A Closer Look at Geospatial Maturity Assessment Disconnects 

By Karen Rogers / April 28, 2020

The initial 2019 Geospatial Maturity Report (GMA) report was just that - initial. Like geologic stratigraphy, the more you dig into it, the more complex information you can find and understand about the layers above. I have been digging into the report findings to do just that, and as expected there’s a lot more to the story. 

The inspiration for the GMA report card was the national report card produced by the Coalition of Geospatial Organizations (COGO). A logical step is to take a look at how these evaluations compare. I was curious as to the consistency of the grades between the two. Through the GMA process, we determined that about half of the framework themes are led by the federal government, while the other half are led by state and local data stewardship efforts. While most of them are in general alignment, of the federal-led themes, two stick out as having a serious disconnect between the grades.

Geodetic Control and Governmental Units both received an A- nationally, while state grades don’t fare nearly as well. While the National Geodetic Survey (NGS) does a great job maintaining and improving a national geodetic control system, not many states are doing much to improve above and beyond the federal data and program. While the theme is poorly understood, it is important for states in the public land survey system (PLSS) to accurately map private property in those states. The grade disparity suggests an overconfidence in the extent and reaches of the federal program and its ability to serve state needs. 

Governmental Units represents a reporting system aimed for the US Census Bureau to identify the boundaries of incorporated governmental bodies, be it towns, cemetery districts or school districts (among others). We found a huge variety of circumstances and programs across the states which made one fair grading rubric difficult to get right the first time. That said, the GMA team agreed this was our biggest weak spot in a fair grading system. The grading metrics deserve a closer look and improvement next time.

For state-led themes, the areas of disconnect lie primarily with Addresses and Transportation. In the 2018 COGO Report Card, Addresses received a B+, while Transportation scored a C nationally. Twelve states received a failing grade for Addresses, with a total of 17 getting a D or below. At the top of the scale, seven states earned an A, one an A+, with a total of 22 getting a B or above. The National Address Database (NAD) is only partially populated and not consistently funded, so a B+ is pretty generous for a nascent program. This was clearly an overly optimistic grade from a national perspective for the first year it was evaluated. While Transportation received a C, most states scored above that, suggesting an overly harsh evaluation by COGO. These areas of disconnect suggest that a conversation between states and lead federal agencies is warranted to determine what data elements and quality characteristics are required to fulfill most business needs to better inform data standards and data reporting processes and mechanisms. 

I also dug into a comparison of the grades states received for their coordination compared to their overall GPA. My assumption was that states with higher Coordination grades would also earn a higher overall GPA as a natural outcome of that central coordinating force. This mostly bears true, but about a quarter of states show a difference of a whole grade (up or down) between the two. This warranted further inspection, and the different set of circumstances in each state demonstrates the unique setting in each state.

Grades are prone to vary, so for the purpose of eliminating minor blips, I looked for the states where their Coordination grade/overall GPA differed by a whole grade mark. Using this criteria, three states have a Coordination grade lower than their overall GPA (Delaware, Illinois, and Wisconsin). Eight states have a higher Coordination grade than their overall GPA. Of these eleven states, I chose four, two from each category, where the difference was the greatest. The following commentary is based wholly on conclusions based on their GMA grades and responses. 

Starting off the pack is Delaware, whose Coordination grade is doomed by the lack of a GIO (D for Coordination and B- for overall GPA). Despite the absence of someone filling that role, all state-led themes got A’s. They earned D’s in the federal-led themes of Geodetic Control and Governmental Units. The grading criteria for both of those is the weakest and most problematic of them all because they’re both complex and vary greatly across the country. Delaware would be catapulted to greatness if they would adopt a GIO position, who could possibly make inroads with NGS and the Census on their two lower grades.

Wisconsin is another state with a big difference between grades (D for Coordination and C+ overall GPA). They have a central coordinator, but the position is fairly narrowly focused, being the manager of the land information program at the Department of Administration. That current involvement with land information shows in their A+ grade in Cadastre, A in Geodetic Control, and A+ in Governmental Units. Despite that, they scored an F in Addresses and Transportation. My guess is those two could readily be improved by building on existing relationships with local government that have shown success in other areas. It might take redefining the role of the GIO position to expand its focus to other framework layers instead of being so narrowly defined. 

Going to the other side of the spectrum, Hawaii scored an A in Coordination but a C- in their overall GPA. Other than an outdated strategic plan and unofficial council, they have all the other characteristics for successful coordination. When it comes to the data, however, they scored an F in Addresses, Transportation, and Elevation, with a C in Hydrography, as well. Thanks to the 3DEP program, their elevation program should improve. Addresses and Transportation grades could be improved in conjunction with movement on a NG9-1-1  or open data program. 

Finally, Virginia is another state that seemingly has everything going for it in the coordination realm, but their grades aren’t stacking up to reflect it (A in Coordination and C+ overall GPA). They earned F’s in Elevation and Governmental Units. Again, Elevation should improve thanks to 3DEP, and I can’t help but think their Governmental Units grade reflects oddities with their incorporated vs. unincorporated places (among other possibilities in our grading system). They also scored a C in Hydrography and C+ in Leaf-On Orthoimagery. Hydrography will likely improve thanks to better elevation data coming over the next few years. Leaf-On Imagery doesn’t seem to have a strong business case in the state to warrant its supported capture. 

These are just a few case studies - about 10% of the data we captured in 2019. Did we get everything right? Certainly not. But we made a solid first stab at it. One common trend is the difference an open data policy would make across the board. If all states adopted an open data policy whereby their address, cadastre, and road data were all shared freely with the benefit of an API, the needle would really shift. It is our best hope that this is the reality in a few years, so that what scores a C this time could potentially bump up to an A. As technology evolves and attitudes change, the minimum bar for expectations will have to change too. This would be a welcome evolution as it will signify true progress and maturity. Regardless, for the first time, states can see where they are relative to their neighbors when it comes to their GIS programs. And we all know that knowing where you are is half the battle. The other half is plotting out a course to get where you want to be.

To learn more, listen to Karen Rogers on the latest episode of StateScoop's GIS Addressed where she explains why states and federal agencies need to coordinate more for GIS growth.