Best Practices for Geospatial Data SharingFellow Safer Drew Rifkin recently came back from the NSGIC 2011 Annual Conference touting NSGIC’s new Geospatial Data Sharing Guidelines for Best Practices document. While Safe Software has been involved in many of these initiatives over our nearly 20 year history, I was surprised to realize that I’ve never seen such a cogent set of arguments in one place for the open sharing of geospatial data.

The document itself is clearly targeted at the much broader audience than just the geospatial professionals who normally rub shoulders to talk about these sorts of things. For example, I may start using the definition it provides for “Geospatial Data” the next time I’m asked at a dinner party what I do for a living.

Geospatial data identify and relate the geographic location of features and boundaries. They are stored in databases that include descriptive attribute information about locations, allowing the information to be mapped. Geospatial data enable government, consumer and business applications. These data are accessed, manipulated or analyzed through Geographic Information Systems (GIS).

From there, the document goes on to provide a solid argument as to why the open sharing of geospatial data is in the best interest of the common good, and why policies that limit this, either by imposing fees or downright restricting such sharing, are counterproductive. James Fee’s experiences a couple of years ago with his local government, while doing a school project, highlight the downside of such policies.

I remember in the earlier years of Safe when some of our government customers held their data tightly and how different groups within the same government had to pay each other for their data. I agree with NSGIC that these efforts that were spent negotiating and accounting for those transactions could have been more profitably spent elsewhere.

3 Myths about the Open Sharing of Geospatial Data

What packs particular punch in this document is the debunking of a few myths around open data.

The document ends with a solid charge for decision makers:

Now is the time to change existing policies that are outdated or based on incorrect assumptions. Tremendous value can be realized by all organizations through the open sharing of geospatial data. NSGIC calls on government administrators, geospatial professionals, and concerned citizens to further advance the use of important geospatial data assets and to ensure that they remain freely accessible.

If you’re a decision maker in a government at any level, in any country, this document is a must read. And if you’re a consumer of spatial data provided to you by governments, you owe it to yourself to be up-to-speed on the latest thinking in the data sharing arena. Because of the way it was written, it’s also ideal for passing on to the high-level decision makers you interact with.

As someone who has seen this debate unfold over the years** in Canada to the point where now open data has really got legs at all levels (thanks Vancouver for your leading role), I say a big THANK YOU to NSGIC for gathering together and presenting such a solid set of arguments around these issues to help decision makers everywhere.

You can read NSGIC’s Geospatial Data Sharing Guidelines for Best Practices here.

No Two Tier Data** Indeed, in the year 2000, I was invited to participate in a roundtable discussion on the issue of open data and I recall holding up a sign saying “No Two Tier Data”, with about the same impact as the similar stunt that had been previously done during an all leaders debate in a Canadian federal election. #thingsnottodoinadebate

About Data Data Silos Open Data Spatial Data

Dale Lutz

Dale is the co-founder and VP of Development at Safe Software. After starting his career working spatial data (ranging from icebergs to forest stands) for many years, he and other co-founder, Don Murray, realized the need for a data integration platform like FME. His favourite TV show is Star Trek, which inspired the names for most of the meeting rooms and common areas in the Safe Software office. Dale is always looking to learn more about the data industry and FME users. Find him on Twitter to learn more about what his recent discoveries are!


8 Responses to “Best Practices for Geospatial Data Sharing Released by NSGIC”

  1. Mike Futrell says:

    Additional is the time cost to a consultant for the time spent negotiating agreements and acquiring typical base layers even for the locality’s own project where they will offer a no-cost project specific license. Often this is billed back to the locality. I’ve seen this go over a thousand dollars and/or encompass a week of delay numerous times. This is always worse when the locality is trying to turn a buck. For non-locality projects, it’s an increased cost of doing business in the locality.

    Further, geospatial consultants can and do create new, update old, and maintain ‘shadow’ versions and partial versions of locality layers because in some places that’s the only way [AEC firms, and others] can perform their work. For the traditional gatekeeper who wants to know who has ‘his/her’ data, maybe their concern should be how many parallel versions of that data are active in the wild and who’s using them, and then think about why.

    I encourage sending this document to your favorite local government, and avoid paying for public data whenever possible to hasten the failure of less than best practices.

  2. Dale Lutz says:

    Hi Mike,

    Great points. I’d not considered the costs and dangers “shadow” world of independently maintained copies of some ancient data ancestor. Not only could decisions be being made based on inaccurate information, but also costs, which have to be paid by someone, would be incurred as each independent shadow was maintained. What a drag. Literally.

    Thanks for chiming in.

  3. Carl Reed says:

    Dale –

    Great comments! A few weeks ago I attended a special workshop for Minnesota Emergency management folks that deal with geospatial data sharing “in the trenches”. Their big issues have much more to do with jurisdictional and organizational relationships, free access to use when the data are needed, and so forth. Interestingly, during my presenation I asked how many of the organizations represented had policies in place for using VGI content (such as OSM). Very few. They did state that issues of quality and liability were a concern. And the state/Federal EM coordinator wondered about security. My response was almost identical to your myth 3 response.

    As to technology, the group understood that the tools are available. However, I also found that there is a lack of knowledge about standards and how they can be used to enhance the real time accessibility to geospatial data contained in distributed geospatial repositories. Also, the issue of semantics was not on their radar screen. This may change in the US as the Next Generation 911 architecture and supporting reference materials specify a common GIS data model for sharing GIS data between counties and PSAPS. GML is used for the GIS Data model encoding and the OGC Web Feature Service interface standard is specified as the query and access standard.

    In terms of Canada, the work of Geoconnections and related Fedeal level organizations has done much to promote effective data sharing in the Canadian community.


  4. Thanks for posting this Dale!
    I am chair of the NSGIC Geospatial Data SharingWorkgroup that produced the document and it is very encouraging to see your positive comments right out of the gate. I worked on this, over the past year, with a tremendous group of dedicated and passionate individuals from across North America – and I was very happy to forward your comments their way.
    One item that I would like your readership to be aware of is the planned future support for this position paper from NSGIC.
    Within the workgroup’s Charter is reference to a second phase of work. This phase two work is just beginning, and will include “…organization of archival evidentiary materials.” This archive will contain white and gray literature and other documents on data sharing in areas such as legal opinion; policy and governance; liability statements; federal, state and local use-cases; emergency response needs; urban and regional planning; economic evaluations; cost avoidance; and other supporting materials. The goal here is to make it easy for local policy makers to find supporting materials relative to their particular struggles.
    Our workgroup would welcome further support and assistance in building the supporting arguments. The plan is to put those materials into the new NSGIC website – which is under development now. With all the work needed to organize and manage the delivery, we will certainly appreciate any help in gathering these supporting materials.
    If you know of a localized report that points out the value of, or shows measurements supporting, open geospatial data sharing policies, please contact myself or a member of the Geospatial Data Sharing Workgroup.

  5. Dale Lutz says:

    Hi Carl — thanks for the comments. As opposed to years ago when data sharing discussions were first being kindled, the tool support side is definitely no longer an impediment, though knowledge of all that’s available may be. All that hard work by so many has really, really made a difference and laid a very solid foundation for any organization wishing to share data. There really are many solid options available now for sure.

  6. Dale Lutz says:

    Thanks Curtis. This secondary archive you are putting together will be a fantastic resource for organizations. Hopefully some of the readers here will be able to help out in various ways, and we’ll all be on the lookout for more reports of the effect of open data. Perhaps our friends at the City of Vancouver may have some observations soon — I’m intending to attend a talk by Dan Campbell from CoV at Autodesk University in a couple weeks and will be on the lookout for relevant information.

    Thank you and your group for the hard work you’ve put into this document. It is a fantastic resource and will be useful for many many organizations I’m sure.

    As a sign of how fast time is passing by, I thought I recalled recent efforts to create an Open Data Consortium (ODC) from a few years ago. A quick websearch later and I see it was about 10 years actually — yikes — but some of the original thinking there is relevant still I think — see

    Thanks again

  7. Jason Birch says:

    Speaking entirely for myself (not my employer, although it is a very pro-open local government), I think the strongest argument for free access to local government data data boils down to one thing:

    Requiring payment for data is anti-local and anti-democratic.

    Having to pay for data places local businesses and developers on a tilted playing field where large out-of-town companies with coffers to match have a huge competitive advantage. If you can’t afford the data, you can’t compete.

    For regular citizens, the matter is even more critical. How can you expect citizens to fully participate in the democratic process, providing feedback and valuable input to processes as varied as zoning and social policies, without having access to the same information that City “experts” and local special interest groups are able to use? Learning how to effectively analyze the data is a tall enough barrier without making them pay for access as well.

  8. Dale Lutz says:

    Thanks Jason, but seriously, how do you feel, really :-). Very good points. I believe you have promoted and participated in hackathons in the past — I have viewed those as wonderful opportunities to create value for citizens in unexpected and surprising ways from the data, and they can also help overcome the analysis barrier.

Leave a Reply to Curtis Pulford Cancel reply

Your email address will not be published. Required fields are marked *

Related Posts