Author:
Dean Hintz

All Blog Posts    |   December 11, 2007   |    By Dean Hintz

GIS Days Event at Institute of Ocean Sciences Reveals Many Opportunities for Safe’s Data Integration Tools

In late November, Safe was given the opportunity to present at the first ever GIS Days event at the Institute of Ocean Sciences (IOS). (Located on Vancouver Island, the IOS is one of nine scientific facilities operated by Fisheries and Oceans Canada.) Historically, most of the tradeshows and conferences Safe has attended have focused on interoperability solutions for land-based spatial data. So when I was tagged to present at GIS Days, I looked forward to gaining some interesting, first-hand insights into challenges related to GIS data management and analysis of the coastal waters of British Columbia and the Northeast Pacific. And I was not disappointed. I’ve included here some of my impressions gleaned from the event, for interested readers. But first, here’s a quick overview of Safe’s presentation.

I have to admit that, prior to this event, I wasn’t sure how much interest my presentation would generate amongst the oceanographic community. But it soon became clear that the sixty or so attendees shared a common and acute interest in finding common data standards and means for exchanging and integrating their oceanographic data. Safe was widely recognized as a company that has much to contribute to this community – not in terms of establishing common data models per se, but in enabling organizations to move data between various models and formats and integrate diverse data types.

The keynote address by Steve Grise of ESRI on Arc Marine, a geodatabase data model for marine applications, obviously resonated with the group and was well received. Fortuitously, the agenda had been planned in such a way that I was able to follow up Steve’s discussion with an overview of Safe’s FME platform, with a focus on FME’s support for moving data between common marine formats. As an example, I showed an FME Workbench workspace that took S-57, E00, Shape, DWG and Caris data and loaded it into PostGIS. Since data in this industry can also often be spreadsheet based, or in some kind of semi-structured text file, I also showed how to use FME to extract X,Y, and Z values from a spreadsheet of IOS survey data from Georgia Straight, then transform this to point geometry in KML and overlay the results on Google Earth. I included a new demo I created recently that always seems to generate a lot of interest – a 3D time-step simulation, displayed in Google Earth, modeling flooding of Vancouver’s downtown core in the aftermath of a tsunami. Towards the end of the presentation, I showed a 300 year NetCDF global land use development dataset overlaid on Google Earth. While the demo did not use oceanographic data, it turns out that NetCDF is a very important raster format for storing multi-dimensional gridded data for the ocean and atmospheric sciences. In this case, one file contains many bands where each band represents global land use for a specific year. The audience did seem impressed by how easy it was, with FME, to extract data from this complex raster structure and transform it so it could be loaded into a schema compatible with KML raster time-step. The result was replayed as a global animation of changing land use covering the last 300 years.

Returning now to my impressions of trends in the industry, a number of interesting themes emerged, all related in one way or another to the need for data integration. I was surprised by the degree of convergence that is occurring between the atmospheric, land and oceanic sciences, in terms of data formats and standards, both in the vector domain, such as with ESRI Arc Marine, and in the raster domain, such as with NetCDF.

There was strong agreement amongst participants that data sharing and GIS analysis is seriously hindered by the lack of integration tools to support common data models– models rich enough to integrate data from many different sources and provide information on diverse elements in the marine environment. What is needed urgently are models that provide information not only on quantitative information such as water depth, but also qualitative information that better supports the marine domain, such as bottom type, roughness, hardness, and habitat type. The need to capture metadata and accuracy information was also stressed, as was the need for tools that more fully support semantic translation, that is, translations that preserve the original meaning of the data, not just the data values. Finally, the need to more readily integrate across datasets with very different data structures was also made apparent, for example combining vector depth information with data from raster sonar grids.

This push for new, improved data models and data integration for the marine domain seems to be driven not only by the ocean science community, but by interests involved in security and defense, as well as environmental organizations and disaster management agencies. Many of these agencies are beginning to invest and participate more in marine surveying and data management. And for good reason. Representatives from several of these groups articulated an awareness that support for richer data models – those that can integrate many different types of survey data – will allow for a much improved understanding and management of the environment, and allow these agencies to efficiently allocate limited resources. Environmental management organizations, for example, have much to gain from improved data models that enable automation of tasks such as habitat assessment. Based on some of the presentations and my discussions with folk from Fisheries and Oceans Canada (DFO), it’s clear ocean floor mapping using multi-beam and backscatter analysis is becoming increasingly important to organizations like the DFO. With this new information on substrate type added to traditional data on depth, salinity, and temperature, the DFO anticipates new capabilities to predict fisheries management factors such as the number of fish that can be supported in a given region, without the need for extensive field surveys.

For agencies involved in defense, richer models that include information on substrate characteristics are better able to support GIS analysts searching for new objects in the environment. Once information on substrate can be included in the analysis, new survey data will quickly reveal, for example, a large metal object lying on the ocean floor. The tell-tale sign in such a scenario might be an increase in hardness or smoothness of the substrate, compared with previous survey records that characterize the substrate in that area as “soft” or “rough.”

In my mind though, the most pointed examples of the need for better models and data integration tools for marine data were provided by agencies involved in coastal disaster management and emergency response planning. Despite the fact that many disasters in the world are coastline based, integrating land and marine data to build accurate maps of the coastline remains a significant challenge. Simply determining what exactly constitutes the coastline is problematic. Ocean maps typically trace the coastline at the low tide level, whereas land-based maps trace the coastline along the high-tide level.

Robert Kung of Natural Resources Canada further underscored some of these hurdles. In his discussion, Robert revealed that the challenges of integrating sub-sea slope measurements and substrate information with land-based data is currently hindering efforts to build landslide prediction models for the eastern shoreline of Howe Sound. The highway along this steep shoreline is the only traffic corridor leading from Vancouver to the Whistler ski area, where many Olympics 2010 events will take place.

The good news in all this is that, once enhanced models are established, sophisticated spatial ETL tools such as Safe’s FME can provide a way to move diverse existing data types into these preferred models. Safe’s FME Server technology for web-based sharing of many different types of data also has much to offer the marine GIS community. Judging by the number of discussions focused on spatial data infrastructures and web portals, marine interests are now feeling a growing responsibility to make their data available to a public with a growing appetite for geospatial data. For me personally, listening to some of the discussion around web-based data access was a sobering reminder of the serious difficulties organizations face in sharing information, in the absence of adequate models and data sharing tools. As a case in point, a presentation by the Canadian Navy revealed that much of the essential spatial data for one domain awareness system resides in something like a dozen different applications on a dozen different computers tracking everything from weather to surface temperature, water currents and benthic data. And here’s the kicker: since there is no data interoperability between these applications, information is shared by simply taking a screen capture and emailing the JPEG image or pasting it in a report. And there’s no geo-referencing attached to the data. When I hear that organizations are struggling with these kinds of issues it sure motivates me to get out there and spread the word about FME!

Given the opportunity, Safe would certainly attend another GIS Days event at IOS. Most of the presentations were of high quality, and Terry Curran of the Canadian Hydrographic Service is to be congratulated for pulling together a well-organized and diverse gathering in under a month, with minimal resources. For us at Safe, the event provided a timely reminder that our spatial ETL technology is as relevant to the management of marine data as it is to land-based data.