Use FME data validation tools to efficiently check and repair data from 335+ data sources, including spatial applications, in an automated way and feel confident that it will work properly in its destination system.
It contains no inconsistencies or errors
It has no missing entries for fields where a value is required
It meets the specifications of data model standards
FME not only identifies problems with datasets, but reports details on what they are and where they exist. If issues exist, FME’s transformation tools can also be used to efficiently repair and filter out bad data. With FME, it’s completely up to you and your specific needs whether you configure workflows to report problems or to fix them or both.
Author data validation and reparation workflows quickly in FME Desktop’s intuitive graphical interface. Once configured, workflows run without the need for manual intervention, freeing you to work on other tasks. FME workflows are reusable and the settings of its transformer tools are easily adjusted to suit changing requirements - no coding required.
Whichever spatial data types you are working with for whichever project - CAD, GIS, BIM, or a combination of the three - FME's spatial data capabilities extend to validation and repair. Once you’ve set the source and destination formats for your particular workflow, FME’s transformer tools can be configured in such a way to determine if there are any issues which will prevent it from working properly in the destination application. For example, use FME to:
And once again, if there are any problems, FME can be used to clean the data so you can move forward with your project.
Every database has rules for the structure and contents of the data it warehouses to ensure a consistent level of quality for everyone accessing and using it in their tasks.
FME can be used to validate datasets prior to loading into central repositories, checking for illegal values like blanks and domain list conflicts to make sure that bad data is stopped before it can do any damage to decision making.
Most industries nowadays have developed data standards to facilitate the exchange of information - and most of these are defined using XML. Although very useful for sharing data, XML is a complex format when it comes to its rules for syntax and schema. Whether you’re reading an XML dataset or trying to output your data as XML, you need to ensure that it complies with your industry data standard’s requirements. FME has an ongoing commitment to support XML-based standards validation with tools that have been specially designed to check your data against its rules - without any coding - and report back if, what and where there is a problem. And of course, if there are any problems, FME can be used to fix them.
Get tips on improving data quality with this compilation of success stories. Guest presenters from Colonial Pipeline and Global Information Systems take part in this best practice discussion of data validation for better quality assurance.Access recorded webinar and slides