FME makes it possible to solve complex problems, but this often leads to complex models which can be challenging to maintain without introducing errors or unwanted behaviours. Typical scenarios are when changing the model or updating the FME version, transformer versions, or other components used by the model.
Testing is a cornerstone in the modern software development process to assure code quality by examining the software's artifacts and behaviour. Methodologies for testing source code are well developed, mature, and well-described. On the other hand, methods for testing scripts or programs developed with a “drag- and drop interface” development software like FME are not as well established. Some work on establishing a test framework for FME scripts has been done, for example, the rTest framework developed by the company Veremes and presented at FME IUC 2017.
We present a test method for quality assurance of FME scripts that is not based on defining test cases but instead works by comparing datasets at different measure points in a script to assure that data has not changed in an unwanted way after updating a script.
Gitter Consult AB
FME User Conference 2022