I was recently asked about what guidelines there are for authoring workspaces in FME.

As you might know, we have the idea of Best Practices in FME; a concept about what makes a workspace, a good workspace.

Chapter three of our training course covers Best Practices. It mentions annotations, bookmarks, debugging, project organization, and a lot more. We also have a sample project that we use for certification candidates. It’s an example of what a project ought to look like – both in the workspace and for its documentation.

So, I can tell you what makes a workspace good – but it’s perhaps a little harder to say what makes a workspace bad. I mean – if a workspace runs to completion, and produces the output you want, it can’t be bad… can it?

EvangelistBanner7

Code Smell

When I looked up methods to identify bad code, I found that it is not always a firm answer. Instead you have to look for identifiers that Wikipedia calls Code Smell. It states:Blog-CodeSmells5

Code smells are usually not bugs—they are not technically incorrect and do not currently prevent the program from functioning. Instead, they indicate weaknesses in design that may be slowing down development or increasing the risk of bugs or failures in the future.

In FME terms, your workspace runs, it produces the output you need, but… further editing may be harder than it should be, plus even if the workspace runs now, maybe it might fail in the future.

So, can I find any identifiers that suggest an FME workspace is… a bit whiffy?

EvangelistBanner7

Duplication

Duplication is the biggest red flag of all.

Objects repeating themselves in a workspace are like the blue cheese of FME (a noticeably sharp aroma that taints everything it comes in contact with).
Blog-CodeSmells1
In particular, these are bad duplications:

These are not always errors – sometimes you really do need repetition – but they indicate that your design could be weak.

 

EvangelistBanner7

Complexity

I found another quote, which I think is great because it is a dilemma in how we evaluate workspaces for certification:

one way [to code] is to make it so simple that there are obviously no deficiencies; the other way is to make it so complicated that there are no obvious deficiencies

A workspace does not need to be complex to be a good project. It can be carrying out a complex process in a simple way. But sometimes I see a workspace that’s so complex it takes me hours to determine whether there really are no deficiencies, or whether they are just really well-hidden.

Blog-CodeSmells2bThese workspaces are the bad wines of FME: the label says it’s a big, bold, oaked nose; with a complex backdrop of juicy raspberries, velvety vanilla, opulent cigars, and spicy figs. But after lengthy sampling, you figure out it reminds you more of used gym socks.

These are clues that there might be a problem:

Again, not always errors; sometimes (I admit) there are FME limitations that force this complexity. But none of the above issues are a good sign.

EvangelistBanner7

Bulletproofing

By bulletproofing I mean that you design a workspace with the assumption that problems will arise – therefore you build in methods to handle failure. Error trapping is – I think – the term used in development.

Bulletproofing can simply be adding a test or filter transformer to weed out bad features – usually before they get to the point at which an error would occur.

However, though easy to describe, it’s not so easy to notice when bulletproofing is missing. These are the wet dogs of FME! You have to put your nose up close and sniff hard to catch the odour (but, boy, when you do!) – I would double check my workspace in these scenarios:Blog-CodeSmells4

To an extent, this is less of a problem now that we’ve started to add <rejected> ports to transformers, but it’s still worth investigating I think. Bulletproofing is of particular interest when the source data is not your own creation, and/or when it is liable to change without notice; for example when you are processing data uploads on FME Server.

If you can’t be sure what data is coming, you better be prepared for the worst!

EvangelistBanner7

Deodorizing a Workspace

In the same way you might think your seafood dinner has a delicate aroma of the ocean, it can be hard to perceive your own workspaces as being anything other than delightfully fragrant! The solution to both problems is much the same: you need a good friend to point out the problem.Blog-CodeSmells3

At Safe, our developers don’t commit their work to the FME product until it has been code-reviewed by a colleague. So that’s what I’m suggesting you try. In particular:

At Safe it’s actually quite informal – we don’t always wear doctor’s coats – and it can be simple for you too. It can be as little as a quick look over your shoulder, or can be carried out by email.

You would only need to get a group of people together for a code review meeting when the project is really large and complex, or there is a disproportionately high-cost associated with failure.

It really is worth doing, and both reviewer and reviewee will perhaps learn about techniques they weren’t aware of.

EvangelistBanner7

FME Golf

The other interesting thing I found while researching code quality, was the concept of “code golf“.

Like traditional golf, where the object is to play a round in the fewest number of strokes, code golf requires a developer to solve a problem using the shortest amount of code possible. For example this snippet of code:

!(y%100<1&&y%400||y%4)

…is (so they say) Javascript that determines whether a given year is a leap year or not. You couldn’t get much smaller than that. In fact, you couldn’t get further away from the concept of well-designed code either!

Still, I’m going to throw out a fun challenge. I’ll set a task and provide a dataset, and you have to create the smallest workspace possible to carry it out.

The challenge is this: download the source CSV dataset of public artwork (below) and create a workspace to do the following:

Here are the files:

Or course, there are some rules:

My workspace carries out the task in the way you would usually expect. It checks in at 68kb (actually 69,411 bytes). Can you reduce that number and produce the same result?! I’m looking forward to seeing some experiments with different transformers, or even different Readers – and to see how you can bend the rules (read them very carefully) without breaking them!

You can email me your creations at the address below (I’m on vacation right now, so there’s no rush). I can’t promise a prize, but I’ll see if I can raid the marketing team’s cache of swag when they aren’t looking.

EvangelistBanner7

Conclusions

The FME golf idea is just a bit of fun, but producing well-formed workspaces is no joke. A bad workspace can be slow and inefficient. It will be harder to maintain and not necessarily proof against future problems. The instructions for code reviews at Safe suggest:

It costs more to fix defects that are found later in the development process. Fix early and fix often

And it’s hard to argue with that.

NewBlogSignature

About FME Best Practice Code Golf Code Review Code Smell FME Evangelist

Mark Ireland

Mark, aka iMark, is the FME Evangelist (est. 2004) and has a passion for FME Training. He likes being able to help people understand and use technology in new and interesting ways. One of his other passions is football (aka. Soccer). He likes both technology and soccer so much that he wrote an article about the two together! Who would’ve thought? (Answer: iMark)

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Posts