Letter from the President - October 2018
This month’s AUGIWorld is about customization, which means we’ve all got programming on our minds... but honestly, I really can’t blame that on the magazine theme. I’ve been thinking a lot about how we interact with software—specifically, how we assign “responsibility” to the programs we use.
It’s very easy to fall into the habit of using passive language to describe software actions, or of assigning the action to the software itself. “I would have been here on time, but my maps app sent me the wrong way.” “My alarm didn’t go off this morning.” “That Revit schedule isn’t showing the right information.
But is that really the best—or the safest—way to think about it? Any program is only as good as its input and its algorithm. A flaw or failure in either component can lead to suboptimal output. Did your maps app really send you “the wrong way” or did you miss a turn? When your alarm “doesn’t go off,” did its internal clock break or did you forget to set it? If your Revit schedule doesn’t look right, is it really broken behind the scenes or did you choose the wrong fields—or model the wrong elements? (Is it a coincidence that these examples also have to do with assigning blame?)
Most of us who aren’t programmers tend to view software as a “black box.” We don’t have the training or the expertise to fully understand the source code, so the output is all we have to go by. That’s why it’s our responsibility to thoroughly test any program or piece of code we intend to use in our workflow—whether it’s a complicated Dynamo script you wrote yourself or a LISP snippet you downloaded from the AUGI forums.
How do we do this? It’s pretty easy: feed the program input with known solutions and see if you agree with its conclusions. Start small and work your way up to more complicated problems. If the results coming from the program match up with your own solutions—based either on your own calculations or personal experience—you can start to trust the answers to problems with previously unknown solutions. (Remember the value of estimations, too. If you’re expecting a number output, you should know beforehand whether that number is about 10, about 100, or about 1,000. It’s a quick gauge of reliability.)
For more complex software, the output by itself isn’t enough to completely earn our trust. We want to know how it works, preferably in as much plain language as possible.
This leads me to my second point, and what I’m beginning to think of as the three most important things in programming: Documentation, documentation, and documentation. All right, maybe that’s a slight exaggeration. But I’ve encountered a few professionally released programs lately that have, at best, bare-bones help files. Getting started in these programs has been a challenge, to say the least. Without documentation, it’s practically trial-and-error to understand what will work as input and what we should expect as output. Yes, good interface design can alleviate some of that work. But it makes it that much harder to trust the program’s results.
If you’re writing your own code, please, for the sake of those of us who will use your program in the future (including you, six months from now), include notes everywhere. Describe everything that’s going on, and what the result of each step should be. It may seem like overkill, but your colleagues—or customers—will thank you for it. It goes back to our habit of assigning responsibility to software. Since it’s unlikely that we’ll stop saying “the program says...” we need to be that much more aware of how the software “thinks.”
In the end, just remember that every bit of software we use today was, at some point, written by a human being. (This statement may not be true once robots have taken over... but for now, I think it stands.) The output is based on human interpretation of data. Fortunately, software won’t get offended when you question its results. Keep checking, keep verifying, and you’ll be able to celebrate the things that technology enables you to accomplish instead of wondering what went wrong!