Tools of Dubious Productivity

So when you’re against the clock working to complete a task, how much of your time should be spent massaging the ego of a recalcitrant software package? How much should the average user need to know about stability, memory leaks and the like? Looking back on it, a very significant portion of an operator’s time is spent corralling the code in a particular direction, working with the various foibles and “features”, well aware that autosave is not always there to save you, erm, automatically. This is totally off the top of my head, but I reckon that a good 20-50% of your time working on any given “productivity tool” is actually spent working within the limitations and boundaries set out by a Russian programmer called Petr. Or Vlad. Whilst my claim is no more accurate than a certain outgoing president’s claims of electoral fraud, unlike electoral fraud a loss in user productivity is really important. This is largely because large organisations like to make decisions based upon perceived percentages of productivity gain, versus sticking with existing legacy solutions, often sold to them by some vendor’s overzealous customer relations managers. Bold improvements in productivity are often trumpeted on marketing bumf but seldom is it mentioned that a certain proportion of that percentage improvement will be lost to “fucking about whilst the little beachball thing spins around”. Go to any busy design studio tomorrow and you can guarantee that at least one user has popped off to take a dump instead of sitting watching a percentage bar crawl across the screen.

To the nearest zillion, how many of these do you think Autodesk receive every quarter, filled with expletives?

I would be very interested to know what the actual percentage productivity gain there is from, say Revit 20xx, versus all the time lost to discovering new and wonderful bugs in this the latest iteration. Rather than always rushing to regurgitate the latest juicy morsels to the impatient hungry chicks, why not slow it down a bit? Mummy and daddy bird could eat together and stop off at the shops on the way home to pick something up for the kids. They could even get some groceries in and prepare something fresh using the cookery book by that fashionable celebrity chef when they get home. Exactly how this applies in the context of software development I’m not sure, but the point is that software development is so frenetic that there is barely a window of stability long enough when everything stops moving around for anyone to measure whether all these new, whizzy patches and add-ins are really helping as much as the vendors claim.

Now of course the image of a garden bird in chefs whites presenting Jamie Oliver style to mums wanting to improve their kitchen repertoire, is absurd. But then they were saying that about a Donald Trump presidency, and look what happened to that? Slow down the product cycle, I say, and aim to provide a competent core product that suits the majority of day to day users, instead of chasing some BIM panacea that so far has largely failed to materialise. Then, the true productivity gains can be gauged over a longer term, in a stable and reliable way, without all the bluster of the bestest and latest update muddying the waters. Also, I like the idea of little birdy chefs.

Digital Twins – The new, unwanted buzzword

First there was “Virtual Building“, a Graphisoft trademark they use to differentiate between ArchiCAD and what was referred to as “Flat CAD”, the 2D drafting board analogue from Autodesk, which was launched the same year as the visionary Hungarian BIM tool’s first iteration, Radar CH.

Next there was BIM, Building Information Modelling, coined by Autodesk in their 2002 white paper, attributed to Phil Bernstein.

Software vendors love a hype wave to follow, re-branding their tired old products with the latest buzzword to try and shift more licenses by scaring customers into thinking they will miss the wave and lose out to competitors. “SuperDuperCAD-X, the market leading CAD/Virtual Building/BIM/Level 3/VDC/Digital Engineering/DfMA authoring tool is now leading the way in Digital Twin capability!”

For years they were banging on about BIM will reduce costs by 50% but all that happened was a “BIM Team” was created in the business, and then treated like “drafters”, glorified tracers (if you don’t know what is a tracer, ask one of the old guys in your organisation). As there was zero engagement between the engineering teams and these BIM jockeys, designs being done in 2D, even drawn up in 2D CAD before handing over to the BIM guys to do their stuff. You might as well have thrown money down the toilet.

BIM added costs without benefits. And now there is Digital Twins.

The thing is, we already have digital twins, in scenarios where it makes commercial sense and add value. A digital twin of a building is just bollocks. The cost of setting up and maintaining a digital twin with all the sensors and stuff is significant, and to be honest, you don’t really need a fully detailed BIM model to make use of it. Just some floor plans, PDFs.

The CAD companies promoting Digital Twins will be really pissed when the market decides they don’t need all that BIM modelling bullshit and just need dashboards and databases.