You are probably aware of those weird benches that are sometimes placed outside, the ones full of armrests and/or spikes and often disguised under the guise of artistic expression but are ultimately just engineered to prevent homeless people and other undesirables to sleep on them. The bench is fulfilling its official function while, at the same time, working against everyone who tries to use it for anything beyond its narrowest definition.
Somehow I think about these benches a lot. And it mostly happens when I’m stuck waiting for my work machine to finish doing something that has basically nothing with the thing I actually want to do.
Being a Contractor
To be clear, I enjoy my job; However, being a contractor in the enterprise world is its own layer of hell. I don’t get to choose my tools, I don’t get to choose how I want my env to be set up or how our pipelines should work. It sucks.
I show up, get handed credentials to whatever system is already in place and have to make it do.
Almost always, for convenience, I have access to the environment only through a VDI (Virtual Desktop Infrastructure) instead of a physical laptop. This VDI is always set up by people who have never met me, configured by policies written years before my project even existed, maintained by people whose priorities have absolutely nothing to do with whether I can compile my code or not.
On top of that, every environment is different. Whatever I developed and got used to at Client X is now worthless because Client Y has a completely different stack of policies, security tools, preinstalled apps and restrictions (these are usually discovered at the worst possible moments).
Confluence? Docs? Wouldn’t it be nice to have a page titled “Hey, here is how to set up your personal environment”? Well, you can do it, just to realize that 6 months have passed and now it’s obsolete because there’s a new client in town and half of the process changed.
The Spikes Make it Harder
I am what someone might call an experienced Linux user. On my home machine I have arch+i3wm, I love my terminal, I rarely use my mouse etc. This is to say that I have a decent understanding of what it means for a computer to behave properly.
- I press a key -> my pc registers it
- I save a file -> the file gets saved
- I copy text to my clipboard -> the text is now in my clipboard I’m sure you get the point, as these are absolutely basic expectations. Which makes it even more frustrating when I can’t rely on them anymore.
At one of my previous clients I had access to a VDI (clearly Windows, it’s always Windows). In this case I’m pretty sure the V stood for Very Shit1, because everything that should’ve taken ms instead took seconds, and everything that should’ve taken seconds instead took me enough time to completely forget what I was even doing, rethink my life choices and wonder if this is just how things are now.
It just happens that our projects often have thousands of small files And, yes, I know that this is partially a problem caused by Windows Defender, but the fact that my environment is configured to perform real time security scanning (in sync mode!) on every single file operation absolutely does not help.
On top of that, we can add an NFS layer (because of course there is one) which makes it everything worse. I won’t even go into detail on the times where my entire storage was on OneDrive (did you know that you can only delete 5k files at a time?), but even when the storage is supposedly an SSD, I somehow end up with IOPs in the hundreds, making me wonder why am I having speeds similar to a HDD in the early 2010s and where did it all go so wrong.
But the real lottery is the clipboard. Sometimes when I copy something, my instance just freezes. “The application is not responding”. Can’t even Alt+Tab. And it doesn’t happen always, which is arguably even worse since it feels like the software equivalent of a jumpscare. I press Ctrl+C and then I get to sit there for 10 to 15 seconds while some security software probably is analyzing my text for threats (nevermind that the file I’m copying from, was already scanned 5m ago)
And now, try to imagine all of this, combined, while I’m working on giant Eclipse projects that have to update, refresh and rebuild every time I save something.
The Saturday Incident
There’s another small story about a time we were deploying to prod and I think it indicates something important about who these environments are designed for.
A while back we had a go-live planned for a client and the deployment was scheduled to happen on a Saturday since the offices were closed for the weekend. We were in a bit of a rush, but for everything to go smoothly we were missing just a few things, one of those being a 7h long migration process. For various reasons (some more valid than others) this process had to be started and finished on one of our VDIs.
Four hours into a seven-hour process, we got a warning, saying that the machine would be restarted.
And this wasn’t a standard windows warning! Those were deliberately disabled. This was because of some scheduled maintenance that happens every weekend. Mandatory and unskippable! We urgently called the VDI provider, just to learn that we were out of luck and there was nothing that could be done.
The system had been configured by someone who reasonably assumed that nobody would be doing serious work on a weekend, because in the world where those policies get written, work happens from Monday to Friday. We didn’t get told about the maintenance (because why would we). We didn’t know what we didn’t know.
Thankfully there were no mandatory restarts on Sunday, but the fact that we had to discover this by crashing into it, is a decent indicator about where contractors often fall on the priority list.
The Museum of Approved Tech
There’s a library we use at work. It’s not particularly common, but it’s also not incredibly obscure and there are no better alternatives for our use case. In one of my previous engagements we faced a bug with it that we couldn’t explain.
I checked, and it turned out we were using a version from what feels like the distant past (2012!). There were four (!) major version changes since then. Hundreds, if not thousands of patches and fixes; many of them being vulnerability fixes.
The approval process for new software versions takes a month on a good day. It involves multiple teams, extensive paperwork, and a review process that was clearly designed to be thorough rather than fast.
So I often find myself in a situation where, technically, we are compliant, running only approved and vetted software, while simultaneously being objectively less secure than we would be if someone just let us update our dependencies2, or if the security process were to be simpler and automated. The security theatre is immaculate.
The actual security is questionable, but hey, at least the paperwork is in order.
The Question
Here is what I keep coming back to: the question that I genuinely don’t get and that I have never seen anyone in a position of authority try to answer: how much productivity loss is acceptable?
Because, there clearly is an answer, even if it’s implicit. Every single policy decision embeds an assumption about tradeoffs. When you mandate real-time security scanning on every file you are (knowingly or unknowingly) deciding that whatever security benefit you get outweighs the productivity cost. When VDIs are configured to restart on weekends (without exception) it means that someone decided that maintenance reliability matters more than weekend work. When your software approval takes a month or more, you are deciding that thoroughness matters more than agility.
And I agree that in theory these are all defensible positions. But I have never seen anyone actually do the math. I have never seen a document or some research that says “We estimate that this security policy costs us X developer hours per year, and we believe that this is justified because it prevents Y incidents of Z security”. The costs are hard to measure and scattered across hundreds of developers, each of them losing some minutes here and there, becoming invisible.
But often that’s also how software becomes unusable! That’s how things turn to shit! Small things that might look acceptable in isolation always need to be measured and considered as part of the whole context.
The result is an environment that optimizes for things that can’t be measured by people who don’t use the environment, at the expense of thing that can only be felt by people who do. And I want to be very clear about one thing: this is not malice. Nobody sat down and decided to make my life miserable. Every single decision was probably reasonable in isolation and was made by someone with limited information trying to solve a specific problem they were responsible for.
But the cumulative effect is hostile architecture. Even if the hostility towards developers is not intentional. The bench has armrests, and you can technically sit on it.
You just can’t actually get comfortable enough to do your best work.
The Point
I don’t have a solution here. The systems are what they are because they emerged out of weird and complicated corporate incentives that prioritize compliance and auditability over developer experience. Those are clearly not going to change because I wrote a blogpost about how my clipboard freezes at times.
But I think that it’s worth naming the thing and pointing out that environments are not neutral: they encode assumption about what matters and who matters. And the people who pay the costs for those assumptions are also rarely the people who made the decisions.3
How much productivity loss is actually worth? Has anyone done the math? Does anyone know how to do the math?
The bench is slanted, it has armrests and spikes, and someone put them there on purpose, and the official justification probably made sense in some meeting room somewhere. But I’m the one trying to sit on the bench, and sadly it just feels like someone doesn’t want me to be comfortable.
Footnotes
-
(And the ‘P’ stands for Performance) ↩
-
Or rather, if the policy makers are also the ones paying the developers - then there is clearly a cost for them as well. But generally speaking it’s implicit and not direct enough to be immediately noticeable. ↩
-
To be clear, I am aware of the security implications and I’m not advocating for indiscriminate dependency bundling. But surely there must exist a compromise in which our libs aren’t 10 years old. ↩