I think of it like this: in, say, the 1990s, "computers" / "tech" was fundamentally about doing _real-world_ things more efficiently. Word processing is a step up from a typewriter, desktop publishing is more efficient than typesetting, email is faster than fax or postal mail, spreadsheets allow for more efficient calculations and accounting, and obviously databases and industrial systems allow for efficient operations of businesses, warehouses, airlines, etc.
A few decade later, all the obvious, real-world, low hanging fruit applications of technology were filled, and "tech" turned into the weird, more self-contained world of "the internet", social media, advertising, the attention economy, bitcoin, high frequency trading, AI, where it's really just your computer/algorithm fighting someone else's computer/algorithm, and rather detached from the offline world.
The microservices fad/wave was where people seemed to lose their minds. The "Solving Imaginary Scaling Issues (at scale)" meme encapsulated it for me. Most programmers of the time seemed far more interested in being architecture astronauts than making something useful. The overengineered hosting setups were also a major impediment to anyone who just wanted to make something useful.
Fortunately, AI-assisted coding seems to be wrestling back coding from developers and re-empowering domain experts and systems analysts. A full recession will probably shake out a lot more of the waste in software development.
> Most programmers of the time seemed far more interested in being architecture astronauts than making something useful.
Of course they were, being an architect astronaut got you hired.
You have to get passed the resume filters and the architecture interview to even have the chance to work on the internal enterprises tool no one uses anyway. People just respond to what will grow their career.
It was always fun to work on companies doing microservice architecture for applications that would scale for thousands of users. Thousands. You could run it from a laptop probably, DB and all.
I think what you're describing is just a consequence of software companies becoming very large. Work for a small business and you'll be back writing utilitarian code.
Largely agree, given your definitions and clarifications, but I see some things are just co-related issues not directly a death of that programming approach. Where I see it is the gap between programmers and end users, scope of 'users' expanding to other programmers, and the increased complexity causes more abstract soft skill code delivery/management roles are entirely co-existing issues. Where they didn't cause the death directly, more a co-morbidity situation, didn't help, but it didn't cause the death. I'd say the primary cause is cost and complexity of operations, forcing the perspective shift from 'help at least one actual human being' to 'help at least <MINIMUM VIABLE MARKET SHARE> of users/developers'. I'd also as an aside argue frameworks and items directed at devs (that are well-designed), are still abstractly utilitarian, because, if they didn't exist a human would have to do the work of programming or doing the work manually so it would directly help at least 1 human.
Absolutely agree, for the most part. Luckily I think the tide is going out and developers are going to be forced to start actually solving problems in their domain out of necessity.
No more easy money = no more engineering for engineering’s sake, and companies are increasingly becoming more privy to the fact that architecture astronauts are a liability, and cloud is a 95% distractions meant to absorb time energy and money from IT orgs within large companies.
I’ve personally switched out of devops and to a domain aligned software team within my company. I am absolutely fed up with how wasteful and distracting the state of devops is to the actual business.
I'm not sure many successful engineering orgs did much of that but also the environment our creations live in is much different now.
It was a huge task just to get our programs to run at all not very long ago. When I first started coding I was working on some code for a VLSI problem and I spent like a month just figuring out how to get the problem to fit on disk (not main memory, but on disk!). Now there are similar things that run on a laptop [0]. And it's not like I'm unique in having to come up with crazy solutions to problems tangent to the primary problem in this space. [1]
Now the infrastructure for our code is amazing and the machines that execute them abound in resources, especially in the cloud. Now that the yoke has been cast off it makes sense more time is spent on the solution the actual problem you set out to solve in the first place.
I seem to be seeing a lot more submissions to Ask HN that are basically blog posts. I'm not trying to police anyone, and if the mods are happy with this then ok. But I'm curious about whether there's actually a trend here, and whether hn users lack alternative places to post their thoughts. Something to so with twitter going down the drain?
I still write utilitarian programs all the time. If the code I’m writing isn’t addressing a need and making someone’s life a little easier, what am I doing?
The world doesn’t need another generic JavaScript framework, but a lot of people have little annoyances everyday that can be made better with code. This is my favorite code to write. Nothing is that impressive, technically speaking, but it changes how people work and makes their job suck a little less. I find this type of work much more fulfilling than working on some silly integration between 2 systems that will be gone in 3 years.
> A clever and witty bash script running on a unix server somewhere is also not utilitarian coding, no human ever directly benefited from it.
Back around 2010, my friend Mat was doing cloud consulting. He wrote some code to screen-scrape the AWS billing and usage page for an account to determine how much had been spent day-over-day. This was, of course, all orchestrated via a bash script that iterated through clients and emailed the results to them (triggered by cron, of course).
He realized he had startup on his hands when something broke and clients started emailing him asking where their email was. Cloudability was born out of that.
I'd say that both the Ruby and bash code involved count as pretty utilitarian despite running on a server and not having a direct user interface.
Several years ago, I was the sysadmin/devops of an on-premises lab whose uplink to the rest of the company (and the proxy by extension) was melting under the CICD load.
When that became so unbearable that it escalated all the way to the top priority of my oversaturated backlog, I took thirty minutes from my hectic day to whip up a Git proxy/cache written in an hundred lines of Bash.
That single-handedly brought back the uplink from being pegged at the redline, cut down time spent cloning/pulling repositories in the CICD pipelines by over two-thirds and improved the workday of over 40 software developers.
That hackjob is still in production right now, years after I left that position. They tried to decommission it at some point thinking that the newly installed fiber uplink was up to the task, only to instantly run into GitHub rate limiting.
It's still load-bearing and strangely enough is the most reliable piece of software I've ever written. It's clever and witty, because it's both easy to understand and hard to come up with. The team would strongly disagree with the statement that they didn't directly benefit from it.
Interesting take. Most code I see floating around on the web is either useless or a copy of something else just written in a different way. I don't think this is going to end soon, if anything, "vibe coding" will continue make it worse.
At some point, someone will start calling it out. GenZ may not as they've taken these things as granted or "way of life", but GenA might if they ever start thinking critically and out of the box.
Why isn't a bash script running on a server utilitarian? I have a dozen cron jobs on my server doing different things for me. Why am I not "benefiting" from them in this definition?
Somewhere a bash script on a server might be calculating the interest my bank owes me, I'm directly benefiting from that too.
I have some scheduled jobs that generate reports that go out to people weekly. Without these they would be suck manuals trying to look things up and tracking it in Excel on a weekly basis. I could give it to them to run themselves, but it’s much better for it to just show up in their inbox.
In other cases I have code that fixes error conditions when they arise. No one has to run it manually, but if it didn’t exist, they would end up with the ticket to fix it manually. Even if they forget it’s there, it is given them more time with each run.
Have you missed the entirety of the evolution of software development?
What is Winforms itself if non "non-utilitarian"? Most of an OS is non-utilitarian. Compilers, libc, databases, web servers, browser APIs, ffmpeg, OpenGL, Unity, etc., etc., etc...
2014 is a wild year to pin the end of "utilitarian" programming on, since all of the things you appear to complain about already existed by then. If anything the beginning of making programs for other programs and programmers was 1951/52 with the invention of the compiler. It's been downhill from there.
What became incredibly obvious to me, after working in software and then as a mechanical engineer, was that software has absolutely no engineering culture.
Software has a deeply ingrained craftsman culture, which empathized personal flavor, unique approaches, stylistic debates over engineering. Surprisingly this gets worse in large organizations, where adherence to stylistic choices, specific tools and workflows, supersedes engineering at every point. Software is still full of fads, where every couple of year a new or old flavor is found, which is now the "right" thing to do, which will then be defended and attacked until a new fad is found.
Not all software is consumer software or web dev. The software used to control the space shuttle for example was created by an organization with a real engineering culture.
I mean, an example of a truly utilitarian software that solves a nontrivial problem would be good. Abstractions has gotten a bad name and you can certainly go overboard with it, but it's also a tool that you often need to solve more complicated problems.
Also, I think it misses a bit where programming came from. The idea of general computation was an abstract mathematical plaything long before it had concrete use cases.
We should also remember that most utilitarian software is built on top of those abstractions.
If I make something utilitarian in Apple Shortcuts, my “code” is sitting on top of countless layers of abstractions and frameworks which make it all possible, which are also abstracted away behind a drag and drop interface.
I think that's an interesting point though that is often neglected - not all abstractions are created equal.
In a browser, both the DOM, web APIs, etc are abstractions - so are frameworks like React, ect. However, there is usually a lot less anger and attention directed at the former than the letter.
My theory why this is the case is that the former is a "hard" abstraction boundary while the latter is "soft": For a web site, it's intentionally not possible to peek below the abstraction layer of the web APIs. However, in exchange, the browser also puts in a lot of effort to make sure that web devs don't have to go below the abstraction layer: By providing watertight error handling for everything below the layer and by giving rich debugging tools that work with the concepts given by the abstraction.
In contrast, any frameworks and libraries that build on top of that are "soft", because developers can - and have to - lool into the code of the framework itself if some problem requires it.
I'm with you pal but the underlying problem is a decline in ethics as technology development as more intimately paralleled the neoliberal expansion. In 60 some-odd years we've gone from the government giving money to institutions and inventors to create novel technologies to privatizing most parts of the development of our technical infrastructure. Now instead of SRI, Xerox PARC, University of Utah (Sutherland), many others we have concentrated capital and what are effectively oligarchic trusts at the core of our development strategy. This happens as ethics is withered down bit-by-bit through the efforts of groups like the Heritage Foundation + other, "libertarian" special interest groups, the unresolved social conflicts that created poor-whites out of Reconstruction, and the disdain the superstitious Christian right has had for popular culture since even before the post-war turn.
Computer ethics will not improve en-masse in the United States in years to come. We will get more privatization of the public good. We will get more protections for the monopolists. Social media manipulation and mass surveillance are just the beginning.
"How I Learned To Stop Worrying And Love The Palantir."
I think it comes from separating domain experts from writing software themselves. There was a lot of great software for different niche things that were written by people who were some what experts in these niche things that happened to write code as well. They knew what calculations people in this field would be doing, and simply wrote the function and wrapped it in some tool. Immediately useful software.
A lot of "academic" code that is pilloried by "real" software engineers is actually a great example of this. Is it the most performant? No. Could a random person off the street make use of this? No. But for those in the field, they know exactly what this tool is conceptually doing even if they don't know how it is made exactly. It is like a very specialized tool for a very specialized tradesman.
What is interesting about the shift away from utilitarian programming is that these "thought leaders" and other middlemen now find themselves in positions to set the narrative essentially thanks to monopolizing the space, change the very meaning of work to suit the software they happen to peddle rather than the other way around. We saw this with enterprise software and now we see this with AI tooling shoehorned into that enterprise software.
I think of it like this: in, say, the 1990s, "computers" / "tech" was fundamentally about doing _real-world_ things more efficiently. Word processing is a step up from a typewriter, desktop publishing is more efficient than typesetting, email is faster than fax or postal mail, spreadsheets allow for more efficient calculations and accounting, and obviously databases and industrial systems allow for efficient operations of businesses, warehouses, airlines, etc.
A few decade later, all the obvious, real-world, low hanging fruit applications of technology were filled, and "tech" turned into the weird, more self-contained world of "the internet", social media, advertising, the attention economy, bitcoin, high frequency trading, AI, where it's really just your computer/algorithm fighting someone else's computer/algorithm, and rather detached from the offline world.
The microservices fad/wave was where people seemed to lose their minds. The "Solving Imaginary Scaling Issues (at scale)" meme encapsulated it for me. Most programmers of the time seemed far more interested in being architecture astronauts than making something useful. The overengineered hosting setups were also a major impediment to anyone who just wanted to make something useful.
Fortunately, AI-assisted coding seems to be wrestling back coding from developers and re-empowering domain experts and systems analysts. A full recession will probably shake out a lot more of the waste in software development.
> Most programmers of the time seemed far more interested in being architecture astronauts than making something useful.
Of course they were, being an architect astronaut got you hired.
You have to get passed the resume filters and the architecture interview to even have the chance to work on the internal enterprises tool no one uses anyway. People just respond to what will grow their career.
It was always fun to work on companies doing microservice architecture for applications that would scale for thousands of users. Thousands. You could run it from a laptop probably, DB and all.
+1, came here to say this
I think what you're describing is just a consequence of software companies becoming very large. Work for a small business and you'll be back writing utilitarian code.
Largely agree, given your definitions and clarifications, but I see some things are just co-related issues not directly a death of that programming approach. Where I see it is the gap between programmers and end users, scope of 'users' expanding to other programmers, and the increased complexity causes more abstract soft skill code delivery/management roles are entirely co-existing issues. Where they didn't cause the death directly, more a co-morbidity situation, didn't help, but it didn't cause the death. I'd say the primary cause is cost and complexity of operations, forcing the perspective shift from 'help at least one actual human being' to 'help at least <MINIMUM VIABLE MARKET SHARE> of users/developers'. I'd also as an aside argue frameworks and items directed at devs (that are well-designed), are still abstractly utilitarian, because, if they didn't exist a human would have to do the work of programming or doing the work manually so it would directly help at least 1 human.
Absolutely agree, for the most part. Luckily I think the tide is going out and developers are going to be forced to start actually solving problems in their domain out of necessity.
No more easy money = no more engineering for engineering’s sake, and companies are increasingly becoming more privy to the fact that architecture astronauts are a liability, and cloud is a 95% distractions meant to absorb time energy and money from IT orgs within large companies.
I’ve personally switched out of devops and to a domain aligned software team within my company. I am absolutely fed up with how wasteful and distracting the state of devops is to the actual business.
> no more engineering for engineering’s sake
I'm not sure many successful engineering orgs did much of that but also the environment our creations live in is much different now.
It was a huge task just to get our programs to run at all not very long ago. When I first started coding I was working on some code for a VLSI problem and I spent like a month just figuring out how to get the problem to fit on disk (not main memory, but on disk!). Now there are similar things that run on a laptop [0]. And it's not like I'm unique in having to come up with crazy solutions to problems tangent to the primary problem in this space. [1]
Now the infrastructure for our code is amazing and the machines that execute them abound in resources, especially in the cloud. Now that the yoke has been cast off it makes sense more time is spent on the solution the actual problem you set out to solve in the first place.
[0] https://github.com/The-OpenROAD-Project/OpenLane [1] How Prince of Persia Defeated Apple II's Memory Limitations https://www.youtube.com/watch?v=sw0VfmXKq54
<meta>
"Let me know in comments"
I seem to be seeing a lot more submissions to Ask HN that are basically blog posts. I'm not trying to police anyone, and if the mods are happy with this then ok. But I'm curious about whether there's actually a trend here, and whether hn users lack alternative places to post their thoughts. Something to so with twitter going down the drain?
I still write utilitarian programs all the time. If the code I’m writing isn’t addressing a need and making someone’s life a little easier, what am I doing?
The world doesn’t need another generic JavaScript framework, but a lot of people have little annoyances everyday that can be made better with code. This is my favorite code to write. Nothing is that impressive, technically speaking, but it changes how people work and makes their job suck a little less. I find this type of work much more fulfilling than working on some silly integration between 2 systems that will be gone in 3 years.
Since this is HN, I'm gonna pick a nit.
> A clever and witty bash script running on a unix server somewhere is also not utilitarian coding, no human ever directly benefited from it.
Back around 2010, my friend Mat was doing cloud consulting. He wrote some code to screen-scrape the AWS billing and usage page for an account to determine how much had been spent day-over-day. This was, of course, all orchestrated via a bash script that iterated through clients and emailed the results to them (triggered by cron, of course).
He realized he had startup on his hands when something broke and clients started emailing him asking where their email was. Cloudability was born out of that.
I'd say that both the Ruby and bash code involved count as pretty utilitarian despite running on a server and not having a direct user interface.
I'm gonna up the nit.
Several years ago, I was the sysadmin/devops of an on-premises lab whose uplink to the rest of the company (and the proxy by extension) was melting under the CICD load.
When that became so unbearable that it escalated all the way to the top priority of my oversaturated backlog, I took thirty minutes from my hectic day to whip up a Git proxy/cache written in an hundred lines of Bash.
That single-handedly brought back the uplink from being pegged at the redline, cut down time spent cloning/pulling repositories in the CICD pipelines by over two-thirds and improved the workday of over 40 software developers.
That hackjob is still in production right now, years after I left that position. They tried to decommission it at some point thinking that the newly installed fiber uplink was up to the task, only to instantly run into GitHub rate limiting.
It's still load-bearing and strangely enough is the most reliable piece of software I've ever written. It's clever and witty, because it's both easy to understand and hard to come up with. The team would strongly disagree with the statement that they didn't directly benefit from it.
It’s best to think of software as just “content” inside VC funded software ecosystem
Interesting take. Most code I see floating around on the web is either useless or a copy of something else just written in a different way. I don't think this is going to end soon, if anything, "vibe coding" will continue make it worse.
At some point, someone will start calling it out. GenZ may not as they've taken these things as granted or "way of life", but GenA might if they ever start thinking critically and out of the box.
I think you are mourning a world that never existed.
Before agile we had waterfall. Developers didn't interact with users, they got handed requirements by people who didn't know what was even possible.
It's true that software has become more abstract over time, as the need to standardise how things work overrides a bespoke approach most of the time.
Why isn't a bash script running on a server utilitarian? I have a dozen cron jobs on my server doing different things for me. Why am I not "benefiting" from them in this definition?
Somewhere a bash script on a server might be calculating the interest my bank owes me, I'm directly benefiting from that too.
I have some scheduled jobs that generate reports that go out to people weekly. Without these they would be suck manuals trying to look things up and tracking it in Excel on a weekly basis. I could give it to them to run themselves, but it’s much better for it to just show up in their inbox.
In other cases I have code that fixes error conditions when they arise. No one has to run it manually, but if it didn’t exist, they would end up with the ticket to fix it manually. Even if they forget it’s there, it is given them more time with each run.
Have you missed the entirety of the evolution of software development?
What is Winforms itself if non "non-utilitarian"? Most of an OS is non-utilitarian. Compilers, libc, databases, web servers, browser APIs, ffmpeg, OpenGL, Unity, etc., etc., etc...
2014 is a wild year to pin the end of "utilitarian" programming on, since all of the things you appear to complain about already existed by then. If anything the beginning of making programs for other programs and programmers was 1951/52 with the invention of the compiler. It's been downhill from there.
What became incredibly obvious to me, after working in software and then as a mechanical engineer, was that software has absolutely no engineering culture.
Software has a deeply ingrained craftsman culture, which empathized personal flavor, unique approaches, stylistic debates over engineering. Surprisingly this gets worse in large organizations, where adherence to stylistic choices, specific tools and workflows, supersedes engineering at every point. Software is still full of fads, where every couple of year a new or old flavor is found, which is now the "right" thing to do, which will then be defended and attacked until a new fad is found.
Not all software is consumer software or web dev. The software used to control the space shuttle for example was created by an organization with a real engineering culture.
I’d wager it was created by scientists/physicists too and not “just” developers.
Indeed. Embedded software has a very big EE influence, which comes with a real engineering culture.
The stark difference between embedded development, especially in aerospace, to "normal" software development is really my point.
I mean, an example of a truly utilitarian software that solves a nontrivial problem would be good. Abstractions has gotten a bad name and you can certainly go overboard with it, but it's also a tool that you often need to solve more complicated problems.
Also, I think it misses a bit where programming came from. The idea of general computation was an abstract mathematical plaything long before it had concrete use cases.
We should also remember that most utilitarian software is built on top of those abstractions.
If I make something utilitarian in Apple Shortcuts, my “code” is sitting on top of countless layers of abstractions and frameworks which make it all possible, which are also abstracted away behind a drag and drop interface.
I think that's an interesting point though that is often neglected - not all abstractions are created equal.
In a browser, both the DOM, web APIs, etc are abstractions - so are frameworks like React, ect. However, there is usually a lot less anger and attention directed at the former than the letter.
My theory why this is the case is that the former is a "hard" abstraction boundary while the latter is "soft": For a web site, it's intentionally not possible to peek below the abstraction layer of the web APIs. However, in exchange, the browser also puts in a lot of effort to make sure that web devs don't have to go below the abstraction layer: By providing watertight error handling for everything below the layer and by giving rich debugging tools that work with the concepts given by the abstraction.
In contrast, any frameworks and libraries that build on top of that are "soft", because developers can - and have to - lool into the code of the framework itself if some problem requires it.
I'm with you pal but the underlying problem is a decline in ethics as technology development as more intimately paralleled the neoliberal expansion. In 60 some-odd years we've gone from the government giving money to institutions and inventors to create novel technologies to privatizing most parts of the development of our technical infrastructure. Now instead of SRI, Xerox PARC, University of Utah (Sutherland), many others we have concentrated capital and what are effectively oligarchic trusts at the core of our development strategy. This happens as ethics is withered down bit-by-bit through the efforts of groups like the Heritage Foundation + other, "libertarian" special interest groups, the unresolved social conflicts that created poor-whites out of Reconstruction, and the disdain the superstitious Christian right has had for popular culture since even before the post-war turn.
Computer ethics will not improve en-masse in the United States in years to come. We will get more privatization of the public good. We will get more protections for the monopolists. Social media manipulation and mass surveillance are just the beginning.
"How I Learned To Stop Worrying And Love The Palantir."
I think it comes from separating domain experts from writing software themselves. There was a lot of great software for different niche things that were written by people who were some what experts in these niche things that happened to write code as well. They knew what calculations people in this field would be doing, and simply wrote the function and wrapped it in some tool. Immediately useful software.
A lot of "academic" code that is pilloried by "real" software engineers is actually a great example of this. Is it the most performant? No. Could a random person off the street make use of this? No. But for those in the field, they know exactly what this tool is conceptually doing even if they don't know how it is made exactly. It is like a very specialized tool for a very specialized tradesman.
What is interesting about the shift away from utilitarian programming is that these "thought leaders" and other middlemen now find themselves in positions to set the narrative essentially thanks to monopolizing the space, change the very meaning of work to suit the software they happen to peddle rather than the other way around. We saw this with enterprise software and now we see this with AI tooling shoehorned into that enterprise software.