I'm disillusioned because it never happens, but purveyors of conferences and books are happy to sell the promised land™ of how "it's really going to be different this time."
Processes, tools, and diligence vigilantly seem the most apparent path. Perhaps rehash the 50 year old debate of professionalization while AI vibes coding is barking at the door, because what could possibly go wrong with even less experience doing the same thing and expecting a different result.
It doesn't happen because building the best software is not the goal of a software engineering job.
If you want to do that on your own time, that's fine - but the purpose of a job is economic. Of course you should write software of some reasonable quality, but optimizations have diminishing economic returns. Eventually, the returns are lower than the cost (in time, money, etc) of further optimizing, and this break-even point is usually at a lower level of quality than engineers would like it to be. Leadership and engineering managers know this and behave accordingly.
Companies can sacrifice every thing for "time to market" - optimization, maintainability, security and safety even - but underestimate the costs of doing that. It is actually more a marketing choice than an economic choice.
One can be skeptical about the implied statement and leadership/management knows what it is doing beyond delivering at the (arbitrarily) set time. One definition of Quality is to satisfy a need entirely at the lowest cost in the shortest time, but more often that not, the last term gets 90% of the attention.
Do they? I’ve been fighting against the tide for years until I understood that all of quality this and quality that doesn’t matter. Sure, it sucks to be on the receiving end of buggy software, but this where you vote with your money. At work? Finish the task with least amount of resources and move on.
It’s about momentum. Once you lose it the wheels come off. And at first that’s shipping product, but later on it’s adding new features to the minefield you’ve laid. Until you start clearing the mines your velocity continues to drop.
> Until you start clearing the mines your velocity continues to drop.
I've been doing this for decades, it's never a problem. Either velocity tanks in which case there's a short period where company invests into improving it, or people leave.
I've seen projects that failed, or were killed, likely at least in part due to a culture that encouraged poor quality and tech debt. This is preventable, and for no additional up-front engineering effort or time investment.
I think this is the most common failure mode I’ve seen, short of a failure to find a proper product-market fit.
It’s just really hard to overstate how much damage a bunch of crappy code can have. Even with the best of intentions. I must say I strongly disagree that this is “never a problem”.
Yes, parent says "people leave" as if it is not a problem in itself; you lose the time it takes to train these people, and they probably take some knowledge about the products with them. Or maybe we are actually talking about commodity developers?
But I'm curious about how one prevents this dysfunctional culture.
One of my managers was fond of the phrase, “a project is done when nobody is willing to work on it anymore.” That can be because of a number of reasons, including that the money is gone, or it sucks your will to live.
> Companies can sacrifice every thing for "time to market" - optimization, maintainability, security and safety even - but underestimate the costs of doing that
If you're working at a company who disregards safety and security good luck getting them to care about clean code and efficiency.
While I agree with everything you've said, I think you might be making an assumption that quality costs time. In my experience this isn't the case, unless you're starting from a low quality codebase or working with low quality people. A high quality team can produce high quality software in less time than it takes a low quality team to produce low quality software meeting the same functional requirements.
The whole ballgame is making sure you have no low quality people on your team.
The quality of your team is more-or-less a pre-existing background variable. The question is whether a team of comparable quality takes longer to produce quality software than hacked-together software, and the answer appears to be "yes". The only way out of this is if optimizing more for code quality *actually helps you recruit better engineers*.
I can put a little data to that question, at least. I run a recruiting company that does interviews, so we have data both on preferences and on apparent skill level.
I went and split our data set by whether an engineer indicated that emphasis on code quality was important to them. Our data suggests that you can probably make slightly better hires (in a raw technical-ability sense) by appealing to that candidate pool:
- Candidates who emphasized quality were slightly (non-significantly) more likely to pass our interview and,
- Candidates who emphasized quality were slightly (non-significantly) less likely to have gotten a job already
The effect is pretty small, though, and I doubt it outweighs the additional cost.
> A high quality team can produce high quality software in less time than it takes a low quality team to produce low quality software meeting the same functional requirements.
Key word is ‘can’. And it takes far more time and money to assemble “quality” team.
This is only true in certain contexts. Most of the time software quality is looked over not because it genuinely isnt important but simply because it's hard to perceive.
Ive watched many businesses appreciate the benefits of software quality (happy customers, few incidents, fast feature turnaround) without ascribing it to anything in particular.
Then, when it went away, they chalked up the problems to something else, throwing fixes at it which didnt work.
At no point in time did they accurately perceive what they had or what they lost, even at the point of bankruptcy.
Part of the problem is that the absence of bugs, incidents and delays just feels normal and part of the problem is most people are really bad at detecting second order effects and applying second order fixes. E.g. they think "another developer will fix it" or "devs just need to dedicate more time to manual QA".
Conversely, because it's so hard to see I think it can make a really good competitive moat.
Ouch. It seems that when a manager sinks some team's velocity by adding a bad developer to it the following reaction is always to add more bad developers so the velocity recovers.
And then when they can't herd all the bad developers around, the obvious next step is to finish destroying everything again by imposing some strict process.
We are getting to the point where people don’t even do the “make it right” part of Make it Work, Make it Right, Make it Fast. It’s making it tough to push for the latter when you can’t even get Right out of some people.
Aye, it never happens but it does sell a lot of books ;)
I don't think we'll reach this promised land™ until incentives re-align. Treating software as an assembly line was obviously The Wrong Thing judging by the results - problem is how can we ever move to a model that rewards quality perhaps similar to (book) authors and royalties?
Owner-operator SaaS is about as close as you can get but limits you to web and web-adjacent.
Get couple shredded guys and gals to show off how fit they are so everyone feels guilty they are snacking past 8PM.
Sell another batch of “how to do pushups” followed by “how to do pushups vol.2” with “pushup pro this time even better”.
Where in the end normal people are not getting paid for getting shredded, they get paid for doing their stuff.
I just constantly feel like I am not a proper dev because I mostly skip unit tests - but on the other hand I built last 15 years couple of systems that worked and were bringing in value.
You could switch into a domain where safety-critical software is developed. Here devs complain about the inverse problem: Why are we required to have 100% test coverage?!
(The answer btw: Because nobody would be able to explain to a jury/judge that 80% or whatever is enough)
Or they complain that they know everything has gone to hell but if they blow the whistle revenge will be taken against them.
Everybody who worked with the 2005 Toyota Camry ETCS would have known what was up when it killed a few people, for example. Nobody can work on spaghetti code of that magnitude and not realize that something is off.
Boeing employees who tried to blow the whistle were similarly ignored or silenced while a few died in mysterious circumstances.
Why would you skip unit tests? Especially in the AI age. You can quickly verify your behavior. Also, by not writing them you're also missing out on opportunities to modularize your code.
Obviously, this assumes you write enterprise grade code. YMMV
You can write modular code without writing tests - I write testable code - I don't write tests. When I need I can always add them back, but I tend to skip it as mostly it doesn't make sense.
But still cottage industry of "clean code" is pushing me into self doubts and shame.
I’m a mechanical engineer, not a software person, but I write a lot of (hopefully close to professional quality) code for my work. writing tests while in the beginning/middle of my development cycle has been the best change I’ve made in how I do things in quite a long time. Since I’m a self-taught amateur often working solo, its invaluable for helping me gain confidence that everything I’m doing is working correctly as I learn the language/libraries necessary for me to build each new program.
I’m not saying that you yourself have this attitude - but the “tests are for suckers, I just ship” crowd really grinds my gears because to me it says “ha! Why do you care about getting things right?”
Totally get where you’re coming from though, sometimes the expected behavior is trivial to verify or (in the case of GUIs) can be very difficult and not worth the effort to implement.
It happens when an ex-engineer is in a leadership position. The results are good, but it’s typically a small part of having a successful company.
However, you should want to build quality software because building quality things is fulfilling. Unfortunately certain systems have made the worship of money the end all be all of human experience.
I’ve seen one company in my 30 year career with effective quality control.
The QE engineers and the development engineers were in entirely separate branches of the org chart. They had different incentive structures. The interface documentation was the source of truth.
The release cadence was slow. QE had absolute authority to stop a release. QE wrote more code than development engineers did with their tests and test automation.
I've worked at one of those companies where software quality was paramount.
They did TDD for a long time, they wrote Clean Code™, they organised meetups, sponsored and went to conferences, they paid 8th Light consultants to come teach (this was actually worth it!) and sent people to Agile workshops and certificates.
At first, I was like "wow, I am in heaven".
About a year later, I noticed so much repetition and waste of time in the processes.
Code was at a point where we had a "usecase" that calls a "repository" that fetches a list of "ItemNetworkResponse" which then gets mapped into "Item" using "ItemNetworkResponseToItemMapper" and tests were written for every possible thing and path.
They had enterprise clients, were charging them nicely, paying developers nicely and pocketed extra money due to "safety buffers" added by both engineers, managers and sales people, basically doubling the length of any project for "safety".
The company kept to their "high dev standards" which meant spending way more time, and thus costing way more, than generic cookie-cutter agencies would cost for the same project.
This was great until every client wanted to save money.
It sounds like they were cargo culting ThoughtWorks.
ThoughtWorks and companies like them do work but theyre heavily reliant upon heavy duty sales. Delivery at high quality is necessary but not sufficient.
The company I work for used to be organized like this a decade or so ago, and people who were around back then still tell horror stories that we all laugh about. Things like bug targets not being met leading to extreme bug ping-pong ("you didn't specify the phase of the moon when this crash on clicking Run reproduced, Needs Information", "this GUI control is misaligned, here are 5 bugs, one for each button that is not where it should be", endless hostile discussions on the severity of bugs and so on).
Sofwtare development and quality assurance should be tightly integrated and should work together on ensuring a good product. Passing builds over a wall of documentation is a recipe for disasters, not good quality software.
It seems to be socially associated with the Handmade Hero and Jon Blow Jai crowd, which is not so much concerned that their software might be buggy as that it might be lame. They're more concerned about user experience and efficiency than they are about correctness.
> which is not so much concerned that their software might be buggy as that it might be lame
This is not at _all_ my interpretation of Casey and JBlow's views. How did you arrive at this conclusion?
> They're more concerned about user experience and efficiency than they are about correctness.
They're definitely very concerned about efficiency, but user experience? Are you referring to DevX? They definitely don't prize any kind of UX above correctness.
From what I've seen, they are very much in a game developer mindset: you want to make a finished product for a specific use, you want that product to be very well received for your users, and you want it to run really fast on their hardware. When you're done with it, your next product will likely be 80% new code, so long term maintainabity is not a major concern.
And stability is important, but not critical - and the main way they want to achieve it is that errors should be very obvious so that they can be caught easily in manual testing. So C++ style UB is not great, since you may not always catch it, but crashing on reading a null pointer is great, since you'll easily see it during testing. Also, performance concerns trump correctness - paying a performance cost to get some safety (e.g. using array bounds access enforcement) is lazy design, why would you write out of bounds accesses in the first place?
One of the slides in Blow's talk about why he was starting work on Jai said, "If we spend a lot of time wading through high-friction environments, we had better be sure that this is a net win. Empirically, it looks to me like [that] does not usually pay off. These methods spend more time preventing potential bugs than it would have taken to fix the actual bugs that happen."
I think that's an overall good summary of the crowd's attitude. They think that mainstream programming environments err too far in the direction of keeping your software from being buggy, charging programmers a heavy cost for it. Undoubtedly for videogames they are correct.
> This is not at _all_ my interpretation of Casey and JBlow's views.
IMHO this group's canonical lament was expressed by Mike Acton in his "Data-Oriented Design and C++" talk, where he asks: "...Then why does it take Word 2 seconds to start up?!"[0]. See also Muratori's bug reports which seem similar[1].
I think it is important to note, as the parent comment alludes, that these performance problems are real problems, but they are usually not correctness problems (for the counterpoint, see certain real time systems). To listen to Blow, who is actually developing a new programming language, it seems his issue with C++ is mostly about how it slows down his development speed, that is -- C++ compilers aren't fast enough, not the "correctness" of his software [2].
Blow has framed these same performance problems as problems in software "quality", but this term seems share the same misunderstanding as "correctness". And therefore seems to me like another equivocation.
Software quality, to me, is dependent on the domain. Blow, et. al, never discuss this fact. Their argument is more like -- what if all programmers were like John Carmack and Michael Abrash? Instead of recognizing software is an economic activity and certain marginal performance gains are often left on the table, because most programmers can't be John Carmack and Michael Abrash all the time.
This is a bit of a simplification of the ideas of Blow, Muratori et al, a much better source for the ideas can be found in "Preventing the collapse of civilization" [0].
The argument made there is that "software quality" in the uncle bob sense, or in your domain version, is not necessarily wrong but at the very least subjective, and should not be used to guide software development.
Instead, we can state that the software we build today does the same job it did decades ago while requiring much vaster resources, which is objectively problematic. This is a factual statement about the current state of software engineering.
The theory that follows from this is that there is a decadence in how we approach software engineering, a laziness or carelessness. This is absolutely judgemental, but its also clearly defended and not based on gut feel but rather on these observations around team sizes/hardware usage vs actual product features.
Their background in videogames makes them an obvious advocate for the opposite, as the gaming industry has always taken performance very seriously as it is core to the user experience and marketability of games.
In short, it is not about "oh it takes 2 seconds to startup word ergo most programmers suck and should pray to stand in the shadow of john carmack", it is about a perceived explosion in complexity both in terms of number of developers & in terms of allocated hardware, without an accompanying explosion in actual end user software complexity.
The more I think about this, the more I have come to agree with this sentiment. Even though the bravado around the arguments can sometimes feel judgemental, at its core we all understand that nobody needs 600mb of npm packages to build a webapp.
> Their argument is more like -- what if all programmers were like John Carmack and Michael Abrash? Instead of recognizing software is an economic activity and certain marginal performance gains are often left on the table, because most programmers can't be John Carmack and Michael Abrash all the time.
At least for Casey his case is less that everyone should be Carmack or Abrash but that programmers often through their poor design choices prematurely pessimise their code when they don’t need too.
I hesitated mentioning it, thinking perhaps I was the only one who thought so. The twitch video failing to load, the static text on blurred background text video, the horizontal text scrolling on mobile, ...
Curious how they’ll balance the business needs of moving fast with AI vs quality because my agents aren’t that good. While it works, I’m often having to cleanup afterwards - slowing everything down. I was almost as fast when I had just basic intellisense.
Anyway, I’ll watch the twitch stream from across the pond.
In DJB's paper on software quality he identifies actionable strategies for code quality and code security that were born out of frustration to sendmail's exploit after exploit. Very accessible and fun read: https://cr.yp.to/qmail/qmailsec-20071101.pdf
I would expect this conf to expand on those types of concepts and strategies.
They probably just manage to realize that being seen to be "moving fast with AI" simply isn't a goal unto itself, that it has to deliver something of value beyond itself.
We could be in a tortoise vs. hare situation... Unless we find ourselves back in the conditions of the 2010's again, thoughtfully building software to be high quality and high performance may win out in the long run over "move fast and break things."
Always have been. It’s why the vast majority of disposable corporate garbageware, products chasing a buck, consumer shovelware, etc is built on the shoulders of thoughtfully designed, high quality, mature software that stands the test of time. No popular production software runs on an OS kernel someone vibe coded yesterday. Durable utility is where quality lies, as the cost of quality is able to amortize. Chasing trends is, by definition, costly.
There are plenty of alternative software needs that do not need to be AI based nor do they need to change tactics due to the current obsession with AI.
If you know of any that are in need of an engineer, let me know. Every single executive I’ve talked to in the last 4 years is all “How can I use AI with this?”
Well, a couple years ago this stuff all sucked (well, a lot more). Yeah it's in many cases somewhat borderline now, but still - this is frickin magic compared to what I thought was possible just a little while ago.
My question is how far does it go - are the gains going to peter out, or does it keep going or even accelerate? Seems like one of the latter two thus far.
Yeah it's interesting, unless I lean hard on them, AI coding agents will tend to solve problems with a lot of "hedging" by splitting into cases or duplicating code. It is totally fine with infinity special cases and unless you push for it, they will solve most problems with special cases and not generalise or consolidate (gemini, claude code at least both seem to have this behaviour).
I feel like this comes about because it's the optimal strategy for doing robust one-shot "point fixes", but it comes at the cost of long-term codebase heath.
I have noticed this bias towards lots of duplication eventually creates a kind of "ai code soup" that you can only really "fix" or keep working on with AI from that point out.
With the right guidance and hints you can get it to refactor and generalise - and it does it well - but the default style definitely trends to "slop" in my experience so far.
This thing feels pretty weird to me. I'm guessing it's an attempt at organizing some sort of european Handmade event, and trying to keep it small.
But between the sparse website, invite-only and anonymous organizers, it just feels like it's emphasizing the reactionary vibes around the Handmade/casey/jblow sphere. Like they don't want a bunch of blue-haired antifa web developers to show up and ruin everything.
Glad to see they got Sweden's own Eskil Steenberg though. Tuning in for that at least.
That's pretty much what it is, it's a reaction to an implosion that happened last year in the Handmade Network related to the Handmade Seattle conference which caused the conference organizer and the community leads to go separate ways.
Like they don't want a bunch of blue-haired antifa web developers to show up and ruin everything.
There's a reason web developers, and the ecosystem/community around them, are the butt of many jokes. I don't think it's at all surprising that the injection of identity politics into the software industry has had a negative effect on quality.
> I don't think it's at all surprising that the injection of identity politics into the software industry has had a negative effect on quality.
That's a pretty broad claim. This conference could be in response to a perceived negative effect on quality, but claiming that as a fact seems hard to back up to me
Ive noticed that some of these types tend to be well meaning young people (often girls) who are super excited to have scored a job doing developer outreach for $BIGTECH.
It's a clever political tactic coz a 50 year old white male middle manager at Microsoft trying to become a board member on an open source foundation would face a lot more hostility than a 20-something girl who pushes all of the diversity buttons.
It mirrors the rather successful marketing strategies for a string of movies including Ghostbusters movie and Barbie, among others. i.e. "There's a certain kind of person who doesnt like our latest corporate offering...". Who wants to be that person?
Yep, preemptively destroying the reputation of whoever opposes you is a common and ancient tactic of bullies from all levels, from school patios to fascist governments.
Look at the people who are pushing for politicizing software development, and you'll see they are always getting money out of the deal.
> Like they don't want a bunch of blue-haired antifa web developers to show up and ruin everything
You write this like this is a bad thing.
I just came to a conference to learn some cool new tech, but instead got lectured about my transphobia, that my database is systemic discrimination and my HDD being named „slave“ means I burn crosses in my free time, even though I have zero family relations to anything America.
I hate talk titles of this form: "Most of your projects are stupid. Please make some actual games.". So annoying. I know it's not personal but I'm sure a better title exists for all talks that choose this form. Why do you have to insult the audience?
Sounds good, but unless it advocates for HR practices that retain talent, and corporate practices that incentivize Quality, it probably won't result in any changes.
Personal Quality Coding practices have been around for as long as software has been a thing. Way back when, Watts Humphrey, Steve McConnell, and Steve Maguire wrote books on how to maximize personal Quality. Many of their techniques still hold true, today.
But as long as there are bad people managers and short-sighted execs, you'll have shit quality; regardless of who does the work.
It bothers me that this is from Sweden, a most inclusive country, while being pretentiously exclusive. I live in Sweden and wouldn’t mind going to a small conference on my holidays, but unfortunately I can’t find the “charming town” where this is supposedly taking place nor know how to find Sam, Sander and Charlie.
So far this has been great, Casey Muratori's talk about the history of OOP aspects of programming has been quite insightful. Will need to revisit when it's on YT.
What is needed is more evidence based software engineering. Statistical methods applied to datasets correlating issue trackers with code ASTs to show us exactly which ways of coding are correlated with longer issue times, frequent bugs etc.
I sometimes wonder if there could be an optimal number of microservices. As far as I know no one has connected issue data to the number of microservices before. Maybe there‘s an optimal number like „8“ which leads to lower number of bugs and faster resolution times.
Depending on who you ask the answer is either "It depends completely on the task, and it is in any case much more important that you divide your application in the right places than exactly how many bits you end up with", or "1".
If you ask Amazon then the more the merrier, because the number of microservices is effectively a multiplier on the bill.
For a non-engineer (business) person the case "engineering quality vs move fast break things" sounds more like "slow & expensive VS fast". The choice is obvious.
It’s not that at all though, the adage “slow down to speed up” applies, because high quality engineering will inevitably increase throughput in the long run.
You should challenge this idea in your internal monologue. Learn a bit more about technology and how it's made. "Fast" in most cases most definitely does not equal cheap, especially over the long term.
Seems like a waste of time to me, especially in this age of AI slop somehow passing as quality. Just another excuse to drink/network/party on company’s dime.
However, I would be interested in establishing a union for technologists across the nation. Drive quality from the bottom up, form local chapters, collectively bargain.
I think I’ve finally figured out just what is that annoys me about the “software quality” crowd.
Quality is a measurement. That’s how it works in hardware land, anyway. Product defects - and, crucially, their associated cost to the company - are quantified
Quality is not some abstract, feel good concept like “developer experience”. It’s a real, hard number of how much money the company loses to product defects.
Almost every professional software developer I’ve ever met is completely and vehemently opposed to any part of their workflow being quantified. It’s dismissed as “micromanagement” and “bean counting”.
Bruh. You can’t talk about quality with any seriousness while simultaneously refusing metrics. Those two points are antithetical to one another.
1. It is partly because the typical metrics used for software development in big corporations (e.g., test coverage, cyclomatic complexity, etc) are such a snake oil. They are constantly misused and/or misinterpreted by management and because of that cause developers a lot of frustration.
2. Some developers see their craft as a form of art, or at least an activity for "expressing themselves" in an almost literary way. You can laugh at this, but I think it is a very humane way of thinking. We want to feel a deeper meaning and purpose in what we do. Antirez of redis fame have expressed something like this. [0]
3. Many of these programmers are working with games and graphics and they have a very distinct metric: FPS.
1. Totally agree that the field of software metrics is dominated by clueless or outright bad actors. I can say with complete certainty that I do not know the right way to measure software quality. All I know is that quality is handled as a metric in most hardware companies, not an abstract concept. When it’s talked about as such an ephemeral thing by software people, it strikes me as a bit disconnected to reality. (If I was going to try, I’d probably shoot for bugs per release version, or time from first spec to feature release.)
2. With respect: that’s a bit of an exceptionalist mindset. There’s nothing precious about software’s value to a business. It’s a vehicle to make money. That’s not to say craft isn’t important - it is, and it has tangible impacts to work. The point I’m making is that: my boss would laugh me out of the room if I told him “You can’t measure the quality of my electronics designs or my delivery process; it’s art.”
3. I’ve never heard of FPS but I’m very interested in learning more. Thanks for sharing the link.
Edit: oh ok duh yeah of course you could measure the frame rate of your graphics stack and get a metric for code quality. D’oh. Whoops. XD
No it isn't, as in it literally isn't. Quantification is the process of judging something numerically, objectively and in measurement. Qualification is just the opposite, judging something by its nature, essence or kind.
Software quality, like all kinds of quality, is always a subjective and experiential feature. Just like, when someone says, this piece of furniture is a high quality, handmade chair, in all likelihood they haven't performed a numerical analysis of the properties of the chair, they're expressing a subjective, direct sentiment.
The handmade movement in software, was exactly about this, putting focus on the personal, lived judgement of experienced practitioners as opposed to trying to quantify software by some objective metric, that's why individual people feature so heavily in it.
Yes, it is. It is a well known field in hardware development, and generally treated as a sub field of manufacturing engineering. It deals with things like testing, sampling, statistics of yield, and process improvement. If you’ve ever done a DFMEA, an 8D report, a Five Whys review, a sampling quality analysis, or a process map, you’ve used tools produced by this discipline.
That’s what I’m trying to tell you and everyone else reading this.
Software, as a profession, collectively talks about quality with all of the rigor of joint passing English majors sharing their favorite sections of Zen and the Art of Motorcycle Maintenance.
Quality has a meaning and a definition and a field of study attached to it. Semiconductors and large scale consumer product manufacturing wouldn’t exist as we know it without this.
>Quality has a meaning and a definition and a field of study attached to it
Yes and I gave you that definition in the first part of my response. That someone in the semiconductor industry made a poor and colloquial choice of words when he confused qualitative and quantitative processes, (the hardware industry deals with the latter), is not evidence to the contrary.
When people talk about software, they're using the terms appropriately. We can objectively talk about the quantities attached to a software. Number of dependencies, size, startup time, what have you, but two people will never necessarily agree on the quality of software, your high quality software might be junk to me, because that is at its root a subjective judgement. There is not a lot of qualitative or subjective judgement in the world of elementary hardware (it either works or doesn't), there is a lot of it in end user software.
It is very difficult to make a bad piece of hardware that does very well on a number of metrics, it's very easy to make a shoddy piece of software that performs well on an infinite number of metrics, because nobody has a subjective experience with a transistor but they do with a piece of software. That is why you should use the terms correctly and not apply processes from one domain to the other.
I notice you have not quantified any aspect of your opinion, here. Which is not surprising, since your opinion is unrelated to facts, science, experience, or wisdom.
Quality is not a "real, hard number" because such a thing would depend entirely on how you collect the data, what you count as data, and how you interpret the data. All of this is brimming with controversy, as you might know if you had read more than zero books about qualitative research, epistemology, the philosophy, history, or practice of science. I say "might" because of course, the number of books one reads is no measure of wisdom. It is one indicator of an interest to learn, though.
It would be nice if you had learned, in your years on Earth, that you can't talk about quality with any seriousness while simultaneously refusing to accept that quality is about people, relationships, and feelings. It's about risks and interpretations of risk.
Now, here is the part where I agree with you: quality is assessed, not measured. But that assessment is based on evidence, and one kind of evidence is stuff that can be usefully measured.
While there is no such thing as a "qualitometer," we should not be automatically opposed to measuring things that may help us and not hurt us.
I'm disillusioned because it never happens, but purveyors of conferences and books are happy to sell the promised land™ of how "it's really going to be different this time."
Processes, tools, and diligence vigilantly seem the most apparent path. Perhaps rehash the 50 year old debate of professionalization while AI vibes coding is barking at the door, because what could possibly go wrong with even less experience doing the same thing and expecting a different result.
It doesn't happen because building the best software is not the goal of a software engineering job.
If you want to do that on your own time, that's fine - but the purpose of a job is economic. Of course you should write software of some reasonable quality, but optimizations have diminishing economic returns. Eventually, the returns are lower than the cost (in time, money, etc) of further optimizing, and this break-even point is usually at a lower level of quality than engineers would like it to be. Leadership and engineering managers know this and behave accordingly.
Companies can sacrifice every thing for "time to market" - optimization, maintainability, security and safety even - but underestimate the costs of doing that. It is actually more a marketing choice than an economic choice.
One can be skeptical about the implied statement and leadership/management knows what it is doing beyond delivering at the (arbitrarily) set time. One definition of Quality is to satisfy a need entirely at the lowest cost in the shortest time, but more often that not, the last term gets 90% of the attention.
They underestimate how long it takes to turn the boat when lowering operational costs becomes the best avenue to improving revenue.
I think it’s like switching CEOs when the company goes out of startup mode into grownup company. What got you here won’t keep you here.
> but underestimate the costs of doing that
Do they? I’ve been fighting against the tide for years until I understood that all of quality this and quality that doesn’t matter. Sure, it sucks to be on the receiving end of buggy software, but this where you vote with your money. At work? Finish the task with least amount of resources and move on.
It’s about momentum. Once you lose it the wheels come off. And at first that’s shipping product, but later on it’s adding new features to the minefield you’ve laid. Until you start clearing the mines your velocity continues to drop.
> Until you start clearing the mines your velocity continues to drop.
I've been doing this for decades, it's never a problem. Either velocity tanks in which case there's a short period where company invests into improving it, or people leave.
I've seen projects that failed, or were killed, likely at least in part due to a culture that encouraged poor quality and tech debt. This is preventable, and for no additional up-front engineering effort or time investment.
I think this is the most common failure mode I’ve seen, short of a failure to find a proper product-market fit.
It’s just really hard to overstate how much damage a bunch of crappy code can have. Even with the best of intentions. I must say I strongly disagree that this is “never a problem”.
Yes, parent says "people leave" as if it is not a problem in itself; you lose the time it takes to train these people, and they probably take some knowledge about the products with them. Or maybe we are actually talking about commodity developers?
But I'm curious about how one prevents this dysfunctional culture.
At my last job the people motivated to fix the clusterfuck were the first to leave. Except me because I’m a masochist apparently.
One of my managers was fond of the phrase, “a project is done when nobody is willing to work on it anymore.” That can be because of a number of reasons, including that the money is gone, or it sucks your will to live.
At work the buyer is not the user, so be sure to hound the buyer when he saddles you with crap.
> Finish the task with least amount of resources and move on
which is now claude code...
> Companies can sacrifice every thing for "time to market" - optimization, maintainability, security and safety even - but underestimate the costs of doing that
If you're working at a company who disregards safety and security good luck getting them to care about clean code and efficiency.
While I agree with everything you've said, I think you might be making an assumption that quality costs time. In my experience this isn't the case, unless you're starting from a low quality codebase or working with low quality people. A high quality team can produce high quality software in less time than it takes a low quality team to produce low quality software meeting the same functional requirements.
The whole ballgame is making sure you have no low quality people on your team.
This isn't an apples-to-apples comparison.
The quality of your team is more-or-less a pre-existing background variable. The question is whether a team of comparable quality takes longer to produce quality software than hacked-together software, and the answer appears to be "yes". The only way out of this is if optimizing more for code quality *actually helps you recruit better engineers*.
I can put a little data to that question, at least. I run a recruiting company that does interviews, so we have data both on preferences and on apparent skill level.
I went and split our data set by whether an engineer indicated that emphasis on code quality was important to them. Our data suggests that you can probably make slightly better hires (in a raw technical-ability sense) by appealing to that candidate pool:
- Candidates who emphasized quality were slightly (non-significantly) more likely to pass our interview and,
- Candidates who emphasized quality were slightly (non-significantly) less likely to have gotten a job already
The effect is pretty small, though, and I doubt it outweighs the additional cost.
> A high quality team can produce high quality software in less time than it takes a low quality team to produce low quality software meeting the same functional requirements.
Key word is ‘can’. And it takes far more time and money to assemble “quality” team.
This is only true in certain contexts. Most of the time software quality is looked over not because it genuinely isnt important but simply because it's hard to perceive.
Ive watched many businesses appreciate the benefits of software quality (happy customers, few incidents, fast feature turnaround) without ascribing it to anything in particular.
Then, when it went away, they chalked up the problems to something else, throwing fixes at it which didnt work.
At no point in time did they accurately perceive what they had or what they lost, even at the point of bankruptcy.
Part of the problem is that the absence of bugs, incidents and delays just feels normal and part of the problem is most people are really bad at detecting second order effects and applying second order fixes. E.g. they think "another developer will fix it" or "devs just need to dedicate more time to manual QA".
Conversely, because it's so hard to see I think it can make a really good competitive moat.
It’s like physical fitness. Try explaining to someone who has never been in shape how it’ll feel to be in shape, and they won’t believe you.
I’ve converted people by building better systems than they’ve seen before. Some balk, but better than half end up getting it and pitching in.
> E.g. they think "another developer will fix it"
Ouch. It seems that when a manager sinks some team's velocity by adding a bad developer to it the following reaction is always to add more bad developers so the velocity recovers.
And then when they can't herd all the bad developers around, the obvious next step is to finish destroying everything again by imposing some strict process.
> ... even at the point of bankruptcy
It that were always the case we could bask in the joy that the problem sorted itself out, but alas, there's a lot of crap that keeps on going.
>At no point in time did they accurately perceive what they had or what they lost, even at the point of bankruptcy.
This is traditionally not only with software, but other kinds of companies too.
Some people are just not quality people.
At this conference there's a presentation encouraging "You should finish your software."
If that's all people did that would be 10x better right there.
We are getting to the point where people don’t even do the “make it right” part of Make it Work, Make it Right, Make it Fast. It’s making it tough to push for the latter when you can’t even get Right out of some people.
Aye, it never happens but it does sell a lot of books ;)
I don't think we'll reach this promised land™ until incentives re-align. Treating software as an assembly line was obviously The Wrong Thing judging by the results - problem is how can we ever move to a model that rewards quality perhaps similar to (book) authors and royalties?
Owner-operator SaaS is about as close as you can get but limits you to web and web-adjacent.
Just like all the fitness content.
Get couple shredded guys and gals to show off how fit they are so everyone feels guilty they are snacking past 8PM.
Sell another batch of “how to do pushups” followed by “how to do pushups vol.2” with “pushup pro this time even better”.
Where in the end normal people are not getting paid for getting shredded, they get paid for doing their stuff.
I just constantly feel like I am not a proper dev because I mostly skip unit tests - but on the other hand I built last 15 years couple of systems that worked and were bringing in value.
You could switch into a domain where safety-critical software is developed. Here devs complain about the inverse problem: Why are we required to have 100% test coverage?!
(The answer btw: Because nobody would be able to explain to a jury/judge that 80% or whatever is enough)
Or they complain that they know everything has gone to hell but if they blow the whistle revenge will be taken against them.
Everybody who worked with the 2005 Toyota Camry ETCS would have known what was up when it killed a few people, for example. Nobody can work on spaghetti code of that magnitude and not realize that something is off.
Boeing employees who tried to blow the whistle were similarly ignored or silenced while a few died in mysterious circumstances.
Why would you skip unit tests? Especially in the AI age. You can quickly verify your behavior. Also, by not writing them you're also missing out on opportunities to modularize your code.
Obviously, this assumes you write enterprise grade code. YMMV
You can write modular code without writing tests - I write testable code - I don't write tests. When I need I can always add them back, but I tend to skip it as mostly it doesn't make sense.
But still cottage industry of "clean code" is pushing me into self doubts and shame.
I’m a mechanical engineer, not a software person, but I write a lot of (hopefully close to professional quality) code for my work. writing tests while in the beginning/middle of my development cycle has been the best change I’ve made in how I do things in quite a long time. Since I’m a self-taught amateur often working solo, its invaluable for helping me gain confidence that everything I’m doing is working correctly as I learn the language/libraries necessary for me to build each new program.
I’m not saying that you yourself have this attitude - but the “tests are for suckers, I just ship” crowd really grinds my gears because to me it says “ha! Why do you care about getting things right?”
Totally get where you’re coming from though, sometimes the expected behavior is trivial to verify or (in the case of GUIs) can be very difficult and not worth the effort to implement.
If it's testable it's trivial to write tests for! You don't want the next person to introduce a bug, do you?
Read my original post again. We don’t have bugs that all those “do it right way” people claim doing it for 15 years with good track record.
You just contribute to BS scare tactics of people selling “clean code”.
How do you know you don't have bugs? Is everything instrumented (observable)? I could just as easily say you're peddling ignorance as bliss!
You can have a lot of other processes and tests on other levels than unit tests ;)
That’s how I do it. 100% test coverage doesn’t make bug reports irrelevant. Every test should have a reason.
It happens when an ex-engineer is in a leadership position. The results are good, but it’s typically a small part of having a successful company.
However, you should want to build quality software because building quality things is fulfilling. Unfortunately certain systems have made the worship of money the end all be all of human experience.
I’ve seen one company in my 30 year career with effective quality control.
The QE engineers and the development engineers were in entirely separate branches of the org chart. They had different incentive structures. The interface documentation was the source of truth.
The release cadence was slow. QE had absolute authority to stop a release. QE wrote more code than development engineers did with their tests and test automation.
I've worked at one of those companies where software quality was paramount.
They did TDD for a long time, they wrote Clean Code™, they organised meetups, sponsored and went to conferences, they paid 8th Light consultants to come teach (this was actually worth it!) and sent people to Agile workshops and certificates.
At first, I was like "wow, I am in heaven".
About a year later, I noticed so much repetition and waste of time in the processes.
Code was at a point where we had a "usecase" that calls a "repository" that fetches a list of "ItemNetworkResponse" which then gets mapped into "Item" using "ItemNetworkResponseToItemMapper" and tests were written for every possible thing and path.
They had enterprise clients, were charging them nicely, paying developers nicely and pocketed extra money due to "safety buffers" added by both engineers, managers and sales people, basically doubling the length of any project for "safety".
The company kept to their "high dev standards" which meant spending way more time, and thus costing way more, than generic cookie-cutter agencies would cost for the same project.
This was great until every client wanted to save money.
The company shut down last year.
It sounds like they were cargo culting ThoughtWorks.
ThoughtWorks and companies like them do work but theyre heavily reliant upon heavy duty sales. Delivery at high quality is necessary but not sufficient.
From your description it looks like that company wasn't into quality but into chasing every fad of software industry.
The company I work for used to be organized like this a decade or so ago, and people who were around back then still tell horror stories that we all laugh about. Things like bug targets not being met leading to extreme bug ping-pong ("you didn't specify the phase of the moon when this crash on clicking Run reproduced, Needs Information", "this GUI control is misaligned, here are 5 bugs, one for each button that is not where it should be", endless hostile discussions on the severity of bugs and so on).
Sofwtare development and quality assurance should be tightly integrated and should work together on ensuring a good product. Passing builds over a wall of documentation is a recipe for disasters, not good quality software.
Everyone was in the same Bay Area office building and both teams talked to each other and ate lunch together and sat together
Every company I’ve seen that maintains a separate QA org chart, inevitably offshores the entire QA org to India or China, with predictable results.
In 2025 I think the only thing that makes sense is having SDETs embedded in development teams.
Was the end result better or worse for this? I'm not being facetious, I just can't get if you think it was a good idea!
The software was rock solid. There were very few surprises once deployed.
> The interface documentation was the source of truth.
lol, fire business analysts and let tech writers do their job. Sounds like some kind of VC black company.
I may be the only one who thought this, but this doesn't seem to be related to the fondly remembered Better Software Magazine: https://en.wikipedia.org/wiki/Better_Software_Magazine
It seems to be socially associated with the Handmade Hero and Jon Blow Jai crowd, which is not so much concerned that their software might be buggy as that it might be lame. They're more concerned about user experience and efficiency than they are about correctness.
> which is not so much concerned that their software might be buggy as that it might be lame
This is not at _all_ my interpretation of Casey and JBlow's views. How did you arrive at this conclusion?
> They're more concerned about user experience and efficiency than they are about correctness.
They're definitely very concerned about efficiency, but user experience? Are you referring to DevX? They definitely don't prize any kind of UX above correctness.
From what I've seen, they are very much in a game developer mindset: you want to make a finished product for a specific use, you want that product to be very well received for your users, and you want it to run really fast on their hardware. When you're done with it, your next product will likely be 80% new code, so long term maintainabity is not a major concern.
And stability is important, but not critical - and the main way they want to achieve it is that errors should be very obvious so that they can be caught easily in manual testing. So C++ style UB is not great, since you may not always catch it, but crashing on reading a null pointer is great, since you'll easily see it during testing. Also, performance concerns trump correctness - paying a performance cost to get some safety (e.g. using array bounds access enforcement) is lazy design, why would you write out of bounds accesses in the first place?
One of the slides in Blow's talk about why he was starting work on Jai said, "If we spend a lot of time wading through high-friction environments, we had better be sure that this is a net win. Empirically, it looks to me like [that] does not usually pay off. These methods spend more time preventing potential bugs than it would have taken to fix the actual bugs that happen."
I think that's an overall good summary of the crowd's attitude. They think that mainstream programming environments err too far in the direction of keeping your software from being buggy, charging programmers a heavy cost for it. Undoubtedly for videogames they are correct.
Jai in particular does support array bounds checking, but you can turn it on or off as a compilation option: https://jai.community/t/metaprogramming-build-options/151
Jai has array bounds checking.
> This is not at _all_ my interpretation of Casey and JBlow's views.
IMHO this group's canonical lament was expressed by Mike Acton in his "Data-Oriented Design and C++" talk, where he asks: "...Then why does it take Word 2 seconds to start up?!"[0]. See also Muratori's bug reports which seem similar[1].
I think it is important to note, as the parent comment alludes, that these performance problems are real problems, but they are usually not correctness problems (for the counterpoint, see certain real time systems). To listen to Blow, who is actually developing a new programming language, it seems his issue with C++ is mostly about how it slows down his development speed, that is -- C++ compilers aren't fast enough, not the "correctness" of his software [2].
Blow has framed these same performance problems as problems in software "quality", but this term seems share the same misunderstanding as "correctness". And therefore seems to me like another equivocation.
Software quality, to me, is dependent on the domain. Blow, et. al, never discuss this fact. Their argument is more like -- what if all programmers were like John Carmack and Michael Abrash? Instead of recognizing software is an economic activity and certain marginal performance gains are often left on the table, because most programmers can't be John Carmack and Michael Abrash all the time.
[0]: https://www.youtube.com/watch?v=rX0ItVEVjHc [1]: https://github.com/microsoft/terminal/issues/10362 [2]: https://www.youtube.com/watch?v=ZkdpLSXUXHY
This is a bit of a simplification of the ideas of Blow, Muratori et al, a much better source for the ideas can be found in "Preventing the collapse of civilization" [0].
The argument made there is that "software quality" in the uncle bob sense, or in your domain version, is not necessarily wrong but at the very least subjective, and should not be used to guide software development.
Instead, we can state that the software we build today does the same job it did decades ago while requiring much vaster resources, which is objectively problematic. This is a factual statement about the current state of software engineering.
The theory that follows from this is that there is a decadence in how we approach software engineering, a laziness or carelessness. This is absolutely judgemental, but its also clearly defended and not based on gut feel but rather on these observations around team sizes/hardware usage vs actual product features.
Their background in videogames makes them an obvious advocate for the opposite, as the gaming industry has always taken performance very seriously as it is core to the user experience and marketability of games.
In short, it is not about "oh it takes 2 seconds to startup word ergo most programmers suck and should pray to stand in the shadow of john carmack", it is about a perceived explosion in complexity both in terms of number of developers & in terms of allocated hardware, without an accompanying explosion in actual end user software complexity.
The more I think about this, the more I have come to agree with this sentiment. Even though the bravado around the arguments can sometimes feel judgemental, at its core we all understand that nobody needs 600mb of npm packages to build a webapp.
[0]: https://www.youtube.com/watch?v=ZSRHeXYDLko
> Their argument is more like -- what if all programmers were like John Carmack and Michael Abrash? Instead of recognizing software is an economic activity and certain marginal performance gains are often left on the table, because most programmers can't be John Carmack and Michael Abrash all the time.
At least for Casey his case is less that everyone should be Carmack or Abrash but that programmers often through their poor design choices prematurely pessimise their code when they don’t need too.
By reading their blog posts and watching their videos.
You would think a conference that advocates for quality software would have a better website.
I hesitated mentioning it, thinking perhaps I was the only one who thought so. The twitch video failing to load, the static text on blurred background text video, the horizontal text scrolling on mobile, ...
Text in video always sucks, but that's just how twitch coding sessions work.
I wish they had a better website, that's for sure.
Not even a section of where and when to find the talks offline.
Curious how they’ll balance the business needs of moving fast with AI vs quality because my agents aren’t that good. While it works, I’m often having to cleanup afterwards - slowing everything down. I was almost as fast when I had just basic intellisense.
Anyway, I’ll watch the twitch stream from across the pond.
In DJB's paper on software quality he identifies actionable strategies for code quality and code security that were born out of frustration to sendmail's exploit after exploit. Very accessible and fun read: https://cr.yp.to/qmail/qmailsec-20071101.pdf
I would expect this conf to expand on those types of concepts and strategies.
They probably just manage to realize that being seen to be "moving fast with AI" simply isn't a goal unto itself, that it has to deliver something of value beyond itself.
We could be in a tortoise vs. hare situation... Unless we find ourselves back in the conditions of the 2010's again, thoughtfully building software to be high quality and high performance may win out in the long run over "move fast and break things."
Always have been. It’s why the vast majority of disposable corporate garbageware, products chasing a buck, consumer shovelware, etc is built on the shoulders of thoughtfully designed, high quality, mature software that stands the test of time. No popular production software runs on an OS kernel someone vibe coded yesterday. Durable utility is where quality lies, as the cost of quality is able to amortize. Chasing trends is, by definition, costly.
Or at least "value" beyond that reaped by current investors unloading their shares onto "Greater Fool" buyers at high prices.
>Curious how they’ll balance the business needs of moving fast with AI vs quality
Why would they need to do that? Is that even a goal or something that this conference is addressing at all?
There are plenty of alternative software needs that do not need to be AI based nor do they need to change tactics due to the current obsession with AI.
If you know of any that are in need of an engineer, let me know. Every single executive I’ve talked to in the last 4 years is all “How can I use AI with this?”
Well, a couple years ago this stuff all sucked (well, a lot more). Yeah it's in many cases somewhat borderline now, but still - this is frickin magic compared to what I thought was possible just a little while ago.
My question is how far does it go - are the gains going to peter out, or does it keep going or even accelerate? Seems like one of the latter two thus far.
> Curious how they’ll balance the business needs of moving fast with AI vs quality because my agents aren’t that good
I would guess the same way humans do.
Put brain in creative mode, bang out something that works
Put brain in rules compliance mode and tidy everything up.
Then send for code review.
Yeah it's interesting, unless I lean hard on them, AI coding agents will tend to solve problems with a lot of "hedging" by splitting into cases or duplicating code. It is totally fine with infinity special cases and unless you push for it, they will solve most problems with special cases and not generalise or consolidate (gemini, claude code at least both seem to have this behaviour).
I feel like this comes about because it's the optimal strategy for doing robust one-shot "point fixes", but it comes at the cost of long-term codebase heath.
I have noticed this bias towards lots of duplication eventually creates a kind of "ai code soup" that you can only really "fix" or keep working on with AI from that point out.
With the right guidance and hints you can get it to refactor and generalise - and it does it well - but the default style definitely trends to "slop" in my experience so far.
To be fair, a lot of humans also have this problem.
[flagged]
Take a moment to reflect on what you have written and how the casual observer may interpret it.
This thing feels pretty weird to me. I'm guessing it's an attempt at organizing some sort of european Handmade event, and trying to keep it small.
But between the sparse website, invite-only and anonymous organizers, it just feels like it's emphasizing the reactionary vibes around the Handmade/casey/jblow sphere. Like they don't want a bunch of blue-haired antifa web developers to show up and ruin everything.
Glad to see they got Sweden's own Eskil Steenberg though. Tuning in for that at least.
That's pretty much what it is, it's a reaction to an implosion that happened last year in the Handmade Network related to the Handmade Seattle conference which caused the conference organizer and the community leads to go separate ways.
https://handmade.network/blog/p/8989-separating_from_handmad...
https://handmadecities.com/news/splitting-from-handmade-netw...
Like they don't want a bunch of blue-haired antifa web developers to show up and ruin everything.
There's a reason web developers, and the ecosystem/community around them, are the butt of many jokes. I don't think it's at all surprising that the injection of identity politics into the software industry has had a negative effect on quality.
> I don't think it's at all surprising that the injection of identity politics into the software industry has had a negative effect on quality.
That's a pretty broad claim. This conference could be in response to a perceived negative effect on quality, but claiming that as a fact seems hard to back up to me
>I don't think it's at all surprising that the injection of identity politics into the software industry has had a negative effect on quality.
If it had any effect, it would be negligible compared to offshoring and weak incentives.
Ive noticed that some of these types tend to be well meaning young people (often girls) who are super excited to have scored a job doing developer outreach for $BIGTECH.
It's a clever political tactic coz a 50 year old white male middle manager at Microsoft trying to become a board member on an open source foundation would face a lot more hostility than a 20-something girl who pushes all of the diversity buttons.
It mirrors the rather successful marketing strategies for a string of movies including Ghostbusters movie and Barbie, among others. i.e. "There's a certain kind of person who doesnt like our latest corporate offering...". Who wants to be that person?
> Who wants to be that person?
Yep, preemptively destroying the reputation of whoever opposes you is a common and ancient tactic of bullies from all levels, from school patios to fascist governments.
Look at the people who are pushing for politicizing software development, and you'll see they are always getting money out of the deal.
> Like they don't want a bunch of blue-haired antifa web developers to show up and ruin everything
This reads like "Oh some people are meeting, so this must actually be about ME".
> Like they don't want a bunch of blue-haired antifa web developers to show up and ruin everything
You write this like this is a bad thing.
I just came to a conference to learn some cool new tech, but instead got lectured about my transphobia, that my database is systemic discrimination and my HDD being named „slave“ means I burn crosses in my free time, even though I have zero family relations to anything America.
I mean this screams fun right from the get go.
Where can you actually learn the substance of what this conference is about?
All I found is a Twitch tagline that reads "Software is getting worse. We're here to make it better."
The have a list of the presentations in the original link. That should at least give you some idea what they're going to talk about.
I hate talk titles of this form: "Most of your projects are stupid. Please make some actual games.". So annoying. I know it's not personal but I'm sure a better title exists for all talks that choose this form. Why do you have to insult the audience?
I know Berkeley Mono when I see it! My go-to terminal font for coming up on three years. Automatically gets me pumped about this conference.
Sounds good, but unless it advocates for HR practices that retain talent, and corporate practices that incentivize Quality, it probably won't result in any changes.
Personal Quality Coding practices have been around for as long as software has been a thing. Way back when, Watts Humphrey, Steve McConnell, and Steve Maguire wrote books on how to maximize personal Quality. Many of their techniques still hold true, today.
But as long as there are bad people managers and short-sighted execs, you'll have shit quality; regardless of who does the work.
The programming language in the background of this website appears to be Odin.
Bill Hall "Ginger Bill", the creator Odin, is a speaker on day 1.
If only they could get Jonathan Blow to be a speaker.
> A software conference that advocates for quality
I am going to keep saying this, if your main tagline/ethos is broken by your website you have failed.
* On mobile the topics are hidden without scroll over. You also can't read multiple of the topics without scrolling right as you read.
* The background is very distracting and disrupts readability.
* None of your speakers have links to their socials/what they are known for.
* > Who are the organizers? Sam, Sander and Charlie.
* * Ah yes, my favourite people.... At least hyperlink their socials.
It bothers me that this is from Sweden, a most inclusive country, while being pretentiously exclusive. I live in Sweden and wouldn’t mind going to a small conference on my holidays, but unfortunately I can’t find the “charming town” where this is supposedly taking place nor know how to find Sam, Sander and Charlie.
> Physical attendance will be invite-only. Tickets will not be publicly available. Invitees will receive an attendee guide with further information.
They don't advocate for diversity that's for sure
Good for them.
So far this has been great, Casey Muratori's talk about the history of OOP aspects of programming has been quite insightful. Will need to revisit when it's on YT.
Looking forward to Casey Muratori's talk!
When I saw the title of the conference I immediately thought of him so I'm not surprised he's headlining!
What is needed is more evidence based software engineering. Statistical methods applied to datasets correlating issue trackers with code ASTs to show us exactly which ways of coding are correlated with longer issue times, frequent bugs etc.
I sometimes wonder if there could be an optimal number of microservices. As far as I know no one has connected issue data to the number of microservices before. Maybe there‘s an optimal number like „8“ which leads to lower number of bugs and faster resolution times.
Depending on who you ask the answer is either "It depends completely on the task, and it is in any case much more important that you divide your application in the right places than exactly how many bits you end up with", or "1".
If you ask Amazon then the more the merrier, because the number of microservices is effectively a multiplier on the bill.
For a non-engineer (business) person the case "engineering quality vs move fast break things" sounds more like "slow & expensive VS fast". The choice is obvious.
It’s not that at all though, the adage “slow down to speed up” applies, because high quality engineering will inevitably increase throughput in the long run.
Really that’s the core of it
You should challenge this idea in your internal monologue. Learn a bit more about technology and how it's made. "Fast" in most cases most definitely does not equal cheap, especially over the long term.
It's more like "slow and expensive vs. fast and more expensive"
"How can you not have enough time to do it right, but enough time to do it twice?"
> plan to throw one away; you will, anyhow.
What a fantastic way of responding to/framing this.
> Where in Sweden is it happening?
> In a charming small town
There is no indication I see from the website that anything about this conference relates to quality, specifically.
I don't see how anyone can be "for" quality and not talk about how quality can be assessed. Where are the talks about that?
the logo is an unsettling convolution of the back orifice logo
Now that you mention it, I'll never see the symbol of the Galactic Empire the same way again.
[flagged]
Seems like a waste of time to me, especially in this age of AI slop somehow passing as quality. Just another excuse to drink/network/party on company’s dime.
However, I would be interested in establishing a union for technologists across the nation. Drive quality from the bottom up, form local chapters, collectively bargain.
I think I’ve finally figured out just what is that annoys me about the “software quality” crowd.
Quality is a measurement. That’s how it works in hardware land, anyway. Product defects - and, crucially, their associated cost to the company - are quantified
Quality is not some abstract, feel good concept like “developer experience”. It’s a real, hard number of how much money the company loses to product defects.
Almost every professional software developer I’ve ever met is completely and vehemently opposed to any part of their workflow being quantified. It’s dismissed as “micromanagement” and “bean counting”.
Bruh. You can’t talk about quality with any seriousness while simultaneously refusing metrics. Those two points are antithetical to one another.
Some thoughts regarding this:
1. It is partly because the typical metrics used for software development in big corporations (e.g., test coverage, cyclomatic complexity, etc) are such a snake oil. They are constantly misused and/or misinterpreted by management and because of that cause developers a lot of frustration.
2. Some developers see their craft as a form of art, or at least an activity for "expressing themselves" in an almost literary way. You can laugh at this, but I think it is a very humane way of thinking. We want to feel a deeper meaning and purpose in what we do. Antirez of redis fame have expressed something like this. [0]
3. Many of these programmers are working with games and graphics and they have a very distinct metric: FPS.
[0] https://blog.brachiosoft.com/en/posts/redis/
1. Totally agree that the field of software metrics is dominated by clueless or outright bad actors. I can say with complete certainty that I do not know the right way to measure software quality. All I know is that quality is handled as a metric in most hardware companies, not an abstract concept. When it’s talked about as such an ephemeral thing by software people, it strikes me as a bit disconnected to reality. (If I was going to try, I’d probably shoot for bugs per release version, or time from first spec to feature release.)
2. With respect: that’s a bit of an exceptionalist mindset. There’s nothing precious about software’s value to a business. It’s a vehicle to make money. That’s not to say craft isn’t important - it is, and it has tangible impacts to work. The point I’m making is that: my boss would laugh me out of the room if I told him “You can’t measure the quality of my electronics designs or my delivery process; it’s art.”
3. I’ve never heard of FPS but I’m very interested in learning more. Thanks for sharing the link.
Edit: oh ok duh yeah of course you could measure the frame rate of your graphics stack and get a metric for code quality. D’oh. Whoops. XD
>Quality is a measurement
No it isn't, as in it literally isn't. Quantification is the process of judging something numerically, objectively and in measurement. Qualification is just the opposite, judging something by its nature, essence or kind.
Software quality, like all kinds of quality, is always a subjective and experiential feature. Just like, when someone says, this piece of furniture is a high quality, handmade chair, in all likelihood they haven't performed a numerical analysis of the properties of the chair, they're expressing a subjective, direct sentiment.
The handmade movement in software, was exactly about this, putting focus on the personal, lived judgement of experienced practitioners as opposed to trying to quantify software by some objective metric, that's why individual people feature so heavily in it.
> No it isn't, as in it literally isn't.
Yes, it is. It is a well known field in hardware development, and generally treated as a sub field of manufacturing engineering. It deals with things like testing, sampling, statistics of yield, and process improvement. If you’ve ever done a DFMEA, an 8D report, a Five Whys review, a sampling quality analysis, or a process map, you’ve used tools produced by this discipline.
That’s what I’m trying to tell you and everyone else reading this.
Software, as a profession, collectively talks about quality with all of the rigor of joint passing English majors sharing their favorite sections of Zen and the Art of Motorcycle Maintenance.
Quality has a meaning and a definition and a field of study attached to it. Semiconductors and large scale consumer product manufacturing wouldn’t exist as we know it without this.
>Quality has a meaning and a definition and a field of study attached to it
Yes and I gave you that definition in the first part of my response. That someone in the semiconductor industry made a poor and colloquial choice of words when he confused qualitative and quantitative processes, (the hardware industry deals with the latter), is not evidence to the contrary.
When people talk about software, they're using the terms appropriately. We can objectively talk about the quantities attached to a software. Number of dependencies, size, startup time, what have you, but two people will never necessarily agree on the quality of software, your high quality software might be junk to me, because that is at its root a subjective judgement. There is not a lot of qualitative or subjective judgement in the world of elementary hardware (it either works or doesn't), there is a lot of it in end user software.
It is very difficult to make a bad piece of hardware that does very well on a number of metrics, it's very easy to make a shoddy piece of software that performs well on an infinite number of metrics, because nobody has a subjective experience with a transistor but they do with a piece of software. That is why you should use the terms correctly and not apply processes from one domain to the other.
Yeah, I think we’re just gonna have to agree to disagree here.
I notice you have not quantified any aspect of your opinion, here. Which is not surprising, since your opinion is unrelated to facts, science, experience, or wisdom.
Quality is not a "real, hard number" because such a thing would depend entirely on how you collect the data, what you count as data, and how you interpret the data. All of this is brimming with controversy, as you might know if you had read more than zero books about qualitative research, epistemology, the philosophy, history, or practice of science. I say "might" because of course, the number of books one reads is no measure of wisdom. It is one indicator of an interest to learn, though.
It would be nice if you had learned, in your years on Earth, that you can't talk about quality with any seriousness while simultaneously refusing to accept that quality is about people, relationships, and feelings. It's about risks and interpretations of risk.
Now, here is the part where I agree with you: quality is assessed, not measured. But that assessment is based on evidence, and one kind of evidence is stuff that can be usefully measured.
While there is no such thing as a "qualitometer," we should not be automatically opposed to measuring things that may help us and not hurt us.
I’m not sure what conclusion to draw from this comment, apart from the fact that you’ve sure made a lot of assumptions about me and my experience.