> To use a dog metaphor, a chainsaw tends to growl before it bites.
I believe the author is mistaken about why chainsaws don't cause more harm than they do. There are multiple of ways in which a chainsaw will kill or destroy, without any warning. Kickback is very quick. A tree falling in the wrong direction comes without warning.
Personally, I suspect there are 2 main reasons why chainsaws don't cause more harm.
A) They've been around for 100 years, and they've been causing fatalities and injuries for 100 years. People have invented ways to reduce the risk. Any chain you can readily buy is a low-kickback chain, and the saw comes with a chain brake. It doesn't completely remove the risk, but it substantially reduces it.
B) They've been around for 100 years, and they've been causing fatalities and injuries for 100 years. People are aware that chainsaws are inherently dangerous. Even someone without any training, and without looking at the safety instructions, understands that one of these will take off a limb without blinking.
What this means for the rest of the metaphor, I'm not sure.
I don't love the analogy, but I 100% agree that we are putting AI tools into people's hands that they are not yet able to wield safely. I think this is mostly because of how quickly AI has been developed. New technologies can take decades to be absorbed by society. Early on with cars we had no traffic lights, speed limits, or seat belts. And modern freeways were many decades off.
"Early electrical systems were poorly insulated, and fires or electrocutions were common. The creation of standards for wiring, outlets, grounding systems, and circuit breakers took decades to develop." It could be that AI will develop so fast that society never catches up, but I'm sure we'll at least get better over time.
I really don't understand why people are impressed by posts like this. The author spends way too much time talking about literal chainsaws and really mostly just asserts its a relevant metaphor. Mostly an LLM is like a chainsaw in that most people don't need either one. There are a million things I don;t need, for example, spinny hubcaps. Is an LLM really more like a chainsaw than it is like spinny hubcaps? Because it seems more like spinny hubcaps to me.
I personally think they are incredibly dangerous to motorcyclists and cyclists. When a car with spinner hubcaps pulls up to an intersection it takes much longer for other vehicles to tell that it has actually stopped (because one of the main visual cues normally is to look at whether the wheels are moving) so a couple of times I have found myself on the verge of taking emergency evasive manoeuvres thinking the car is pulling straight out on me (cars pull out on bikers all the time). I've seen cyclists swerve dangerously also thinking their lives were in danger from a car with spinny hubcaps.
The anime Chainsaw Man is about this. In its world, chainsaws are the Number #1 Most-power Devil. Even more powerful than Guns.
I wondered about this but it makes sense when you imagine trees as sentient. Chainsaws are extremely efficient killing machines for the most helpless and generous creatures on earth.
I feel like this metaphor could just as easily be about "cars" instead of "chainsaws". We do put cars in almost everyone's hands even though it has the same potential for dangers.
Even more so since I would assume more people die to car accidents then chainsaw accidents. But more people find cars useful to their day to day lives than chainsaws.
So I guess the question is whether these models are useful to everyone like cars, or just to a subset like chainsaws that still remains to be seen.
Conclusion: most of us don't need AI in the same way most of us don't need a chainsaw.
"We’re all mostly just regular people interacting with other regular people, trying to go about our business and get through our days. Nobody’s asking us to index and catalog entire libraries of information. Nobody really cares if we reuse a stock photo that somebody else might have used somewhere else at some point. We’re not so busy that we need everything predigested before it’s presented to us. We’re still clever enough to work our way through unfamiliar problems. Some of us, I would have to think, still possess the ability and the desire to produce some sort of output in our chosen medium without relinquishing creative control to AI."
I disagree, you're not including mental fatigue. I can say something like "I have a server OS upgrade that needs to be done by X date. The server uses XYZ technologies. Here is the list of stakeholders and their roles. Design me an assessment and implementation plan and outline the key milestones and their dates." Could I think through that myself? Yes but it may take a couple hours. I just got 85% of the planning work done in 2 minutes.
I feel like I'm talking with people who would rather ride a horse and buggy down a rutted dirt road for 2 hours both ways, checking the wooden wheels for damage. Giving the horses rest stops.
I think I'll take a 10 minute car drive to costco, thank you very much.
> The chainsaw is a poor fit for, say, ripping down a sheet of plywood for that new shed dormer you’re trying to add onto the house. A chainsaw shouldn’t be used to dig a trench for an irrigation line.
both of those statements are false. While not common, some proiessional construction crews do use a chainsaw instead of a circular saw - the vast majority don't but some do. Likewise a trenching tool is just a chainsaw with a large blade - not the same chainsaw as you would use for a tree, put clearly a chainsaw if you look.
A chainsaw in the right hands can be a precision instrument; much better than a more directed/restricted saw. Definitely not typical for most carpentry work, but a sufficiently skilled craftsman can get through cuts a lot quicker with a chainsaw than a "safer" alternative. (Note: this is only true for very skilled/experienced people. Don't try it at home, those are professionals, etc, etc). I once worked at a construction site with a guy who would use a chainsaw to cut molding and trim. He would measure one time, and then make ornate cuts that looked amazing, and he would do it so quickly that it seemed like a magic trick.
If you ever want to have your mind blown (in a good way), visit Fairbanks or similar arctic locations in the winter and go to an ice carving festival. You'll see people there making incredibly detailed cuts with chainsaws. Ice is more forgiving than wood, but still far above the level of most mere mortals.
Meanwhile I feel good about my chainsaw work if I don't knock the chain off.
Enery compared to what? many crews are running a generator all day which is less efficient as it is running even when the says are not, while the chainsaw shuts off between cuts.
of course battery works well today. Other sites now have grid power.
Surely, LLMs are professional tools as for now. Internet was for scientists in first days.
GPUs were once for gamers. Then gamers and CAD/CGI professionals.
Today my phone uses gpu to show pretty unlock screen and internet is... internet. Most people don't even distinguish GPU and internet from their device.
My prediction: at current pace in 20 years average user won't even know if he is using local LLM on his device or not. It will be just how computers work. Very un-chainsowy, very inconspicuous, very 0 training required.
> Language models are capable of producing and digesting substantial volumes of text. More text than any single person should ever be expected to handle in the course of a lifetime. Compared to the speed at which a human can read and write, these models are the linguistic equivalent of a chainsaw. It’s much the same with computer vision, and generative algorithms producing videos and images of events that never occurred and things that don’t exist.
It’s my belief that, in our current artificial intelligence boom’s haste to grab as much business as possible, we are essentially handing out chainsaws to unqualified and inexperienced people who don’t appreciate the responsibility entrusted to them, and who probably don’t require such power in the first place. And that is not the consumers’ fault—this is all on the companies that are pushing it into their laps.
> Some would say that, compared to the tangible hazards of losing a bodily extremity or dropping a pine trunk through the bedroom ceiling, misuse of AI by irresponsible or malicious actors sounds downright genteel. But think about how quickly memes and misinformation flow through social media and the larger internet. Whoever first used the word “viral” to describe such spread, they hit that nail right on the head.Social media craves that stuff, and AI provides the almost effortless ability to produce unlimited quantities of exactly what it desires. And the reward for the creator, as much as the users of an AI product can be called the “creator” of that content, is a shower of likes, reposts, updoots, badges, and the tiny dribble of dopamine brought by those things. Thus the system perpetuates itself.
> Unlike the venerable chainsaw, AI doesn’t give any indication that it is being misused. It doesn’t growl, shake, kick, or protest. It doesn’t even give a useful indication that “hey this result might be completely useless hogwash, I dunno.” The user doesn’t get to see what happens inside, or know precisely where the information originally came from, or evaluate how the model may have compromised reality to produce an output that looked plausibly like something a human would accept. It just hums along quietly, churning out line after line of approximately whatever it believed was asked of it.
> To use a dog metaphor, a chainsaw tends to growl before it bites.
I believe the author is mistaken about why chainsaws don't cause more harm than they do. There are multiple of ways in which a chainsaw will kill or destroy, without any warning. Kickback is very quick. A tree falling in the wrong direction comes without warning.
Personally, I suspect there are 2 main reasons why chainsaws don't cause more harm.
A) They've been around for 100 years, and they've been causing fatalities and injuries for 100 years. People have invented ways to reduce the risk. Any chain you can readily buy is a low-kickback chain, and the saw comes with a chain brake. It doesn't completely remove the risk, but it substantially reduces it.
B) They've been around for 100 years, and they've been causing fatalities and injuries for 100 years. People are aware that chainsaws are inherently dangerous. Even someone without any training, and without looking at the safety instructions, understands that one of these will take off a limb without blinking.
What this means for the rest of the metaphor, I'm not sure.
>>chainsaw tends to growl before it bites.
It doesn't tend to; it growls continuously : treat me wrong an i will eat you.
THe new electric chainsaws don't make the "normal" sounds.
more like swish.
also: PSA - chainsaw pants.
I don't love the analogy, but I 100% agree that we are putting AI tools into people's hands that they are not yet able to wield safely. I think this is mostly because of how quickly AI has been developed. New technologies can take decades to be absorbed by society. Early on with cars we had no traffic lights, speed limits, or seat belts. And modern freeways were many decades off.
"Early electrical systems were poorly insulated, and fires or electrocutions were common. The creation of standards for wiring, outlets, grounding systems, and circuit breakers took decades to develop." It could be that AI will develop so fast that society never catches up, but I'm sure we'll at least get better over time.
I really don't understand why people are impressed by posts like this. The author spends way too much time talking about literal chainsaws and really mostly just asserts its a relevant metaphor. Mostly an LLM is like a chainsaw in that most people don't need either one. There are a million things I don;t need, for example, spinny hubcaps. Is an LLM really more like a chainsaw than it is like spinny hubcaps? Because it seems more like spinny hubcaps to me.
I'd be interested in hearing what, if any, danger to society spinny hubcaps present
I personally think they are incredibly dangerous to motorcyclists and cyclists. When a car with spinner hubcaps pulls up to an intersection it takes much longer for other vehicles to tell that it has actually stopped (because one of the main visual cues normally is to look at whether the wheels are moving) so a couple of times I have found myself on the verge of taking emergency evasive manoeuvres thinking the car is pulling straight out on me (cars pull out on bikers all the time). I've seen cyclists swerve dangerously also thinking their lives were in danger from a car with spinny hubcaps.
The anime Chainsaw Man is about this. In its world, chainsaws are the Number #1 Most-power Devil. Even more powerful than Guns.
I wondered about this but it makes sense when you imagine trees as sentient. Chainsaws are extremely efficient killing machines for the most helpless and generous creatures on earth.
I feel like this metaphor could just as easily be about "cars" instead of "chainsaws". We do put cars in almost everyone's hands even though it has the same potential for dangers.
Even more so since I would assume more people die to car accidents then chainsaw accidents. But more people find cars useful to their day to day lives than chainsaws.
So I guess the question is whether these models are useful to everyone like cars, or just to a subset like chainsaws that still remains to be seen.
Conclusion: most of us don't need AI in the same way most of us don't need a chainsaw.
"We’re all mostly just regular people interacting with other regular people, trying to go about our business and get through our days. Nobody’s asking us to index and catalog entire libraries of information. Nobody really cares if we reuse a stock photo that somebody else might have used somewhere else at some point. We’re not so busy that we need everything predigested before it’s presented to us. We’re still clever enough to work our way through unfamiliar problems. Some of us, I would have to think, still possess the ability and the desire to produce some sort of output in our chosen medium without relinquishing creative control to AI."
I disagree, you're not including mental fatigue. I can say something like "I have a server OS upgrade that needs to be done by X date. The server uses XYZ technologies. Here is the list of stakeholders and their roles. Design me an assessment and implementation plan and outline the key milestones and their dates." Could I think through that myself? Yes but it may take a couple hours. I just got 85% of the planning work done in 2 minutes.
I feel like I'm talking with people who would rather ride a horse and buggy down a rutted dirt road for 2 hours both ways, checking the wooden wheels for damage. Giving the horses rest stops.
I think I'll take a 10 minute car drive to costco, thank you very much.
How else can you know that your wooden wheels aren't damaged? We've used wooden wheels forever, good enough for me! Lol I agree.
> The chainsaw is a poor fit for, say, ripping down a sheet of plywood for that new shed dormer you’re trying to add onto the house. A chainsaw shouldn’t be used to dig a trench for an irrigation line.
both of those statements are false. While not common, some proiessional construction crews do use a chainsaw instead of a circular saw - the vast majority don't but some do. Likewise a trenching tool is just a chainsaw with a large blade - not the same chainsaw as you would use for a tree, put clearly a chainsaw if you look.
Chainsaws don't like dry wood. They are not cost effective energy and wear-wise for most carpentry work.
A chainsaw in the right hands can be a precision instrument; much better than a more directed/restricted saw. Definitely not typical for most carpentry work, but a sufficiently skilled craftsman can get through cuts a lot quicker with a chainsaw than a "safer" alternative. (Note: this is only true for very skilled/experienced people. Don't try it at home, those are professionals, etc, etc). I once worked at a construction site with a guy who would use a chainsaw to cut molding and trim. He would measure one time, and then make ornate cuts that looked amazing, and he would do it so quickly that it seemed like a magic trick.
If you ever want to have your mind blown (in a good way), visit Fairbanks or similar arctic locations in the winter and go to an ice carving festival. You'll see people there making incredibly detailed cuts with chainsaws. Ice is more forgiving than wood, but still far above the level of most mere mortals.
Meanwhile I feel good about my chainsaw work if I don't knock the chain off.
Enery compared to what? many crews are running a generator all day which is less efficient as it is running even when the says are not, while the chainsaw shuts off between cuts.
of course battery works well today. Other sites now have grid power.
I am kind of not sure what article even means.
Surely, LLMs are professional tools as for now. Internet was for scientists in first days. GPUs were once for gamers. Then gamers and CAD/CGI professionals.
Today my phone uses gpu to show pretty unlock screen and internet is... internet. Most people don't even distinguish GPU and internet from their device.
My prediction: at current pace in 20 years average user won't even know if he is using local LLM on his device or not. It will be just how computers work. Very un-chainsowy, very inconspicuous, very 0 training required.
You have to scroll a kilometer, but the article does finally get around to saying something. I almost gave up.
The word "mélange" has been here longer than chainsaws, but I wouldn't trust 90% of people to use it correctly either
Great article, Scott. Chainsaws seem to be a very apropos metaphor for generative AI.
Who are you writing for?
You can skip about half the article
> Language models are capable of producing and digesting substantial volumes of text. More text than any single person should ever be expected to handle in the course of a lifetime. Compared to the speed at which a human can read and write, these models are the linguistic equivalent of a chainsaw. It’s much the same with computer vision, and generative algorithms producing videos and images of events that never occurred and things that don’t exist.
It’s my belief that, in our current artificial intelligence boom’s haste to grab as much business as possible, we are essentially handing out chainsaws to unqualified and inexperienced people who don’t appreciate the responsibility entrusted to them, and who probably don’t require such power in the first place. And that is not the consumers’ fault—this is all on the companies that are pushing it into their laps.
> Some would say that, compared to the tangible hazards of losing a bodily extremity or dropping a pine trunk through the bedroom ceiling, misuse of AI by irresponsible or malicious actors sounds downright genteel. But think about how quickly memes and misinformation flow through social media and the larger internet. Whoever first used the word “viral” to describe such spread, they hit that nail right on the head.Social media craves that stuff, and AI provides the almost effortless ability to produce unlimited quantities of exactly what it desires. And the reward for the creator, as much as the users of an AI product can be called the “creator” of that content, is a shower of likes, reposts, updoots, badges, and the tiny dribble of dopamine brought by those things. Thus the system perpetuates itself.
> Unlike the venerable chainsaw, AI doesn’t give any indication that it is being misused. It doesn’t growl, shake, kick, or protest. It doesn’t even give a useful indication that “hey this result might be completely useless hogwash, I dunno.” The user doesn’t get to see what happens inside, or know precisely where the information originally came from, or evaluate how the model may have compromised reality to produce an output that looked plausibly like something a human would accept. It just hums along quietly, churning out line after line of approximately whatever it believed was asked of it.