Yes, and this post may have been written by AI, but it wasn't. Corrected title: One person thinks the latest Hunger Games novel may have been co-authored by AI based on two examples and a "gut feeling".
Initially I was thinking "could be AI, could just be A3O-quality writing", but that whole "my grandmother's skin was as soft as a spiderweb" thing seems a bit much. It's very hard to imagine a human writing that.
'Embellishment' seems absolutely standard for a novel, some authors more than others, and it's exactly what you're taught to do clumsily in school.
The actions in the train scene didn't seem so bizarre to me, and even if they did we can still write bizarre events and characters? Similarly the spiderweb skin, where's the line between 'no human would ...' and writing a strange character?
I don't have anything like the OOP's experience with literature or even AI, but I'm not really convinced. It is nevertheless interesting that they believe they've identified it, and to think of the ramifications, aside from whether it's a correct analysis or not.
> The moment our hearts shattered? It belongs to us.
I'm ambivalent about a lot of other cited examples (could be AI, could just be bad ghostwriting) but this particular sentence does have very distinct AI smell.
> It would be considered standard industry practice to hire a ghostwriter for these prequels
The use of ghostwriters is the main problem, I think. AI is just another ghostwriter.
I recently read a book that's 3rd in a trilogy. I loved the first 2 books but didn't like the 3rd at all and indeed stopped reading after about 100 pages. It felt like the 3rd book was just a perfunctory response to the publisher's request for another sequence in the series, a mere money grab. But now, suddenly, I'm starting to wonder whether the 3rd book was even written by the author...
I was aware that non-writers, e.g., politicians, use ghostwriters when they publish a book, but it would have never occurred to me that experienced, accomplished fiction writers would also use ghostwriters.
Writing is tough. If you are famous author and not invested in the publisher's cash grab request, it would make sense to have someone else writing it while you perform basic QA for branding purposes.
My issue with that as a reader is that when I purchase a book authored by Suzanne Collins I expect it to have been actually written by Suzanne Collins, not by somebody she contracted to imitate her style.
We are so far past that in mass market fiction. It's folding back in on itself.
Tom Clancy has new books coming out every year and he's been dead for over a decade. They don't hide the "ghostwriter" but they also put Tom Clancy in huge letters at the top even though he had less than nothing to do with it.
It's interesting though isn't it, because if she contracts someone good, the ghostwriter does an excellent job of imitating her style, then really you do get exactly what you wanted? I don't like the idea of it either, fwiw, but it's hard to rationalise.
(But then why stop there, have the estate of the esteemed author go on contracting ghostwriters! Does it only work if you keep the death a secret, or would a licensed P.G. Wodehouse ghostwriter do as well today as if he were a recluse and never proclaimed dead?)
> It's interesting though isn't it, because if she contracts someone good, the ghostwriter does an excellent job of imitating her style, then really you do get exactly what you wanted? I don't like the idea of it either, fwiw, but it's hard to rationalise.
But the ghostwriters are never as good as the original. Imitating style is not the same as imitating excellence. If the ghostwriter were as talented as the original author, they would be publishing their own novels under their own name, not doing anonymous gruntwork for others.
Reminds me, there was a Stephen King interview from a ~decade ago where he observed that when he passes away his son (Joe Hill) could probably keep publishing under his name for at least a few years since he's perfected imitating King's general style.
I think the distinction, for me, is that when I pay for a book I want access to the author's creative thoughts and personality, not just their particular "brand". I realize that a lot of readers don't care, especially in the YA space, but I'd rather read a worse novel from the person who conceived The Hunger Games than a perfect imitation from someone who's merely imitating the brand.
> It's interesting though isn't it, because if she contracts someone good, the ghostwriter does an excellent job of imitating her style, then really you do get exactly what you wanted? I don't like the idea of it either, fwiw, but it's hard to rationalise.
There is nothing hard about rationalizing this. If this was what "I wanted" they would just put the ghostwriters name on cover with "written in style of X" in bold letters.
But, it is not what people wanted and they would buy the book less.
> Writing is tough. If you are famous author and not invested in the publisher's cash grab request, it would make sense to have someone else writing it while you perform basic QA for branding purposes.
And it would then make 100% sense to put that other persons name on the cover. The QA doing person can be there as "editor".
Recording artists release songs composed by other songwriters, but those writers are credited on the album.
Music is different in that it includes both writing and performance, but books include no performance component (except books on tape, I suppose, but then the narrator is credited).
With that in mind, most negative reviews seem to have this in common:
> If you want to read something very similar to The Hunger Games, this is your book. It goes reaping > dress-up > training > rating > games. The characters are different, but the plot is virtually the same.
I have a lot of experience with "LLM voice" as well, and none of that sounds even remotely LLM-written.
The smoking gun for LLM-written text is when the text is a "linked list". It can only ever directly reference the previous thing. That's not the case here. And the latest Hunger Games book isn't yet another amazon-published slopfest. It's been through a couple of rounds of editing, at the very least.
I'm not saying RedditOP is completely off-kilter. There might be something to what he's saying. Maybe Suzanne Collins (the author of the book) has been consuming a lot of LLM-generated content. Or maybe she's just ahead of the curve and writing in a style that's likely to catch fire (no pun intended) [1].
[1]: Yes, I wrote this myself! And the entire reply!
I use AI for world and character-building in the RPG i DM, and this is for sure something an AI would write. AI i use professionally do like linked list, but when i ask it without a MCP, to "write" a character and its background, catchphrases and what he would do in 3-4 different situation, you'll get paragraph that suspiciously look like the first example.
LLMs can change their "voice" with a simple instruction. If detecting LLM-generated text was as easy as you think it is then services in this space wouldn't suffer any false-positives or any false-negatives.
Oh I don't think it's easy or that one can be completely sure about it. And in my experience, LLMs aren't very good at changing their voice. If you have any examples to the contrary, I'd like to see em.
I don't have examples, I'm asking you to provide evidence which supports your extraordinary claim.
You claimed, with a high level of confidence, that the text isn't written by AI because it lacks obvious "tells" which you believe to be present in any LLM generated text. But if the absence of these "tells" reliably indicated human writing then LLM detectors would have false negative rate of approximately 0%, do they?
It's not necessary for the tell to be easily computable. It's not something that's true everywhere, but it has been true more often than not when I've tried to use LLMs. And I haven't seen a LLM-generated piece of writing that's sufficiently long and complicated enough to rule this tell out.
Did we read the same extracts? The nonsensical actions and movements of the lovers in the entire train scene? The obnoxious call and response structure? The absurd comparison between a grandmother skin and a spiders web because "silk"?
> It can only ever directly reference the previous thing. That's not the case here. And the latest Hunger Games book isn't yet another amazon-published slopfest. It's been through a couple of rounds of editing, at the very least.
I don't agree with your assessment here, "the last thing" can be literally anything the user prompts. Are you suggesting that because none of the previous books in the series are written by AI, that's somehow an argument that the latest in the series can't be?
Okay, first, I haven't read that book. I'm going off the cherrypicked examples in the OP.
If a LLM was indeed used, the output was likely massaged to a degree that wouldn't be immediately obvious.
Now, my 2c: Writing is sometimes atrocious and sometimes authors jam in stupid things to maintain flow. If a person could make the connection between "Silk", "Spider", "weaving" and "grandmother" then another person could as well (even when one is "verifying" and another "proving"). And using those properly and in context and succinctly is far beyond most LLMs, and would require a fair amount of gambling, which would be out of character for someone whose writing prowess has been verified pre-LLMs.
As for what I mean by "directly reference the previous thing": LLMs can jam in well-known (to them) phrases and sentences and structures onto an idea/request. However, they are unable to loop upon the particulars of that idea/request in a coherent manner, which leads to slopification at large output sizes, and shows us the ceiling of the quality of writing a LLM can output.
Whether it's accurate or not (how could anyone really tell?) just another person spending a lot of time in AI systems starting to question everything about their reality. See also some recent anime/comic-cons where vendors have been accused of selling generated artwork. This being unable to unsee the potential for AI-use in many creative things etc driving one missions to unveil the 'truth', conspiracy theory/paranoia. Calling it being AI-pilled. Not saying it isn't unnerving/unfounded wanting to weed out the 'interference', but not a good direction for many.
Can we spare HN some of these off the wall Reddit dives?
This is truly an abysmal revelation. I can't really put into words how horrified I am that actual real physical books are now getting AI slop enshitifcation.
There's nothing magic about physical books in a way that might protect them, and it's a terribly low-margin industry that's vulnerable to this kind of behavior.
Precisely this. Pop literature was already prone to plagiarism before AI because of the sheer volume of slush. "LLM co-authoring" is plagiarism run through the laundry.
That's a broad statement that's untrue for most genres, at least right now. However, it is true for the YA lit / chic-lit and the smut-disguised-as-fantasy genres. These already have authors pumping out a book every few months with different characters jammed into similar situations. And people have personalized these kinds of interactions already (CharacterAI and others). Literature is literature, whether human or LLM-generated and will sell like hot cakes if it caters to the common denominator well enough.
It can also be seen in newer fanfics. If you visit AO3 you might get to see a few that are completely written with LLMs and the authors sometimes don't even bother reading it themselves before publishing it. The lower quality is almost always apparent from the get-go.
LLMs can be very useful for writing, but I don't see serious writers using LLMs except for maybe checking facts / as a knowledgebase. The lowest tier of readers was already one-shot by LLMs over a year ago.
Lots of stories on youtube seem to be AI written or highly assisted (With AI generated voices, subtitles and pictures).
I was listening to "HFY Sci-Fi" at one point but there are hundreds of channels with "Getting revenge on the Boss" or 50 other story genres. Each pumping out a new story every week. Some taken from other sources, some AI generated.
OK I'll rephrase, in place of book I will use "cultural anchor" which covers everything. The point remains that it is all under siege.
It’s one thing that the internet is full of slop now, but something about this feels deeply worse. It’s something that has for all of history involved real humans authors. Now it seems some human authors use AI an don’t even bother to check the results make sense.
It's already big in esoterism. A relative of mine was reading some book which upon closer inspection turned out to be a direct ChatGPT printout (yes with the prompts included).
Yes, and this post may have been written by AI, but it wasn't. Corrected title: One person thinks the latest Hunger Games novel may have been co-authored by AI based on two examples and a "gut feeling".
Initially I was thinking "could be AI, could just be A3O-quality writing", but that whole "my grandmother's skin was as soft as a spiderweb" thing seems a bit much. It's very hard to imagine a human writing that.
I've read plenty of dumb things in pre-AI books.
Dumb, sure, but "my, grandma, how spiderweb-y you are" is a bit out there.
[dead]
'Embellishment' seems absolutely standard for a novel, some authors more than others, and it's exactly what you're taught to do clumsily in school.
The actions in the train scene didn't seem so bizarre to me, and even if they did we can still write bizarre events and characters? Similarly the spiderweb skin, where's the line between 'no human would ...' and writing a strange character?
I don't have anything like the OOP's experience with literature or even AI, but I'm not really convinced. It is nevertheless interesting that they believe they've identified it, and to think of the ramifications, aside from whether it's a correct analysis or not.
> The moment our hearts shattered? It belongs to us.
I'm ambivalent about a lot of other cited examples (could be AI, could just be bad ghostwriting) but this particular sentence does have very distinct AI smell.
What I found more disturbing was this:
> It would be considered standard industry practice to hire a ghostwriter for these prequels
The use of ghostwriters is the main problem, I think. AI is just another ghostwriter.
I recently read a book that's 3rd in a trilogy. I loved the first 2 books but didn't like the 3rd at all and indeed stopped reading after about 100 pages. It felt like the 3rd book was just a perfunctory response to the publisher's request for another sequence in the series, a mere money grab. But now, suddenly, I'm starting to wonder whether the 3rd book was even written by the author...
I was aware that non-writers, e.g., politicians, use ghostwriters when they publish a book, but it would have never occurred to me that experienced, accomplished fiction writers would also use ghostwriters.
Writing is tough. If you are famous author and not invested in the publisher's cash grab request, it would make sense to have someone else writing it while you perform basic QA for branding purposes.
Its not good at all, but makes financial sense.
My issue with that as a reader is that when I purchase a book authored by Suzanne Collins I expect it to have been actually written by Suzanne Collins, not by somebody she contracted to imitate her style.
We are so far past that in mass market fiction. It's folding back in on itself.
Tom Clancy has new books coming out every year and he's been dead for over a decade. They don't hide the "ghostwriter" but they also put Tom Clancy in huge letters at the top even though he had less than nothing to do with it.
https://www.amazon.com/Clancy-Line-Demarcation-Jack-Novel-eb...
It's interesting though isn't it, because if she contracts someone good, the ghostwriter does an excellent job of imitating her style, then really you do get exactly what you wanted? I don't like the idea of it either, fwiw, but it's hard to rationalise.
(But then why stop there, have the estate of the esteemed author go on contracting ghostwriters! Does it only work if you keep the death a secret, or would a licensed P.G. Wodehouse ghostwriter do as well today as if he were a recluse and never proclaimed dead?)
> It's interesting though isn't it, because if she contracts someone good, the ghostwriter does an excellent job of imitating her style, then really you do get exactly what you wanted? I don't like the idea of it either, fwiw, but it's hard to rationalise.
But the ghostwriters are never as good as the original. Imitating style is not the same as imitating excellence. If the ghostwriter were as talented as the original author, they would be publishing their own novels under their own name, not doing anonymous gruntwork for others.
Reminds me, there was a Stephen King interview from a ~decade ago where he observed that when he passes away his son (Joe Hill) could probably keep publishing under his name for at least a few years since he's perfected imitating King's general style.
I think the distinction, for me, is that when I pay for a book I want access to the author's creative thoughts and personality, not just their particular "brand". I realize that a lot of readers don't care, especially in the YA space, but I'd rather read a worse novel from the person who conceived The Hunger Games than a perfect imitation from someone who's merely imitating the brand.
> It's interesting though isn't it, because if she contracts someone good, the ghostwriter does an excellent job of imitating her style, then really you do get exactly what you wanted? I don't like the idea of it either, fwiw, but it's hard to rationalise.
There is nothing hard about rationalizing this. If this was what "I wanted" they would just put the ghostwriters name on cover with "written in style of X" in bold letters.
But, it is not what people wanted and they would buy the book less.
> Writing is tough. If you are famous author and not invested in the publisher's cash grab request, it would make sense to have someone else writing it while you perform basic QA for branding purposes.
And it would then make 100% sense to put that other persons name on the cover. The QA doing person can be there as "editor".
Interesting that in music, ghostwriters are known and pretty much not hidden in any way.
Ghostwriters are by definition hidden.
Recording artists release songs composed by other songwriters, but those writers are credited on the album.
Music is different in that it includes both writing and performance, but books include no performance component (except books on tape, I suppose, but then the narrator is credited).
With that in mind, most negative reviews seem to have this in common:
> If you want to read something very similar to The Hunger Games, this is your book. It goes reaping > dress-up > training > rating > games. The characters are different, but the plot is virtually the same.
> the book reads like a hunger games #1 rewrite
https://www.goodreads.com/book/show/214331246-sunrise-on-the...
Although most fans seem to love it.
I have a lot of experience with "LLM voice" as well, and none of that sounds even remotely LLM-written.
The smoking gun for LLM-written text is when the text is a "linked list". It can only ever directly reference the previous thing. That's not the case here. And the latest Hunger Games book isn't yet another amazon-published slopfest. It's been through a couple of rounds of editing, at the very least.
I'm not saying RedditOP is completely off-kilter. There might be something to what he's saying. Maybe Suzanne Collins (the author of the book) has been consuming a lot of LLM-generated content. Or maybe she's just ahead of the curve and writing in a style that's likely to catch fire (no pun intended) [1].
[1]: Yes, I wrote this myself! And the entire reply!
I use AI for world and character-building in the RPG i DM, and this is for sure something an AI would write. AI i use professionally do like linked list, but when i ask it without a MCP, to "write" a character and its background, catchphrases and what he would do in 3-4 different situation, you'll get paragraph that suspiciously look like the first example.
LLMs can change their "voice" with a simple instruction. If detecting LLM-generated text was as easy as you think it is then services in this space wouldn't suffer any false-positives or any false-negatives.
Oh I don't think it's easy or that one can be completely sure about it. And in my experience, LLMs aren't very good at changing their voice. If you have any examples to the contrary, I'd like to see em.
I don't have examples, I'm asking you to provide evidence which supports your extraordinary claim.
You claimed, with a high level of confidence, that the text isn't written by AI because it lacks obvious "tells" which you believe to be present in any LLM generated text. But if the absence of these "tells" reliably indicated human writing then LLM detectors would have false negative rate of approximately 0%, do they?
It's not necessary for the tell to be easily computable. It's not something that's true everywhere, but it has been true more often than not when I've tried to use LLMs. And I haven't seen a LLM-generated piece of writing that's sufficiently long and complicated enough to rule this tell out.
> none of that sounds even remotely LLM-written.
Did we read the same extracts? The nonsensical actions and movements of the lovers in the entire train scene? The obnoxious call and response structure? The absurd comparison between a grandmother skin and a spiders web because "silk"?
> It can only ever directly reference the previous thing. That's not the case here. And the latest Hunger Games book isn't yet another amazon-published slopfest. It's been through a couple of rounds of editing, at the very least.
I don't agree with your assessment here, "the last thing" can be literally anything the user prompts. Are you suggesting that because none of the previous books in the series are written by AI, that's somehow an argument that the latest in the series can't be?
Okay, first, I haven't read that book. I'm going off the cherrypicked examples in the OP.
If a LLM was indeed used, the output was likely massaged to a degree that wouldn't be immediately obvious.
Now, my 2c: Writing is sometimes atrocious and sometimes authors jam in stupid things to maintain flow. If a person could make the connection between "Silk", "Spider", "weaving" and "grandmother" then another person could as well (even when one is "verifying" and another "proving"). And using those properly and in context and succinctly is far beyond most LLMs, and would require a fair amount of gambling, which would be out of character for someone whose writing prowess has been verified pre-LLMs.
As for what I mean by "directly reference the previous thing": LLMs can jam in well-known (to them) phrases and sentences and structures onto an idea/request. However, they are unable to loop upon the particulars of that idea/request in a coherent manner, which leads to slopification at large output sizes, and shows us the ceiling of the quality of writing a LLM can output.
Whether it's accurate or not (how could anyone really tell?) just another person spending a lot of time in AI systems starting to question everything about their reality. See also some recent anime/comic-cons where vendors have been accused of selling generated artwork. This being unable to unsee the potential for AI-use in many creative things etc driving one missions to unveil the 'truth', conspiracy theory/paranoia. Calling it being AI-pilled. Not saying it isn't unnerving/unfounded wanting to weed out the 'interference', but not a good direction for many.
Can we spare HN some of these off the wall Reddit dives?
This is truly an abysmal revelation. I can't really put into words how horrified I am that actual real physical books are now getting AI slop enshitifcation.
There's nothing magic about physical books in a way that might protect them, and it's a terribly low-margin industry that's vulnerable to this kind of behavior.
Precisely this. Pop literature was already prone to plagiarism before AI because of the sheer volume of slush. "LLM co-authoring" is plagiarism run through the laundry.
That's a broad statement that's untrue for most genres, at least right now. However, it is true for the YA lit / chic-lit and the smut-disguised-as-fantasy genres. These already have authors pumping out a book every few months with different characters jammed into similar situations. And people have personalized these kinds of interactions already (CharacterAI and others). Literature is literature, whether human or LLM-generated and will sell like hot cakes if it caters to the common denominator well enough.
It can also be seen in newer fanfics. If you visit AO3 you might get to see a few that are completely written with LLMs and the authors sometimes don't even bother reading it themselves before publishing it. The lower quality is almost always apparent from the get-go.
LLMs can be very useful for writing, but I don't see serious writers using LLMs except for maybe checking facts / as a knowledgebase. The lowest tier of readers was already one-shot by LLMs over a year ago.
Lots of stories on youtube seem to be AI written or highly assisted (With AI generated voices, subtitles and pictures).
I was listening to "HFY Sci-Fi" at one point but there are hundreds of channels with "Getting revenge on the Boss" or 50 other story genres. Each pumping out a new story every week. Some taken from other sources, some AI generated.
OK I'll rephrase, in place of book I will use "cultural anchor" which covers everything. The point remains that it is all under siege.
It’s one thing that the internet is full of slop now, but something about this feels deeply worse. It’s something that has for all of history involved real humans authors. Now it seems some human authors use AI an don’t even bother to check the results make sense.
It's already big in esoterism. A relative of mine was reading some book which upon closer inspection turned out to be a direct ChatGPT printout (yes with the prompts included).
Feels so wrong in physical book form.