What's the nature of these videos? I don't really see anything AI at all on my feed. That said, most of the videos I watch are of people talking long form who I've been following since before the AI craze.
A lot of these are just a voice over some random graphics that may or may no relate to the content being discussed. Most of the content is pilfered from real creators.
Travel videos, for one. I've had to wade through a bunch of "Top 10 must-see places in $CITY" that are obviously thrown together listicles, narrated by an AI voice, with no value whatsoever.
Makes me wish they'd start showing thumbs-down counts on videos again, maybe that would have some impact on the problem.
Take image generation trained on a bunch of copyrighted photographs and artwork. What is the intent behind creating such a tool?
Yes, there is a stage where the software developer is building the software to do this purely operating on a "this is a cool hack" kind of mentality, but the point at which you make it available to other people, especially for payment, is when the liability becomes real.
Having worked on open source software for most of my life, I have always had to be aware of the issues surrounding copyright. That other software developers will write software and use copyright to proect it while ignoring the copyrights of others is deeply concerning. Copyright of software is no more or less important than copyrights applied to images and artwork.
Engineers are liable for their mistakes. At some point the same mechanisms may well need to be applied to software.
Software developers rarely have the power to say no to the people who sign their paychecks. If you do, you're speaking from a position of privilege that most don't share. Count yourself lucky rather than spewing hate on those whose lives are not as privileged as yours. Not everyone can go long periods of time without income, especially in the current labor market and doubly so if it is known that they're willing to say no to their employer.
No value whatsoever? I find the travel ones to be slightly/ useful as a basic neutral introduction. I will often choose an obvious AI version over some opinionated person who is trying to build a brand. (Sorry humans.)
That being said, some of the random still frames that make up these videos are pretty stupid and valueless.
In addition to the video types that mpicker0 and lysace mentioned, here's yet another:
Fake, and dishonest, Christian Country Music.
I learned of this through Fil Henley, who just pointed out Ella Scott (https://www.youtube.com/watch?v=X0wtyIljNns) which is an AI account with AI-generated songs and thumbnails being produced at an unfeasibly high rate for an actual human. There are astroturfing channels under other names that purport, with more AI fakery, to show themselves alongside Ella Scott. There's an Ella Scott psychology channel tapping into the self-help market. There's an Ella Scott Soulnotes channel. There are not-declared-AI linked accounts on other platforms. Until this all got high exposure, there was an active PayPal donation setup for a non-existent Ella Scott charity.
This isn't even the first firehose account of AI-generated CCM that M. Henley has covered. There has been the improbably double-barrelled James Hilton-Cowboy (https://www.youtube.com/watch?v=9HORuCSsDLI) for example.
And there is an audience of Christians being taken in by these thinking that they are real performers.
The irony of believers in the supernatural being too easily fooled by ai is entertaining and unsurprising - part of being Christian is being open to ‘fake and dishonest’
And besides, if they like the music, what’s the harm?
It's all too easy to be unsympathetic. But the unsympathetic should remember the soliciting of donations to the non-existent charity for supposedly aiding children; and then bear in mind that there are going to be AI-wielding con-artists in other fields, closer to home, with other hooks for grabbing the marks, too.
Ironically, the religious have rules about doing unto others: be sympathetic to the people conned into giving their money away, in the hopes that they'll be sympathetic to you when you're conned in your turn.
Not op but, whenever I search for some term that maybe have more watchtime,you will see AI generated content. For example try searching for "li ion battery".
My feed is filled with AI slop videos. They are usually AI generated thumbnails, AI generated voice-overs over either AI generated images animating or random clips of videos stolen from actual creators. Last month, I was getting 5-6 videos every refresh containing some fake futuristic AI-generated cars. Nowadays, I am getting the same with AI-generated tech products.
Typically, all these videos have under 1000 views. I am at a point where I might even write a user script to hide every video under 1000 views.
There appears to be an enormous algorithmic weight to those. Just a few such videos completely swamped my recommendations with Japanese marching bands a couple of years ago. Simply finding those two to copy their URLs into this post has immediately put two further band videos into my recommendations.
This doesn't surprise me. Websites and blogs and search results are polluted with trash. On the plus side, it has ruined any "explore" type watching of anything for me, which is making me spend less time on these platforms.
Looking for advice? One approach that might be interesting to try is to subscribe to channels and restrict yourself to your subscriptions. You could even bookmark the subscriptions tab in Youtube so that you don't land on the main page with recommendations.
Oh, I like the idea of sticking to subscribed channels. It's the equivalent of only using the Following tab on Twitter.
It's funny how invariably the algorithmic feed ends up being overused and pushes people to self filtered content. Facebook is the earliest example I remember of this.
Same. I don't stray from my core Youtube. I don't even subscribe to channels, but my algorithm is still only offering me videos from creators that I've been watching for years and trust. When one of them jumps the shark, looking at you Veritasium and Mark Rober, I just stop clicking their videos and eventually they stop showing up in my list.
Curious which video caused you to dump vertasium? Other than click-bait titles and thumbnails, the production and information presented seems first class.
When he stopped presenting them live, and switched to animation with voice over. The change was likely made to be able to pump them out faster, and there was a definite drop in quality that came along with the change.
My bar for new channels is that they must have a person presenting the content, who appears on camera for enough time to convince myself they're not fake.
Yes. The elderly people in my family are falling into watching AI-generated videos which a human never touched before getting views. Grandpa is watching AI-generated stories of puppies saving lives (I had to break it to him, none of these are real) -- just search "puppy saves baby" to see an endless stream of slop. Mom is watching videos from a Galactic Light Federation which she says raise her vibration, and she doesn't believe these are AI. It's tragic, such a waste.
Just like Amazon could detect fraudulent sellers but choose not to because they make money off of those sales just like the legit ones, and customers stick around despite of the fraud.
I only watch new content if there is a person presenting it, and making themselves visible on camera for at least a few minutes. It's not perfect, but that few minutes let's me evaluate whether I trust they are real. Of course, this doesn't apply to channels that I trust, such as those I've been watching for years.
I can confirm. My YouTube algorithmic feed is full of AI slop on a whole range of topics, including fake news items, which is particularly concerning because it's often pretty convincing.
As a point of reference: I usually watch high quality content, avoid shorts and other brain rot.
So, basically like any place on earth where a lot of people meet.
I stopped going to festivals for that reason.
The worst was near a lake: during the time we needed to setup our small tent, enough RVs had arrive with though dumping their full toilets into the lake wouldn't matter... yeah, bathing never happened, because a lot of brown sausages were swimming around, plus random trash.
But it didn't stop there:
before the first night, the way between camping grounds and the different stages was clearly visible even without a lot of artificial light,bdue to the white plastic in the trash, lining both sides. Disgusting.
A lot of festival goers are basically ignorant shit-holes on legs. Even if there are only 2-3%, that often spoils it for the rest.
Here's a weird data point: I was recently doing some physical labor with a neighbor who is probably a recovering addict with what some might call a "partially fried brain." After a bit, I noticed that he liked to have his phone out playing AI slop videos with Star Wars-like stories about Galactic this or Federation that.
He said it settled his brain better than human generated content. It was soothing, consistent and just one or two notches above bland.
I left with the impression that his brain and the AI content had a great impedance match.
What's the nature of these videos? I don't really see anything AI at all on my feed. That said, most of the videos I watch are of people talking long form who I've been following since before the AI craze.
A lot of these are just a voice over some random graphics that may or may no relate to the content being discussed. Most of the content is pilfered from real creators.
Travel videos, for one. I've had to wade through a bunch of "Top 10 must-see places in $CITY" that are obviously thrown together listicles, narrated by an AI voice, with no value whatsoever.
Makes me wish they'd start showing thumbs-down counts on videos again, maybe that would have some impact on the problem.
This is the world that software developers are building. At what point do they become accountable for the resulting mess?
I don't think people should be held liable for the misuse of tools they create, unless the tools were created for that purpose.
That's not something I can agree with.
Take image generation trained on a bunch of copyrighted photographs and artwork. What is the intent behind creating such a tool?
Yes, there is a stage where the software developer is building the software to do this purely operating on a "this is a cool hack" kind of mentality, but the point at which you make it available to other people, especially for payment, is when the liability becomes real.
Having worked on open source software for most of my life, I have always had to be aware of the issues surrounding copyright. That other software developers will write software and use copyright to proect it while ignoring the copyrights of others is deeply concerning. Copyright of software is no more or less important than copyrights applied to images and artwork.
Engineers are liable for their mistakes. At some point the same mechanisms may well need to be applied to software.
Software developers rarely have the power to say no to the people who sign their paychecks. If you do, you're speaking from a position of privilege that most don't share. Count yourself lucky rather than spewing hate on those whose lives are not as privileged as yours. Not everyone can go long periods of time without income, especially in the current labor market and doubly so if it is known that they're willing to say no to their employer.
Yeah I get that. But you gotta feel filthy. But I guess overtime those people become so desensitised to it they just dont feel it.
No value whatsoever? I find the travel ones to be slightly/ useful as a basic neutral introduction. I will often choose an obvious AI version over some opinionated person who is trying to build a brand. (Sorry humans.)
That being said, some of the random still frames that make up these videos are pretty stupid and valueless.
AI narrated space opera - HFY. There are like 10 original stories, and everything else is some regurgitated version of those.
In addition to the video types that mpicker0 and lysace mentioned, here's yet another:
Fake, and dishonest, Christian Country Music.
I learned of this through Fil Henley, who just pointed out Ella Scott (https://www.youtube.com/watch?v=X0wtyIljNns) which is an AI account with AI-generated songs and thumbnails being produced at an unfeasibly high rate for an actual human. There are astroturfing channels under other names that purport, with more AI fakery, to show themselves alongside Ella Scott. There's an Ella Scott psychology channel tapping into the self-help market. There's an Ella Scott Soulnotes channel. There are not-declared-AI linked accounts on other platforms. Until this all got high exposure, there was an active PayPal donation setup for a non-existent Ella Scott charity.
This isn't even the first firehose account of AI-generated CCM that M. Henley has covered. There has been the improbably double-barrelled James Hilton-Cowboy (https://www.youtube.com/watch?v=9HORuCSsDLI) for example.
And there is an audience of Christians being taken in by these thinking that they are real performers.
Some AI country music tapes are actually pretty funny - https://www.youtube.com/watch?v=KhRkkNr-6V4
The irony of believers in the supernatural being too easily fooled by ai is entertaining and unsurprising - part of being Christian is being open to ‘fake and dishonest’
And besides, if they like the music, what’s the harm?
It's all too easy to be unsympathetic. But the unsympathetic should remember the soliciting of donations to the non-existent charity for supposedly aiding children; and then bear in mind that there are going to be AI-wielding con-artists in other fields, closer to home, with other hooks for grabbing the marks, too.
Ironically, the religious have rules about doing unto others: be sympathetic to the people conned into giving their money away, in the hopes that they'll be sympathetic to you when you're conned in your turn.
Not op but, whenever I search for some term that maybe have more watchtime,you will see AI generated content. For example try searching for "li ion battery".
One example: supposed factual documentary videos of historic events.
But really, I’ve see this across so many genres lately.
I don’t know if Alphabet wants to solve this, though.
[dead]
My feed is filled with AI slop videos. They are usually AI generated thumbnails, AI generated voice-overs over either AI generated images animating or random clips of videos stolen from actual creators. Last month, I was getting 5-6 videos every refresh containing some fake futuristic AI-generated cars. Nowadays, I am getting the same with AI-generated tech products.
Typically, all these videos have under 1000 views. I am at a point where I might even write a user script to hide every video under 1000 views.
If you want to banjanx the algorithm, instead, watch a few Kyoto Tachibana Senior High School Band videos (e.g. https://www.youtube.com/watch?v=kIouDxRfTb8) or Shimane Prefectural Izumo Commercial High School Band videos (e.g. https://www.youtube.com/watch?v=VPLzr3cuFtk).
There appears to be an enormous algorithmic weight to those. Just a few such videos completely swamped my recommendations with Japanese marching bands a couple of years ago. Simply finding those two to copy their URLs into this post has immediately put two further band videos into my recommendations.
This doesn't surprise me. Websites and blogs and search results are polluted with trash. On the plus side, it has ruined any "explore" type watching of anything for me, which is making me spend less time on these platforms.
Looking for advice? One approach that might be interesting to try is to subscribe to channels and restrict yourself to your subscriptions. You could even bookmark the subscriptions tab in Youtube so that you don't land on the main page with recommendations.
Oh, I like the idea of sticking to subscribed channels. It's the equivalent of only using the Following tab on Twitter.
It's funny how invariably the algorithmic feed ends up being overused and pushes people to self filtered content. Facebook is the earliest example I remember of this.
I don’t see any because I only watch channels I trust.
Same. I don't stray from my core Youtube. I don't even subscribe to channels, but my algorithm is still only offering me videos from creators that I've been watching for years and trust. When one of them jumps the shark, looking at you Veritasium and Mark Rober, I just stop clicking their videos and eventually they stop showing up in my list.
Curious which video caused you to dump vertasium? Other than click-bait titles and thumbnails, the production and information presented seems first class.
When he stopped presenting them live, and switched to animation with voice over. The change was likely made to be able to pump them out faster, and there was a definite drop in quality that came along with the change.
That’s likely due to the fact that Veritasium has PE investment now.
https://www.electrify.video/post/electrify-completes-majorit...
https://youtu.be/hJ-rRXWhElI?si=UVbFVwH9h9HQw3L2
Is there anything PE can't make worse ?
The animations seem appropriate for teaching the content and actually seem like more effort than just talking into a camera.
But I agree there has been a drop in quality with the growth of his team.
Did that happen when Verasitum became owned by venture capital?
How do you find those channels?
Up until recently the YT recommender algorithms did a quite good job at that for me. Now they serve up crap.
My bar for new channels is that they must have a person presenting the content, who appears on camera for enough time to convince myself they're not fake.
Unfortunately, that rules out the likes of Minute Physics, ScienceClic English, and the famous for being camera-averse Lockpicking Lawyer. (-:
LPL appears on camera. Just not his face.
The question is whether a disembodied pair of hands, that could be those of a body double, is enough to satisfy stronglikedan.
The good news is that Big Clive, 3Blue1Brown, and even Jago Hazzard have all appeared on camera, albeit not always on their own channels. (-:
I have a pre-existing list based on my interests.
Sometimes those creators will mention other channels or I will hear about a channel from someone else and I’ll check them out.
Yes. The elderly people in my family are falling into watching AI-generated videos which a human never touched before getting views. Grandpa is watching AI-generated stories of puppies saving lives (I had to break it to him, none of these are real) -- just search "puppy saves baby" to see an endless stream of slop. Mom is watching videos from a Galactic Light Federation which she says raise her vibration, and she doesn't believe these are AI. It's tragic, such a waste.
The even worse ones are "POLITICAL FIGURE owns/destroys/etc. JUDGE/POLITICAL PARTY/OTHER POLITICAL FIGURE"
Now imagine your Grandpa was the President of the United States and he was tweeting out links to these AI slop videos for everyone to see.
And also the AI slop video featured both himself and his daughter-in-law, so you'd think he would have a pretty good idea the video was AI slop.
And the subject of the AI slop video was announcing magical beds that heal every problem you might have.
https://www.dispatch.com/story/news/2025/09/29/what-are-medb...
The United States is so very fucked.
My thinking is that YouTube/Google/Alphabet are capable of detecting this crap but choose not to use that signal in the Youtube recommender algos.
Just like Amazon could detect fraudulent sellers but choose not to because they make money off of those sales just like the legit ones, and customers stick around despite of the fraud.
I only watch new content if there is a person presenting it, and making themselves visible on camera for at least a few minutes. It's not perfect, but that few minutes let's me evaluate whether I trust they are real. Of course, this doesn't apply to channels that I trust, such as those I've been watching for years.
As opposed to being full of regular crap?
I can confirm. My YouTube algorithmic feed is full of AI slop on a whole range of topics, including fake news items, which is particularly concerning because it's often pretty convincing.
As a point of reference: I usually watch high quality content, avoid shorts and other brain rot.
So, basically like any place on earth where a lot of people meet.
I stopped going to festivals for that reason.
The worst was near a lake: during the time we needed to setup our small tent, enough RVs had arrive with though dumping their full toilets into the lake wouldn't matter... yeah, bathing never happened, because a lot of brown sausages were swimming around, plus random trash. But it didn't stop there: before the first night, the way between camping grounds and the different stages was clearly visible even without a lot of artificial light,bdue to the white plastic in the trash, lining both sides. Disgusting.
A lot of festival goers are basically ignorant shit-holes on legs. Even if there are only 2-3%, that often spoils it for the rest.
/rant over
1) Disable YouTube history 2) Delete YouTube history 3) Remove Ads (either adblock Or yt premium) 4) ScreenTime on iOS + one sec (or screen zen)
Its a bit drastic but its completely cleared my timeline from slop (by completely clearing my timeline)
Without YT Premium, YT has become a cesspool.
With YT Premium, it's boring as the algorithm keeps bringing you back to your bubble.
search yt for “whale barnacles” it’s an endless stream of the worst ai SLOP
Here's a weird data point: I was recently doing some physical labor with a neighbor who is probably a recovering addict with what some might call a "partially fried brain." After a bit, I noticed that he liked to have his phone out playing AI slop videos with Star Wars-like stories about Galactic this or Federation that.
He said it settled his brain better than human generated content. It was soothing, consistent and just one or two notches above bland.
I left with the impression that his brain and the AI content had a great impedance match.
That makes sense.
The segment of the population that is suffering from brain rot is gigantic. This is the segment Mark Zuckerberg loves.
In his own words: "dumb fucks".
Junk food has a similar effect on the brain - without exercising proper discipline it has a catastrophic effect on productivity.