Im a bit clueless on this subject but I've been wondering why isn't this heat used productively to, say, generate electricity. Why aren't AI datacenters for example use the heat to provide the municipality with hot water for example?
Electricity from heat is generated by temperature differences. Large differences are much more efficient than small differences and data centers are "small differences" for these purposes.
Unless you can use the heat directly, it won't be worth it to convert it into electricity.
> The facility will consume 2.2 gigawatts of electricity — enough to power a million homes. Each year, it will use millions of gallons of water to keep the chips from overheating. And it was built with a single customer in mind: the A.I. start-up Anthropic, which aims to create an A.I. system that matches the human brain.
Sentences like this never ceases to amaze me. All of this juice to attempt to match what a single human brain can do with its relatively low resource requirements.
> Each year, it will use millions of gallons of water to keep the chips from overheating.
Growing corn on that same 1200 acres would require on the order of a billion gallons of water. People have no sense of just how much water agriculture consumes.
In all probability, putting a data center on farm land is greatly reducing water consumption.
Yeah. The power use is impressive, although still only 1/10th of the output of the largest power station in the world, the three gorges dam. The water use is not really a big deal at all. I don’t know why journalists inevitably have to mention both. And even the power numbers require much more careful contextualization.
I wish I had a good balanced article to link to when these discussions come up, one that really digs into the power use question and compares it to the power consumed by other human activities and capital projects. Does anyone know of one?
Doesn’t a typical McDonald’s require millions of gallons of water each week just for the beef alone, without even considering things like drinks and ice and boiling potatoes and washing floors?
By pointing out the obvious you miss out jumping on the bandwagon of everyone writhing in pain over how devastated they are for the environmental impact. Whenever AI is brought up you have to remind people how unsustainable it is and how it’s all about to fall apart and how the value isn’t there, among other topics. Off topic/on topic I’m reminded of death reports during COVID for particular states. When I went to go and look at the statistics of years prior to compare, they were at par if not less than the few previous years. Just an example of uncontrolled (mostly manipulative) outrage in action. I’d like to expect better of HN but I don’t: I’ve lost all respect for the moderation (dang and now tom) here over the years.
Considering the human brain evolved over hundreds of millions of years of evolution (what a lot of data!) that almost seems cheap to do in the span of a number of months to train a model.
13 years is an incredibly long time for something as fast moving as data center development. I guarantee that a _lot_ has changed. I know AWS in particular has gone through multiple entire revisions of their DC designs, and I recall a talk from some of their engineers saying how AWS actually found it more economical to use less cooling and let their DCs run hotter than they used to.
Data centers may change but the physics of cooling doesn't.
It's more economical to run chips hotter but at the end of the day you'll still have heat that needs dissipating and it's hard if not impossible to beat evaporative cooling in terms of cost.
This is like someone in 1800 saying “at the end of the day you still have transportation needs and it’s hard if not impossible to beat horses and carriages in terms of cost”.
Literally just do a google search. There are advancements every day that improve upon evaporative cooling to make it use less and less water and energy, and alternative methods other than evaporative cooling.
I think you’re conflating things. It’s not a single human brain. It’s processing power to provide the human trove of knowledge and reasoning at the beck and call of millions of people nearly simultaneously. No single person would be able to do that.
It's a laudable goal to match human intelligence but we can't ignore the cost for too long. If humans can create the same reasoning for cheaper why would we chug along on this path? And this may be the case when the AI hype bubble pops and real economics take over.
This is not the intelligence of one person, even the most intelligent person but rather the intelligence (wisdom) of the whole crowd -the whole world democratized (I.e. available to everyone for a nominal fee).
And for the same purpose before that we had, or rather have, since it is still very much in use today, the Dewey Decimal System. Peering further onward still into the annals of library science, one can find even more creative and revealing methods of indexing human knowledge. In doing so, one might even be inclined to believe that the state of the science has come so far as to be considered solved. Alas.
I think if you went to the library and asked for information on "how to build a chicken coop" and the librarian took 60 books related to chickens and building and farming, cut up the words in them, then arranged them in a way they found satisfying, you might start going to a different library.
The server/ML GPUs are not great for gaming, they strip out all the specialized shader/pixel units to cram in more general compute cores.
In theory you could write games with just those but aside from the amount of work needed I'm not sure if the performance would be good enough, the specialized texture samplers etc can be faster than general purpose compute shaders.
Also, for cloud gaming you want very low latency, so few GPUs all over the world in local POPs, not a lot of GPUs in few large data centers.
Putting aside possible specialization of hardware and differences for applications, cloud gaming was tried and largely flopped. Related for the power intense use-cases where it could be useful to outsource had too much latency involved. Remember Strada?
"Trainium accelerator" doesn't sound like it would be terribly useful outside it's current niche. It's not clear how much of the spend is that kind of thing versus general purpose compute and storage.
This feels like saying when the dot-com bubble burst, could the servers be re-purposed as mainframes. Whatever about valuations and individual stocks, AI/LLM workloads are only going up.
I am more on the AI skeptic side (LLMs are not a path to general intelligence, most positions cannot be 100% replaced with any version of product I have seem to date), but I fully agree. Some form of AI text generation is likely to be here forever. Maybe it gets vastly more efficient per clock cycle, but compared to N years ago, all of the tech vendors need more compute to offer this service.
It doesn’t matter if its general intelligence. What matters is if they can get workers and consumers using it every day. Which increasingly seems to be the case.
>A year ago, a 1,200-acre stretch of farmland outside New Carlisle, Ind., was an empty cornfield. Now, seven Amazon data centers rise up from the rich soil, each larger than a football stadium.
Can't they find brownfield sites instead of fields
The problem with what xAI is doing in Memphis is they were originally running "temporary" generators for over a year (effectively permanently ...) without meeting emission requirements a permanent generator would need to.
Sure, it's not xAI's fault the local power grid isn't suitable for their datacenter but the problem here is that xAI is effectively not following the law.
As far as "industrial" neighbors go, a datacenter is a pretty good one to have vs an abandoned husk vs an active polluter/nuisance, at least if that datacenter isn't run by a selfish psychopath that tacked on a bunch of gas burning generators to power the thing.
This comment got me wondering whether loss of farmland in the US is a serious issue.
It looks like there's about 800 million acres of farmland in the US and we're losing about 2 million acres per year to the land being repurposed. Despite that, crop production has more than tripled in the past 70 years due to technological advances.
That said, economic effects, loss of farmland, and climate change have contributed to slower growth and higher variability of crop yields recently.
In the past decade there's been a modest 0.8% annual increase in crop production despite losing about 2 million acres per year.
Yeah, it's not a significant amount of land. But it does seem like these companies prefer to take quality farmland instead of unused land. There's something similar happening near my town, where a company wants to put a big solar array on some prime farmland, and the locals are asking why that spot and not getting an answer. It might save a small amount on development, as farmland is fairly flat and has no trees to remove, but that's miniscule in the overall budget for these things, and rough ground would be much cheaper to buy. But the corporation behind it and the government entities involved are digging in their heels and insisting on using farmland, without any explanation why.
So it does seem like some of the people making these decisions just like the idea of taking farmland out of production for some reason. Maybe they just don't like farmers or modern farming methods. If that's their motive, they may not realize how tiny an effect they're having on the total, because most non-farmers don't really understand how much land is out there.
It's like the people who say Bill Gates is trying to control the food supply because he owns something like 270,000 acres of farmland. Even that just isn't that much, not enough for him to control anything larger than the horseradish market.
The problem is not the amount of land but if that land is economically viable. Farm already has low margins.so, if you grab a good location to build a data center and push the farm land even further away from population centers, then you are pretty much killing family farms.
There are virtually no family farms left in the US. Especially central Indiana corn farmers. 1200 acres wouldn’t be a financially viable corn farm if it were family farmed.
I think this is really close. My hunch is that agricultural land is just simply cheaper to acquire and convert, as compared to industrial land which may or may not have all kinds of remediation or razing that needs to happen to it first.
One of the major problems facing American agriculture is that there are fewer and fewer farmers/farming families.
Farming is extremely money and labor intensive and there’s a lot of upfront investment with a lot of long-tail return, and it’s not “sexy” the way (for example) AI is, so there’s not a bottomless pit of cash to shovel into the furnace for a quick buck turnaround.
Independent farmers tend to seriously rely on good weather and a lot of advantageous tax treatment.
Of course massive agri-business would very much love to continue to fill more and more of the void left by the shrinking independent farming population. That has its own problems.
Also whether the farmland requires irrigation. In Ohio, there’s not much irrigation. In Indiana, you see it a lot more. In Idaho, it’s basically a hard requirement
I don't know about Indiana, but in Texas companies buy farmland because it's often available near population centers and infrastructure. And the land is available because the existing owners do just enough farming/ranching to qualify for the lower tax category.
This particular site is being built basically in the parking lot/backyard of a steel mill, so it sort of is brownfield. I have a hunch that “empty cornfield” is just the author using some artistic freedom instead of saying “an empty field in the Midwest”.
It was definitely farm fields growing crops, you can look in Google Earth and see for yourself.
Edit: Anywhere in the Midwest where there is a large expanse of flat ground will almost always be growing crops or have a town built on it. Undeveloped land is almost all hilly and forested or floodplain, nowhere where you could put a data center.
Why? Hold outs, zoning, greedy politicians and more. I drove through a blighted neighborhood. There were 60-70% unkempt houses and maybe 10-15% in great shape. Some won't want to move for their own reasons and they'll refuse to sell. Some will just be trying to get more money. Others will be sincere. It's very hard to unsubdivide land.
And then there are the zoning and city regulations which naturally resist. You would think they would welcome a new taxpayer, but they're really in love with the regulations that someone wrote decades ago. The odds of getting them to bend is anyone's guess, but in most cases a bribe or ten is going to be required.
At this point we have to be glad it's not coal. After all we have a president that just passed an executive order to foster the "Beautiful Clean Coal Industry"
> is the first in a new generation of data centers being built by Amazon, and part of what the company calls Project Rainier, after the mountain that looms near its Seattle headquarters
Near both the Casadia fault that's produced magnitude 9 earthquakes and a chain of active volcanoes. Both of which are statistically likely. I do wonder what contingency plans Amazon (and Microsoft) have in the event of a megathrust earthquake.
It is named project rainier, but the facilities are in Indiana.
If you're speaking metaphorically, the mega thrust would be some cataclysmic world event like China invading Taiwan. In which case they won't need to earn natural gas to power or waste fresh water to cool...
Megathrust earthquakes are pretty common around the world. Some of them cause major damage, such as the ones in 2004 and 2011. But even then, the impact is mostly regional.
https://archive.ph/9ytmy
Im a bit clueless on this subject but I've been wondering why isn't this heat used productively to, say, generate electricity. Why aren't AI datacenters for example use the heat to provide the municipality with hot water for example?
Electricity from heat is generated by temperature differences. Large differences are much more efficient than small differences and data centers are "small differences" for these purposes.
Unless you can use the heat directly, it won't be worth it to convert it into electricity.
Using heatpumps you only need a little electricity to heat quite a few homes with district heating.
But there probably is no nearby district to be heated.
And you only need heating for at the very most, 6 months of the year in the US. Though that's shrunk depending on region in recent years.
True for heating but warm water is in demand all the time.
There are already some datacenters doing this: https://www.techtarget.com/searchdatacenter/tip/Data-center-... But for now, it's still the exception, probably because of the cost of such systems.
> The facility will consume 2.2 gigawatts of electricity — enough to power a million homes. Each year, it will use millions of gallons of water to keep the chips from overheating. And it was built with a single customer in mind: the A.I. start-up Anthropic, which aims to create an A.I. system that matches the human brain.
Sentences like this never ceases to amaze me. All of this juice to attempt to match what a single human brain can do with its relatively low resource requirements.
> Each year, it will use millions of gallons of water to keep the chips from overheating.
Growing corn on that same 1200 acres would require on the order of a billion gallons of water. People have no sense of just how much water agriculture consumes.
In all probability, putting a data center on farm land is greatly reducing water consumption.
Yeah. The power use is impressive, although still only 1/10th of the output of the largest power station in the world, the three gorges dam. The water use is not really a big deal at all. I don’t know why journalists inevitably have to mention both. And even the power numbers require much more careful contextualization.
I wish I had a good balanced article to link to when these discussions come up, one that really digs into the power use question and compares it to the power consumed by other human activities and capital projects. Does anyone know of one?
Doesn’t a typical McDonald’s require millions of gallons of water each week just for the beef alone, without even considering things like drinks and ice and boiling potatoes and washing floors?
By pointing out the obvious you miss out jumping on the bandwagon of everyone writhing in pain over how devastated they are for the environmental impact. Whenever AI is brought up you have to remind people how unsustainable it is and how it’s all about to fall apart and how the value isn’t there, among other topics. Off topic/on topic I’m reminded of death reports during COVID for particular states. When I went to go and look at the statistics of years prior to compare, they were at par if not less than the few previous years. Just an example of uncontrolled (mostly manipulative) outrage in action. I’d like to expect better of HN but I don’t: I’ve lost all respect for the moderation (dang and now tom) here over the years.
Considering the human brain evolved over hundreds of millions of years of evolution (what a lot of data!) that almost seems cheap to do in the span of a number of months to train a model.
Let me fix that for you: all that juice to exploit human's cognitive biases and trick them into paying engineers and founders ungodly sums of money.
I don't think any rational person outside of the hype cycle thinks LLM s are emulating a human brain in a meaningful way.
I thought datacenter water cooling was usually a closed loop? I keep reading conflicting info about this
For very hot data centers, evaporative cooling is still popular. This is from 2012 but I doubt much has changed.
https://blog.google/outreach-initiatives/sustainability/gett...
13 years is an incredibly long time for something as fast moving as data center development. I guarantee that a _lot_ has changed. I know AWS in particular has gone through multiple entire revisions of their DC designs, and I recall a talk from some of their engineers saying how AWS actually found it more economical to use less cooling and let their DCs run hotter than they used to.
Here’s a recent article from AWS about using closed-loop systems for their AI data centers: https://www.aboutamazon.com/news/aws/aws-liquid-cooling-data...
Data centers may change but the physics of cooling doesn't.
It's more economical to run chips hotter but at the end of the day you'll still have heat that needs dissipating and it's hard if not impossible to beat evaporative cooling in terms of cost.
This is like someone in 1800 saying “at the end of the day you still have transportation needs and it’s hard if not impossible to beat horses and carriages in terms of cost”.
Literally just do a google search. There are advancements every day that improve upon evaporative cooling to make it use less and less water and energy, and alternative methods other than evaporative cooling.
Bleeding-edge advancement and commercially-viable solutions are not apples to apples.
Are new water-guzzling DCs and nuclear plants built on water sources unlikely to be affected by climate change?
Nope. Evaporative cooling.
I think you’re conflating things. It’s not a single human brain. It’s processing power to provide the human trove of knowledge and reasoning at the beck and call of millions of people nearly simultaneously. No single person would be able to do that.
It's a laudable goal to match human intelligence but we can't ignore the cost for too long. If humans can create the same reasoning for cheaper why would we chug along on this path? And this may be the case when the AI hype bubble pops and real economics take over.
This is not the intelligence of one person, even the most intelligent person but rather the intelligence (wisdom) of the whole crowd -the whole world democratized (I.e. available to everyone for a nominal fee).
That’s an important difference.
No, but we had done a pretty good job of storing and indexing it all on this thing called the Internet.
And for the same purpose before that we had, or rather have, since it is still very much in use today, the Dewey Decimal System. Peering further onward still into the annals of library science, one can find even more creative and revealing methods of indexing human knowledge. In doing so, one might even be inclined to believe that the state of the science has come so far as to be considered solved. Alas.
I think if you went to the library and asked for information on "how to build a chicken coop" and the librarian took 60 books related to chickens and building and farming, cut up the words in them, then arranged them in a way they found satisfying, you might start going to a different library.
> It’s processing power to provide produce near unlimited spam and millions of pictures of shrimp jesus
ftfy
Once the overall AI bubble bursts/deflates, would they be able to re-purpose it for other workloads e.g. cloud gaming?
Though, maybe Midjourney or other niche AI companies could buy it for the right price I guess.
The server/ML GPUs are not great for gaming, they strip out all the specialized shader/pixel units to cram in more general compute cores.
In theory you could write games with just those but aside from the amount of work needed I'm not sure if the performance would be good enough, the specialized texture samplers etc can be faster than general purpose compute shaders.
Also, for cloud gaming you want very low latency, so few GPUs all over the world in local POPs, not a lot of GPUs in few large data centers.
Putting aside possible specialization of hardware and differences for applications, cloud gaming was tried and largely flopped. Related for the power intense use-cases where it could be useful to outsource had too much latency involved. Remember Strada?
"Trainium accelerator" doesn't sound like it would be terribly useful outside it's current niche. It's not clear how much of the spend is that kind of thing versus general purpose compute and storage.
This feels like saying when the dot-com bubble burst, could the servers be re-purposed as mainframes. Whatever about valuations and individual stocks, AI/LLM workloads are only going up.
I am more on the AI skeptic side (LLMs are not a path to general intelligence, most positions cannot be 100% replaced with any version of product I have seem to date), but I fully agree. Some form of AI text generation is likely to be here forever. Maybe it gets vastly more efficient per clock cycle, but compared to N years ago, all of the tech vendors need more compute to offer this service.
It doesn’t matter if its general intelligence. What matters is if they can get workers and consumers using it every day. Which increasingly seems to be the case.
>A year ago, a 1,200-acre stretch of farmland outside New Carlisle, Ind., was an empty cornfield. Now, seven Amazon data centers rise up from the rich soil, each larger than a football stadium.
Can't they find brownfield sites instead of fields
> "Can't they find brownfield sites instead of fields"
You mean precisely like xAI did in Memphis? Build on the site of an abandoned factory—a historically disadvantaged, low-income industrial zone?
I couldn't imagine anyone having anything negative to say about that.
This seems pretty bad faith.
The problem with what xAI is doing in Memphis is they were originally running "temporary" generators for over a year (effectively permanently ...) without meeting emission requirements a permanent generator would need to.
Sure, it's not xAI's fault the local power grid isn't suitable for their datacenter but the problem here is that xAI is effectively not following the law.
As far as "industrial" neighbors go, a datacenter is a pretty good one to have vs an abandoned husk vs an active polluter/nuisance, at least if that datacenter isn't run by a selfish psychopath that tacked on a bunch of gas burning generators to power the thing.
This comment got me wondering whether loss of farmland in the US is a serious issue.
It looks like there's about 800 million acres of farmland in the US and we're losing about 2 million acres per year to the land being repurposed. Despite that, crop production has more than tripled in the past 70 years due to technological advances.
That said, economic effects, loss of farmland, and climate change have contributed to slower growth and higher variability of crop yields recently.
In the past decade there's been a modest 0.8% annual increase in crop production despite losing about 2 million acres per year.
Yeah, it's not a significant amount of land. But it does seem like these companies prefer to take quality farmland instead of unused land. There's something similar happening near my town, where a company wants to put a big solar array on some prime farmland, and the locals are asking why that spot and not getting an answer. It might save a small amount on development, as farmland is fairly flat and has no trees to remove, but that's miniscule in the overall budget for these things, and rough ground would be much cheaper to buy. But the corporation behind it and the government entities involved are digging in their heels and insisting on using farmland, without any explanation why.
So it does seem like some of the people making these decisions just like the idea of taking farmland out of production for some reason. Maybe they just don't like farmers or modern farming methods. If that's their motive, they may not realize how tiny an effect they're having on the total, because most non-farmers don't really understand how much land is out there.
It's like the people who say Bill Gates is trying to control the food supply because he owns something like 270,000 acres of farmland. Even that just isn't that much, not enough for him to control anything larger than the horseradish market.
The problem is not the amount of land but if that land is economically viable. Farm already has low margins.so, if you grab a good location to build a data center and push the farm land even further away from population centers, then you are pretty much killing family farms.
There are virtually no family farms left in the US. Especially central Indiana corn farmers. 1200 acres wouldn’t be a financially viable corn farm if it were family farmed.
I think this is really close. My hunch is that agricultural land is just simply cheaper to acquire and convert, as compared to industrial land which may or may not have all kinds of remediation or razing that needs to happen to it first.
One of the major problems facing American agriculture is that there are fewer and fewer farmers/farming families.
Farming is extremely money and labor intensive and there’s a lot of upfront investment with a lot of long-tail return, and it’s not “sexy” the way (for example) AI is, so there’s not a bottomless pit of cash to shovel into the furnace for a quick buck turnaround.
Independent farmers tend to seriously rely on good weather and a lot of advantageous tax treatment.
Of course massive agri-business would very much love to continue to fill more and more of the void left by the shrinking independent farming population. That has its own problems.
It will swing back one day. Maybe in a few generations.
Has there ever been a swing back to small, independent producers of anything after an industry has consolidated?
Also whether the farmland requires irrigation. In Ohio, there’s not much irrigation. In Indiana, you see it a lot more. In Idaho, it’s basically a hard requirement
I don't know about Indiana, but in Texas companies buy farmland because it's often available near population centers and infrastructure. And the land is available because the existing owners do just enough farming/ranching to qualify for the lower tax category.
The county it is located in is about 290,000 acres.
Indiana is about 23 million acres.
This particular site is being built basically in the parking lot/backyard of a steel mill, so it sort of is brownfield. I have a hunch that “empty cornfield” is just the author using some artistic freedom instead of saying “an empty field in the Midwest”.
It was definitely farm fields growing crops, you can look in Google Earth and see for yourself.
Edit: Anywhere in the Midwest where there is a large expanse of flat ground will almost always be growing crops or have a town built on it. Undeveloped land is almost all hilly and forested or floodplain, nowhere where you could put a data center.
Why? Hold outs, zoning, greedy politicians and more. I drove through a blighted neighborhood. There were 60-70% unkempt houses and maybe 10-15% in great shape. Some won't want to move for their own reasons and they'll refuse to sell. Some will just be trying to get more money. Others will be sincere. It's very hard to unsubdivide land.
And then there are the zoning and city regulations which naturally resist. You would think they would welcome a new taxpayer, but they're really in love with the regulations that someone wrote decades ago. The odds of getting them to bend is anyone's guess, but in most cases a bribe or ten is going to be required.
it's probably not actually rich soil, maybe a writer's flourish - post harvest fields are pretty much barren of nutrients and life
> The local utility will largely use natural gas to generate the additional electricity needed to power Amazon’s data center
Sad, but expected.
At this point we have to be glad it's not coal. After all we have a president that just passed an executive order to foster the "Beautiful Clean Coal Industry"
> At this point we have to be glad it's not coal.
it will be if the coal price drops far enough
Far better than xAI's data center being powered by mobile Diesel generators
xAI is natural gas, not diesel.
For something to be considered "sad", there needs to be an available, practical alternative. What should they have chosen instead?
We could not build new data centers and focus on fixing the real issues instead.
We could always revisit this decision once clean power is actually available.
Cover the roof in solar panels and use the power to do something useful instead of running bullshit generators.
Or maybe a park with some oak trees and a frisbee golf course idk.
It's either that or nuclear.
You say that as if nuclear was bad?
Not at all. I think it's probably the best option if you want guaranteed electricity.
False equivalence bias
Dunning–Kruger effect
Decades of lies about nuclear has its downsides.
> After building seven data centers in Indiana, Amazon plans to build 23 more.
Jesus
Large numbers of the 8 billion people in the world are still not digitally literate.
The demand from all the poorer countries with large populations is only going to go up as their kids grow up with tech.
I'd imagine they would build those data centers closer to those populations.
[dead]
> is the first in a new generation of data centers being built by Amazon, and part of what the company calls Project Rainier, after the mountain that looms near its Seattle headquarters
Near both the Casadia fault that's produced magnitude 9 earthquakes and a chain of active volcanoes. Both of which are statistically likely. I do wonder what contingency plans Amazon (and Microsoft) have in the event of a megathrust earthquake.
https://en.wikipedia.org/wiki/Cascadia_subduction_zone#Forec...
https://en.wikipedia.org/wiki/Mount_Rainier#Modern_activity_...
It is named project rainier, but the facilities are in Indiana.
If you're speaking metaphorically, the mega thrust would be some cataclysmic world event like China invading Taiwan. In which case they won't need to earn natural gas to power or waste fresh water to cool...
Megathrust earthquakes are pretty common around the world. Some of them cause major damage, such as the ones in 2004 and 2011. But even then, the impact is mostly regional.
> Both of which are statistically likely.
It's seems a little dishonest to not include that 'statistically likely' is on a geological time scale.