This new solar thermoelectric generator (STEG) traps heat on one side and cools the other, making electricity from the hot-cold gap via the Seebeck effect. Unlike solar panels, which need direct sunlight, STEGs can use ambient heat and scattered light. That’s why they still work in shaded or cloudy areas—any temperature difference can generate power. Today they’re only ~1% efficient (vs. ~20% for solar), but the new design is 15× better than earlier STEGs. They won’t beat solar panels yet, but could be useful in spots where panels underperform.
I did data collection for a paper looking at the Seebeck effect in magnetic insulators about ~10-15 years ago, and it seemed like everyone in the whole physics department considered spintronics pretty dead. It feels great to see some big promising applications coming out of the field.
It’s explicitly part of Chinese science and technology strategy to think outside the box and it’s what’s pushing them forward on areas like semiconductors as well.
Given the massive advantage in talent they’ve built up while the Us reverts to Drill Baby Drill we know how this ends.
Eventually the Us with push for atmospheric dimming to “fix” the negative externalities of their approach which had the nice side effect of degrading solar ….
You’re being downvoted but i don’t think you’re entirely wrong. China has been pursuing some stuff that the western world had essentially abandoned, getting interesting wins (eg: thorium reactors).
He's wrong in the "why" tho. It's not that they must think outside the box, it's that they _must_ not all focus on a single point of research. I'm pretty sure that they are also pursuing popular research topics, because it would be pretty bad if they fall behind for not doing the obvious.
so they are basically using a similar idea to that of a stirling engine in thermoelectric generator or they use a different mechanism to produce energy?
Two materials (often n-type and p-type semiconductors) are joined at two junctions, one junction is heated and the other cooled. The temperature difference makes charge diffuse from the hot side toward the cold side, and this charge is what turns into the seebeck voltage they describe. It was just very hard to get anything meaningful out of this because you can't easily get such a temperature difference. If you've read of the peltier effect, it's the same thing as this, just in reverse.
I’m really excited for this technology to be used for solid state AC. I think we’ll get there one day.
I’m even willing to accept a lot of inefficiency because new AC’s in the US can cost 10-20k now and you can need them as frequently as every 7-10 years.
I presume this would be useful for thermal camouflage, since you could store the heat energy in a battery instead of diverting or diffusing it. I hope bleeding edge research sees the potential for aiding the climate crisis as ethically imperative over a military's niche technological advantage/cash.
I sometimes wonder if you could make a much safer nuclear plants with TEGs. There would be next to no active processes and moving parts, no pumps and coolant and circulation. Use nuclear magic to make the hot side, cool with ambient temperature.
No doubt this would be less energy efficient, but perhaps you reap so much savings by not having to worry about water circulation that it's worth it.
Sort of, They are called Radioisotope Thermoelectric Generator (RTG) used where you need a small amount of super reliable power. Mainly used on deep space missions where there is not enough sunlight. The soviets also used them in unmanned deep north lighthouses.
Neat tech, but very inefficient, to make it efficient fluids start needing to be moved around which hurts reliability. The next step up are pebble bed reactors, I don't think any have been built but the idea is to have self contained fuel "pebbles" enriched enough to get hot but with enough built in moderation so they can never melt down. Then a traditional heat engine is bolted on.
Not for the ones already launched and we're not launching any of those in the near future, especially not with the current batch of governments.
And I'm not convinced that particular discovery would yield that kind of performance increase for such an application. There are just too many things different in the environment alone.
> Use nuclear magic to make the hot side, cool with ambient temperature.
If you use ambient temperature for cooling, you are severely limited in your total power output. Like, we're talking about less than a megawatt output (depending on how big the ambient heat dispensers are) compared to the ~1GW of a regular old nuclear plant.
You might say: that's fine, let's just build many small ones. But you still need to track your radioactive material, make sure it's not stolen etc., which is a lot of overhead per installation.
I don't entirely follow. A lot of existing nuclear plants use ambient temperature for cooling, via cooling towers, no? They use pumped coolant to get the heat out of the reactor, sure, but the cold end is just air.
Also, I guess you could have the hot end very hot too..? This improving efficiency. Especially if, by virtue of cooling being safer, you could run it at a higher temperature (less safety margin needed).
Nuclear power plants aren't inherently any more unsafe than the nuclear material itself before having been mined.
What makes them potentially unsafe is nuclear technology having an incredible energy density, which can be misused and the radioactive material being active even without prior activation. The latter makes many radioactive isotopes a very effective poison.
And misuse or bad practices are a general problem. One can build awful buildings, toys or government structures, too.
And if you have a look at what the worst reasonable non-political consequences from a nuclear powerplant meltdown can be, they're surprisingly harmless.
We have to Soviets to thank for their absolutely incompetent response to the Chernobyl meltdown, that we have a good idea of the long term effects. The powerplant never stopped operating, people kept working there every day for decades. Hundreds of people were never evacuated and hundreds more returned within weeks.
Just to put this into perspective: Chernobyl was effectively a dirty super-bomb dispersing 50t of highly active radioactive materials and yet the death count among anyone who didn't approach into rock throwing distance remains 0.
Nuclear plants sometimes aren’t even worthwhile at their current efficiency. Even if TEGs simplified things mechanically, you’d still face the problem that nuclear often can’t compete (with renewables) in direct a comparison (setting aside overall strategy for base load and stuff).
Yes I wonder if with the current trend, solar might become the real alternative, less expensive, less risky.
But to be fair, you have to consider your « aside », because nuclear has the tremendous advantage of working when it’s cloudy, dark and you need the energy the most in the winter.
I do not think that we can just compare the prices, or maybe we should also add the cost of storage (that is going down too) for solar.
But currently a mix is probably the pragmatic approach.
It's going to be solar + wind + battery. That's where the economics are at. Sodium batteries are just coming online now https://en.m.wikipedia.org/wiki/Sodium-ion_battery - lithium is getting phased out.
Maybe in some far off future nuclear will have a role... But the global energy investment markets paint a very clear picture: solar + wind + battery is the way.
But now you have a probabilistic system. Your battery part is designed for n numbers of low/no solar/wind input. So you are paying for a system that would be sufficient for x% of typical/historical years.
Which has to factor in the design and cost calculation.
nuclear is also probabilistic: in 2022 half of France nuclear reactor capacity was offline because of premature aging of some components in a design used in many reactors.
Note in case SMR become part of our grid: what if something similar happens to your hundreds of produced and deployed SMR?
There are always unknown unknowns that might be correlated, like that flaw in the component. Weather systems like a https://en.wikipedia.org/wiki/Dunkelflaute are known facts. The good thing is that we do have excellent weather data for the last 75 years, so it is totally feasible for proponents of renewables to run their models through that historical data and say: Look, if we have that amount of renewables and that amount of batteries that would have been enough for the last 3 quarters of a century.
Nuclear costs are largely due to regulatory burdens created for reactor designs that are not safe. That is no longer the case. Also, attempts to exploit economies of scale could also improve baseline costs, although these attempts haven't been funded enough yet to actually scale.
In the next years, it doesn't make sense to use batteries to sustain winter load: it would be way too expensive. But batteries get cheaper quickly, such that it doesn't make sense to build expensive nuclear plants just for winter. What does make sense, until batteries are cheap enough, is natural gas during winter, plus (where available) wind energy and hydro / pumped storage, existing nuclear plants (optimised for winter), biomass (wood), photovoltaics in the mountain, and geothermal.
So show me the model of renewables + batteries that would have been sufficient for all of the last 75 years in Germany and the UK. We do have the historical weather data so there is ZERO reason for all that handwaving.
Exponential growth is a funny thing. First it looks like nothing is happening, and all of a sudden everything has changed. Check out discussions about wind and solar some 10 years ago.
E: for reference from memory, it took about 50 years to install the first TW of solar. The next TW took 2 years, and the next TW is projected to take only 1 year, 2025.
You do realize HVDC grids can do 3,000km energy travel, right? That's basically anywhere to anywhere, continental US. There's already installs like the PDCI https://en.wikipedia.org/wiki/Pacific_DC_Intertie that take 3GW from north oregon to LA.
> so the 3.2GWh battery grid storage array, in operation, this is still 1/100th what is needed?
That's closer to 1% of what California needs by itself then even 1% of the USA's need. We aren't even taking into account the large and continual growth in electricity demand yet either.
Nuclear costs would be way higher if the plant operators would need to have insurance for catastrophic failures. Right now, they don't need that. The state (the population) just takes this risk.
Even ignoring all of that, there's "time to first watt" - essentially if you break ground now, how quickly can you start producing power? Nuclear has years scale, wind and solar has weeks, if not days.
And also when better tech comes along, you can partially transition a farm to newer panels and resell the old ones after market.
Plus you don't have to build Onkalo Repository like systems to store waste for 100,000 years after you've produced your electricity.
I have this same issue with fusion. Who cares if the fuel is practically free, when building and operating the plant is extremely expensive and prone to failures due to the sheer complexity.
Of course the tech and science is cool, possibly useful in space or other niche environments, but whenever I see fusion proposed as some general energy solution, I just roll my eyes and move on.
People really love scifi on hn, and that's fine ... but the investment capital has spoken and renewables are being funded 30x nuclear. Not 30% more, 3,000% more. It's even 2x over ogc infra (oil, gas, coal)
We'll have direct antimatter annihilation at scale before we have fusion. It's basically a physics research project, with zero potential for commercial use.
There's already a convenient fusion reactor fairly close by, and it's unlikely to stop operating any time soon.
Wind + solar is just adding another failure mode for when there is no wind. There are many places without adequate wind speed. Nuclear does not care about either, and has the highest energy density on top of that.
Nuclear has its own failure modes. In Switzerland, one of the nuclear plants will be offline for winter (!) due to "unplanned repairs". This will cost the owners of the plant millions.
Nuclear also have weather failures: low water levels, and high river temperature. In both cases the power plant needs to reduce output or be turned off.
Those aren't weather failures but environmental regulations, there's nothing preventing the plant to work of you really want to, it's just not needed, especially in summer.
If you remove environmental regulations, then (pumped) hydro, wind, and photovoltaics would also be much cheaper, and much faster to build. For windmills, it's birds and whatnot, for photovoltaics (specially large-scale in the mounts) it's wildlife and other environmental impact.
Yes, but those are still different from weather failures. When there's no sun and no wind, you can do whatever you want with regulations, you can't bring it back. Weather failure is something unique to renewables on top of everything else.
FUD about "what about where renewables aren't available " is just rhetorical handwaving. The answer, which already exists at nation-level scale is storage and infrastructure.
Now you're just empirically lying, by equating the behaviour of solar and wind with that of hydro.
That table also doesn't say what you apparently think it does: it lists Luxembourg as 89% renewable, which is true, but does not include that Luxembourg only covers about 28% of the electricity it uses, and imports the rest.
Thus Luxembourg's production being 89% renewable is worthless information as to the viability and reliability of wind and solar for baseload: Luxembourg relies on its neighbours for reliable electricity supply.
The increase in renewable generation is great, but some of these are kind of cheating by importing energy to cover shortfalls. You need some kind of baseload generation that’s not dependent on weather, and borrowing this from a neighbour while pretending you don’t need it is like those ‘tiny house’ guys.
That's genuinely not how it works. You can see it every spring as Germany wholesale prices go negative to try and offload as much electricity as fast as possible to keep their grid from falling over.
Wind turbines can, and are, turned off (by turning, feathering the blades, and braking). There are two main cases: high wind / storm, and too much electricity in the grid. Photovoltaics can also be turned off.
The main reason for negative electricity prices are inflexible generators, eg. nuclear and coal, because they can't easily (cheaply) ramp down or shut off. Sometimes it is cheaper to let prices go negative than to use emergency mechanisms (that do exist).
Negative prices are not all bad: they are an incentive for storage / flexible demand to step in. Specially, a negative price does not mean the grid is melting.
'Too much electricity in the grid' is a wrong way of expressing it, just like you can't have 'too much fluid in a pipe'. What happens is that the line voltage creeps up because loads are lagging further behind compared to generation.
And like you wrote, that's controlled. Agreed with the rest of your comment, especially the bit that pricing is mostly controlled by the worst parties, not by the best. What we are simply finding out is that a grid designed mostly for baseline loads needs fast response generation (for instance: half of the UK putting their kettle on during half time requires so much extra power that pumped storage becomes a good alternative). And conversely, that if you change the mix considerably that you're going to have to have more control over the cumulative effect of many smaller generators.
But there are already standards for dealing with that even absent remote control of resources: as soon as the local grid voltage that the inverters in modern wind and solar plants see exceeds a very specific maximum for a proscribed period of time they fully autonomously back off their capacity until they are well below those maximums again, and then slowly ramp up to avoid causing grid instability due to oscillation.
What grid balancing is all about is to make this all financially optimal, it has relatively little to do with the safety of the grid, it is simply a way to extract maximum capacity without affecting that safety. A coarser mechanism would simply incur some more waste, but given the amounts of money involved it pays off to tune this.
That’s very interesting, but as a counter point, it seems that the major spain blackout was partially caused by such a voltage increase that was not mitigated properly.
So yes there are mitigations but it still is a major cause of concern I think
Yes, but that voltage increase alone wasn't enough to have caused the outage. The bigger issue was the subsequent oscillations which were amplified by the fact that that part of the grid is relatively isolated. The larger lessons there are still being learned with the report on that outage due in October I believe, but this isn't the first and it certainly won't be the last power outage. The 2003 one in the USA and Canada was much larger and didn't have any renewables other than as instantly available recovery loads for a good chunk of the grid, whereas nuclear power (everybody's baseline stand-by) took much, much longer to recover.
I think for myself the main takeaway is that we have come to rely on always available grid power to a degree that we probably should not have. Unfortunately inverters and battery systems that are capable of running in off-grid mode are very hard to come by compared to the on-line variety. Automatic disconnect and synchronization hardware are present in pretty much all inverters but they are connected in ways that the house would not be isolated from the grid and the software does not support such a solution because of the certification requirements.
Interestingly, a large (house capacity, which is a considerable amount of power) UPS does have those capabilities, and charging UPS batteries through a different mechanism than the built in charger is easily doable.
As for that Spanish/Portuguese outage: I fully expect that there will be some regulatory demands made on grid operators, especially with respect to containment of such outages, and possibly a requirement for better interconnection to increase the amount of perceived inertia in the grid. That is the best protection against such issues. Another thing that needs to be studied better is the kind of 'thundering herd' scenario that seems to have been the cause here (that's very much preliminary, but that seems to be the most logical explanation), especially in grid regions with low internal inertia. Such inertia is basically tightly coupled to how much grid synchronized rotating mass there is in a particular section of the grid. The more mass like that the more inertia there is the harder it is to make the grid go into oscillations. This mass is present both on the production side (generators) and on the consumer side (industry, because the prevalence of electric motors). So areas where the are no traditional (non-renewable) sources and very little industry are more susceptible to such kind of problems, especially when they become more isolated.
I'm following this closely because I look at companies in this space with some regularity and it is in fact what I went to school for at some point, it has always been a field that has interested me.
As the grid moves away from physical inertia sources and loads, do you think it would be realistic to distribute a grid-wide signal separate from the actual line voltage which could assist non-rotating power sources to stay in sync or at least help reduce the chances of oscillation?
The easiest is probably radio or satellite broadcasts but the topology of the grid, which does change, would also have to be considered. Probably not an easy problem to solve simply?
The grid itself is the best source for that. What I think we will see instead is custom superconductance based sink/source units that help with local grid stabilization. Those are already being deployed and they work quite well absent mechanical solutions, but they are still expensive and their capacity is still limited. A really dumb (but probably quite effective) way of doing this could also be by simply hooking up massive but slow flywheels.
Both have the same effect. Good distribution of generation and consumption in a geographical sense is something we never really gave much consideration in the past, it wasn't rare at all to have one side of a geographic region to be 'mostly producers' and another to be 'mostly consumers' and where the two sat next to each other it was usually to accommodate some really large consumer (for instance, a paper mill or a steel or aluminum plant). That also allowed for co-generation which is far more efficient. I think we will see more of this as well, and incentives to allow EVs to be used as sinks during times of excess power availability.
Other options are HVDC interconnects between geographically distant regions or to use these to create micro grids, each of which would be less stable than a much larger one but it would serve to isolate problems if and when they occur.
Interesting detail: wind power, while theoretically rotating grid synchronized mass is increasingly uncoupled and powering the grid using inverters. This is for efficiency reasons, the rotors have a much wider range that way, and you then only use furling of the blades to protect the installation from overspeeding and maximum efficiency the rest of the time even if that means rotating at a different speed than what would sync with the grid. This is optional, if the machine is synchronized it will still produce power, but not quite as much because blades are more efficient at higher RPM running flatter than at lower RPM running coarse, though coarse they do have more torque. So by sticking an inverter in the middle you can basically electronically do MPPT for the windmill rather than doing that mechanically.
Over the life of an installation the cost of that inverter is more than paid back in extra power but it has the downside of not having the mechanical mass of the wind turbine rotor and blades as extra inertia. Win some, lose something else...
> Negative prices are not all bad: they are an incentive for storage / flexible demand to step in.
Maybe that'll happen, but currently such events only keep increasing in frequency (https://www.pv-magazine.com/2025/08/26/germany-records-453-h...), and as neighbours also install more solar and wind the ability for germany to maintain their grid stability through exports is going to worsen not improve.
That's genuinely exactly how it works. There are companies that provide that offloading-as-a-service who make good money on this concept and the grid isn't anywhere near falling over. Your 'grid melting' comment upthread is nonsense. Nothing is melting.
Melting would imply that currents exceed rated capacity of the lines that is entirely impossible due to how the grid is set up. What does happen is that loads that are otherwise not economical to run get turned on and that sources that are remote controllable (which is all wind installations > 2 MW and all solar farms > 10 KW except for residential) are switched off. This is a fascinating subject and worth some study, the thing you want to read up on is called grid balancing.
Typically the day-ahead and the 15 minute ahead markets take care of this with pricing alone and there have been no meaningful excursions due to overproduction of renewables, that's just FUD and it does not contribute to the discussion.
What you could argue if you had read up on this is that there are market operators that do both sides of the market, which sets you up for an Enron like situation because they can make money by front-running. After all, they have a little bit of time between the moment where they know what they're going to do and the moment when they actually do it. Market makers that are also traders is always a dangerous combination and this has already led to some trouble, especially early on in the energy balancing market process. Now it is much better.
Where are you getting your numbers? The numbers I’ve seen indicate that non-household electricity prices in Germany have been stable for decades - fluctuating around the 35€/MWh mark until the crisis in 2021 - which affected all European countries - and is currently back at roughly that level. Taking into account of inflation, the wholesale cost of electricity in Germany has fallen slightly over the last 2 decades in real terms.
> to the point that we’re now rapidly deindustrialising
That's not exclusively due to the price of energy, though it is a factor, there are other factors (such as the price of wages) that are much larger factors.
The biggest simply being that China is outcompeting Germany on its own strengths through a combination of a lack of environmental regulations, cheap labor and state subsidies at a level that the EU would not tolerate.
Yes its a doubly idiotic fuckup. Unfortunately we we have incredibly dumb leadership, deciding to retire our last nuclear reactors even though our last hope, cheap Russian energy, has halted.
Almost the entirety of safety issues with existing reactors is around cooling them. Bad things happen when you can't cool them.
There's many ways in which this can happen in existing reactors. You may have a catastrophic leak and lose the coolant - and you can't just send some welders in, what with radiation, superheated steam etc. The pumps that push the coolant around might fail. Etc. etc.
Even when you "switch off" the chain reaction, the fuel rods keep emitting heat from the decay of transient radioactive elements, enough to need active cooling for days or weeks.
So a lot of new reactor designs revolve around eliminating such failure modes. NuScale for example, IIRC, don't use pumps to circulate the coolant, and that's one thing less that can break.
What I'm daydreaming about simply cannot stop working, in terms of cooling. You have something hot in the middle, you let all the heat get our naturally, and you harvest some of it along the way.
You can build fixed power reactors where they don't go critical if cooling is lost. It's just the cost more and they have a capped output. Pebble bed reactors are like this.
I wonder if the improvements for the aluminum heatsink might be applicable to other situations like cpu/gpu heatsinks or other places where cooling is needed. It seems like it might be economical.
I caught that as well. Further reading it looks like they had a ~2x improvement over a flat aluminium sheet (Which itself performed worse then the bare TEG). And of that improvement about half came from radiative cooling.
So my very hot take is that a conventional forced air finned radiator treated with this laser process would show an improvement, it is unlikely to be economically viable versus just using a bigger radiator (at desktop/server CPU/GPU scales). At laptop scales it might be more viable given space constraints.
I’ve not heard the term used that way—I’ve always understood it to indicate a multiple. But now that you’ve said it, the idea of -fold as “folding something in half” is interesting to think about!
I don't know if the engineering field has the same conventions, but in biology we use "fold" as it was used in the paper.
When comparing the signal of two things, the fold change is how many times one is bigger than the other (basically A/B). If A is 15-fold higher than B then it's 15x B.
What you described is the Log2 Fold Change (log2(A/B)), meaning that if A has a log2FC of 15 over B, its signal is 2^15 times higher, hence ≈32,000x.
I'm bummed that you're getting downvoted for what's IMO a very natural question, especially given that "times" is unambiguous and that folding naturally implies a doubling. But this is English, it's deliberately designed to not make sense (see also the modern definition of "literally").
I think its getting downvoted because they suggested that the article's copyeditor should have caught it, despite the article being right. I doubt they would have been downvoted if they just asked, but suggesting that someone failed at their job despite them actually doing it correctly tends to get people a bit uppity.
I don’t fully get it, thermoelectric is relying on a surface temperature differential compared to the radiation temperature differential for solar panels. Wouldn’t the carnot efficiency of these panels significantly pale solar panels?
Well yes, but this is not a competing technology but a complementary one.
You can use this to improve the efficiency of a regular solar panel and as a way to still produce electricity when there is less direct light but enough temperature difference.
It's hard to come close to solar cells' Carnot... I don't think they improved "efficiency" by much. That's why by performance they actually mean something like raw power output. TEG is notorious for having like microwatts per sq cm of output near room temp. Now they are a 100th to a 10th of PV "performance".. mildly disingenuous titling imho.
Sure, solar panels get wicked hot and are more efficient when cold, so attaching something to scavenge heat from them, bonus hot water and a little electricity are all wins until you factor in the cost of doing so and realize you would get 10x the return on adding a few more panels.
We have passive thermal heat tubes on our roof to heat our pool. It works amazingly well. I want to put PV on our roof, but that’d mean having to pull up those tubes first and replacing our pool heater with something electric.
Turns out there’s companies that do hybrid systems! Water is used to cool the PV, increasing the efficiency of the panels in the process, and then the heated water is used wherever you need it.
Unfortunately it seems there’s only a couple of providers, it’s rare to find installers that do it, and it’s ssuuuppppeeerrr expensive relative to the normal options. Such a shame. I wish there were more options here. It seems like a great approach.
We just did the opposite and ripped up our solar hot water system. We have a metal roof and a salt water pool. Problem is that these systems can and do leak. Salt water on a metal roof makes creates rust.
With photovoltaic panels being dirt cheap, we decided to rather heat our pool with a heat pump that is powered by our own electricity.
> until you factor in the cost of doing so and realize you would get 10x the return on adding a few more panels.
You're looking what the cost would be now and I don't think they were suggesting that, but rather as a direction of development for panels.
Luckily this is exactly how things work and why we have continues progress in the area, including with the batteries. Because 10 years ago you wouldn't even bother with super expensive Lithium batteries for home energy storage and go with NiCd, right?
This new solar thermoelectric generator (STEG) traps heat on one side and cools the other, making electricity from the hot-cold gap via the Seebeck effect. Unlike solar panels, which need direct sunlight, STEGs can use ambient heat and scattered light. That’s why they still work in shaded or cloudy areas—any temperature difference can generate power. Today they’re only ~1% efficient (vs. ~20% for solar), but the new design is 15× better than earlier STEGs. They won’t beat solar panels yet, but could be useful in spots where panels underperform.
PV panels work in shaded or cloudy settings (at reduced power of course).
I did data collection for a paper looking at the Seebeck effect in magnetic insulators about ~10-15 years ago, and it seemed like everyone in the whole physics department considered spintronics pretty dead. It feels great to see some big promising applications coming out of the field.
It’s explicitly part of Chinese science and technology strategy to think outside the box and it’s what’s pushing them forward on areas like semiconductors as well.
Given the massive advantage in talent they’ve built up while the Us reverts to Drill Baby Drill we know how this ends.
Eventually the Us with push for atmospheric dimming to “fix” the negative externalities of their approach which had the nice side effect of degrading solar ….
FMl
You’re being downvoted but i don’t think you’re entirely wrong. China has been pursuing some stuff that the western world had essentially abandoned, getting interesting wins (eg: thorium reactors).
He's wrong in the "why" tho. It's not that they must think outside the box, it's that they _must_ not all focus on a single point of research. I'm pretty sure that they are also pursuing popular research topics, because it would be pretty bad if they fall behind for not doing the obvious.
so they are basically using a similar idea to that of a stirling engine in thermoelectric generator or they use a different mechanism to produce energy?
Two materials (often n-type and p-type semiconductors) are joined at two junctions, one junction is heated and the other cooled. The temperature difference makes charge diffuse from the hot side toward the cold side, and this charge is what turns into the seebeck voltage they describe. It was just very hard to get anything meaningful out of this because you can't easily get such a temperature difference. If you've read of the peltier effect, it's the same thing as this, just in reverse.
They both use heat. But stirling converts mechanically, whereas STEM converts energy to electricity «directly».
I’m really excited for this technology to be used for solid state AC. I think we’ll get there one day.
I’m even willing to accept a lot of inefficiency because new AC’s in the US can cost 10-20k now and you can need them as frequently as every 7-10 years.
The whole industry is really getting absurd.
I presume this would be useful for thermal camouflage, since you could store the heat energy in a battery instead of diverting or diffusing it. I hope bleeding edge research sees the potential for aiding the climate crisis as ethically imperative over a military's niche technological advantage/cash.
I sometimes wonder if you could make a much safer nuclear plants with TEGs. There would be next to no active processes and moving parts, no pumps and coolant and circulation. Use nuclear magic to make the hot side, cool with ambient temperature.
No doubt this would be less energy efficient, but perhaps you reap so much savings by not having to worry about water circulation that it's worth it.
Sort of, They are called Radioisotope Thermoelectric Generator (RTG) used where you need a small amount of super reliable power. Mainly used on deep space missions where there is not enough sunlight. The soviets also used them in unmanned deep north lighthouses.
https://en.wikipedia.org/wiki/Radioisotope_thermoelectric_ge...
Neat tech, but very inefficient, to make it efficient fluids start needing to be moved around which hurts reliability. The next step up are pebble bed reactors, I don't think any have been built but the idea is to have self contained fuel "pebbles" enriched enough to get hot but with enough built in moderation so they can never melt down. Then a traditional heat engine is bolted on.
https://en.wikipedia.org/wiki/Pebble-bed_reactor
RTGs aren't quite the same, in that they use simple radioactive decay, no? And cooling in space is really hard.
Would this increase in performance they discovered make deep space satellites more powerful?
Not for the ones already launched and we're not launching any of those in the near future, especially not with the current batch of governments.
And I'm not convinced that particular discovery would yield that kind of performance increase for such an application. There are just too many things different in the environment alone.
> Use nuclear magic to make the hot side, cool with ambient temperature.
If you use ambient temperature for cooling, you are severely limited in your total power output. Like, we're talking about less than a megawatt output (depending on how big the ambient heat dispensers are) compared to the ~1GW of a regular old nuclear plant.
You might say: that's fine, let's just build many small ones. But you still need to track your radioactive material, make sure it's not stolen etc., which is a lot of overhead per installation.
I don't entirely follow. A lot of existing nuclear plants use ambient temperature for cooling, via cooling towers, no? They use pumped coolant to get the heat out of the reactor, sure, but the cold end is just air.
Also, I guess you could have the hot end very hot too..? This improving efficiency. Especially if, by virtue of cooling being safer, you could run it at a higher temperature (less safety margin needed).
Cooling towers are active not passive cooling. They don’t work without pumps.
Nuclear power plants aren't inherently any more unsafe than the nuclear material itself before having been mined.
What makes them potentially unsafe is nuclear technology having an incredible energy density, which can be misused and the radioactive material being active even without prior activation. The latter makes many radioactive isotopes a very effective poison.
And misuse or bad practices are a general problem. One can build awful buildings, toys or government structures, too.
And if you have a look at what the worst reasonable non-political consequences from a nuclear powerplant meltdown can be, they're surprisingly harmless.
We have to Soviets to thank for their absolutely incompetent response to the Chernobyl meltdown, that we have a good idea of the long term effects. The powerplant never stopped operating, people kept working there every day for decades. Hundreds of people were never evacuated and hundreds more returned within weeks.
Just to put this into perspective: Chernobyl was effectively a dirty super-bomb dispersing 50t of highly active radioactive materials and yet the death count among anyone who didn't approach into rock throwing distance remains 0.
Could you run a plant with less-refined fuel if it doesn’t need to temp up to boil water?
You could literally run it on the waste.
The question becomes is the power production worth the operation and maintenance costs.
Nuclear plants sometimes aren’t even worthwhile at their current efficiency. Even if TEGs simplified things mechanically, you’d still face the problem that nuclear often can’t compete (with renewables) in direct a comparison (setting aside overall strategy for base load and stuff).
Yes I wonder if with the current trend, solar might become the real alternative, less expensive, less risky.
But to be fair, you have to consider your « aside », because nuclear has the tremendous advantage of working when it’s cloudy, dark and you need the energy the most in the winter.
I do not think that we can just compare the prices, or maybe we should also add the cost of storage (that is going down too) for solar.
But currently a mix is probably the pragmatic approach.
It's going to be solar + wind + battery. That's where the economics are at. Sodium batteries are just coming online now https://en.m.wikipedia.org/wiki/Sodium-ion_battery - lithium is getting phased out.
Nuclear can't compete. https://en.m.wikipedia.org/wiki/Levelized_cost_of_electricit...
Maybe in some far off future nuclear will have a role... But the global energy investment markets paint a very clear picture: solar + wind + battery is the way.
But now you have a probabilistic system. Your battery part is designed for n numbers of low/no solar/wind input. So you are paying for a system that would be sufficient for x% of typical/historical years.
Which has to factor in the design and cost calculation.
nuclear is also probabilistic: in 2022 half of France nuclear reactor capacity was offline because of premature aging of some components in a design used in many reactors.
Note in case SMR become part of our grid: what if something similar happens to your hundreds of produced and deployed SMR?
There are always unknown unknowns that might be correlated, like that flaw in the component. Weather systems like a https://en.wikipedia.org/wiki/Dunkelflaute are known facts. The good thing is that we do have excellent weather data for the last 75 years, so it is totally feasible for proponents of renewables to run their models through that historical data and say: Look, if we have that amount of renewables and that amount of batteries that would have been enough for the last 3 quarters of a century.
That would be go a very long way to convince me.
Nuclear costs are largely due to regulatory burdens created for reactor designs that are not safe. That is no longer the case. Also, attempts to exploit economies of scale could also improve baseline costs, although these attempts haven't been funded enough yet to actually scale.
Can you cite any stats to back up the claim that nuclear is cheaper than solar or wind in any country, any of them, that's not over 4 years old?
The price of solar and battery storage has collapsed. It's really dramatic
This is a log scale https://ourworldindata.org/grapher/solar-pv-prices?time=earl...
Battery storage would need another 100x improvement before being usable for such usage.
Maybe it will reach that point, maybe not but anyways, you can't plan a grid on non-existing tech. Otherwise I'd pick some better non-existing one
What kind of usage? Batteries are already being built and deployed at scale to support renewables.
No they aren't, there's no country on earth which can sustain a winter load with batteries.
In the next years, it doesn't make sense to use batteries to sustain winter load: it would be way too expensive. But batteries get cheaper quickly, such that it doesn't make sense to build expensive nuclear plants just for winter. What does make sense, until batteries are cheap enough, is natural gas during winter, plus (where available) wind energy and hydro / pumped storage, existing nuclear plants (optimised for winter), biomass (wood), photovoltaics in the mountain, and geothermal.
There is no country on earth which has spent anything like as much on developing storage as it has on fragile, unreliable, expensive nuclear plants.
Systems like these are just getting started.
https://stateofgreen.com/en/solutions/storing-heat-for-a-col...
So show me the model of renewables + batteries that would have been sufficient for all of the last 75 years in Germany and the UK. We do have the historical weather data so there is ZERO reason for all that handwaving.
Yeah sure, it's right around the corner, I had the same conversation on HN 3 years ago haha.
Say what you want about nuclear plants but they work, right now and we have example of countries successfully creating a grid with it.
I can't say the same about the magical batteries.
Exponential growth is a funny thing. First it looks like nothing is happening, and all of a sudden everything has changed. Check out discussions about wind and solar some 10 years ago.
E: for reference from memory, it took about 50 years to install the first TW of solar. The next TW took 2 years, and the next TW is projected to take only 1 year, 2025.
For now it looks more like a flat curve than an exponential one. Batteries haven't followed PV at all, especially not for a grid scale usage.
Making batteries viable for home use is a very different story to make them viable for a grid.
Simulated wind-water-solar-battery in one of Australia grid is pretty close:
https://reneweconomy.com.au/a-near-100-per-cent-renewable-gr...
so the 3.2GWh battery grid storage array, in operation, this is still 1/100th what is needed?
https://www.energy-storage.news/edwards-sanborn-california-s...
You want a 320 GWh installation?
You do realize HVDC grids can do 3,000km energy travel, right? That's basically anywhere to anywhere, continental US. There's already installs like the PDCI https://en.wikipedia.org/wiki/Pacific_DC_Intertie that take 3GW from north oregon to LA.
There's even transcontinental energy links in the works like this: https://en.wikipedia.org/wiki/Australia-Asia_Power_Link
> so the 3.2GWh battery grid storage array, in operation, this is still 1/100th what is needed?
That's closer to 1% of what California needs by itself then even 1% of the USA's need. We aren't even taking into account the large and continual growth in electricity demand yet either.
Together with a large and continual growth of battery electric vehicules.
> so the 3.2GWh battery grid storage array, in operation, this is still 1/100th what is needed?
Unless I'm mistaken, the US consumption is 500GWh/day with peaks at 700GW/day, so 3GWh isn't going to do much
"This technology that is starting to emerge is garbage because it's not already everywhere"?
... and so we're back to my first comment, a 100x improvement is still needed for this kind of usage.
3GWh isn't even at a proof of concept stage yet for this kind of usage. Even 10x that would barely be called a POC.
Nuclear costs would be way higher if the plant operators would need to have insurance for catastrophic failures. Right now, they don't need that. The state (the population) just takes this risk.
Even ignoring all of that, there's "time to first watt" - essentially if you break ground now, how quickly can you start producing power? Nuclear has years scale, wind and solar has weeks, if not days.
And also when better tech comes along, you can partially transition a farm to newer panels and resell the old ones after market.
Plus you don't have to build Onkalo Repository like systems to store waste for 100,000 years after you've produced your electricity.
It's wildly more feasible.
I have this same issue with fusion. Who cares if the fuel is practically free, when building and operating the plant is extremely expensive and prone to failures due to the sheer complexity.
Of course the tech and science is cool, possibly useful in space or other niche environments, but whenever I see fusion proposed as some general energy solution, I just roll my eyes and move on.
People really love scifi on hn, and that's fine ... but the investment capital has spoken and renewables are being funded 30x nuclear. Not 30% more, 3,000% more. It's even 2x over ogc infra (oil, gas, coal)
https://knowledge.energyinst.org/new-energy-world/article?id...
It's a 12-1 over OGC in what the IEA labels "advanced economies" https://www.iea.org/reports/world-energy-investment-2025
We'll have direct antimatter annihilation at scale before we have fusion. It's basically a physics research project, with zero potential for commercial use.
There's already a convenient fusion reactor fairly close by, and it's unlikely to stop operating any time soon.
Wind + solar is just adding another failure mode for when there is no wind. There are many places without adequate wind speed. Nuclear does not care about either, and has the highest energy density on top of that.
Wind + solar + batteries.
Nuclear has its own failure modes. In Switzerland, one of the nuclear plants will be offline for winter (!) due to "unplanned repairs". This will cost the owners of the plant millions.
But those are uncorrelated while weather systems are not.
Renewables also have those, it's just that you have weather failures on top of the technical failures.
Nuclear also have weather failures: low water levels, and high river temperature. In both cases the power plant needs to reduce output or be turned off.
Those aren't weather failures but environmental regulations, there's nothing preventing the plant to work of you really want to, it's just not needed, especially in summer.
If you remove environmental regulations, then (pumped) hydro, wind, and photovoltaics would also be much cheaper, and much faster to build. For windmills, it's birds and whatnot, for photovoltaics (specially large-scale in the mounts) it's wildlife and other environmental impact.
Yes, but those are still different from weather failures. When there's no sun and no wind, you can do whatever you want with regulations, you can't bring it back. Weather failure is something unique to renewables on top of everything else.
If there's an issue with a wind turbine you don't have to shut down every turbine in a field. A nuclear plant is a single point of failure
10 countries do 100%, 20 do over 90, and this data is 2 years old
https://en.m.wikipedia.org/wiki/List_of_countries_by_renewab...
I'm an empiricist.
FUD about "what about where renewables aren't available " is just rhetorical handwaving. The answer, which already exists at nation-level scale is storage and infrastructure.
Now you're just empirically lying, by equating the behaviour of solar and wind with that of hydro.
That table also doesn't say what you apparently think it does: it lists Luxembourg as 89% renewable, which is true, but does not include that Luxembourg only covers about 28% of the electricity it uses, and imports the rest.
Thus Luxembourg's production being 89% renewable is worthless information as to the viability and reliability of wind and solar for baseload: Luxembourg relies on its neighbours for reliable electricity supply.
The increase in renewable generation is great, but some of these are kind of cheating by importing energy to cover shortfalls. You need some kind of baseload generation that’s not dependent on weather, and borrowing this from a neighbour while pretending you don’t need it is like those ‘tiny house’ guys.
> when there is no wind.
Or when there’s too much and you’re melting your grid.
No, you just turn off some windmills.
That's genuinely not how it works. You can see it every spring as Germany wholesale prices go negative to try and offload as much electricity as fast as possible to keep their grid from falling over.
Wind turbines can, and are, turned off (by turning, feathering the blades, and braking). There are two main cases: high wind / storm, and too much electricity in the grid. Photovoltaics can also be turned off.
The main reason for negative electricity prices are inflexible generators, eg. nuclear and coal, because they can't easily (cheaply) ramp down or shut off. Sometimes it is cheaper to let prices go negative than to use emergency mechanisms (that do exist).
Negative prices are not all bad: they are an incentive for storage / flexible demand to step in. Specially, a negative price does not mean the grid is melting.
'Too much electricity in the grid' is a wrong way of expressing it, just like you can't have 'too much fluid in a pipe'. What happens is that the line voltage creeps up because loads are lagging further behind compared to generation.
And like you wrote, that's controlled. Agreed with the rest of your comment, especially the bit that pricing is mostly controlled by the worst parties, not by the best. What we are simply finding out is that a grid designed mostly for baseline loads needs fast response generation (for instance: half of the UK putting their kettle on during half time requires so much extra power that pumped storage becomes a good alternative). And conversely, that if you change the mix considerably that you're going to have to have more control over the cumulative effect of many smaller generators.
But there are already standards for dealing with that even absent remote control of resources: as soon as the local grid voltage that the inverters in modern wind and solar plants see exceeds a very specific maximum for a proscribed period of time they fully autonomously back off their capacity until they are well below those maximums again, and then slowly ramp up to avoid causing grid instability due to oscillation.
What grid balancing is all about is to make this all financially optimal, it has relatively little to do with the safety of the grid, it is simply a way to extract maximum capacity without affecting that safety. A coarser mechanism would simply incur some more waste, but given the amounts of money involved it pays off to tune this.
That’s very interesting, but as a counter point, it seems that the major spain blackout was partially caused by such a voltage increase that was not mitigated properly.
So yes there are mitigations but it still is a major cause of concern I think
Yes, but that voltage increase alone wasn't enough to have caused the outage. The bigger issue was the subsequent oscillations which were amplified by the fact that that part of the grid is relatively isolated. The larger lessons there are still being learned with the report on that outage due in October I believe, but this isn't the first and it certainly won't be the last power outage. The 2003 one in the USA and Canada was much larger and didn't have any renewables other than as instantly available recovery loads for a good chunk of the grid, whereas nuclear power (everybody's baseline stand-by) took much, much longer to recover.
I think for myself the main takeaway is that we have come to rely on always available grid power to a degree that we probably should not have. Unfortunately inverters and battery systems that are capable of running in off-grid mode are very hard to come by compared to the on-line variety. Automatic disconnect and synchronization hardware are present in pretty much all inverters but they are connected in ways that the house would not be isolated from the grid and the software does not support such a solution because of the certification requirements.
Interestingly, a large (house capacity, which is a considerable amount of power) UPS does have those capabilities, and charging UPS batteries through a different mechanism than the built in charger is easily doable.
As for that Spanish/Portuguese outage: I fully expect that there will be some regulatory demands made on grid operators, especially with respect to containment of such outages, and possibly a requirement for better interconnection to increase the amount of perceived inertia in the grid. That is the best protection against such issues. Another thing that needs to be studied better is the kind of 'thundering herd' scenario that seems to have been the cause here (that's very much preliminary, but that seems to be the most logical explanation), especially in grid regions with low internal inertia. Such inertia is basically tightly coupled to how much grid synchronized rotating mass there is in a particular section of the grid. The more mass like that the more inertia there is the harder it is to make the grid go into oscillations. This mass is present both on the production side (generators) and on the consumer side (industry, because the prevalence of electric motors). So areas where the are no traditional (non-renewable) sources and very little industry are more susceptible to such kind of problems, especially when they become more isolated.
I'm following this closely because I look at companies in this space with some regularity and it is in fact what I went to school for at some point, it has always been a field that has interested me.
As the grid moves away from physical inertia sources and loads, do you think it would be realistic to distribute a grid-wide signal separate from the actual line voltage which could assist non-rotating power sources to stay in sync or at least help reduce the chances of oscillation?
The easiest is probably radio or satellite broadcasts but the topology of the grid, which does change, would also have to be considered. Probably not an easy problem to solve simply?
The grid itself is the best source for that. What I think we will see instead is custom superconductance based sink/source units that help with local grid stabilization. Those are already being deployed and they work quite well absent mechanical solutions, but they are still expensive and their capacity is still limited. A really dumb (but probably quite effective) way of doing this could also be by simply hooking up massive but slow flywheels.
Both have the same effect. Good distribution of generation and consumption in a geographical sense is something we never really gave much consideration in the past, it wasn't rare at all to have one side of a geographic region to be 'mostly producers' and another to be 'mostly consumers' and where the two sat next to each other it was usually to accommodate some really large consumer (for instance, a paper mill or a steel or aluminum plant). That also allowed for co-generation which is far more efficient. I think we will see more of this as well, and incentives to allow EVs to be used as sinks during times of excess power availability.
Other options are HVDC interconnects between geographically distant regions or to use these to create micro grids, each of which would be less stable than a much larger one but it would serve to isolate problems if and when they occur.
Interesting detail: wind power, while theoretically rotating grid synchronized mass is increasingly uncoupled and powering the grid using inverters. This is for efficiency reasons, the rotors have a much wider range that way, and you then only use furling of the blades to protect the installation from overspeeding and maximum efficiency the rest of the time even if that means rotating at a different speed than what would sync with the grid. This is optional, if the machine is synchronized it will still produce power, but not quite as much because blades are more efficient at higher RPM running flatter than at lower RPM running coarse, though coarse they do have more torque. So by sticking an inverter in the middle you can basically electronically do MPPT for the windmill rather than doing that mechanically.
Over the life of an installation the cost of that inverter is more than paid back in extra power but it has the downside of not having the mechanical mass of the wind turbine rotor and blades as extra inertia. Win some, lose something else...
> The main reason for negative electricity prices are inflexible generators eg. nuclear
Ah yes, wind and solar generation crushes the grid (https://x.com/ElectricityMaps/status/1786377006562541825) but that's the fault of all those dastardly nuclear plants germany is littered with, all zero of them (https://en.wikipedia.org/wiki/List_of_commercial_nuclear_rea...)
> Negative prices are not all bad: they are an incentive for storage / flexible demand to step in.
Maybe that'll happen, but currently such events only keep increasing in frequency (https://www.pv-magazine.com/2025/08/26/germany-records-453-h...), and as neighbours also install more solar and wind the ability for germany to maintain their grid stability through exports is going to worsen not improve.
That's genuinely exactly how it works. There are companies that provide that offloading-as-a-service who make good money on this concept and the grid isn't anywhere near falling over. Your 'grid melting' comment upthread is nonsense. Nothing is melting.
Melting would imply that currents exceed rated capacity of the lines that is entirely impossible due to how the grid is set up. What does happen is that loads that are otherwise not economical to run get turned on and that sources that are remote controllable (which is all wind installations > 2 MW and all solar farms > 10 KW except for residential) are switched off. This is a fascinating subject and worth some study, the thing you want to read up on is called grid balancing.
Typically the day-ahead and the 15 minute ahead markets take care of this with pricing alone and there have been no meaningful excursions due to overproduction of renewables, that's just FUD and it does not contribute to the discussion.
What you could argue if you had read up on this is that there are market operators that do both sides of the market, which sets you up for an Enron like situation because they can make money by front-running. After all, they have a little bit of time between the moment where they know what they're going to do and the moment when they actually do it. Market makers that are also traders is always a dangerous combination and this has already led to some trouble, especially early on in the energy balancing market process. Now it is much better.
Especially at their price point, the new british one is bound to cost like 40 billion pounds in total, that's far off from solar+batteries.
I'd love to have more modern nuclear, but I don't see it happening anymore, no expertise in building them anymore, cost and time overruns all over...
> (setting aside overall strategy for base load and stuff).
You can't really set aside the reality of the electric grid, you have to do with it.
> setting aside overall strategy for base load and stuff
Energy got increasingly expensive in Germany the further the Energiewende agenda advanced, to the point that we’re now rapidly deindustrialising.
Turns out base load kinda matters.
Where are you getting your numbers? The numbers I’ve seen indicate that non-household electricity prices in Germany have been stable for decades - fluctuating around the 35€/MWh mark until the crisis in 2021 - which affected all European countries - and is currently back at roughly that level. Taking into account of inflation, the wholesale cost of electricity in Germany has fallen slightly over the last 2 decades in real terms.
> to the point that we’re now rapidly deindustrialising
That's not exclusively due to the price of energy, though it is a factor, there are other factors (such as the price of wages) that are much larger factors.
The biggest simply being that China is outcompeting Germany on its own strengths through a combination of a lack of environmental regulations, cheap labor and state subsidies at a level that the EU would not tolerate.
Nothing to do with being reliant on Russian oil at all? All the fault of renewables?
Yes its a doubly idiotic fuckup. Unfortunately we we have incredibly dumb leadership, deciding to retire our last nuclear reactors even though our last hope, cheap Russian energy, has halted.
Why is it safe to have no coolants around when things run out of control?
Almost the entirety of safety issues with existing reactors is around cooling them. Bad things happen when you can't cool them.
There's many ways in which this can happen in existing reactors. You may have a catastrophic leak and lose the coolant - and you can't just send some welders in, what with radiation, superheated steam etc. The pumps that push the coolant around might fail. Etc. etc.
Even when you "switch off" the chain reaction, the fuel rods keep emitting heat from the decay of transient radioactive elements, enough to need active cooling for days or weeks.
So a lot of new reactor designs revolve around eliminating such failure modes. NuScale for example, IIRC, don't use pumps to circulate the coolant, and that's one thing less that can break.
What I'm daydreaming about simply cannot stop working, in terms of cooling. You have something hot in the middle, you let all the heat get our naturally, and you harvest some of it along the way.
You can build fixed power reactors where they don't go critical if cooling is lost. It's just the cost more and they have a capped output. Pebble bed reactors are like this.
I wonder if the improvements for the aluminum heatsink might be applicable to other situations like cpu/gpu heatsinks or other places where cooling is needed. It seems like it might be economical.
Cars are an obvious application.
Some parts get very hot, and any electricity produced without engine or fuel add to range / efficiency.
I caught that as well. Further reading it looks like they had a ~2x improvement over a flat aluminium sheet (Which itself performed worse then the bare TEG). And of that improvement about half came from radiative cooling.
So my very hot take is that a conventional forced air finned radiator treated with this laser process would show an improvement, it is unlikely to be economically viable versus just using a bigger radiator (at desktop/server CPU/GPU scales). At laptop scales it might be more viable given space constraints.
2x the radiative + convective heat dissipation for a passive aluminum heat sink seems like as big a deal as its use in this particular application.
I agree, it has quite broad application!
It's not a good solar power source, but could the technology be adapted to heat sinks? Maybe they could license the technology to CoolerMaster.
AI data centres can perhaps harvest some of the waste heat back as electricity.
Doesn’t 15-fold mean multiplied by two 15 times, or 32,000x not 15x?
I feel like someone should have caught that before publication.
No https://en.m.wikipedia.org/wiki/Fold_change
I’ve not heard the term used that way—I’ve always understood it to indicate a multiple. But now that you’ve said it, the idea of -fold as “folding something in half” is interesting to think about!
I imagine folding too. But then we couldn’t ever achieve more than maybe a twelve-fold increase.
I don't know if the engineering field has the same conventions, but in biology we use "fold" as it was used in the paper. When comparing the signal of two things, the fold change is how many times one is bigger than the other (basically A/B). If A is 15-fold higher than B then it's 15x B.
What you described is the Log2 Fold Change (log2(A/B)), meaning that if A has a log2FC of 15 over B, its signal is 2^15 times higher, hence ≈32,000x.
I'm bummed that you're getting downvoted for what's IMO a very natural question, especially given that "times" is unambiguous and that folding naturally implies a doubling. But this is English, it's deliberately designed to not make sense (see also the modern definition of "literally").
mumbles something else about 2^15
> folding naturally implies a doubling.
Why is that? I can imagine doing two folds on a sheet of paper and ending up with three layers of paper. Imo one fold adds one layer.
I think its getting downvoted because they suggested that the article's copyeditor should have caught it, despite the article being right. I doubt they would have been downvoted if they just asked, but suggesting that someone failed at their job despite them actually doing it correctly tends to get people a bit uppity.
Words are arbitrary, but there really isn't any dispute what -fold means as a suffix. See the dictionary entry for it https://www.merriam-webster.com/dictionary/-fold
Also as far as i know, this isn't just an english being weird thing, most germanic languages use "fold" the same way.
I don’t fully get it, thermoelectric is relying on a surface temperature differential compared to the radiation temperature differential for solar panels. Wouldn’t the carnot efficiency of these panels significantly pale solar panels?
Well yes, but this is not a competing technology but a complementary one.
You can use this to improve the efficiency of a regular solar panel and as a way to still produce electricity when there is less direct light but enough temperature difference.
It's hard to come close to solar cells' Carnot... I don't think they improved "efficiency" by much. That's why by performance they actually mean something like raw power output. TEG is notorious for having like microwatts per sq cm of output near room temp. Now they are a 100th to a 10th of PV "performance".. mildly disingenuous titling imho.
Could you combine them with solar panels?
Sure, solar panels get wicked hot and are more efficient when cold, so attaching something to scavenge heat from them, bonus hot water and a little electricity are all wins until you factor in the cost of doing so and realize you would get 10x the return on adding a few more panels.
We have passive thermal heat tubes on our roof to heat our pool. It works amazingly well. I want to put PV on our roof, but that’d mean having to pull up those tubes first and replacing our pool heater with something electric.
Turns out there’s companies that do hybrid systems! Water is used to cool the PV, increasing the efficiency of the panels in the process, and then the heated water is used wherever you need it.
Unfortunately it seems there’s only a couple of providers, it’s rare to find installers that do it, and it’s ssuuuppppeeerrr expensive relative to the normal options. Such a shame. I wish there were more options here. It seems like a great approach.
We just did the opposite and ripped up our solar hot water system. We have a metal roof and a salt water pool. Problem is that these systems can and do leak. Salt water on a metal roof makes creates rust.
With photovoltaic panels being dirt cheap, we decided to rather heat our pool with a heat pump that is powered by our own electricity.
> until you factor in the cost of doing so and realize you would get 10x the return on adding a few more panels.
You're looking what the cost would be now and I don't think they were suggesting that, but rather as a direction of development for panels.
Luckily this is exactly how things work and why we have continues progress in the area, including with the batteries. Because 10 years ago you wouldn't even bother with super expensive Lithium batteries for home energy storage and go with NiCd, right?