this post was submitted on 10 Feb 2024
157 points (89.1% liked)
Technology
59232 readers
3671 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Needs to be taken with a grain of salt. Actually capturing the heat for electricity, and getting more electricity out of it than required to run the reactor itself, remain massive open questions that this generation of research reactors does not even begin to tackle.
IIRC, this is a big deal because they are achieving more energy out than they put in.
If I've been reading these correctly they are achieving it with tiny amounts of fuel and slowly working up as they achieve success. I'm seeing these as proof of concept and fantastic steps in the right direction.
In this context, the "energy that they put in" only counts the heating of the plasma. It does not include the energy needed to run the rest of the reactor, like the magnets that trap the plasma. If you count those other energy needs, about an order of magnitude improvement is still required. Possibly more, if we have to extract the energy (an incredibly hard problem that's barely been scratched so far).
So yeah, it's nice to see the progress, but the road ahead is still a very long one.
I feel like the big scary problem is capturing the heat. The proposed method I've seen involves a beryllium "blanket" that captures the heat to send it off to a boiler. The problem is beryllium is quite expensive and quite limited in availability. And in fact we may only have enough beryllium (in the world) for a dozen or so reactors. But it's worse, because these blankets absorb high energy neutrons, and become radioactive over time. And that means two problems, you need to replace the blanket and you need to dispose of radioactive waste.
When you put all that together, I just think "shouldn't we stick with fission power?"
Someone please correct me if I'm wrong, but isn't the problem that Uranium has a half-life of a couple hundred million years, while the half life of beryllium is less than a second?
Only Beryllium-10 has a long half-life for beta decay. Adding another neutron drops that back down to a few seconds and additional neutrons drop it back to a fraction of a second. So as long as that specific type of Beryllium isn't used, it would be fine, right?
Edit: https://www.thoughtco.com/beryllium-isotopes-603868
Those quick half-lives decay right away, losing a neutron, right? So that Berillium-11 just decays back into Berillium-10.
The problem is that the blanket is constantly absorbing neutrons from the fusion reactions, that's it's job. So despite using simple berillium 5 to build your blanket, you end up with these heavy isotopes over time, and because the heavier ones quickly decay into lighter ones, you basically end up with a whole lot of berillium-10.
It doesn't look like they're generating electricity with that energy yet, so while you are correct the person you responded to is also correct in that we still need to prove we can harness it efficiently enough.
I think they'll get there, it just boils down to investment and time.