Nothing suggests mental illness more than the tortured reasoning used by anti-nuclear activists to attempt to justify their positions. Often they fixate on particular claims which turn out to be utterly ludicrous after even shallow examination. One of these is the argon-41 chromosome-damage "theory" currently being flogged by BasG
(Bas Gresnigt) over at Atomic Insights. This short-lived isotope (half-life 106 minutes, beta-decaying to stable potassium) is a noble gas, and disperses rapidly in air by diffusion and turbulent mixing. The "researchers" who tortured their data to make it confess the crimes of 41
Ar got it to sign a statement amounting to this:
- Spent nuclear fuel emits neutrons due to spontaneous fission.
- The neutrons escape into the air and run into 40Ar atoms, forming radioactive 41Ar .
- 41Ar ionizes due to recoil from the neutron capture.
- The ion is attracted to dust and water particles, keeping it close to the ground after formation.
- Because of this, the effects of 41Ar are most strongly felt upwards of 20 km from its point of formation, rather than immediately next to it.
- The measurable effect is an increase in the male-female sex ratio at birth, because the X chromosome has more DNA than the Y chromosome and is more susceptible to damage.
Got that? I lost 5 IQ points just from reading the crap required to write that summary, so I'm not going to repeat it. Oh, the "researchers" conveniently left out the mechanism by which 41
Ar singles out the sex chromosomes for damage, rather than causing mutations and consequent birth defects all over the genome. It's just one of the ways that 41
Ar is evil, I guess.
Back in reality, things are just a little bit different.
Argon is only a trace constituent of air. Nitrogen is 78% of air by volume, while all isotopes of argon are only 0.93%. Further, each molecule of nitrogen has two atoms while argon has but one. Last, the thermal neutron capture cross-section of nitrogen is 1.91 barns, while argon's is only 0.675 barns
. The upshot is that a free neutron in air is about 470 times as likely to be soaked up by a nitrogen atom (forming
stable 15N 14
C by the (n,p) reaction) than by 40
Ar. My understanding is that even that's not terribly likely, and the most common fate of neutrons in air is beta-decay to hydrogen (half-life ~11 minutes).
But let's follow this to the end. 40
Ar plus a free neutron have a total mass of 40.9710480385 AMU. 41
Ar masses 40.9645006 AMU, for a difference of 0.0065474385 AMU or on the order of 6 MeV. This will be released as a gamma ray. This is certainly enough to ionize an atom... but is it likely to stay
that way? The ionization energy of argon is 15.7 electron-volts (eV). The ionization energies of both nitrogen and oxygen are less than that, so at the first collision with an oxygen or nitrogen molecule the newly-formed 41
ion is going to steal an electron and not be an ion any more. That will take on the order of picoseconds.
Last is the issue of location of the decay. The typical human contains enough potassium that the beta decay of 40
K occurs around 4000 times per second; against this background of beta decays, you'd have to have a huge effect from 41
Ar to measure something. But even if an atom of 41
Ar was able to stay ionized, attach to a dust particle or water droplet, and stick around near a human, what is the likelihood that it could be ingested and migrate to the gonads before it decayed? Roughly zero.
The funniest thing about this 41
Ar "theory" is that there are actually people who take it seriously. So who's pushing this nonsense? There are two basic possibilities here, not necessarily mutually exclusive: either they are objectively deluded (crazy), or they want you to be. The latter want to panic you into following their agenda, which you wouldn't do by pursuing your own interests. And that, my friends, is evil.
Note: Corrected the results of neutron capture in nitrogen, H/T rrmeyer.
Edit: corrected notation.
Someone on The Energy Collective
suggested that 24/7/365 facilities like data centers could run on "renewable energy" (meaning unreliable wind and solar) by taking the first pick of power from a wind farm or other facility and letting others take the surplus. Specifically, he said this:
The wind farm that HP is drawing from is 300 MW. If they get first dibs on generation,it's not out of the question that 95 percent of the time it will generate more than 112 MW which would be a 37 percent CF.
Is that true? I decided to find out.
Here's a plot of Texas wind generation over March of 2014, courtesy EIA:
I cut this down to a 143-by-489 area (69927 px²) of just the plot itself trimmed down to the production maximum, and used Gimp's histogram function to measure the red area. It came to 32026 red pixels. If we assume that the production peak was 100% of nameplate (unlikely, but it's favorable to the case) that's a capacity factor of 45.8% for the month.
45.8% of the 143 pixel height is 65.5 pixels. Cutting the graph down to 65 pixels from the baseline yields this:
The curve never quite goes to zero, but it gets close to it several times; it stays very low for an entire day. Even the wind across the entire state of Texas, cut down to its capacity factor for the windy month of March, is not reliable enough to keep data centers running; the net capacity factor for the entire state of Texas
is just 73.9%, far less than the 95% assertion of "wind smith". The infrastructure of an information economy needs reliability more like 99.99%.
But what's left over? Here's what that curve looks like:
In the windy state of Texas, in March 2014, the "leftovers" from preferred loads taking everything up to 45.8% of the peak has a capacity factor of just 22.1%. It's a very spiky curve that has gaps lasting days when there is little or no power available. What sort of business or process could anyone operate using power that was so unreliable? I can't think of one. Maybe you could dump this power to heaters or some other extremely cheap load, but what you'd do with the heat I'm not sure. At one point I had the idea of using surplus electricity to heat crushed concrete, with the goal of dehydrating the cement and converting it back to separate streams of cement, sand and aggregate for recycling into new concrete. I don't know if this is chemically possible (does hot cement react with sand or otherwise become inseparable and unusable?), but at 22.1% capacity factor the kilns and sifters and whatnot would have to be very cheap to make this a workable proposition and you'd have to get the power for close to nothing.
"Renewables" fanatics (maybe I should start calling them "windbags") like to say that the wind is always blowing somewhere. At least for Texas in the month of March 2014, that much was true. However, there were many periods even in that blustery month where it was certainly not blowing hard enough to keep essential 24/7 things running. When it comes down to e.g. pumping stations filling up and backing raw sewage into homes and buildings because the unreliables are not there that day, even the most fanatic Green is likely to burn fossil fuel instead.
The unreliable sources of energy are simply not going to replace fossil fuels. They can't; their characteristics make it an engineering impossibility. This is why Greens need to drop their contrived objections to those sources of energy which actually can.
After reading "Ender's Game" and so many of his other works, I thought that it was mighty hard to go wrong with Orson Scott Card.
I can now say that it is a sinking feeling to find out just how wrong I was. At least, when he affixes his name to something as a co-author.
I read "Earth Unaware", and shouldn't have bothered finishing it. Is OSC phoning it in, or what? Based on what's in the book, I'm assuming that his contribution was his name and some of the characters which show up in later novels. His co-author Aaron Johnson has no concept whatsoever of orbital mechanics, radio propagation, or even basic physics like elementary particles. The least one can do if one can't remember the difference between gamma rays and cosmic rays is look it up
, and it's excruciatingly obvious that he didn't bother.
There are scientific howlers throughout the off-earth parts of the book. An alien ship moving at a large fraction of lightspeed... takes weeks to travel inside the orbit of Jupiter, and human ships of far inferior capability nevertheless match velocities with it! This ship generates radio noise of sufficient intensity to block non-laser communications... yet humanity appears to have forgotten how to make and use radio telescopes, because it hasn't been detected, pinpointed and intensively studied because of those very emissions! Dust particles and gas in the Kuiper belt are a hazard to space-suited persons outside ships moving at speed... ignoring the fact that the solar wind sweeps dust into interstellar space, and the solar wind itself is moving as fast or faster than the velocities given in the book! Oh, best of all: the simple expedient of orienting the ship so that the hull intercepts anything from the direction of motion never occurs to the author.
If Card cared for his reputation, he'd ask Tor to remove his name from this book. If Tor had any integrity, they'd stop selling it as SCIENCE fiction.
TL;DR: No. At least, not chemically.
In a discussion about residential radon in which I linked this NIH study
which found that lung cancer decreases with rising residential radon levels, someone asserted that you could die by radon poisoning. I challenged that, saying that I'd calculate just what would happen if you had even 0.1 vol% radon in air. Then someone else said that people had died by radon suffocation in Appalachia, so I went and did it.
Stipulate that 0.1% by volume in air is two orders of magnitude below anything presenting an asphyxiation hazard. Radon has a density of 9.73 grams/liter, so 0.1% by volume would be 9.73 mg per liter or 43.8 μmol/liter.
The half-life of Rn-222 (the only isotope which lasts long enough to get out of soil and hang around much) is 3.824 days. This means that there's 1/e of it in 5.52 days, or 2.1e-6 (1/(86400*5.52)) of it decaying per second. For 43.8 μmol, this is 9.2e-11 mol/sec decaying or 55 TBq (terabequerels).
The decay energy of Rn-222 is 5.5 MeV, so that 4.38 μmol has a total power output of 3.04e20 eV/sec. An electron-volt is 1.602e-19 J, so that works out to 48.7 watts per liter of air. The air in a room 3mx3mx2.5m high (22500 liters), spiked with 43.8 μmol/liter Rn-222, would release about 1.1 megawatts of heat.
Anything and anyone in such a room would catch fire in seconds. There would be no time to suffocate.
Do I need to mention that if such high concentrations of radon were found in nature, people would pump it into tanks and use it to boil water? It would be one of the most fantastic sources of free energy imaginable.
Pursuant to a dissussion on Rod Adams' blog, I'm likely going to be digging into ultrasupercritical steam cycles today. This made me realize that most people who do this are either paid money for the effort, or are studying so they can be paid money for it.
Should I be taking commissions for analytical posts? Something like GoFundMe? I can use the money.
Are there any gaps in the public-domain analysis out there that maybe I could fill?
Bend my (virtual) ear.
I picked up a link to an article on the Monticello nuclear power plant and the alleged un-economic nature of the cost overruns of its recent power uprate
, and went to try to add this to the discussion:
Let's have a look at the facts, shall we?
The Monticello nuclear plant is rated at 671 megawatts net. The plant can be expected to average more than 90% of this figure (more than 600 MW), and since refueling outages are scheduled for seasons of low demand its useful capacity factor is close to 100% during the peaks of summer and winter. All of that generation is free of air emissions of any kind, especially carbon.
It may be true that...
it’s enough money to install over 400 megawatts of new wind power.
But 400 nameplate megawatts of wind turbines, even at a generous capacity factor of 40%, is just 160 megawatts average (barely more than 1/4 of Monticello). Neither does that figure include the cost of new transmission lines and other upgrades which are required by the new wind even if they're not billed to it; those can cost as much as the wind farms themselves. Worst, wind farms go dead during winter and summer high-pressure systems which bring heat waves and cold snaps.
Ignoring these things won't make them go away. Tragically, the advocates of "renewables" appear to be sticking their fingers in their ears to avoid hearing the words of Robert F. Kennedy, Jr: "the plants that we're building, the wind plants and the solar plants, are gas plants". Replacing Monticello with wind plus gas means about 2 million extra tons of CO2 emissions per year, about 30 million tons by 2030. At a social cost of perhaps $50 per ton, that is $1.5 billion to go "renewable".
Please wake up.
However, immediately upon creating an account to enter this, I was faced with this screen:
Apparently, the drawbridge has been pre-emptively raised against any attempt of the truth to invade their little castle of delusion.
Leveraging existing assets is one of our best prospects for cleaning up the air and cutting petroleum consumption. An under-appreciated possibility is using conventional hybrids, even retrofitting existing ones, to substitute a bit of grid power for petroleum fuel.
In an article on advanced lead-carbon batteries for mild hybrids at GCC
, there are these notable phrases:
The state-of-charge (SoC) of current lead-carbon batteries is typically maintained at between 30 and 50%.... Advanced lead-carbon batteries for vehicles currently under development will be capable of operating in the 30 to 70% SoC range at 12.5kW.
That's operation. What about off-line? Lead-acid likes to be held at 100% SOC, and I doubt that lead-carbon suffers at all from it. NiMH batteries also appear to prefer to be held at 100% SOC and cycled only shallowly
. Neither could accept regenerative braking power when full, but when the vehicle is starting and driving shortly after start there is no braking energy to recover.
The GCC article continues:
Future battery developments will most likely combine advanced lead-carbon electrochemistry with ... substantially reducing the size of a 1 kWh battery required for mild electrification of the powertrain.
The hybrids and mild hybrids of tomorrow will have on the order of 1 kWh (above) to 1.3 kWh (base Prius) of battery, and operate it in a SOC range centered between 40% and 50%. But if the battery was charged to 100% SOC off-line, there would be between 500 and 700 Wh of extra energy to move the car (or for other functions, like instant high-power defrost). This is enough for perhaps 2 miles of petroleum-free driving, perhaps more if the first stretch after starting is creeping in traffic for an extended distance.
How much fuel could this save? Approximately 930,000 hybrids (not plug-ins or BEVs) were sold in the USA in 2012-3. If we assume 10 battery top-offs per week 52 weeks a year, 2 miles range per top-off and 40 MPG consumption avoided, the 2012-3 fleet would avoid about 1 gallon consumption per vehicle per week, 52 gallons/vehicle/year, 48 million gallons/year for the fleet. The total for the hybrid fleet going back to the oldest Priuses on the road would be multiples of this, perhaps a good fraction of a percent of total US gasoline consumption.
What could this do for the grid? 1.2 kWh/day 5 days a week isn't much, but if you can draw it on demand it might be worth something. The J1772 Level 1 spec is 120 VAC @ 12 A, or 1440 watts. 930,000 vehicles @ 1.44 kW/vehicle is 1.34 GW of potential demand. It would take only about 25 minutes to put 600 Wh into a battery at that pace, but 25 minutes of demand equal to a large nuclear plant available twice a day (perhaps 3x, if vehicles are plugged in during after-work errands) might be very useful to ISOs for down-regulation of the grid.
This is a small possibility, not a big possibility. But there is a big push for mild hybrids (perhaps multiples of the number of conventional hybrids), and a lot of littles make a lot.
Do you recall the PSR hysteria about the unloading of SNF from Fukushima Dai'ichi Unit 4, and how it was a deadly threat to all life on earth, the "most dangerous moment since the Cuban Missile Crisis
"? (Older post
, later update
Well, I must announce that Wasserman's worst nightmare must have come true. TEPCO announced that all spent fuel has been removed from the Unit 4 fuel pool
, leaving only 180 unused fuel elements (unused fuel is safe to handle by hand). Since we were told that it was impossible to handle this job safely yet we heard nothing of this, it stands to reason that it killed us all in our sleep, and we are in some dreary Hadean afterlife in which we continue to go through the motions of existence without any knowledge of what happened to us.
be some dreadful thing like this, because Harvey Wasserman, PSR, UCS, etc. could not possibly have been wrong. Could they?
Labels: Fukushima, PSR, satire, TEPCO, Wasserman
Mostly, they seem to be lying to themselves. However, that they repeat falsehoods as fact cannot be disputed and suggests that the entire Green philosophy is one enormous delusion.
A little while ago, I was flabbergasted by one Bob Wallace, moderator, censor and ban-hammer wielder of Cleantechnica.com [1
], who flatly stated that nuclear is not dispatchable. Not just once to me, but many times going back years
Digging a little deeper I found that this meme-virus had some fairly old (as the web goes) roots. The Wikipedia article on dispatchability still says as of 2014-10-08 that nuclear power is not dispatchable
. This claim goes back to the very first version of the page in 2006
In my research I found a grid operator definition of dispatchable meaning the generator having filed an energy supply curve with the ISO, but I'm unable to find it again. But the EIA rides to the rescue
. Under "Dispatchable Technologies", it lists:
- Conventional Coal
- Advanced Coal with CCS
AND (last but not least)
- Advanced Nuclear
(there being no un-advanced nuclear entering the market in 2018)
Bob Wallace will probably back-pedal and claim that today's nuclear isn't advanced and isn't dispatchable. But for the last 30 years, France has been load-following with its fleet of 1980's-vintage PWRs. How can you follow load, if you can't dispatch generation on demand to follow it? Ridiculous.
I expect to see more such howlers from Bob Wallace and his ilk. AAMOF, I would wager that he'd even repeat his "nuclear is not dispatchable" claim after being referred to the EIA page. In other words, what Bob Wallace says can only be taken as a recitation of Green dogma. One cannot expect any evidence of independent thought or any acceptance of facts from outside his echo chamber.
A lot of what Bob Wallace says (along with the rest of orthodox Greens) is out-and-out lies. This is not to say that they are consciously lying. They may very well have accepted a delusional belief system structured to support the romantic Green vision, and can't pry themselves away from it. But that means that they have, for all intents and purposes, surrendered their ability to think. Without the ability to actually reason from facts, even uncomfortable facts, they are intellectual zombies: shambling along, eating the brains of those too slow to evade them and turning them into more like themselves.
And that, folks, is why so much discussion of our energy future is Sofa King Stew Pit.
He doesn't like it when you use the phrase "put you in my crosshairs
". There's a list of other things he doesn't like, none of which are noted prominently enough for the casual Cleantechnica commenter to have any idea what they are before being censored, banned or both for transgressing them.
Cyril R., on SeekerBlog
If we are serious about greenhouse gas emissions then the only way forward is to realize we need 10x more energy even with the best energy efficiency technology employed globally. And then plan fossil fuel consumption accordingly. But then we have 10x the energy need while we need 10x less ghg emissions long term! I think we will find that in such a future world, we may only use (10% of 10% =) 1% fossil fuels to stay within long term acceptable ghg emissions limits. It is such a drastic reduction that we really need to think about zero carbon economies. CCS with 90% efficiency will not be good enough. It will need to be 99% efficient. Solar and wind grids with 30% natural gas backup will not be acceptable. They will have to be 98% solar and wind and 2% natural gas.
Once you look at those futures it is clear that wind and solar and CCS can’t cut the mustard. They are unacceptable greenhouse gas emitters…
Hence in my opinion such technologies are part of the global energy problem (except for niche uses), not part of the solution.
With sufficiently cheap energy, pollution would become a thing of the past; at current levels of energy usage, this is impractical. Most of our species is still locked in the subsistence-agriculture trap, and simply extending the First World system of coal- and petroleum-fueled industrialism to them would probably destroy the planet.
This knowledge is the specter at the feast of progress, now that Western civilization is overcoming the self-inflicted wound of 1914 and its consequences. The so-called "soft" path is nothing more than a return to the animalistic misery of pre-industrial times; sunlight is simply too diffuse to maintain even the present level of consumption. Hard, concentrated energy is the essential prerequisite for an economy like ours; conservation and increased efficiency merely delay the problem without solving it. Restricting growth means a boot in the face of the world's have-nots; a tiny island of comfort on a swelling mound of resentful pain; eventually it means impovershing everyone.
— S.M. Stirling, foreword to Power
, copyright 1991.
I'm used to seeing nonsense on this heyah in-tar-web thing, but sometimes I stumble across something that strikes me as... special, yeah. Call it special.
One such type of special is the blogger Stock, who calls his blog Nuke Professional
. His latest post
is supposedly about "understanding cesium".
How does he go about understanding it? He starts "With all the big numbers, scientific notation, various ways of expressing radiation units that the pro-radiation stakeholders use to confuse people...." The various radiation units are products of history and the essential physics, and scientific notation was invented to make both large and small numbers easier
to keep track of and understand. If the public doesn't follow them, the scientific community isn't at fault: they recommend that everyone learn some science, and can't be blamed if they don't.
From all appearances, Stock goes out of his way to prove that he doesn't understand big numbers, small numbers, and orders of magnitude. Take the obfuscatory incredulousness about 4 trillion Bq of Cs-137 supposedly stirred up along with dust at F. Dai'ichi. He expresses disbelief that the quantity of radio-cesium could be as small as 1.25 grams. If you were a professional, wouldn't your first question be about the actual concentration of Cs-137 in said dust?
Stock professes to read ENEnews, and my first search on the terms "cesium dust fukushima" turned up an ENEnews page claiming "over 200,000 Bq/kg"
in dust from the site. That's 2.0*105
Bq/kg, for all you evil scientific-notation users out there.
4 trillion Bq becomes 4*1012
Bq in the radiation scientists' secret code. If we use the arithmetic trick called division, we can do this:
Bq dispersed / 2.0*105
Bq/kg dust = 2*107
kg dust dispersed.
That 1.25 grams of Cs-137 appears to have been distributed amongst on the order of twenty MILLION kilograms (twenty thousand metric tons) of dust. The radioactive material was a very small fraction (about 60 parts per trillion) of a rather large total mass. And I got all of this from a source that he cites with apparent approval.
Stock writes this:
slightly more than 1/3 of ONE GRAM of cesium 137, deposited across a square mile of land as a smoke or gas, is enough to render that land uninhabitable for decades.
Is that correct? Cs-137 has an activity of 88 curies per gram
, so 1/3 gram (about 29 Ci) per square km is 29 μCi/m² or about 1 million Bq/m². How much actual radiation would that expose you to? Let's haul out another virtual envelope:
Assume that a human standing on this contaminated ground covers an area of 1/10 square meter. Half the radiation goes straight down, half goes straight up. The beta radiation from Cs-137 is blocked by the soles of the shoes, but all the gamma radiation from the decay of Ba-137m is absorbed. That's 1 million decays per square meter, times 0.5 going upward, times 1/10 square meter: 50,000 gammas per second absorbed. The absorbed energy is 5*104
events/sec * 6.62*105
eV/event = 3.31*1010
eV/sec = 5.3*10-9
J/sec = 1.9*10-5
J/hr. For a body weight of 50 kg, this is an absorbed dose of 0.95 microJoules per kg per hour, or 0.95 μGrays (roughly μSv).
Unless I've slipped a decimal place, that's not even 9 milliSieverts (mSv) per year. That's less than people in Colorado average from groundshine, cosmic rays, and radon. If that land was uninhabitable, Colorado is uninhabitable. Such an assertion is as insane as the claim (made in all apparent seriousness by a certain Canadian who I believe I read lives on disability
) that California is a wasteland.
That land would not only be safe to live on; it would be safe at ten times the Cs-137 concentration, or 50. And were it spiced up to 1000 times, it would not be long before rainwater washed the cesium down through the soil and reduced the surface exposure levels back to something quite tolerable. That very process is going on right now in Fukushima prefecture; radiation levels are falling much faster than the radioisotopes decay, and areas are being cleared for habitation even at Japan's hypochondriacally low standards for safety.
So, uninhabitable for decades? I don't think so. And I'm pondering a Kickstarter campaign to go live in Fukushima prefecture for a year to prove it. A year of sake, sushi and blogging. Who could ask for more?
NNadir, commenting at TheEnergyCollective
And your theory is that building vast carbon dioxide waste dumps to
contain 31 billion metric tons of a gaseous compound would be easier?
one builds a waste dump, it involves charging waste fees. If one
builds a power plant, one has an asset. Which one makes economic
sense? A dump or an asset?
The world's largest carbon dioxide
waste dump functioning right now on this planet contains what percentage
of the carbon dioxide as of 2014?
Anyone mentioning the
non-issues of waste (what is carbon dioxide?), proliferation (how many
people died in nuclear wars in the last half century, and how many died
in oil wars?) and mentioning the absurd the issue of safety - when one
considers air pollution deaths and fossil fuel accidents (how many
people died in oil and gas explosions compared to the number of dead
from nuclear power plant failures in the last half a century?) = while
attaching these issues only to nuclear energy and nothing esle is simply
not being serious.
Nuclear exceptionalism is simply not rational.
worst kind of critic of nuclear energy is one who pretends to be
rational and open about it while dragging out tiresome nuclear
exceptionalism rhetoric that simply doesn't stand scrutiny.
Repeating nuclear myths while pretending to decry their effects is not helpful, nor is it, really, ethical..
is no way in hell that so called renewable energy will produce 34% of
humanity's energy, as it has failed to do this in half a century of
Containing 14% of the world's carbon dioxide
in a dump involves the capture of carbon dioxide involves 41 billion
tons per decade. There are, for the record, more than 30 years in a
billion seconds. What's your theory, that containing 137 tons of
carbon dioxide per second, every second, for a half a century is a
simple and cheap thing to do? Have you any idea about the technical,
financial, and geological issues involved in this outcome would be?
Your guess is that this can be brought on line in 15 years, by 2029?
Faster than breeder reactors? Russia and India both brought breeders
to completion in the last two years. How many billion ton carbon
dioxide dumps were built in the last two years?
We built more than
400 reactors on this planet in about 20 years, and they produced an
average of 28 exajoules of energy each year of the first decade of the
21st century. By contrast doing what's never been done, CSS, is
There is no way in hell that it will be easier to contain
hundreds of billions of tons of carbon dioxide each decade when we can't
find a way to store 75,000 tons - collected over half a century - of an
insoluble relatively harmless solid as much of used nuclear fuel is.
problem is that you are attempting to compare a theory that has failed
in all cases to become significant with a reality and coupling it with
the logical fallacy of "appeal to popularity."
energy may not be able to stop climate change in its tracks - surely it
won't, because with this kind of rhetoric flying around this late in
the game it can't - but it need not do so to be the best possible,
cheapest possible, and most experimentally verified approach to doing
so. Any money that is diverted from nuclear energy to CSS, or for that
matter, so called "renewable" energy is essentially a decision to
commit suicide at this point, whether the general public knows it or
I wouldn't, by the way, put too much faith in the wisdom of
the general public. In the middle of the last millenium the general
public was pretty sure that the bubonic plague was best dealt with by
prayer rather than improved sanitation. Things are not much better 500
CSS talk has been, is and always will be the
equivalent of doing nothing, and the result of doing nothing is clearly
visible in the planetary atmosphere in CO2 measurements over the last
decade. We blew past 400 ppm this year, and we will blow through 450
ppm just as quickly as we went from 350 to here. We were at 350 in
1987, and I doubt, very much, that it will take more than 25 years to
hit 450, especially with this kind of cynicism floating around
Aruba is a very nice tourist destination. It has sun, sand and water. It also has trade winds, which lash the eastern shore with heavy waves. These winds are the energy source for the Vader Piet wind farm, on Aruba's southeast shore.
Vader Piet exists because of a coup of clever financing. Arranged during the credit crisis of 2008, Jerome Guillet managed to get the turbine vendor (Vestas) to back the deal. According to what I can find, the project was completed in just over a year and went live in December 2009.
If everyone lived happily ever after, they're being awfully quiet about it. I've done quite a bit of digging, but I can't find many generation figures for Vader Piet. The follow-on wind farm that was rumored to be in planning has generated absolutely zero news that I've been able to discover. Maybe it's just not in English, and is escaping my American-tuned search engine nets.
Or maybe it's just that these things don't exist.
The IEA has next to zero useful information on the former Netherlands Antilles, beyond the fact that their electric generation is entirely oil-fired
. That makes it easier to interpret the EIA data, which has a specific page on Aruba
I recall rumors that the winds on Aruba allow a wonderfully high capacity factor, around 60%. The 30 MW (10x 3 MW Vestas turbines) farm was expected to produce 18 MW average
. In its first year, it didn't do quite so well. Between June 1 2010 and May 8 2011, Vader Piet produced as much as 60.8% and as little as 11.0% of monthly capacity, for an average of 41.5%
. That's an average of about 12.45 MW, or 109 GWh/yr. Net generation rose slightly over the 2008-2012 interval, but not much:
109 GWh/yr is a big chunk of energy on a grid that produces just 920 GWh/yr. At 30% efficiency, it's equivalent to 41.5 MW thermal or about 590 bbl/day of oil at 6.1 GJ/bbl. That is about 20% of Aruba's net oil imports. Do we see this happen between 2009 and 2010? Not as I read the EIA data (which doesn't seem to be available as tables for some reason):
We get a significant drop, but not a very big one... and it doesn't seem to coincide with the year 2010.
There's a further confounding effect for Aruba: in the same period as the Vater Piet installation, the island was installing some "RECIP" plants to increase the efficiency of the oil-fired electric generation. This is probably what accounts for the other 140 MW of the 170 MW increase in nameplate generating capacity over the last few years. How much of the decrease was due to better efficiency of the oil-burning generators, versus displacement by wind? This page claims 30% greater efficiency of the new diesels vs. the old steam turbines
. That would produce the observed efficiency increase all by itself.
If wind is going to replace fossil fuels and eliminate carbon emissions, there should be few places it would work better than Aruba. Despite this, the evidence that it is working in Aruba is spotty at best. That is a mighty slim reed on which to hang the continued existence of industrial civilization and a liveable climate.
This was posted at Hiroshima Syndrome
, but because the blog format does not allow permalinks to specific content I'm going to quote the entire March 1 2014 entry below. Do not skim, read every word.
March 1, 2014
Naoto Kan’s crime against Japan
This past week, a court panel in Tokyo rejected a criminal suit against former PM Naoto Kan concerning his actions during the first week of the Fukushima accident. Kan and five other officials allegedly caused the premature deaths of numerous people due to the chaotic Fukushima Daiichi evacuation. The panel said they could find no proof of the claim. I was waiting to see how the case would turn out before writing what follows. If criminal charges would have been filed, my opinion would be little more than adding insult to injury. I no longer feel this constraint.
There are numerous detailed reports concerning what happened at Fukushima, including my E-book Fukushima: The First Five Days. All of these sources show that soon after midnight of March 12, 2011, Naoto Kan made an executive decision. In my opinion, the events caused by Kan’s decision warrant criminal charges being brought against him, but not for the evacuation. His decision may have been the main reason for the severity of meltdowns with units #1, 2 & 3, and the sole cause of the hydrogen explosions at units #1, 3 & 4.
At ~12:20 am, site manager Yoshida wanted to begin the work of manually depressurizing unit #1 and asked the company’s home office for permission. Tepco-Tokyo dutifully forwarded the request to Kan, who’s approval should have been a perfunctory “yes”. However, Kan told them to not depressurize until (1) the entire 3km radius’ evacuation was confirmed, and (2) a 3pm Press conference in Tokyo was held to announce the impending depressurization. The Press conference was held at 3:06am, but the 3km radius could not be confirmed as evacuated until ~9am. I firmly believe these politically-mandated delays are the prime reason for the full core-relocating meltdown of unit #1 and the hydrogen explosion which decimated the upper story of the Reactor Building at 3:36pm. If these delays had not been ordered by Kan, the full meltdown could well have been mitigated and the building explosion completely avoided.
The records kept by the staff and management team at F. Daiichi show that at 10pm on March 11, control room indication for reactor water level had been energized and there was more than 20 inches of water above the top of the fuel core inside unit #1. The meltdown could not have yet begun with that much water in the core. However, reactor building radiation levels were increasing. By 11pm, radiation levels at the Turbine Building access were increasing, as well. It may have been at this point that the fuel inside the reactor was beginning to be uncovered. But, as long as there was any water and steam inside the RPV, it is unlikely than a full meltdown would happen. When the actual melting of the core began is speculative, at best, but it does not seem to have begun before midnight. Soon after midnight, site manager Yoshida ordered the staff to prepare to depressurize the Primary Containment structure surrounding the reactor itself. The lower pressure would allow low pressure fire pumps to inject cooling water into the core and stop the progression of core damage. Local authorities said the 3km radius was fully evacuated at 12:30am, so Yoshida wanted depressurization to begin in earnest. They needed Tokyo’s approval. At 1:30am they were told of Kan’s two criteria for depressurization by Tepco-Tokyo.
It is quite likely that if the depressurization of unit #1 would have happened at 1:30pm, the amount of fuel melting in the core would have been severe, but a full core relocation unlikely. Some hydrogen may have begun seeping out of the PCV and into the outer reactor building. However, it is unlikely that the hydrogen level in the outer building would have been sufficient for the later-in-the-day explosion. When the actual depressurization occurred at ~ 10am, the fuel core was fully melted and had relocated to the bottom head of the pressure vessel. Also, large volumes of hydrogen gas had entered the outer reactor building in sufficient quantity to cause the subsequent explosion. The depressurization was way, way too late. The person most responsible for this situation was Naoto Kan.
The explosion with unit #1 came just six minutes after a high-voltage mobile diesel had begun sending electricity into unit #1. The staff was on the verge of starting the high-pressure Standby Liquid Control (SLC) system which would have been able to inject water inside the RPV. Flying concrete shards from the hydrogen explosion shorted out the heavy-duty cable that has been spliced between the diesel and a switchboard inside the reactor building. Flying debris also smashed into the diesel and knocked it out of commission. If it were not for the unit #1 hydrogen explosion, it is safe to say unit #1 would have been in a safe condition rather quickly, and unit #2 re-energized through tandem-unit interconnections soon there-after. If the depressurization would have been allowed at ~1:30am on March 12, 2011, it is probable that unit #1’s fuel damage would have been stanched at the partial/severe meltdown stage and the unit #1 hydrogen explosion would never have happened! Further, it is likely that unit #2 would have completely avoided meltdown since the fuel core did not begin to uncover until around 4:30pm on March 14th! In fact, unit #2 did not lose its steam-powered emergency cooling pumps (RCIC and HPCI) until a few hours before the core began to uncover. If there had been no unit #1 explosion, there would have been no fuel melting and relatively minor fuel bundle damage.
It is possible that unit #3 could have been saved, as well. A second high-voltage mobile diesel was on its way to unit #3 from units #5&6, when the unit #1 explosion occurred. The road was damaged by the earthquake and tsunami debris had to be removed as the diesel made its way down from the two undamaged unit on the bluff above units #1 through #4. It was a slow go. When unit #1 exploded, there was even more debris to clear than before, making the trip even slower. About half of the spool of heavy-duty cable used to splice the first diesel into unit #1 remained. The balance of the cable could have been used to connect the second diesel with unit #3 and reenergized the emergency cooling systems. The flow of water into unit #3’s core was not terminated until 2:42am on March 13th. The meltdown probably didn’t begin until after that time. Thus, unit #3 might have been saved if not for the hydrogen explosion with unit #1 at 3:36pm on March 12.
Kan would argue, I’m sure, that he ordered the delays to insure that no member of the public would be exposed to the radioactive gasses released by depressurizing unit #1. However, the wind was blowing out to sea on March 11th and was projected to stay that way for at least two days, which was not unknown to Kan and his emergency team in Tokyo. Real-time meteorological data from a computerized system, acronym SPEEDI, was available to Kan the entire time, but he negligently chose to ignore it because he felt meteorological forecasting was inherently inaccurate.
Clearly, Kan panicked and gave orders that exacerbated the severity of the accident. He more than “meddled” - he criminally interfered! While we cannot say that Naoto Kan’s negligence caused the Fukushima accident, it seems that we can point a guilty finger at the former PM and say that he was the primary reason for the severity of the accident and the one person most responsible for all three hydrogen explosions. In my honest opinion, he should be criminally indicted for executive malfeasance, meddling in the emergency actions at F. Daiichi, placing the station’s entire staff in an un-necessary state of danger, and causing completely avoidable anguish to be inflicted on the people of Japan.
Labels: Fukushima, Japan
One of the nice things about analyses you can do on the back of an envelope is that they are easier to understand and lend themselves to settling issues. It occurred to me that a comparison of US LDV carbon emissions to the EV-related emissions from a nuclearized grid would be just one of those things.
First off, gasoline. Motor gasoline forms about 20 pounds of CO2 per gallon burned. In 2012, US LDVs burned 137 billion gallons of the stuff
for total emissions around 2.74 trillion pounds or 1.24 billion metric tons. At a guesstimated average fuel economy of 24.6 MPG, that same 137 billion gallons powered 3.37 trillion vehicle-miles travelled (VMT). Dividing miles by tons and moving the decimal point 6 places to the right to get grams, this comes out to 368 gCO2/mi or 229 gCO2/km.
Suppose that the average US vehicle did not have the characteristics of an ICE-powered light truck, but a Tesla Model S. Its energy consumption from the wall is 380 Wh/mi. Dividing by average transmission efficiency of 93%, this would be 409 Wh/mile at the generator. If it was charging off the French grid, with its net emissions of 77 gCO2/kWh, the vehicle's net emissions would be 31 gCO2/mi or 19.5 gCO2/km.
Things would not be so clean in "renewable" Denmark. The emissions from the Danish grid, at 385 gCO2/kWh, would result in 155 gCO2/mi or 97 gCO2/km. Some ICE-powered vehicles already emit less than this. And of course in coal-fired Australia, at 850 gCO2/kWh...
Climate scientists claim that we need no less than an 80% reduction in CO2 emissions to stabilize the atmosphere. This brief analysis shows that "renewables" will not get us there, even with electric vehicles. However, the combination of EVs and nuclear energy can achieve a reduction of around 92% even given a rather large and powerful EV, assuming French levels of carbon emission from generation. This is a pessimistic analysis in some ways; I've not assumed any reduction in per-kWh emissions due to increased base-load generation made possible by electrification and demand-side management of vehicle charging. Filling in the overnight demand trough and serving it with nuclear would reduce emissions at all times of day.
Another angle: supposedly there's room for about 1 ton/capita/year of carbon emissions. At 31 grams/mile, the 13,000 miles/yr travelled by the average US vehicle would emit just 400 kg of CO2. That leaves plenty of room for other things.
The bottom line? There's no existence proof that renewables can save the climate (and plenty of reasons to believe the job is far more difficult than claimed). Nuclear energy can.
Labels: carbon budget, carbon emissions, EVs, nuclear power, transport
TEPCO has announced
that 220 fuel bundles have been transferred from the Unit 4 fuel pool to the common storage pool, roughly 1/8 of the total. Despite this, the islands of Japan, the west coast of the USA, and the world in general remain habitable.
This on-going tragedy affects the scaremongers, whose credibility may be irreparably damaged by this continuing state of normality. Perhaps they can blame the Polar Vortex and California drought on Fukushima, and we can all breathe a sigh of relief that our media has not lost the ability to scare us about SOMETHING.
Labels: Fukushima, sarcasm, satire, TEPCO
Over at Climate Crocks, Christopher Arcus
cites an article at Scientific American on the efficacy of wind power for reducing grid CO2 emissions
. Unlike other pieces I've seen on the subject, that one fails to cite any sources for its conclusion which is:
even if wind produced as much as 50 percent of Spain's electricity the CO2 savings would still be 80 percent of the emissions that would have been produced by the displaced thermal power stations.
This appears to be possible, if the rest of the grid mix is compatible. Hydropower is particularly well-suited, as it has no thermal-cycling limits and little in the way of startup delays. However, hydropower cannot be assumed to be present with the wattage and water storage required. Kodiak island can up its wind power and go diesel-free (so long as electric demand doesn't rise too high), but the rest of the world must deal with other constraints.
One of these constraints is the increased emissions due to more startups and low-load operation of powerplants that would otherwise run more efficiently; this leads to the net reduction in CO2
emissions from most RE being substantially less than their gross contribution to the grid. Argonne National Lab studied this, and stated this in the abstract of the paper
Our results for the power system in the state of Illinois show significant emissions effects from increased cycling and particularly start-ups of thermal power plants. However, we conclude that as the wind power penetration increases, pollutant emissions decrease overall due to the replacement of fossil fuels.
The question becomes, how MUCH do pollutant emissions decrease? There's no fine print in the abstract's text, but what the words giveth, the graphic taketh away:
There's one obvious anomaly in the graph: it strains credulity that the total emissions can decrease proportionally faster than the total fossil generation, as it does at the left edge. This could be due to an error in the baseline introduced by a graphic artist. But aside from that, the emissions curve is distinctly concave upward; well before the middle of the curve, total emissions do not fall as fast as total wind penetration. There's the further question about the total amount of wind generation usable. To achieve more than 40% penetration, the capacity factor of wind would also need to be on the order of 40% or else available power would frequently exceed total demand. Without storage, the excess generation would have to be "curtailed" (spilled). This increases the net cost per kWh.
Using my Gimp-fu to extract data points from the graphic, I get this table of data:
|Penetration, % ||mmMT CO2 ||% reduction |
|0 ||41.7 ||0.0 |
|10 ||37.0 ||11.3 |
|20 ||33.1 ||20.7 |
|30 ||30.1 ||27.7 |
|40 ||28.2 ||32.4 |
By 40% penetration, the total emissions reduction from wind has fallen to 81% of its contribution; worse, the total emissions reduction between 30% and 40% wind penetration is just 4.7%, less than half of the fractional addition to generation. This is well into the region of diminishing returns.
According to climate scientists, keeping total climate warming below 2°C¹
requires no less than an 80% reduction in total GHG emissions. Even if we could draw a straight line between the 30% and 40% data points to a hypothetical 100% "penetration" way off the right edge of the graph, the total emissions reduction would only be about 61%; net emissions would still be twice as high as we can allow them to be. Of course, expecting that curve not to bend upward to the right of 40% is a pipe dream.
I hear the objection coming up immediately: "But wind isn't all there is. Solar and other technologies can fill the holes in wind and push emissions down further!" Sadly, solar PV (which is the only kind we're likely to see in private hands or outside sunny deserts) has a very low capacity factor; Germany's is about 11%. Achieving more than 11% penetration gets into the same region where generation exceeds instantaneous demand, and the excess must be stored (expensive) or spilled (driving up cost per kWh, and also requiring a control system to manage generation). Last, the emissions reductions from PV will be subject to the same diminishing returns evident for wind.
- It is broadly true that the addition of wind power to electric grids dominated by fossil-fired plants can reduce total pollutant emissions, including CO2.
- The substitution rapidly runs into diminishing returns (unacknowledged or even denied by the advocates).
- The claims that even an 80% reduction in carbon emissions from electric generation can be achieved with the addition of wind and solar are far-fetched and not credible. Absent other carbon-free generation using large amounts of stored energy (e.g. conventional hydro), a zero-carbon RE grid should be viewed as nigh impossible.
Because of this, if we expect to de-carbonize our supply of electricity (and energy in general) we have to look to sources other than "renewables".
(A hat-tip to Willem Post
for bringing the Argonne paper up!)
James Hansen and others opine that 2°C is well past the safe zone, and we need to shoot for no more than 1°C. This requires a much lower ceiling on emissions, achieved much sooner.
Labels: renewable energy, storage, variability
This is a short piece I need to get out just so I can put all my attention on the major post I'm working on.
Germany's shutdown of many of its nuclear plants has led quite directly and inevitably to increased burning of fossil fuels. (Canadian Energy Issues
pegged this.) Contrary to the false claims of Chrisopher Arcus on ClimateCrocks
, all German coal-fired generation has risen, both hard coal and lignite. This isn't some right-wing claim, it comes straight from the Rocky Mountain Institute
The Energiewende can only be described as schizophrenic:
- It is represented as "green", but policy decisions are driving consumption away from low-carbon gas and zero-carbon nuclear to the dirtiest fuel on earth, coal.
- It is represented as a "transition", but to date it's more of a return to the 1950's than anything forward-looking.
- Subconciously it might be an effort to reduce dependency on Russia, but if that was the case, why shut down nuclear plants which could be used to slash gas demand much further?
Then again, how can we expect sensibility from people who panic about tsunamis in the middle of Europe? Sie sind wahnsinnig.
Don't ever think I don't like my new Fusion Energi. It's literally the best car I've ever had. But you have made some extremely annoying, unsettling or just plain counter-productive decisions (or bugs) in the software, and I wish you'd fix them.
Some of these are weird behaviors that have no obvious explanation:
- The other day I was on a relatively long trip driving on cruise control, and the graphics around the "ECO" symbol on the left-side dash display changed to vertical dotted lines. This has happened before, but what followed had not. From time to time the right-side dotted line flashed red, accompanied by a pattern of 3 pulsed vibrations on the right side of the car. There was no text pop-up to explain what was going on, and obviously it is not safe to search the owner's manual while driving. I was eventually persuaded to pull over and examine the tires to see if something was wrong with them. I found nothing, and the strange behavior went away after the stop. I still have no explanation for this.
- Sometimes when I park and return to the car, the side mirrors are turned all the way down and I have to reset them properly. This has occurred on any number of occasions, sometimes after just minutes away.
- There is a bug in the charging system. Sometimes when I plug in a charger, the car does not recognize it and refuses to charge. The charger can be un-plugged and re-plugged as many times as desired, and the car will still not charge. I had the convenience cord replaced under warranty because of this, with no change; I later reproduced this problem on a public charger. I finally discovered that cycling the ignition would get the car to charge.
But my biggest irritations are things that I ought to be able to get, but you deny me.
- The 12-volt power points shut off when the ignition is off, even when the car is plugged in and charging. There should be no danger of running down the accessory battery, so I fail to see the justification. Maybe I'd like to leave devices plugged into charge; why can't I?
- Even worse, the 150-watt AC outlet in the center console doesn't work at all when the car is charging, even when the ignition is on! I may have 3 kilowatts coming in through the socket on the fender, but I can't get a lousy 150 watts to charge a laptop. That requires a second cord running to another outlet... IF one is available. It is bizarre to be able to run the car's air conditioning from "shore power", but not a single AC-powered device.
- And at the very basics, how about the electrical specs for the charging port? I am interested in buying a 220 volt charger to install at home. Of course, I am interested in getting the best charging performance the car will allow, for my own convenience and to electrify as much of my driving as I can. I can get chargers that go all the way up to 30 amps, but the higher currents cost more money. How much current will the car accept? YOU WON'T TELL ME! There is NOTHING on the car, in the owner's manual, or on-line that lists the basic electrical specifications that you'll find on a string of Christmas lights. How about coming clean here? My laptop power brick says "INPUT: 100-240V~ 50-60Hz 1.5A", it won't kill you to do the same.
If you really want to thrill me, open up the full specs for the car's high-voltage systems and put me in touch with your product-development engineers. There are a whole heap of options and applications that you haven't touched, and all you need to turn the marketplace loose is an open specification for plugging in. Play your cards right and you could start the PC revolution, only with plug-in vehicles. Think about it. But don't think too long, because the rest of the world is ready to steal a march on you.