19 August 2005

Entropy (or Exergy) of Electricity?

From an engineering perspective, the entropy content of a unit quantity of energy is representative of the amount of useful work that can be derived from it. By work I generally mean move something; useful work might be turning a drive shaft, for example, as opposed to waste heat that cannot.

Thermodynamics gives us a relationship between entropy S and energy E (taking some mild liberties):

dS/dE = 1/T

T in this case is the temperature. High entropy content is bad, so we can see energy that can achieve a high temperature must have lower entropy and hence be capable of doing more work for a unit mass. This follows obviously for chemical fuels. Based off their combustion temperature, the Carnot cycle predicts the theoretical maximum efficiency with which they can do work.

efficiency = (T_hot - T_cold) / T_hot

This result is used in something called exergy or availability analysis which is based off the Carnot cycle efficiency limitations. Exergy is really a wolf (entropy) in sheep's clothing. I won't go into more details on exergy at this time.

So we can easily figure out the efficiency of chemical fuels and from that either their exergy or entropy density, whichever you prefer. But how about electricity?

We might just say that your standard best electric motor has an efficiency of 95 % and leave it there. But that doesn't really tell us what the fundamental limit is. After all, superconducting electric motors can do better, and do.

I have been digging around in research journals looking for an answer, but found nothing thus far. As such, I basically decided to do some basic analysis. The basic model for a metallic conductor -- the Drude model -- states that conduction electrons move freely in a conductor as a free electron gas. A correcting factor, a damping time Tau, is inserted to reflect the collisions electrons can have with crystal defects and phonons. Tau can be derived from conductivity.

Tau = Conductivity*electron mass / (electron density * electron charge^2)

For Copper, Tau = 2.5 x 10^-14 s.

From Tau, we can find the drift velocity of electrons under an electric field,

v = e * electric field * Tau / m

where e is electron charge and m electron mass. And we can relate the temperature of an ideal gas (which electrons are in a pure sense) to the individual kinetic energy of an electron,

0.5 * m * v^2 = 1.5 * k_b * T

where k_b is Boltzmann's constant. Solving for temperature I find that,

T = 3 * (m /k_b)(conductivity*electric field/electron density*e)^2

The conductivity times the electric field is the current density in a conductor (usually abbreviated J, and I get the distinct impression I'm doing this ass backwards). One could relate the current density to the power density (p) and potential (voltage - V):

J = p/V

However, I think I have again taken a bigger than blog-sized bite, so I'll stop and leave it as an exercise for the reader to realize that the entropy content of electricity is very low indeed. The result that you should take from this is a realization that electricity is the best means of carrying useful work that we have, and probably will ever have.

A comparison of electricity to hydrogen is very illuminating. The 2nd law of thermodynamics is rather explicit. If you are reading about a hydrogen powered system, take note of its electricity powered equivalent. In all likelihood, the electrical system is more efficient. And in all likelihood, if an electrical system outperforms hydrogen now, it probably always will. I can see now that I probably should have just spewed forth numbers and arguments regarding revesibility rather than doing the analysis, but I do call myself Entropy Production for a reason. Among chemical fuels, hydrogen is king when it comes to an entropy (or exergy) analysis. It can do more work per unit mass than any other fuel (except maybe Acetylene). However, it remains just a chemical fuel.

Hydrogen can't hold a flame to Electricity.

7 comments:

Robert McLeod said...

AAAAUUUUUUUUUGGGGGGGHHHHHHHH!!!!!!!!

For the reader: my next topic will be on Science gibberish, a relative of legalese and military-speak.

Anonymous said...

From an engineering perspective, the entropy content of a unit quantity of energy is representative of the amount of useful work that can be derived from it.
...
High entropy content is bad, so we can see energy that can achieve a high temperature must have lower entropy and hence be capable of doing more work for a unit mass.
...
This result is used in something called exergy or availability analysis which is based off the Carnot cycle efficiency limitations.


These definitions do not seem right. The available energy is the enthalpy. The entropy is the amount of available energy lost to irreversibility. Exergy is the enthalpy minus the entropy, given the current environmental and system conditions.

This doesn't change the overall statements that entropy is bad, exergy is roughly equivalent to Carnot efficiency, and higher temperatures lead to higher efficiency, which are all correct.

Robert McLeod said...

Yeah I know. I'm took serious liberties in an attempt to create a simple explanation. I failed.

Tom Wayburn said...

1/T is the integrating factor in the the Second Law for a closed system undergoing a reversible process. It converts del Q, an inexact (path-dependent) differential, into dS, an exact differential (independent of path).

The question is this: Is electricity pure work? As such is its principle characteristic that it carries no entropy? (It does not even appear in an entropy balance. See http://tinyurl.com/dyqao.) However, packets of photons (electromagnetism) carry entropy expressed as s* = S/N = 4.97E-23 joules per Kelvin, where S is entropy and N is number of photons. [Bowman] If so, then why should not electricity suffer from some of the deficiencies of electromagnetic waves. After all, electricity behaves like electromagnetic waves in a wave guide, does it not? Do wave guides annihilate entropy?

Anonymous said...

I think the key flaw for the derivation was the slick and deceptive equating of the drift velocity of electrons with the "v^2" in the formual for kinetic energy and thus temperature.

The v^2 in that we really ought to think of as the *variance* of the distribution of velocities, what we all know as heat. Consider a bulk motion in a solid---the mean motion is in fact subtracted out before computing the temperature.

I believe that the thing that we care about is sitting in front of us. Plainly the resistance captures the irreversible transfer of useful energy embodiment---which is concentrated, directed, classical magnitude electric field---into physical entropy.

The value of electricity is not in the electrons (electrons are available anywhere) but the extremely convenient conveyance of potent, contained and directed electric field which has a bulk macroscopic classical vector far larger than the spatial and temporal variation thereof, in direct analogy to kinetic energy of atoms in a moving solid.

Therefore a proper notion of the 'temperature/entropy of electricity' might consider the magnitude of the thermodynamic fluctuations in the *electric* field due to the fact that it is conveyed by electrons which collide with a lattice at a certain rate and the lattice itself has a certain temperature: "thermal electrical noise".

Simple lab experience shows that the intrinsic 'temperature' or 'entropy' of useful electricity must be awfully small, because we can maintain a system with a voltage whose fluctuations are far far smaller than the mean, known as a regulated power supply. Of course, some heat was expended in the regulation.

Similarly, a laser with a long coherence length (i.e. tiny phase fluctuations) emits low entropy oscillating electrical fields.

One limit is certain: superconductors, by definition, are in a pure eigenstate of electricity and have a truly, in quantum mechanical sense, zero "temperature of electricity".

Martin Juckes said...

I don't think that either thermal energy or electrical energy have an intrinsic relation to entropy: the result for thermal energy that you refer to is for equilibrium systems.

The efficiency of an electric motor might give a useful measure for practical purposes: the fact that superconducting motors have a greater energetic efficiency might not change the estimated entropy if the different lower temperature is taken into account:

T_hot = T_cold/(1-efficiency)

Mark said...

That comparison with hydrogen may be true, but unfortunately we produce most of our electricity by burning chemical fuels. The entropy loss is still somewhere, just not at the point of use.