Jump to content

Speed of electron in electric current


Guest raul

Recommended Posts

Guest raul

Can anyone please help me in this.

 

Jsut want to know what is the speed of electron inside a electric current. As per my understanding current is flow of elctrons. And my question is that if it equal to speed of light (it works the moment you switch the circuit on) then why the einstein theory of increase in weight at speed doen't come to picture for it.

 

new to this form. sorry if i have posted this to a wrong forum.

Link to comment
Share on other sites

All the free electrons in a wire are bumping around at high speeds anyway. By applying a voltage to the wire a slight force is applied to them, causing them to tend to go in one direction slightly more. The sum of this effect is called their drift velocity, and is generally on the order of 10^(-4) ms^(-1). It just seems near-instantaneous because the driving voltage propogates at the speed of light.

Link to comment
Share on other sites

  • 1 month later...

I read recently of a test where 90% of electrical engineering grads couldn't calculate the speed of electrons in a wire, and had hopelessly inaccurate ideas about it.

 

An important point to note is the idea of 'drift'. Since all electrons (and holes) are identical, how do we tell what has actually moved?

 

Another important idea is the idea of 'compression' or build-up. In some sense electrons are 'incompressible', and can be imagined to act like mutually repulsing 'magnets' that keep a certain distance apart from each other. This image comes from classical electrostatics.

 

The 'billiard ball' model can be helpful to explain the motion: Imagine a wire is a long 'tube' full of billiard balls. If you push a billiard-ball in one end, then one pops out the other end, pretty much instantaneously (if there is no spaces). So this can explain why the transmission of electrical effects seems instantaneous, while the actual electrons are not believed to move very far or very fast at all along the wire. Especially with A.C. it is unlikely that a specific electron moves very far if at all for many cycles.

 

On the other hand, static electricity experiments and capacitors show that electric charges *do* act like a 'compressible' gas in many circumstances, crowding at one end or another in a wire or plate, and having varying densities.

 

Superimposed upon these ideas is the idea of 'free electrons' or holes that can 'wander' in a kind of random 'Brownian' motion through the substance, like gas particles in space.

 

But a key idea to come away with is that actual individual electrons (if they really exist) probably don't travel very far or fast relatively speaking, compared to the 'voltages' (i.e., pressure changes) which are transmitted great distances and very rapidly (approaching the speed of light).

Link to comment
Share on other sites

Jsut want to know what is the speed of electron inside a electric current. As per my understanding current is flow of elctrons. And my question is that if it equal to speed of light (it works the moment you switch the circuit on) then why the einstein theory of increase in weight at speed doen't come to picture for it.

I read a book. It said the speed of electrons is very small but their electric fields go with c.

Link to comment
Share on other sites

The 'billiard ball' model can be helpful to explain the motion: Imagine a wire is a long 'tube' full of billiard balls. If you push a billiard-ball in one end, then one pops out the other end, pretty much instantaneously (if there is no spaces). So this can explain why the transmission of electrical effects seems instantaneous, while the actual electrons are not believed to move very far or very fast at all along the wire.
While this model might apply to billiard balls, it certainly does not apply to electrons.

 

In a case of a current carrying conductor, it is not electron-electron interactions that dominate the transport behavior. The electrons respond to the field that is set up in the conductor (by a rearragement of charges in the battery/voltage source), and this field propagates at nearly c (c over the refractive index of the conductor, n). Resistance to transport comes from (i)electron-electron interactions (ii)electron-phonon interactions, and (iii)electron-impurity interactions.

 

At normal (300K) or high temperatures (and normal values of current density), the contribution from (ii) dominates the resistivity, while at low temperatures, (iii) dominates. The strength of (i) is orders of magnitude weaker than (ii) (as evidenced by the low resistivity of an electron beam in an accelerator, for instance) and this is the reason why you can have pretty accurate descriptions of electronic properties (of many normal materials) by simply neglecting many-body effects.

 

It's not too hard to come up with a plausibility argument for the negligible contribution of electron-electron interactions to the resistivity. Consider a pair of electrons whose line of approach is along the direction of current flow. If the first electron "collides" with the second, making the second electron speed up (gain momentum), the first electron itself must lose the same amount of momentum. So, one electron speeds up, while another slows down. The average effect is that the current remains unchanged. Hence, the negligible contribution of electron-electron collisions.

 

A better model is to describe a current using a vertical pipe filled with billiard balls (or a liquid). The billiard balls fall through the pipe because they are responding to the gravitational field.

Link to comment
Share on other sites

Shouldn't that be v=c/n?

 

Also, I was under the impression that for most metals n~10, thus v~0.1c.

If I'm way off please correct me.

 

>>edit v in this context is the wave propagation velocity _not_ the drift velocity of course.

Link to comment
Share on other sites

Yes, of course : v = c/n, not c*n :embarass:

Thanks for pointing that out.

 

As for a typical value of n for a metal, I don't know anything off the top of my head, but we are now treading more murky waters.

 

(i) For one thing, the refractive index ([imath]n(\omega)=\sqrt {\epsilon _r(\omega)}[/imath] for a non-magnetic metal) is a function of the frequency, [imath]\omega[/imath]. So, if you want to specify n, so must say what frequency you are specifying it at.

 

(ii) The dielectric constant, and hence, the refractive index is in general, a complex number. At the very high and very low frequency limits, the dielectric constant [imath] \epsilon_r(\omega) [/imath] is essentially real (the imaginary part is negligible), but the refractive index can be real or imaginary.

Link to comment
Share on other sites

I was just going by a _very_ dim memory that for metals n is rather high, compared to say glass at n~1.5. Naturally in most circumstances this would be "swamped out" by L C considerations of the geometry, but I seem to recall that it can be a hard limit in critical circuit design (cpu's and the like).

Link to comment
Share on other sites

This is true. While I don't remember the numbers, I do recall that the analytic form for the dielectric constant has a singularity at [imath] \omega = 0 [/imath].

 

PS : I failed to mention before, that what is commonly refered to as the refractive index (n) is the real part of the square root of the dielectric constant. So, while [imath]\sqrt {\epsilon } [/imath] may be complex, n is generally real, but is still frequency dependent.

Link to comment
Share on other sites

PS : I failed to mention before, that what is commonly refered to as the refractive index (n) is the real part of the square root of the dielectric constant. So, while [imath]\sqrt {\epsilon } [/imath'] may be complex, n is generally real, but is still frequency dependent.

 

Depends on how you define things, and what you mean by "common." In more advanced applications you use a complex index of refraction, where the imaginary part gives you absorption.

Link to comment
Share on other sites

True. I've seen this written both ways - just as is the case with the dielectric constant. Sometimes, the imaginary part (of the dielectric constant) is called the conductivity, other people call it simply the imaginary part of the dielectric constant.

 

Likewise with the refractive index. The wave vector is, in general, a complex number. And the refractive index is proportional to the wave vector. But many times I've noticed that people write the real part as 'n' and the imaginary part is simply labeled [imath]i \kappa[/imath] or something like that (and as you said, is the origin of the decay term) . I can't back this up right now, but even between Jackson, Kittel and Ashcroft, I'm sure you'll come across both notations.

Link to comment
Share on other sites

A better model is to describe a current using a vertical pipe filled with billiard balls (or a liquid). The billiard balls fall through the pipe because they are responding to the gravitational field.
I'm probably a bit slow this morning, but I fail to see the significant difference between the two models.

 

It seems you are saying that the field acts as a 'conveyer belt' rather than a 'pressure' difference applied to the ends of the wire. That is, the field I suppose penetrates everything instantaneously, and hooks directly into every electron, transmitting energy directly, not through 'collision'.

 

But it seems hard to deny that from this perspective the voltage difference is essentially a uniform field, and the 'motion' of the energy at C is simply a different inertial frame, measuring the transport of a different 'substance'. The electrons still maintain an equilibrium of 'equal spacing' via collision or rather repulsion of their own electrostatic fields.

 

There is some 'gas-like' compressibility which we observe, in the bunching of charges in a capacitor for instance, and of course impurities causing non-uniform motion (friction) for the electrons, which is countered by 'collision-like' bumping that keeps things moving.

 

I have noticed others complaining about both the 'water' analogy and the 'pressure' analogy. But they seem extremely useful for people learning the basics of electricity and electronics, and cause no serious perceptual crippling. After all, new concepts can be (and can only be) introduced when students are ready, having mastered simpler ones.

 

Character - bender
corner_tl.gif corner_tr.gif
tail.gif
psst!...Metaphrizzie = outpatient... apply meds.
corner_bl.gif corner_br.gif
Link to comment
Share on other sites

The electrons still maintain an equilibrium of 'equal spacing' via collision or rather repulsion of their own electrostatic fields.
But this isn't true ! Experiments clearly show that the electrostatic interactions between electrons in a current carrying conductor are a very small perturbation that can very well be neglected in describing the microscopic dynamics of conduction. It is essentially for this reason that the almost naive Drude Theory does spectacularly well in predicting the electronic properties of the alkali metals.

 

In fact, models all the way up to the Tight Binding Model (which is resoundingly succesful in predicting band structures of various materials) are based upon the "independent electron" assumption - that basically, electron-electron interactions can be neglected. It is only with the Hubbard model (and its variants in the Anderson-Hubbard, Mott-Hubbard, or the Bose-Hubbard model) that you first start to deal with electron-electron interactions. Such models have known to produce new revelations only in the more exotic materials like Mott insulators, but even they are extremely rudimentary in the extent to which they model many-body effects. The difficulty of dealing with electron-electron interactions was next lowered by the approach utilized by the Fermi Liquid Theory (where weak interactions are renormalized into the effective mass of a quasiparticle - which is now, essentially a non-interacting particle). Strongly correlated electronic systems (BECs, Superconductors, etc.) are found at the top of the "exotoc materials" ladder, and it is only in these sytems that you need a robust framework capable of modeling the interactions between the electrons.

 

For regular conduction in ordinary metals though, electron-electron interactions play no role.

Link to comment
Share on other sites

I could not agree more:

 

But I have two things to add:

 

(1) In extending Feynman's program for QED, both his collaborator and student Carver Mead has taken QED to a new level in his small book, Collective Electrodynamics.

 

(2) The Transactional Interpretation of Quantum Mechanics puts previous interpretations to shame by maintaining the objective coherence of Bohm's work extending Einstein's ideas, and fully modernizing and harmonizing it with Feynman's time-symmetrical approach. As a bonus, the Transactional Interpretation has consequences that involve empirical tests that have already been confirmed.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.