# is voltage directly proportional to current? [Answered: YES]

## Recommended Posts

or is voltage inversely proportional to current?

(i've been given two conflicting pictures, one from the net; one from the television)

the one from the net; voltage is inversely proportional to current. the uk uses 240 v mains, the us uses 110v mains. the us is therefore safer, but requires more current (and therefore larger cables which are more expensive)

the one from television (bbc learning zone); voltage is directly proportional to current; the bigger the push, the bigger the flow

i'm guessing myself that the area of the cross section of the circuit wire is key to the magnitude of the current, no?

(and the voltage is key to the VELOCITY of the current; not the AMOUNT of current)?

##### Share on other sites

V = IR

(Voltage = Current * Resistance)

So voltage and current are directly proportional to each other, for a constant resistance.

You need the same energy to power things though, and therefore the same power (energy/unit time):

P=I2R=VI=V2/R

Current can also be related to the drift velocity of the electrons in a circuit:

I = nqAv

Where n is the number, q is the electron charge, A is the cross section and v is the drift velocity.

##### Share on other sites

"i'm guessing myself that the area of the cross section of the circuit wire is key to the magnitude of the current, no?"

No, the thing that determines the current is the resistance of the appliance you connect it to.

Let's make the arithmetic a bit easier and pretend that here in the UK (where I am) the voltage is 250.

I want a heater that provides heat to my room at 1000 Watts.

One of the equations you will see a lot is volts times amps = watts.

From that I need the fire to use 4 amps (so that 4*250=1000)

Ohms law tells me that I need the fire to have a resistance of 250/4 =62.5 Ohms

The equations Klaynos gave above let you do that in one stage

250^2/R =1000 works for R=62.5

If I wanted a 2000 Watt fire I would need twice the current so I would design the fire to have a resistance only half as big i.e. 31.25 Ohms.

If I wanted a 1000 Watt fire in the US (and lets pretend they use 100 volts rather than 110) it would need to draw 10 Amps and so it would need a resistance of just 10 Ohms.

If I brought that to the UK andd connected it up to the 250 volt supply the higher voltage would push 2.5 times more current through it ie 25 Amps. Not only that, but each amp of current would transfer 2.5 times more power (because the voltage is higher) and so the poor fire would end up trying to dissipate 6250 Watts. It would almost certainly overheat and burn out.

The speed of electricity is not as simple as it first seems. The speed of the electrons is not actually very big, but the energy is transfered at something near the speed of light.

The voltage is a measure of how hard the electrons are pushed; what determines how many of them flow past a point in given time (i.e. the current) is the resistance.

How fast they flow is also dependent on how many of the electrons there are available to carry the current and how thin the conductor is. I would leave this question aside until you are certain about voltage, current and resistance.

##### Share on other sites
V = IR

(Voltage = Current * Resistance)

So voltage and current are directly proportional to each other, for a constant resistance.

You need the same energy to power things though, and therefore the same power (energy/unit time):

P=I2R=VI=V2/R

But the power isn't a constant, and I think that's where the confusion comes in. If you increase the voltage, and thus the current, the power increases.