Jump to content

Wire amp load is given at 1100v but i need for 200v. what is the calculation?


q4agl

Recommended Posts

[edited: now i know conduit spec meaning]

[edited] My question was vague, what i want is, wire specs are tested at 1100v , where for example 1.5sq mm copper is rated at 16amp.

But my application is 200v, i would guess that lower volt heats up wire faster. so should i use R=V/I to lower amp  for 200v ? or the resistance is independent of the voltage.

Edited by q4agl
my point was not clear
Link to comment
Share on other sites

If it's the same power, you can use P=IV

It's whether the wiring is in a conduit or in open air. Ratings for in-conduit (or grouped) vs unenclosed have to do with heat dissipation and the temperature the wire might achieve

https://en.wikipedia.org/wiki/Ampacity

"The allowed current in a conductor generally needs to be decreased (derated) when conductors are in a grouping or cable, enclosed in conduit, or an enclosure restricting heat dissipation. e.g. The United States National Electrical Code, Table 310.15(B)(16), specifies that up to three 8 AWG copper wires having a common insulating material (THWN) in a raceway, cable, or direct burial has an ampacity of 50 A when the ambient air is 30 °C, the conductor surface temperature allowed to be 75 °C. A single insulated conductor in free air has 70 A rating."

Link to comment
Share on other sites

The current rating is pretty much independent of the voltage.


If the current is too high, the wire overheats and the insulation melts.

If the wire is on the surface of the wall (unenclosed) then the heat can escape and the cable will carry more current before it overheats.
If the cable is in an enclosed space like a conduit, there's less chance for heat to escape so, to stop it overheating, you have to ensure that the current is lower.

As Swansont has pointed out, the derating can be complicated.

It's best (In most places, it's a legal requirement) to ask an electrician.

 

 

Link to comment
Share on other sites

34 minutes ago, John Cuthber said:

The current rating is pretty much independent of the voltage.


If the current is too high, the wire overheats and the insulation melts.

If the wire is on the surface of the wall (unenclosed) then the heat can escape and the cable will carry more current before it overheats.
If the cable is in an enclosed space like a conduit, there's less chance for heat to escape so, to stop it overheating, you have to ensure that the current is lower.

As Swansont has pointed out, the derating can be complicated.

It's best (In most places, it's a legal requirement) to ask an electrician.

 

 

really? i always imagine high volt allows for more current before overheat.

Link to comment
Share on other sites

1 hour ago, q4agl said:

[edited: now i know conduit spec meaning]

[edited] My question was vague, what i want is, wire specs are tested at 1100v , where for example 1.5sq mm copper is rated at 16amp.

But my application is 200v, i would guess that lower volt heats up wire faster. so should i use R=V/I to lower amp  for 200v ? or the resistance is independent of the voltage.

Pity you edited this before I could respond to the original.

It is not a good idea to remove the original question.

Yes I would expect the current density in a cable designed for much higher voltage to be lower than at the lower voltage.

This is simply because the insulation surrounding the conductor will be thicker and therefore also a greater insulator of heat.

This does not mean you can increase the current in a higher voltage cable and still meet regulations.

I note you application is a 16 amp circuit, which sound domestic to me, with 1.5mm2 conductors.

This sounds European.

 

Note that the 1.5mm2 is not the physical cross sectional area of the cable, but an equivalent depending upon the cable type, (stranded, solid etc) as well as the insulation type.
This is determined by the manufacturer.

Note also that the insulation has to withstand the peak to peak voltage plus a safety factor.

This will not be your 200v stated.

 

Another very important factor you should consider, often overlooked by the inexperienced, is the cable length. Particularly as you appear to want to run it near maximum current.

In these circumstances the cable sizing is often limited by acceptable voltage drop and therefore lenths, rather than current or voltage per se.
At some point you will need to select a larger section to avoid unacceptable voltage drop.

Edited by studiot
Link to comment
Share on other sites

The heat dissipation is based on P = I^2R, so lowering the voltage and increasing the current means more power dissipated in the wire.  

P = IV would include the voltage drop across the load, which should be much larger than the drop along the wire itself.

 

IOW, if you have a ~1.1 kW device, at 1100 V it draws 1 A. At 200 V it requires 5.5 A. The power dissipated in the wire goes up by about a factor of 30 (though for a length of wire with a resistance of 0.1 ohm, this is going from 0.1 W to ~3 W. That's the issue. It's negligible in terms of the overall load, but can be significant for a thin bit of wire)

 

Link to comment
Share on other sites

2 hours ago, swansont said:

The heat dissipation is based on P = I^2R, so lowering the voltage and increasing the current means more power dissipated in the wire.  

P = IV would include the voltage drop across the load, which should be much larger than the drop along the wire itself.

 

IOW, if you have a ~1.1 kW device, at 1100 V it draws 1 A. At 200 V it requires 5.5 A. The power dissipated in the wire goes up by about a factor of 30 (though for a length of wire with a resistance of 0.1 ohm, this is going from 0.1 W to ~3 W. That's the issue. It's negligible in terms of the overall load, but can be significant for a thin bit of wire)

 

I don't know if this was aimed at my post but if so it is besides the point.

The UK wiring regulations publish the voltage drops in mV/m for electricians to estimate the drop in a supply cable.

A value of 31 is provided for 1.5mm2 wire.  (BS 7671)

https://the-regs.co.uk/blog/?p=481

So say this a 100m cable was supplying an outhouse or a tall building:

voltage drop = 21 x 16 x 100 / 1000 Volts  = 49.6 volts   at 16 amps.

So the OP's 200volt supply would be down to a 150 volt supply at the end of that cable !

This could play havoc with many pieces of equipment.

Link to comment
Share on other sites

Yes, the voltage drop is another consideration. For a 100m run, you may want a thicker wire, but one should note that the calculation assumes you are running maximum rated current through the wire. From an electrical contractor/regulations/house wiring code point-of-view this makes sense and should be followed, but it's the worst-case scenario - if you're doing a hobby project and you absolutely know that the current you'll be using is 1/10 of the max, then the voltage drop is also 1/10 of the value the formula gives you.

Link to comment
Share on other sites

3 hours ago, swansont said:

The heat dissipation is based on P = I^2R, so lowering the voltage and increasing the current means more power dissipated in the wire.  

P = IV would include the voltage drop across the load, which should be much larger than the drop along the wire itself.

 

IOW, if you have a ~1.1 kW device, at 1100 V it draws 1 A. At 200 V it requires 5.5 A. The power dissipated in the wire goes up by about a factor of 30 (though for a length of wire with a resistance of 0.1 ohm, this is going from 0.1 W to ~3 W. That's the issue. It's negligible in terms of the overall load, but can be significant for a thin bit of wire)

 

So power loss rate would be same for say 1000V 5A versus 100V 5A ? This is what my question is.

I need to know if i can still run same amps rated for 1100v on a 200v circuit. or if i need to get bigger cable to run same amps at lower volt

4 hours ago, studiot said:

Pity you edited this before I could respond to the original.

It is not a good idea to remove the original question.

Yes I would expect the current density in a cable designed for much higher voltage to be lower than at the lower voltage.

This is simply because the insulation surrounding the conductor will be thicker and therefore also a greater insulator of heat.

This does not mean you can increase the current in a higher voltage cable and still meet regulations.

I note you application is a 16 amp circuit, which sound domestic to me, with 1.5mm2 conductors.

This sounds European.

 

Note that the 1.5mm2 is not the physical cross sectional area of the cable, but an equivalent depending upon the cable type, (stranded, solid etc) as well as the insulation type.
This is determined by the manufacturer.

Note also that the insulation has to withstand the peak to peak voltage plus a safety factor.

This will not be your 200v stated.

 

Another very important factor you should consider, often overlooked by the inexperienced, is the cable length. Particularly as you appear to want to run it near maximum current.

In these circumstances the cable sizing is often limited by acceptable voltage drop and therefore lenths, rather than current or voltage per se.
At some point you will need to select a larger section to avoid unacceptable voltage drop.

the original was just asking what "in conduit trunking" is. english is not my native so i didnt knew. plus i added more info on my main question since my intended question was missed because i was too vague.

thanks i totaly forgot about sizing up to minimize loss. its for wiring our new house, since the profesionals here in my place are plain dumb. plus electricians dont do the controlled testing to know these things and lack the scientific knowledge. so its a science guy job to figure these kind

 

 

Link to comment
Share on other sites

22 minutes ago, q4agl said:

the original was just asking what "in conduit trunking" is. english is not my native so i didnt knew. plus i added more info on my main question since my intended question was missed because i was too vague.

thanks i totaly forgot about sizing up to minimize loss. its for wiring our new house, since the profesionals here in my place are plain dumb. plus electricians dont do the controlled testing to know these things and lack the scientific knowledge. so its a science guy job to figure these kind

 

You should not be using 1.5 mm2 for power wiring in the UK.

Power should be at least 2.5mm2.

 

That cable is meant for lighting circuits.

It used to be also used for direct spurs off a ring main of less than 2m length, but I think that is now frowned upon.

Edited by studiot
Link to comment
Share on other sites

26 minutes ago, studiot said:

You should not be using 1.5 mm2 for power wiring in the UK.

Power should be at least 2.5mm2.

 

That cable is meant for lighting circuits.

It used to be also used for direct spurs off a ring main of less than 2m length, but I think that is now frowned upon.

im just giving 1.5 as example. i just need to know if power loss is more with lower volt given the same wire and same current. not as percentage of each power. just simply which would heat up faster.

6 hours ago, John Cuthber said:

The current rating is pretty much independent of the voltage.


If the current is too high, the wire overheats and the insulation melts.

If the wire is on the surface of the wall (unenclosed) then the heat can escape and the cable will carry more current before it overheats.
If the cable is in an enclosed space like a conduit, there's less chance for heat to escape so, to stop it overheating, you have to ensure that the current is lower.

As Swansont has pointed out, the derating can be complicated.

It's best (In most places, it's a legal requirement) to ask an electrician.

 

 

thanks all, now i understand this. the first line is the answer i looking for. i just needed more info on it, got it in quora

Link to comment
Share on other sites

6 hours ago, q4agl said:

So power loss rate would be same for say 1000V 5A versus 100V 5A ? This is what my question is.

Power loss is given by P=IV
 

Same current, lower voltage would have a smaller loss.

Quote

I need to know if i can still run same amps rated for 1100v on a 200v circuit. or if i need to get bigger cable to run same amps at lower volt

No, you should not. You deliver less power at the lower voltage when the current is the same. 1000V 5A is 5 kW. 100V 5A is 500 W. 

 

But I agree with what others have said - defer to electricians following code for home wiring projects

 

Link to comment
Share on other sites

It depends on what's being kept constant.
If I happen to have some cable rated for 10 amps and 1100 V I can use it to wire up a 200V 2KW heater because the current is still only 10A.

But, if I was buying cable for that job, I would choose cable rated for 200V at 10A- because it would be cheaper.

Still better to check with an electrician.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.