How is it possible to have high voltage and low current? It seems to contradict the relationship between current and voltage in E=IR

  • I have read different forums and watched a few youtubes (in addition to my textbook readings) and the explanations seem to fall short. The issue seems to be how we are first taught about a direct relationship between voltage and current (that is, an increase in voltage renders an increase in current if resistance remains the same) and then we're taught about power lines that have high voltage and low current (because other wise we would need thick wires that carry high current [which would run the risk of overheating due to the joule effect or something or another..). So please don't explain to me the infrastructural reasons why high voltage, low current is necessary for power lines. I just need to know how high voltage, low current is even possible. I've only been studying DC so far so maybe AC has rules that would enlighten me...but I thought the E=IR formula was universal.

    Besides that power lines carry rather high current, if you have high voltage and low current, just enter it into ohms law, and you have a resistance to make exactly that possible

    This isn't exactly an answer to your question, but from the calculations there you should be able to figure out the answer to your question yourself: How to calculate voltage drop over and power loss in wires

    Power (P) = I.V --> I goes up, V goes down for constant power, and vice versa.

    The general consensus seems to be that I should not even think about Ohm's law for power (even though it clearly is part of the power equation) and only consider it when calculating voltage drop (or current when figuring wire size).

    Pcustomer < PpowerPlant - PlineLoss, regardless of the intermediate voltages, currents, or resistances between you and them. Inefficient, insufficient or failing transmission equipment increases PlineLoss at each component per ohms law. As per my unpopular answer, customers can be billed for their usage, while power lines cannot. Omh's law determines how much usage the power lines, themselves, will consume, but power in is always greater-than or equal-to power out.

    Also from my unpopular answer, both frequency and the skin effect explain a large part of why AC and DC encounter different resistances in similar wire and why the particular values in main-stream use were chosen.

    Please read this answer if confusion persists, it has been answered in a good way. This question

    What is your definition of low current? A few hundred miliamps can kill you...

    You seem to assume that the resistance is low: Do you have any basis to assume that, any specific values of the resistance of a power line? A short segment of a power line has low resistance, but a VERY long power line may well have a VERY high resistance, thus necessitating very high voltage to make possible to push even some very low current? Read the formula as I = E / R where R and E are given,and I comes to what it does,and it may happen to be low, if R is big. Again,it is not just the thickness of the cable, but also its total length that makes resistance high. Any specific value of R?

  • You're confusing "high voltage" with "high voltage loss". Ohm's Law governs the loss of voltage across a resistance for a given current passing through it. Since the current is low, the voltage loss is correspondingly low.

    And by "voltage loss", you mean "voltage across the component".

    Well if that's true (i.e., ohm's law governing loss of voltage), it makes a lot more sense to me now. However, that creates another question. As far as forum rules go, am I supposed to form a new question or just ask it as part of this thread?

    New questions should have a new question opened, but if it is related to another question then linking to that related question is acceptable.

  • You are confused about the consumer load and the resistance of the cables.

    The point is that power is the product of voltage and current. To transmit the same power to a consumer load, you can increase the voltage and decrease the current.

    If the light in your house needs 100W, say 10A at 10V, this can be transferred from the power plant directly.

    Let's say the cable between your house and the plant has 10 Ohm. If you sink 10A from the plant, the plant has to provide 110V: At 10A, a voltage drop of 100V occurs on the cable, plus the 10V you need. This means, you consume 100W while the cable wastes 1000W.

    Now, let's say your house receives 1000V.

    Of course, you need a transformer to convert the delivered voltage to the voltage needed by the light!

    The current consumed from the plant is now only 0.1A.

    The voltage drop on the cable is now just 1V, which means 0.1W loss to power your 100W light. This is much better.

    The point is the use of the transformer which allows to convert voltages and currents while maintaining the power:

    $$U_1\cdot I_1=U_2\cdot I_2=const.$$

    I guess I'm just having trouble conceptualizing voltage as potential energy.

    No, that's not the point (and even physically not correct). It is really `power = U*I`, the fact that high currents in a cable cause high voltage drop / power loss and that you need transformers.

    I think it's my fault that you misunderstand me. I wasn't looking for the benefits of high voltage, low current for power lines. I already understood that. I was looking for how it is possible to create the pressure (voltage) without causing electrons to increase in speed (current) (and thus causing the wires to overheat and melt). If you are saying that thinking about voltage as potential energy is wrong, you're going against a far-reaching didactic tradition (because this analogy is made a lot), but I'm definitely interested in hearing why you say it is not correct.

    @MountainScott by increasing the resistance (at the end of the cable, not the resistance of the cable itself which would just waste power)

  • One word: Resistance. Recall that Voltage is calculated by multiplying the current by the resistance. You can have a high potential difference (which is what voltage is), and a low current, simply by having a high resistance in place to block that current.

    Think of it like a water hose turned on full blast, with a hose gun attached to the end. The hose gun acts as a varying resistor controlled by the user, so even though there's high potential energy in the hose (the water wanting to flow), the resistance is so great that little to no water flows. As the user presses the trigger, the resistance lowers until water flows more and more.

    Just seems that if transformers create more resistance (or impedance, I suppose), that that would cause a decrease in both voltage AND current (making the output useless) it that the current is already relatively high and the "high voltage/low current" relationship in power lines is all relative as well?

  • The power distribution system uses transformers to step the voltage up or down.

    Transformers handle Power (Voltage times Current). The power fed into a transformer will be equal to the power taken from the transformer (neglecting small losses) so we can calculate the voltage and current on each side of the transformer using the formula

    Vin x Iin = Vout x Iout

    Using this formula, you can see that if the input voltage is 10 times the output voltage, the input current must be 1/10 of the output current.

    At the risk of adding confusion, I'll add some more information: A transformer is also an impedance converter. The impedance of source or load goes up or down across a transformer in the same direction as the voltage goes up or down, but the impedance ratio is squared while the voltage and current ratios are "straight", compared to the turns ratio. Plug this into Ohm's Law to see that it exactly makes up for the voltage changing in one direction and the current changing in the opposite direction to keep the power equal.

    The upshot of all this is that your house, when "seen" by the high-voltage distribution lines through a step-down transformer, appears to have much higher impedance than it really does, and it's this higher impedance that goes into Ohm's Law for the distribution line. Thus, higher voltage, lower current.

  • Your confusion comes from the fact that you're forgetting about receiver's resistance. Basically it looks like this:

    power plant -> wire -> receiver -> return wire -> power plant

    The voltage in the wire (or power plant) is high and the resistances of the wires are low, so you think that the current should be high. Right, but now consider that the receiver has a very high resistance. This is what makes the current in this circuit low.

    So you have high voltage and low current because of high resistance of the receiver between the wires. It's totally consistent with Ohm's law: \$I=U/R\$ and R is very big, so I is small.

    In this simplified scenario if we increase the power plant's voltage, we must also increase the receiver's resistance, if we want to keep the receiver's power constant.

    In reality receivers run behind transformers which convert high voltage to low (constant e.g. 230V in Europe). So in the above scenario when we increase the voltage in the power plant, then we just need to change transformers (their resistance) - no need to change receiver's resistance. All of this is transparent to the end-user.

    This explains how it's possible to have high voltage and low current. And why is it better?

    Remember the formula for power in relation to resistance and current - it's \$P=I^2*R\$. If you have a wire which has some constant resistance R, and then you lower the current 2 times (by increasing the voltage 2 times), the power lost in this wire decreases 4 times. That's why it's good to have a high voltage.

    Not an expert, but it feels like this is the direct answer to the question

  • Well, we call them "power lines" for a reason... what we are transmitting is POWER. And since \$P=VI\$, we can transmit the same amount of power at \$10,000\$ volts using a current of \$0.1\$ amps, or at \$100\$ volts and \$10\$ amps. ((\$10,000 \text{V} \times 0.1 \text{A} = 1000\text{ Watts}\$) is equivalent to (\$100\text{ V} \times 10\text{ A} = 1000\text{ Watts}\$)).

    So a power plant can transmit the same amount of power (\$1000\$ Watts in this example) using \$10,000\$ Volts and just a tenth of an Amp, or \$100\$ Volts at \$10\$ Amps. What motivates their decision, then? Money. The \$V=IR\$ relationship you mentioned determines the voltage drop across the cables that transmit power. Naturally, those cables are designed with as low resistance as possible, but that resistance cannot be eliminated. Recall that \$P=VI\$, so a drop in voltage results in a drop in power. Any loss of power along the transmission lines is waste, and the power company loses money.

    Also note that when we combine these two equations, we can write the power equation as \$P=I^2R\$. This illustrates that loss of power is proportional to the SQUARE of current for a set resistance. So if the power company can reduce current by raising voltage, the benefit of that reduction is squared. In this example, dropping the current by a factor of \$100\$ (from \$10\$ Amps down to \$0.1\$ Amps) reduces the power loss by a factor of \$10,000\$.

  • One way to look at it is to ask what's at the other end of the power-line: a customer. The customer doesn't buy current or voltage he/she buys power (watts). So, if a power supplier delivers a given amount of power they can use thinner wires by upping the voltage and lowering the current for a given amount of power.

    The question asks how it's possible, not why it's done.

  • If P = IV it would mean that if V increases I would have to decrease. For example: if P = 12 an V = 3 then I would have to be 4. But if you step up V - you step down I for example: if V Became 8 then I would become 1.5. A low current is necessary because less energy is lost. Imagine that the electrons within the cable were shoppers and that the energy they carried was money. Now imagine a line of 100 shoppers rushing out of a building each carry $15 but all have to pass through an alleyway(the alleyway being the cable) and every time they bumped into each other they lost $1(energy lost as heat energy). Now imagine what it would be like if there were only 10 people carrying $150 and how much less they would lose.

  • In direct response to the original post, it seems to me that all of you have over-complicated what the answer to his question really is. Although your provided information is great to include, the question seems unanswered. E=IR Your understanding that an increase in voltage should result in an increase in current is correct - swap out a 3v battery in a simple circuit for a 9v and you've jumped 3x current as well.

    High voltage/low current and vice versa is a TRANSFORMATION of what is ALREADY there - you are not swapping a battery (or any voltage source) with another. A transformer works because of watt's law: power is constant (resistance is constant in ohm's law) and power is current x voltage, or "P = EI"

    A change in voltage is an inverse change in current, and vice versa, where power is conserved.

  • It appears to me that you are having conceptualization troubles, which I will address in my answer.

    It is true the (1) E = IR is a universal formula. However, you have to understand that it can also be expressed as (2) R = E/I, and (3) I = E/R.

    Using form(2), I will show your current understanding of the formula. If you make the voltage 10 times larger (10E), in order to keep the resistance the same (unchanged), the current will also have to increase 10 times R = E/I = 10E/10I. However, I can also increase the voltage and maintain the current the same by increasing the resistance 10 times I = E/R = 10E/10R. So, with form (3), I am able to show that it is possible to increase the voltage (10E) without having to increase the current (maintain the current "low" (I)).

License under CC-BY-SA with attribution

Content dated before 6/26/2020 9:53 AM