LOL. Off topic a bit, but I remember reading ‘The Dancing Wu Li Masters’ which is a ‘Quantum Mechanics For Dummies’ primer into the absurdities of the quantum world, and how even the most casual understanding of it destroys long held beliefs about the nature of things.
I honestly walked around in a complete mental daze for a few days after that read.
Crazy implications like there is a non-zero probability that one could actually walk through a wall. Very, very small but non-zero, according to the theory.
I checked and sure enough, I had reused the power cable from the 850W PSU.
It may or may not have been a problem, but OP’s misfortune has me not wanting to take a chance.
I remember when I was about the install the new 1000W PSU, some smart people here advised me to use the modular cables that came with it, and not the ones from the 850W PSU.
I thought, “SATA schmATA,” but did a little more research, and learned that not all SATA power cables are the same. It was a pain to reroute all the modular cables, but I’m glad I did.
If find it hard to believe a cable supplied with an 850W PSU would cause problems when used with a 1000W one, and certainly not if it was from a decent manufacturer. I’ve just looked through the collection of C13 cables in my drawer and all they’re all rated 10A 250V, which is 2500W.
As for the SATA (and PCI) cables, the reason for that is there’s no standard for how the wiring in those cables is configured at the PSU end. Even if the connections are physically compatible there’s a good chance they either won’t work or could seriously damage your hardware.
Yes and no… I think it’s more the current rating rather than the voltage rating that matters here in the US. Since we are at 110-120VAC that’s only 1100-1200W and you’re at the current limit. More amps = more heat in the cable. And in my case, the power supply is rated at 1200W. The PC never gets close to the limit but I like to keep some headroom.
If you have 220-250VAC at the socket, then that’s another story and those cables have a lot of headroom.
Speaking of heat, in the name of science, I tried to get the cable to burn again with a propane torch to heat up the severed end of the cable. With some disappointment, it didn’t spark, it just slowly burned and went out.
But I’m not sure that gives me a definitive “negative” conclusion as to why it spewed sparks the other day. I think the propane torch isn’t hot enough to unleash the madness. Which would also mean the poor little filaments still connected in my cable were hotter than my torch. I might just need to ask my friend to break out his oxygen-acetylene torch. That might not be enough either though. External heating might also have a different effect than a superheated strand of questionably alloyed copper. Were the sparks actually ionized copper plasma from the wire? That’s plausible I suppose, but I’d have to invoke electricity again to try to replicate the failure conditions. Might be taking it a bit too far.
Far be it from me to question the wisdom of an engineer, but here goes…
My power cable is rated 300V/10A. It’s on a 120V/15A circuit breaker (common houselhold amperage rating.)
To me that means the cable should easily handle current up to 120V x 10A = 1200W without overheating. My 1000W UPS loadmeter shows around 650W (so, about 5.4A) at maximum, with the computer, two monitors, and a host of other peripherals attached.
BeardyBrun reported that his cable was rated for 250V/10A = 2500W. (Here’s where my electrical knowledge hits a brick wall) If my UPS and computer used 240V, wouldn’t the current draw be half, i.e. 325W?
I almost burned down a B&B in Scotland. Came back from golf and the voltage converter I was using to power a 120V battery charger had literally melted and dripped down the wall. So clearly I should not be trusted with electricity…
They should never have given me my Electrical Speciality (Low Voltage) Contractor’s License.
Seriously though, if the computer power supply is rated for 1000W @ 240V, the current draw would be half that of a 1000W @ 120V power supply, right?
I think where I’m getting mixed up is by thinking of a computer’s power use, when in reality the important thing is the current that’s being used to produce that power at a given voltage.
Physicists might puff their chests about being smart but I’ve found electrical engineers to have even bigger egos. Then there’s the chemists. They’re god’s gift to mankind. Just ask them.
But from my lowly physics perspective,
Power (Watts) = Volts x Amps
and then Ohm’s Law…
Volts = Amps x Resistance (Ohms)
You can substitute any rearrange the variables from these two equations and solve for a value.
For a simple resistive load, power something uses is typically a constant as is the case for an old school lightbulb or toaster, where the voltage is taken into account and the resistance of the filament is set accordingly.
Computer power supplies are a bit more complicated in that they convert AC to DC and regulate the output voltage(s) over varying loads. You’ve probably seen that little red recessed switch on some power supplies. That configures the step down transformer coil ratio to take input voltage into account.
Your converter in Scotland was probably just a mechanical adapter rather than a voltage converter. It didn’t have a step down transformer, or maybe it did but your load was more than it could handle.
So back to the case of your theoretical 240V supply voltage…. If the power draw is constant, which is what the power supply is trying to achieve, the voltage is doubled, the current is halved. You said 325Watts, which is power.
Important consideration here:
Many PC transformers give their power as the amount they output, not input. Transformers aren’t 100% efficient so a 1500w PSU might draw 1650w from the wall socket.
You need to check the PSUs input power usage, not output. It’s usually on the label
Not much of a difference, but it can add up and get you into trouble!
That’s why the load meter on my UPS is valuable. When it says 650W I know that’s what the computer and other devices connected to it are drawing from the AC power outlet. (Assuming of course it’s accurate…)
(for full clarity and explanation in this scenario) Also worth noting that as per Ohms law, if you keep resistance constant and double the Current, you are also doubling the voltage to achieve the double Current.
Because Voltage ÷ Current = Resistance
If you double Current you must double Voltage to keep the same Resistance
So you actually pull 4x the power in this example because you’re doubling both voltage and current
You’re right, of course. But I was referring to the cable specs, not the actual resistance of the circuit. I can see how my verbiage wasn’t well stated.