Ask Lemmy
A Fediverse community for open-ended, thought provoking questions
Please don't post about US Politics. If you need to do this, try [email protected]
Rules: (interactive)
1) Be nice and; have fun
Doxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them
2) All posts must end with a '?'
This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?
3) No spam
Please do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.
4) NSFW is okay, within reason
Just remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either [email protected] or [email protected].
NSFW comments should be restricted to posts tagged [NSFW].
5) This is not a support community.
It is not a place for 'how do I?', type questions.
If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email [email protected]. For other questions check our partnered communities list, or use the search function.
Reminder: The terms of service apply here too.
Partnered Communities:
Logo design credit goes to: tubbadu
view the rest of the comments
Yup. Usually the device being charged can scale down the power throughput so it's not getting 60W+ if it's not able to handle it.
That's the core of charging management: The charged device controls the process, not the charger.
Anything else won't work if you think about it.
In this thread: people who don't understand what power is.
Power isn't something that is "pushed" into a device by a charger. Power is the rate at which a device uses energy. Power is "consumed" by the device, and the wattage rating on the charger is a simply how much it can supply, which is determined by how much current it can handle at its output voltage. A device only draws the power it needs to operate, and this may go up or down depending on what it's doing, e.g. whether your screen is on or off.
As long as the voltage is correct, you could hook your phone up to a 1000W power supply and it will be absolutely fine. This is why everything's OK when you plug devices into your gaming PC with a 1000W power supply, or why you can swap out a power-hungry video card for a low-power one, and the power supply won't fry your PC. All that extra power capability simply goes unused if it isn't called for.
The "pushing force" that is scaled up or down is voltage. USB chargers advertise their capabilities, or a power delivery protocol is used to negotiate voltages, so the device can choose to draw more current and thus power from the charger, as its sees fit. (If the device tries to draw too much, a poorly-designed charger may fail, and in turn this could expose the device to inappropriate voltages and currents being passed on, damaging both devices. Well designed chargers have protections to prevent this, even in the event of failure. Cheap crappy chargers often don't.)
Great write up! Definitely filled in the info I only know a little about.
Oh, please enlighten us, oh wise one. You might want to google "power draw" before you reply.
I guess he should have included the subtitles.
Have you ever heard the story of Darth Plagueis the Wise?
Not usually, but all the time. It’s part of the USB standard to negotiate the power that the device and even the cable can handle.
When all USB could do was 5V I already didn't trust any charger but mine - I couldn't believe people dared to connect their devices to charge into any public USB chargers.
Now that they can go up to 20V, and we have to trust everything will work with the negotiation and wiring to get the right voltage, it's even scarier!
Will go up to 48V (240W) with the next USB-PD standard.
But as long as it's reputable hardware that actually implements the starndard, I'm not too worried.
48V and we're back to POTS (plain old telephone system) voltages :-)
I agree, but that's the problem even from reputable sources, glitch happens. Old 5V-only chargers would need much more things to go wrong to fry our devices. A 20V (or 48V !) one is just a small (sw or hw) glitch away to zap a device that doesn't support such voltages.
The USB standard is usally really robust and the changes of SW errors is small. If you have a good brand laptop it will probably come with very reliable charger as well. I really don't worry about it.
Is there some exception to USB-C im not aware of? Am i putting myself in danger using high power chargers to charge low power devices?
No, they do a handshake through the USB connection and negotiate the best charging wattage.
And to add : if the handshake fails, or no common voltage can be decided, it will stay at 5v
If you use really cheap 3rd party chargers there is a possibility.