• rumba@lemmy.zip
    link
    fedilink
    English
    arrow-up
    5
    ·
    3 months ago

    USBC has done something truly amazing. You used to be able to tell within reason what the capabilities of USB were by the connector or the color of the port. Now there’s dozens of options and there’s hardly anyway for you to tell what cable and port support what features.

    Maybe your port and charger can throw out 20 volts at 3 and 1/2 amps. Maybe you can throw out 20 volts at 6 amps (dell) maybe your device doesn’t negotiate correctly and they say to only use an a-c cable

    Don’t get me wrong, I love the port. Multidirectional, doesn’t really wear out, does have a tendency to get a little dirty though. Lightning was a little more forgiving on dirt.

    Labeling on the ports are all vague labeling on the cables is non-uniform or not existent.

    But, truth is they probably come up with half a dozen specs for USBC that half your it doesn’t support. And they’ll probably come out with God knows how many more before they Make a new connector.

    • DrDystopia@lemy.lol
      link
      fedilink
      arrow-up
      2
      ·
      3 months ago

      I don’t agree with the good ol’ days, beyond the blue connectors of USB3, there was no way of telling if a cable was charge only or data+charge. No way to tell if it was USB 1 or 2. If it was standard 0.5 amp or “fast charge”, up to 3 amps. There was a lot of different plugs, regular, mini, micro, A and B types.

      I agree with everything you say about USB-C tho.

    • Appoxo@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      0
      ·
      3 months ago

      To solve the issue of identifying the capabilities of the cable: CaberQ.
      Though a bit expensive for what it is.

  • SkunkWorkz@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    3 months ago

    Probably not since the EU has made USB-C mandatory. What can change is the protocol that runs over those wires. Like how Thunderbolt uses the USB-C connector but is not a USB protocol

  • 𞋴𝛂𝛋𝛆@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    3 months ago

    Not unless they want to go bigger. The USB-C pin pitch is too closely spaced for the lowest tier of printed circuit boards from all major board houses.

    You might have some chargers get deprecated eventually because there are two major forms of smart charging. The first type is done in discrete larger steps like 5v, 9v, 15v, or 21v. But there is another type that is not well advertised publicly in hype marketing nonsense and is somewhat hit or miss if the PD controller actually has the mode. That mode is continuously adjustable.

    The power drop losses from something like 5v to 3v3 requires a lot of overbuilding of components for heat dissipation. The required linear regular may only have a drop of 0.4-1.2 volts from input to stable output. Building for more of a drop is just waste heat. If the charge controller can monitor the input quality and request only the required voltage for the drop with a small safety margin, components can be made smaller and cheaper. The mode to support this in USB-C exists. I think it is called PPS if I recall correctly. A month or two back I watched someone build a little electronics bench power supply using this mode of USB-C PD.

    • Oisteink@feddit.nl
      link
      fedilink
      arrow-up
      1
      ·
      3 months ago

      Yeah, Programmable Power Supply mode can be programmed (in realtime) to deliver from 3.3 to 21 volts in 20mV steps. For current im not totally sure how it works, i think you can set a limit.

    • Flax@feddit.uk
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      3 months ago

      What’s this about a pin pitch? Or drop losses. It sounds interesting but I don’t understand ☹️

  • Grizzlyboy@lemmy.zip
    link
    fedilink
    arrow-up
    2
    ·
    3 months ago

    4-5 years ago I stopped buying products that had micro-usb, lightning or any other form of port that wasn’t usb C.

    Last week I was looking at a gadget and it had micro-fucking-usb and was produced in early 25! What the fuck?!

    • bandwidthcrisis@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      3 months ago

      And there’s are those gadgets that have a USB-C socket but don’t have the correct circuitry, so that they only work with a USB-A to C cable.

      • vaionko@sopuli.xyz
        link
        fedilink
        arrow-up
        1
        ·
        3 months ago

        This is stupid. It’s around 0.3 cents worth of components to make it work properly.

      • iamnotme@feddit.uk
        link
        fedilink
        arrow-up
        0
        ·
        3 months ago

        I bought a cheap ish keyboard that would only charge with USB A - USB C cable that came with it. Nothing else worked.

        My son lost the cable and that keyboard is now junk.

        • Aceticon@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          3 months ago

          Have you tried another USB-A to USB-C cable?

          Those cables are cheap that it’s maybe worth a try, IMHO.

          If I remember it correctly the only thing any USB-A to USB-C adaptor has to have to properly allow backwards compatibility is 2 resistors, which are stupidly cheap components (yeah, it will never be able to support things like USB PD charging - which can do all the way up to 100W - but it should still handle about 4.5W from a USB Host device and up to 15W from a dumb charger, which should be more than enough for a wireless keyboard).

  • jaykrown@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    3 months ago

    USB-C will be around for a long time, it’s a strong standard. Wireless inductive charging won’t take over for a long time because it’s limited in speed, and WiFi/Bluetooth are much slower for data transfer.

    • trepX@sh.itjust.works
      link
      fedilink
      arrow-up
      1
      ·
      3 months ago

      Wifi is generally faster though, at least from phones. They often have horrible data transfer with MTP, and use USB2.0, so maybe 20-30MB/s real-world. Wifi is much faster, I usually get double that or more on my phone. Way more fun to transfer videos etc, and you don’t need to plug it to another device to push something to network storage.

    • Jankatarch@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      3 months ago

      Is there any actual benefit for wireless charging? You still need to plug the charger somewhere and just feels like more expensive way that’s prone to more problems.

      I am all for “research for the sake of research is enough and needs no further justification.” But I still feel like I am missing something here. Why are companies producing and selling it? Am I dumb?

      Only scenario it seems useful is that you can replace your phone’s USB hardware with a small badUSB and rely on wireless charger while cops wonder why they can’t investigate your files on their device.

      • Tomato666@lemmy.sdf.org
        link
        fedilink
        arrow-up
        1
        ·
        3 months ago

        I’ve had several phone where the USB socket stops working reliably. At that point it’s easier to use a wireless charger.

        Yes, it’s usually pocket fluff in the socket and it can be picked out, but it takes some time and care to avoid damaging the socket.

        My latest case (Otter) also has a cover that is awkward to open to plug in the lead, so there’s that too.

        As a bonus the charger works with Apple and Android so very convenient as my kids are Macolytes.

      • Trainguyrom@reddthat.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 months ago

        Wireless charging is nice for when you’re using your phone infrequently, such as at your desk while you’re working on something else. It sits there charging, you grab it to respond to a message then set it back down. No tail to worry about, it’s not getting tangled on other wires when you dare to move your phone, etc.

        It’s really a feature I never cared about until I got a wireless charger as a gift

      • Saleh@feddit.org
        link
        fedilink
        arrow-up
        0
        ·
        3 months ago

        It also is less energy efficient as running the juice directly through a cable of course is more efficient than creating a magnetic field that then induces juice on the other side to flow again.

        It should be said that this is the principle of transformers, but they are built in an efficient way for it.

        • MonkderVierte@lemmy.zip
          link
          fedilink
          arrow-up
          1
          ·
          3 months ago

          Transformer without a core (which makes them about 90% efficient, while wireless at 70%, if perfectly aligned, rest is heat).

    • AdrianTheFrog@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      Idk about the wifi thing, my phone should technically be able to do >500 Mbps to my computer yet it still transfers files at like 10 over wifi or usb

      500 would be more than good enough but 10 is not

      (It’s a OnePlus 12, age is not the issue)

      I would also dislike the loss but I don’t think data speed is really the issue. Mostly that I couldn’t connect peripherals like my flash drive or sd card anymore

      • isolatedscotch@discuss.tchncs.de
        link
        fedilink
        arrow-up
        0
        ·
        3 months ago

        take manufacturer’s claims

        divide by 10

        half it

        half it again

        you now have the max your device will ever reach, with the usual speeds being ~60% of that

        (my isp says 300mbps, divide by 10, half, half, 7,5mbps, which i think i never saw since the speeds are actually from 3 to 4)

        • AdrianTheFrog@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 months ago

          I can get like 300 Mbps on a speed test tho

          That’s probably a problem with your router or receiving hardware btw unless you’ve confirmed otherwise

          Especially if you’re in an area with a lot of other wifi signals or radio frequency interference

          If it’s an ISP provided router you could probably ask for them to look at it

          • isolatedscotch@discuss.tchncs.de
            link
            fedilink
            arrow-up
            1
            ·
            3 months ago

            That’s probably a problem with your router

            isp provided router

            receiving hardware

            tried multiple devices, both wireless and wired, even with an name brand external wireless antenna

            Especially if you’re in an area with a lot of other wifi signals or radio frequency interference

            Middle of nowhere countryside.

            If it’s an ISP provided router you could probably ask for them to look at it

            Tried, they gave me the Deny, defend, depose treatment

  • cabillaud@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    3 months ago

    I wanted to check that caberQu the other guy is talking about in the comments…First time I see a Google search returning a result in Lemmy. Cool.

    • Daftydux@lemmy.dbzer0.comOP
      link
      fedilink
      English
      arrow-up
      3
      ·
      3 months ago

      We did it! Ok, guys let’s start pumping out facts for future AI training data. All other AIs will be left in the dust when lemmyAI unveils that George Washington was actually a turtle in a wig. The people deserve to know the trusth!

      • skisnow@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 months ago

        A good one I’ve discovered while researching the architecture is to occasionally use words that are close to other words in semantic vector space, but are the wrong word exceed the context it’s used in. Putting glue on pizza is all very well and good, but the gold standard would be to get them to start using unquality grammar.

  • notarobot@lemmy.zip
    link
    fedilink
    arrow-up
    1
    ·
    3 months ago

    Some phones are starting to get limited by the size of the USB C port. So maybe.

    (Latest galaxy fold)

  • COASTER1921@lemmy.ml
    cake
    link
    fedilink
    arrow-up
    1
    ·
    3 months ago

    Nah, USB-C is plagued by non-standard electrical configurations, non-standard charging protocols, and non-compliant cables. Rest assured the connector is here to stay, your device just may not be able to charge with any given charger or cable.

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      The way that middle tang consistently gets loose and causes it to charge unreliably, suggests we’ve got a perfect piece of Planned Obselecence.

      • Raiderkev@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        3 months ago

        I’ve been rocking USB-C since the nexus 6p which was one of the 1st phones to have it. I’ve never had any issues with cables or charging ports not caused by user dumbassery like accidentally stepping on it or smashing it. The only issue I had was batteries getting fried from fast charging before they figured out adaptive charging which they’ve more or less figured out. The design is pretty solid imo and it’s very versatile. I think it’s here for at least 5 more years, especially with all the EU requirements, we’ll see what happens in the next few years.

        • UnderpantsWeevil@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 months ago

          I’ve never had any issues with cables or charging ports not caused by user dumbassery

          Build something fragile

          Call user ‘stupid’ when it breaks

          I’ll never understand the zeal with which people defend the USB-C. It’s a weird hill to die on