

You’re worried about the screen being worn out? How does a screen wear out (excluding maybe oled burn in, but this aint oled). And a good chassis shouldn’t show that much wear after a few years.
You’re worried about the screen being worn out? How does a screen wear out (excluding maybe oled burn in, but this aint oled). And a good chassis shouldn’t show that much wear after a few years.
They’re still far better than everything else on the market.
IdeaPads also aren’t ThinkPads. Those are the consumer grade garbage you’d want to stay away from.
That’s cool. Performance per dollar isn’t the only factor for a laptop.
Size
Weight
Durability
Battery life
I/O and other features.
A not dogshit network card
An actually usuable trackpad
I’m sure I could list more. But those are all things that are important on a laptop and you can’t change after you buy it.
The law requires things to have a backdoor (or non encryption I guess). So it’s either sell in that market, or don’t and have security.
And it’s not in the center of the port. It’s in between two ports where a boot should never be in the way.
The first Android phone had the nipple, so must be the layout or something.
Windows Hello (and presumably modern Linux equivalents) use the camera + IR transmitters to work at least similarly to how apples Face ID works. In theory they should both be secure, but in practice who knows what they fucked up.
Is there anyone on the internet who doesn’t lie about those things?
Lots of small companies don’t require/give you to use a dedicated work phone. Shit my company isn’t even all that small anymore and we still don’t.
Forking splits the community, development resources, etc and ensures Linux will stay irreverent to the home user.
If everyone switches over to the fork that’s great. But let’s be honest. Ubuntu isn’t going anywhere any time soon.
Every picture I’ve seen has been an outside pin. So my theory is it’s the cable getting tugged for cable management and even though it’s clipped in, it’s not making as good of contact.
That or just a bad cable design. I’ve bought a few cables from cablemod and I’m not happy with the wiring they used. Their website says “Crafted with 16AWG wiring” but they also brag about the flexibility of their cables so I assume they’re using stranded wires instead of a solid core so you lose a decent chunk of ampacity (and heat sinking).
High end GPUs are always pushed just past their peak efficiency. If you slightly underclock and undervolt them you can see some incredible performance per watt.
I have a 4090 that’s underclocked as low as it will go (0.875v on the core, more or less stock speeds) and it only draws about 250 watts while still providing like 80%+ the performance of the card stock. I had an undervolt that went to about 0.9 or 0.925v on the core with a slight overclock and I got stock speeds at about 300 watts. Heavy RT will make the consumption spike to closer to the 450 watt TDP, but that just puts me back at the same performance as not underclocked because the card was already downclocking to those speeds. About 70 of that 250 watts is my vram so it could scale a bit better if I found the right sweet spot.
My GTX 1080 before that was under volted, but left at maybe 5% less than stock clocks and it went from 180w to 120 or less.
The 8800 Ultra was 170 watts in 06
The GTX 280 was 230 in 08.
The GTX 480/580 was 250 in 2010. But then we got the GTX 590 dual GPU which more or less doubled
The 680 was a drop, but then they added the TIs/Titans and that brought us back up to high TDP flagships.
These cards have always been high for the time, but quickly that became normalized. Remember when 95 watt CPUs were really high? Yeah that’s a joke compared to modern CPUs. My laptops CPU draws 95 watts.
There are decades where nothing happens; and there are weeks where decades happen.
Still a charge, just no re for tomorrow.
Also I’m pretty sure those things lasted a lot longer than a day.
Just because it’s open source and anyone could theoretically fork it doesn’t mean it can’t be enshitified.
What “merits” needing a CPU upgrade? I upgraded from a core i9 11950h to a 13900h machine because I needed more performance. That 11th gen machine still looks pristine besides one spot where a cat bit the corner of the lid. Even my piddling around machine wasn’t up to snuff and upgraded from a 10th gen i5 to a 12th gen system. That machine’s keyboard was a bit worn when I first got it, but it’s not (appreciably) worse now. Besides that and maybe the palm rest the chassis is in pretty good condition. Why does it matter if the keycaps are a little smooth? Or there’s a small scuff on one corner. Or a cat punctured the bezel of the display and somehow didn’t break anything.