cross-posted from: https://lemmy.bestiver.se/post/1091783
- GreenBeanMachine@lemmy.worldEnglish2 hours
So there’s more supply than demand, meaning they should become really cheap. Right? That’s fucking good news.
What collapse are you talking about? Profit collapse for greedy corporations?
- floofloof@lemmy.caEnglish18 minutes
Sure, you may be able to buy a cheaper motherboard for a while. But you’ll pay through the nose to populate it, hence the falling motherboard sales.
- GreenBeanMachine@lemmy.worldEnglish16 minutes
If they get dirty cheap, I might just replace my old one using old parts
- febra@lemmy.worldEnglish1 hour
I have a home server waiting around for an SSD for a year now. I have the money, but I don’t like feeling like I’m getting scammed. So I’d rather wait for this market to collapse than give them my money.
- farmgineer@nord.pubEnglish1 hour
I was originally thinking of grabbing parts as they go on sale finally (this PC is from 2018ish, I think, and I guess I could upgrade some bits), but I think I’m just going to wait and get a laptop. In part because I do less gaming, the gaming is less intense, and I’m thinking about trying to spend part of the year living outside of Japan which would make the logistics of shipping a heavy full tower around (or even mid if I downsized) just too much of a headache. Still not 100% sure, though.
- msage@programming.devEnglish3 hours
Can we create a fund with gamers, and then buy the manufacturers that go under?
We need just one of each, MB, PS, GPU, hopefully something for CPU will be buyable as well.
- boonhet@sopuli.xyzEnglish2 hours
For CPU and GPU, our options are mostly TSMC, Samsung and Intel. Nvidia and AMD don’t really have fabs. Any of those can also help with the other chips on a motherboard. Samsung also do NAND so they’d be the best to acquire.
- chunes@lemmy.worldEnglish9 hours
How has this whole saga not been an obvious indictment of ‘the free market?’
Big players shouldn’t be allowed to gobble up all the resources needed by small ones. How is it not obvious that they need to wait until production increases to meet their needs before embarking on their little project?
- Venator@lemmy.nzEnglish6 hours
The free market means the market is free to fuck you. Yes you in particular 😅
- 9 hours
No this is exactly what the free market is. Regulations that make things fairer for small people is communism.
- kiranraine@reddthat.comEnglish9 hours
Then give me communism bc this free market shiz aint it. Giving us the scraps that AI doesn’t want is just nuts if we get anything at all. Esp since no one beyond billionaires want this ai crap
- BeMoreCareful@lemmy.worldEnglish1 hour
It’s like a dozen people gaming ever one else for the rest of the chairs on the Titanic.
They want to buy the dip
- shirro@aussie.zoneEnglish11 hours
Who knew cartels were bad for markets? I am sure economists and regulators were screaming at legislators for years but I guess they couldn’t hear them over the sound of the bags of dirty lobbyist money landing on their desks.
And then there are the financial “irregularities” funding the AI boom which are also not getting any attention.
- eestileib@lemmy.blahaj.zoneEnglish19 hours
AI is an amazing tool for fascists.
Annihilate private access to computing, censor and rewrite all comms, destroy free software and the last remnants of education…
Every single decision made for evil.
And all these vendors who are locking themselves into one customer are about to learn why that’s a bad idea.
- Tollana1234567@lemmy.todayEnglish4 hours
AI is used as propaganda tools to spread it, and it can bee seen on youtube, social media quite readily. plus it sexualizes victims to like csam, and festishized unattainable “women” for conservatives.
conservaties pretty much buys into/believes in anything that is scammy.
- [deleted]@piefed.worldEnglish17 hours
The worst thing is that when used for good AI is fantastic! Scientific progress with purpose built AI to find planets, predict the weather, and tons of pattern matching has been in use for decades with positive benefits!
Even LLMs can be a useful tool in the right situations where looking like words people would say but accuracy is NOT important.
The problem is trying to use LLMs to do everything and failing while running the tech industry, the environment, and soon the economy into the ground. They took something positive, ruined it and coopted the terminology while shoving it down everyone’s throats.
MrKoyun@lemmy.worldEnglish
12 hoursHow are they about to learn why that’s a bad idea? Like, when the bubble pops?
- bytepursuits@programming.devEnglish10 hours
I think. Most likely they going to be outcompeted by china.
https://www.bloomberg.com/news/articles/2026-04-27/why-china-s-deepseek-qwen-and-moonshot-are-a-worry-for-us-ai-rivals
- verbalins@sh.itjust.worksEnglish17 hours
This article talks exactly about that: https://tante.cc/2026/04/21/ai-as-a-fascist-artifact/
artyom@piefed.socialEnglish
16 hoursI don’t know if spending unfathomable amounts of money buying up the entire global supply of computing products can be considered a “tool”.
- cadekat@pawb.socialEnglish18 hours
I’m not here to argue for or against LLMs in general, but self-hostable AI is a thing. Actually open AI is a thing.
A blanket statement saying about AI as a whole technology being good for fascism is about as useful as saying “roads are good for fascism” (they’re great for troop movement after all).
roofuskit@lemmy.worldEnglish
18 hoursYou can’t self host anything when the hardware is no longer affordable.
ugjka@lemmy.ugjka.netEnglish
17 hoursSelf hosting an llm ain’t the same thing as self hosting nextcloud for your docs and calendar. Yes there are small models but their output is laughable
- MagicShel@lemmy.zipEnglish16 hours
Small models are improving and becoming more capable. The quality of local LLMs is basically unbounded. The context size of local LLMs is bounded by hardware. So local LLMs can be very capable for small, self-contained tasks.
qwen 3.6 35b running locally:
Write a Python script that can pull weather data from public sources and provide the high and low temperature for the current day in Miami, FL.Single shot. No tool/internet use, so it didn’t pull this script from elsewhere.
import requests def get_miami_weather(): # Miami, FL coordinates LATITUDE = 25.7617 LONGITUDE = -80.1918 # Open-Meteo API URL (free, no API key required) url = ( f"https://api.open-meteo.com/v1/forecast?" f"latitude={LATITUDE}&longitude={LONGITUDE}" f"&daily=temperature_2m_max,temperature_2m_min" f"&timezone=auto" ) try: response = requests.get(url, timeout=10) response.raise_for_status() # Raises error for 4xx/5xx HTTP status codes data = response.json() # Index 0 corresponds to the current day high_c = data["daily"]["temperature_2m_max"][0] low_c = data["daily"]["temperature_2m_min"][0] # Convert to Fahrenheit (commonly used in the US) high_f = (high_c * 9/5) + 32 low_f = (low_c * 9/5) + 32 print("🌤️ Miami, FL Weather for Today:") print(f"High: {high_f:.1f}°F ({high_c:.1f}°C)") print(f"Low: {low_f:.1f}°F ({low_c:.1f}°C)") except requests.exceptions.HTTPError as http_err: print(f"❌ HTTP error occurred: {http_err}") except requests.exceptions.ConnectionError: print("❌ Error: Could not connect to the weather API.") except requests.exceptions.Timeout: print("❌ Error: Request timed out.") except requests.exceptions.RequestException as err: print(f"❌ An error occurred: {err}") except KeyError as key_err: print(f"❌ Error parsing data: Missing expected key {key_err}") except Exception as err: print(f"❌ Unexpected error: {err}") if __name__ == "__main__": get_miami_weather()Output:
% python3 ./m_weather.py 🌤️ Miami, FL Weather for Today: High: 88.0°F (31.1°C) Low: 73.2°F (22.9°C)I tried to keep the size and scope within something that would reasonably fit in a comment. Looks pretty decent to me, but I can’t write Python myself. Never learned. I double-checked the LAT & LON of Miami, and it’s spot on.
It did take 47 seconds, while a cloud LLM would probably take 5 or less.
All I’m saying is local LLM isn’t garbage and it is getting better all the time.
- chunes@lemmy.worldEnglish9 hours
Now show the output for an 8b model. The only one I’m capable of running
- humanspiral@lemmy.caEnglish9 hours
qwen 3.6 is awesome, but 48-64gb is still real money these days. (though 32gb on dedicated separate machine is also more money). Sonnet 3.5 to opus 4.5 level benchmarks. and the online cost metrics for 27b and 35b are way off considering the overall usefulness of a 48-64gb machine (inclusive of gpu vram for 35b) which even in single, non batching, use could displace $5-$7/day of use.
Local costs are much lower than online costs in linked chart, but if online, there are better models
Rimu@piefed.socialEnglish
15 hoursThat’s interesting.
How much ram did it use while running?
If you used a GPU, how much does it cost in today’s prices?
- MagicShel@lemmy.zipEnglish11 hours
It’s a MacBook Pro. 36GB of ram. I am sure Macs have some kind of gpu and I understand it somehow combines GPU ram with system ram, but I don’t really know Mac hardware very well.
It’s beefy for a laptop, but the desktop I built for myself several years ago had 32 GB of ram and a GTX 1660, so I’m guessing they are similar in capability. I gave that to my daughter, so I can’t run a comparison right now.
EDIT: After doing just a bit of research, I’ve learned the unified memory architecture that Macs use, while not ideal for many purposes, is actually a big advantage for running larger inference models. So it’s possible that this particular model wouldn’t run at all on my Linux box or would run much slower because the full model wouldn’t fit in the 6GB of VRAM and create a lot of memory thrashing.
- boonhet@sopuli.xyzEnglish2 hours
Yup, you want memory accessible to the GPU for local AI. AMD Strix Point and Mac devices are popular options. CPU can run LLMs but very slowly. I’ve got 32 GB of RAM and 8 VRAM and it’s borderline useless for models that don’t fit in the VRAM.
- SabinStargem@lemmy.todayEnglish9 hours
You can use something like KoboldCPP on Linux, which allows both RAM and VRAM combined to run a model. O’course, not as fast when compared to pure VRAM or the Mac approach, but it is an option. I use my 128gb RAM with some GPUs for running models.
- humanspiral@lemmy.caEnglish9 hours
decent performance on 6gb gpu without quantization: https://www.youtube.com/watch?v=8F_5pdcD3HY&t=9s
- Janx@piefed.socialEnglish17 hours
Or available. Companies have pre-sold years worth of inventory to AI companies.
- corsicanguppy@lemmy.caEnglish18 hours
You see hot that’s tangential to what you’re replying to?
Ai is evil
LOCAL AI is not all evil
Computers are expensive
Your point is completely valid, but in another discussion.
- Fondots@lemmy.worldEnglish17 hours
Sorry, but I think the point about local AI not necessarily being evil is the tangent here.
The OP is about motherboard shortages, which is being driven by the big AI companies and is making hardware unaffordable for normal users
The top level reply to that is about how that’s bad because it removes the ability for people to be in control of their own computing
Then someone comes in, saying “yeah, but you can host your own AI so that it’s not evil so not all AI is bad”
Then someone points out that you can only host your AI if you can afford the hardware to do so which, as the OP and the comment you replied to pointed out, is getting really hard to do.
- 17 hours
Only when you ignore what was literally the first premise and conclusion.
- Jhex@lemmy.worldEnglish18 hours
if you did not understand the comment from above it’s fine but splitting hairs like you are doing is silly (everybody knows it’s not 100 % of AI is 100% evil)…
your comment is exactly the same as when people say “guns don’t kill people, people kill people”… yes, we all know guns are not autonomously killing people, the point is that guns, as a tool, are remarkably good at doing something we do not want, which is to kill people
- meco03211@lemmy.worldEnglish17 hours
Not to go on a separate tangent, but that’s the entire point of guns. They are supposed to kill. That’s not meant to be some crazy conservative defense of them or opposition to regulating them. Just pointing out something that seems to get lost in conversations.
- Jhex@lemmy.worldEnglish15 hours
Correct… so when I tell you “guns DON’T kill people, people kill people” you are right to assume and I am just an idiot trying to jingle keys in front of you to distract you from the fact that guns do in fact kill people.
- brendansimms@lemmy.worldEnglish18 hours
Corps want to privatize roads and make them all toll roads too
- eestileib@lemmy.blahaj.zoneEnglish18 hours
Roads were also useful for random citizens and people who happened to be in the area.
LLMs are overwhelmingly more useful to bad actors.
- IratePirate@feddit.orgEnglish18 hours
I’ve looked into self-hosted AI and decided it’s not worth the cost - both in terms of hardware and energy - when compared to the relative value to be gotten out of it. YMMV.
- cadekat@pawb.socialEnglish14 hours
Same, pretty much. It is possible though, which makes LLMs a more democratic technology than, say, nuclear reactors.
- IratePirate@feddit.orgEnglish13 hours
The models you can run on consumer hardware are still nowhere near the stuff that runs in corporate data centers. To stick with your metaphor, its like running a little steam engine at home while the big guys get to operate nuclear reactors…
- cadekat@pawb.socialEnglish11 hours
You can get pretty far with a stack of 5090s and llama.cpp with split mode graph (or so I’ve heard, I’ve never tried), or AMD’s unified memory CPU thing.
It’s not as good as data centre grade stuff, but it’s not nothing either.
- cadekat@pawb.socialEnglish14 hours
That’s kinda my point. Roads are a useful technology, but they can be used by fascists.
Ilixtze@lemmy.mlEnglish
18 hoursThe US government is already setting down the legal framework to make self hostable AI ilegal so good luck with that. Also self hostable AI is still being trained on stolen material so still fascist.
- DigDoug@lemmy.worldEnglish17 hours
Motherboards were already way too goddamn expensive, anyway.
About a year ago I was considering upgrading my AM4 PC to AM5. The rock-bottom cheapest motherboards were only slightly cheaper than the relatively high-end one I got 5-ish years ago. I decided to stick with my current PC.
- doingthestuff@lemy.lolEnglish9 hours
Yeah I’m going to be running AM4, DDR4, with my 5800x for many more years the way it’s looking.
- floofloof@lemmy.caEnglish14 hours
There’s going to be a lot of us running 2019-vintage PCs indefinitely.
- GenChadT@infosec.pubEnglish11 hours
Just repasted both of mine with phase change materials and thermal putty, including GPU dies. Looking at getting better/more fans and a smarter hub soon. Gonna make this 5800X3D last as long as possible; I’m in this for the long haul lol
- Rai@lemmy.dbzer0.comEnglish12 hours
I think motherboards have been pretty aptly priced. What do you think a complex piece of a computer like that should cost? The most important part of your computer that ties every single part together?
150USD? You can get a decent board for that.
Under 100? First of all, that’s insane, and second of all, you can get a budget board for under 100.
350 is too much? You’re looking at a high end board. You’re paying high end board prices.
Is your complaint that 500 is too much for a mobo? Why are you even looking at 500 dollar mobos?
All of my boards I’ve gotten in the last five builds have been 200-300 and those are amazing machines that are either newer and excellent or older and still hold up over a decade later.
- Asafum@lemmy.worldEnglish18 hours
If we can’t make a time machine to go backwards, can we at least pause time? The future absolutely fucking sucks, let’s just avoid it altogether lol
- 10 hours
Science is not yet clear. I think quantum theories say time might not even exist
- deranger@sh.itjust.worksEnglish18 hours
Pausing time would be worse than living forever. I could not imagine a greater torture.
- Fishnoodle@lemmy.worldEnglish18 hours
Seed debris in the orbit to destroy current satellites and then prevent new ones from being launched for several decades
- Echo Dot@feddit.ukEnglish17 hours
Honestly I quite like space.
Better solution is to just kill like six people. Because that’s all the AI industry is, it’s six really annoying rich guys who are having the most extravagant pissing contest in history.
- boonhet@sopuli.xyzEnglish1 hour
Six? There’s Jensen Huang, Lisa Su, Alex Karp, Peter Thiel, Sam Altman, Dario Amodei, Mark Zuckerberg, Elon Musk, Satya Nadella and Sundar Pichai at a minimum to start with, but there are many others too. I’m including the CEOs of Nvidia and AMD because they’ve been pushing AI for years to get more valuable customers than we ever were to them.
Just in case anyone needs a list.
- Asafum@lemmy.worldEnglish16 hours
This is our fucking insane reality:
Who wins? 6 rich guys or 8 Billion people?
Easy. 6 rich guys.
…
- Echo Dot@feddit.ukEnglish14 hours
Honestly I wouldn’t mind so much if someone created an AI that was actually useful.
- commander@lemmy.worldEnglish18 hours
I keep thinking maybe things will good by 2030 and remember that’s 4 years away. Game devs please target the Steam Deck and Switch 2 as the baseline. Mid range and high end is just too premium for most people. Even entry level enthusiast gaming hardware is too expensive because of memory and storage. Steam Deck and Switch 2 are good low power draw integrated graphics level. That’s not terrible for pricing
- ericwdhs@discuss.onlineEnglish17 hours
Even having high-end enthusiast hardware, I want those devices as the baseline too. Whatever optimizations they do still apply over the whole hardware spectrum.
Also, you can technically say 2030 is less than 4 years away if you want to traumatize old people. Lol.
- Echo Dot@feddit.ukEnglish17 hours
I was like watching sci-fi made in the 80s and 90s where they thought that 2015 was a really advanced distant prospect.
- DigDoug@lemmy.worldEnglish12 hours
Back to the Future 2 promised us flying cars, (actual) hoverboards, and Jaws 19… 11 years ago.
- rebelsimile@sh.itjust.worksEnglish15 hours
the first time I heard “1997”, in a movie, in the 90s, I freaked out, like no way that’s a real year.
- idiomaddict@lemmy.worldEnglish12 hours
2031 still sounds like a year out of a sci-fi story to me, but it’s only five years away
- TachyonTele@piefed.socialEnglish17 hours
I agree with you both. And you also both suck for saying mean things.
- Echo Dot@feddit.ukEnglish17 hours
According to valves hardware survey over half of all gamers have hardware less powerful than the Steam Machine so provided valve don’t go mad with the pricing that may actually end up happening.
I think the steam deck and the switch 2 are probably too low power to be reasonable targets.
- 18 hours
Dammit. Everything is going up in fire GPU prices. . Ram prices. Storage. Then cpu. And now motherboards. So basically everything…!
tal@lemmy.todayEnglish
18 hoursMotherboards are, if anything, probably going to do the opposite — motherboard prices aren’t rising because of increased demand. Memory prices rose because of increased demand. Prices for things that use memory also rose. Motherboard sales are falling because of decreased demand; motherboards don’t use a ton of memory, and fewer people need a new motherboard because the components that they’d plug into the motherboard cost enough to cause them to defer upgrading or buying a new PC. You might see price cuts, if anything.
- 18 hours
It’s actually the other way around, prices should go down as mobo sales are low.
- 7 minutes
Yes normally speaking that would be logical. But today nobody is producing any products anymore for consumers.
Hack even one of the 3 chip products, Micron, just said fk consumers. We only focus on businesses (Ai datacenters). Since we can earn more that way. In the short term at least.
- ignirtoq@feddit.onlineEnglish16 hours
No, sales are going down because prices are going up. If you have a fixed inventory and sales go down, you lower prices to increase demand and move the product and keep your revenue stream. But in this case, they’re moving supply away from this market (consumer hardware) to a different market (AI data centers). So the supply is going down with (previously) fixed demand, driving prices up. The “motherboard sales are collapsing” headline comes from looking at the consumer hardware slice of the computing hardware market. If you look at total sales from each manufacturer, so include the AI data center sales in the analysis, they’re not having any trouble moving inventory nor keeping up their revenue stream overall.
- 4 hours
Unlike DRAM, which is quite universal, most manufacturers of motherboards specialise in a specific direction. Asus, MSI, etc., hell, most of the consumer market players, have specialised in gaming oriented motherboards.
Do you know what a server motherboard doesn’t need?
- 4 different RGB headers
- various gamer crap baked into the motherboard
- gamer branding all over the place
what they need is:
- specially formatted motherboards with built in IPMI or similar remote management systems
- dual CPU sockets in most cases
- tons of PCIe lanes available for interconnect fabric, GPUs, and so on
the two markets simply don’t mesh. Asus losing 25-30% of its market practically overnight because people can’t afford to buy RAM, SSD, etc. does not negate the fact they need sales to survive, so what they’ll do is drop prices, lowering their profit margin, just so they’re not sitting on unsellable stock.
- ParlimentOfDoom@piefed.zipEnglish15 hours
There’s no components available to plug into a new motherboard. Demand has dropped.















