The HDMI Forum, responsible for the HDMI specification, continues to stonewall open source. Valve’s Steam Machine theoretically supports HDMI 2.1, but the mini-PC is software-limited to HDMI 2.0. As a result, more than 60 frames per second at 4K resolution are only possible with limitations.
- TurboWafflz@lemmy.worldEnglish5 months
Console manufacturers all just need to switch to displayport to encourage tv manufacturers to do the same. No one’s going to not buy a ps6 or steam machine because they have to use a little dp-hdmi adapter, but they might be a little more likely to choose a tv that doesn’t need an adapter over one that does
- ShinkanTrain@lemmy.mlEnglish5 months
Sony would probably create a proprietary standard before they’d switch to displayport.
- Fluffy Kitty Cat@slrpnk.netEnglish5 months
What about shipping an adaptor? DP to HDMI for the transition?
- curbstickle@anarchist.nexusEnglish5 months
DP can do HDMI natively.
HDMI needs an active adapter to support DP.
- mybuttnolie@sopuli.xyzEnglish5 months
yes, i have a dp to hdmi 2.1 cable that cost like 35€. it works fine except each time i get up from my chair the screen flashes white. and no VRR.
- madjo@feddit.nlEnglish5 months
Put a ferrite core (or ferrite bead) around your cable to lessen the impact ESD has on your video output.
- 5 months
There are a lot of passive adapters out there that seem like black magic at first.
Obviously has drawbacks, but you can also do displayport/hdmi to VGA passively.
- 5 months
Oh, you may have got me.
By passive, I was trying to say that no external power/additional connection/etc is required. But, the port is technically powered and that is enough to make it not passive.
- yucandu@lemmy.worldEnglish5 months
I have a 25ft HDMI cable that works fine for everything except sometimes when I open a new tab in Firefox, the screen temporarily goes blank/no-signal for a couple seconds. Also sometimes when the dryer turns on.
- mybuttnolie@sopuli.xyzEnglish5 months
my friend has had this issue for years, when he turns on a fan his monitor turns off for a second
- 5 months
Yes, DP converts to HDMI natively. But because HDMI has so much proprietary BS built in, going from HDMI to DP requires an active adapter which strips out the proprietary BS.
A_Random_Idiot@lemmy.worldEnglish
5 monthsYes.
I have a secondary monitor running on a unpowered, dumb (and cheap), DP to HDMI driver right now
- chaospatterns@lemmy.worldEnglish5 months
Can’t be a passive adapter or else that would mean DisplayPort and HDMI have to protocol compatible. If they were then we wouldn’t have this issue.Apparently I was wrong.- 5 months
Nope. DisplayPort can adapt to HDMI or DVI passively. It won’t support the proprietary bullshit like HDCP, but it will be able to display video just fine. Pin 13 on DP is specifically used to detect adapters, so the output device can automatically change to using an HDMI protocol if it detects an HDMI adapter. This technically requires a dual-mode DP port to automatically adapt, but the vast majority of DP connectors produced in the past several years are dual-mode.
But going the other direction (HDMI to DP) requires an active adapter, to strip out all of the proprietary HDMI-only bullshit.
- 1Fuji2Taka3Nasubi@piefed.zipEnglish5 months
Technically no, it has to specifically have Dual-Mode support (DP++). In practice most of them do, at least in the consumer space.
If it doesn’t then you need an active adapter.
- Assassassin@lemmy.dbzer0.comEnglish5 months
https://duckduckgo.com/?q=dp+to+hdmi
Here’s an already filled in Google search for you.
- athatet@lemmy.zipEnglish5 months
Oh wow thanks! Alright everyone. We can all get off lemmy now. Turns out we can just look everything up online. No need to waste time talking to each other. Ugh!
- Assassassin@lemmy.dbzer0.comEnglish5 months
Or you can simply look up the answer to a super basic question in the same amount of time it takes to ask it in a forum so that you’re contributing to the conversation, rather than lazily putting it on other people to answer.
You can’t look up everything online, but you can look up basic information fundamental to the conversation you’re in.
- Archer@lemmy.worldEnglish5 months
Or choosing to ask a question means they actually want to hear what other people have to say and not whatever AI slop the major search engines have become
- dovahking@lemmy.worldEnglish5 months
Phone companies succeeded in killing 3.5mm audio port with that strategy. So why not, for once, use it for a good cause?
- 5 months
I agree with the sentiment but we’re dealing with a chicken and egg problem. If no TVs have DisplayPort, who would buy a console that can’t be used with their TV?
- 5 months
TVs are starting to come with DisplayPort already in the form of USB-C alt mode.
- rubdos@lemmy.zipEnglish5 months
Not really. Both could start shipping both connectors, except if I’m unaware of some licensing issue over that?
- 5 months
If I’m a TV manufacturer, I have less incentive to have both connector types because it increases cost and complexity while only appealing to a very small subset of users. It will take leadership at those companies to take a bit of a leap of faith that the effort is valuable as a long term plan because it will take other manufacturers to make the ecosystem. Couple that with the fact that leadership at companies tend to not be enthusiasts or technically inclined and it makes it difficult, but not impossible. I really hope we can move electronics towards DisplayPort just so it’s an open standard instead of the HDMI for-profit model.
- 5 months
Ah, the Apple strategy of forcing a standard.
EDIT: By that I mean when Apple started putting USB (1.0) on their Macs back in the day to encourage more USB accessories. Not their proprietary (what was the old iPod connector called?) or lightning BS.
- 5 months
You’re thinking of firewire, and that was not proprietary. Sony came up with that. I had a mini disc player with a firewire port. And thunderbolt, which is what they use now, is an evolution on firewire made by Apple, Sony, and Intel.
Both firewire and thunderbolt are superior to USB.
- 5 months
I have an old iMac that I use as a Plex server, and it has a firewire 800 port and a thunderbolt 1 port, both of which I use for a couple of very old external drive enclosures. Sure as hell beats USB 2.0.
JoeBigelow@lemmy.caEnglish
5 monthsI always wanted a way to use that jack, still have no clue what it was for
- 5 months
I was thinking of the 30 Pin Dock Connector, which was proprietary, but it looks like it used both FireWire and USB protocols.
Apple was known for helping propagate FireWire too.
- 5 months
Leave it to me to only consider the male end of the connector 😜
- Gerudo@lemmy.zipEnglish5 months
The first time I ever used a firewire port, I thought it was black magic compared to usb. It was INSANELY faster and super consistent speed. It was the same level of wow as the first time I used an SSD vs HDD.
- 5 months
Compared to the top speed of USB 2.0, firewire 400 was actually faster in that regard (due to a consistent transfer rate rather than a variable one), and I’ll explain where the true performance came in to play, and how thunderbolt also has this amazing feature:
When usb connections begin to data transfer, they started at 0 Kb and slowly speed up to the maximum transfer rate. Then it slows down before completion. FireWire (and is successor, Thunderbolt) maintain a consistent data transfer speed. It begins at that transfer rate, and ends at that transfer rate. This is especially good if you’re moving around a large amount of small files.
Also, firewire 400 already beat out USB 2.0’s 382 Mb/s transfer rate. Firework 800 more than doubled it, and thunderbolt 1 started at 1.5 GB a second. We’re at thunderbolt 5 now, and I stopped keeping track of the data rates because they were so blazingly fast.
One drawback, however, is that firewire cables, and subsequently thunderbolt cables, are both extremely expensive and not very durable. They contain a lot more twisted copper wires, and tend to wear out faster. USB cables are nearly indestructible.
Additionally, firewire (and thunderbolt) are also a networking protocol. You can create an ad-hoc LAN just with firewire or thunderbolt cable cables. This is natively built into macOS, but, on Linux, it requires some sorcery to make it work. With a Mac, and an emergency, you can boot your Mac with a damaged hard to drive remote remotely from another functional Mac just by using a thunderbolt cable (or a firewire cable). It’s a neat trick, and has saved my ass more times than I can count.
One final awesome feature of thunderbolt 2+ is that a natively carries DisplayPort signals since switching to the USB 3 plug standard.
- ayyy@sh.itjust.worksEnglish5 months
No, youngin’ they’re talking about USB. The original iMac was USB-only specifically to force the adoption of USB keyboards and mice.
- 5 months
Not their proprietary (what was the old iPod connector called?)
This is what I was responding to. Also, they only sold that iMac for about a year, after which point iMacs came with FireWire ports.
And I’m in my 40s. I’m not a “youngin’”.
- pivot_root@lemmy.worldEnglish5 months
As long as the manufacturers are competing against each other, that’s never going to happen.
The “gamer” consumer demographic has some of the most whiny, entitled vocal minorities. They’re going to endlessly complain about the next generation of console needing a special cable/dongle to connect to their TV, one of the manufacturers are going to fold, and then the other one is going to walk back the lack of HDMI because they don’t want to lose sales to their competitor.
- eRac@lemmings.worldEnglish5 months
It bothers me. There are too many things that are either not standards-complient or support different parts of the USB feature set that compatibility is a wildcard.
I carry a large backup battery when I travel for work. It can keep my laptop going under load all day, allowing me to not care at all about proximity to outlets when working. It also allows me to painlessly recharge phones by just handing it to someone.
Last week, I was running something from someone else’s laptop (enterprise HP, like mine, but different model). It got low, so I pulled out my battery. Plug it in… No power. I could see the voltage fluctuations of it negotiating, but nothing after that.
Bilb!@lemmy.mlEnglish
5 monthsI have actually been looking at modding an NES with HDMI (and other goodies) as a small project. There are various kits out there.
- pivot_root@lemmy.worldEnglish5 months
That was something they could actually market to the consumer as a necessary upgrade, though.
- “Sure, you need a new cable, but component video has cleaner edges and less color bleeding.”
- “Sure, you need a new cable, but HDMI has better resolution and no fuzziness.”
Going from HDMI 2.1 to DisplayPort 2.1a doesn’t offer anything other than higher bandwidth, and not even high-end PCs are capable of pushing resolutions at high enough framerates for that bandwidth to have been the limiting factor for games.
Because of that lack of perceptible benefit to them, the optics of replacing HDMI on consumer devices that are meant to be connected to TVs isn’t going to be good. Even if it’s an objectively better standard from a technical perspective, it will just come across to consumers as an unnecessary change meant to push their TVs towards planned obsolescence.
They’re going to complain about it, the media will pick up on the story and try to turn it into a scandal, and then legislators and regulators will step in and make decisions based on limited understanding of the technical reasons. By that point, one of the console manufacturers will have been pressured into backing down and promise to keep HDMI in their next-gen console, and the other ones will have followed suit because they don’t want to lose sales over it.
The only way console manufacturers are going to stay united in kicking HDMI to the curb is if the organization behind HDMI pulls a Unity move and starts charging royalties to the manufacturers for every time a consumer plugs the console into a TV.
- tty5@lemmy.worldEnglish5 months
HDMI Forum has fewer than 80 members and membership fee is 15,000 USD/year. Valve could spin up 80 companies, have them join the forum for a low low price of 1.2M USD and outvote remaining members to open source the entire spec.
- BreadstickNinja@lemmy.worldEnglish5 months
This is hilariously plausible. Someone SCUBA down to Gabe and give him the idea.
- Katana314@lemmy.worldEnglish5 months
dives to 50 feet, removes tube from mouth to shout to Gabe, and fucking drowns
- utopiah@lemmy.worldEnglish5 months
FWIW (and I know it’s not the joke…) it’s perfectly fine to remove the mouth piece while scuba diving. In fact it’s part of basic training. You should be able to remove the mouth piece and take another one, your octopus or the one of your buddy, in case there is an incident.
No… the real question for a good diver is how the heck you’re going to say HDMI 2.1 with hand signs! /s
- zerofk@lemmy.zipEnglish5 months
You breathe through an octopus? Don’t they need that oxygen themselves?
- utopiah@lemmy.worldEnglish5 months
Just have to ask nicely. 🐙
(for people confused https://en.wikipedia.org/wiki/Diving_regulator#Octopus )
- Croquette@sh.itjust.worksEnglish5 months
I fucking hate this phrase. You have the choice to not participate and be a normal human instead of a sociopath.
Hate the players because they perpetuate the game.
- Buddahriffic@lemmy.worldEnglish5 months
It’s even more pathetic than that. They aren’t just expressing their will to play the game, they are asking for approval despite it. It’s similar to the “nothing personal” disclaimer which is usually followed by something with significant personal disruption.
Most honestly expressed, they’d be, “I’m doing/about to do something that impacts you negatively, please don’t retaliate against me because I don’t like it when negative things happen to me.”
Edit: just noticed the commenter you replied to reversed the original saying and agrees with you.
Cybersteel@lemmy.worldEnglish
5 monthsI’m pretty sure the saying goes, “don’t hate the player, hate the game.” Which implies that you shouldn’t be blaming the bad actors but the bad system that causes it to be that way.
- JcbAzPx@lemmy.worldEnglish5 months
Which is asinine both here and in its original use. If there weren’t bad actors the system wouldn’t be broken. The players make the game.
- Lost_My_Mind@lemmy.worldEnglish5 months
Does it have to be companies? Could individual people just have 15k, and join? We just need 81 new members.
- Dran@lemmy.worldEnglish5 months
Unfortunately it not only has to be companies, but unless you are a producer of products that are HDMI certified already your membership will be denied. It would take a lot of fuckery to make that many corporations and not have all of their membership applications be denied. Also I’m not sure that it’s even a voting democracy in the traditional sense even if you could.
- 5 months
Don’t you just need to setup a run of HDMI devices and have 80 companies invest together as a group for manufacturing, then have each company put their own sticker on it.
- tty5@lemmy.worldEnglish5 months
While doing that for 80 companies is not feasible I doubt all 80 members are opposed. Valve and AMD could talk to video card, monitor, laptop and handheld makers to pad the membership enough.
As for the democracy question a quick skim of their bylaws suggests it’s close enough.
- Shoshin@aussie.zoneEnglish5 months
Are people just forgetting it has a displayport also? Just ignore HDMI, they got greedy, onto the rubbish pile they go.
- pressanykeynow@lemmy.worldEnglish5 months
The people who block HDMI for Linux are also the people who make TVs and other media stuff. So you may not be able to use displayport or hdmi just because some rich people decided so to make more profit.
- 5 months
This is what I said the other day about this issue. Good luck finding a decent tv with display port! Those fuckers are rare and expensive!
Routhinator@startrek.websiteEnglish
5 monthsThey are called monitors, and yeah expensive but then you don’t get a “Smart TV” with tracking and bullshit.
- sunbeam60@feddit.ukEnglish5 months
Nor do you get TV tuners. While most geeks probably couldn’t care less, any associated family do prefer to watch Great British Bake-off as it airs.
Cybersteel@lemmy.worldEnglish
5 monthsYou could get an external digital tuners and a hdmi switch to switch between pc and the TV.
- 5 months
My grandmother still has an ancient tv with one of those tuner boxes she bought for her tv when analog went off the air.
- madjo@feddit.nlEnglish5 months
I would use an Apple TV or a Chromecast in that case. Most TV providers that I know offer their own mediabox anyway, so no need for TV Tuners anymore.
- Petter1@discuss.tchncs.deEnglish5 months
My country just disabled radio and tv over coax wire and provider send ip-TV boxes.
This allows to have now faster internet, which I like.
- 5 months
Just told my elderly aunt to not buy a smart tv today!
But do they make 40ish+ monitors?
Routhinator@startrek.websiteEnglish
5 monthsThey do, but they much more expensive than a smart TV, even though they have less components… Because a Smart TV is sold for less because its providing the vendor access to you as a product to all of their 3rd party partners.
Routhinator@startrek.websiteEnglish
5 monthsI will add that you can also still get Westinghouse dumb TVs with included DVD player and USB video player, 3x HDMI and a tuner, but 1080p and Max 36"
- 5 months
This is big! Where can I buy one for my aunt‽ She needs a dumb TV that integrates with the Westinghouse infrastructure.
Routhinator@startrek.websiteEnglish
5 monthsThis is the one I’ve gotten two of for kids playroom and bedroom… They work great, old school simple tech.
https://www.amazon.ca/Westinghouse-Parental-Controls-Non-Smart-Monitor/dp/B09QXYZB3Y
- FG_3479@lemmy.worldEnglish5 months
It doesn’t matter if you get one. They are the cheapest and best image quality TVs available, and many like Samsung’s let you set them up without connecting to Wi-Fi.
- 5 months
Yeah, is a fucking sad state of affairs. I actually paid extra for my dumb TV, but then hooked up a jailbroken firestick so I guess I’m a hypocrite.
- FG_3479@lemmy.worldEnglish5 months
I understand why, but you didn’t have to. Many smart TVs let you set them up without Wi-Fi and just use the HDMI ports.
Samsung, LG, and Google TV models do, but Roku and Fire TV do not.
- 5 months
Basically no modern TV has displayport except for few that come with USB-C
- 5 months
DisplayPort to HDMI adapters are quite cheap and don’t add latency.
- FalschgeldFurkan@lemmy.worldEnglish5 months
Aside from practicality, might there be something that gets lost if you do this? (e.g. worse frame rate, quality or sth. else)
- 5 months
No, DisplayPort, DVI-D and HDMI use fundamentally the same protocol, HDMI just adds DRM which requires active adapters when one plugs in a HDMI source into a DisplayPort sink. DisplayPort to HDMI conversions are completely lossless and don’t add latency AFAIK.
- FalschgeldFurkan@lemmy.worldEnglish5 months
Oh wow, all that hassle just for some DRM, as if that would prevent anything… Thanks for clarifying 👍
- 5 months
Yes, but forget about VRR unless you want to flash a custom firmware and with that the adapter is finky as hell. I use one for 2 years now. It kinda works
- Echo Dot@feddit.ukEnglish5 months
It might be possible to get into output wirelessly, it has a dedicated radio for video output so presumably if you just stuck an adaptor into your TV you could just cast to it. Could be quite a nice setup if you wanted it to be able to connect to both your computer in the study and your TV in the living room.
- chiliedogg@lemmy.worldEnglish5 months
What we want is a solution for customers who don’t understand the benefit of DP and won’t buy an adapter when there’s already HDMI ports on both devices.
- Treczoks@lemmy.worldEnglish5 months
It probably would, but that is already to complicated for most people.
douglasg14b@lemmy.worldEnglish
5 monthsThat’s not really how it works given that so many devices have HDMI ports.
If we expect to make hardware devices that are generally compatible with interfaces non-technical users use, then excluding an entire class of common modern interface spec isn’t a great choice.
It’ll be fine for now but as the specs bump up inversion and HDMI changes over time is just going to get worse
- MonkderVierte@lemmy.zipEnglish5 months
but as the specs bump
Eh, 4k is already at the point of diminishing returns. There are 8k displays for a while now, but nobody buys them.
Rather, create 5k, please.
- TotalCourage007@lemmy.worldEnglish5 months
I wish game devs knew how to optimize for storage better. Let me install HD instead of 4k games if I choose. I do the same thing for anime series because HD barely takes up any space at all.
- MonkderVierte@lemmy.zipEnglish5 months
Weell, except some series, where every 20 min HD piece is 1 GB.
- MonkderVierte@lemmy.zipEnglish5 months
No, bad transcoding settings while ripping. Lycoris Recoil is especially bad, 1.5 GB per episode while 300 MB mpeg is usual. Ah, to note that the lossless encoding they use on DVDs is over 6 GB per episode.
- GreenKnight23@lemmy.worldEnglish5 months
no. I have a DP to HDMI cable that cost me like $20. it does great.
- b34k@lemmy.worldEnglish5 months
I have a DP to HDMI cable… it still won’t do HDMI 2.1. No 4k120 without DSC… no VRR…
- MSids@lemmy.worldEnglish5 months
Not everything you don’t like is a bot. I learned something new today that DP supports audio and feel a bit foolish for not knowing that before now, though I stand by my personal experience with the connector. Between work and home, it’s always the DP that flicker at the slightest tap.
- MSids@lemmy.worldEnglish5 months
Two considerations: Displayport doesn’t support audio, and there is no connector on the planet more frustrating and unreliable than DisplayPort. It’s like a joke how sensitive it is to the lightest bump. HDMI just works.
ArtVandelay@lemmy.worldEnglish
5 monthsDisplayPort absolutely does support audio
DisplayPort supports a wide range of audio formats, including multi-channel audio like 7.1 surround sound, ensuring compatibility with various multimedia applications and delivering a robust audio experience directly aligned with the video output.
https://www.anker.com/blogs/hubs-and-docks/do-display-ports-support-audio
- massacre@lemmy.worldEnglish5 months
You seem knowledgeable, Mr. Vandelay. Perhaps you deal with imports and exports… if so on the topic of audio on DisplayPort, are you aware of any Receivers that will split the signal to send audio to speakers and video to your projector or monitor (TV but there are few)?
Serious question about the receiver if you do know of any - it’s come up in the last week while seeing the Valve HDMI news on Lemmy. I found some projectors that have DP, but no receivers and hoping someone here can!
- ngdev@lemmy.zipEnglish5 months
should be able to with hdmi arc, not sure 100% tho but it seems like you could tell the projector where to send the audio, i know you can with tvs and hdmi arc. worst case scenario you do dp in to the projector and hope valve has stereo or optical out to go into reciever
- Petter1@discuss.tchncs.deEnglish5 months
Yes, I have recognised, that if a speaker is connected on the HDMI-ARC port, it will play audio from other input sources to TV still on those speakers. But in my setup, other input sources are HDMI as well…
- SreudianFlip@sh.itjust.worksEnglish5 months
I know that there are some complicated configurations that you could use to get the audio feed from display port to your receiver, like running it through a splitter that will strip out the audio and send it to your receiver separately. I’m pretty sure there are no mainstream AV receivers that will do what you want because the market is split between home theatre and PC, as mentioned elsewhere in this thread, and manufacturers need to be convinced there’s a market for it.
In that situation, I would connect the output device, in this case a PC, directly to the TV/monitor with DP, and run optical audio from either the TV or the output device to receiver.
You lose some of the integrated control that HDMI-CEC gives you, so get a good universal remote that can adapt to this set up and get one-button source switching back.
- massacre@lemmy.worldEnglish5 months
I’ve considered this - I would like multiple sources on my receiver though, so this as you say requires at least a universal or 2 remotes to swap back to the reciever. My projector (current one) only has 2 HDMI Ports. Perhaps in the future this can be my setup.
TonyTonyChopper@mander.xyzEnglish
5 monthsI run a USB DAC off my pc, then have RCA cables going to my speakers. Usually the DAC built in to a PC or TV is terrible compared to a dedicated one
- Billegh@lemmy.worldEnglish5 months
I’m not sure where you’re getting your information, but displayport absolutely supports audio. In most of the same formats that HDMI does as well.
Also, I’ve only ever had issues with HDMI plugs. All the displayport plugs I’ve used had positive locks on them and have been the most reliable plugs I’ve ever had to use aside from BNC connections.
You could perhaps have instead gone with “you don’t find displayport on cheap consumer displays,” because that’s an accurate statement. That’s a huge part of why this is a big deal.
- alphabethunter@lemmy.worldEnglish5 months
My cat literally loves hiding behind my display port connected monitor, bumps into it all the time, it has never disconnected or stopped working. Your cable might suck.
- DragonOracleIX@lemmy.mlEnglish5 months
Where did you hear that? I use DP to connect to my monitor and I can play sound through it.
- boonhet@sopuli.xyzEnglish5 months
I have an HDMI cable somewhere that stops working temporarily when there are changes in temperature, air pressure or planetary alignment.
This is not an HDMI vs DisplayPort issue.
Kissaki@feddit.orgEnglish
5 monthshttps://hdmiforum.org/members/
AMD is part of the forum but can’t get them to accept their own open source driver. I guess we can’t complain or shame all of them in one. I wonder who voted reject vs accept.
Sad.
- Bakkoda@lemmy.zipEnglish5 months
Then AMD needs to apply more leverage or start an awareness campaign with as much shit PR for every business supporting them.
- tty5@lemmy.worldEnglish5 months
Fewer than 80 members. 15k/year membership fee and very lax joining requirements. $1.2M gets you majority allowing you do to whatever even with 100% of current members opposing :P
- felbane@lemmy.worldEnglish5 months
How hilarious would it be if the AMD board member was the one who veto’d the driver 😅
- tty5@lemmy.worldEnglish5 months
AMD has had the code ready to include in their open source driver for a while and has been trying to get HDMI Forum to let them release it for a long time https://www.phoronix.com/news/HDMI-2.1-OSS-Rejected
- DasSkelett@discuss.tchncs.deEnglish5 months
Knowbe your enemy, they say.But jokes aside, I believe DisplayLink’s focus is primarily on the client<->docking station part, with docking station<->monitor usually still being HDMI/DP (same with direct client<->monitor links). So they still have to interface with it some way or another.
Kissaki@feddit.orgEnglish
5 monthsI was confused by seeing DisplayLink there, but couldn’t fully place them, and Wikipedia made me think they may be using HDMI and have an interest in keeping it inaccessible to sell their products and services.
- 5 months
We need more DisplayPort in the world - and better support for multi-monitor setups under Linux.
- 5 months
Every time I plug a second monitor into my Linux PC (Mint), both screens start blinking on and off. Sometimes Win+P works, sometimes it doesn’t. I’ve asked in the Discord, I’ve tried arandr. Nothing works.
- TapatioOnEverythin@lemmy.zipEnglish5 months
Multimontitor display issues are why I switched from mint to a distro that uses Wayland
KhanLee@lemmy.zipEnglish
5 monthsDo you have adaptive sync on? If you do try turning it off for both monitors. That worked for me but your milage may vary.
Anafabula@discuss.tchncs.deEnglish
5 monthsMint isn’t wayland yet and X11 does not support adaptive sync if multiple monitors are connected
Bilb!@lemmy.mlEnglish
5 monthsIt’s a generic term for Freesync/GSync, having a variable refresh rate.
- 5 months
Oh. No, my monitors should always be at 60Hz now. They’re both DC powered (internal + USB).
Buelldozer@lemmy.todayEnglish
5 monthsOdd, I have three screens active on my Mint desktop right now. I’ll have to try this on my Mint laptop and see what happens.
T4V0@lemmy.ptEnglish
5 months3 monitors here and working fine on Elementary OS, Super+P works as well. Are you on Wayland or X11?
- 5 months
Whatever Cinnamon runs on natively. I have no idea beyond that.
Anafabula@discuss.tchncs.deEnglish
5 monthsLast I heard Cinnamon does not have stable Wayland support yet. One of the reasons Wayland was made is because multi-screen support on the old X11 is an ugly hack. Unlike wayland, it doesn’t play well with screens of different resolutions, refresh rates, adaptive sync compatibility or HDR.
T4V0@lemmy.ptEnglish
5 monthsThen it would be best to check the version of your distro and session, then go to their github issue list and search for it, if there isn’t one provide a issue for them to look at. I also recommend verifying if it’s not a hardware issue or cable fault.
- 5 months
Neofetch says that it’s seeing an AMD ATI Radeon HD 8730M and an Intel Haswell-ULT, which is probably the chipset connected to the i7-4600U, and I’m assuming what 99% of the graphics are being generated on.
Kuro@feddit.orgEnglish
5 monthsTry making the move to Wayland. I habe zero issues cinnecting or disconnecting monitors.
- 5 months
If I can do it while keeping Cinnamon, and my current desktop, I might.
- ramenshaman@lemmy.worldEnglish5 months
I recently watched a video about HDMI. I didn’t hate HDMI but now I kinda do.
wavebeam@lemmy.worldEnglish
5 monthsUnfortunately most standards bodies are pretty much this stupid though. Blu-ray, DVD, USB, hell even codecs like H265 and MP3 have governing bodies that are mostly enterprises enforcing their collective power on standards. That’s good in ways because it means they all have to decide on a standard that’ll work wihh to pretty much anything, but bad because they can also enforce bullying like HDCP onto consumers.
- count_dongulus@lemmy.worldEnglish5 months
It’s $1.2M to gain majority share on the HDMI board, but it sure would be nice if someone gave $1.2M to one of the engineers with access to that cryptographic DRM keys for the binary to “apparently get hacked” and have the keys magically appear online.
- count_dongulus@lemmy.worldEnglish5 months
I bet we would start to see chinese adapters showing up on the market with DisplayPort to HDMI2.1 though.
- Echo Dot@feddit.ukEnglish5 months
Oh yeah, and and they would mostly be terrible, except one brand no one has ever heard of but, it’s apparently a big name in China, and called something like Zloks would inexplicably be the king of that particular niche product.
- 5 months
You can/should write your congressman (or equivalent in your country). Just the threat that if OSS can’t use HDMI congress will open up the laws will get action. In a democracy voters have more power than big money when they care and vote like it.
NutWrench@lemmy.worldEnglish
5 monthsYup. Especially since “smart TVs” that are WiFi connected can spy on your HDMI connection. Samsung, LG and Visio were caught doing it.
https://www.pcmag.com/how-to/is-your-tv-spying-on-you-how-to-check
- Logical@lemmy.worldEnglish5 months
After all the shit I’ve seen and heard about the creepy shit smart TV manufacturers get up to I am never ever connecting a smart TV to the internet in my home.
- XeroxCool@lemmy.worldEnglish5 months
I didn’t connect my free Roku TV to the new wifi, and suddenly the remote works like shit. Turns out, it’s a wifi remote that would rather not use infrared and the infrared receiver has been slightly blocked this whole time.
The way to set it up without connecting it to wifi is very hidden. I had to look it up after because I couldn’t figure it out. I fucking hate smart tvs.
- Echo Dot@feddit.ukEnglish5 months
I hate the Wi-Fi remote thing. My TV has a Wi-Fi remote and it runs off of AA battery and they last about 4 days and then run out. If I hadn’t replaced them with rechargeable ones they would have probably had to open a new landfill site just for my old batteries.
- XeroxCool@lemmy.worldEnglish5 months
My roku has an internal rechargeable battery and lasts months. But what’s infuriating is I read up further and found it doesn’t actually use my wifi network. It’s direct to the TV. So why wouldn’t it work without internet? Insane.
- WhyJiffie@sh.itjust.worksEnglish5 months
if they can do that, how come they can’t do the same with displayport?
- OR3X@lemmy.worldEnglish5 months
But why does the HDMI fourm not want a open source 2.1-compliant implementation? Is it DRM related? I feel like it’s DRM related.
- bobs_monkey@lemmy.zipEnglish5 months
Likely moreso that they’re facing pressure from other competitors in the industry that see Steam and open source in general as a threat to their business model. The HDMI forum is made up of industry leaders, and naturally Microsoft and Sony are there.
- Jesus_666@lemmy.worldEnglish5 months
They’ve been refusing open HDMI 2.1 since 2017. I don’t think that being afraid of Linux becoming the dominant gaming platform plays a role here; it’s more likely that they’re afraid people might find new ways to get at protected content.
JoeBigelow@lemmy.caEnglish
5 monthsIve never had using HDMI prevent me from enjoying pirated media, so Ive always been confused about what sort of drm a TV is looking for.
- Imacat@lemmy.dbzer0.comEnglish5 months
It’s more of a barrier for people who are pirating media, not the ones consuming that pirated media.
- leftzero@lemmy.dbzer0.comEnglish5 months
Don’t they mostly download it directly from streaming platforms these days, skipping the display and its connector altogether…?
- b34k@lemmy.worldEnglish5 months
Isn’t getting at protected content pretty trivial anyway? At least that’s my impression from how easy it is to find basically anything.
Buelldozer@lemmy.todayEnglish
5 monthsBut why does the HDMI forum not want a open source 2.1-compliant implementation?
To my knowledge they’ve never officially said but you can be sure that it has to do with Content Protection and that means DRM. An Open Source HDMI 2.1+ driver would make pirating much simpler, probably trivial and they don’t want that.
It’s possible anyway of course but there are a couple of hardware hoops to jump through and that’s enough to keep most people from doing it.
- 5 months
Because that would open source certain implementations they want to hold captive.
It also enforces closed source drivers which can be shipped with spyware/crapware, further extending profits for companies… companies that happen to make up the HDMI Forum.
- tty5@lemmy.worldEnglish5 months
They charge a fee for access to the spec and maintain who can claim their products are HDMI compliant and require compliance testing on those products.
An open source implementation would make that spec public and strip a lot of control they hold.
- plantfanatic@sh.itjust.worksEnglish5 months
Part of being open source is subsequent licensing. This would allow any others to piggyback and avoid the fee.
tabular@lemmy.worldEnglish
5 monthsI wish I could buy hardware without HMDI at all so they got no money from me.
- Echo Dot@feddit.ukEnglish5 months
As much as I agree I think it would have been a bad move for them to do that. The devices success is already highly dependant on its price, which is still in flux as far as I understand, there is no reason for them to make the decision even more difficult.
Soapbox@lemmy.zipEnglish
5 monthsNot all software available on Linux is open source. NVIDIA drivers for example. Hell, most of the games on Steam are closed source.
So, is it just a matter of principle on Valve and AMD’s part that they only want to ship with fully open source drivers?
I’m not technically knowledgeable enough to understand why you can’t just make the HDMI 2.1 part of the driver code closed source and the rest of the graphics drivers open?
- turmacar@lemmy.worldEnglish5 months
Modern specs are complicated. I vaguely remember something about a cryptographic key the driver needs to be signed with to successfully complete the handshake to enable all display options between the computer and display.
Not entirely unwarranted either, an unexpected amount of voltage on an unexpected pin because the driver / hardware is misconfigured damaging your TV would suck. (Still sounds like the Forum is being a dick about it though.)
A_Random_Idiot@lemmy.worldEnglish
5 monthsunless you’re a NES. which then a voltage spike gets the DRM permanantly out of the way forever.
ryannathans@aussie.zoneEnglish
5 monthsNo they are not. The kernel module is open, which is not the driver. You’ve been lied to
async_amuro@lemmy.zipEnglish
5 monthsMaybe a dumb question… if I used a DisplayPort to HDMI 2.1 adapter, would I get 4K at 120Hz on the Steam Machine and my LG CX tv?
- unalivejoy@lemmy.zipEnglish5 months
Depends on the adapter and source. You may find issues when playing HDPC protected content if you buy a low quality adapter.
- 5 months
DisplayPort to HDMI doesn’t need an adapter since DP has an alt signal mode with HDMI support.
Although I’m not sure 2.1 signalling is available yet or not.
- kkj@lemmy.dbzer0.comEnglish5 months
DP++ HDMI support tells the GPU to output HDMI. If the GPU can’t output HDMI 2.1 over an HDMI port, it can’t output it over a DisplayPort (as a general rule; you could theoretically wire the DP for a higher standard, but why would you?)
- hemko@lemmy.dbzer0.comEnglish5 months
Yeah, just it doesn’t support VRR. If you read the article, you’d know
- CIA_chatbot@lemmy.worldEnglish5 months
If the article didn’t require accepting cookies to read it I would :D (just being snarky)
Buelldozer@lemmy.todayEnglish
5 monthsNah, I don’t blame you. The list of crap that I had to allow under “required” to read the article was preposterous.
- Echo Dot@feddit.ukEnglish5 months
It’s got DP as well though so it’s not all that bad. We really should be pushing manufacturers over to DP anyway.
It’s literally the same feature set.






























