A search for "LCD controller" on Ebay, Aliexpress, Banggood etc. returns many products; some of these boards need reprogramming for a specific screen, while others can be configured through jumpers. Here are some useful links I collected in the past on the chips and programming tools, plus info about panels.
Not directly for these products, however I usually buy only from sellers with feedback ratings above 99%, and check their negative feedback to see if there are any related to the product I'm going to buy.
A great number of online vendors from China/HK sell counterfeit parts, also the best ones do that, sometimes unknowingly as they very often have no idea of what they sell, but those are ready made boards which is a completely different product. They could be spare parts for cheap old TVs (some have a analog only TV receiver on board). My experience with Far East sellers suggest to never ever buy active parts (chips, transistors) because they're very likely relabeled fake parts, while boards, modules and passives are usually ok. Probably not the best quality but they always worked.
The same universal controller boards (more commonly known in the industry as "scalers") can be used to "dumb down" smart TVs. For monitor use, one with HDMI/DVI/VGA is sufficient but you can find the TV-oriented ones with a tuner and composite video input too, as well as speaker outputs.
It's hard to keep track of all the different variants as this seems to be a bunch of Chinese companies all copying each other. You can find these boards under names such as MSD3663/3463 DS3663LUA, TSUMV53/V56/V59, etc.
I wonder if there are any of these with VRR and lower latency. I'd like to upgrade my TV but it seems pretty wasteful to throw out the panel when it's mostly the firmware that's the problem.
> This domain is currently on the Crypto Wallets domain warning list. This means that based on information available to us, Crypto Wallets believes this domain could currently compromise your security and, as an added safety feature, Crypto Wallets has restricted access to the site. To override this, please read the rest of this warning for instructions on how to continue at your own risk.
I've sort of dumbified 2 smart TVs by coupling them with cheap small computers with ethernet or WiFI in and HDMI out. I don't know absolutely that the TV's are not spying on us, but I can get to the internet going around the TVs not through them.
If there's a cellular radio. it has to be declared to the FCC. You can look up your TV's FCC ID to verify that it doesn't have one. I'm not aware of any TV that has an embedded cellular radio.
I used to think that would be too expensive but then I saw smartphones sell new for USD 20 on Black Friday and I also see one year of 200 MB a month data plans selling for USD 30.
That’s retail.
I have changed my mind on this: if you were selling a million televisions a month, you could absolutely make it work from a money standpoint. They don’t need to send back tons of data and they don’t need to do it particularly fast.
I expect with a lot of these kinds of use cases you could get absolutely rock bottom volume m2m prices by promising to only use it during designated quiet periods when the network is underutilized.
Maybe, but I feel like there'd be a pretty easy calculus on it at scale— like "your competitor is cheaper. We're going with them unless you can cut your rate in half. One way you can do that is by giving us SIMs that only unlock for an hour once every 24 hours, and you can decide when that hour is, as long as we see our devices at least once a day."
This should be pretty easy to implement on the telco side and it would let them monetize the dead time in the network, same as TOU pricing for electricity.
If you ship a device with a SIM card you then need to follow regulations to ship it. This means you, the buyer, can find out from the FCC what spectrums it speaks on. I don’t think mobile connectivity is a much needed option on TVs so you won’t find it. Connecting to open WiFi hotspots is a completely different beast and doesn’t leave behind a tell-tale trace
You would think it would be easy to implement ... technically ... but one has to remember that as well as being businesses many Telcos are large hulking great slow moving monsters draped with red tape and politics - anything that changes the existing ways of doing things is often a Herculean act and that strips any dynamism in the market. Don’t misunderstand me there are many discussions in the industry around realising the kind of utopian vision you espouse but these are all bound up in committees with a typical lifespan of > 10 years ...
Well, I'll take your word for it, as it sounds like you have first hand experience and I'm just speculating. But it doesn't feel like this should be the kind of thing that requires a standards committee to be involved— it's literally just a SIM whose ID is mapped on the backend to reject connections most of the time, or be so heavily throttled as to be practically unusable. There's no new protocol needed here, just a profile added to existing traffic shaping/limiting systems, all of which already have time of day awareness in support of commonly-available features like free data after 5pm or whatever.
OTOH, perhaps it's simply a question of an insufficiently large addressable market for non-realtime data like this. Maybe there's room for an M2M-focused MVNO to negotiate a block of off-hours data for dirt cheap and offer this kind of service, maybe paired with some kind managed platform to handle the storage and transmission side.
You’d think ha. Economists have a term for this “deadweight loss”.
MVNO are probably the best place for this kind of dynamism but they are always limited by the tools that the tier 1 carriers give them.
There is stuff like you describe on the way but progress is glacial. You’ll get bits and pieces dripped out bit by bit as the big guys satisfy themselves they’re not missing out on anything.
Just a note that this was one of the puzzles Steve Jobs managed to crack. We had smart phones before the iPhones but progress was stymied by the industry - thankfully jobs recognised the potential and had the wherewithal to put the Telcos over his knee and make them his bitch.
Steve is gone now, and there aren’t too many players like that around now and you can see how gradually the smartphone market has started to tend back towards the more orthodox again.
You don't, but connecting them to a WiFi won't solve that problem because they can (and probably do) switch to radio if they can't reach their servers through Wifi connection.
I recently came into a pile of V-by-one panels, and discovered boards for them are significantly harder to come by than their eDP or LVDS brethren. But I've found a few on aliexpress and we're up and running:
The nice thing about VBO is the connector is more-or-less standardized, and it's a very durable JAE series, so plugging and unplugging isn't such an issue. (Prototyping with LVDS or MIPI DSI tends to run into fatigue issues unless you're scrupulously gentle.)
Beyond the panel interface, there's also the matter of the backlight. Sometimes the backlight driver is separate, sometimes it's built into the scaler board. The connectors for these are almost always evil and you have to hope your supplier can include the cables, because you'll never find them yourself.
I would loooooove to find a source for the SDK to build the software that runs in the scaler chip. When I buy the scaler boards, they come with random firmware that the supplier isn't particularly interested in updating. Other builds of the same board support DDC/CI but don't have the backlight driver on 'em, and cost more. There's so much stuff about the firmware I'd like to change...
It like be amazing to have a broad movement around hacking smart TVs to make them dumb. Either repurposing broken ones to reduce e-waste or to de-crap all the privacy abusing smart features.
Thanks for the tip. I would love the idea of getting a smart tv, cracking it open and dumbing it down but the risk of destroying it in the process is a bit scary. I think there’s a business opportunity for hardware hackers do de-crap modern hardware like smart TVs or robot vacuums for those who can barely use a soldering iron or a terminal
does finding a panel model number involve taking it apart? or has someone been nice enough to aggregate that data somewhere based on, say, tv model (maybe even serial number)
Sometimes reviews for a particular monitor or TV will talk about the specific panel inside. But finding them is sort of awkward, especially since you have no guarantee that such reviews exist.
Taking the thing apart is gonna be required at some point anyway, you might as well take that plunge. Lay a soft blanket on a sufficiently large work surface, grab some tools, and have at!
This interestingly made me think, I wonder if there is the opposite of this. Capture those signals to a capture card to avoid HDCP or the likes. I realize there are HDCP splitter/strippers but maybe one day those don't work, who knows.
On the flipside, does anyone know if there's any caveats to "recycling" a laptop's motherboard? I've got a laptop that has awful thermals but decent hardware. One of the fans is dead, and I've attempted to replace it but I guess the header itself is busted.* The processor and GPU are just wasted in there as they throttle hard on basic workloads. So, I've been meaning to rip the board out and make an HTPC out of it. I'm not too experienced with electronics to know if there's any issues I could run into, so I've been holding off on it. I would love to see if anyone else has done something like that.
* Unfortunately, it possible that my attempt to fix a dead fan is what busted the header. I'm not sure because the previous fan was dead and the new fan I put in didn't spin, but that's part of why I don't feel comfortable just winging it.
>does anyone know if there's any caveats to "recycling" a laptop's motherboard?
The biggest caveat in re-using old compute hardware IMO is power consumption, power-efficiency has been one consistent improvement in every new generation of computer chips. So, a new Raspberry Pi is almost always as capable as your old hardware for HTPC but more power efficient.
Except maybe say extremely high quality video, say 8K 10 bit video.
>Unfortunately, it possible that my attempt to fix a dead fan is what busted the header.
You can try passive heatsink or passing the fan thermal management to separate unit altogether like Corsair Commandar(I haven't tried it).
You can also use your hardware for only GPGPU applications & HW accel on GPU to reduce load on CPU e.g. Separate stream encoding machine.
If you're worried about power consumption from a cost perspective, it's going to take a long time to break even from the cost of new hardware, even for a cheap pi.
If you're worried about it from an environmental perspective, it often takes more energy / resource mining / pollution to manufacture a new piece of hardware than the total amount saved from not running older hardware over a longer lifetime. Running a computer / smartphone / tablet until it doesn't work well enough for its purpose is generally better for the environment than upgrading because shiny, because it slightly reduces the demand for new hardware.
> If you're worried about power consumption from a cost perspective, it's going to take a long time to break even from the cost of new hardware, even for a cheap pi.
That depends a lot on where you live. In Germany, prices scratch 0.3€/kWh. A new Pi sets you back 60€ [0]. Assuming you save 20W and it's on 24/7, you'll break even in just above a year[1].
[0] I know it's marketed as cheaper, but you never get it for 35€ and then you still need cables and an SD card.
Sure, and it never hurts to do the math for your particular situation. Being in California I'm probably paying about the same as you per kWh. I do think 20W delta is rather optimistic, since the last time I checked a decade ago, laptops only consume 10-15W at idle, and pi's likely consume at least a few watts themselves. Laptop hardware is generally pretty good about reliably suspending to ram and resuming.
My general point is that there's a lot of externalized costs to consider depending on what you're trying to optimize for. For example, I live in a somewhat chilly area most of the year, and part of my home is electrically heated rather than by natural gas. I don't particularly care how efficient any electronics are in that part of the house (when I'm using them) because that electricity is converted into useful heat just as well as a space heater would do it.
I had a nightmare last night that my 3 year old smartphone's screen got horribly scratched enough to be a constant nuisance but not bad enough to interfere with its function, and I had to decide whether or not to give up and upgrade.
This isn't necessarily related to the discussion at hand, but it reminded me:
I got a huge crack (https://ibb.co/1mPxC6C) in my smartphone's screen right around New Year's. One of my goals was to use my phone less (less social media). The crack is a big, ~.25in black diagonal line across the screen, but the touch still works. Because it's a diagonal crack, I can still read whole text messages/emails/etc, I just have to scroll the right way.
It's now March and I still haven't fixed my screen, mostly out of laziness since I'm home all the time, but also because it's done wonders to my smartphone use. I practically don't idly scroll anymore, just because it's really annoying to do so, and I don't miss it at all. Everyone thinks I'm nuts, but I haven't read twitter in like 3 months so who's the crazy one?
I'd say, don't bother upgrading! Enjoy that the addictive quality of your smartphone is diminished.
Lol, thankfully it was just a dream and my screen is still flawless! I did stop bothering to put protective glass on it after that cracked a few months ago.
The only thing I read obsessively on my phone these days is hacker news. :-) I stopped using most social media 6 or 7 years ago.
> The biggest caveat in re-using old compute hardware IMO is power consumption, power-efficiency has been one consistent improvement in every new generation of computer chips. So, a new Raspberry Pi is almost always as capable as your old hardware for HTPC but more power efficient.
yes but your typical laptop rarely exceeds 30W, so if your electricity is priced at $0.12 per kWh, and you run that laptop 24x7 for a year:
The block of transactions lol literally a problem for division. I could have written “1% of the energy required to process of a block of 3 transactions”
haha, without transactions you wouldn't need to mine anything would you. The network is already secure and the data valid. If there were no transactions the ledger wouldn't need to be appended to, and in that case you could just throw it into read-only storage and call it a day.
Transactions are the only thing that contributes to energy usage.
I think you are confused. The only thing that contributes to energy usage is mining blocks, aka the block reward subsidy. Without transactions, it would still happen that way. And does, in the form of empty blocks.
> I think you are confused. The only thing that contributes to energy usage is mining blocks, aka the block reward subsidy.
Which is a reward for processing transactions.
> Without transactions, it would still happen that way. And does, in the form of empty blocks.
If we eliminated the ability to transact Bitcoin, the ledger would be closed - finalized. A data blob plus a published SHA. Like any other archive.
Mining allows you to amend said ledger, by processing a batch of updates. It is the process by which you amend the ledger, grouped into blocks of 0 or more transactions.
Transactibility is what uses 100% of the energy of the bitcoin network and it's fine to quantize that energy usage on a per-block, and per-transaction basis.
Mining furnishes transactibility. Transactibility is quantized into blocks, blocks into transactions. I'm sorry, it's clear, transactions and the ability to execute them are what uses 100% of the power of Bitcoin. If not that, then what? Don't just say "mining" - explain what mining provides if not transactibility.
A block of... 0 or more what now? Which wouldn’t need to exist if the ledger was immutable? Ergo... I mean you seem pretty intelligent so I’m gonna assume you get my point.
0 or more transactions form a block, which is mined to facilitate mutations. Mutations in the form of transactions. Nothing more. The block reward is compensation for providing the service of processing blocks of 0 or more transactions. A mining scheme that consistently has 0 transactions achieved nothing. End of story. If you disagree explain to me what else mining achieves.
See it’s solely for transactions. Without transactions or mutability mining wouldn’t be required. This division is fair game. The block reward subsidy is only required to reward the processing of blocks which wouldn’t need to happen if transactions weren’t a feature.
The method of facilitating transactions consumes all the power, so you can divide the average number of transactions per block to obtain the power expended per average transaction.
Trust me I’m not struggling here, except with pushing back on the koolaid consumption. Folks who pursue extensive mental gymnastics to justify their having obtained wealth through environmental destruction: it’s like if everyone holding Bitcoin woke up one day to realize they’d killed a gorilla in the process. They’d try really hard to convince themselves they didn’t or that it wasn’t their fault. Even to pretend it didn’t happen because the gorillas were simply transmuted to a better life in the sky.
> The biggest caveat in re-using old compute hardware IMO is power consumption
This is a relatively recent laptop (it has a GTX 1060 inside!), and I can't get hardware that capable for cheap enough for it to be worth it. My only reason for making it an HTPC is that I don't want to see it gather dust on a shelf for no reason.
> You can try passive heatsink
I've considered that + other cheap ways, but I'm concerned about mounting. The CPU runs hot, even when well cooled, so I'd need some way to get good contact with the die without harming it. There's a few screw slots around it, but those are for the heat pipes it comes with. That's the biggest challenge in this project from my perspective.
> You can also use your hardware for only GPGPU applications & HW accel on GPU to reduce load on CPU e.g. Separate stream encoding machine
Unfortunately, I don't have any use cases where that could come in handy. Working a full-time job killed my dreams of producing content :'). I just want this to sit at my 4K TV and play content without snooping on me, like all the big name boxes do.
I have a 5 year old laptop whose display died after 2 years. Rather disappointing that high-end stuff doesn't last and manufactures don't seem to care, but that's another rant. I plugged it into my TV to use for emulators. It's fun to play occasionally, but overall I hate it because of how whiny and high maintenance windows 10 is. Every single freaking time I want to use it, I have to pull it out from under the blu-ray player and reassure it of its insecurities. Every few days it will randomly wake up and start loudly revving its fans up and down too, just to remind me that it exists to torment me.
When I looked into it 5 years ago, there were supposedly issues with it having a non-standard NVME interface firmware or something. There's a good chance that it would just work these days, but in my experience getting linux to run reliably on a laptop that wasn't designed for it can be tricky. It was originally a work laptop, so linux compatibility wasn't originally a consideration.
Maybe I'll whip up a Linux live usb drive and try booting it this weekend!
This is the key for me, and a major reason I've always avoided the cheap, older hardware off of eBay for home use. My server rack consists of new (released) at time of purchase hardware with power efficiency as a major focus.
One measurement which really surprised me was that my 2019 XPS 13 idles with about 1.5W, screen on according to powertop. That's less power than a Raspberry Pi takes to idle!
I did this for my younger sister's first computer. We didn't have a lot of expendable cash, so I took my old laptop, stripped the motherblard out, mounted it on a makeshift open-air rig, and hooked it up to a second-hand monitor (laptop screen was irreparably broken). Served her well for three years or so.
How did you get good contact with the CPU die? I've considered an open air setup even before I had a backup PC to be bold with the laptop, but I wasn't sure it could cool the CPU.
It won't be pretty but if you can access the CPU from the top or bottom, you can strap a desktop heatsink down with zipties running both ways around the laptop.
In my case, the weight of the fan connected to a 12v motorcycle battery was enough to make it not throttle at indefinite 100% load, I didn’t have to zip tie anything. Mind that I did not remove the existing fans, just used thermal interface material on top of the existing heat sink, add some salvaged aluminium passive cooling sinks on top, and then on top of that, have the fan screwed into an Amazon shipping box so that the fan blades didn’t hit the sinks.
Leaked schematics are available for a lot of them (even famous ones like Apple, as Louis Rossmann can tell you...) Component-level repair can be possible.
This is indeed cool. A lazier and less efficient solution for those that don't want to take apart the computer is synergy (https://symless.com/synergy and https://github.com/symless/synergy-core) which lets you share keyboard/mouse and clipboard over multiple computers.
Can you expand on why you think it's weird and wrong to charge for software?
I'm not a developer and I first discovered Syngerg about 10 years ago when I was working physical infrastructure and physical security in a data centre, and used it across my work provided Windows desktop and my personal (BYOD) Macbook Air.
If I recall correctly I was able to use it for free initially, but as soon as I discovered I could pay for it I did.
In the specific case of Synergy, because it was originally open source and free. The people using the name Synergy now have sort of hijacked it. Synergy originated at SGI.
Isn't often heard around HN that it's ok to charge for Open Source software ? If you don't want to pay for it, you're free to choose not to, but if others want to pay, what's in it for you?
BTW the parts of Synergy that were Open Source are still Open Source, so you're free to download those and build them yourself if you so choose.
They don’t make it very easy to do so - my understanding is that’s essentially what Barrier is (making it easy to use the open source parts of Synergy).
I think for most people this is the better option from a cost perspective. I'm nearly certain I've never purchased a new monitor, either CRT or LCD. I like the idea of repurposing laptop screens in concept, but I'm not sure what problem it's trying to solve.
From my personal experience the problem of already having a working display right in front of you but it being no longer (reasonable) to use.
You can set up something like mouse without borders or something that uses it as a over the network display or buy these controllers and make a custom case/mount/stand but in the end you're probably better off accepting you're better off buying a new monitor even though this one works great you "just" need to detach it from the laptop.
For as many that say "but I did and it works great" you don't really see many people going out of their way to get a laptop screen instead of a monitor prior to the cost already being sunk. That being said sometimes it's useful when you explicitly set out to build a budget custom video wall (11520x6480 52" can be had for cheaper than one might imagine) but usually you're buying specific panels at that point not finding yourself in possession of 9 of the same model laptop display by accident.
I don't see people going out of their way either. I know it's not explicitly stated, but there's a bit of a subtext of getting something for nothing in these articles about turning laptop screens into monitors. I can see the appeal of $50 + something you already own becomes a small high DPI monitor. For me personally, and for what I assume is the majority of the contributors to HN, I see the computers in front of me as a distributed system, each with different capabilities. To dumb it down to two or three screens attached to one computer is a downgrade from the cross-platform multi-screen environment I can build if each of these screens is attached to its own computer.
In the UK during the lockdown I suddenly needed to kit out my wife and four children for working from home and there was a shortage of cheap screens online so I bought screens without stands on eBay. These seem to be taken from office closures or equipment refresh and so they are higher end devices with more robust design and more connectivity options than cheap consumer ones. You can buy monitor arms cheaply now that clamp to your desk and provide a much wider range of positions to get an ergonomic clutter free setup. Cost was roughly £30-60 per 19-24" 1080 Dell or HP screen with HDMI/DP, £20 for a good arm and £10 for box containing enough M4 screws of different lengths to last a lifetime. Now that the kids are back in school and more offices are hitting the end of their leases I expect the eBay prices will drop
I'm typing this on a Compaq laptop I rescued from a skip (running Debian), a keyboard I paid 50 NOK for at a jumble sale, connected to a 20 inch Dell LCD VGA monitor bought at Fretex, the Norwegian Salvation army chain of second hand shops. I think I paid about 150 NOK for the monitor about five years ago, that was about 20 USD then.
It would be great if a vendor sold iMac mainboard swap kits to convert older models into general purpose monitors. The monitors in iMacs are useful well beyond the period Apple supports the builtin computing parts. It would be interesting to see how big a market there is for such a set of products. I suspect a lot of iMacs are collecting dust because they can’t install a version of MacOS that still receives security updates.
Ideally, a customer would enter their iMac model and serial number to see if a kit was available so they wouldn’t have to crack open their computer to find the display panel model.
This may be complicated because different models have different sets of ports and it’s possible that batches of the same model were made with different LCD panels and display controllers.
The varying ports would not be an issue if customers were willing to modify their cases with a rotary grinder.
The resulting monitors would have DisplayPort (regular & USB-C), HDMI, and DVI-I inputs that could be switched between. A builtin USB hub would also be great. It would be better if the embedded iSight webcam was connected to the USB hub internally and the audio jacks repurposed to utilize the builtin microphone and amplified speakers.
In an alternate universe Apple could have included a monitor input port and let users switch between displaying the internal and external computers. They sort of did this with Thunderbolt on a few models but that was only useful with other Macs.
I really wish the iMac had never been an all-in-one product and instead was a minitower or pizza box chassis with regular PCI Express slots and a separate monitor but here we are. In that scenario, folks with less compute and expansion needs would get a Mac Mini. Is connecting 3 extra cables (monitor, monitor power, and USB) such a heavy lift?
I've done this a few times, as 4:3 panels are good for Retro-Computing. The bigger issue for me is finding a way to support/frame the panel securely enough that it won't get damaged. I keep meaning to get a 3D printer to try and sort this but I can't really justify the cost.
Yup, I got extremely excited at the prospect of converting my 15" 4:3 monitor into a side screen for my laptop, but I can't figure out how I'd prop it up cheaply and securely.
Some aluminum t-slot extrusions and maybe some 3D printed connectors and mounts would probably work (or you could dremel and drill custom fittings out of a block of delrin if you don’t have access to a 3D printer)
Is there any "scaler" universal controller board for driving a MacBook Pro 15" Retina LCD? I miss my personal laptop's PPI when I'm at work. It's not just about the number of pixels, it's also the pixel size.
Instead of having to buy another board, I'm planning on detaching the lid & flipping it backwards like a tablet. Then only using it for reading PDFs in portrait & hook up Leap Motion controller to it.
I’ve had batteries die on 2 laptops, motherboards on one. Never had a screen go bad. Off the top of my head this is on a herd of 10 laptops over 20 years.
Batteries in a good number of modern laptops are inside the device. Not that easy to replace - I’ve found some that are harder to replace than memory, drives and keyboards. Plus that’s beside the point - my point is that in my experience screens don’t fail that often - replaceable or not.
That hasn't been the case in my sample size of ~ 10 laptops. Batteries wear out, hard drives die, cpus get old, but screens pretty much work; unless the backlight cable breaks (I repaired that for a friend multiple times; maybe my soldering wasn't the best though)
Very few laptops also comes with screens worth reusing. It’s getting better, but only a few models of laptop have screens I’d consider using, even as a secondary monitor.
Is there a good software-only solution for using a MacBook as an external display for another Mac?
I have an old MacBook 12” that it would be nice to use as an extra display. I looked into this a few years ago and all the options had really poor video quality as they used Airplay (I would probably mainly use it for a terminal window so I’d want sharp text, I could handle a little lag)
Will try this out, thanks - I think I tried it several years ago and got annoyed that AirPlay wouldn’t do full resolution, but it’s quite possible the situation has improved!
If you only / mainly plan to use the terminal on this laptop,what about using a local terminal and ssh into the main computer? And maybe use Barrier to use the same mouse / keyboard for the two machines.
That's actually a really good idea that I hadn't thought of! Of course it would be nice to have full flexibility to display whatever I want on there, but I think this would be useful. Great suggestion, thank you!
I did wonder if you can create some type of laptop terminal KVM (keyboard video mouse) for a Mac Mini or small PC box.
This way, it becomes a sort of portable desktop/laptop hybrid.
Maybe someone can mill an exoframe box and sell it on Etsy, that allows you to plug in a monitor, keyboard, and the magic Apple trackpad for such an idea?
Is there a place to find a generic container to house the remote controller and allow button pressing? I'm unsure of where to look, or even what search terms to use.
I can't recommend the U2415. Unfortunately this is one of these models where Dell chose to tarnish the Ultrasharp series with crap so you can't just say any more to "just get an Dell Ultrasharp".
The U2415 has a horrible power supply which gets very noisy by design and it has a 6 bit panel with noticeable temporal dithering.
I'd pick one of the 2560x14440 U27xx models (after checking they're not crap like the U2715). But beware that some of the older models (like the U2713) don't take the full resolution over HDMI and need Displayport or dual-link DVI.
Or you could use the FOSS alternative you can host yourself, Apache Guacamole[0].
EDIT: Interesting. It seems you actually work for/with a company affiliated with Shells? If so, can you do a compare/contrast with Guacamole, AWS Workspaces, etc.?
This requirement means that it mustn't be a very old laptop. I have a nice IBM R52 Thinkpad, Pentium M 1.8 GHz, 512 MB memory that runs Antix Linux but struggles to run a modern web browser.
It seems to me that if you can run a modern web browser then you have enough hardware to run a modern OS locally.
https://www.panelook.com/index.php
https://github.com/avtehnik/RTD2662
https://translate.google.com/translate?hl=&sl=ru&tl=en&u=htt...
http://monitor.espec.ws/files/rtd2662_spec_brief_0702_171.pd...
https://github.com/ghent360/RTD-2660-Programmer/
https://sites.google.com/site/lcd4hobby/how-to-start
http://www.mattmillman.com/info/lcd/rovatools
http://www.elecrealm.com/down/class/
https://www.codeforge.com/article/258602