I recall a utility that proxied all requests through another server, and compressed assets (eg large images) down on the fly on the server- so the end client used much less bandwidth to load a heavy web page.
It worked quite well and I think some ISPs offered it out of the box in the end
IIRC the proxy server does way more than just compress assets, it re-encodes the page to an internal binary markup format, and handles restricted JS execution with a simplified interaction model (the client had all of 4 events: unload, submit, click, and change, these events would trigger a server call which executes the corresponding handler and sends the updates back).
Not everything worked on that thing (most obviously the complex web applications which were starting to crop up around that time), but it was absolutely magic on unreliable GPRS connections 15 years ago.
The client was actually even simpler than that, it was just clicks. Doing anything else wasn't practical with the average latency back then.
The rest was done on the serverside using the old Opera Presto core - which scaled very economically, since it had gone through a lot of memory optimization efforts when it was ported to early smartphones/japanese weirdphones. We needed to keep the users's browser "tabs"/windows around until the next click, potentially a few minutes away, so memory usage was a key economical concern at the time.
I ran the product/development team from early on until 2014. Glad you liked it. Opera Mini peaked at about 150M monthly active users in like 2012-2013 or so. Since then I'm sure it's been a steady decline. If I were to guess right now, I'd say they new Chinese owners maybe will keep it alive for another 5-7 years, but who knows, they seem to be really focusing on Africa.
I used to use their proxy to bypass every web filter I ever came across haha. Ta for introducing me to the whack a mole that is bypassing porn filters, Opera! Sorry I never used your browser as intended
Google Web Accelerator [0] did this. Requests were pricked via Google. A side effect of this was that you could access internal Google tools that had been restricted to Google's IP address range – oops!
I might be remembering this totally wrong but didn't a mobile browser (maybe Opera?) offer this feature as well, like 5 or 10 years ago? I seem to remember trying it out for my Android phone back in the day
Yeah opera did this, I remember being amazed at the results in my Nokia e71 - and viewing the desktop site rather than the terrible 201) mobile offerings, but getting great speeds due to the compressed sizes.
One company that I worked at used a NetLi (now part of Akamai). The approach that they used (IIRC) had multiple data centers closer to the end users and then a server in each client's datacenter. The server in the client data center and the servers in the edge data centers would establish a pre-negotiated connection that had its TCP parameters optimized for its traffic... and then serve compressed traffic over that connection.
It was pretty impressive back in the days of CGI and JSP - before the presentation was pushed entirely out to the browser and everything was rendered on the server.
Same! Can't remember what it was called, but there was a Java application I used 15 years ago that proxied HTTP and compressed assets, which was useful in New Zealand at the time because bandwidth was limited and not as fast(I'm sure that's changed by now).
I remember AOL (circa version 3 or 4) doing this. If you saved one of these (maybe by normal means, or maybe only by poking through the local cache) they'd have an .art filename extension and be some proprietary format that wouldn't load in most image software.
Compression in the context of (jpeg) images often implies lossy compression. You can apply that multiple times ans still gain size benefits. Other ways are stripping metadata and reducing color palettes. Images on websites are often horribly unoptimized, so if you’re willing to compromise a little on quality you can have substantial gains in bandwidth requirements.
Let’s say you normally could download 10 kilobytes in a minute. If your computer clock slowed down by half, the kilobytes would download at the same speed, but only 30 seconds would have passed in computer time, so the computer would calculate that you had downloaded 10 kilobytes in 30 seconds, or 20 kilobytes a minute. Really, a minute passes, not 30 seconds, but the computer doesn’t know
In this case download speed is based on your system's clock. For example, 1 minute on your system takes 2 minutes real time. Therefore, your computer thinks downloads finish in half the time it takes in reality.
I had sort of thought this was by design, all the drives are sharing the same amount of free space. If you were to use some of that, it would then show less available space. I think it’s called space sharing. https://en.wikipedia.org/wiki/Apple_File_System
Seems like a technique for the scammers on Ebay who sell you an external hard drive filled with a heavy bolt and a 1GB flash stick. They program the drive to read 1TB but rewrite the 1GB over and over.
Worst thing is you never know until you need the data you wrote...
> They program the drive to read 1TB but rewrite the 1GB over and over.
My understanding is that they didn't specifically write a "program" to do it, all they did was to "program" a bunch of false configuration data to the USB flash controller and let it silently corrupt the flash. If you can find their internal programming tools on the web, they can even be fixed by program the correct configuration back.
>Worst thing is you never know until you need the data you wrote...
It's worthwhile to run dd+sha256sum or badblocks(8) on suspicious USB drives.
They are not the same thing. Check them with 'ls -lsh' and you'll see the sparse file from truncate uses no sectors. Whereas the fallocated file is fully preallocated.
If you have 100G free, truncate -S 1T will succeed, fallocate -l 1T will not.
I remember a program that promised to allow B&W Macs to show colors. In actually it used POV trickery to make it look like it was displaying hints of colors. (Sadly I don’t remember the name of the program but I remember being mildly amused by it.)
That works because the RGB receptors in the eye have different transient responses, so by precisely timing flashes of light you can create a color illusion. This is called the Fechner color effect : https://en.wikipedia.org/wiki/Fechner_color
Interesting! I don’t remember the name of the program (it was some shareware program I downloaded in the mid 1990s) but I’m pretty certain it used those Fechner discs to create the illusion.
Perhaps I’m biased because my first assignment as a professional programmer was writing file system utilities, but I’m fairly certain this is the kind of thing you want to make really sure you get it right.
Would a 2 trillion dollar software company allow such bugs to persist deliberately? Is it part and parcel of the carrot wrapped up in the next ‘upgrade/update’?
(2 trillion meaning, they have the money to sort this)
Something similar happened to me once - I used faulty ram for about a year (it worked most of the time, and when it didn't, it typically became apparent within a few minutes of booting the system, so it was a minor annoyance at worst). One of the times it didn't work, I noticed all of my programs were being killed by the OOM reaper, even though I had just started the system. After running free in a spare terminal, I realized it was reporting that I had roughly INT_MAX memory in use, and so it was deciding to kill basically everything to try and get back to having a non-negative amount of free memory left. There must have been memory corruption in some kernel data structure somewhere that deals with memory allocation.
I rebooted and it never happened again, but it was pretty funny running free and seeing those ridiculous numbers pop up. I was kind of amazed the rest of the system was even working at all.
Compression programs like this (e.g. Disk Doubler - https://en.m.wikipedia.org/wiki/DiskDoubler) were popular for a while when storage was more expensive. Some of them were legit and could compress most common workloads pretty well. Some others were scams.
It's too bad that the popularity has waned, considering you can still get a lot out of it. Using the new compression in windows 10 I get to shrink my steam folder more than 25%, but being off the beaten path means I have to deal with a bit of jankiness and manually reapplying it every once in a while.
back in those days, we weren't using file formats that applied some sort of compression to them natively, so compression was effective. Now, most of our space is consummed by compressed media in Images/Audio/Video type files. Most of those files are in a compressed format "natively", so they see no benefit of applying another compression type.
> A number of people have expressed disbelief that such a feat is possible, saying that they’d avoid anything like RAM Doubler because it’s obviously doing strange things to memory, which isn’t safe. The answer to these naysayers is that a program like RAM Doubler either works or it doesn’t – it’s a binary decision.
Cool that the technology was available and weird that the knowledge behind why it would work was not more widespread.
I take issue with the statement here though - it's not binary. It's of course important if it works in normal situations, but how does it work under extreme memory pressure or certain incompressible memory pages? Et.c.
It's funny how memory works. Human memory that is.
When I read the title of this article and thread, I thought, "This sounds a bit like that fraudulent DOS memory doubler. What was it called again, RAM Doubler?"
My favorite is getting GUID partition and FAT partitions wrong. I had partitioned a drive incorrectly on a specific Mac. That Mac had no issues using the drive. However, no other Mac could mount the drive. My second favorite is formatting a drive in Linux without defining a partition.
And to my astonishment it worked!
A few minutes later, “the clock seems to be running really slow”
Clever program. Half speed clock makes all download speeds show double speed.