- Photon, the graphical interface for QNX. Oriented more towards real time (widgets included gauges) but good enough to support two different web browsers. No delays. This was a real time operating system.
- MacOS 8. Not the Linux thing, but Copeland. This was a modernized version of the original MacOS, continuing the tradition of no command line. Not having a command line forces everyone to get their act together about how to install and configure things. Probably would have eased the tradition to mobile. A version was actually shipped to developers, but it had to be covered up to justify the bailout of Next by Apple to get Steve Jobs.
- Transaction processing operating systems. The first one was IBM's Customer Information Control System. A transaction processor is a kind of OS where everything is like a CGI program - load program, do something, exit program. Unix and Linux are, underneath, terminal oriented time sharing systems.
- IBM MicroChannel. Early minicomputer and microcomputer designers thought "bus", where peripherals can talk to memory and peripherals look like memory to the CPU. Mainframes, though, had "channels", simple processors which connected peripherals to the CPU. Channels could run simple channel programs, and managed device access to memory. IBM tried to introduce that with the PS2, but they made it proprietary and that failed in the marketplace. Today, everything has something like channels, but they're not a unified interface concept that simplifies the OS.
- CPUs that really hypervise properly. That is, virtual execution environments look just like real ones.
IBM did that in VM, and it worked well because channels are a good abstraction for both a real machine and a VM. Storing into device registers to make things happen is not. x86 has added several layers below the "real machine" layer, and they're all hacks.
- The Motorola 680x0 series. Should have been the foundation of the microcomputer era, but it took way too long to get the MMU out the door. The original 68000 came out in 1978, but then Motorola fell behind.
- Modula. Modula 2 and 3 were reasonably good languages. Oberon was a flop. DEC was into Modula, but Modula went down with DEC.
- XHTML. Have you ever read the parsing rules for HTML 5, where the semantics for bad HTML were formalized? Browsers should just punt at the first error, display an error message, and render the rest of the page in Times Roman.
Would it kill people to have to close their tags properly?
- Word Lens. Look at the world through your phone, and text is translated, standalone, on the device. No Internet connection required. Killed by Google in favor of hosted Google Translate.
> Would it kill people to have to close their tags properly?
Probably not, but what would be the benefit of having more pages fail to render? If xhtml had been coupled with some cool features which only worked in xhtml mode, it might have become successful, but on its own it does not provide much value.
> Would it kill people to have to close their tags properly
It would kill the approachability of the language.
One of the joys of learning HTML when it tended to be hand-written was that if you made a mistake, you'd still see something just with distorted output.
That was a lot more approachable for a lot of people who were put off "real" programming languages because they were overwhelmed by terrible error messages any time they missed a bracket or misspelled something.
If you've learned to program in the last decade or two, you might not even realise just how bad compiler errors tended to be in most languages.
The kind of thing where you could miss a bracket on line 47 but end up with a compiler error complaining about something 20 lines away.
Rust ( in particular ) got everyone to bring up their game with respect to meaningful compiler errors.
But in the days of XHTML? Error messages were arcane, you had to dive in to see what the problem actually was.
- I think without the move to NeXT, even if Jobs had come back to Apple, they would never have been able to get to the iPhone. iOS was - and still is - a unix-like OS, using unix-like philosophy, and I think that philosophy allowed them to build something game-changing compared to the SOTA in mobile OS technology at the time. So much so, Android follows suit. It doesn't have a command line, and installation is fine, so I'm not sure your line of reasoning holds strongly. One thing I think you might be hinting at though that is a missed trick: macOS today could learn a little from the way iOS and iPadOS is forced to do things and centralise configuration in a single place.
- I think transaction processing operating systems have been reinvented today as "serverless". The load/execute/quit cycle you describe is how you build in AWS Lambdas, GCP Cloud Run Functions or Azure Functions.
- Most of your other ideas (with an exception, see below), died either because of people trying to grab money rather than build cool tech, and arguably the free market decided to vote with its feet - I do wonder when we might next get a major change in hardware architectures again though, it does feel like we've now got "x86" and "ARM" and that's that for the next generation.
- XHTML died because it was too hard for people to get stuff done. The forgiving nature of the HTML specs is a feature, not a bug. We shouldn't expect people to be experts at reading specs to publish on the web, nor should it need special software that gatekeeps the web. It needs to be scrappy, and messy and evolutionary, because it is a technology that serves people - we don't want people to serve the technology.
> - XHTML. Have you ever read the parsing rules for HTML 5, where the semantics for bad HTML were formalized? Browsers should just punt at the first error, display an error message, and render the rest of the page in Times Roman. Would it kill people to have to close their tags properly?
We stop at the first sign of trouble for almost every other format, we do not need lax parsing for HTML. This has caused a multitude of security vulnerabilities and only makes it more difficult for pretty much everybody.
The attitude towards HTML5 parsing seemed to grow out of this weird contrarianism that everybody who wanted to do better than whatever Internet Explorer did had their head in the clouds and that the role of a standard was just to write down all the bugs.
I've argued this for years on this site...but AOL.
At its best, having IM, email, browser, games, keywords, chats, etc. was a beautiful idea IMO. That they were an ISP seemed secondary or even unrelated to the idea. But they chose to charge for access even in the age of broadband, and people decided they'd rather not pay it which is to be expected. I often wonder if they'd have survived as a software company otherwise.
They were basically a better Facebook before Facebook, in my opinion.
Edit: you asked why. I first saw it at SELF where Chris DiBona showed it to me and a close friend. It was awesome. Real time translation, integration of various types of messaging, tons of cool capabilities, and it was fully open source. What made it out of Google was a stripped down version of what I was shown, the market rejected it, and it was a sad day. Now, I am left with JIRA, Slack, and email. It sucks.
Google wave was built on an awesome technology layer, and they they totally blew in on the user interface.... deciding to treat it as a set of separate items instead of a single document everyone everywhere all at once could edit.... killed it.
It make it seem needlessly complicated, and effectively erased all the positives.
I was blown away by the demo but then after I thought about it, it seemed like a nightmare to me. All the problems of slack of having to manually check channels for updates except X 100 (yea, I get that slack wasn't available then. My point is I saw that it seemed impossible to keep up with nested constantly updated hierarchical threads. Keeping up with channels on slack is bad enough so imagine if Wave had succeeded. It'd be even worse.
Wave was great for conversation with one or two other people on a specific project, which I'm sure most people here used it for. I can't imagine it scaling well beyond that.
Even the watered-down version of wave was something I used at my host startup, it was effectively our project management tool. And it was amazing at that.
I don't know how it would fare compared to the options available today, but back then, it shutting down was a tremendous loss.
Google sucked/s at executive function because they completely lack appreciation for proper R&D and long-term investment and also kill things people use and love.
Optane persistent memory had a fascinating value proposition: stop converting data structures for database storage and just persist the data directly. No more booting or application launch or data load: just pick up where you left off. Died because it was too expensive, but probably long after it should have.
VM's persist memory snapshots (as do Apple's containers, for macOS at least), so there's still room for something like that workflow.
The world had already caught up. By the time it was released, flash memory was already nearing it's speed and latency, to the point that the difference want with the cost.
In 2011, before TypeScript, Next.js or even React, they had seamless server-client code, in a strongly typed functional language with support for features like JSX-like inline HTML, async/await, string interpolation, built-in MongoDB ORM, CSS-in-JS, and many syntax features that were added to ECMAScript since then.
I find it wild how this project was 90%+ correct on how we will build web apps 14 years later.
Sandstorm: it seemed quite nice with a lot of possibilities when it launched in 2014, but it didn’t really take off and then it moved to sandstorm.org.
Yahoo pipes. It was so great at creating rss feeds and custom workflows. There are replacements now like Zapier and n8n but loved that. Also google reader which is mentioned multiple times already.
Yahoo Pipes was what internet should have been. We're so many decades into computing and that kind of inter-tool linking has only barely been matched by unix pipes.
Google Reader: I will forever be salty about how Google killed something that likely required very little maintenance in the long run. It could have stayed exactly the same for a decade and I wouldn't have cared because I use an RSS reader exactly the same way I do that I did back in 2015.
Yes. That was the single worst business decision in Google history, as somebody correctly noted. It burned an enormous amount of goodwill for no gain whatsoever.
Killing Google Reader affected a relatively small number of users, but these users disporportionately happened to be founders, CTOs, VPs of engineering, social media luminaries, and people who eventually became founders, CTOs, etc. They had been painfully taught to not trust Google, and, since that time, they didn't. And still don't.
Just think of the data mining they could have had there.
They had a core set of ultra-connected users who touched key aspects of the entire tech industry. The knowledge graph you could have built out of what those people read and shared…
They could have just kept the entire service running with, what, 2 software engineers? Such a waste.
This would require the decision-maker to think and act at the scale and in interests of the entire company. Not at the scale of a promo packet for next perf: "saved several millions in operation costs by shutting down a low-impact, unprofitable service."
I still use PICASA it works fine. However, when google severed the gdrive-photo linking it meant my photos didn’t automatically download from google to my PC. This is what killed google for me.
> Google Play Music: I had uploaded thousands of MP3 files there. They killed it. I won't waste my time uploading again.
You can argue whether it's as good as GPM or not, but it's false to imply that your uploaded music disappeared when Google moved to YouTube Music. I made the transition, and all of my music moved without a new upload.
Hmm, good to know. But given Google's history, I assumed that it would stop working.
I also need to sell my Google Chromecast with Google TV 4K. Brand new, still in its shrink wrap. Bought it last year, to replace a flaky Roku. It was a flaky HDMI cable instead. I trust Roku more than Google for hardware support.
I'm still amused that they killed Google Notebook and then a few years later created Google Keep, an application with basically the same exact feature set.
You can say that for a fair few of the services mentioned by GP.
Google killed a lot of things to consolidate them into more "integrated" (from their perspective) product offerings. Picasa -> Photos, Hangounts -> Meet, Music -> YT Premium.
No idea what NFC Wallet was, other than the Wallet app on my phone that still exists and works?
The only one I'm not sure about is Chromecast - a while back my ones had an "update" to start using their newer AI Assistant system for managing it. Still works.
That was probably me, when I stopped using Google Search some years ago. :-) Got tired of the ads, the blog spam, and AI-generated content crap floating to the top of their results page.
The https://udm14.com/ flavor of Google is quite usable, though, esp with notable operators like inurl:this-or-that. But, all in all, yeah, gimme back vanilla Google search from 2008-2010 or so. Back then it was definitely a tool (I worked in investigative journalism at the time), whereas currently "searching" stands for sitting fingers crossed and hoping for the better. But, oh well. </rant>
That's more what I meant. Sure, lots of people still type stuff into the URL bar that takes them to www.google.com/search. But whatever you want to call that results page now, it's no longer Google Search in anything but name.
I’m still using
- free g suite
- play music
- finance
- nfc wallet is just google wallet isn’t it?
- chromecast, video and audio-only
I guess play music is now YouTube music, and doesn't have uploads, so that can be considered dead, but the others seem alive to me.
I used Picasa and loved it, until I realized I want all my photos available from all my devices at all times and so gave in to Google Photos (for access, not backup)
I use SyncThing for that purpose. It syncs across my phone, my laptops, and my Synologies. But I don't sync all my photos.
I don't like the thought of providing Google thousands of personal photos for their AI training. Which will eventually leak to gov't agencies, fraudsters, and criminals.
I used Google Talk than Hangouts, but once they switched to Meet, I gave up on them. By then my family was all using Hangouts, and we never settled on a new service, because one of my siblings didn't want to support any chat services that don't freely give user information to the government, and the rest of us didn't want to use a chat platform that does freely give user information to the government.
From what I can tell (since I am just finding out about this today), they stopped manufacturing the old Chromecast hardware, and at some point, will stop supporting the old devices. The old devices may stop working in the future, for example, because they sunset the servers. Like their thermostats. Who knows?
The internet before advertising, artificial intelligence, social media and bots. When folks created startups in their bedrooms or garages. The days when google slogan was “don’t be evil”.
Heroku? I know it's still around, though IDK who uses it, but I miss those days when it was thriving. One language, one deployment platform, one database, a couple plugins to choose from, everything simple and straightforward, no decision fatigue.
I often wonder, if AI had come 15 years earlier, would it have been a ton better because there weren't a billion different ways to do things? Would we have ever bothered to come up with all the different tech, if AI was just chugging through features efficiently, with consistent training data etc.?
As soon as they put a persistent Salesforce brand banner across the top which did nothing but waste space and put that ugly logo in our face every day, my team started our transition off Heroku pretty much right away.
Didn't they offer free compute? IIRC all free compute on the Internet went away with the advent of cryptocurrencies as it became practical to abuse the compute and translate it directly into money.
My company still uses Heroku in production actually. Every time I see the Salesforce logo show up I wince, but we haven't had any issues at all. It continues to make deployment very easy.
Vine. It was already pretty big back in 2013
but
Twitter had no idea what to do with it. TikTok actually launched just a few months before Vine was shut down and erased from the internet.
Whoever took the decision to kill Vine was an absolute moron, even without hindsight. It was square videos, how hard could it have been to shove an ads banner above it and call it a day? Incredible
Midori, Microsoft's capability-based security OS[1]. Rumor has it that it was getting to the point where it was able to run Windows code, so it was killed through internal politics, but who knows! It was the Fuchsia of its time...
I've heard someone at Microsoft describe it as a moonshot but also a retention project; IIRC it had a hundred plus engineers on it at one time, including a lot of very senior people.
Apparently a bunch of research from Midori made it into .NET so it wasn't all lost, but still...
The technical foundation seems interesting, but knowing Microsoft this would have just become yet another bloated mess with it's own new set of problems. And by now it would have equally become filled with spyware and AI "features" users don't want.
CLPM, the Common Lisp Package Manager. The Quicklisp client doesn't do HTTPS, ql-https doesn't do Ultralisp, and OCICL (which I'm currently using) doesn't do system-wide packages. CLPM is a great project, but it's gone neglected long enough that it's bitrotted and needs some thorough patching to be made usable. Fortunately Common Lisp is still as stable as it has been for 31 years, so it's just the code which interacts with 3rd-party libraries that needs updating.
The Lockheed D-21 drone. Supersonic ramjet without the complexity of scramjet or the cost of turbojet, hamstrung by the need for a manned launch platform (making operations safety-critical… with predictable results) and recovery to get data off it. Twenty or forty years later it would have been paired by a small number of high-cost launcher UAVs and had its cost driven down to disposable, with data recovery over radio comms… but twenty to forty years later there’s nothing like it, and the maturation of satellites means there almost certainly never will be.
It was a series of experiments with new approaches to programming. Kind of reminded me of the research that gave us Smalltalk. It would have been interesting to see where they went with it, but they wound down the project.
Yeah, Opa was wildly ahead of its time, I actually just wrote a top level comment about it. Basically Next.js+TypeScript+modern ECMAScript features, but in 2011.
Full vector dpi aware UI, with grid, complex animation, and all other stuff that html5/css didn’t have in 2018 but silverlight had even in 2010 (probable even earlier).
MVVM pattern, two-way bindings. Expression Blend (basically figma) that allowed designers create UI that was XAML, had sample data, and could be used be devs as is with maybe some cleanup.
Excellent tooling, static analysis, debugging, what have you.
Rendered and worked completely the same in any browser (safari, ie, chrome, opera, firefox) on mac and windows
If that thing still worked, boy would we be in a better place regarding web apps.
Unfortunately, iPhone killed adobe flash and Silverlight as an aftermath. Too slow processor, too much energy consumption.
I am happy this one died. It was just another attempt by Microsoft to sidestep open web standards in favor of a proprietary platform. The other notorious example is Flash, and both should be considered malware.
Open web standards are great but consider where we could have been if competition drove them a different way? We're still stuck with JavaScript today (wasm still needs it). Layout/styling is caught up now but where would we be if that came sooner?
> Open web standards are great but consider where we could have been if competition drove them a different way? We're still stuck with JavaScript today (wasm still needs it). Layout/styling is caught up now but where would we be if that came sooner?
Why do you think JavaScript is a problem? And a big enough problem to risk destroying open web standards.
I loved silverlight. Before I got a “serious” job, I was a summer intern at a small civil engineering consultancy that had gradually moved into developing custom software that it sold mostly to local town/city/county governments in Arizona (mostly custom mapping applications; for example, imagine Google Maps but you can see an overlay of all the street signs your city owns and click on one to insert a note into some database that a worker needs to go repair it… stuff like that).
Lots of their stuff was delivered as Silverlight apps. It turns out that getting office workers to install a blessed plugin from Microsoft and navigate to a web page is much easier than distributing binaries that you have to install and keep up to date. And developing for it was pure pleasure; you got to use C# and Visual Studio, and a GUI interface builder, rather than the Byzantine HTML/JS/CSS ecosystem.
I get why it never took off, but in this niche of small-time custom software it was really way nicer than anything else that existed at the time. Web distribution combined with classic desktop GUI development.
RethinkDB. Technically it still exists (under The Linux Foundation), but (IMO) the original company's widening scope (the Horizon BaaS) that eventually led to its demise killed its momentum.
Connect your phone to a display, mouse, keyboard and get a full desktop experience.
At the time smartphones were not powerful enough, cables were fiddly (adapters, HDMI, USB A instead of a single USB c cable) and virtualization and containers not quite there.
Today, going via pkvm seems like promising approach. Seamless sharing of data, apps etc. will take some work, though.
Microsoft Songsmith is another one that deserved a second life. It let you hum or sing a melody and would auto-generate full backing tracks, guitar, bass, drums, chords, in any style you chose.
It looked a bit goofy in the promo videos, but under the hood it was doing real-time chord detection and accompaniment generation. Basically a prototype of what AI music tools like Suno, Udio, or Mubert are doing today, fifteen years too early.
If Microsoft had kept iterating on it with modern ML models, it could’ve become the "GarageBand for ideas that start as a hum."
ReactOS, the effort to create a free and open source Windows NT reimplementation.
It has been in existence in some form or another for nearly 30 years, but did not gain the traction it needed and as of writing it's still not in a usable state on real hardware. It's not abandoned, but progress on it is moving so slow that I doubt we'll ever see it be released in a state that's useful for real users.
It's too bad, because a drop in Windows replacement would be nice for all the people losing Windows 10 support right now.
On the other hand, I think people underestimate the difficulty involved in the project and compare it unfavorably to Linux, BSD, etc.
Unix and its source code was pretty well publicly documented and understood for decades before those projects started, nothing like that ever really existed for Windows.
They had no chance. Look how long it tooks for Wine to get where they are. Their project is Wine + a kernel + device drivers compatibility, and a moving target.
Wine, Proton and virtualization all got good enough that there's no need for a half-baked binary-compatible Windows reimplementation, and I think that took a lot of the oxygen out of what could have been energy towards ReactOS. It's a cool concept but not really a thing anybody requires.
> ReactOS, the effort to create a free and open source Windows NT reimplementation.
Some projects creep along slowly until something triggers an interest and suddenly they leap ahead.
MAME's Tandy 2000 implementation was unusable, until someone found a copy of Windows 1.0 for the Tandy 2000, then the emulation caught up until Windows ran.
Maybe ReactOS will get a big influx of activity after Windows 10 support goes offline in a couple days, or even shortly after when you can't turn AI spying off, not even three times a year.
RAM Disks. Basically extremely fast storage using RAM sticks slotted into a specially made board that fit in a PCIe slot. Not sure what happened to the project exactly but the website disappeared sometime in 2023.
The idea that you could read and write data at RAM speeds was really exciting to me. At work it's very common to see microscope image sets anywhere from 20 to 200 GB and file transfer rates can be a big bottleneck.
Products to attach RAM to expansion slots have long existed and continue to be developed. It's a matter of adding more memory once all of the DIMMs are full.
What to do with it, once it's there, is a concern of software, but specialized hardware is needed to get it there.
Non Daw. Its breaking up each function of the DAW into its own application gave a better experience in each of those functions, especially when you only needed that aspect, you were not working around everything else that the DAW offers. The integration between the various parts was not all that it could be but I think the idea has some real potential.
Thought about Non immediately, but I figured it must have (had) about 2 other users amongst HNers, though. :) Nice to see it mentioned.
I used it quite a bit to produce radio shows for my country's public broadcasting. Because Non's line-oriented session format was so easy to parse with classic Unix tools, I wrote a bunch of scripts for it with Awk etc. (E.g. calculating the total length of clips highlighted with brown color in the DAW -- which was stuff meant for editing out; or creating a poor man's "ripple editing" feature by moving loosely-placed clips precisely side by side; or, eventually, converting the sessions to Samplitude EDL format, and, from there, to Pro Tools via AATranslator [1] (because our studio was using PT), etc. Really fun times!)
I loved my N900, and my N800 before that, and I would have loved to have seen successors. Ultimately, I ended up switching to Android because I was tired of things only available as apps. Since then, web technologies have gotten better, and it's become much more feasible to use almost exclusively websites.
They should have partnered not only with Intel, but with Palm, RIM or whatever other then-giant to rival Android. Those two went their own ways with WebOS and buying QNX, so maybe they could have agreed to form a consortium for an open and interoperable mobile OS
Boot2Gecko or whatever the browser as Operating system was called. This was a project that should have focused on providing whatever its current users needed expanding and evolving to do whatever those users wanted it to do better.
Instead it went chasing markets, abandoning existing users as it did so, in favour of potential larger pools of users elsewhere. In the end it failed to find a niche going forward while leaving a trail of abandoned niches behind it.
I adored my Firefox Phones. Writing apps was so easy I built myself dozens of little one-offs. Imagine if it had survived to today, its trivial html/css/js apps could be vibe coded on-device and be the ultimate personalized phone.
Luckily it wasn't long after Mozilla abandoned it that PWAs were introduced and I could port the apps I cared about.
For a few short months circa 2016 or 2017, KaiOS was the number one mobile OS in India. This was probably because of all the ultra-cheap KaiOS-powered Reliance Jio phones flooding the Indian market at the time.
I noticed the trend when I was working on a major web property for the Aditya Birla conglomerate. My whole team was pleasantly surprised, and we made sure to test everything in Firefox for that project. But everyone switched to Android + Chrome over the next few years, which was a shame.
I always thought Microsoft Popfly had huge potential and was way ahead of its time. It made building web mashups feel like playing with Lego blocks, drag, drop, connect APIs, and instantly see the result.
If something like that existed today, powered by modern APIs and AI, it could become the ultimate no-code creativity playground.
Macromedia Flash. Its scope and security profile was too big. It gave way to HTML’s canvas. But man, the tooling is still no where near as good. Movieclips, my beloved. I loved it all.
The iPhone killed Flash, probably because it would've been a way to create apps for it, more probably because it would've been laggy in the 2007 hardware, and people would've considered the iPhone "a piece of junk".
Interesting how Flash became the almost universal way to play videos in the browser, in the latter half of the 2000's (damn I'm old...).
It's incredible to me that they killed the whole tool instead of making a JS/Canvas port. Even without "full flash websites", there's still need for vectorial animations on the web.
As a Linux user, I hated Flash with a passion. It mostly didn't work despite several Linux implementations. About the time they sorted all the bugs out, it went away. Good riddance.
Was recently reading about Project Ara, the modular smartphone project by Google/Motorola [1]. Would have liked to see a few more iterations of the idea. Something more customizable than what we have today without having to take the phone apart.
It might be too soon to call it abandoned, but I was very intrigued by the Austral [1] language. The spec [2] is worth reading, it has an unusual clarity of thought and originality, and I was hoping that it would find some traction. Unfortunately it seems that the author is no longer actively working on it.
I played with Austral about a year ago and really wanted to use it for my projects, but as a hobbyist and mostly inept programmer it lacked the community and ecosystem I require. I found it almost intuitive and the spec does an amazing job of explaining the language. Would love to see it get a foothold.
The author got hired by Modular, the AI startup founded by the creators of LLVM and Swift, and is now working on the new language Mojo.
He’s been bringing a bunch of ideas from Vale to Mojo
Oh nice! I just had an excuse to try mojo via max inference, it was pretty impressive. Basically on par with vllm for some small benchmarks, bit of variance in ttft and tpot. Very cool!
Just on principle, I'd have liked to see it on the market for more than 49 days! It pains me as an engineer to think of the effort to bring a hardware device to market for such a minuscule run.
>This presentation introduces Via, a virtual file system designed to address the challenges of large game downloads and storage. Unlike cloud gaming, which suffers from poor image quality, input latency, and high hosting costs, Via allows games to run locally while only downloading game data on demand. The setup process is demonstrated with Halo Infinite, showing a simple installation that involves signing into Steam and allocating storage space for Via's cache.
>Via creates a virtual Steam library, presenting all owned games as installed, even though their data is not fully downloaded. When a game is launched, Via's virtual file system intercepts requests and downloads only the necessary game content as it's needed. This on-demand downloading is integrated with the game's existing streaming capabilities, leveraging features like level-of-detail and asset streaming. Performance metrics are displayed, showing download rates, server ping, and disk commit rates, illustrating how Via fetches data in real-time.
>The system prioritizes caching frequently accessed data. After an initial download, subsequent play sessions benefit from the on-disk cache, significantly reducing or eliminating the need for network downloads. This means the actual size of a game becomes less relevant, as only a portion of it needs to be stored locally. While server locations are currently limited, the goal is to establish a global network to ensure low ping. The presentation concludes by highlighting Via's frictionless user experience, aiming for a setup so seamless that users are unaware of its presence. Via is currently in early access and free to use, with hopes of future distribution partnerships.
I'm amazed the video still has under 4,000 views. Sadly, Flaherty got hired by XAI and gave up promoting the project.
Wait until you hear that almost all Unity games don't really have asset streaming because the engine loads things eagerly by default.
I don't see how this could take off. Internet speeds are getting quicker, disk space is getting cheaper, and this will slow down load times. And what's worse is the more you need this tech the worse experience you have.
It's a real shame its raster functionality wasn't integrated into Illustrator. Adobe really butchered the whole Macromedia portfolio, didn't they?
(For those unfamiliar, Illustrator is a pure vector graphics editor; once you rasterize its shapes, they become uneditable fixed bitmaps. Fireworks was a vector graphics editor that rendered at a constant DPI, so it basically let you edit raster bitmaps like they were vectors. It was invaluable for pixel-perfect graphic design. Nothing since lets you do that, though with high-DPI screens and resolution-independent UIs being the norm these days, this functionality is less relevant than it used to be.)
VPRI, I was really hoping it would profoundly revolutionise desktop application development and maybe even lead to a new desktop model, and instead they wound up the project without having achieved the kind of impact I was dreaming of.
The IBM school's computer. Developed by IBM Hursley in 1967, it was years ahead in its design, display out to a television and storage on normal audio tape. Would have kick started an educational revolution if it had been launched beyond the 10 prototype machines.
I came to say Opa too. I liked the language but the meteor-like framework it was bundled with, while nice for prototyping, was a pain to work around when it didn't do what you needed.
That said, frameworks were all the buzz back in the day, so the language alone probably wouldn't have gone anywhere without it.
Nokia Maps. There was a brief period in the early 2010s where Nokia had the best mapping product on the planet, and it was given away for free on Lumia phones at a time when TomTom and Garmin were still charging $60+ for navigation apps.
wua.la … the original version. You share part of your storage to get the same amount back as resilient cloud storage from others. Was bought and killed by LaCie (now Seagate). They later provided paid-for cloud storage under the same name but it didn’t take off.
Ceylon, JVM language, developed by Red Hat, now abandoned at Eclipse. Lost the race with Kotlin but proposed more than just syntax sugar over Java. Anonymous union types, comprehensions, proper module system...
In the late 90s there was a website called fuckedcompany which was a place where people could spill the beans about startups (mainly in silicon valley). It was anonymous and a pretty good view into the real state of tech. Now there is twitter/x but it's not as focused on this niche.
The closest sites I've found are Web3 is Going Just Great and Pivot to AI, which are newsfeeds of various car crashes in their respective hype arenas, although without any insider scoops/gossip.
OSI's session layer did very little more than TCP/UDP port numbers; in the OSI model you would open a connection to a machine, then use that connection to open a session to a particular application.
X.400 was a nice idea, but the ideal of having a single global directory predates security. I can understand why it never happened
On X.509, the spec spends two chapters on attribute certificates, which I've never seen used in the wild. It's a shame; identity certificates do a terrible job at authentication
Anyone remember Openmoko, the first commercialised open source smart phone. Was heaps buggy though, not really polished, etc. It’s only redeeming feature was the open source software and hardware (specs?).
I could think of many examples, but I'll talk about the top four that I have in mind, that I'd like to see re-evaluated for today's times.
1. When Windows Vista was being developed, there were plans to replace the file system with a database, allowing users to organize and search for files using database queries. This was known as WinFS (https://en.wikipedia.org/wiki/WinFS). I was looking forward to this in the mid-2000s. Unfortunately Vista was famously delayed, and in an attempt to get Vista released, Microsoft pared back features, and one of these features was WinFS. Instead of WinFS, we ended up getting improved file search capabilities. It's unfortunate that there's been no proposals for database file systems for desktop operating systems since.
2. OpenDoc (https://en.wikipedia.org/wiki/OpenDoc) was an Apple technology from the mid-1990s that promoted component-based software. Instead of large, monolithic applications such as Microsoft Excel and Adobe Photoshop, functionality would be offered in the form of components, and users and developers can combine these components to form larger solutions. For example, as an alternative to Adobe Photoshop, there would be a component for the drawing canvas, and there would be separate components for each editing feature. Components can be bought and sold on an open marketplace. It reminds me of Unix pipes, but for GUIs. There's a nice promotional video at https://www.youtube.com/watch?v=oFJdjk2rq4E.
OpenDoc was a radically different paradigm for software development and distribution, and I think this was could have been an interesting contender against the dominance that Microsoft and Adobe enjoys in their markets. OpenDoc actually did ship, and there were some products made using OpenDoc, most notably Apple's Cyberdog browser (https://en.wikipedia.org/wiki/Cyberdog).
Unfortunately, Apple was in dire straits in the mid-1990s. Windows 95 was a formidable challenger to Mac OS, and cheaper x86 PCs were viable alternatives to Macintosh hardware. Apple was an acquisition target; IBM and Apple almost merged, and there was also an attempt to merge Apple with Sun. Additionally, the Macintosh platform depended on the availability of software products like Microsoft Office and Adobe Photoshop, the very types of products that OpenDoc directly challenged. When Apple purchased NeXT in December 1996, Steve Jobs returned to Apple, and all work on OpenDoc ended not too long afterward, leading to this now-famous exchange during WWDC 1997 between Steve Jobs and an upset developer (https://www.youtube.com/watch?v=oeqPrUmVz-o).
I don't believe that OpenDoc fits in with Apple's business strategy, even today, and while Microsoft offers component-based technologies that are similar to OpenDoc (OLE, COM, DCOM, ActiveX, .NET), the Windows ecosystem is still dominated by monolithic applications.
I think it would have been cool had the FOSS community pursued component-based software. It would have been really cool to apt-get components from remote repositories and link them together, either using GUI tools, command-line tools, or programmatically to build custom solutions. Instead, we ended up with large, monolithic applications like LibreOffice, Firefox, GIMP, Inkscape, Scribus, etc.
3. I am particularly intrigued by Symbolics Genera (https://en.wikipedia.org/wiki/Genera_(operating_system)), an operating system designed for Symbolics Lisp machines (https://en.wikipedia.org/wiki/Symbolics). In Genera, everything is a Lisp object. The interface is an interesting hybrid of early GUIs and the command line. To me, Genera could have been a very interesting substrate for building component-based software; in fact, it would have been far easier building OpenDoc on top of Common Lisp than on top of C or C++. Sadly, Symbolics' fortunes soured after the AI winter of the late 1980s/early 1990s, and while Genera was ported to other platforms such as the DEC Alpha and later the x86-64 via the creation of a Lisp machine emulator, it's extremely difficult for people to obtain a legal copy, and it was never made open source. The closest things to Genera we have are Xerox Interlisp, a competing operating system that was recently made open source, and open-source descendants of Smalltalk-80: Squeak, Pharo, and Cuis-Smalltalk.
4. Apple's "interregnum" years between 1985 and 1996 were filled with many intriguing projects that were either never commercialized, were cancelled before release, or did not make a splash in the marketplace. One of the most interesting projects during the era was Bauhaus, a Lisp operating system developed for the Newton platform. Mikel Evins, a regular poster here, describes it here (https://mikelevins.github.io/posts/2021-07-12-reimagining-ba...). It would have been really cool to have a mass-market Lisp operating system, especially if it had the same support for ubiquitous dynamic objects like Symbolic Genera.
OpenDoc was mostly given to Taligent (the Apple and IBM joint venture) to develop. It was full-on OO: about 35 files for a minimal application, which meant that Erich Gamma had to build a whole new type of IDE which was unusable. He likely learned his lesson: it's pretty hard to define interfaces between unknown components without forcing each one to know about all the others.
MIME types for mail addressed much of the demand for pluggable data types.
Fro me, DESQview. Microsoft tried to buy it in order to use its tech in their windows system. I wonder how things would be today if they were able to purchase it. But DESQview said "no".
Instead it went into a slow death spiral due to Windows 95.
Love seeing this one. My uncle was co-founder of Quarterdeck, and I grew up in a world of DESQview and QEMM. It was a big influence on me as a child.
Got a good family story about that whole acquisition attempt, but I don't want to speak publicly on behalf of my uncle. I know we've talked at length about the what-ifs of that moment.
I do have a scattering of some neat Quarterdeck memorabilia I can share, though:
DESQview/X sucked the wind out of DESQview's sails. It was, on paper, a massive upgrade. I had been running DESQview for years, with a dial-up BBS in the background.
But you couldn't actually buy /X. After trying to buy a copy, my publisher even contacted DESQ's marketing people to get a copy for me, and they wouldn't turn one over. Supposedly there were some copies actually sold, but too few, too late, and then /X was dropped. There was at least one more release of plain DESQview after that, but by then Windows was eating its lunch.
Windows Phone's UI is still with us, from Windows 8 onwards. Everything on 8, 10, and 11 is optimized for a touch interface on a small screen, which is ridiculous on a modern desktop with a 32" or so monitor and a trackball or mouse.
I'm booting and running Haiku on my Thinkpad. It's a from-scratch workalike of BeOS, and able to run Be software. Though, frankly, Be software is totally 1990s, so a lot of Linux software written for Qt has been ported to Haiku.
In the end I wound up with basically the same application software as on my Debian desktop, except running on Haiku instead of Linux. Haiku is noticeably snappier and more responsive than Linux+X+Qt+KDE, though.
In late September or early October 1996, Fry's Electronics places a full page promo ad on the back of the business section of the San Jose Mercury News for OS/2 4.0 "WRAP [sic]" in 256 pt font in multiple places. Oops!
Nah, that time has passed and there's not much to miss from the base OS. What would be interesting is for IBM to publish the source to the Workplace Shell and the underlying SOM code so it might get a new life running on one of the free *nixes.
Fortress language. It suffered from being too Haskell-like in terms of too many, non-orthogonal features. Rust and Go applied lessons from it perhaps indirectly.
their operator precedence system was one of my favourite pieces of language design. the tl;dr was that you could group operators into precedence sets, and an expression involving operators that all came from the same set would have that set's precedence rules applied, but if you had an expression involving mixed sets you needed to add the parentheses. crucially, they also supported operator overloading, and the same operator could be used in a different set as long as everything could be parsed unambiguously. (caveat, I never used the language, I just read about the operator design in the docs and it was very eye opening in the sense that every other language's operator precedence system suddenly felt crude and haphazard)
Humane AI Pin. I think they launched 2 years too early and were too greedy with device pricing and subscription. Also if they focused as accessory for Android/iPhone they could reduce power usage and cost as well.
Their execution was of course bad but I think today current LLM models are better and faster and there is much more OSS models to reduce costs. Hardware though looked nice and pico projector interesting concept even though not the best executed.
Wine predates ReactOS. It was basically a FOSS duplicate of Sun's WABI.
I wrote a bunch of software in Borland Delphi, which ran in Windows, Wine, and ReactOS with no problems. Well, except for ReactOS' lack of printing support.
As long as you stay within the ECMA or published Windows APIs, everything runs fine in Wine and ReactOS. But Microsoft products are full of undocumented functions, as well as checks to see if they're running on real Windows. That goes back to the Windows 3.1 days, when 3.1 developers regularly used OS/2 instead of DOS, and Microsoft started adding patches to fail under OS/2 and DR-DOS. So all that has to be accounted for by Wine and ReactOS. A lot of third-party software uses undocumented functions as well, especially stuff written back during the days when computer magazines were a thing, and regularly published that kind of information. A lot of programmers found the lure of undocumented calls to be irresistible, and they wound up in all kinds of commercial applications where they really shouldn't have been.
In my experience anything that will load under Wine will run with no problems. ReactOS has some stability problems, but then the developers specifically call it "alpha" software. Despite that, I've put customers on ReactOS systems after verifying all their software ran on it. It gets them off the Microsoft upgrade treadmill. Sometimes there are compatibility problems and I fall back to Wine on Linux. Occasionally nothing will do but real Windows.
Hard disagree. The Humane AI Pin ad was a classic silicon valley ad that screamed B2VC and demonstrated nothing actually useful that couldn't be done with an all-in-one phone app (or even the ChatGPT app) and bluetooth earbuds that you already have.
Which reduces its innovation level to nothing more than a chest-mounted camera.
You want real B2C products that people would actually buy? Look at the Superbowl ads instead. Then watch the Humane ad again. It's laughable.
nah, glass was impressive for a such a big org like google, but smartphones are popular because people use them like portable televisions. glanceable info and walking directions are more like an apple watch sized market, without the fashion element. meta is about to find out.
google glass sucks though and glasses will never be a thing. google and meta and … can spend $8T and come up with the most insane tech etc but no one will be wearing f’ing glasses :)
Apple’s scanning system for CSAM. The vast majority of the debate was dominated by how people imagined it worked, which was very different to how it actually worked.
It was an extremely interesting effort where you could tell a huge amount of thought and effort went into making it as privacy-preserving as possible. I’m not convinced it’s a great idea, but it was a substantial improvement over what is in widespread use today and I wanted there to be a reasonable debate on it instead of knee-jerk outrage. But congrats, I guess. All the cloud hosting systems scan what they want anyway, and the one that was actually designed with privacy in mind got screamed out of existence by people who didn’t care to learn the first thing about it.
Good riddance to a system that would have provided precedent for client-side scanning for arbitrary other things, as well as likely false positives.
> I wanted there to be a reasonable debate on it
I'm reminded of a recent hit-piece about Chat Control, in which one of the proponent politicians was quoted as complaining about not having a debate. They didn't actually want a debate, they wanted to not get backlash. They would never have changed their minds, so there's no grounds for a debate.
We need to just keep making it clear the answer is "no", and hopefully strengthen that to "no, and perhaps the massive smoking crater that used to be your political career will serve as a warning to the next person who tries".
This. No matter how cool the engineering might have been, from the perspective of what surveillance policies it would have (and very possibly did) inspire/set precedent for… Apple was very much creating the Torment Nexus from “Don’t Create the Torment Nexus.”
> from the perspective of what surveillance policies it would have (and very possibly did) inspire/set precedent for…
I can’t think of a single thing that’s come along since that is even remotely similar. What are you thinking of?
I think it’s actually a horrible system to implement if you want to spy on people. That’s the point of it! If you wanted to spy on people, there are already loads of systems that exist which don’t intentionally make it difficult to do so. Why would you not use one of those models instead? Why would you take inspiration from this one in particular?
The problem isn’t the system as implemented; the problem is the very assertion “it is possible to preserve the privacy your constituents want, while running code at scale that can detect Bad Things in every message.”
Once that idea appears, it allows every lobbyist and insider to say “mandate this, we’ll do something like what Apple did but for other types of Bad People” and all of a sudden you have regulations that force messaging systems to make this possible in the name of Freedom.
Remember: if a model can detect CSAM at scale, it can also detect anyone who possesses any politically sensitive image. There are many in politics for whom that level of control is the actual goal.
> the problem is the very assertion “it is possible to preserve the privacy your constituents want, while running code at scale that can detect Bad Things in every message.”
Apple never made that assertion, and the system they designed is incapable of doing that.
> if a model can detect CSAM at scale, it can also detect anyone who possesses any politically sensitive image.
Apple’s system cannot do that. If you change parts of it, sure. But the system they proposed cannot.
To reiterate what I said earlier:
> The vast majority of the debate was dominated by how people imagined it worked, which was very different to how it actually worked.
So far, you are saying that you don’t have a problem with the system Apple designed, and you do have a problem with some other design that Apple didn’t propose, that is significantly different in multiple ways.
Also, what do you mean by “model”? When I used the word “model” it was in the context of using another system as a model. You seem to be using it in the AI sense. You know that’s not how it worked, right?
> Chat Control, and other proposals that advocate backdooring individual client systems.
Chat Control is older than Apple’s CSAM scanning and is very different from it.
> Clients should serve the user.
Apple’s system only scanned things that were uploaded to iCloud.
You missed the most important part of my comment:
> I think it’s actually a horrible system to implement if you want to spy on people. That’s the point of it! If you wanted to spy on people, there are already loads of systems that exist which don’t intentionally make it difficult to do so. Why would you not use one of those models instead? Why would you take inspiration from this one in particular?
I don’t think you can accurately describe it as client-side scanning and false positives were not likely. Depending upon how you view it, false positives were either extremely unlikely, or 100% guaranteed for practically everybody. And if you think the latter part is a problem, please read up on it!
> I'm reminded of a recent hit-piece about Chat Control, in which one of the proponent politicians was quoted as complaining about not having a debate. They didn't actually want a debate, they wanted to not get backlash. They would never have changed their minds, so there's no grounds for a debate.
Right, well I wanted a debate. And Apple changed their minds. So how is it reminding you of that? Neither of those things apply here.
No, but I have a hard time imagining a bug that would meaningfully compromise this kind of system. Can you give an example?
> How about making Apple vulnerable to demands from every government where they do business?
They already are. So are Google, Meta, Microsoft, and all the other giants we all use. And all those other companies are already scanning your stuff. Meta made two million reports in 2024Q4 alone.
Apple designed a system. People guessed at what it did. Their guesses were way off the mark. This poisoned all rational discussion on the topic. If you imagine a system that works differently to Apple’s system, you can complain about that imaginary system all you want, but it won’t be meaningful, it’s just noise.
You understand it just fine, you're just trying to pass you fantasy pod immutable safe future as rational while painting the obvious objections based on the real world as meaningless noise.
There is no place for spyware of any kind on my phone. Saying that it is to "protect the children" and "to catch terrorists" does not make it any more acceptable.
- Photon, the graphical interface for QNX. Oriented more towards real time (widgets included gauges) but good enough to support two different web browsers. No delays. This was a real time operating system.
- MacOS 8. Not the Linux thing, but Copeland. This was a modernized version of the original MacOS, continuing the tradition of no command line. Not having a command line forces everyone to get their act together about how to install and configure things. Probably would have eased the tradition to mobile. A version was actually shipped to developers, but it had to be covered up to justify the bailout of Next by Apple to get Steve Jobs.
- Transaction processing operating systems. The first one was IBM's Customer Information Control System. A transaction processor is a kind of OS where everything is like a CGI program - load program, do something, exit program. Unix and Linux are, underneath, terminal oriented time sharing systems.
- IBM MicroChannel. Early minicomputer and microcomputer designers thought "bus", where peripherals can talk to memory and peripherals look like memory to the CPU. Mainframes, though, had "channels", simple processors which connected peripherals to the CPU. Channels could run simple channel programs, and managed device access to memory. IBM tried to introduce that with the PS2, but they made it proprietary and that failed in the marketplace. Today, everything has something like channels, but they're not a unified interface concept that simplifies the OS.
- CPUs that really hypervise properly. That is, virtual execution environments look just like real ones. IBM did that in VM, and it worked well because channels are a good abstraction for both a real machine and a VM. Storing into device registers to make things happen is not. x86 has added several layers below the "real machine" layer, and they're all hacks.
- The Motorola 680x0 series. Should have been the foundation of the microcomputer era, but it took way too long to get the MMU out the door. The original 68000 came out in 1978, but then Motorola fell behind.
- Modula. Modula 2 and 3 were reasonably good languages. Oberon was a flop. DEC was into Modula, but Modula went down with DEC.
- XHTML. Have you ever read the parsing rules for HTML 5, where the semantics for bad HTML were formalized? Browsers should just punt at the first error, display an error message, and render the rest of the page in Times Roman. Would it kill people to have to close their tags properly?
- Word Lens. Look at the world through your phone, and text is translated, standalone, on the device. No Internet connection required. Killed by Google in favor of hosted Google Translate.
> Would it kill people to have to close their tags properly?
Probably not, but what would be the benefit of having more pages fail to render? If xhtml had been coupled with some cool features which only worked in xhtml mode, it might have become successful, but on its own it does not provide much value.
> Would it kill people to have to close their tags properly
It would kill the approachability of the language.
One of the joys of learning HTML when it tended to be hand-written was that if you made a mistake, you'd still see something just with distorted output.
That was a lot more approachable for a lot of people who were put off "real" programming languages because they were overwhelmed by terrible error messages any time they missed a bracket or misspelled something.
If you've learned to program in the last decade or two, you might not even realise just how bad compiler errors tended to be in most languages.
The kind of thing where you could miss a bracket on line 47 but end up with a compiler error complaining about something 20 lines away.
Rust ( in particular ) got everyone to bring up their game with respect to meaningful compiler errors.
But in the days of XHTML? Error messages were arcane, you had to dive in to see what the problem actually was.
Nice list. Some thoughts:
- I think without the move to NeXT, even if Jobs had come back to Apple, they would never have been able to get to the iPhone. iOS was - and still is - a unix-like OS, using unix-like philosophy, and I think that philosophy allowed them to build something game-changing compared to the SOTA in mobile OS technology at the time. So much so, Android follows suit. It doesn't have a command line, and installation is fine, so I'm not sure your line of reasoning holds strongly. One thing I think you might be hinting at though that is a missed trick: macOS today could learn a little from the way iOS and iPadOS is forced to do things and centralise configuration in a single place.
- I think transaction processing operating systems have been reinvented today as "serverless". The load/execute/quit cycle you describe is how you build in AWS Lambdas, GCP Cloud Run Functions or Azure Functions.
- Most of your other ideas (with an exception, see below), died either because of people trying to grab money rather than build cool tech, and arguably the free market decided to vote with its feet - I do wonder when we might next get a major change in hardware architectures again though, it does feel like we've now got "x86" and "ARM" and that's that for the next generation.
- XHTML died because it was too hard for people to get stuff done. The forgiving nature of the HTML specs is a feature, not a bug. We shouldn't expect people to be experts at reading specs to publish on the web, nor should it need special software that gatekeeps the web. It needs to be scrappy, and messy and evolutionary, because it is a technology that serves people - we don't want people to serve the technology.
> - XHTML. Have you ever read the parsing rules for HTML 5, where the semantics for bad HTML were formalized? Browsers should just punt at the first error, display an error message, and render the rest of the page in Times Roman. Would it kill people to have to close their tags properly?
Amen. Postel’s Law was wrong:
https://datatracker.ietf.org/doc/html/rfc9413
We stop at the first sign of trouble for almost every other format, we do not need lax parsing for HTML. This has caused a multitude of security vulnerabilities and only makes it more difficult for pretty much everybody.
The attitude towards HTML5 parsing seemed to grow out of this weird contrarianism that everybody who wanted to do better than whatever Internet Explorer did had their head in the clouds and that the role of a standard was just to write down all the bugs.
I've argued this for years on this site...but AOL.
At its best, having IM, email, browser, games, keywords, chats, etc. was a beautiful idea IMO. That they were an ISP seemed secondary or even unrelated to the idea. But they chose to charge for access even in the age of broadband, and people decided they'd rather not pay it which is to be expected. I often wonder if they'd have survived as a software company otherwise.
They were basically a better Facebook before Facebook, in my opinion.
Google Wave.
Edit: you asked why. I first saw it at SELF where Chris DiBona showed it to me and a close friend. It was awesome. Real time translation, integration of various types of messaging, tons of cool capabilities, and it was fully open source. What made it out of Google was a stripped down version of what I was shown, the market rejected it, and it was a sad day. Now, I am left with JIRA, Slack, and email. It sucks.
Google wave was built on an awesome technology layer, and they they totally blew in on the user interface.... deciding to treat it as a set of separate items instead of a single document everyone everywhere all at once could edit.... killed it.
It make it seem needlessly complicated, and effectively erased all the positives.
I was blown away by the demo but then after I thought about it, it seemed like a nightmare to me. All the problems of slack of having to manually check channels for updates except X 100 (yea, I get that slack wasn't available then. My point is I saw that it seemed impossible to keep up with nested constantly updated hierarchical threads. Keeping up with channels on slack is bad enough so imagine if Wave had succeeded. It'd be even worse.
Wave was great for conversation with one or two other people on a specific project, which I'm sure most people here used it for. I can't imagine it scaling well beyond that.
Is there a video or anything of this version of Wave?
Isn't Nextcloud (including Nextcloud Talk) a viable alternative? Certainly, something like Discord (centralized and closed source) isn't.
Immediately thought of this.
Even the watered-down version of wave was something I used at my host startup, it was effectively our project management tool. And it was amazing at that.
I don't know how it would fare compared to the options available today, but back then, it shutting down was a tremendous loss.
wave was fucking amazing. buggy but amazing
Google sucked/s at executive function because they completely lack appreciation for proper R&D and long-term investment and also kill things people use and love.
Honestly a lot of the time they seem to be be in "what do humans want?" mode.
Discord is function wise the best now...
Optane persistent memory had a fascinating value proposition: stop converting data structures for database storage and just persist the data directly. No more booting or application launch or data load: just pick up where you left off. Died because it was too expensive, but probably long after it should have.
VM's persist memory snapshots (as do Apple's containers, for macOS at least), so there's still room for something like that workflow.
Optane was impressive from tech standpoint.
We were about get rid of split between RAM and disk memory and use single stick for both!
1+ for 3dxpoint.
The technology took decades to mature, but the business people didn’t have the patience to let the world catch up to this revolutionary technology.
The world had already caught up. By the time it was released, flash memory was already nearing it's speed and latency, to the point that the difference want with the cost.
I have an optane drive with the kernel on it, instant boot!
Definitely Opa: http://opalang.org/
In 2011, before TypeScript, Next.js or even React, they had seamless server-client code, in a strongly typed functional language with support for features like JSX-like inline HTML, async/await, string interpolation, built-in MongoDB ORM, CSS-in-JS, and many syntax features that were added to ECMAScript since then.
I find it wild how this project was 90%+ correct on how we will build web apps 14 years later.
Sandstorm: it seemed quite nice with a lot of possibilities when it launched in 2014, but it didn’t really take off and then it moved to sandstorm.org.
The creator, kentonv (on HN), commented about it recently here https://news.ycombinator.com/item?id=44848099
Yahoo pipes. It was so great at creating rss feeds and custom workflows. There are replacements now like Zapier and n8n but loved that. Also google reader which is mentioned multiple times already.
I never used it, but Yahoo pipes sounds like it was awesome whenever I hear people talk about it.
I don't know if it was Yahoo Pipes that died, or a mainstream internet based on open protocols and standards.
Yahoo Pipes was what internet should have been. We're so many decades into computing and that kind of inter-tool linking has only barely been matched by unix pipes.
hey, but we got MCP...
A lot of things on https://killedbygoogle.com/ . I used to use 30-40 Google products and services. I'm down to 3-4.
Google Picasa: Everything local, so fast, so good. I'm never going to give my photos to G Photos.
Google Hangouts: Can't keep track of all the Google chat apps. I use Signal now.
Google G Suite Legacy: It was supposed to be free forever. They killed it, tried to make me pay. I migrated out of Google.
Google Play Music: I had uploaded thousands of MP3 files there. They killed it. I won't waste my time uploading again.
Google Finance: Tracked my stocks and funds there. Then they killed it. Won't trust them with my data again.
Google NFC Wallet: They killed it. Then Apple launched the same thing, and took over.
Google Chromecast Audio: It did one thing, which is all I needed. Sold mine as soon as they announced they were killing it.
Google Chromecast: Wait, they killed Chromecast? I did not know that until I started writing this..
Google Reader: I will forever be salty about how Google killed something that likely required very little maintenance in the long run. It could have stayed exactly the same for a decade and I wouldn't have cared because I use an RSS reader exactly the same way I do that I did back in 2015.
Yes. That was the single worst business decision in Google history, as somebody correctly noted. It burned an enormous amount of goodwill for no gain whatsoever.
Killing Google Reader affected a relatively small number of users, but these users disporportionately happened to be founders, CTOs, VPs of engineering, social media luminaries, and people who eventually became founders, CTOs, etc. They had been painfully taught to not trust Google, and, since that time, they didn't. And still don't.
Just think of the data mining they could have had there.
They had a core set of ultra-connected users who touched key aspects of the entire tech industry. The knowledge graph you could have built out of what those people read and shared…
They could have just kept the entire service running with, what, 2 software engineers? Such a waste.
This would require the decision-maker to think and act at the scale and in interests of the entire company. Not at the scale of a promo packet for next perf: "saved several millions in operation costs by shutting down a low-impact, unprofitable service."
I still use PICASA it works fine. However, when google severed the gdrive-photo linking it meant my photos didn’t automatically download from google to my PC. This is what killed google for me.
> Google Play Music: I had uploaded thousands of MP3 files there. They killed it. I won't waste my time uploading again.
You can argue whether it's as good as GPM or not, but it's false to imply that your uploaded music disappeared when Google moved to YouTube Music. I made the transition, and all of my music moved without a new upload.
You made the transition, under differing licensing terms. Not always an option.
Chromecast Audio still works! They just don't sell them anymore. I use mine every day, and have been keeping an eye out for anyone selling theirs...
Hmm, good to know. But given Google's history, I assumed that it would stop working.
I also need to sell my Google Chromecast with Google TV 4K. Brand new, still in its shrink wrap. Bought it last year, to replace a flaky Roku. It was a flaky HDMI cable instead. I trust Roku more than Google for hardware support.
Immich is a great replacement for Google Photos, if maybe not Picasa.
I'm still amused that they killed Google Notebook and then a few years later created Google Keep, an application with basically the same exact feature set.
You can say that for a fair few of the services mentioned by GP.
Google killed a lot of things to consolidate them into more "integrated" (from their perspective) product offerings. Picasa -> Photos, Hangounts -> Meet, Music -> YT Premium.
No idea what NFC Wallet was, other than the Wallet app on my phone that still exists and works?
The only one I'm not sure about is Chromecast - a while back my ones had an "update" to start using their newer AI Assistant system for managing it. Still works.
I'm still upset that Google Maps no longer tracks my location. It was very useful to be able to go back and see how often and where I had gone.
Is there another app where I can store this locally?
Google Maps still tracks my location.
The difference is they no longer store the data on their servers, it's stored on your phone (iPhone/Android)
https://support.google.com/maps/answer/6258979
That way, they can't respond to requests for that data by governments as they don't have it.
I can look on my phone and see all the places I've been today/yesterday, etc
Arc and its free Arc Mini companion. iOS. Been using it since Facebook eclipsed Moves app. A decade later, it's still not as good as Moves.
Apple Maps added a Visited Places (beta) feature recently.
Strava? :-) Half-joking, half-serious, I haven't used Strava in years, I don't remember all its capabilities.
Edit: Missed the "locally" part. Sorry no suggestions. Maybe Garmin has something?
Nope, Garmin only tracks your location when you record an activity that uses gps, which is good, frankly.
Google Search: Not officially dead yet, but....
yup, losing 0.000087% year-over-year so in 865 billion years it’ll be dead :)
That was probably me, when I stopped using Google Search some years ago. :-) Got tired of the ads, the blog spam, and AI-generated content crap floating to the top of their results page.
The https://udm14.com/ flavor of Google is quite usable, though, esp with notable operators like inurl:this-or-that. But, all in all, yeah, gimme back vanilla Google search from 2008-2010 or so. Back then it was definitely a tool (I worked in investigative journalism at the time), whereas currently "searching" stands for sitting fingers crossed and hoping for the better. But, oh well. </rant>
That's more what I meant. Sure, lots of people still type stuff into the URL bar that takes them to www.google.com/search. But whatever you want to call that results page now, it's no longer Google Search in anything but name.
I’m still using - free g suite - play music - finance - nfc wallet is just google wallet isn’t it? - chromecast, video and audio-only I guess play music is now YouTube music, and doesn't have uploads, so that can be considered dead, but the others seem alive to me.
YouTube Music still supports uploads.
https://support.google.com/youtubemusic/answer/9716522
I used Picasa and loved it, until I realized I want all my photos available from all my devices at all times and so gave in to Google Photos (for access, not backup)
I use SyncThing for that purpose. It syncs across my phone, my laptops, and my Synologies. But I don't sync all my photos.
I don't like the thought of providing Google thousands of personal photos for their AI training. Which will eventually leak to gov't agencies, fraudsters, and criminals.
Google Desktop Search (and also the Search Appliance if you were an SMB).
Why did you keep on using so many Google products if those products get cancelled?
Why didn’t you quit Google after, say, the third product you used got canned?
I used Google Talk than Hangouts, but once they switched to Meet, I gave up on them. By then my family was all using Hangouts, and we never settled on a new service, because one of my siblings didn't want to support any chat services that don't freely give user information to the government, and the rest of us didn't want to use a chat platform that does freely give user information to the government.
I think Chromecast has been replaced by Google TV which is a souped up Chromecast.
Isn't it "Google TV Streamer" now?
From what I can tell (since I am just finding out about this today), they stopped manufacturing the old Chromecast hardware, and at some point, will stop supporting the old devices. The old devices may stop working in the future, for example, because they sunset the servers. Like their thermostats. Who knows?
I wish there was some law that requires open-sourcing firmware and flashing tools if a company decides to EOL a product ...
Picasa definitely went against the grain of Google, which is all about tying you to online services.
Hangouts had trouble scaling to many participants. Google Meet is fine, and better than e.g. MS Teams.
Legacy suite, free forever? Did they also promise a pony?..
Play Music: music is a legal minefield. Don't trust anybody commercial who suggests you upload music you did not write yourself.
Finance: IDK, I still get notifications about the stocks I'm interested in.
NFC Wallet: alive and kicking, I use it literally every day to pay for subway.
Can't say anything about Chromecast. I have a handful of ancient Chromecasts that work. I don't want any updates for them.
The information superhighway
The internet before advertising, artificial intelligence, social media and bots. When folks created startups in their bedrooms or garages. The days when google slogan was “don’t be evil”.
AKA "back when Marc Andreessen had hair and not enough money to build an apocalypse bunker on a personal island."
And when no one knew you were a dog and neither did they care.
Animated gifs of cat, banner bars and pixels cost one dollar, until a one million were sold.
And it all ran on Chuck Norris' personal computer.
That’s the internet before commercialisation and silos.
Heroku? I know it's still around, though IDK who uses it, but I miss those days when it was thriving. One language, one deployment platform, one database, a couple plugins to choose from, everything simple and straightforward, no decision fatigue.
I often wonder, if AI had come 15 years earlier, would it have been a ton better because there weren't a billion different ways to do things? Would we have ever bothered to come up with all the different tech, if AI was just chugging through features efficiently, with consistent training data etc.?
As soon as they put a persistent Salesforce brand banner across the top which did nothing but waste space and put that ugly logo in our face every day, my team started our transition off Heroku pretty much right away.
> One language, one deployment platform, one database, a couple plugins to choose from, everything simple and straightforward, no decision fatigue.
Sounds not that different from containers, if you just choose the most popular tooling.
Small projects: docker compose, posgres, redis, nginx
Big projects: kubernetes, posgres, redis, nginx
This is why Heroku lost popularity.
Yes. And fittingly, Docker was born out of a Heroku competitor.
Didn't they offer free compute? IIRC all free compute on the Internet went away with the advent of cryptocurrencies as it became practical to abuse the compute and translate it directly into money.
My company still uses Heroku in production actually. Every time I see the Salesforce logo show up I wince, but we haven't had any issues at all. It continues to make deployment very easy.
Vine. It was already pretty big back in 2013 but Twitter had no idea what to do with it. TikTok actually launched just a few months before Vine was shut down and erased from the internet.
Whoever took the decision to kill Vine was an absolute moron, even without hindsight. It was square videos, how hard could it have been to shove an ads banner above it and call it a day? Incredible
They also killed Periscope right as the explosion of streaming online video happened... Twitter has always been pretty incompetent.
I will never forgive twitter for this catch and kill of a platform so full of life
Perhaps because they already had Periscope that no one used. It was a "buy competitor to kill it" play that didn't have the desired effect.
Amusingly Periscope was their clone of Meerkat which was briefly popular before they killed it.
I've thought about this too. Imagine all the drama the US government could've avoided if Vine had won over TikTok!
With Elon running it? He probably would have actively sold it to china.
In a world where Vine is as successful as TikTok ended up being, who’s to say they get to a point where selling to Musk even happens?
Midori, Microsoft's capability-based security OS[1]. Rumor has it that it was getting to the point where it was able to run Windows code, so it was killed through internal politics, but who knows! It was the Fuchsia of its time...
[1] https://en.wikipedia.org/wiki/Midori_%28operating_system%29
Midori was fascinating. Joe Duffy's writing on it is the most comprehensive I've seen: https://joeduffyblog.com/2015/11/03/blogging-about-midori/
I've heard someone at Microsoft describe it as a moonshot but also a retention project; IIRC it had a hundred plus engineers on it at one time, including a lot of very senior people.
Apparently a bunch of research from Midori made it into .NET so it wasn't all lost, but still...
> retention project
Never heard this phrase before, but I can definitely see this happening at companies of that size
The technical foundation seems interesting, but knowing Microsoft this would have just become yet another bloated mess with it's own new set of problems. And by now it would have equally become filled with spyware and AI "features" users don't want.
CLPM, the Common Lisp Package Manager. The Quicklisp client doesn't do HTTPS, ql-https doesn't do Ultralisp, and OCICL (which I'm currently using) doesn't do system-wide packages. CLPM is a great project, but it's gone neglected long enough that it's bitrotted and needs some thorough patching to be made usable. Fortunately Common Lisp is still as stable as it has been for 31 years, so it's just the code which interacts with 3rd-party libraries that needs updating.
The Lockheed D-21 drone. Supersonic ramjet without the complexity of scramjet or the cost of turbojet, hamstrung by the need for a manned launch platform (making operations safety-critical… with predictable results) and recovery to get data off it. Twenty or forty years later it would have been paired by a small number of high-cost launcher UAVs and had its cost driven down to disposable, with data recovery over radio comms… but twenty to forty years later there’s nothing like it, and the maturation of satellites means there almost certainly never will be.
The "Eve" programming language / IDE - https://witheve.com
It was a series of experiments with new approaches to programming. Kind of reminded me of the research that gave us Smalltalk. It would have been interesting to see where they went with it, but they wound down the project.
https://eyg.run/ is heavily inspired by eve!
Why did they not pursue this? Were there any applications using this in the wild? It was not immediately obvious from their github repository.
Elm programming language. Arguably not dead but somewhat incomplete and not actively worked on.
A few commits recently.
There are lots of competing MLs you can use instead:
- F# (Fable)
- ReasonML
- OCaml (Bucklescript)
- Haskell
- PureScript
IMO the problem with Elm was actually The Elm Architecture.
What's "the Elm architecture"?
opa, along the same lines - really nice ML based language for isomorphic full stack web development.
Yeah, Opa was wildly ahead of its time, I actually just wrote a top level comment about it. Basically Next.js+TypeScript+modern ECMAScript features, but in 2011.
Microsoft Silverlight.
Full C# instead of god forbidden js.
Full vector dpi aware UI, with grid, complex animation, and all other stuff that html5/css didn’t have in 2018 but silverlight had even in 2010 (probable even earlier).
MVVM pattern, two-way bindings. Expression Blend (basically figma) that allowed designers create UI that was XAML, had sample data, and could be used be devs as is with maybe some cleanup.
Excellent tooling, static analysis, debugging, what have you.
Rendered and worked completely the same in any browser (safari, ie, chrome, opera, firefox) on mac and windows
If that thing still worked, boy would we be in a better place regarding web apps.
Unfortunately, iPhone killed adobe flash and Silverlight as an aftermath. Too slow processor, too much energy consumption.
I am happy this one died. It was just another attempt by Microsoft to sidestep open web standards in favor of a proprietary platform. The other notorious example is Flash, and both should be considered malware.
Open web standards are great but consider where we could have been if competition drove them a different way? We're still stuck with JavaScript today (wasm still needs it). Layout/styling is caught up now but where would we be if that came sooner?
> Open web standards are great but consider where we could have been if competition drove them a different way? We're still stuck with JavaScript today (wasm still needs it). Layout/styling is caught up now but where would we be if that came sooner?
Why do you think JavaScript is a problem? And a big enough problem to risk destroying open web standards.
Did Silverlight have the same security issues as Flash?
Both a Silverlight and Adobe Flex fan here!
I loved silverlight. Before I got a “serious” job, I was a summer intern at a small civil engineering consultancy that had gradually moved into developing custom software that it sold mostly to local town/city/county governments in Arizona (mostly custom mapping applications; for example, imagine Google Maps but you can see an overlay of all the street signs your city owns and click on one to insert a note into some database that a worker needs to go repair it… stuff like that).
Lots of their stuff was delivered as Silverlight apps. It turns out that getting office workers to install a blessed plugin from Microsoft and navigate to a web page is much easier than distributing binaries that you have to install and keep up to date. And developing for it was pure pleasure; you got to use C# and Visual Studio, and a GUI interface builder, rather than the Byzantine HTML/JS/CSS ecosystem.
I get why it never took off, but in this niche of small-time custom software it was really way nicer than anything else that existed at the time. Web distribution combined with classic desktop GUI development.
RethinkDB. Technically it still exists (under The Linux Foundation), but (IMO) the original company's widening scope (the Horizon BaaS) that eventually led to its demise killed its momentum.
Man I loves the original concept for demos but never build anything real with it. Curious if anyone did?
https://maruos.com/ https://en.wikipedia.org/wiki/Ubuntu_Edge
Connect your phone to a display, mouse, keyboard and get a full desktop experience.
At the time smartphones were not powerful enough, cables were fiddly (adapters, HDMI, USB A instead of a single USB c cable) and virtualization and containers not quite there.
Today, going via pkvm seems like promising approach. Seamless sharing of data, apps etc. will take some work, though.
Microsoft Songsmith is another one that deserved a second life. It let you hum or sing a melody and would auto-generate full backing tracks, guitar, bass, drums, chords, in any style you chose.
It looked a bit goofy in the promo videos, but under the hood it was doing real-time chord detection and accompaniment generation. Basically a prototype of what AI music tools like Suno, Udio, or Mubert are doing today, fifteen years too early.
If Microsoft had kept iterating on it with modern ML models, it could’ve become the "GarageBand for ideas that start as a hum."
It also had one of the best campy promotional videos ever produced: https://www.youtube.com/watch?v=k8GIwFkIuP8
ReactOS, the effort to create a free and open source Windows NT reimplementation.
It has been in existence in some form or another for nearly 30 years, but did not gain the traction it needed and as of writing it's still not in a usable state on real hardware. It's not abandoned, but progress on it is moving so slow that I doubt we'll ever see it be released in a state that's useful for real users.
It's too bad, because a drop in Windows replacement would be nice for all the people losing Windows 10 support right now.
On the other hand, I think people underestimate the difficulty involved in the project and compare it unfavorably to Linux, BSD, etc. Unix and its source code was pretty well publicly documented and understood for decades before those projects started, nothing like that ever really existed for Windows.
They had no chance. Look how long it tooks for Wine to get where they are. Their project is Wine + a kernel + device drivers compatibility, and a moving target.
The easiest way to avoid patent liabilities is to always be 20 years behind.
Wine, Proton and virtualization all got good enough that there's no need for a half-baked binary-compatible Windows reimplementation, and I think that took a lot of the oxygen out of what could have been energy towards ReactOS. It's a cool concept but not really a thing anybody requires.
> ReactOS, the effort to create a free and open source Windows NT reimplementation.
Some projects creep along slowly until something triggers an interest and suddenly they leap ahead.
MAME's Tandy 2000 implementation was unusable, until someone found a copy of Windows 1.0 for the Tandy 2000, then the emulation caught up until Windows ran.
Maybe ReactOS will get a big influx of activity after Windows 10 support goes offline in a couple days, or even shortly after when you can't turn AI spying off, not even three times a year.
RAM Disks. Basically extremely fast storage using RAM sticks slotted into a specially made board that fit in a PCIe slot. Not sure what happened to the project exactly but the website disappeared sometime in 2023.
The idea that you could read and write data at RAM speeds was really exciting to me. At work it's very common to see microscope image sets anywhere from 20 to 200 GB and file transfer rates can be a big bottleneck.
Archive capture circa 2023: https://web.archive.org/web/20230329173623/https://ddramdisk...
HN post from 2023: https://news.ycombinator.com/item?id=35195029
There's now a standard for memory over a physical PCIe interface (https://en.wikipedia.org/wiki/Compute_Express_Link) and off-the-shelf products (https://www.micron.com/products/memory/cxl-memory).
I’m confused why this can’t be done in software?
Products to attach RAM to expansion slots have long existed and continue to be developed. It's a matter of adding more memory once all of the DIMMs are full.
What to do with it, once it's there, is a concern of software, but specialized hardware is needed to get it there.
You can do this in software, I tried it a few times with games and just other stuff ~10 years ago. Why would it have to be a hardware solution?
soon will be able to buy a gigabyte AI Top CXL R5X4. PCI expansion card with up to 512gb RAM over four DIMMs.
Non Daw. Its breaking up each function of the DAW into its own application gave a better experience in each of those functions, especially when you only needed that aspect, you were not working around everything else that the DAW offers. The integration between the various parts was not all that it could be but I think the idea has some real potential.
https://non.tuxfamily.org
Thought about Non immediately, but I figured it must have (had) about 2 other users amongst HNers, though. :) Nice to see it mentioned.
I used it quite a bit to produce radio shows for my country's public broadcasting. Because Non's line-oriented session format was so easy to parse with classic Unix tools, I wrote a bunch of scripts for it with Awk etc. (E.g. calculating the total length of clips highlighted with brown color in the DAW -- which was stuff meant for editing out; or creating a poor man's "ripple editing" feature by moving loosely-placed clips precisely side by side; or, eventually, converting the sessions to Samplitude EDL format, and, from there, to Pro Tools via AATranslator [1] (because our studio was using PT), etc. Really fun times!)
1: https://aatranslator.com.au/
Maemo/Meego. I know there is Sailfish still around, but things would had been very different today if Nokia had put all its weight on it back then.
In my ideal world, Maemo/Meego and Palm's WebOS (not LG's bastardization of it) would be today's Android and iOS.
Apple would have inevitably done their own thing, but it would have been really nice to have two widely used, mature and open mobile Linux platforms.
I loved my N900, and my N800 before that, and I would have loved to have seen successors. Ultimately, I ended up switching to Android because I was tired of things only available as apps. Since then, web technologies have gotten better, and it's become much more feasible to use almost exclusively websites.
> it's become much more feasible to use almost exclusively websites.
And that's precisely why companies nerf their web sites and put a little popup that says "<service> works better on the app".
They should have partnered not only with Intel, but with Palm, RIM or whatever other then-giant to rival Android. Those two went their own ways with WebOS and buying QNX, so maybe they could have agreed to form a consortium for an open and interoperable mobile OS
Boot2Gecko or whatever the browser as Operating system was called. This was a project that should have focused on providing whatever its current users needed expanding and evolving to do whatever those users wanted it to do better.
Instead it went chasing markets, abandoning existing users as it did so, in favour of potential larger pools of users elsewhere. In the end it failed to find a niche going forward while leaving a trail of abandoned niches behind it.
I adored my Firefox Phones. Writing apps was so easy I built myself dozens of little one-offs. Imagine if it had survived to today, its trivial html/css/js apps could be vibe coded on-device and be the ultimate personalized phone.
Luckily it wasn't long after Mozilla abandoned it that PWAs were introduced and I could port the apps I cared about.
It lives on as KaiOS. Has limited success as a low end phone platform now.
For a few short months circa 2016 or 2017, KaiOS was the number one mobile OS in India. This was probably because of all the ultra-cheap KaiOS-powered Reliance Jio phones flooding the Indian market at the time.
I noticed the trend when I was working on a major web property for the Aditya Birla conglomerate. My whole team was pleasantly surprised, and we made sure to test everything in Firefox for that project. But everyone switched to Android + Chrome over the next few years, which was a shame.
Today, India is 90% Chrome :(
Everpix: Looked like good execution but they were probably ahead of time.
Also this: https://news.ycombinator.com/item?id=6676494
Redmart (Singapore): Best web based online store to this date (obviously personal view). No one even tries now that mobile apps have won.
https://techcrunch.com/2016/11/01/alibaba-lazada-redmart-con...
Gentoo file manager.
(Not the Linux distribution with the same name)
I have used it for years.
A two pane manager, it makes defining file associations, applications invoked by extensions and short cut buttons easy convenient.
Sadly it is abandonware now.
Slowly migrating to Double Commander now...
Secure-Scuttlebot (the gossiped social network) died circa 2019 or 2024 depending who we ask. It died before it's time for various reasons including:
1. competing visions for how the entire system should work
2. dependence on early/experimental npm libraries
3. devs breaking existing features due to "innovation"
4. a lot of interpersonal drama because it was not just open source but also a social network
the ideas are really good, someone should make the project again and run with it
So much drama there too, but it's designed to attract drmas
I always thought Microsoft Popfly had huge potential and was way ahead of its time. It made building web mashups feel like playing with Lego blocks, drag, drop, connect APIs, and instantly see the result.
If something like that existed today, powered by modern APIs and AI, it could become the ultimate no-code creativity playground.
Macromedia Flash. Its scope and security profile was too big. It gave way to HTML’s canvas. But man, the tooling is still no where near as good. Movieclips, my beloved. I loved it all.
The iPhone killed Flash, probably because it would've been a way to create apps for it, more probably because it would've been laggy in the 2007 hardware, and people would've considered the iPhone "a piece of junk".
Interesting how Flash became the almost universal way to play videos in the browser, in the latter half of the 2000's (damn I'm old...).
It's incredible to me that they killed the whole tool instead of making a JS/Canvas port. Even without "full flash websites", there's still need for vectorial animations on the web.
There was the discontinued Adobe Edge suite, which was what you described.
https://en.wikipedia.org/wiki/Adobe_Edge
Adobe Animate (new name for Macromedia/Adobe Flash) can output to JS/Canvas now.
I agree that the tooling was unbelievable…better for interactive web than anything that exists today AFAIK.
I wonder why one one has managed to build something comparable that does work on a phone.
I agree the tooling was great, .... for making apps/games for desktops with a mouse and keyboard and a landscape screen of at least a certain size.
Maybe they could have fixed all that for touch screens, small portrait screens, and more but they never did make it responsive AFAIK.
Adobe Animate is still just Flash from a tool-standoint.
Are you referring to the SWF file format?
I took it as sarcasm.
As a Linux user, I hated Flash with a passion. It mostly didn't work despite several Linux implementations. About the time they sorted all the bugs out, it went away. Good riddance.
I for one am so glad Flash died. At one point I dreaded navigating to a new website because of it.
Was recently reading about Project Ara, the modular smartphone project by Google/Motorola [1]. Would have liked to see a few more iterations of the idea. Something more customizable than what we have today without having to take the phone apart.
[1]: https://en.wikipedia.org/wiki/Project_Ara
It might be too soon to call it abandoned, but I was very intrigued by the Austral [1] language. The spec [2] is worth reading, it has an unusual clarity of thought and originality, and I was hoping that it would find some traction. Unfortunately it seems that the author is no longer actively working on it.
[1] https://austral-lang.org/ [2] https://austral-lang.org/spec/spec.html
I played with Austral about a year ago and really wanted to use it for my projects, but as a hobbyist and mostly inept programmer it lacked the community and ecosystem I require. I found it almost intuitive and the spec does an amazing job of explaining the language. Would love to see it get a foothold.
Same with Vale: https://vale.dev
ouch, last “recent update” in 2023. Any idea what happened?
The author got hired by Modular, the AI startup founded by the creators of LLVM and Swift, and is now working on the new language Mojo. He’s been bringing a bunch of ideas from Vale to Mojo
Oh nice! I just had an excuse to try mojo via max inference, it was pretty impressive. Basically on par with vllm for some small benchmarks, bit of variance in ttft and tpot. Very cool!
HP TouchPad
Just on principle, I'd have liked to see it on the market for more than 49 days! It pains me as an engineer to think of the effort to bring a hardware device to market for such a minuscule run.
Developer Ryan Flaherty's "Via" project, a novel approach to streaming large games in real time.
https://www.youtube.com/watch?v=e5wAn-4e5hQ
https://www.youtube.com/watch?v=QWsNFVvblLw
Summary:
>This presentation introduces Via, a virtual file system designed to address the challenges of large game downloads and storage. Unlike cloud gaming, which suffers from poor image quality, input latency, and high hosting costs, Via allows games to run locally while only downloading game data on demand. The setup process is demonstrated with Halo Infinite, showing a simple installation that involves signing into Steam and allocating storage space for Via's cache.
>Via creates a virtual Steam library, presenting all owned games as installed, even though their data is not fully downloaded. When a game is launched, Via's virtual file system intercepts requests and downloads only the necessary game content as it's needed. This on-demand downloading is integrated with the game's existing streaming capabilities, leveraging features like level-of-detail and asset streaming. Performance metrics are displayed, showing download rates, server ping, and disk commit rates, illustrating how Via fetches data in real-time.
>The system prioritizes caching frequently accessed data. After an initial download, subsequent play sessions benefit from the on-disk cache, significantly reducing or eliminating the need for network downloads. This means the actual size of a game becomes less relevant, as only a portion of it needs to be stored locally. While server locations are currently limited, the goal is to establish a global network to ensure low ping. The presentation concludes by highlighting Via's frictionless user experience, aiming for a setup so seamless that users are unaware of its presence. Via is currently in early access and free to use, with hopes of future distribution partnerships.
I'm amazed the video still has under 4,000 views. Sadly, Flaherty got hired by XAI and gave up promoting the project.
https://x.com/rflaherty71/status/1818668595779412141
But I could see the technology behind it working wonders for Steam, Game Pass, etc.
Wait until you hear that almost all Unity games don't really have asset streaming because the engine loads things eagerly by default.
I don't see how this could take off. Internet speeds are getting quicker, disk space is getting cheaper, and this will slow down load times. And what's worse is the more you need this tech the worse experience you have.
Visual Basic 6 - arguably the most accessible way of creating GUI apps.
you still have Lazarus, "a Delphi compatible cross-platform IDE for Rapid Application Development."
Adobe Fireworks - easiest vector / photo editor crossover app there ever was.
It's a real shame its raster functionality wasn't integrated into Illustrator. Adobe really butchered the whole Macromedia portfolio, didn't they?
(For those unfamiliar, Illustrator is a pure vector graphics editor; once you rasterize its shapes, they become uneditable fixed bitmaps. Fireworks was a vector graphics editor that rendered at a constant DPI, so it basically let you edit raster bitmaps like they were vectors. It was invaluable for pixel-perfect graphic design. Nothing since lets you do that, though with high-DPI screens and resolution-independent UIs being the norm these days, this functionality is less relevant than it used to be.)
Did not expect to see FW mentioned here. Absolutely loved it.
Just barely stopped using my CS6 copy. Still haven't found anything as intuitive.
Gah. Fireworks and Dreamweaver were my "web designer" jumpstart. Ps and Ai had nothing on Fireworks
Google Reader. We could have had a great society, man.
The loss of Google Reader really does feel like the beginning of the end in retrospect.
VPRI, I was really hoping it would profoundly revolutionise desktop application development and maybe even lead to a new desktop model, and instead they wound up the project without having achieved the kind of impact I was dreaming of.
Windows Longhorn. It looked cool and had some promising features that never made it into Vista, like WinFS.
https://wiki.mozilla.org/Labs/Ubiquity
https://en.wikipedia.org/wiki/IGoogle
https://en.wikipedia.org/wiki/Google_Desktop
and why? = UI/UX
I feel like Zen (Firefox based) captures a few good things from Ubiquity. It could do more though. Zen + Kagi gets even more with the bang commands.
The IBM school's computer. Developed by IBM Hursley in 1967, it was years ahead in its design, display out to a television and storage on normal audio tape. Would have kick started an educational revolution if it had been launched beyond the 10 prototype machines.
Died due to legal wranglings about patents, iirc.
More here:https://news.ycombinator.com/item?id=45061680
Opa language 2012, it was a typed nextjs before its time.
http://opalang.org/
I think the market was still skeptical about nodejs on the server at the time but other than that I don’t really know why it didn’t take off
Launching under AGPL was the kiss of death. They eventually went MIT, but the developers it steered away, probably never gave it a second chance
I came to say Opa too. I liked the language but the meteor-like framework it was bundled with, while nice for prototyping, was a pain to work around when it didn't do what you needed.
That said, frameworks were all the buzz back in the day, so the language alone probably wouldn't have gone anywhere without it.
Nokia Maps. There was a brief period in the early 2010s where Nokia had the best mapping product on the planet, and it was given away for free on Lumia phones at a time when TomTom and Garmin were still charging $60+ for navigation apps.
Still around as "Here Maps"
Started to suck pretty badly not long after getting acquired by German car companies. It used to be good.
choojs
All of the upside and none of the downside of react
No JSX and no compiler, all native js
The main dev is paid by microsoft to do oss rust nowadays
I use choo for my personal projects and have used it twice professionally
https://github.com/choojs/choo#example
The example is like 25 lines and introduces all the concepts
Less moving parts than svelte
You can get the same thing with lit-html and any of the add on libraries that flesh it out.
For example, Haunted is a react hooks implementation for lit: https://github.com/matthewp/haunted
Choo suffered from not having an ecosystem, same with mithtil and other "like react but not" also-rans.
Microsoft Courier.
Dual screen iPad killer, productivity optimised. IIRC Microsoft OneNote is its only legacy.
Killed because both the Windows team and the Office team thought it was stepping on their toes.
wua.la … the original version. You share part of your storage to get the same amount back as resilient cloud storage from others. Was bought and killed by LaCie (now Seagate). They later provided paid-for cloud storage under the same name but it didn’t take off.
https://en.wikipedia.org/wiki/Wuala
OpenSocial: https://en.wikipedia.org/wiki/OpenSocial
Ceylon, JVM language, developed by Red Hat, now abandoned at Eclipse. Lost the race with Kotlin but proposed more than just syntax sugar over Java. Anonymous union types, comprehensions, proper module system...
The Amiga. Just... the Amiga.
In the late 90s there was a website called fuckedcompany which was a place where people could spill the beans about startups (mainly in silicon valley). It was anonymous and a pretty good view into the real state of tech. Now there is twitter/x but it's not as focused on this niche.
The closest sites I've found are Web3 is Going Just Great and Pivot to AI, which are newsfeeds of various car crashes in their respective hype arenas, although without any insider scoops/gossip.
creator now makes wild, bespoke headphones https://www.reddit.com/user/pudjam667/submitted/
This is hilarious, thanks for sharing.
fuckedcompany was awesome but very much a product of the early stages of the .com bubble poppage
I kind of expect we might see something similar if the AI bubble pops
I wonder who owns the domain now
Adobe Flex with Adobe Catalyst. Design a GUI in Photoshop, export it to Flex/Flash to add interactivity.
Looked cool during demos. Got killed when Flash died.
I thought Google Wave was going to kill email and chat and a whole bunch of other stuff.
FireChat.
https://en.wikipedia.org/wiki/FireChat
Keybase <3
Memex, it was a solution to the biggest problem facing the scientific community just after WW2 and it still hasn't been implemented, 80 years later!
SMIL. Nothing comparable for seamless media stream composition, 20 years later.
X.400 we're approaching it by stepwise refinement. It had X.500 which lives on as X.509 certificates and LDAP.
ISO/OSI had session layer. ie much of what QUIC does regarding underlying multiple transports.
Speaking of X.509 the s-expressions certificate format was more interesting in many ways.
OSI's session layer did very little more than TCP/UDP port numbers; in the OSI model you would open a connection to a machine, then use that connection to open a session to a particular application.
X.400 was a nice idea, but the ideal of having a single global directory predates security. I can understand why it never happened
On X.509, the spec spends two chapters on attribute certificates, which I've never seen used in the wild. It's a shame; identity certificates do a terrible job at authentication
Yahoo Pipes
Anyone remember Openmoko, the first commercialised open source smart phone. Was heaps buggy though, not really polished, etc. It’s only redeeming feature was the open source software and hardware (specs?).
I could think of many examples, but I'll talk about the top four that I have in mind, that I'd like to see re-evaluated for today's times.
1. When Windows Vista was being developed, there were plans to replace the file system with a database, allowing users to organize and search for files using database queries. This was known as WinFS (https://en.wikipedia.org/wiki/WinFS). I was looking forward to this in the mid-2000s. Unfortunately Vista was famously delayed, and in an attempt to get Vista released, Microsoft pared back features, and one of these features was WinFS. Instead of WinFS, we ended up getting improved file search capabilities. It's unfortunate that there's been no proposals for database file systems for desktop operating systems since.
2. OpenDoc (https://en.wikipedia.org/wiki/OpenDoc) was an Apple technology from the mid-1990s that promoted component-based software. Instead of large, monolithic applications such as Microsoft Excel and Adobe Photoshop, functionality would be offered in the form of components, and users and developers can combine these components to form larger solutions. For example, as an alternative to Adobe Photoshop, there would be a component for the drawing canvas, and there would be separate components for each editing feature. Components can be bought and sold on an open marketplace. It reminds me of Unix pipes, but for GUIs. There's a nice promotional video at https://www.youtube.com/watch?v=oFJdjk2rq4E.
OpenDoc was a radically different paradigm for software development and distribution, and I think this was could have been an interesting contender against the dominance that Microsoft and Adobe enjoys in their markets. OpenDoc actually did ship, and there were some products made using OpenDoc, most notably Apple's Cyberdog browser (https://en.wikipedia.org/wiki/Cyberdog).
Unfortunately, Apple was in dire straits in the mid-1990s. Windows 95 was a formidable challenger to Mac OS, and cheaper x86 PCs were viable alternatives to Macintosh hardware. Apple was an acquisition target; IBM and Apple almost merged, and there was also an attempt to merge Apple with Sun. Additionally, the Macintosh platform depended on the availability of software products like Microsoft Office and Adobe Photoshop, the very types of products that OpenDoc directly challenged. When Apple purchased NeXT in December 1996, Steve Jobs returned to Apple, and all work on OpenDoc ended not too long afterward, leading to this now-famous exchange during WWDC 1997 between Steve Jobs and an upset developer (https://www.youtube.com/watch?v=oeqPrUmVz-o).
I don't believe that OpenDoc fits in with Apple's business strategy, even today, and while Microsoft offers component-based technologies that are similar to OpenDoc (OLE, COM, DCOM, ActiveX, .NET), the Windows ecosystem is still dominated by monolithic applications.
I think it would have been cool had the FOSS community pursued component-based software. It would have been really cool to apt-get components from remote repositories and link them together, either using GUI tools, command-line tools, or programmatically to build custom solutions. Instead, we ended up with large, monolithic applications like LibreOffice, Firefox, GIMP, Inkscape, Scribus, etc.
3. I am particularly intrigued by Symbolics Genera (https://en.wikipedia.org/wiki/Genera_(operating_system)), an operating system designed for Symbolics Lisp machines (https://en.wikipedia.org/wiki/Symbolics). In Genera, everything is a Lisp object. The interface is an interesting hybrid of early GUIs and the command line. To me, Genera could have been a very interesting substrate for building component-based software; in fact, it would have been far easier building OpenDoc on top of Common Lisp than on top of C or C++. Sadly, Symbolics' fortunes soured after the AI winter of the late 1980s/early 1990s, and while Genera was ported to other platforms such as the DEC Alpha and later the x86-64 via the creation of a Lisp machine emulator, it's extremely difficult for people to obtain a legal copy, and it was never made open source. The closest things to Genera we have are Xerox Interlisp, a competing operating system that was recently made open source, and open-source descendants of Smalltalk-80: Squeak, Pharo, and Cuis-Smalltalk.
4. Apple's "interregnum" years between 1985 and 1996 were filled with many intriguing projects that were either never commercialized, were cancelled before release, or did not make a splash in the marketplace. One of the most interesting projects during the era was Bauhaus, a Lisp operating system developed for the Newton platform. Mikel Evins, a regular poster here, describes it here (https://mikelevins.github.io/posts/2021-07-12-reimagining-ba...). It would have been really cool to have a mass-market Lisp operating system, especially if it had the same support for ubiquitous dynamic objects like Symbolic Genera.
OpenDoc was mostly given to Taligent (the Apple and IBM joint venture) to develop. It was full-on OO: about 35 files for a minimal application, which meant that Erich Gamma had to build a whole new type of IDE which was unusable. He likely learned his lesson: it's pretty hard to define interfaces between unknown components without forcing each one to know about all the others.
MIME types for mail addressed much of the demand for pluggable data types.
Fro me, DESQview. Microsoft tried to buy it in order to use its tech in their windows system. I wonder how things would be today if they were able to purchase it. But DESQview said "no".
Instead it went into a slow death spiral due to Windows 95.
Love seeing this one. My uncle was co-founder of Quarterdeck, and I grew up in a world of DESQview and QEMM. It was a big influence on me as a child.
Got a good family story about that whole acquisition attempt, but I don't want to speak publicly on behalf of my uncle. I know we've talked at length about the what-ifs of that moment.
I do have a scattering of some neat Quarterdeck memorabilia I can share, though:
https://www.dropbox.com/scl/fo/0ca1omn2kwda9op5go34e/ACpO6bz...
DESQview/X sucked the wind out of DESQview's sails. It was, on paper, a massive upgrade. I had been running DESQview for years, with a dial-up BBS in the background.
But you couldn't actually buy /X. After trying to buy a copy, my publisher even contacted DESQ's marketing people to get a copy for me, and they wouldn't turn one over. Supposedly there were some copies actually sold, but too few, too late, and then /X was dropped. There was at least one more release of plain DESQview after that, but by then Windows was eating its lunch.
WebOS.
Javascript/HTML based smartphone / app interface.
I had a Palm Pre and really enjoyed this, shame it didn’t make it.
Windows Phone
Windows Phone's UI is still with us, from Windows 8 onwards. Everything on 8, 10, and 11 is optimized for a touch interface on a small screen, which is ridiculous on a modern desktop with a 32" or so monitor and a trackball or mouse.
ello.co - what a fun and pretty social media website that was.
OS/2 my beloved.
I was super excited for BeOS myself.
BeOS-lineage Binder IPC continues in Android.
I'm booting and running Haiku on my Thinkpad. It's a from-scratch workalike of BeOS, and able to run Be software. Though, frankly, Be software is totally 1990s, so a lot of Linux software written for Qt has been ported to Haiku.
In the end I wound up with basically the same application software as on my Debian desktop, except running on Haiku instead of Linux. Haiku is noticeably snappier and more responsive than Linux+X+Qt+KDE, though.
Did an install of OS/2 3.0 recently, and it was just as wonderful as the first time I used it. That team got so much so right.
OS/2 ISV Stardock gave us Win8 start button.
In late September or early October 1996, Fry's Electronics places a full page promo ad on the back of the business section of the San Jose Mercury News for OS/2 4.0 "WRAP [sic]" in 256 pt font in multiple places. Oops!
Nah, that time has passed and there's not much to miss from the base OS. What would be interesting is for IBM to publish the source to the Workplace Shell and the underlying SOM code so it might get a new life running on one of the free *nixes.
It ran lots of banking ATMs that were not hacked.
Lotus Agenda, Ecco Pro and Chandler. 1980s AI-like human organization.
XMMS
Fortress language. It suffered from being too Haskell-like in terms of too many, non-orthogonal features. Rust and Go applied lessons from it perhaps indirectly.
their operator precedence system was one of my favourite pieces of language design. the tl;dr was that you could group operators into precedence sets, and an expression involving operators that all came from the same set would have that set's precedence rules applied, but if you had an expression involving mixed sets you needed to add the parentheses. crucially, they also supported operator overloading, and the same operator could be used in a different set as long as everything could be parsed unambiguously. (caveat, I never used the language, I just read about the operator design in the docs and it was very eye opening in the sense that every other language's operator precedence system suddenly felt crude and haphazard)
Fortress had great ideas, but I'd say the closest thing to in the real world now might be Julia.
Nokia smartphone line killed by Microshaft.
Humane AI Pin. I think they launched 2 years too early and were too greedy with device pricing and subscription. Also if they focused as accessory for Android/iPhone they could reduce power usage and cost as well.
Their execution was of course bad but I think today current LLM models are better and faster and there is much more OSS models to reduce costs. Hardware though looked nice and pico projector interesting concept even though not the best executed.
Wine predates ReactOS. It was basically a FOSS duplicate of Sun's WABI.
I wrote a bunch of software in Borland Delphi, which ran in Windows, Wine, and ReactOS with no problems. Well, except for ReactOS' lack of printing support.
As long as you stay within the ECMA or published Windows APIs, everything runs fine in Wine and ReactOS. But Microsoft products are full of undocumented functions, as well as checks to see if they're running on real Windows. That goes back to the Windows 3.1 days, when 3.1 developers regularly used OS/2 instead of DOS, and Microsoft started adding patches to fail under OS/2 and DR-DOS. So all that has to be accounted for by Wine and ReactOS. A lot of third-party software uses undocumented functions as well, especially stuff written back during the days when computer magazines were a thing, and regularly published that kind of information. A lot of programmers found the lure of undocumented calls to be irresistible, and they wound up in all kinds of commercial applications where they really shouldn't have been.
In my experience anything that will load under Wine will run with no problems. ReactOS has some stability problems, but then the developers specifically call it "alpha" software. Despite that, I've put customers on ReactOS systems after verifying all their software ran on it. It gets them off the Microsoft upgrade treadmill. Sometimes there are compatibility problems and I fall back to Wine on Linux. Occasionally nothing will do but real Windows.
Hard disagree. The Humane AI Pin ad was a classic silicon valley ad that screamed B2VC and demonstrated nothing actually useful that couldn't be done with an all-in-one phone app (or even the ChatGPT app) and bluetooth earbuds that you already have.
Which reduces its innovation level to nothing more than a chest-mounted camera.
You want real B2C products that people would actually buy? Look at the Superbowl ads instead. Then watch the Humane ad again. It's laughable.
Betamax. Because I bought a player and it gave better quality video.
Google Inbox
Meteor
It's alive and well!
Riak
Google Glass. Thanks society.
People always fail to see something that is an inevitability. Humans lack foresight because they don't like change.
At least with a smartphone it’s pretty clear when someone is filming you. Google Glass was too much of an enabler for creeps.
nah, glass was impressive for a such a big org like google, but smartphones are popular because people use them like portable televisions. glanceable info and walking directions are more like an apple watch sized market, without the fashion element. meta is about to find out.
Google Wear is pretty much Google Glass on your wrist, so you don't burn out your eyes looking up and to the side.
Wild that people would downvote your low stake personal opinion given as a direct ask from OP. I am 100% with you.
Google Glass was so much before its time, it might be reinvented a few more times and abandoned again before finally becoming a success.
yea, crazy, I upvoted just now.
google glass sucks though and glasses will never be a thing. google and meta and … can spend $8T and come up with the most insane tech etc but no one will be wearing f’ing glasses :)
USA
Apple’s scanning system for CSAM. The vast majority of the debate was dominated by how people imagined it worked, which was very different to how it actually worked.
It was an extremely interesting effort where you could tell a huge amount of thought and effort went into making it as privacy-preserving as possible. I’m not convinced it’s a great idea, but it was a substantial improvement over what is in widespread use today and I wanted there to be a reasonable debate on it instead of knee-jerk outrage. But congrats, I guess. All the cloud hosting systems scan what they want anyway, and the one that was actually designed with privacy in mind got screamed out of existence by people who didn’t care to learn the first thing about it.
Good riddance to a system that would have provided precedent for client-side scanning for arbitrary other things, as well as likely false positives.
> I wanted there to be a reasonable debate on it
I'm reminded of a recent hit-piece about Chat Control, in which one of the proponent politicians was quoted as complaining about not having a debate. They didn't actually want a debate, they wanted to not get backlash. They would never have changed their minds, so there's no grounds for a debate.
We need to just keep making it clear the answer is "no", and hopefully strengthen that to "no, and perhaps the massive smoking crater that used to be your political career will serve as a warning to the next person who tries".
This. No matter how cool the engineering might have been, from the perspective of what surveillance policies it would have (and very possibly did) inspire/set precedent for… Apple was very much creating the Torment Nexus from “Don’t Create the Torment Nexus.”
> from the perspective of what surveillance policies it would have (and very possibly did) inspire/set precedent for…
I can’t think of a single thing that’s come along since that is even remotely similar. What are you thinking of?
I think it’s actually a horrible system to implement if you want to spy on people. That’s the point of it! If you wanted to spy on people, there are already loads of systems that exist which don’t intentionally make it difficult to do so. Why would you not use one of those models instead? Why would you take inspiration from this one in particular?
The problem isn’t the system as implemented; the problem is the very assertion “it is possible to preserve the privacy your constituents want, while running code at scale that can detect Bad Things in every message.”
Once that idea appears, it allows every lobbyist and insider to say “mandate this, we’ll do something like what Apple did but for other types of Bad People” and all of a sudden you have regulations that force messaging systems to make this possible in the name of Freedom.
Remember: if a model can detect CSAM at scale, it can also detect anyone who possesses any politically sensitive image. There are many in politics for whom that level of control is the actual goal.
> The problem isn’t the system as implemented
Great!
> the problem is the very assertion “it is possible to preserve the privacy your constituents want, while running code at scale that can detect Bad Things in every message.”
Apple never made that assertion, and the system they designed is incapable of doing that.
> if a model can detect CSAM at scale, it can also detect anyone who possesses any politically sensitive image.
Apple’s system cannot do that. If you change parts of it, sure. But the system they proposed cannot.
To reiterate what I said earlier:
> The vast majority of the debate was dominated by how people imagined it worked, which was very different to how it actually worked.
So far, you are saying that you don’t have a problem with the system Apple designed, and you do have a problem with some other design that Apple didn’t propose, that is significantly different in multiple ways.
Also, what do you mean by “model”? When I used the word “model” it was in the context of using another system as a model. You seem to be using it in the AI sense. You know that’s not how it worked, right?
> I can’t think of a single thing that’s come along since that is even remotely similar. What are you thinking of?
Chat Control, and other proposals that advocate backdooring individual client systems.
Clients should serve the user.
> Chat Control, and other proposals that advocate backdooring individual client systems.
Chat Control is older than Apple’s CSAM scanning and is very different from it.
> Clients should serve the user.
Apple’s system only scanned things that were uploaded to iCloud.
You missed the most important part of my comment:
> I think it’s actually a horrible system to implement if you want to spy on people. That’s the point of it! If you wanted to spy on people, there are already loads of systems that exist which don’t intentionally make it difficult to do so. Why would you not use one of those models instead? Why would you take inspiration from this one in particular?
I don’t think you can accurately describe it as client-side scanning and false positives were not likely. Depending upon how you view it, false positives were either extremely unlikely, or 100% guaranteed for practically everybody. And if you think the latter part is a problem, please read up on it!
> I'm reminded of a recent hit-piece about Chat Control, in which one of the proponent politicians was quoted as complaining about not having a debate. They didn't actually want a debate, they wanted to not get backlash. They would never have changed their minds, so there's no grounds for a debate.
Right, well I wanted a debate. And Apple changed their minds. So how is it reminding you of that? Neither of those things apply here.
Forgot about the concept of bugs have we? How about making Apple vulnerable to demands from every government where they do business?
No thanks. I'll take a hammer to any device in my vicinity that implements police scanning.
> Forgot about the concept of bugs have we?
No, but I have a hard time imagining a bug that would meaningfully compromise this kind of system. Can you give an example?
> How about making Apple vulnerable to demands from every government where they do business?
They already are. So are Google, Meta, Microsoft, and all the other giants we all use. And all those other companies are already scanning your stuff. Meta made two million reports in 2024Q4 alone.
> The vast majority of the debate was dominated by how people imagined it worked, which was very different to how it actually worked.
But not very different to how it was actually going to work, as you say:
> If you change parts of it, sure.
Now try to reason your way out of the obvious "parts of it will definitely change" knee-jerk.
I’m not sure I’m understanding you.
Apple designed a system. People guessed at what it did. Their guesses were way off the mark. This poisoned all rational discussion on the topic. If you imagine a system that works differently to Apple’s system, you can complain about that imaginary system all you want, but it won’t be meaningful, it’s just noise.
You understand it just fine, you're just trying to pass you fantasy pod immutable safe future as rational while painting the obvious objections based on the real world as meaningless noise.
There is no place for spyware of any kind on my phone. Saying that it is to "protect the children" and "to catch terrorists" does not make it any more acceptable.