Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
OpenXanadu (xanadu.com)
107 points by warent on July 24, 2022 | hide | past | favorite | 79 comments


I remember reading about the never-ending story of Xanadu in a 1992 issue of Dr Dobbs Journal, which my local newsagent had to order specially for me... no web in those days. Imagine: there will be developers reading this who weren't even born when it was already vapourware!


The Web existed in 1992 (invented 1989) just not for you. Gopher also existed by then (1991) and you maybe see the problem: The real world gave people something usable long before Xanadu could even begin to get off the ground. The same thing happened with the Internet versus OSI: The Internet protocol suite rendered the OSI protocols mostly irrelevant to the point people now insist OSI is just a model and the protocols are never mentioned. Insisting that we all think in terms of a seven-layer model while using SMTP instead of X.400 for email is some delicate mix of funny and infuriating otherwise known only to the parents of small children.


Maybe it's just me, but it feels like Xanadu was always vaporware for a very important reason: It never had a specification concrete enough to implement because its chief architect always came off like a schizophrenic. I can't be alone in thinking that. No one who's read any of the numerous papers and books surrounding this fiasco can walk away thinking it's author is sane. There's just something about that "stream of consciousness" style of all Ted Nelson's papers that point to there being something fundamentally wrong with him.

I say all this not to be a dick, mind you, but to point out that when something has been vapoware for decades that maybe it's time to ignore the guy who's at the heart of it and, if there's anything useful in it, move on without him. The world did that as far as hypertext goes. Now if only the world could do that with GNU.


There's a fine line between schizophrenia and creative genius and Nelson walks that line enormously all things considered. Project Xanadu aside Computer Lib and his work on HyperText systems in the 1960's are still incredibly ahead their time. Nelson is a pioneer on the same level as McCarthy, Kay, or Minsky but he didn't come up the way that they did. He's firstly a filmmaker; I think a lot of people miss this.

It's worth noting that Brenda Laurel, Alan Kay, Nelson, and Seymour Papert had, "art" backgrounds. The future of computing will effectively be led by people with these cross-sectional skills. There's a reason programmers are shitty at designing software.


Everyone does the color-rays thing first, because that's the easy part. At the heart of his vision are several non-trivial, unsolved problems. Most of these are still non-trivial & unsolved. Instead of clearly identifying & expanding upon these problems, one-by-one, & posting a bounty, he blames his programmers. The man does not like programmers.

It is ridiculous. Going by the "Computer Lib" book, and when it was published, it's obvious that Ted knows enough about how computers work, and has had more than enough time to write the thing all by himself--if his only problems were programming ones.

In a way, I am glad he failed. A Google-scale Google is bad enough. Adding a Google-scale Elsevier into the mix is not a plus.


In all fairness, there's a reason why Ted had a grudge against programmers, and he spelled it out in Computer Lib/Dream Machines. When he grew up, programmers were still generally the High Priests of the Machine, dressed in their business suits and working for the Man.

Also, Xanadu was always intended to be a federated network of individually owned backend servers, not a centralised network. So neither the Google nor the Elsevier model. The idea behind including micropayments was to attempt to encourage existing rightsholders to participate in the pre-granted permission transcoopyright ecosystem so that people can easily make remixes using the majority of our culture that is currently locked up, not just the portion of it available under permissive licences.


They're still high priests of the machine, but now dressed in free t-shirts & warby parkers. I also dislike programmers, but they are the least of his problems.

Going by his own writings, his zealous consistency on trademark/copyright/credit, the limits of the technology (until blockchain) at the time... any micro-payments system coming out of that was destined to be more Lexis-Nexis than anything resembling decentralization, regardless of whatever Ted's intentions are.

And most of that "locked up" culture is garbage in the first place--the 100th starwars novel, the 25th tom clancy book, the latest nutrition or psychology paper that has been p-hacked to hell and back for the benefit of some commercial interest. Money doesn't attract creativity. Money attracts whores. So in addition to "SEO Experts" optimizing PageRank, we'd have Xanadu Experts gaming that system as well, and it would be just as full of garbage as Google is. Money would surely change hands, but to the pewds & mr beasts of XanaduSpace.


> Now if only the world could do that with GNU.

What do you mean? GNU is not vaporware, many people use it on top of Linux. Do you mean GNU/Hurd?


[flagged]


I'm not clear: Linux and FreeBSD are GNU products? I didn't know! I scoured www.gnu.org, and couldn't find either there. (GNU/Linux refers to a GNU userland coupled with Linux, you know.)

I've been using Emacs happily for going on 40 years. Thanks for alerting me that Emacs is Generally Not Usable. I didn't know.


His statement is too out there to be credible. It is immediately and trivially falsifiable but if you discuss it with him he'll likely continue to argue about it in perpetuity.

It happens on the Internet. Don't feed the trolls, not everything merits a response


> I'm not clear: Linux and FreeBSD are GNU products?

Nope! Despite people trying to claim otherwise, as I'm sure you couldn't possibly be trying to do


You know precisely that Linux and FreeBSD both depend upon GNU utilities. ("GNU/Linux refers to a GNU userland coupled with Linux") Don't be obtuse. If Emacs works for you then that's great, but if you are at all interested in having a system that works then you should be prepared to hear that for other people your preferred solution doesn't work, and why, be prepared to acknowledge that, and accept that something should be done to fix that. Responding with hostility is not at all productive and is precisely why GNU remains unusable for the majority.


Clearly your first statement is incorrect: Android is an example of a Linux system that doesn't expose a GNU userland. As for “the majority”. I don't know what you mean by that. The vast majority of computer users don't have any experience of anything that is released by the Free Software Foundation; they just use applications, some of which are released under the GPL (e.g., Audacity or VLC), and most of which are not. So what is the population of which the majority is unable to use GNU products?

Re Emacs: there is a substantial community of Emacs users. These people have not found it unusable. That other people use vim, VScode, or something else, or have no use for a text editor, doesn't undercut this fact.


Your argument seems to be that GNU has failed because it isn't the majority choice for general purpose computing systems.

Correct me if I'm wrong, because that makes no sense at all.


No, my argument is that GNU failed by being generally broken and unusable for the vast majority of people such that they won't use it.


GNU the project may be non-optimally managed (I have no opinion), but that doesn't have much impact on its products.

GNU the set of software packages is clearly not broken nor unusable. The vast majority of people who would use them, do use them, generally quite successfully.

GNU the software licensing model is quite popular and has its pragmatic and political proponents. Not universally loved, but clearly not broken nor unusable.

So, what are you talking about?


FreeBSD does not depend on GNU utilities.


> GNU remains unusable for the majority.

GNU provides a set of tools that are useful to people who make software. Most people don't make software, so GNU doesn't target their needs. There's other software for most users.

Most people can't play the guitar, but that doesn't mean guitars "don't work" and "something should be done to fix" them. The people who want to make music spend the requisite time to learn them, then they make music. People who just want to hear that music can buy it without having to own or learn guitar.


> You know precisely that Linux and FreeBSD both depend upon GNU utilities.

I'm reasonably sure FreeBSD, being a BSD, depends upon BSD utilities rather than GNU utilities (unless you're running some distro of GNU/kFreeBSD, but in that case your soul is already probably too thoroughly damned to warrant further comment).


Do you have any reasoning beyond the general notion that you're entitled to your opinion?


"It never had a specification concrete enough to implement because its chief architect always came off like a schizophrenic"

My takeaway was, that it never went anywhere, because the chief architect is a non technical person, not understanding the technical limits, but having a great ego and therefore was very hard to work with.

I do admire the basic vision (minus royalty), but I think it would take a new, clean and clear approach of designing, to make it actual usable. The way it is, it looks to me like a very early proof of concept, which is not much after all of those years.


Ted wrote a book ("Computer Lib/Dream Machines", 1974). He still sells it! Ted has known how computers work, in high detail, for longer than most of us have been alive.


Yes! That book was an important contribution in its day. I wrote an appreciation of it here a few years ago:

https://news.ycombinator.com/item?id=22176769


Hm, I have not read that book yet, but from what I gathered, it confirmed my view - that he has a generalist understanding of computers(and the sociological impact), but lack the details and don't know how to programm himself.


As someone who has read that book, & owns it, I can attest to the fact that Ted knows more about how computers work than most people who call themselves professional programmers (I know, I know, low bar..., but even so).

If this were a forum, instead of a plantation, I could post selections from it, and you could see for yourself.


> its chief architect always came off like a schizophrenic

do you realize it's our economic system and the perversion of technology to literally criminalize cooperation and the sharing of information (intellectual property regime (read the book Abolish Silicon Valley by Wendy Liu for more) for one of two groups of people (workers and owners)?

> Maybe it's just me

normalization of an anti-human economy starts from day -9 months (i.e. health/ill-health of the human organism starts and depends on the health of the mother). what has the 'modern world' got to offer those who don't have clean drinking water due to imperialist systems?


Note the classic troll: Making idiotic statements without backing, and constantly teasing that they're going to back them while making further idiotic statements along the way. This troll never backed anything they said, of course, and the moderators here will never be intelligent enough to sanction this behavior.


Ted also got into fights with almost everybody he ever worked with on Xanadu


Did he? This is a part of the story I've never heard. Got any stories / links?


There is definitely some truth to this, but he has never got into any fights with me over the last 40 years that I've been working with him.


"Amazingly, Herzog concludes that Ted is the only sane person in the computer field." [https://www.youtube.com/watch?v=Bqx6li5dbEY]

At the time Ted conceived his visionary ideas, the magazines were full of pictures of guys in suits standing in front of secretaries typing into screenless hardware, and business machines with racks of IBM tape-reels. There was little non-corporate networking (except for the beginnings of PLATO), and only dreamers talked of making all information available to everyone. Wikipedia is a kludge by comparison.


For someone born after it was already a vapourwave, can anyone explain how issues with such deep interlinking were supposed to be solved? Like, what is supposed to happen if the linked host dies, or if the content becomes paywalled, copyrighted, or distributed illegaly in the first place? Or if somebody highly referenced gets hacked and a malicious code gets injected into the referenced text?


The design always included replication of the content. When information is originally published, you can request that "n" copies are sent to other back-end hosts that are advertising they have available storage. I believe we intended n to be at least 3. In addition, when someone requests content that is not already available locally and especially when they transclude the content, the back-end they are using is encouraged to make a local copy. So that answers "what if the linked host dies".

All content is already copyrighted by the author(s), and in order to publish it on the Xanadu network they have to agree to publish it under transcopyright which grants prior permission to transclude it. That does not preclude also offering the same content elsewhere under different license terms, but revoking the original license agreement would require the content be removed from the Xanadu network. IANAL but I suspect people might have some rights to rely on the original license unless properly notified that the rightsholder had revoked it.

All Xanadu content is append-only versioned, so if someone gets hacked and content is changed, nobody is obligated to transclude from the altered version. They can continue to transclude earlier versions.


The simplest way is to store everything locally, as was proposed by V Bush in 1945. His Memex included an output (on microfilm) of all the relevant pages, along with the annotation trails.

If you have a local copy of everything referenced, none of your links break.


Everything is pay per view in Xanadu. It's a micropayment model. When you read, you pay, and the payments are distributed, word by word, to all the creators of the input text. It's for storing expensive information in expensive computers to be sold expensively. Think of it as a successor to Mead Data Central, Lexis, and Nexis[1], early centralized and expensive information retrieval systems. (Those are still around, merged into Reed Elsevier, publisher of expensive books and journals.)

The technology became cheap enough that such a business model was unnecessary. Although it was considered in the early days of the Web. Early thinking was that you'd have to subscribe to everything, and there would be charges on your Internet bill for each service, like cable TV. You'd pay to subscribe to a search engine. The New York Times went that route successfully, but few others did.

I knew some of the people behind Xanadu in the 1980s. They were mostly everything-is-a-market libertarians.

[1] https://en.wikipedia.org/wiki/LexisNexis


Not exactly. It is a lot closer to pay for first view.

https://xanadu.com/xuTco.html is Ted's most recent thoughs on thematter.


Note that prices could always be zero - the reason to include micropayments was to try and get access to the vast majority of existing culture that is currently only available for a fee. Ted wanted to bring in existing rightsholders rather than fight them. Unfortunately as it turned out that was never going to work. Just look at how hard the film industry fought home video, among many other examples of established industries fighting disruptors!

Ted once called me a libertarian, and I immediately set him straight. Not even close, dude.


A very informative post. I think a lot of the young people hip to Nelson miss this point. I'm no longer a, "free and open everything" kind of person but Nelson is far more of a BioShock character than you'd think. He's not a baby boomer in any sense; he's a child of Welles, Hawks, and Howard Hughes. I could see him being old enough not to freak out over Birth Of A Nation.

God bless that crazy bastard. We used to make Americans so well.


Its not about cheap vs expensive. It's about overall incentives within the system. A read should be as valuable as a write. At the very least, a read shouldn't be completely free.


Depends what I am reading.

If it is useful to me, then incentive modelling would suggest I should pay. If I get useless information, then I won't want to pay. If I get advertising, then they should pay.

But how do I tell what value I will extract from it before I have read it? Information/concepts/ideas are not like physical objects at all.


Yup, that never made sense to me, either.

There might be ways to help with that, by free trial of x amount of words etc. but this would be hard to consistently implement.

Also, there is clearly information so dense, it is way more valuable, than some shallow, mindless post.

So the creators would have to up a price for every sentence? And the readers get that information, before they read and link?

Alltogether just way too complicated and impractical.


Animats, bless you for posting this every time Xanadu comes up. There's always plenty of discussion on the man himself and his peculiarities. Not enough looking at the context of the times this was conceived, the problems Xanadu was expected to resolve, or the perspective and philosophies of the developers that ended up baked into the very structure written into it.


This sounds eerily like web3.


Kind of, but the crypto crowd can't do micropayments. Blockchains are too expensive to run to process payments in the penny range.


Much less the fraction of a penny range Xanadu wants!


a quote from the system: "Adam and Lilith began to fight. She said, 'I will not lie below,' and he said, 'I will not lie beneath you, but only on top. For you are fit only to be in the bottom position, while am to be in the superior one.' Lilith responded, 'We are equal to each other inasmuch as we were both created from the earth."

That's just a few seconds of clicking around in systems like this. It's like surfing with a fucking jetpak.

Spend a few minutes with the demo and try to, "feel" the concept. I would pay huge amounts of money to have all my research materials, "Xanadu'd." (Anyone looking to be hired?)


About that quote, it's from the fifth Alphabet of Sirach[0]

[0] https://en.wikipedia.org/wiki/Alphabet_of_Sirach


Open for contractor work, see my email at my website (check my bio). We can talk about the details via email.


Legend is that at CERN they flipped a coin to decide whether they would choose Xanadu or Tim Berner Lee's WWW as their Hypertext system.

That day coins failed us.


Obviously we're in the worse timeline. We got evil sentient AI was well: "The Algorithm" that runs YouTube, FaceBook, Twitter, et al.


Huge inspiration for many of the features present in tools such as Obsidian, Logseq and Roam Research.


In 1988, Autodesk (makers of AutoCAD) was so impressed by the Xanadu project that they gave them financial backing, hoping to bring a product to market the following year. After four years without any progress, Autodesk gave up:

> […] Come 1992, the “resources of Autodesk” were still funding “talent of the Xanadu team” which had not, as of that date, produced anything remotely like a production prototype—in fact, nothing as impressive as the 88.1x prototype which existed before Autodesk invested in Xanadu. On August 21, 1992 Autodesk decided to pull the plug and give its interest in Xanadu back to the Xanadudes.

(Quoted from footnote linked from this page: https://www.fourmilab.ch/autofile/e5/chapter2_64.html)


Am I getting the idea behind this correctly? The idea is embedding content in one stream from different sources?

It seems like folks here are skeptical, but... we're seeing a growing trend towards this. Nevermind the origins of hypertext, look at the more modern timeline:

- tiddlywiki encouraging bite-sized references that stack on top of each other in one view - The growing popularity of digital zettelkasten(s?) such as Roam or Obsidian. Backlink views and embedding snippets are at least similar. - Finally, the block protocol: https://blockprotocol.org/

Is this going to be something where people look back on this and say "if only it had been understood earlier, we'd have had $whatever_is_after_block_protocol sooner" ?


>Is this going to be something where people look back on this and say "if only it had been understood earlier, we'd have had $whatever_is_after_block_protocol sooner" ?

Yes, in the July, 1945 issue of "The Atlantic", the editor prefixes "As We May Think" with this:

  As Director of the Office of Scientific Research and Development, Dr. Vannevar Bush has coordinated the activities of some six thousand leading American scientists in the application of science to warfare. In this significant article he holds up an incentive for scientists when the fighting has ceased. He urges that men of science should then turn to the massive task of making more accessible our bewildering store of knowledge. For years inventions have extended man's physical powers rather than the powers of his mind. Trip hammers that multiply the fists, microscopes that sharpen the eye, and engines of destruction and detection are new results, but not the end results, of modern science. Now, says Dr. Bush, instruments are at hand which, if properly developed, will give man access to and command over the inherited knowledge of the ages. The perfection of these pacific instruments should be the first objective of our scientists as they emerge from their war work. Like Emerson's famous address of 1837 on "The American Scholar," this paper by Dr. Bush calls for a new relationship between thinking man and the sum of our knowledge. — THE EDITOR  

  [1] https://www.theatlantic.com/magazine/archive/1945/07/as-we-may-think/303881/


Wait, what? What does this show?

The link goes to an example. The front page of the site says the following, whish doesn't help much:

The computer world is not just technicality and razzle-dazzle. It is a continual war over software politics and paradigms. With ideas which are still radical, WE FIGHT ON.

We hope for vindication, the last laugh, and recognition as an additional standard-- electronic documents with visible connections.

We propose a new kind of writing-- PARALLEL PAGES, VISIBLY CONNECTED.


One thing to keep in mind is that Project Xanadu predates the web by a decade or so. Whereas the web settled on simple unidirectional links embedded in linear text documents, Ted Nelson's Xanadu was aiming for bidirectional transclusion and nonlinear documents ("zippered lists").

Reading old papers from Nelson, Engelbart, or Bush is a lot of fun, though it can sometimes take a little work to get into a pre-internet headspace. I don't read very many authors these days who are attempting to rethink fundamental computer interface concepts like that, though there are a few.


Xanadu was the original project in which the term "hypertext" was coined. It's the primary influence over all hypertext things, including HTML. Its intended use was "directly embedded documents" instead of clicking through links. So, this web site just revives that original idea. I agree that its novelty over web model is still questionable, but that's how significant it is.


I would argue that memex is the primary influence over all hypertext things. The concept was around before Xanadu.


You're right, I hadn't thought about Memex, so, the concept was there, but still, Ted Nelson coined the term.


>It's the primary influence over all hypertext things, including HTML

Is it? Was Tim Berners-Lee influences by Project Xanadu? Was he aware at all?


He was certainly influenced by Ted Nelson and I’d be shocked if he had no awareness of this. He’d have to have been completely naive about the field he was working in, and I don’t think he was naive.


You can check yourself by reading TBL's first WWW site [1].

In essence, Ted Nelson (and Vannevar Bush) are referenced in the "History" section, and also for coining the term "Hypertext", but not from the "Technical" [2] and "Standards" [3] sections.

[1]: http://info.cern.ch/hypertext/WWW/TheProject.html

[2]: http://info.cern.ch/hypertext/WWW/Technical.html

[3]: http://info.cern.ch/hypertext/Standards/Overview.html


Ted Nelson was mentioned specifically in his original proposal document, so I would say it had some influence.


It's a fascinating dead end in the history of hypertext. Ted Nelson has always come across to me as either 51% visionary and 49% crackpot, or the other way 'round. Technically, Xanadu's bidirectional links and document transclusion (e.g., not just including links but whole sections of other web pages in yours) are really interesting. From a copyright standpoint, they're kind of a mess, and an integral idea of Xanadu is essentially being able to make links and embedded microdocuments require micropayments for usage and/or inclusion, which sounds like the kind of thing that people who don't like libertarians use to parody libertarian ideas. And, from a practical standpoint, they've just never gone anywhere -- as far as I can tell, in no small part because Mr. Nelson is apparently impossible to work with.

Earlier discussion: https://news.ycombinator.com/item?id=15269827

A famous (and infamous) article, "The Curse of Xanadu", on "the longest-running vaporware project in the history of computing, a 30-year saga of rabid prototyping and heart-slashing despair" (now closer to 50 years and counting): https://www.wired.com/1995/06/xanadu/

I don't think I'd actually clicked through and seen this demo before, though, which adds some interesting new objections to it on my end -- I find the user experience here ugly and cryptic. I'm sure a much nicer one with these ideas could be made, but it seems oddly telling that Xanadu's proponents think this -- and the Xanadu web site in general -- are great ways to sell the ideas.


As a demo, I can forgive ugly and being a bit cryptic. It's the ramifications that get me. Okay, in the main document, you have a transclusion link to the King James Bible (Genesis 1:1-3). But one of the main points of Xanadu is the bidirectional links, so when you switch to reading the King James Bible, you should also see bidirectional links not only to the document you were just reading ("Xanadoc"), but every other link to the King James Bible in Xanalogical space, and past a few dozen such links, things will get incredibly messy (and a "missed opportunity"---one of the sides documents actually quotes one of the other side documents, yet there's no link).


Note that since Xanadu links and transclusions are external, you can select which linkbases to show. So you can filter them, which is important because otherwise you will surely end up with an unlimited number of links on every single character. :)

If the system is working correctly, you should always be able to follow transcluded quotations back to their original primary source - that is one of the most fundamental design principles. Conversely, the rightsholder of a document should be able to see where their work is being quoted.


So all in all quite like the sprawling mansion of the same name from the Citizen Kaine film then?


More like Coleridge's unfinished poem Kubla Khan[0]

[0]: https://en.wikipedia.org/wiki/Kubla_Khan


Yeah, I dont get it either.


Imagine web had actual support for hypertext. Then you couldn't just "hotlink" images, but also "hotlink" (i.e. natively quote, and annotate) textual content, with arbitrary start and end-points. As it is, html can't can't even express the equivalent of margin notes or footnotes, because it has some braindamaged ideas about inline vs block elements and the former can't contain the latter (they changed the nomenclature, but I'm pretty sure the limitation remains).


Not just text, either. Other media formats as well, though text was always the first format supported.


SGML, the meta-language on which HTML is based, has entities ie text expansion variables bound to the content of external resources accessed via URLs or file names. So HTML itself doesn't need such mechanisms if it's understood that general markup facilities of SGML are available (which however isn't the case with browsers). XML (XInclude) attempted to establish more granular inclusion of external document fragments via XPath, but failed on the web. Even if HTML had transclusion, what would the use case look like? HTML as a markup vocabulary was introduced for casual academic publishing at CERN, reusing folklore SGML markup elements that were already widely used at the time. You can argue that HTML has been effectively stagnant while everything around it (CSS and JS) was changed ad absurdum to cater for HTML's limitations, but still in an academic setting, you'll use inline citations or block quotes with references to sources, just as HTML is providing.


You can do all this with the mostly ignored part of Fielding's REST, HATEOAS.

Xanadu has a lot of interesting ideas and Nelson is a visionary, though the idea to visually link works just doesn't work. However, transclusion is an important idea, that is critical for Wikipedia's style of evolution, and will be important as Wikidata and other linked data (semantic web) applications get off the ground.

However, the vast majority of the web consists of cheap short cut web sites that aren't much different than dial up systems. AKA start up money grabbing culture trying to carve up every little idea into a unicorn. And once they start carving things up, they sure don't want to collapse it and help create the semantic web, except privately behind the scenes, in creepy surveillance capitalist and other centralizing projects.


I think the Memex (er, Xanadu) is an effective Rorschach test to reveal perceptions of the power of computing.

Many are happy with the status quo, and see it as overly complex

Some see it as evil, like communism.

Some see it as hope for the future. Hope that should have arrived half a century ago, at least.


I hope to see xanadu succeed, even if it is only on some specialized niches. By seeing value on those niches more and more projects and people would understand the benefits of such system and would like to implement it.

But I also believe there will be always a space for current webapps, treating links as RPCs against external systems.

EDIT: For anyone interested here is Ted Nelson explaining the system in 2008 on an example system implementation of his original idea:

https://www.youtube.com/watch?v=En_2T7KH6RA


A significant source of motivation for the development of a great deal of the functionality that can be found in programs like as Obsidian, Logseq, and Roam Research.


This looks truly innovative to me, the way the different text-sources connect. And it seems to work on some level.


How is this practically useful anyway? How is this better than Roam?


This crashed the safari tab on my iPhone, lol.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: