Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I am extremely unlikely to ever allow anything even vaguely related to Elon Musk to implant something in my brain, or even to wear as completely noninvasive "through the skin" helmet or headband-like sensing device. Nevermind something with him as a founder.


The article shows a video of Noland, paralyzed from the shoulders down, playing Polytopia. It's great to be fully able-bodied and mock Elon Musk, but for someone to go from using a mouth stylus to playing Polytopia via telepathy is very cool and should be celebrated.


There is more than one organization/company in the world working on human-computer assistive interfaces for the paralyzed. For instance:

https://www.news-medical.net/news/20230824/Brain-computer-in...

https://news.brown.edu/articles/2012/05/braingate2

https://www.ahajournals.org/doi/10.1161/STROKEAHA.123.037719

If you google "BCI brain computer interface paralyzed" you will find a wealth of researchers and organizations working on it which are not Neuralink.


Sure, but have any made as much progress as Neuralink? Not a rhetorical, genuinely asking as someone who doesn't know much about this field. Though, even if others have, isn't this kind of technological achievement something to be lauded regardless of who owns the company that did it?


Yes: https://www.youtube.com/@BCIcanDoBetter

Shaking President Obama's hand with "touch feedback" in 2016: https://www.youtube.com/watch?v=itkgmMLi7l4

Eating a taco in 2018: https://www.youtube.com/watch?v=fUjfA78FuZM

Robot arm in 2018: https://www.youtube.com/watch?v=MjFr0rnbT24

Playing Final Fantasy 14 with a BCI in 2019: https://www.youtube.com/watch?v=WjNHkRH0Dus

Non-invasive robot arm control in 2011: https://www.youtube.com/watch?v=8eOSlzDdOpg

Non-invasive robot arm control in 2020: https://www.youtube.com/watch?v=asDwupMbE2I

Speech/voice generation in 2024: https://www.youtube.com/watch?v=v8frSsvwPp4

The technology to do these sorts of things as proof-of-concepts is fairly old. You do not see widespread deployment because brain surgery betas are not a very good idea. There is insufficient evidence the technology is mature or safe enough to support full-scale deployment. A common class of problem being brain scarring on the invasive insertions that reduce efficacy of the implant requiring further damaging brain surgery to remove the implant in a few years.

When you have insufficiently mature technology for deployment you optimize for research. For that, you only need enough to saturate your researchers with data and well-designed tests which is usually achieved with only a small number of units. This is similar to the reason why you only need a few prototype cars even when you are going to make millions of them. If you are not deploying, then you do not need a lot to saturate your design/development process and making a bunch of each half-baked version prior to the final release candidate is a waste of time.

When the technology is minimally adequate, then you scale up. In contrast, deploying middling quantities of proof-of-concept versions as if that "tests" anything is a recipe for a slow-burning disaster. Nobody else is "trying to compete" on who can deploy more because competing on who can deploy more half-baked brain implants would be unethical.


very interesting links, thanks! I was not aware that the technology had progressed that rapidly (outside of Neuralink, which captures a lot of the attention)


BCIs have been doing what Neuralink is showing off since at least the 90s. It is emphatically not a difficult concept to understand that you can put wires in the brain and someone can learn to influence signals on those electrodes. Hell, Deep Brain Stimulation has been an FDA approved use of putting electrodes in your brain since 1997.

The hard parts of BCI are: Electrode sensing, but that's a much less difficult problem nowadays. Implant longevity, probably an unsolvable problem without massive advancements in understanding the body. Brain surgery, which will never not be a huge deal because piercing the barriers that protect the brain is just inherently a huge deal and risky to do.

I'm pretty sure Neuralink is the only one mass killing monkeys though.

Note that Elon has also helped push for the killing of US science funding, like funding used to further study BCIs. How convenient for him that all his competition is suddenly going to struggle.


Great. Happy to celebrate them too.


I'm reminded of The Diamond Age novel, where a character hears about someone who had an implant which was hacked. If I recall correctly, the hack caused the implant to display an advertisement in a different language at the edge of their vision.

That seems pretty benign compared to what a neural implant could be made to do to someone.


In Iain M Banks' Culture novels there's a ship (the General Systems Vehicle "Grey Area", but most often called "Meatfucker" by other ships) which has converted its interior into a museum of torture devices. Lots of stuff we'd recognise, but one we wouldn't - a Neural Lace. Almost all Culture citizens have one, it's typically implanted in early adulthood and grows next to your brain. A Culture Citizen touring the museum is confused, why does the Grey Area have a Neural Lace ?

Well of course the device doesn't have to be programmed to be controlled by the host, does it ? Torture entirely by manipulating the compute substrate your mind runs on would be effective† and yet very easy to do... so this is in fact just another torture device.

† Effective in the sense that it would inflict needless misery on people, that's what torture is actually for, it's not an effective interrogation strategy and never has been.


> That seems pretty benign compared to what a neural implant could be made to do to someone.

Black Mirror: https://en.wikipedia.org/wiki/Men_Against_Fire

just as an example


If it was that or be paralyzed for the rest of your life, would you at least consider it?

I don't like Musk and I find Neuralink spooky in terms of their overall goals, but it's hard to deny how much this invention helps people.


Given the security track record of software in general, not even specifically those of Musk's companies but more broadly zero-days in all the major platforms, I would worry about a scenario half way between the plot of the film Upgrade and the long-standing trope of using hypnosis to turn someone into an unwitting assassin: https://en.wikipedia.org/wiki/Upgrade_(film)


It seems easy to worry about somewhat far-fetched scenarios like this when one is not paralyzed and thus not making a trade-off between risks and a fully paralyzed life.


The tech itself is far-fetched; that hackers exploit vulnerabilities in every system is not, rather it is so common you can measure attempts in Herz.


You have again ignored the trade-off in your word-play.


I disagree.

I see the promise, but I've got too many real life examples of security issues to draw on to trust it would even keep working very long — let alone working appropriately and under my control — to allow one to control my body, which an implant would necessarily need to do.

And that's even with 100% of the biological compatibility issues being solved (I'm told those take several years to show up in all the other research examples from everyone else) and assuming that there was no trust deficit with Musk's companies selling products on the promise of what they aspire to do "this year" and don't/them having misleading demos — this is a fundamental issue of digital security being hard.

If an accident like Christopher Reeve's were to happen, I'd wait for something that repaired or regenerated tissue over a chip.


You seem awfully sure about what you would do but you again do not seem to have considered about what it is like to be fully paralyzed. Let me be clear, I understand the risks of these devices. But my impression is that you're having trouble emphathizing with people with such medical conditions and you're not really considering how it feels to live like that. In fact, you again spend your whole message talking about abstract considerations, but you do not talk about the experience of being unable to do almost anything - how that shapes a persons willingness to take risks and weigh them differently. That is my point about trade-offs, consider the personal and emotional as well as the technical.


> In fact, you again spend your whole message talking about abstract considerations

No.

Not abstractions.

I have experience of software, I know how bad the entire industry is.

https://xkcd.com/2030/ applies to everything.

Even without malice, my degree used as case studies the failures of the Therac-25 and the digitalisation of the 1992 failure of the London Ambulance Service computerised dispatch system.

Hospitals and devices do get attacked. Bitcoin ransomware does affect hospitals. These are not abstractions, they are things that actually happen: https://en.wikipedia.org/wiki/Medical_device_hijack

I wasn't being "abstract" when I said the frequency with which attacks are attempted can be measured in Herz, that's an actual anecdote from someone I knew a decade ago.


Software safety is an abstract concept. You have experience with specific instances of problems that fall under the abstract concept of software security. This is not to say it is not important.

Not being able to move your hands is not an abstract concept. It can be directly experienced.


"A specific disability" is as little of an abstraction as "this specific software vulnerability".

"Disability in general" exactly as much an abstraction as "software safety in general".


If it would make the greed and exploitation that form people like Musk go away without a trace and forever, I'd happily to be paralyzed for the rest of my life. Nobody would even have to know and thank me for it, as long as I knew... I'd watch humanity flourish on TV and cry tears of happiness. Even the best version of my best life is still just 1 life.


It is very brave of you to make this totally hypothetical sacrifce for us - only in your mind of course.


> If it was that or be paralyzed for the rest of your life, would you at least consider it?

It's all hypothetical anyway, what's your deal? Are you saying the totally hypothetical life changing cure is something to be impressed by so much, that the real suffering caused by those pursuing it is to be ignored? Ignoring the real suffering for some hypothetical deus ex machina is very cowardly, and if my hypothetical sacrifice reminded you of that, that's fine.

"If Elon Musk gave a shit about anything than profit, and knew his ass from his elbow, and this tech was feasible, and you were paralyzed, would you do it?"

He doesn't, he doesn't, it may not be, and I'm not, so the question is moot. But it's very scientific to ask, and a great way to navigate such society impacting questions, thanks!


Yikes dude. Neither SpaceX nor Tesla would exist if he only cared about profit. Nor would Neuralink!

This is Reddit-level delusion right here. Please don't bring that here.


While the discourse could be better, that counterargument doesn't work either: all of those companies still make sense even if they were purely driven by profit motives as they also represent gaps in the marketplace.

I think Musk is driven by both.

Even now, despite the flaws I see in him, I still assume Musk thinks he's improving humanity.

But he needs, and knows he needs, a lot of money for Mars. There's unambiguously a lot of profit motive.

Unlike @computerthings, my objection is on the tech, not the person. The person doesn't help, he also doesn't seem to get the mindset needed for quality software security, but also doesn't make it much worse given how bad this is everywhere.


I know people this paralyzed. The concerns they have are usually more "How will I pay rent next month" and "how do I not get such bad bed sores"

How much do you think Neuralink is going to cost? How will people who can't get around on their own pay that? How are people who can't work going to pay that?

I don't know why supporters of all these things are so unable to view the whole situation. Musk doesn't want to pay taxes to a government that will support these disabled people. Musk doesn't want to support these disabled people. They are literally pawns for PR to him.

Musk doesn't want to advance the HUMAN RACE. Musk wants to advance CERTAIN PEOPLE.


To add to what you say:

> Musk doesn't want to advance the HUMAN RACE. Musk wants to advance CERTAIN PEOPLE.

I think he can't tell the difference between those certain people and the human race as a whole. Trans people in particular would be the obvious example of his failure here — ironically, given how much inspiration he's taken from a fictional universe where people can change physical gender by thinking about it a bit and waiting a few months.

It's… not intended as a compliment when I say he "seems sincere" about wanting to advance the human race when it comes with this caveat. Quite the opposite.

Likewise given what else he's "seemed sincere" about in the past and hasn't manifested.


Why should we believe that he is not just straight up a lying or hiding some kind of fatal flaw? He is intentionally and systematically dismantling any regulatory or enforcement bodies that would hold him accountable or investigate his claims


"Mind virus" has a whole extra dimension when IoT is hardwired to your brain.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: