Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

How typical are you thinking? I'd guess less than 10%, there's a lot of AI researchers and this is just one strain of thought.


He's on the extreme end but way more than 10% believe that AGI will be a technological breakthrough on the same level as fire.


For those 'in the know' a lot more typical than you would think. If we don't reach at least Kardashev scale 1 in the next hundred years or so, we're going to go extinct due to several now-predictable factors.

And an unchained LLM trained on reality is far more capable of finding solutions to that problem than a bunch of squabbling politicians.


> And an unchained LLM trained on reality is far more capable of finding solutions to that problem than a bunch of squabbling politicians.

Not that I disagree with this statement, I don't, but this is not a silver bullet. Technology is, ultimately, operated by humans and no amount of frontier research and development can overcome collective action problems. At some point, you do have to sit down with these stupid politicians and get everyone on board. The loom was invented hundreds of years before the industrial revolution, in fact it was nearly forgotten and the designed survived due to a few happy accidents. It was only after the English Civil War and the establishment of checks on royal power that widespread adoption was possible.


Technology is operated by humans now, but I believe it is a mistake to think that technology could not evolve to the complexity that it can operate itself.


Sure, but is this what we want, as engineers? Can computers be held accountable?


I can see this in Minsky time period AI research, but surely with the number of people getting into AI and coming from a purely practical right now I would expect that mindset to be diluted. As someone not in the know I could very well be wrong.

In response to the coming apocalypse, this isn't the first time everyone has a vague sense of potential doom about the future. I believe this happens during any time of fundamental change, making the future uncertain which we interpret as apocalyptical. Back during the 30 years war that apocalyptic belief manifested as God being angry with us, today it's with the (very real) problems our rapid industrialization has created. Not to minimize the problems that we face - well minimizing only in that they probably won't lead to extinction. The various predictable factors mentioned have the potential to make life really shitty and cause massive causalities.

While framing these issues as a matter of extinction may feel like a way of adding urgency to dealing with these problems, instead it's contributing, on an individual level, to fracturing our society - we all "know" an apocalypse is coming but we're fighting over what is actually causing that apocalypse. Except that there will be no apocalypse - it's just a fear of the unknown, something is fundamentally changing in the world and we have no idea how the cards will land. It's no different than a fear of the dark.


We accuse GPT of confidently giving answers on things, but man, it learned from the best.

I cannot assure you that we won't have something like a nuclear apocalypse in the next few decades, and here you are certain it's not going to happen. How can you be assured of this future when the underlying assumptions of things like value of labor will be experiencing massive changes, while asset inflation is on an ever increasing spiral up.


I think you misread what I said - I was responding to this quote:

> If we don't reach at least Kardashev scale 1 in the next hundred years or so, we're going to go extinct due to several now-predictable factors.

Many people are certain of human extinction for one reason or another, it doesn't sound like you're one of them. I'm saying that we don't know what the future will bring, and that uncertainty manifests as apocolyptic thinking. I also specifically mentioned that we are facing multiple problems that can cause huge devastation and I'm not making the argument that "Oh hey everything is ok!" Just that to frame things as apocalyptic is contributing to the schism and preventing us from doing anything because everyone refuses to listen to anything else since they believe their lives are at stake.

I guess I shouldn't say "it won't be extinction", but that's way way way lower probability than people think. It's just that a massive amount of people have thought the world would end many times through out history, so I'm skeptical of "well this time we're RIGHT".




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: