> I think the key idea that consciousness is a local maximum in the universe is fundamentally mistaken, if a cool thought experiment.
Very much agree, on both counts. Even "just" on earth, it's not like the more mechanistic approaches didn't have plenty of chances to compete. The near-instant and very complete domination by the first species to reach a certain level - and its subsequent orders-of-magnitude raising as it then starts to compete mainly with itself - was totally driven by evolution and there's no reason to think it wouldn't happen anywhere else. The only variable I can really see is in where it (evolution-driven intelligence increase) stops.
It's extremely hard to imagine even the basics such as group coordination and complex communication without some sense of self, and even the animals we see on earth with some level of these abilities demonstrate at least a proto-consciousness, which might well have won out in the end had we not been first past the post. There's no reason to think this phenomenon would be an earth-only anomaly.
It reminds of the "million monkeys with a million keyboards" thought experiment. Yes, they will, given infinite time, produce the entire works of shakespeare. But in the real universe they don't have infinite time, the actual shakespeare evolved, wrote it all first, then took over the monkey planet and enslaved or at least repurposed them before they even got the first paragraph out.
I don't mean to suggest that homo sapiens is some kind of universal peak of capability - we are almost certainly not - but I find it pretty much impossible to imagine sentient creatures being outcompeted by non-sentient ones under any circumstances.
It's expensive, sure, but consciousness appears to be a - the - superweapon, and whoeever grasps it first, wins.
I think maybe you are misrepresenting the thesis of the book. My impression was, that in the book consciousness was a consequence of the evolution of intelligence on earth and that the scramblers had a totally different evolutionary pathway. Now that intelligence had evolved, its side effects namely consciousness was taking up processing power that could be used for more intelligence. Maybe I am misrepresenting the thesis of the book also, but at the very least I think it presents something more than an interesting thought experiment.
I think the idea that intelligence, creativity etc can be totally separate from consciousness is a perfectly valid proposition and certainly relevant to current and future developments in ML, collective intelligence (companies, countries, mobs etc)
I think in reality, separate from the story of the book, that consciousness being a key component in evolving truly intelligent systems from scratch makes a lot of sense. (and probably isnt as "expensive" as the book makes it out to be) I think it also makes sense that you can have non-sentient/unconscious intelligent systems just as intelligent as conscious ones and perhaps more.
If the chinese room gives convincing answers why does it matter one way or the other whether it truly "understands" chinese?
> I think maybe you are misrepresenting the thesis of the book
I certainly could be! It has been a while. But are you saying your understand was that sentience was achieved in the scrambler's evolution, and then later discarded in favour of "more teraflops", if you'll excuse my analogy? That wasn't my understanding - but I might well be wrong.
Even if so, the idea remains unconvincing, in fact more so. After a species achieves a certain domination in its environment - something you would very much expect from a civilisation capable of building spacecraft - evolution simply doesn't apply any more, not in any darwinian sense. Even with humans, evolution has basically stopped, or if it continues, does so under circumstances entirely under our control. I find it impossible to imagine any naturally-ocurring scenario in which such any sentient species is forced by evolutionary pressure to optimise away consciousness in favour of other mental tasks. They would instead, as you mention, simply augment their powers with technology - or deliberate genetic intervention.
It's still an interesting premise and definitely got me thinking. I also highly recommend the book. It's actually quite exciting, all this philosophy does not take up that much space in the text ;)
The problem with the chinese room is that someone had to build it... and someone had to invent chinese.
>I find it impossible to imagine any naturally-ocurring scenario in which such any sentient species is forced by evolutionary pressure to optimise away consciousness in favour of other mental tasks.
If a person is born tomorrow, on earth, with a mutation that makes them slightly less sentient but more intelligent, they would not have any issue reproducing.
Or you could conduct this thought experiment: if chat-gpt4 were somehow installed in a robot with prompt "reproduce", do you think they would not replace humans after some time?
Here is an interesting passage from the book I feel is relevant:
"So sentience has gotta be good for something, then. Because it's expensive, and if it sucks up energy without doing anything useful then evolution's gonna weed it out just like that."
"Maybe it did." He paused long enough to chew food or suck smoke. "Chimpanzees are smarter than Orangutans, did you know that? Higher encephalisation quotient. Yet they can't always recognize themselves in a mirror. Orangs can."
"So what's your point? Smarter animal, less self-awareness? Chimpanzees are becoming nonsentient?"
"Or they were, before we stopped everything in its tracks."
"So why didn't that happen to us?"
"What makes you think it didn't?"
It was such an obviously stupid question that Sascha didn't have an answer for it. I could imagine her gaping in the silence.
"You're not thinking this through," Cunningham said. "We're not talking about some kind of zombie lurching around with its arms stretched out, spouting mathematical theorems. A smart automaton would blend in. It would observe those around it, mimic their behavior, act just like everyone else. All the while completely unaware of what it was doing. Unaware even of its own existence."
"Why would it bother? What would motivate it?"
"As long as you pull your hand away from an open flame, who cares whether you do it because it hurts or because some feedback algorithm says withdraw if heat flux exceeds critical T? Natural selection doesn't care about motives. If impersonating something increases fitness, then nature will select good impersonators over bad ones. Keep it up long enough and no conscious being would be able to pick your zombie out of a crowd." Another silence; I could hear him chewing through it. "It'll even be able to participate in a conversation like this one. It could write letters home, impersonate real human feelings, without having the slightest awareness of its own existence."
"I dunno, Rob. It just seems—"
"Oh, it might not be perfect. It might be a bit redundant, or resort to the occasional expository infodump. But even real people do that, don't they?"
"And eventually, there aren't any real people left. Just robots pretending to give a shit."
"Perhaps. Depends on the population dynamics, among other things. But I'd guess that at least one thing an automaton lacks is empathy; if you can't feel, you can't really relate to something that does, even if you act as though you do. Which makes it interesting to note how many sociopaths show up in the world's upper echelons, hmm? How ruthlessness and bottom-line self-interest are so lauded up in the stratosphere, while anyone showing those traits at ground level gets carted off into detention with the Realists. Almost as if society itself is being reshaped from the inside out."
"Oh, come on. Society was always pretty— wait, you're saying the world's corporate elite are nonsentient?"
"God, no. Not nearly. Maybe they're just starting down that road. Like chimpanzees."
"Yeah, but sociopaths don't blend in well."
"Maybe the ones that get diagnosed don't, but by definition they're the bottom of the class. The others are too smart to get caught, and real automatons would do even better. Besides, when you get powerful enough, you don't need to act like other people. Other people start acting like you."
While im not totally convinced about the premise of "1%ers" being zombies I think its again a bit more than an interesting thought experiment.
"Things do what they do, because if they didn't, they wouldn't be what they are."
If machines can be sentient, then fire could be justifiably called sentient as well. Considering that our metabolic processes largely amount to exothermic energy transfer through oxidization and distribution via what amounts to liquid rust, I suppose the circle from man to machine to simpler machine to thermodynamics, and back to man, could be completed through this pathway.
Very much agree, on both counts. Even "just" on earth, it's not like the more mechanistic approaches didn't have plenty of chances to compete. The near-instant and very complete domination by the first species to reach a certain level - and its subsequent orders-of-magnitude raising as it then starts to compete mainly with itself - was totally driven by evolution and there's no reason to think it wouldn't happen anywhere else. The only variable I can really see is in where it (evolution-driven intelligence increase) stops.
It's extremely hard to imagine even the basics such as group coordination and complex communication without some sense of self, and even the animals we see on earth with some level of these abilities demonstrate at least a proto-consciousness, which might well have won out in the end had we not been first past the post. There's no reason to think this phenomenon would be an earth-only anomaly.
It reminds of the "million monkeys with a million keyboards" thought experiment. Yes, they will, given infinite time, produce the entire works of shakespeare. But in the real universe they don't have infinite time, the actual shakespeare evolved, wrote it all first, then took over the monkey planet and enslaved or at least repurposed them before they even got the first paragraph out.
I don't mean to suggest that homo sapiens is some kind of universal peak of capability - we are almost certainly not - but I find it pretty much impossible to imagine sentient creatures being outcompeted by non-sentient ones under any circumstances.
It's expensive, sure, but consciousness appears to be a - the - superweapon, and whoeever grasps it first, wins.