This paper is part of a growing body of thought on the distinction between experiential understanding and symbolic understanding, and the problems caused by either equating the two or relying entirely on the latter.
Experiential understanding is an understanding of ‘things as they truly exist’. It is intuitive and non-linguistic which means that it is hard to coordinate efforts with other people based on it alone and it offers few affordances - it is not very useful for engineering.
Symbolic understanding - our descriptions of reality - is a step removed/reduced from experiential understanding. It is a re-presentation. Reality is continuous in space and time and has no distinct parts. To be able to name things, to have symbolic/linguistic representations of things, we must somehow chop reality into distinct pieces which there is no objectivity correct or complete way to do. The boundaries we draw around things are always slightly arbitrary and meaningful bits are always left out. But the resulting symbolic re-presentation of reality is amazingly powerful. It lets us communicate and coordinate with others about the world around us and it lets us develop scientific models that provide massive affordances over the world.
As society comes to value and trust symbolic understanding more and more, we come to think that it is the whole story. That if experiential understanding does have contain any meaning beyond that of symbolic understanding, it only a temporary state of affairs because science has not yet caught up. The growing dominance of a symbolic worldview is only driven forward faster by the muteness of experiential understanding - by its very nature it cannot defend itself in a court of words, and when it tries to do so through a friendly linguist interpreter it comes off as unhinged and not a little woo-woo. What well-trained STEM practitioner reading this doesn’t scoff at the idea that representational/scientific/symbolic understandings of reality cannot be the whole story? In fact, if you asked most STEM professionals which provides a more fundamental and true picture of reality, an experience or a scientific model, most would say the latter.
But any symbolic understanding derives all its meaning from experiential understanding. It’s only experience that gives words their semantic weight, that ontologically ‘grounds’ them in reality (what does ‘red’ mean to a blind person?). What many philosophers, writers, and ethicists are circling around is the sense that if we place symbolic understanding fully on the throne and kick experiential understanding not just out of the castle but out of the kingdom, a process we seem desperate to do as quickly as possible, then we will be left with a hollow and meaningless, but highly articulate and self-consistent, ontology. One so decoupled from reality that it easily drifts off into convincing delusions and other forms of madness.
Iain McGilchrist popularized this experience-vs-representation epistemological framework in his book The Master and His Emissary.
This paper here argues that this same dynamic leads to ai-induced psychosis on the individual level.
In his recent collection of essays, Against The Machine, Paul Kingsnorth blames this same over-indexing on symbolic understanding as the root of a growing society-wide delusion where we are detaching culture of all forms from any grounding in experiential reality and so are left with only a hollow and fragile shell - one that is too weak to bind us together toward any common goal.
So what then? Don’t confuse a picture of a place as the experience of being in that place. Don’t think that experiencing the world through a screen (where it has been filtered through a symbolic representation) is the same as touching it. Don’t let the singulatarians and accelerationists fool you that transcendence or eternal life is available through digital upload so you should help bring about the singularity as quickly as possible. The map is not the terrain. Touch grass.
I agree. Over the last 10 years, I've been growing in awareness of how in general people, developers, and their managers, and their company owners are incapable of the communications necessary for basic cooperation in project teams, in their own self planning, and forget about their ability dealing with more complex issues like market & social-political manipulation of the public. Far too many people appear to only be capable of 1-step forward planning. Seriously. My career has become a consultant that can see multiple steps forward and when explaining this insight I get everything from angry refusals to daunting obedience, but never a competing insight, only 1 step past their present situation. People are way to quick to enter into short sighted conflict, and then completely ignore their real responsibilities.
I was ready to dismiss the article on the basis of its sensationalist writing, but I wanted to dig into the study as well. After reading through it, I asked my wife, who's a sociologist and researcher, what she thought about these conclusions based on the survey data. She said that collecting survey results in this kind of longitudinal study is considered a valid source of data about trends like this, but she said the obvious caveat is that you can't determine causation or what the practical impact is. The article does not hesitate to state hypothesis as fact, which is probably where the dismissals are coming from.
From the talk: “Intelligence is the conversion ratio between past experience and potential operating area [where the potential operating area of interest is the area of problems that system has no experience of].”
So we might say, “General Intelligence is the ability to do the things we haven’t yet thought of.”
“Like what?”
“Well, as soon as I name something it stops counting.”
Gödellian - I like it. Does that mean a constructive definition of General Intelligence is uncomputable?
Had the mistake been made in the other direction, making the significance of their finding very small, would they have double checked their math? You bet they would have.
Take a bunch of videos of the real world and calculate the differential camera motion with optical flow or feature tracking. Call this the video’s control input. Now we can play SORA.
Experiential understanding is an understanding of ‘things as they truly exist’. It is intuitive and non-linguistic which means that it is hard to coordinate efforts with other people based on it alone and it offers few affordances - it is not very useful for engineering.
Symbolic understanding - our descriptions of reality - is a step removed/reduced from experiential understanding. It is a re-presentation. Reality is continuous in space and time and has no distinct parts. To be able to name things, to have symbolic/linguistic representations of things, we must somehow chop reality into distinct pieces which there is no objectivity correct or complete way to do. The boundaries we draw around things are always slightly arbitrary and meaningful bits are always left out. But the resulting symbolic re-presentation of reality is amazingly powerful. It lets us communicate and coordinate with others about the world around us and it lets us develop scientific models that provide massive affordances over the world.
As society comes to value and trust symbolic understanding more and more, we come to think that it is the whole story. That if experiential understanding does have contain any meaning beyond that of symbolic understanding, it only a temporary state of affairs because science has not yet caught up. The growing dominance of a symbolic worldview is only driven forward faster by the muteness of experiential understanding - by its very nature it cannot defend itself in a court of words, and when it tries to do so through a friendly linguist interpreter it comes off as unhinged and not a little woo-woo. What well-trained STEM practitioner reading this doesn’t scoff at the idea that representational/scientific/symbolic understandings of reality cannot be the whole story? In fact, if you asked most STEM professionals which provides a more fundamental and true picture of reality, an experience or a scientific model, most would say the latter.
But any symbolic understanding derives all its meaning from experiential understanding. It’s only experience that gives words their semantic weight, that ontologically ‘grounds’ them in reality (what does ‘red’ mean to a blind person?). What many philosophers, writers, and ethicists are circling around is the sense that if we place symbolic understanding fully on the throne and kick experiential understanding not just out of the castle but out of the kingdom, a process we seem desperate to do as quickly as possible, then we will be left with a hollow and meaningless, but highly articulate and self-consistent, ontology. One so decoupled from reality that it easily drifts off into convincing delusions and other forms of madness.
Iain McGilchrist popularized this experience-vs-representation epistemological framework in his book The Master and His Emissary.
This paper here argues that this same dynamic leads to ai-induced psychosis on the individual level.
In his recent collection of essays, Against The Machine, Paul Kingsnorth blames this same over-indexing on symbolic understanding as the root of a growing society-wide delusion where we are detaching culture of all forms from any grounding in experiential reality and so are left with only a hollow and fragile shell - one that is too weak to bind us together toward any common goal.
So what then? Don’t confuse a picture of a place as the experience of being in that place. Don’t think that experiencing the world through a screen (where it has been filtered through a symbolic representation) is the same as touching it. Don’t let the singulatarians and accelerationists fool you that transcendence or eternal life is available through digital upload so you should help bring about the singularity as quickly as possible. The map is not the terrain. Touch grass.