🧠Edelman on Mental Representations
Gerald Edelman, especially in his theory of Neural Darwinism and in works like Bright Air, Brilliant Fire (1992), rejects the idea of mental representations as discrete internal symbols or stored models. Instead, he insists that consciousness and cognition emerge from dynamic, context-sensitive processes in neural systems—not from the manipulation of representations. Here’s a brief sketch of Edelman’s position.
Edelman critiques the representationalist view on both neuroscientific and philosophical grounds:
-
Degeneracy over codingNeural circuits are degenerate: different neural configurations can produce the same functional output. This undermines the idea of fixed internal representations or mappings from input to meaning.
-
Reentrant connectivityThe brain functions through reentrant signalling—continuous recursive communication between and within neural maps. There is no central model or representation, only ongoing coordination across distributed areas.
-
Embodied, dynamic systemsCognition is the result of embodied interaction between the organism and its environment. Meaning is not stored internally but arises through history-dependent patterns of activation shaped by bodily experience.
-
Selectional systems, not representational onesThe brain is a selectional system (like evolution): it doesn’t encode representations, but selects among variations generated through development and experience. This is radically different from computationalist models of representation.
“There is no image, no homunculus, no representation in the brain. There is only the capacity to categorize and to select from past activity.”— Bright Air, Brilliant Fire, p. 120
Edelman’s view aligns with many systemic theorists’ non-representational stance—notably Halliday, Matthiessen, and Thibault—who have long argued that language is not a system of internal representations, but a socially-constructed semiotic resource.
In fact, Edelman provides neuroscientific validation for the idea that:
-
The brain does not need representations to support language use.
-
Cognition can be understood as emergent, context-sensitive, and relational.
-
Systems (like language) are better conceived as structured potentials for construal, not as storage systems of meaning.
Why Edelman Was Right: Mental Representation as a Category Error
What emerges from Edelman’s work is a view that not only undermines the idea of mental representation, but also supports a non-representational, relational ontology in which meaning is neither internal nor stored.
Mental Representation: A Problematic Inheritance
The notion of “mental representation” has long been central to many traditions in linguistics, particularly those influenced by cognitive science. The idea is simple, perhaps deceptively so: when we understand language, we do so by constructing internal models or symbolic representations—of concepts, sentences, or mental images—that can be retrieved or manipulated by the mind.
This metaphor of storage and retrieval has deep roots in the computational view of mind. But it carries with it several assumptions:
-
That cognition is a matter of processing symbolic input.
-
That knowledge consists of stored internal content.
-
That meaning exists prior to and independent of context, residing inside the brain.
These assumptions, however, are exactly what Edelman calls into question.
Edelman’s Rejection of Representationalism
In Bright Air, Brilliant Fire and elsewhere, Edelman argues that the brain does not represent the world internally in any meaningful way. Instead, it categorises experience through patterns of neural activation shaped by selection, history, and interaction with the environment. There are no internal models or stored symbols—only coordinated, dynamic patterns of neural activity, continuously shaped by feedback loops.
“There is no image, no homunculus, no representation in the brain. There is only the capacity to categorize and to select from past activity.”— Gerald Edelman (1992, p. 120)
In place of representations, Edelman proposes:
-
Degeneracy: Multiple neural configurations can give rise to the same functional outcome. This rules out the idea of one-to-one mappings between input and internal representations.
-
Reentrant connectivity: The brain is massively recursive, with ongoing signalling across distributed maps. Meaning is not stored—it is emergent from ongoing coordination.
-
Selectional dynamics: The brain is more like a Darwinian system than a computer. It doesn’t encode; it selects from structured potentialities based on past experience and current context.
In short, Edelman offers a vision of cognition that is relational, emergent, and embodied—not representational, symbolic, or internalist.
Conclusion: The Real Error
Mental representation is not just a flawed model. It is a category error. It mistakes the infrastructure that supports construal for the construal itself. It reifies meaning as substance. It confuses system with storage. It localises meaning where meaning does not reside.
Edelman saw this clearly from the vantage point of neuroscience. It’s time more linguistic theories saw it too.
No comments:
Post a Comment