Why Edelman Was Right: Mental Representation as a Category Error
What emerges from Edelman’s work is a view that not only undermines the idea of mental representation, but also supports a non-representational, relational ontology in which meaning is neither internal nor stored.
Mental Representation: A Problematic Inheritance
The notion of “mental representation” has long been central to many traditions in linguistics, particularly those influenced by cognitive science. The idea is simple, perhaps deceptively so: when we understand language, we do so by constructing internal models or symbolic representations—of concepts, sentences, or mental images—that can be retrieved or manipulated by the mind.
This metaphor of storage and retrieval has deep roots in the computational view of mind. But it carries with it several assumptions:
-
That cognition is a matter of processing symbolic input.
-
That knowledge consists of stored internal content.
-
That meaning exists prior to and independent of context, residing inside the brain.
These assumptions, however, are exactly what Edelman calls into question.
Edelman’s Rejection of Representationalism
In Bright Air, Brilliant Fire and elsewhere, Edelman argues that the brain does not represent the world internally in any meaningful way. Instead, it categorises experience through patterns of neural activation shaped by selection, history, and interaction with the environment. There are no internal models or stored symbols—only coordinated, dynamic patterns of neural activity, continuously shaped by feedback loops.
“There is no image, no homunculus, no representation in the brain. There is only the capacity to categorize and to select from past activity.”
— Gerald Edelman (1992, p. 120)
In place of representations, Edelman proposes:
-
Degeneracy: Multiple neural configurations can give rise to the same functional outcome. This rules out the idea of one-to-one mappings between input and internal representations.
-
Reentrant connectivity: The brain is massively recursive, with ongoing signalling across distributed maps. Meaning is not stored—it is emergent from ongoing coordination.
-
Selectional dynamics: The brain is more like a Darwinian system than a computer. It doesn’t encode; it selects from structured potentialities based on past experience and current context.
In short, Edelman offers a vision of cognition that is relational, emergent, and embodied—not representational, symbolic, or internalist.
Conclusion: The Real Error
Mental representation is not just a flawed model. It is a category error. It mistakes the infrastructure that supports construal for the construal itself. It reifies meaning as substance. It confuses system with storage. It localises meaning where meaning does not reside.