- Date de réalisation : 8 Décembre 2016
- Durée du programme : 62 min
- Classification Dewey : Ontologie, Sémantique, Computer Science
- Auteur(s) : Cantwell Smith Brian
- producteur : INRIA (Institut national de recherche en informatique et automatique)
- Editeur : INRIA (Institut national de recherche en informatique et automatique) , CNRS - Centre National de la Recherche Scientifique , Région PACA , UNS
Dans la même collectionThe Legacy of Rudolph Kalman Reasoning over large-scale biological systems with heterogeneous and incomplete data Majority judgment: a new voting method Les premiers photons : les saisir, les faire parler Knowledge transfer and human-machine collaboration for training object class detector Biological Networks Entropies: examples in neural, genetic and social networks
Semantics in the Time of Computing
Much of the technical terminology of computer science betrays its logical heritage: ‘language’, ‘symbol’, ‘syntax’, ‘semantics’, ‘value’, ‘reference’, ‘identifier’, ‘data’, etc. Classically, such terms were used to name essential phenomena underlying logic, human thought and language — phenomena, it was widely believed, that would never succumb to scientific (causal, mechanical) explanation. Computer science, however, now uses all these terms in perfectly good scientific ways, to name respectable scientific (causally explicable, mathematically modellable) phenomena.
There are two possibilities. The first is that computer science has given us a scientific understanding the fundamental mysteries of language, logic, and mind. The second is that computer science has redefined these words, so that, although they have been brought into the realm of the scientific, they no longer refer to what they used to refer to. Most people believe the former. I will argue for the latter: that, for reasons traceable back to Turing’s 1936-7 paper, computer science has redefined these terms in such a way as to “disappear” much of what is fundamental to the human condition: language’s long-distance reach, the “non-effectiveness” of truth and reference, thought’s normative deference to the world.
The result, I believe, not only challenges prospects for Artificial Intelligence and cognitive science, but also limits our ability to understand data bases, knowledge representation, even programs. It also hinders communication, because overlapping technical vocabulary means different things in different communities. Most seriously, it undermines our ability to talk about the most fundamental aspects of semantic or symbolic systems.