This communication focuses on the interplay between fiction, style and tools, in order to trace a genealogy of the practices and thought which have led to the current state of AI technologies, illustrated particularly by GPT-3. Indeed, since the currrent wave of interest into artificial intelligence isn’t the first one, it matters to resituate it within a tradition of modern computing technologies—starting with the second-half of the twentieth century. The approach here focuses mostly on ways of doing, manières de faire, in order to highlight three aspects. First, it is a matter of higlighting the assumptions within which AI practicioners were working, what were the worldviews they were surrounded with, and some of the hypotheses (fictions) they grounded their work in. Second, I want to show how these approaches translate into concrete, material ways, taking the specific example of programming languages, from LISP to Python. Third, I would like to examine nmore closely the shift from one epistemological stance to another, one which I will call atomic, into the other which I will call patternic. Finally, I will sketch out what could be some of the consequences of this shift—or rather, this re-organization of priorities—for contemporary works of fiction, resulting in a literature of patterns.
This investigation starts from the perspective of communication sciences at large, encompassing science and technology studies, media studies, philosophy of science and literature. The main analytical lens I will rely on here is that of style, not just as a set of aesthetic manifestations, but style as an empistemological stance. Developed by Gilles Gaston-Granger in his work Essai pour une philosophie du style. What he assumes throughout his work is that specific formal manifestations tend to highlight different tendencies and perspectives on a specific conceptual problem. As he analyzes the different ways of doing mathematics, from Euclides to Descartes and vector math, he highlights the intertwining of “work” and “thought”.
As the connection between the singular and the collective, style cannot be separated from inspiration. Part of what I would like to show today is also the mutually-reinforcing relationship between the not-so-distant two domains of C.P. Snow. Styles exist across both of those domains, and nonetheless enter into a dialogue. Fiction and theories aren’t so different from each other (Given X…, Once upon a time Y…) we will see how computer scientists themselves mention specific styles and themes of fictions in their scientific work, and how this scientific work in turn influences, more or less directly, writers of fiction.
Looking at specific practices, we can further enrich these relationships, and illustrate those stylistic, and therefore epistemological, stances by looking at programming languages.
lisp as experimental research which trickles down to the rest of practicioners
formalism vs. materialism -> how can tools represent epistemological stances?
differentiations of style
style can be explicit or not, from fashion to science
The first approach to Artificial Research in computer science (even though it was barely called that at the time, the first CS department appearing at Cornell in XXXX), was kickstarted in 1956 at the Dartmouth Summer Reasearch Project on Artifical Intelligence. Organized by John McCarthy, Marvin Minsky, Nathaniel Rochester and Claude Shannon and others, it was intended to formalize the diverse approaches which existed at the time, focusing more on what was then called “thinking machines” rather than Artificial Intelligence. Indeed, the term itself was coined by McCarthy at the time.
The results of the workshop, like so much of AI work at the time, and perhaps still today, vastly underestimated the nature and scale of the task. It did, however, establish AI research as an coherent field, and laid out foundations for two different heuristics for solving the issue of finding “how to make machines use language, form abstractions and concepts, solve kinds of problems now reserved for humans, and improve themselves”. These two approaches are the physiological approach, and the formal approach. The physiological approach, favored by Minsky, Solomonoff and Turing, intended to re-create the human brain’s plasticity, with the assumption that the simulation of neurons would result in the simulation of thoughts. The formal approach, favored by McCarthy, Shannon and Cullough, posits in turn that the problem-space, rather than the phenomelogical space should be abstracted away in formally manipulable symbols in order to enable its processing by the computers of the time, computers designed on the assumption of the vast reach of discreete logic operations.
These two approaches are two distinct epistemological stances, the former specifically phenomenological, the other specifically rational. Historically, the rational approach prevailed for a time, until the deep learning renaissance of the early 2000s. Looking at it more closely, we can highlight two “fictional environments”, which have surrounded this stage of research in AI. The first is taken from cybernetics, and the second is taken from children’s games.
The founding fathers of AI, as we know them, are not particularly known for explicitly quoting the works of fiction that have inspired them. Their lineage, rather, stands more closely aligned within philosophical traditions rather than artistic ones. In particular, the works of the rational AI scientists can be traced back to logicians and philosophers, from Russell to Frege and Leibniz INSERT PROOF WHICH COMES FROM MOST HISTORY OF CS PAPERS.
then sum it up, history of CS, and then semantics
make believe, rules, etc.
turing - wittgenstein (opposite approaches?)
thesis of mccarthy
winograd -> blocks world -> symbolic artifical intelligence
as input (programming languages)
and output (data, json, corpus, word2vec, etc.)
only schools, fantasies (YCombinator), OOP and C worked better
Lisp as a myth for today’s programmers
not just computer scientists, but also the of the programmers that come after
from lisp to python, scripting is anything
mccarthy -> minsky
it was intertwined before, it is still intertwined
what is the lisp of today? platforms (tensorflow, etc.)