1854: Boolean algebra
- George Boole starts modern logic with his book 'The laws of thought', in which expounds an algebra of logic [Wikipedia]
1880-1900: Sense and reference, semantic compositionality
- In the late nineteenth century Gottlob Frege laid the ground work for modern logic and the philosophy of language with his function-argument analysis of the proposition, the distinction between concept and object, the principle of compositionality, the context principle and the distinction between sense and reference. [Wikipedia]
1950: The Turing Test
- In his 1950 paper "Computing Machinery and Intelligence" Alan Turing poses the question "Can machines think?" and proposes to interpret it in terms of an "imitation game". In this game an interrogator must decide whether the agent in the other room is a man or a machine. He is only allowed to communicate with him or it via typed print. This game is now known as the Turing Test.
1953: Language games, meaning as use
- Ludwig Wittgenstein's book "Philisophical Investigations" is published posthumously. Herein he develops the concept of "Language game": the use of language can be compared to a game, in that lanuage interactions are bound by rules, and the words and sentences used in the game derive their meaning from the use in the game. Wittgenstein also noted that the meaning of a word should be found in the use of the word in the game. [Wikipedia]
1956: IPL, AI
- IPL was invented by Allen Newell, Cliff Shaw, and Herbert Simon at RAND Corp and Carnegie Institute of Technology. IPL invented the concept of list processing. It was used in several NLI systems. [Wikipedia]
- At the Dartmouth conference the field of Artificial Intelligence was born. It was posed that "every aspect of learning of any other feature of intelligence can in principle be so precisely described that a machine can be made to simulate it." Main participants: Marvin Minsky, John McCarthy, Allen Newell, Herbert Simon.
1957: Transformational grammar, COMIT, Generalized quantifiers
- Noam Chomsky publishes "Syntactic Structures", in which he introduces Transformational Grammar. It was used by a few NLI systems. [Wikipedia]
- COMIT, the first string processing language, was invented by Victor Yngve as MIT. It was used in several NLI systems. [Wikipedia]
- A. Mostowski extends the logical universal and existential quantifiers to generalized quantifiers. Natural language uses a wide range of quantifiers. [Article] [Encyclopedia.com]
1958: Lisp
- Lisp was invented by John McCarthy. The favored programming language for AI research. Influenced by IPL. [Wikipedia]
1959: Programs with common sense
- John McCarthy wrote the influential paper 'Programs with common sense'. The new idea was to use formal languages to solve ordinary commonsense reasoning, whereas logic was used before only to represent mathematical truths. The program to be written, 'Advice Taker', was to be a joint project with Marvin Minsky, but it was not realized. Reasoning is done by processing rules containing premises and conclusions (a(x, y), b(y, z), c(z, a) -> d(x, a)). Fischer Black later made a first implementation of this system (1964) [Wikipedia]
1962: Speech Acts
- Speech Acts: J.L. Austin showed in his "How to do things with words" that sentences are not just conveyors of declarative meanings that are true or false. They may also be actions in themselves. A sentence may inquire, command, promise, etc. A speech act may require pragmatic information to be interpreted. [Wikipedia]
1963: Selectional Restrictions
- Selectional restrictions specify the allowed values for predicate arguments. Work on selectional restrictions as a way of characterizing semantic well-formedness began with Katz and Fodor [Wikipedia]
1966: Chatbot
- Joseph Weizenbaum invents the chatbot. Chatbots are accessible for the user and simple to create for a programmer. The chatbot uses hard-coded request-response rules. Many chatbots have since been created and they are still popular. In their basic form they are mindless, but they can be enhanced with other functions such as virtual assistance and decision support.
1967: Events as objects
- Donald Davidson first explicitly represents events as objects in a formal representation in his paper 'The logical form of action sentences'
1968: Chart parser, Case Grammar, Micro-Planner
- Jay Earley creates a chart parser for natural language parsing. Chart parsers use dynamic programming to memorize structures that have been built before in order to avoid double work. [Wikipedia]
- Case Grammar was developed by Charles J. Fillmore. It emphasises semantic roles of a sentence, Agent, Object, Benefactor, Location or Instrument. [Wikipedia]
- Planner was created by Carl Hewitt (MIT), the first procedural programming language, implemented in Lisp. It applied forward and backward chaining. It was used by SHRDLU. [Wikipedia]
1969: Semantic Memory, Conceptual Dependency
- Quillian introduces Semantic Memory into NLI. Semantic Memory holds factual world knowledge. [Wikipedia]
- Roger Schank introduces "Conceptual Dependency Theory", a theory for natural language understanding that uses fundamental concepts like ATRANS, PTRANS and MTRANS to denote abstract transfers.
1970: Augmented Transition Network, Relational Databases
- W.A. Woods invents the Augmented Transition Network (ATM), a finite state machine extension to parse natural language sentences. [Wikipedia]
- E.F. Codd (IBM) publishes "A Relational Model of Data for Large Shared Data Banks". The basis of modern relational databases.
1972: Semantic Parsing, Prolog
- Yorick Wilks coins the term 'semantic parsing' for parsing directly to semantic form, skipping syntax [Wikipedia]
- Prolog was created by Alain Colmerauer and Philippe Roussel, based on Robert Kowalski's procedural interpretation of Horn clauses.
1973: Montague Grammar
- Montague publishes "The Proper Treatment of Quantification in Ordinary English" (also known as PTQ), the first formal treatment of the semantics of natural languages. Since this article requires a good deal of prior knowledge of logic, you may wish to approach this article through Dowty et al's "Introduction to Montague Semantics".
1975: Scripts, Gricean maxims
- Roger Schank and Robert P. Abelson introduce Script Theory in AI. It is a psychological theory by Silvan Tomkins that explains how people know how to react in all sorts of stereotypical situations
- Paul Grice states the implied assumptions in a conversation. These so called Gricean maxims are: conversation should be informative, true, relevant and clear [Wikipedia]
1979: Feature Unification
- Kay invents Feature Structure Unification, an enhancement to phrase structure grammars that enforces features such as tense and number [Wikipedia]
1981: Discourse Representation Theory
- Hans Kamp invents Discourse Representation Theory, which creates representations for inter-sentential references [Wikipedia]
1983: Centering Theory
- Barbara Grosz, Aravind K. Joshi and Scott Weinstein present a unified account of anaphora resolution that describes the preferences for pronouns and definite noune phrases. Their 1995 paper 'Centering: A Framework for Modeling the Local Coherence of Discourse' provides a more complete theory [Wikipedia]
1987: BDI
- The book "Intention, Plans, and Practical Reason", written by Michael Bratman, was published. It describes the BDI model (Belief-Desire-Intention) of action selection. [Wikipedia]
2006: Linked Open Data
- Tim Berners-Lee, inventor of the WWW and director of the W3C coined the term Linked (Open) Data to described structured data which is interlinked with other data over the internet. Large interconnected open databases have spawned many new question answering systems that mean to access this data. Many of these databases are triple stores that are accessible via the query language Sparql. [Wikipedia]
2018: The generative pre-trained transformer
- The invention of the transformer neural network model by Google Brain, Google Research, and University of Toronto, lead Researchers of OpenAI to develop GPT, a language model that is able to generate syntactically correct sentences based on an unbound set of conversation subjects. It opens the door for a variety of language tasks without the need for hard-coded request-response rules (see chatbot). [Wikipedia]