Natural Language Interface Gems



Programming language
Programming language
Which programming language is used to implement the natural language processing core components of the system? This excludes the languages that are only used to interact with the system.

APL ( 0 )
Fortran ( 0 )
Lisp ( 0 )
Prolog ( 0 )
Java ( 0 )

System structure

Type of analysis
Type of analysis
The main categories of natural language interfaces
Pattern matching
Literal occurrences of a pattern in a sentence are converted directly to parts of a DB query
Syntax based
A sentence is parsed and the parse tree is mapped directly to a DB query
Semantics based
After a sentence is parsed, it is first converted into an intermediate semantic expression, which is in turn converted into a DB query

From: Androutsopoulos, et al., Natural Language Interfaces to Databases - An Introduction

Pattern matching ( 0 )
Syntax based (maps parse tree to DB query) ( 0 )
Semantics based (via semantic intermediate)

How does the system deal with the ambiguity in the input sentence?

Ambiguity occurs at the tokenization phase (the word 'de' may be part of a last name, or it may be an article), at the parsing phase (causing multiple parse trees), and at the semantic analysis phase (quantifier scoping problems, for example).
Early convergence
Apply as much restrictions as available, early in the process. At all stages, pick only a single interpretation.
Late convergence
Keep all interpretations open. Pick the interpretation that gives the 'best' result in the end. Score results.

Early convergence ( 0 )
Late convergence


Semantic grammar ( 0 )
Semantic grammar
Domain specific grammar.

The grammar used to parse the sentence contains non-leaf structures that are specially designed for some domain. Each new application requires a different grammar.

From: Androutsopoulos, et al., Natural Language Interfaces to Databases - An Introduction

An example grammar rule from C-PHRASE

POST → ⟨"in the" · NP, λf. λx.f(x) ∧ x.City(x) ∧ (∃y)(State(y) ∧ x.state = ∧ NP(y)⟩
Integrated knowledge base
Integrated knowledge base
The knowledge base is part of the system.

  • No need to convert a semantic structure to a knowledge base structure.

  • The system is not extendable at this point. It has no ready-made facilities to link to external knowledge bases.

Input / Output

User Input

Types of questions

Yes / No
Which / What / Who
How many
How ( 0 )


Answers questions about the Knowledge Base
Answers questions about the Knowledge Base
This is the basic function of a Natural Language Interface: to answer questions about a knowledge base.
Answers questions about the Goal Model ( 0 )
Answers questions about the Goal Model
The user may ask the system about the means and motives of the system (how and why).

Example from SHRDLU:
UserWhen did you pick it up?
SHRDLUWhile I was stacking up the red cube, a large red block and a large green cube.
SHRDLUTo get rid of it.
Answers questions about the Domain Model ( 0 )
Answers questions about the Domain Model
The user may ask the system about the structure of the knowledge base, which is stored in the Domain Model.
The user may teach the system
The user may teach the system
The user may teach the system about new words or concepts from within the dialog.

An example user sentence: Call the biggest block "superblock" (SHRDLU)
Act on user input
Act on user input
The user may tell the system to actually do things, other than answer questions.

Example from SHRDLU:
UserPick up a big red block.
Multiple sentences ( 0 )
Multiple sentences
The user input may consist of several sentences.

Example from RENDEZVOUS:
UserI want to find certain projects. Pipes were sent to them in feb. 1975.
RENDEZVOUSThis is what the system understands your query to be: print the name of every project to which a shipment of a part named pipe was sent during february 1975.
Partial sentences ( 0 )
Partial sentences
The user input may consist of part of a sentence (usually a noun phrase).

Example from C-PHRASE:
Usercities of over 100,000 people in the largest area mid-western state
Handle idioms
Handle idioms
The user may use expressions whose meaning cannot be analyzed, and need to be taken as-is.

Example from SHRDLU:
UserThank you.
SHRDLUYou're welcome

Classic idiom example from ThoughtTreasure:
UserPeter kicked the bucket.
ThoughtTreasure structure[died Peter]

System Output


Spelling correction ( 0 )
Spelling correction
Example from RENDEZVOUS:
UserGive me their locatio also
RENDEZVOUSIs the word 'locatio' intended to be: location? (yes or no)
Cooperative responses
Cooperative responses
The system goes beyond literally answering the question, it answers in a way that actually helps the user.

Example from SHRDLU:
UserIs it supported?
SHRDLUYes, by the table
Paraphrase knowledge base query ( 0 )
Paraphrase knowledge base query
The system is able to turn the query as it will be sent to the knowledge base in a human readable form.

It is used to check if the understanding of the system matches the intent of the user.

Example from RENDEZVOUS:
UserI want to find certain projects. Pipes were sent to them in feb. 1975.
RENDEZVOUSThis is what the system understands your query to be: print the name of every project to which a shipment of a part named pipe was sent during february 1975.
Canned responses
Canned responses
The system shows a fixed piece of text as a response to a question.

Example from SHRDLU:
UserStack up two pyramids
SHRDLUI can't.

Example from ThoughtTreasure:
UserHow are you?
ThoughtTreasureVery well, thank you.
Simple responses with variables
Simple responses with variables
The system shows simple pieces of text, with some variables, as a response.

Example from SHRDLU:
UserHow many blocks are not in the box?
SHRDLUFour of them
Generate full response
Generate full response
Example from ThoughtTreasure:
UserList my appointments.
ThoughtTreasureYou have an appointment with Ruth Northville at the Four Seasons in one hour. You have an appointment with Amy Newton on Thursday March 21, 1996 at eight pm.




Lexicon lookup
Lexicon lookup
Uses (among others) a lexicon to recognize tokens in a sentence.

Especially useful for compound nouns, like 'distance learning' that cannot be recognized by using space as a delimiter alone.

The lexicon may also provide the part-of-speech of the word, i.e. noun, verb, preposition, to be used in the parsing process.
Morphological analysis
Morphological analysis
Removes the prefixes and suffixes of a word to find the root form (present in the lexicon)

For example: larger => large; finding => find; unable => able
Open-ended token recognition
Open-ended token recognition
Recognizes words from an endless category that is not a good fit for a lexicon.

Examples are ordinals: 42, forty-two, forty-second

ThoughtTreasure: Date expressions such as the following are recognized (Mueller):

Monday March 11, 1996
March 11, 1996
March 1996
March '96
lundi le 11 mars 1996
Proper names lookup in knowledge base
Proper names lookup in knowledge base
When a word is not present in the lexicon, the Knowlege Base is queried to find if the word is present as a proper name.
Proper names by matching
Proper names by matching
Proper names are recognized by fitting them into a pattern.

For example: [A-Z][a-z]* van der [A-Z][a-z]*
Quoted string recognition
Quoted string recognition
Recognizes quoted sentences as part of a sentence.

For example: Who said "Gravitation is not responsible for people falling in love"?

Examples from ThoughtTreasure

  • le film "Horizons lointains"
  • une chanson, "I will always love you"
  • the "Dangerous" album
  • What does (the word) "stupid" mean?
Uses a part-of-speech tagger
Uses a part-of-speech tagger
An off-the-shelf part-of-speech tagger is used to determine the parts-of-speech of the words in a sentence.


Syntactic form type

Parse trees


Accept ungrammatical sentences ( 0 )
Accept ungrammatical sentences
Sentences that do not follow the system's grammar are not discarded off hand. The system will make an effort to understand them and / or to make the user change them.
Drop non-essential words ( 0 )
Drop non-essential words
Words and phrases that are not important for the result of the query are ignored.

This may be part of tokenization, parsing, or semantic analysis.

Words that are not in the lexicon may not just simply be dropped. Words may be dropped only if the user agrees or if they are part of a set of known superfluous words.

Example from RENDEZVOUS
UserWhat the hell does Jones supply?
RENDEZVOUSdrops 'hell' after checking with the user; the sentence 'What the does Jones supply?' is processed further.
Apply selectional restrictions
Apply selectional restrictions
The parser excludes sentences that violate (semantic) selectional restrictions that the verb (predicate) imposes on its arguments.

For example, this the sentence "Sam drank a car" will not parse if the verb "drink" imposes the class "liquid" on its object.

Semantic Analysis


Analyse while parsing ( 0 )
Analyse while parsing
Semantic analysis takes place as part of the parsing process.

The alternative would be that semantic analysis only takes places after parsing is complete.

The semantic structures created for different parts of the parse tree may conflict, and when they do, this path is abandoned. This helps to cut down the number of possible parse trees.
Semantic attachment ( 0 )
Semantic attachment
Meaning structures are taken from the lexicon entries of the matched words and attached to them in the parse tree.
Semantic composition
Semantic composition
The meaning structure of a phrase, and the sentence as a whole is derived by composing the meaning of the words.
Morphological semantic composition ( 0 )
Morphological semantic composition
Compose the meaning of morphologically compound words by combining the meaning of the morphemes.

Example from SHRDLU:

Words like 'littlest' are not in the dictionary but are interpreted from the root forms like 'little'. (Winograd)
Use lexicon
Use lexicon
The system may use these attributes from a word in the lexicon for semantic analysis:
  • Semantic definition: a semantic form representation of the meaning of a word.
  • Grammatical relations: for verbs, the presence and position of the object and indirect object in the semantic form.
  • Semantic selectional restrictions: interpretations may be discarded if these restrictions don't match.
  • Phrasal verbs: in a word group like "Abe looks after Bob" "looks after" is turned into a single predicate with Bob as the object
  • Idioms: "X kicks the bucket" may be interpreted as "die(X)" in this phase
Modifier attachment ( 0 )
Modifier attachment
The problem is to identify the constituent to which each modifier has to be attached.

From: Androutsopoulos, et al., Natural Language Interfaces to Databases - An Introduction

An example from SHRDLU:

Put the blue pyramid on the block in the box.
Proper interpretation of conjunction and disjunction ( 0 )
Nominal compounds ( 0 )
Nominal compounds
An attempt is made to derive the meaning of compounds that are not in de lexicon.

Example from RENDEZVOUS
UserHow many Whitney shipments have a shipdate 6/10/1975
RENDEZVOUSThe word 'Whitney' is unfamiliar. Is it one of the following? 1. supplier name 2. supplier location 3. supplier rating ...
Semantic conflict detection ( 0 )
Semantic conflict detection
The system detects conflicts in semantic structure information. For example: How many corners has a ball?
Quantifier scoping ( 0 )
Anaphora resolution

Commonsense reasoning

Commonsense types
Commonsense types
This system uses knowledge that people find "commonsense", to answer questions.
Question Answering
Rules and procedures designed to answer different types of questions in a cooperative manner.
"A set of emotions is maintained for each actor in the context, and the weights of those emotions are decayed over time." (ThoughtTreasure)
Personal relations
Inference rules that update the attitudes between two persons (friendship, animosity)
Space, Time
Update the deictic center of the discourse model

ThoughtTreasure has many more of these commonsense collections, aptly called "understanding agents": for sleep, weather, showering, appointments, trade, occupation, analogy. The user can provide an ASCII representation (called a grid) of the contents of a space (i.e. a room) to inform the system of the location of things.

Example of space based commonsense from ThoughtTreasure:
UserJeanne Püchle was where?
ThoughtTreasureShe was in the corner grocery.
UserShe was near what electronic devices?
ThoughtTreasureShe was near the cash register.

Question answering
Personal relations


Plausibility resolution ( 0 )
Plausibility resolution
Determine if the semantic interpretation thus far is plausible with respect to the context.
Clarification dialog to improve input sentence
Clarification dialog to improve input sentence
The systems interacts with the user (by asking extra questions) in order to establish what the user means, exactly.

This subsystem makes use of domain knowledge.

Example from SHRDLU:
UserHow many things are on top of green cubes?
SHRDLUI'm not sure what you mean by "on top of" in the phrase "on top of green cubes".
Do you mean:
1 - directly on the surface
2 - anywhere on top of?
SHRDLUThree of them

Example from ThoughtTreasure:
UserI want to buy a Fiat Spider.
ThoughtTreasureA 124, a 2000, or a 1800?
UserA 124.
ThoughtTreasureA 1978 Fiat 124 was for sale for 3000 dollars by Todd Spire at "".
Interpret speech act ( 0 )
Interpret speech act
In general:
A sentence that starts with a question word is a question. A sentence without a subject is an imperative.

But this system is also capable of correctly interpreting some of the sentences like this: Can you tell me where I can find Chinese food? (not a yes/no question)

Example from RENDEZVOUS:
UserI want to find certain projects. Pipes were sent to them in feb. 1975.
RENDEZVOUSThis is what the system understands your query to be: print the name of every project to which a shipment of a part named pipe was sent during february 1975.



Deductive reasoning: reason from premises to conclusions.

Apply the deduction rules from the Domain Model to reach factual statements that are not stored literally in the Knowledge Base.
Proof by example ( 0 )
Proof by example
Conclude that something is possible from the existence of at least a single instance.

Example from SHRDLU:
Usercan a pyramid be supported by a block?
The deductive system finds an actual example, so it knows this is possible. (Winograd)
Proof by custom procedure
Proof by custom procedure
A custom procedure implemented in code decides whether a statement is true or false.

Example from ThoughtTreasure

near(X, Y) is determined by invoking a space routine.

Conversion to knowledge base form


Syntactic rewrites
Syntactic rewrites
Rewrite the generic semantic form to a domain specific semantic form used in a specific domain.
Optimize query ( 0 )
Optimize query
The raw knowledge base query is rewritten for reasons of processing speed.
Restructure information ( 0 )

Knowledge base execution


Queries multiple knowledge bases for single request ( 0 )
Queries multiple knowledge bases for single request
The system queries multiple knowledge bases for the same sentence, and integrates the results.
Logical reasoning ( 0 )
Logical reasoning
The knowledge base itself contains inference rules that allow facts to be deduced from other facts.



Goal creation
Goal creation
When a certain state in the Knowledge Base triggers a goal creation rule from the domain model, a goal is created and placed in the Goal Model.
Plan execution
Plan execution
When a goal is active in the Goal Model, the system will find plans and actions that are geared towards fulfillment of the goal.

An action may be implemented by a custom procedure, or it may be a question asking the user for more information.
Process feedback
Process feedback
Interpret the response of the user in the context of an active plan.

For instance, when the user says "Yes" this may answer an active question by the system.



Using transformation rules
Using transformation rules
When generating a linguistic response (a sentence) based on a found answer (a semantic representation), transformation rules are used to create responses that do not sound rigid.

Examples from ThoughtTreasure

  • a friend of you => your friend
  • is not => isn't
  • le arbre => l'arbre
Generate pronouns
Generate pronouns
In the linguistic response, nouns that are part of the active context are replaced by pronouns (I, he, it).
Generate articles
Generate articles
In the linguistic response, nouns will need a proper article.

Examples from ThoughtTreasure

  • Elephants are smart (empty article)
  • An elephant is a mammal (indefinite article)
  • The elephant (definite article)
Generate aspects
Generate aspects
In the linguistic response, verbs will need to be expressed according to the correct aspect.

Aspect - Wikipedia



Learn new names
Learn new names
The user may teach the system new names for things.

Example from SHRDLU:
UserCall the biggest block "superblock"
UserHave you picked up superblock since we began?


If the name text agent encounters an unknown word in a context where it is likely to be a name, such as after a first name, if it has a prefix or suffix commonly used in names, it learns the new name.
Learn new words by telling ( 0 )
Learn new words by telling
The user may explain to the system what a words means.

Example from SHRDLU:
UserA "steeple" is a stack which contains two green cubes and a pyramid
UserAre there any steeples now?
Learn new words by deduction
Learn new words by deduction
The system may deduce the meaning of new words using derivation rules.

Example from ThoughtTreasure:
UserWhat is a xylophonephile?
ThoughtTreasureA xylophonephile is a type of human. Xylophonephiles like xylophones.
Learn new facts
Learn new facts
The user may tell the system about a new fact for the knowledge base.

Example from SHRDLU:
UserThe blue block is mine.
UserDo I own anything in the box?

Example from ThoughtTreasure:
UserIn 1994, Kimba became chair of Sony.
ThoughtTreasure(learns @1994:na|[chair-of Sony Kimba])
Learn new rules ( 0 )
Learn new rules
The user may tell the system about rules that apply to the knowledge base.

Example from SHRDLU:
UserI own blocks which are not red, but I don't own anything which supports a pyramid.
UserDo I own anything in the box?
Refuse to accept ( 0 )
Refuse to accept
Based on a contradiction with a known fact, the system refuses to accept what the user tells him.

Example from SHRDLU:
UserThere were five blocks to the left of the box then.
SHRDLUNo, only four of them: the red cube, two large green cubes and a large red block

Temporary Data Structures

Semantic form

Semantic form type
Semantic form type
A nested list of [predicate argument, argument, ...] where each argument can be another list.
A nested structure of [goal means means, ...] where each means can be another tree
First Order Predicate Logic
A nested logical structure of functions and operators

An example from ThoughtTreasure (list-based):
"Who directed Rendezvous in Paris?"
is represented as
	[director-of *RDP *human-interrogative-pronoun]]
An example from SHRDLU (goal-based):
"Is any block supported by three pyramids?"
is represented as the Planner construct
	(THFIND 3 $?X2 (X2)
		(THGOAL (#SUPPORT $?X2 $?X1))))
An example from Orakel (FOPL) "Which river passes through Berlin?"
?x (river(x) ∧ flow_through(x, Berlin))

Relational ( 0 )
List based
Goal based ( 0 )
First order Predicate Logic ( 0 )


Event based ( 0 )
Temporal ( 0 )
Uses constants for proper nouns ( 0 )

Knowledge base form


Handle aggregations ( 0 )


Database / Knowledge Base

Knowledge base type
Knowledge base type
The way data is stored in the knowledge base:
A relational database with tables
Tree based
A hierarchical database with trees
List based
Data is stored as a set of (nested) lists. The outermost list contains a predicate.

Relational ( 0 )
Tree based ( 0 )
List based


History of states and events ( 0 )
History of states and events
In order to answer questions about previous states, the system needs to keep track of its states and events, and how they were connected.

Example from SHRDLU
UserWhat did the red cube support before you started to clear it off?
SHRDLUThe green pyramid

Syntax to semantics mappings

Semantic composition type
Semantic composition type
Semantic composition is the process of building the meaning of a sentence from the meanings of the phrases and eventually, the words.
Production rules
A pattern -> action rule that maps a syntax tree sub-structure to its semantic form.
Lambda calculus
Custom procedures
Custom pieces of code act on the contents of parse tree nodes and attach semantic structures to them. Very flexible but can only be extended by a programmer with detailed knowledge of the system.

An example production rule from LUNAR:

	(S.V (OR (EQU 1 HAVE)
		     (EQU 1 CONTAIN))
	->(QUOTE (CONTAIN (# 1 1) ( # 3 1))) ]
S:CONTAIN is the name of the rule. The action follows the -> mark.

Unification ( 0 )
Production rules ( 0 )
Lambda calculus ( 0 )
Custom procedures


Natural language
Natural language
Which natural languages are supported by this system? The majority of systems just supports English.


Sentence types

Imperative ( 0 )

Phrase types
Phrase types
Clauses as objects; example from SHRDLU:
UserFind a block which is taller than the one I told you to pick up.
"you to pick up" is a clause that is treated as an object (Winograd)

Noun Phrases
Verb Phrases
Preposition Phrases
Determiner Phrases
ADVerb Phrases
ADJective Phrases
Relative Clauses
Modals ( 0 )
Comparative expressions ( 0 )
Passives ( 0 )
Clefts ( 0 )
There be ( 0 )
Clauses as objects ( 0 )
Extraposition ( 0 )


Ellipsis ( 0 )



Syntactic features
Syntactic features
A lexical entry has information syntactic features.
the syntactic category (e.g. work: part-of-speech = verb)
singular or plural (e.g. birds: number = plural)
Semantic definition ( 0 )
Semantic definition
A lexical entry has a definition of the meaning of the word.

This usually includes a predicate.
Semantic form properties
Semantic form properties
The semantic form of a lexical entry has specific properties, other than just a predicate.

Examples from ThoughtTreasure:
Type of relation
Is the relation one-to-one, one-to-many, or many-to-many? This property is used in generation to determine if an expression is "the president of the US" or "a president of the US".
Fuzzy value
For the predicate "like-human", the word "like" has a fuzzy value in the range of 0.5 - 0.8, while "love" has a value of 0.8 to infinite.

Only irregular forms ( 0 )
Only irregular forms
The lexicon stores only irregular forms of verbs, like 'geese' and 'slept'. The regular morphological compound forms are derived by applying rules.
Grammatical relations
Grammatical relations
The lexicon codes which grammatical relations (like subject, object, and indirect object) a verb has.
A verb with only a subject
A verb with a subject and an object
A verb with a subject, an object and an indirect object

Coded grammatical relations help restrict the number of possible parse trees.

Note that verbs may have multiple "frames". For example: "That man eats" (intransitive) and "He eats apples" (mono-transitive).

Grammatical relation - Wikipedia
Semantic selectional restrictions
Semantic selectional restrictions
Also called "S-selection". The lexicon stores semantic constraints for each argument of a verb.

For example, the verb may contain these restrictions:
  • subject: instance of living organism
  • object: instance of a liquid
Selection (linguistics) - Wikipedia
Category selectional restrictions
Category selectional restrictions
Also called "C-selection". The lexicon stores syntactical constraints for each argument of a verb.

For instance the lexicon may specify that an argument may not be just a noun phrase ("John talks to Peter"), but also a subordinate clause ("John tells Peter to go").

Selection (linguistics) - Wikipedia
The prototypical example of an idiom is "kick the bucket". It means: to die.

Such an idiom may be stored in the lexicon. Note that its structure must be stored, not just the surface form, since variants are also possible. E.g. "Our dog kicked the bucket."

Idiom - Wikipedia
Phrasal verbs
Phrasal verbs
The lexicon stores the structure of phrasal verbs.

There are three types of phrasal verbs:
prepositional phrasal verbs
verb + preposition. Ex: "Who is looking after the kids?"
particle phrasal verbs
verb + particle. Ex: "They brought that up twice."
particle-prepositional phrasal verbs
Verb + particle + preposition. Ex: "Who can put up with that?"

Explicit use of phrasal verbs helps restrict the number of possible parse trees. Also, it distinguishes between syntactically similar but semantically different verbs like "look" and "look after", and produces different semantic concepts ("look", "look-after").

Phrasal verb - Wikipedia
Examples of roles are "president-of", "mother-of", "nationality-of", "friend-of".

ThoughtTreasure stores these relations in the lexicon as surface form / semantic form.

Domain model


Uses an ontology
Uses an ontology
An ontology is an explicit description of the types, attributes, and relations of a domain.

See Wikipedia
Gradable adjectives
Gradable adjectives
The ontology uses weights to express the measure of adjectives.

From ThoughtTreasure

A weight is a value from -1.0 to +1.0. If a weight is not provided, it is assumed to be 0.55.

[hot A]A is hot
[hot A 1.0]A is extremely hot
[hot A 0.2]A is slightly hot
[hot A -0.55]A is cold
[hot A -0.7]A is very cold
Deduction rules
Deduction rules
IF/THEN inference rules to deduce facts from other facts.
A plan library
A plan library
A set of plans needed to reach certain goals. A plan consists of a goal and a set of actions, or lower level plans.
Procedures for representing and manipulation
Procedures for representing and manipulation
A set of custom procedures to represent the state of the knowledge base other than through language.

Also, a set of procedures to manipulate data or physical objects.

These procedures are accessible through natural language interaction.
Goal creation rules
Goal creation rules
A set of IF/THEN rules that specify under which Knowledge Base conditions a goal is created and added to the Goal Model.

Goal model


Instantiated goals ( 0 )
Instantiated goals
Goals, plans and actions. Goals are taken from the Domain Model and instantiated with data from the user query.

The plans are also taken from the Domain Model, and bound to variables from the goal.

UserPick up a big red block.
The system answers "OK" when it carries out a command. In order to pick up the red block, it had to clear it off by finding a space for the green one and moving the green one away. (Winograd)
History of goals, plans and actions ( 0 )
History of goals, plans and actions
In order to answer questions about its motives, the system needs to keep track of its goals and plans, and how they were connected.

Example from SHRDLU:
UserWhy did you do that?
SHRDLUTo clear off the red cube

Discourse model

Deictic center
Deictic center
Some words can only be understood in reference to an origin. This origin is called the deictic center.
Words like like 'he', 'their'
Words like 'yesterday'
Words like 'there', 'this'

Hence the system needs to update the origin with each new sentence.



Keep track of active plans ( 0 )
Keep track of active plans
The system needs to remember what plans it is currently working on.
Baseball1959?, Bert F. Green, Jr., Alice K. Wolf, Carol Chomsky, Kenneth Laughery
The Conversation Machine1959, L.E.S. Green, E.C. Berkeley, C.C. Gotlieb
ALA1960, Householder, Lyons, Thorne
The Oracle1960, A.V. Philips
PLM1961?, R. Kirsch, D. Cohen, B. Rankin, W. Sillars
The General Inquirer1962?, P. Stone
Protosynthex1963?, Simmons, McConlogue, Klein
SAD SAM1963?, R. Lindsay
SIR1964?, Bertram Raphael
SQA1964?, F. Black
STUDENT1964?, D. Bobrow
The Cooper System1964?, William .S. Cooper
The Darlington Logic Programs1964?, J.L. Darlington
ELIZA1964, Joseph Weizenbaum
DEACON1965?, F. Thomson, J. Graig
SHRDLU1968, Terry Winograd
REL1969?, Bozena H. Dostert, Frederick B. Thomson
LUNAR1971, William A. Woods
PARRY1971, K.M. Colby, H. Enea, L. Tesler, D.C. Smith
MARGIE1973, Roger C. Schank, Neil M. Goldman, Charles J. Rieger, Chris Riesbeck
RENDEZVOUS1974, Edgar F. Codd, Robert S. Arnold, Jean-Marc Cadiou, Chin-Liang Chang, Nick Roussopoulos
SAM1975, Roger Schank, Cullingford
Lifer1976?, Gary G. Hendrix
EUFID1976, Marjorie Templeton, John F. Burger
PAM1976, Robert Wilensky
Veronica Dahl's systems1976, Veronica Dahl
QUALM1977, Wendy G. Lehnert
FAUSTUS (formerly: PAMELA)1978?, Peter Norvig
Ladder1978?, Gary G. Hendrix, Earl D. Sacerdoti, Doniel Sagalowicz, Jonathan Slocum
Politics1979?, Carbonell
TEAM1980, Barbara J. Grosz, Douglas E. Appelt, Fernando C.N. Pereira, David H.D. Warren, Paul Martin, Armar Archbold, Robert C. Moore, Jerry Hobbs, Jane J. Robinson, Daniel Sagalowicz
Plot Units1981?, Lehnert
Chat-801981, David H.D. Warren, Fernando C.N. Pereira
IR-NLI1982?, Giovanni Guida, Carlo Tasso
PANDORA1982, Joseph (Joe) Faletti
Ask1983?, Bozena H. Thompson, Frederick B. Thompson
PHLIQA11983?, Remko J.H. Scha
CYC1984, Douglas Lenat
CLE1986, Hiyan Alshawi, David Carter, Jan van Eijk, Björn Gambäck, Robert C. Moore, Douglas B. Moran, Fernando C.N. Pereira, Stephen G. Pulman, Manny Rayner, Arnold G. Smith
DAYDREAMER1987, Erik T. Mueller
Unix Consultant1988?,
JANUS1989?, Philip Resnik, Erhard W. Hinrichs, Ralph M. Weischedel, Marie Meteer, Lance Ramshaw, Jeff Palmucci, Damaris M. Ayuso, Robert J. Bobrow
ThoughtTreasure1994, Erik T. Mueller
Faq Finder1997?, Robin D. Burke, Kristian J. Hammond, Vladimir A. Kulyukin, Steven L. Lytinen, Noriko Tomuro, Scott Schoenberg
Lasso1999?, Moldovan D., Harabagiu S., Pasca M., Mihalcea R., Goodrum R., Girju R., Rus V.
Falcon2000?, Moldovan D., Harabagiu S., Pasca M., Mihalcea R., Goodrum R., Girju R., Rus V.
MULDER2000?, Cody Kwok, Oren Etzioni, Daniel S. Weld
Qalc2000?, Ferret O., Grau B., Huraults-Plantet M.
DIMAP2001?, Kenneth C. Litkowski
START2002?, Boris Katz, Gary Borchardt, Sue Felshin, Deniz Yuret, Ali Ibrahim, Jimmy Lin, Gregory Marton, Alton Jerome McFarland, Baris Temelkuran, Yuan Shen, Gabriel Zaccak
Precise2003?, Ana-Maria Popescu, Oren Etzioi, Henry Kautz
CALO2003, about 250 people
ORAKEL2004, Philipp Cimiano, Peter Haase, Jörg Heizmann, Matthias Mantel, Rudi Studer
NaLIX2005?, Yunyao Li, Huahai Yang, H. V. Jagadish
Ephyra2006?, N. Schlaefer, P. Gieselmann, G. Sautter
Ginseng2006?, Bernstein A., Kauffmann E., Kaiser C., Kiefer C.
Watson2006, David Ferrucci and many others
AquaLog2007?, Lopez V., Uren V., Motta E., Pasin M.
Panto2007?, Wang C., Xiong M., Zhou Q., Yu Y.
Qristal2007?, Laurent D., Séguéla P., NègreCross S.
WebQA2007?, Parthasarathy S., Chen J.
C-PHRASE2008, Michael Minock
Qacid2009?, Fernandez O., Izquierdo R., Ferrandez S., Vicedo J. L.
FREyA2010?, Damljanovic D., Agatonovic M., Cunningham H,
Pythia2011?, Unger C., Cimiano P.
PowerAqua2012?, Lopez V., Fernández M., Motta E., Stieler N.
QAKiS2012?, Cabrio E., Cojan J., Aprosio A. P., Magnini B., Lavelli A., Gandon F.
TBSL2012?, Unger C., Bühmann L., Lehmann J., Ngomo A. C. N., Gerber D., Cimiano P.
LodQA2013?, Kim J. D., Cohen K. B.
Squall2013?, Ferré S.
CASIA@V22014?, Shizhu H., Yuanzhe Z., Liu K., Zhao J.
Swip2014?, Pradel C., Haemmerlé O., Hernandez N.