Natural Language Interface Gems

Systems properties

General

Code

Programming language
Programming language
Which programming language is used to implement the natural language processing core components of the system? This excludes the languages that are only used to interact with the system.

APL ( 1 )
C ( 2 )
Fortran ( 1 )
Lisp ( 4 )
Prolog ( 2 )
Java ( 1 )

System structure

Type of analysis
Type of analysis
The main categories of natural language interfaces
Pattern matching
Literal occurrences of a pattern in a sentence are converted directly to parts of a DB query
Syntax based
A sentence is parsed and the parse tree is mapped directly to a DB query
Semantics based
After a sentence is parsed, it is first converted into an intermediate semantic expression, which is in turn converted into a DB query


From: Androutsopoulos, et al., Natural Language Interfaces to Databases - An Introduction

Pattern matching ( 1 )
Syntax based (maps parse tree to DB query) ( 1 )
Semantics based (via semantic intermediate) ( 7 )

Ambiguity
Ambiguity
How does the system deal with the ambiguity in the input sentence?

Ambiguity occurs at the tokenization phase (the word 'de' may be part of a last name, or it may be an article), at the parsing phase (causing multiple parse trees), and at the semantic analysis phase (quantifier scoping problems, for example).
Early convergence
Apply as much restrictions as available, early in the process. At all stages, pick only a single interpretation.
Late convergence
Keep all interpretations open. Pick the interpretation that gives the 'best' result in the end. Score results.


Early convergence ( 3 )
Late convergence ( 1 )

Features

Semantic grammar ( 2 )
Semantic grammar
Domain specific grammar.

The grammar used to parse the sentence contains non-leaf structures that are specially designed for some domain. Each new application requires a different grammar.

From: Androutsopoulos, et al., Natural Language Interfaces to Databases - An Introduction

An example grammar rule from C-PHRASE

POST → ⟨"in the" · NP, λf. λx.f(x) ∧ x.City(x) ∧ (∃y)(State(y) ∧ x.state = y.name ∧ NP(y)⟩
Integrated knowledge base ( 3 )
Integrated knowledge base
The knowledge base is part of the system.

Advantages:
  • No need to convert a semantic structure to a knowledge base structure.


Disadvantages:
  • The system is not extendable at this point. It has no ready-made facilities to link to external knowledge bases.

Input / Output

User Input

Types of questions

Yes / No ( 6 )
Which / What / Who ( 8 )
How many ( 5 )
When ( 2 )
How ( 1 )
Why ( 3 )

Features

Answers questions about the Knowledge Base ( 9 )
Answers questions about the Knowledge Base
This is the basic function of a Natural Language Interface: to answer questions about a knowledge base.
Answers questions about the Goal Model ( 1 )
Answers questions about the Goal Model
The user may ask the system about the means and motives of the system (how and why).

Example from SHRDLU:
UserWhen did you pick it up?
SHRDLUWhile I was stacking up the red cube, a large red block and a large green cube.
UserWhy?
SHRDLUTo get rid of it.
Answers questions about the Domain Model ( 1 )
Answers questions about the Domain Model
The user may ask the system about the structure of the knowledge base, which is stored in the Domain Model.
The user may teach the system ( 3 )
The user may teach the system
The user may teach the system about new words or concepts from within the dialog.

An example user sentence: Call the biggest block "superblock" (SHRDLU)
Act on user input ( 3 )
Act on user input
The user may tell the system to actually do things, other than answer questions.

Example from SHRDLU:
UserPick up a big red block.
Multiple sentences ( 1 )
Multiple sentences
The user input may consist of several sentences.

Example from RENDEZVOUS:
UserI want to find certain projects. Pipes were sent to them in feb. 1975.
RENDEZVOUSThis is what the system understands your query to be: print the name of every project to which a shipment of a part named pipe was sent during february 1975.
Partial sentences ( 2 )
Partial sentences
The user input may consist of part of a sentence (usually a noun phrase).

Example from C-PHRASE:
Usercities of over 100,000 people in the largest area mid-western state
Handle idioms ( 2 )
Handle idioms
The user may use expressions whose meaning cannot be analyzed, and need to be taken as-is.

Example from SHRDLU:
UserThank you.
SHRDLUYou're welcome


Classic idiom example from ThoughtTreasure:
UserPeter kicked the bucket.
ThoughtTreasure structure[died Peter]

System Output

Features

Spelling correction ( 4 )
Spelling correction
Example from RENDEZVOUS:
UserGive me their locatio also
RENDEZVOUSIs the word 'locatio' intended to be: location? (yes or no)
Useryes
Cooperative responses ( 3 )
Cooperative responses
The system goes beyond literally answering the question, it answers in a way that actually helps the user.

Example from SHRDLU:
UserIs it supported?
SHRDLUYes, by the table
Paraphrase knowledge base query ( 3 )
Paraphrase knowledge base query
The system is able to turn the query as it will be sent to the knowledge base in a human readable form.

It is used to check if the understanding of the system matches the intent of the user.

Example from RENDEZVOUS:
UserI want to find certain projects. Pipes were sent to them in feb. 1975.
RENDEZVOUSThis is what the system understands your query to be: print the name of every project to which a shipment of a part named pipe was sent during february 1975.
Canned responses ( 2 )
Canned responses
The system shows a fixed piece of text as a response to a question.

Example from SHRDLU:
UserStack up two pyramids
SHRDLUI can't.


Example from ThoughtTreasure:
UserHow are you?
ThoughtTreasureVery well, thank you.
Simple responses with variables ( 2 )
Simple responses with variables
The system shows simple pieces of text, with some variables, as a response.

Example from SHRDLU:
UserHow many blocks are not in the box?
SHRDLUFour of them
Generate full response ( 1 )
Generate full response
Example from ThoughtTreasure:
UserList my appointments.
ThoughtTreasureYou have an appointment with Ruth Northville at the Four Seasons in one hour. You have an appointment with Amy Newton on Thursday March 21, 1996 at eight pm.

Processes

Tokenization

Features

Lexicon lookup ( 9 )
Lexicon lookup
Uses (among others) a lexicon to recognize tokens in a sentence.

Especially useful for compound nouns, like 'distance learning' that cannot be recognized by using space as a delimiter alone.

The lexicon may also provide the part-of-speech of the word, i.e. noun, verb, preposition, to be used in the parsing process.
Morphological analysis ( 5 )
Morphological analysis
Removes the prefixes and suffixes of a word to find the root form (present in the lexicon)

For example: larger => large; finding => find; unable => able
Open-ended token recognition ( 5 )
Open-ended token recognition
Recognizes words from an endless category that is not a good fit for a lexicon.

Examples are ordinals: 42, forty-two, forty-second

ThoughtTreasure: Date expressions such as the following are recognized (Mueller):

Monday March 11, 1996
March 11, 1996
March 1996
March '96
lundi le 11 mars 1996
..
Proper names lookup in knowledge base ( 5 )
Proper names lookup in knowledge base
When a word is not present in the lexicon, the Knowlege Base is queried to find if the word is present as a proper name.
Proper names by matching ( 1 )
Proper names by matching
Proper names are recognized by fitting them into a pattern.

For example: [A-Z][a-z]* van der [A-Z][a-z]*
Quoted string recognition ( 3 )
Quoted string recognition
Recognizes quoted sentences as part of a sentence.

For example: Who said "Gravitation is not responsible for people falling in love"?

Examples from ThoughtTreasure

  • le film "Horizons lointains"
  • une chanson, "I will always love you"
  • the "Dangerous" album
  • What does (the word) "stupid" mean?
Uses a part-of-speech tagger ( 1 )
Uses a part-of-speech tagger
An off-the-shelf part-of-speech tagger is used to determine the parts-of-speech of the words in a sentence.

Parsing

Syntactic form type

Parse trees ( 6 )

Features

Accept ungrammatical sentences ( 1 )
Accept ungrammatical sentences
Sentences that do not follow the system's grammar are not discarded off hand. The system will make an effort to understand them and / or to make the user change them.
Drop non-essential words ( 1 )
Drop non-essential words
Words and phrases that are not important for the result of the query are ignored.

This may be part of tokenization, parsing, or semantic analysis.

Words that are not in the lexicon may not just simply be dropped. Words may be dropped only if the user agrees or if they are part of a set of known superfluous words.

Example from RENDEZVOUS
UserWhat the hell does Jones supply?
RENDEZVOUSdrops 'hell' after checking with the user; the sentence 'What the does Jones supply?' is processed further.
Apply selectional restrictions ( 1 )
Apply selectional restrictions
The parser excludes sentences that violate (semantic) selectional restrictions that the verb (predicate) imposes on its arguments.

For example, this the sentence "Sam drank a car" will not parse if the verb "drink" imposes the class "liquid" on its object.

Semantic Analysis

Features

Analyse while parsing ( 2 )
Analyse while parsing
Semantic analysis takes place as part of the parsing process.

The alternative would be that semantic analysis only takes places after parsing is complete.

The semantic structures created for different parts of the parse tree may conflict, and when they do, this path is abandoned. This helps to cut down the number of possible parse trees.
Semantic attachment ( 6 )
Semantic attachment
Meaning structures are taken from the lexicon entries of the matched words and attached to them in the parse tree.
Semantic composition ( 6 )
Semantic composition
The meaning structure of a phrase, and the sentence as a whole is derived by composing the meaning of the words.
Morphological semantic composition ( 1 )
Morphological semantic composition
Compose the meaning of morphologically compound words by combining the meaning of the morphemes.

Example from SHRDLU:

Words like 'littlest' are not in the dictionary but are interpreted from the root forms like 'little'. (Winograd)
Use lexicon ( 1 )
Use lexicon
The system may use these attributes from a word in the lexicon for semantic analysis:
  • Semantic definition: a semantic form representation of the meaning of a word.
  • Grammatical relations: for verbs, the presence and position of the object and indirect object in the semantic form.
  • Semantic selectional restrictions: interpretations may be discarded if these restrictions don't match.
  • Phrasal verbs: in a word group like "Abe looks after Bob" "looks after" is turned into a single predicate with Bob as the object
  • Idioms: "X kicks the bucket" may be interpreted as "die(X)" in this phase
Modifier attachment ( 1 )
Modifier attachment
The problem is to identify the constituent to which each modifier has to be attached.

From: Androutsopoulos, et al., Natural Language Interfaces to Databases - An Introduction

An example from SHRDLU:

Put the blue pyramid on the block in the box.
Proper interpretation of conjunction and disjunction ( 1 )
Nominal compounds ( 1 )
Nominal compounds
An attempt is made to derive the meaning of compounds that are not in de lexicon.

Example from RENDEZVOUS
UserHow many Whitney shipments have a shipdate 6/10/1975
RENDEZVOUSThe word 'Whitney' is unfamiliar. Is it one of the following? 1. supplier name 2. supplier location 3. supplier rating ...
User1
Semantic conflict detection ( 4 )
Semantic conflict detection
The system detects conflicts in semantic structure information. For example: How many corners has a ball?
Quantifier scoping ( 4 )
Anaphora resolution ( 6 )

Commonsense reasoning

Commonsense types
Commonsense types
This system uses knowledge that people find "commonsense", to answer questions.
Question Answering
Rules and procedures designed to answer different types of questions in a cooperative manner.
Emotions
"A set of emotions is maintained for each actor in the context, and the weights of those emotions are decayed over time." (ThoughtTreasure)
Personal relations
Inference rules that update the attitudes between two persons (friendship, animosity)
Space, Time
Update the deictic center of the discourse model


ThoughtTreasure has many more of these commonsense collections, aptly called "understanding agents": for sleep, weather, showering, appointments, trade, occupation, analogy. The user can provide an ASCII representation (called a grid) of the contents of a space (i.e. a room) to inform the system of the location of things.

Example of space based commonsense from ThoughtTreasure:
UserJeanne Püchle was where?
ThoughtTreasureShe was in the corner grocery.
UserShe was near what electronic devices?
ThoughtTreasureShe was near the cash register.

Question answering ( 3 )
Emotions ( 1 )
Personal relations ( 1 )
Space ( 2 )
Time ( 1 )

Features

Plausibility resolution ( 2 )
Plausibility resolution
Determine if the semantic interpretation thus far is plausible with respect to the context.
Clarification dialog to improve input sentence ( 5 )
Clarification dialog to improve input sentence
The systems interacts with the user (by asking extra questions) in order to establish what the user means, exactly.

This subsystem makes use of domain knowledge.

Example from SHRDLU:
UserHow many things are on top of green cubes?
SHRDLUI'm not sure what you mean by "on top of" in the phrase "on top of green cubes".
Do you mean:
1 - directly on the surface
2 - anywhere on top of?
User2
SHRDLUThree of them


Example from ThoughtTreasure:
UserI want to buy a Fiat Spider.
ThoughtTreasureA 124, a 2000, or a 1800?
UserA 124.
ThoughtTreasureA 1978 Fiat 124 was for sale for 3000 dollars by Todd Spire at "toddspi@quapaw.astate.edu".
Interpret speech act ( 2 )
Interpret speech act
In general:
A sentence that starts with a question word is a question. A sentence without a subject is an imperative.

But this system is also capable of correctly interpreting some of the sentences like this: Can you tell me where I can find Chinese food? (not a yes/no question)

Example from RENDEZVOUS:
UserI want to find certain projects. Pipes were sent to them in feb. 1975.
RENDEZVOUSThis is what the system understands your query to be: print the name of every project to which a shipment of a part named pipe was sent during february 1975.

Inference

Features

Deduction ( 3 )
Deduction
Deductive reasoning: reason from premises to conclusions.

Apply the deduction rules from the Domain Model to reach factual statements that are not stored literally in the Knowledge Base.
Proof by example ( 1 )
Proof by example
Conclude that something is possible from the existence of at least a single instance.

Example from SHRDLU:
Usercan a pyramid be supported by a block?
SHRDLUYES.
The deductive system finds an actual example, so it knows this is possible. (Winograd)
Proof by custom procedure ( 1 )
Proof by custom procedure
A custom procedure implemented in code decides whether a statement is true or false.

Example from ThoughtTreasure

near(X, Y) is determined by invoking a space routine.

Conversion to knowledge base form

Features

Syntactic rewrites ( 6 )
Syntactic rewrites
Rewrite the generic semantic form to a domain specific semantic form used in a specific domain.
Optimize query ( 2 )
Optimize query
The raw knowledge base query is rewritten for reasons of processing speed.
Restructure information ( 1 )

Knowledge base execution

Features

Queries multiple knowledge bases for single request ( 0 )
Queries multiple knowledge bases for single request
The system queries multiple knowledge bases for the same sentence, and integrates the results.
Logical reasoning ( 2 )
Logical reasoning
The knowledge base itself contains inference rules that allow facts to be deduced from other facts.

Planning

Features

Goal creation ( 3 )
Goal creation
When a certain state in the Knowledge Base triggers a goal creation rule from the domain model, a goal is created and placed in the Goal Model.
Plan execution ( 2 )
Plan execution
When a goal is active in the Goal Model, the system will find plans and actions that are geared towards fulfillment of the goal.

An action may be implemented by a custom procedure, or it may be a question asking the user for more information.
Process feedback ( 2 )
Process feedback
Interpret the response of the user in the context of an active plan.

For instance, when the user says "Yes" this may answer an active question by the system.

Generation

Features

Using transformation rules ( 1 )
Using transformation rules
When generating a linguistic response (a sentence) based on a found answer (a semantic representation), transformation rules are used to create responses that do not sound rigid.

Examples from ThoughtTreasure

  • a friend of you => your friend
  • is not => isn't
  • le arbre => l'arbre
Generate pronouns ( 1 )
Generate pronouns
In the linguistic response, nouns that are part of the active context are replaced by pronouns (I, he, it).
Generate articles ( 1 )
Generate articles
In the linguistic response, nouns will need a proper article.

Examples from ThoughtTreasure

  • Elephants are smart (empty article)
  • An elephant is a mammal (indefinite article)
  • The elephant (definite article)
Generate aspects ( 1 )
Generate aspects
In the linguistic response, verbs will need to be expressed according to the correct aspect.

Aspect - Wikipedia

Learning

Features

Learn new names ( 2 )
Learn new names
The user may teach the system new names for things.

Example from SHRDLU:
UserCall the biggest block "superblock"
UserHave you picked up superblock since we began?
SHRDLUYES.

ThoughtTreasure

If the name text agent encounters an unknown word in a context where it is likely to be a name, such as after a first name, if it has a prefix or suffix commonly used in names, it learns the new name.
Learn new words by telling ( 1 )
Learn new words by telling
The user may explain to the system what a words means.

Example from SHRDLU:
UserA "steeple" is a stack which contains two green cubes and a pyramid
UserAre there any steeples now?
SHRDLUNO.
Learn new words by deduction ( 2 )
Learn new words by deduction
The system may deduce the meaning of new words using derivation rules.

Example from ThoughtTreasure:
UserWhat is a xylophonephile?
ThoughtTreasureA xylophonephile is a type of human. Xylophonephiles like xylophones.
Learn new facts ( 3 )
Learn new facts
The user may tell the system about a new fact for the knowledge base.

Example from SHRDLU:
UserThe blue block is mine.
UserDo I own anything in the box?
SHRDLUYES. TWO THINGS: THE BLUE BLOCK AND THE BLUE PYRAMID.


Example from ThoughtTreasure:
UserIn 1994, Kimba became chair of Sony.
ThoughtTreasure(learns @1994:na|[chair-of Sony Kimba])
Learn new rules ( 1 )
Learn new rules
The user may tell the system about rules that apply to the knowledge base.

Example from SHRDLU:
UserI own blocks which are not red, but I don't own anything which supports a pyramid.
UserDo I own anything in the box?
SHRDLUYES. TWO THINGS: THE BLUE BLOCK AND THE BLUE PYRAMID.
Refuse to accept ( 1 )
Refuse to accept
Based on a contradiction with a known fact, the system refuses to accept what the user tells him.

Example from SHRDLU:
UserThere were five blocks to the left of the box then.
SHRDLUNo, only four of them: the red cube, two large green cubes and a large red block

Temporary Data Structures

Semantic form

Semantic form type
Semantic form type
Relational
?
List-based
A nested list of [predicate argument, argument, ...] where each argument can be another list.
Goal-based
A nested structure of [goal means means, ...] where each means can be another tree
First Order Predicate Logic
A nested logical structure of functions and operators


An example from ThoughtTreasure (list-based):
"Who directed Rendezvous in Paris?"
is represented as
[preterit-indicative
	[director-of *RDP *human-interrogative-pronoun]]
An example from SHRDLU (goal-based):
"Is any block supported by three pyramids?"
is represented as the Planner construct
(THFIND ALL $?X1 (X1)
	(THGOAL (#IS $?X1 #BLOCK))
	(THFIND 3 $?X2 (X2)
		(THGOAL (#IS $?X2 #PYRAMID))
		(THGOAL (#SUPPORT $?X2 $?X1))))
An example from Orakel (FOPL) "Which river passes through Berlin?"
?x (river(x) ∧ flow_through(x, Berlin))

Relational ( 4 )
List based ( 1 )
Goal based ( 1 )
First order Predicate Logic ( 0 )

Features

Event based ( 2 )
Temporal ( 1 )
Uses constants for proper nouns ( 3 )

Knowledge base form

Features

Handle aggregations ( 7 )

Models

Database / Knowledge Base

Knowledge base type
Knowledge base type
The way data is stored in the knowledge base:
Relational
A relational database with tables
Tree based
A hierarchical database with trees
List based
Data is stored as a set of (nested) lists. The outermost list contains a predicate.

Relational ( 6 )
Tree based ( 1 )
List based ( 3 )

Features

History of states and events ( 1 )
History of states and events
In order to answer questions about previous states, the system needs to keep track of its states and events, and how they were connected.

Example from SHRDLU
UserWhat did the red cube support before you started to clear it off?
SHRDLUThe green pyramid

Syntax to semantics mappings

Semantic composition type
Semantic composition type
Semantic composition is the process of building the meaning of a sentence from the meanings of the phrases and eventually, the words.
Unification
.
Production rules
A pattern -> action rule that maps a syntax tree sub-structure to its semantic form.
Lambda calculus
/.
Custom procedures
Custom pieces of code act on the contents of parse tree nodes and attach semantic structures to them. Very flexible but can only be extended by a programmer with detailed knowledge of the system.

An example production rule from LUNAR:

[ S:CONTAIN
	(S.NP (MEM I SAMPLE))
	(S.V (OR (EQU 1 HAVE)
		     (EQU 1 CONTAIN))
		 (S.OBJ (MEM 1 (ELEMENT OXIDE ISOTOPE)))
	->(QUOTE (CONTAIN (# 1 1) ( # 3 1))) ]
S:CONTAIN is the name of the rule. The action follows the -> mark.

Unification ( 1 )
Production rules ( 3 )
Lambda calculus ( 1 )
Custom procedures ( 2 )

Grammar

Natural language
Natural language
Which natural languages are supported by this system? The majority of systems just supports English.

English ( 10 )
French ( 1 )

Sentence types

Question ( 10 )
Declarative ( 3 )
Imperative ( 4 )

Phrase types
Phrase types
Clauses as objects; example from SHRDLU:
UserFind a block which is taller than the one I told you to pick up.
"you to pick up" is a clause that is treated as an object (Winograd)

Noun Phrases ( 9 )
Verb Phrases ( 8 )
Preposition Phrases ( 9 )
Determiner Phrases ( 5 )
ADVerb Phrases ( 3 )
ADJective Phrases ( 4 )
Relative Clauses ( 6 )
Negations ( 5 )
Conjunctions ( 4 )
Anaphora ( 4 )
Auxiliaries ( 4 )
Modals ( 1 )
Comparative expressions ( 4 )
Passives ( 2 )
Clefts ( 1 )
There be ( 1 )
Clauses as objects ( 2 )
Extraposition ( 1 )

Features

Ellipsis ( 2 )

Lexicon

Features

Syntactic features ( 5 )
Syntactic features
A lexical entry has information syntactic features.
part-of-speech
the syntactic category (e.g. work: part-of-speech = verb)
number
singular or plural (e.g. birds: number = plural)
Semantic definition ( 3 )
Semantic definition
A lexical entry has a definition of the meaning of the word.

This usually includes a predicate.
Semantic form properties ( 1 )
Semantic form properties
The semantic form of a lexical entry has specific properties, other than just a predicate.

Examples from ThoughtTreasure:
Type of relation
Is the relation one-to-one, one-to-many, or many-to-many? This property is used in generation to determine if an expression is "the president of the US" or "a president of the US".
Fuzzy value
For the predicate "like-human", the word "like" has a fuzzy value in the range of 0.5 - 0.8, while "love" has a value of 0.8 to infinite.


Only irregular forms ( 2 )
Only irregular forms
The lexicon stores only irregular forms of verbs, like 'geese' and 'slept'. The regular morphological compound forms are derived by applying rules.
Grammatical relations ( 1 )
Grammatical relations
The lexicon codes which grammatical relations (like subject, object, and indirect object) a verb has.
intransitive
A verb with only a subject
mono-transitive
A verb with a subject and an object
ditransitive
A verb with a subject, an object and an indirect object


Coded grammatical relations help restrict the number of possible parse trees.

Note that verbs may have multiple "frames". For example: "That man eats" (intransitive) and "He eats apples" (mono-transitive).

Grammatical relation - Wikipedia
Semantic selectional restrictions ( 2 )
Semantic selectional restrictions
Also called "S-selection". The lexicon stores semantic constraints for each argument of a verb.

For example, the verb may contain these restrictions:
  • subject: instance of living organism
  • object: instance of a liquid
Selection (linguistics) - Wikipedia
Category selectional restrictions ( 1 )
Category selectional restrictions
Also called "C-selection". The lexicon stores syntactical constraints for each argument of a verb.

For instance the lexicon may specify that an argument may not be just a noun phrase ("John talks to Peter"), but also a subordinate clause ("John tells Peter to go").

Selection (linguistics) - Wikipedia
Idioms ( 1 )
Idioms
The prototypical example of an idiom is "kick the bucket". It means: to die.

Such an idiom may be stored in the lexicon. Note that its structure must be stored, not just the surface form, since variants are also possible. E.g. "Our dog kicked the bucket."

Idiom - Wikipedia
Phrasal verbs ( 1 )
Phrasal verbs
The lexicon stores the structure of phrasal verbs.

There are three types of phrasal verbs:
prepositional phrasal verbs
verb + preposition. Ex: "Who is looking after the kids?"
particle phrasal verbs
verb + particle. Ex: "They brought that up twice."
particle-prepositional phrasal verbs
Verb + particle + preposition. Ex: "Who can put up with that?"


Explicit use of phrasal verbs helps restrict the number of possible parse trees. Also, it distinguishes between syntactically similar but semantically different verbs like "look" and "look after", and produces different semantic concepts ("look", "look-after").

Phrasal verb - Wikipedia
Roles ( 1 )
Roles
Examples of roles are "president-of", "mother-of", "nationality-of", "friend-of".

ThoughtTreasure stores these relations in the lexicon as surface form / semantic form.

Domain model

Features

Uses an ontology ( 5 )
Uses an ontology
An ontology is an explicit description of the types, attributes, and relations of a domain.

See Wikipedia
Gradable adjectives ( 1 )
Gradable adjectives
The ontology uses weights to express the measure of adjectives.

From ThoughtTreasure

A weight is a value from -1.0 to +1.0. If a weight is not provided, it is assumed to be 0.55.

Examples
[hot A]A is hot
[hot A 1.0]A is extremely hot
[hot A 0.2]A is slightly hot
[hot A -0.55]A is cold
[hot A -0.7]A is very cold
Deduction rules ( 2 )
Deduction rules
IF/THEN inference rules to deduce facts from other facts.
A plan library ( 2 )
A plan library
A set of plans needed to reach certain goals. A plan consists of a goal and a set of actions, or lower level plans.
Procedures for representing and manipulation ( 1 )
Procedures for representing and manipulation
A set of custom procedures to represent the state of the knowledge base other than through language.

Also, a set of procedures to manipulate data or physical objects.

These procedures are accessible through natural language interaction.
Goal creation rules ( 2 )
Goal creation rules
A set of IF/THEN rules that specify under which Knowledge Base conditions a goal is created and added to the Goal Model.

Goal model

Features

Instantiated goals ( 1 )
Instantiated goals
Goals, plans and actions. Goals are taken from the Domain Model and instantiated with data from the user query.

The plans are also taken from the Domain Model, and bound to variables from the goal.

From SHRDLU:
UserPick up a big red block.
SHRDLUOK
The system answers "OK" when it carries out a command. In order to pick up the red block, it had to clear it off by finding a space for the green one and moving the green one away. (Winograd)
History of goals, plans and actions ( 1 )
History of goals, plans and actions
In order to answer questions about its motives, the system needs to keep track of its goals and plans, and how they were connected.

Example from SHRDLU:
UserWhy did you do that?
SHRDLUTo clear off the red cube

Discourse model

Deictic center
Deictic center
Some words can only be understood in reference to an origin. This origin is called the deictic center.
Person
Words like like 'he', 'their'
Time
Words like 'yesterday'
Space
Words like 'there', 'this'


Hence the system needs to update the origin with each new sentence.

Person ( 2 )
Time ( 2 )
Space ( 2 )

Features

Keep track of active plans ( 1 )
Keep track of active plans
The system needs to remember what plans it is currently working on.
SHRDLU1968, Terry Winograd
LUNAR1971, William A. Woods
RENDEZVOUS1974, Edgar F. Codd, Robert S. Arnold, Jean-Marc Cadiou, Chin-Liang Chang, Nick Roussopoulos
EUFID1976, Marjorie Templeton, John F. Burger
TEAM1980, Barbara J. Grosz, Douglas E. Appelt, Fernando C.N. Pereira, David H.D. Warren, Paul Martin, Armar Archbold, Robert C. Moore, Jerry Hobbs, Jane J. Robinson, Daniel Sagalowicz
Chat-801981, David H.D. Warren, Fernando C.N. Pereira
CLE1986, Hiyan Alshawi, David Carter, Jan van Eijk, Björn Gambäck, Robert C. Moore, Douglas B. Moran, Fernando C.N. Pereira, Stephen G. Pulman, Manny Rayner, Arnold G. Smith
ThoughtTreasure1994, Erik T. Mueller
CALO2003, about 250 people
C-PHRASE2008, Michael Minock