0% found this document useful (0 votes)
69 views37 pages

Knowledge Representation and Engineering

The document discusses knowledge representation and engineering methods in artificial intelligence, emphasizing the historical focus on expert systems and the challenges of knowledge acquisition. It outlines various approaches to knowledge representation, including logic, semantic nets, and frames, while distinguishing between types of knowledge such as declarative and procedural. Additionally, it highlights the importance of creating shared ontologies and the interdependence of representation and computation in AI systems.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
69 views37 pages

Knowledge Representation and Engineering

The document discusses knowledge representation and engineering methods in artificial intelligence, emphasizing the historical focus on expert systems and the challenges of knowledge acquisition. It outlines various approaches to knowledge representation, including logic, semantic nets, and frames, while distinguishing between types of knowledge such as declarative and procedural. Additionally, it highlights the importance of creating shared ontologies and the interdependence of representation and computation in AI systems.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

MAI0203 Lecture 10: Knowledge

Representation and Engineering


Methods of Artificial Intelligence
WS 2002/2003
Part II: Inference and Knowledge Representation

II.10 Knowledge Representation and Knowledge Engineering

MAI0203 Lecture 10: Knowledge Representation and Engineering – p.252


Introduction
Knowledge representation was the “big topic” of AI in
the 80ies: development of representation
formalisms/languages and tools
Focus on expert systems (instead of general
mechanisms, such as problem solving and inference)
The knowledge bottleneck (Feigenbaum): acquisition
of expert knowlege and its formal representation
(knowledge engineering)
alternative: symbolic approaches to machine learning
Approaches to knowledge representation:
Logic/Prolog, Terminological Logic, Semantic Nets,
Frames/Object-Oriented Representation, Procedural
Representation/Production Systems, Analogical
Representation, ...
MAI0203 Lecture 10: Knowledge Representation and Engineering – p.253
Ongoing Discussion
Talk of Pim Haselager (8th of Jan. 2003, IKW
Colloquium): “Embodied embedded cognition and the
addiction to representations in cognitive science”

MAI0203 Lecture 10: Knowledge Representation and Engineering – p.254


Knowledge
Knowledge base vs. data base
Traditional definition in philosophy and linguistic:
If “P knows that X” is true than it holds
X is true ( knowledge is always true! )


P believes that X holds
P can justify why X holds
in cognitive psychology: content of longterm memory;
justified, subjective belief
in AI: content of the knowledge base of a system
(might be revisable, remember default logic)

There are a lot of specialized concepts of knowledge ...

MAI0203 Lecture 10: Knowledge Representation and Engineering – p.255


Semantic vs. Episodic K.
General/Semantic/Conceptual knowledge: knowledge
about the meaning of concepts, their attributes and
relations, knowledge which is viewed as “valid without
restrictions” in form of propositions or rules
Apples are red, A dog is a mammal, Birds lay eggs
vs.
Episodic/Case-based knowledge: refers to concrete
examples, includes context information (place and
time)
My dog Lassie likes to sleep on the sofa

MAI0203 Lecture 10: Knowledge Representation and Engineering – p.256


Declarative vs. Procedural K.
Declarative/explicit knowledge: Knowledge about facts,
verbalizable, accessible, represented in form of
propositions, frames, explicit rules
vs.
Procedural/implicit knowledge: knowledge which
immediately can be transformed into actions, not
verbalizable, not accessible, represented in form of
production rules, algorithms, functions

MAI0203 Lecture 10: Knowledge Representation and Engineering – p.257


Further Kinds of K.
World knowledge vs. language specific knowledge
Heuristic knowledge: (cf. in problem solving)
Strategic knowledge, Meta knowledge
Background knowledge
Domain-specific knowledge, common-sense
knowledge
...

MAI0203 Lecture 10: Knowledge Representation and Engineering – p.258


Knowledge Representation
Knowledge itself is a representation of “real facts”
(“Das logische Bild der Tatsachen ist der Gedanke”,
Wittgenstein, Tractatus)
Knowledge is a model (in the logical sense), that is, a
truth preserving mapping from “the real world” to a
model which can be expressed in some (formal)
language.
Representation means the construction of a
model for some part of reality.

MAI0203 Lecture 10: Knowledge Representation and Engineering – p.259


Knowledge Representation cont.
Taller than Wider than Larger than
Represented World
-> longer than -> longer than -> longer than

a b c d a’ b’ c’ d’ a’ b’ c’ d’ a’ b’ c’ d’

a’ b’ c’ d’ a’ b’ c’ d’ a’ b’ c’ d’

Taller than -> larger than -> points to -> chains to

(Palmer, 1978)
What do I represent and how do I represent it?

MAI0203 Lecture 10: Knowledge Representation and Engineering – p.260


Representation and Computation
The Roman and Arabic numerical systems are
equivalent representations for positive integers (can be
transformed into each other), but: arithmetic operations
can be performed more efficiently in the arabic system!
Equivalence of (symbolic) representations (Larkin &
Simon, 1987): informational equivalence (can be
transformed into each other) computational
equivalence
any inference which can be drawn easily and quickly in one can be
also drawn easily and quickly in the other
Representation and computation are interdependent!
(cf. computer programs: explicit storage in data
structures operations working on the data)

MAI0203 Lecture 10: Knowledge Representation and Engineering – p.261


Logical Representation
Suited for solving inference problems
First-order logic/Prolog
Restriction to inheritance nets, concept languages,
terminological logic, description logic
Introduced by Brachmann (1979) as a variant of
semantic networks
Special formalisms: KL-ONE, BACK, CLASSIC

MAI0203 Lecture 10: Knowledge Representation and Engineering – p.262


Concept
Concept: elementary unit of knowledge
abstraction over objects/facts/events
classification based on common attributes and
relations
Bachelor And(Unmarried, Adult, Male)
(CLASSIC)
 



 





 


x bachelor unmarried adult male



(FOL)
Grounding problem!
Psychological research: e.g. E. Rosch (Prototype
theory)

MAI0203 Lecture 10: Knowledge Representation and Engineering – p.263


Terminological Logic
Representation of concepts in a (tractable) subset of
FOL (no negation, no disjunction)
Solving classification tasks: for a new concept, find the
set of most specific concepts in the knowledge base
which subsume the new concept (see “Staubsauger”
example, lecture 10 of MAI 01/02)
hierarchical semantic net: isa-relations and
property-relations, inheritance

MAI0203 Lecture 10: Knowledge Representation and Engineering – p.264


Example
organ tissue cell
isa hasprop hasprop
hasprop
isa animal heart

fish
isa isa

trout carp

isa isa
Theo Carl

MAI0203 Lecture 10: Knowledge Representation and Engineering – p.265


(Sets and relations)
Semantics
Concepts: {trout, carp, fish, animal, heart, organ, tissue, cell}


Instances: {Theo, Carl}








Theo trout, Carl carp, trout fish, carp fish, fish animal,



heart organ


ISA {(Theo, trout), (Carl, carp), (trout, fish), (carp, fish),







(fish, animal), (heart, organ)}


HASPROP {(animal, heart), (organ, tissue), (tissue,





cell)}
Transitive Closure:
 


 


 
ISA ISA ISA









 


 


 
HASPROP HASPROP HASPROP









 


 


 

HASPROP ISA HASPROP











 


 


 

ISA HASPROP HASPROP










MAI0203 Lecture 10: Knowledge Representation and Engineering – p.266


PROLOG
is_a(fish,animal).
is_a(trout,fish).
is_a(carp,fish).
is_a(theo,trout).
is_a(carl,carp).
is_a(heart,organ).
has_prop(animal,heart).
has_prop(organ,tissue).
has_prop(tissue,cell).

isa(A,B) :- is_a(A,B).
isa(A,C) :- is_a(A,B), isa(B,C). /* Transitivity of isa */

has(A,X) :- has_prop(A,X).
has(X,Z) :- has_prop(X,Y), has(Y,Z). /* Transitivity of has */

has(A,X) :- has_prop(A,Y), isa(Y,X). /* Generalization of has wrt isa */


has(A,X) :- is_a(A,B), has(B,X). /* Inheritance of has wrt isa */

MAI0203 Lecture 10: Knowledge Representation and Engineering – p.267


Ontology
A hierarchical structure of concepts is called an
ontology
In psychology, there are empirical investigations about
the development of ontologies (e.g. childern learn very
early to discern animated and inanimate objects)
In AI systems (and semantic web): problem of the
knowledge engineer is to create ontologies which are
shared by a large community of people

MAI0203 Lecture 10: Knowledge Representation and Engineering – p.268


Semantic Nets and Case Frames
Hierachical net/terminological logic: concepts and
properties as nodes , set/subset relation (isa) and
property-relations as arcs
In general: A graph with “objects” as nodes and
labelled arcs as relations between objects
Attribute-Value Structures: prop(Obj, Att, Value)
Attribute color
Object Value Ball71 Red

Case Frames: (Fillmore, 1971)


gave book

V instance

John event1 book69


agent object

receiver

Mary

MAI0203 Lecture 10: Knowledge Representation and Engineering – p.269


Examples for Cases
Agent: instigator of an event
Object: entity that moves or changes or whose position
or existence is in consideration
Instrument: stimulus or direct physical cause of an
event
Location: place where an event happens
Time: time when an event happens
Source: place from which something moves
Destination/Goal: place to which something moves
Recipient: recipient of an event

(see also Theta-Theory in linguistics)


MAI0203 Lecture 10: Knowledge Representation and Engineering – p.270
Deep Structure
Representation of natural language sentences in
semantic nets:
Extract the events: based on the verbs
Each verb is associated with one or more case
frames
give(agent, object, recipient)
discriminate: giving a physical object vs. giving a
kiss/affection!
Case frames represent the deep structure, e.g. no
distinction between active and passive:
John gave Mary the book.
The book was given to Mary by John.

MAI0203 Lecture 10: Knowledge Representation and Engineering – p.271


Inference in Semantic Nets
For classification, see hierarchical nets
Question answering: based on pattern matching and
unification
What was given to Mary?
gave

event ?X
object

receiver

Mary

Activation networks: intersection search (e.g.


declarative memory in ACT, Anderson)

MAI0203 Lecture 10: Knowledge Representation and Engineering – p.272


Problems of Semantic Nets
Find a “minimal” and “complete” set of roles to
describe events (current proposal in linguistics: use
“proto roles”, e.g. only “proto agent” roles and “proto
patient” roles, Dowty)
Unclear semantics of relations
difference: instance-relation between instances
and concepts or subconcepts and superconcepts
(isa(book69, book) vs. isa(trout, fish))



 

 




 
part-of: x, y partof loc loc



only leads to valid conclusions for physical
relations (e.g. not for being part of a family)
There are a lot of degrees of freedom in constructing
representations (knowledge engineering problem)
A failed project? CYC (Lenat et al.), how to include common-sense
knowledge
MAI0203 Lecture 10: Knowledge Representation and Engineering – p.273
Evidence for Propositional Rep.
Sachs (1967): humans to not store the surface structure of
sentences

Text about the live of Galileo, variation of one sentence:


1. He sent a letter about it to Galileo, the great Italian
scientist.
2. He sent Galileo, the great Italian scientist, a letter
about it.
3. Galileo, the great Italian scientist, sent him a letter
about it.
Already after 80 further syllables read, sentences 1
and 2 could no longer be discriminated.

MAI0203 Lecture 10: Knowledge Representation and Engineering – p.274


Evidence for Prop. Rep. cont.
McKoon & Ratcliff (1980): Evidence that people represent
propositions and not sequences of words
Sentence Proposition
The (1) businessman gestured to a (2) waiter. (p1) gesture(businessman, waiter)
The waiter brought (3) coffee. (p2) bring(waiter, coffee)
The coffee stained the (4) napkins. (p3) stain(coffee, napkins)
The napkins protected the (5) tablecloth. (p4) protect(napkin, tablecloth)
The businessman flourished (6) documents. (p5) flourish(businessman, documents)
The documents explained a (7) contract. (p6) explain(documents, contract)
The contract satisfied the (8) client. (p7) satisfy(contract, client)
P2 P3 P4

P1

P5 P6 P7

Priming experiment: “document” is recognized after prime


“waiter” faster than e.g. “client” is recognized after napkins”
(word distance 4 in both cases)
MAI0203 Lecture 10: Knowledge Representation and Engineering – p.275
Frame Problem
Problem of representing that common-sense knowledge
which allows us to predict consequences of interactions
with the environment, that is, things that change and things
that do not change when an action is performed.

Originally formulated in the context of situation calculus by


McCarthy (see lecture on Planning)

MAI0203 Lecture 10: Knowledge Representation and Engineering – p.276


Frames
Closely related to semantic nets with inheritance
In psychology: schema (Bartlett, 1932, reconstructive
memory; Rumelhardt & Ortonu, 1977)
In AI: Minsky (1975)
closely related to object-oriented representation
structured representation of expectations about
objects/events
slots with values ( constraints/ranges), default
instantiations
Frame hierachies with inheritance
Example: “Apple” frame with slots for color, taste,
region of growth; super-frame “Fruit”
Problem: multiple inheritance! MAI0203 Lecture 10: Knowledge Representation and Engineering – p.277
RESTAURANT Frame Frames cont.
Specialization-of: BUSINESS-ESTABLISHMENT
Types:
range: (Cafeteria, Seat-Yourself, Wait-to-be-seated)
default: Wait-to-be-seated
if-needed: IF plastic-orange-counter THEN Fast-food
IF stack of trays THEN cafeteria
...
Location:
range: an ADDRESS
if-needed: (Look at the Menu)
Name:
if-needed: (Look at the Menu)
Food-Style:
range: (Burgers, Chinese, American, Seafood, French)
default: American
Times-of-Operation:
range: a Time-of-Day
default: open evenings except Mondays
Payment-Form:
range: (Cash, Credit Card, Check, Washing-Dishes-Script)
Event-Sequence:
default: Eat-at-Restaurant Script MAI0203 Lecture 10: Knowledge Representation and Engineering – p.278
Scripts
Schank (1972) proposed a set of primitive acts from
which more complex actions can be built (semantic
primitives, conceptual dependency)
Such acts can be used in scripts
Restaurant-Script
Scene 1: Entering
S PTRANS S into restaurant, S ATTEND eyes to tables, S MBUILD where to sit, S
PTRANS S to table, S MOVE S to sitting position
Scene 2: Ordering
S PTRANS menu to S (menu already on table), S MBUILD choice of food, S
MTRANS signal to waiter, waiter PTRANS to table, S MTRANS ’I want food’ to waiter,
waiter PTRANS to cook
Scene 3: Eating
Cook ATRANS food to waiter, waiter PTRANS food to S, S INGEST food
Scene 4: Exiting
waiter MOVE write check, waiter PTRANS to S, waiter ATRANS check to S, S
ATRANS money to waiter, S PTRANS out of restaurant
MAI0203 Lecture 10: Knowledge Representation and Engineering – p.279
Procedural Representation
Internal representation of state changes/action
sequences
Procedural semantics: “understanding” a sentence


the according action can be performed (the meaning of
“take the hammer” is given with the procure that allows
the system to grasp it)
Concepts can be represented as classification rules
(e.g. learned with a decision tree algorithm or with a
feed-forward net)
Explicit representation of rules: can be used in a
declarative or in a procedural way suitable for


analogical reasoning (Rumelhart & Norman, 1981)

MAI0203 Lecture 10: Knowledge Representation and Engineering – p.280


Procedural Representation cont.
Procedural knowledge in LOGO
define square(:x)
loop(4, &(forward(:x), right(90)))

define pentagon(:x)
loop(5, &(forward(:x), right(72)))

Knowledge what as square is: action sequence to


draw one
Analogy: A pentagon is like a square but with 5 instead
of 4 sides and with 72 degree angles instead of 90
degree.

MAI0203 Lecture 10: Knowledge Representation and Engineering – p.281


Analogical/Direct Representation
Direct/analogical means, that “the structure of the
representation gives information about the structure of
what is represented” (Sloman, 1971)
E.g., a street map is a direct representation of a city in
the sense that the distance between two points on the
map corresponds to the distance of the places they
represent.
Diagrammatical reasoning
Algorithms: Scanning (over pixel matrices)
In psychology: Mental Models, Imagery
Mental rotation experiments (Shepard & Metzler, 1971)

MAI0203 Lecture 10: Knowledge Representation and Engineering – p.282


Analogical Representation cont.
Experiments by Kosslyn (1981) consistent with the direct
representation assumption:
subjects memorized a map of an island with some
landmarks (well, lake, swamp, hut)
task: (1) imagine the memorized map, (2) focus on
landmark , (3) refocus on landmark


result: time for shifiting attention increases linearily
with the distance

MAI0203 Lecture 10: Knowledge Representation and Engineering – p.283


Analogical Representation cont.
compare:

leftOf(A, B)
leftOf(B, C)
leftOf(C, D) A B C D

Is A to the left of D? information is given ex-


plicitely
Inference using
transitivity rule.

But: our introspective impression that “we just see” that A


is left of D nevertheless relies on computation (but a
different kind)

MAI0203 Lecture 10: Knowledge Representation and Engineering – p.284


Knowledge Engineering
A knowledge engineer is involved with
Knowledge acquisition: obtaining and formalizing
Knowledge operationalization: design and
implementation
Tools: KADS (Schreiber, Wielinga & Breuker, 1993):
Modeling on the “knowledge level” (Newell)

MAI0203 Lecture 10: Knowledge Representation and Engineering – p.285


Knowledge Engineering cont.
The problem is the acquisition:
Only for very small and conrete domains, an
analytic description is possible, otherwise, you
need to obtain the knowledge from domain experts.
Expert knowledge is typically highly automated, not
verablizable knowledge (try indirect methods
instead of interviews)
There is no methodology to select “suitable”
granularity and representation format (c.f.
requirement and design specification in software
engineering)

MAI0203 Lecture 10: Knowledge Representation and Engineering – p.286


Human K. Acquisition and Expertice
Learning by being told:
rote learning
generalization learning with an external teacher
(supervised learning)
Learning by doing: Compilation of rules by practice
(e.g., Anderson, 1983)
We seldomly learn “realy new” things: new experience
is relative to already existing knowledge
Research of human expertice:
Large amount of domain-knowledge
efficient knowledge organization (e.g. chunks,
Chase & Simon, 1973)
High meta-cognitive skills (Gruber & Strube, 1989):
evaluation of the state of a partial solution
Automated skills MAI0203 Lecture 10: Knowledge Representation and Engineering – p.287
The Running Gag of MAI 02/03
Question: How many AI people does it take to change a lightbulb?
Answer: At least 67.
5th part of the solution: The Knowledge Engeneering Group (6)
One to study electricians changing lightbulbs
One to arrange for the purchase of the Lisp machines
One to assure the customer that this is a hard problem and
that great accomplishments in theory will come from support
of this efford
The same one can negotiate the project budget
One to study related research
One to indicate how it is a description of human
lightbulb-changing behavior
One to call the Lisp hackers (“Artificial Intelligence”, Rich & Knight)
MAI0203 Lecture 10: Knowledge Representation and Engineering – p.288

You might also like