AI Unit 3
AI Unit 3
UNIT-03
Ms. Neha Yadav
Assistant professor
Applied computational science and engineering
Knowledge Representation
• Knowledge Representation in AI describes the representation of knowledge.
Basically, it is a study of how the beliefs, intentions, and judgments of an
intelligent agent can be expressed suitably for automated reasoning.
• One of the primary purposes of Knowledge Representation includes modeling
intelligent behavior for an agent.
• It is responsible for representing information about the real world so that a
computer can understand and can utilize this knowledge to solve the complex real
world problems such as diagnosis a medical condition or communicating with
humans in natural language.
What to Represent?
Following are the kind of knowledge which needs to be represented in AI systems:
Object: All the facts about objects in our world domain. E.g., Guitars contains strings, trumpets are
brass instruments.
Events: Events are the actions which occur in our world.
Performance: It describe behavior which involves knowledge about how to do things.
Meta-knowledge: It is knowledge about what we know.
Facts: Facts are the truths about the real world and what we represent.
Knowledge-Base: The central component of the knowledge-based agents is the knowledge base. It
is represented as KB. The Knowledgebase is a group of the Sentences (Here, sentences are used as a
technical term and not identical with the English language).
Different Types of Knowledge
Different Types of Knowledge
1. Declarative Knowledge:
• Declarative knowledge is to know about something.
• It includes concepts, facts, and objects.
• It is also called descriptive knowledge and expressed in declarativesentences.
• It is simpler than procedural language.
2. Procedural Knowledge
• It is also known as imperative knowledge.
• Procedural knowledge is a type of knowledge which is responsible for knowing how to do something.
• It can be directly applied to any task.
• It includes rules, strategies, procedures, agendas, etc.
• Procedural knowledge depends on the task on which it can be applied.
Different Types of Knowledge
3. Meta-knowledge:
Knowledge about the other types of knowledge is called Meta-knowledge.
4. Heuristic knowledge:
• Heuristic knowledge is representing knowledge of some experts in a filed or subject.
• Heuristic knowledge is rules of thumb based on previous experiences, awareness of
approaches, and which are good to work but not guaranteed.
5. Structural knowledge:
• Structural knowledge is basic knowledge to problem-solving.
• It describes relationships between various concepts such as kind of, part of, and grouping
of something.
• It describes the relationship that exists between concepts or objects.
AI knowledge cycle
Artificial Intelligent Systems usually consist of various components to display their
intelligent
behavior. Some of these components include:
1. Perception
2. Learning
3. Knowledge Representation & Reasoning
4. Planning
5. Execution
AI knowledge cycle
AI knowledge cycle
1. Perception Block
• This will help the AI system gain information regarding its surroundings through various sensors, thus
making the AI system familiar with its environment and helping it interact with it.
• These senses can be in the form of typical structured data or other forms such as video, audio, text,
time, temperature, or any other sensor-based input.
2. Learning Block
• The knowledge gained will help the AI system to run the deep learning algorithms.
• These algorithms are written in the learning block, making the AI system transfer the necessary
information from the perception block to the learning block for learning (training).
AI knowledge cycle
3. Knowledge and Reasoning Block
• As mentioned earlier, we use the knowledge, and based on it, we reason and then take any decision.
• Thus, these two blocks are responsible for acting like humans go through all the knowledge data and
find the relevant ones to be provided to the learning model whenever it is required.
Disadvantages:
Disadvantages:
• In frame system inference, the mechanism cannot be easily processed.
• The inference mechanism cannot be smoothly proceeded by frame representation.
• It has a very generalized approach.
Production Rules
• In production rules, agent checks for the condition and if the condition exists then production rule fires
and corresponding action is carried out.
• The condition part of the rule determines which rule may be applied to a problem. Whereas, the action
part carries out the associated problem-solving steps. This complete process is called a recognize-act
cycle.
Inheritable Knowledge
Knowledge here is stored hierarchically. A well-structured hierarchy of classes is formed where data is stored,
which provides the opportunity for inference. Here we can apply inheritance property, allowing us to have
inheritable knowledge. This way, the relations between instance and class (aka instance relation) can be
identified. Unlike Simple Relations, here, the objects are represented as nodes.
Inferential Knowledge
In this method, logics are used. Being a very formal approach, facts can be retrieved with a high level of
accuracy.
Procedural Knowledge
This method uses programs and codes that use simple if-then rules. This is the way many programming
languages such as LIST, Prolog save information. We may not use this method to represent all forms of
knowledge, but domain-specific knowledge can very efficiently be stored in this manner.
Propositional logic in Artificial intelligence
• Propositional logic (PL) is the simplest form of logic where all the statements are made by propositions.
• A proposition is a declarative statement which is either true or false.
• It is a technique of knowledge representation in logical and mathematical form.
• An statement is a proposition if it is either true or false. Examples of propositions include "2+2=4," and
"The sky is blue."
• Logical connectives are used to combine propositions to form more complex statements.
• Truth tables are used to represent the truth values of propositions and the logical connectives that
combine them.
• In propositional logic, inference rules are used to derive new propositions from existing ones.
• Propositional logic is a limited form of logic that only deals with propositions that are either true or false.
Example:
a) It is Sunday.
b) The Sun rises from West (False proposition)
c) 3+3= 7(False proposition)
d) 5 is a prime number.
Syntax of Propositional Logic
• Syntax of propositional logic refers to the formal rules for constructing statements in propositional logic.
• Propositional logic deals with the study of propositions, which are declarative statements that are either
true or false.
• The syntax of propositional logic consists of two main components: atomic propositions and compound
propositions.
Atomic Propositions
• Atomic propositions are simple statements that cannot be broken down into simpler statements. They are
the building blocks of propositional logic. An atomic proposition can be represented by a letter or symbol,
such as p, q, r, or s. For example, the following are atomic propositions:
• p: The sky is blue. q: The grass is green. r: 2+2=4. s: The Earth orbits the Sun.
Compound Propositions
• Compound propositions are formed by combining atomic propositions using logical operators. There are
several logical operators in propositional logic, including negation, conjunction, disjunction, implication, and
equivalence.
• Example:
• "It is raining today, and state it is wet."
• "Ankit is a doctor, and his clinic is in Mumbai
Logical Conectives
When connecting two simpler assertions or logically expressing a statement, logical connectives are used. Using
logical connectives, we can build compound propositions. The following list of connectives includes the main five:
Negation
The negation of a proposition p is denoted by ¬p and is read as "not p". For example: ¬p: The sky is not blue.
Conjunction
The conjunction of two propositions p and q is denoted by p ∧ q and is read as "p and q's". The conjunction
is true only if both p and q are true. For example: p ∧ q`: The sky is blue and the grass is green.
Disjunction
The disjunction of two propositions p and q is denoted by p ∨ q and is read as "p or q". The disjunction is
true if at least one of p and q is true. For example: p ∨ q: The sky is blue or the grass is green.
Logical Conectives
Implication
The implication of two propositions p and q is denoted by p → q and is read
as "if p then q". The implication is false only if p is true
Biconditional
A sentence such as P⇔ Q is a Biconditional sentence, example I will eat
lunch if and only if my mood improves. P= I will eat lunch, Q= if my
mood improves, it can be represented as P ⇔ Q.
Summarized table for Propositional Logic Connectives
Truth table with Three Propositions
Precedence of Connectives
Logical equivalence
Logical equivalence is one of the features of propositional logic. Two
propositions are said to be logically equivalent if and only if the columns in the
truth table are identical to each other.
Let's take two propositions A and B, so for logical equivalence, we can write it
as A⇔B. In below truth table we can see that column for ¬A ∨ B and A→B, are
identical hence A is Equivalent to B
Properties of Operators:
Commutativity:
• P∧ Q= Q ∧ P, or
• P ∨ Q = Q ∨ P.
Associativity:
• (P ∧ Q) ∧ R= P ∧ (Q ∧ R),
• (P ∨ Q) ∨ R= P ∨ (Q ∨ R)
Identity element:
• P ∧ True = P,
• P ∨ True= True.
Distributive:
• P∧ (Q ∨ R) = (P ∧ Q) ∨ (P ∧ R).
• P ∨ (Q ∧ R) = (P ∨ Q) ∧ (P ∨ R).
DE Morgan's Law:
• ¬ (P ∧ Q) = (¬P) ∨ (¬Q)
• ¬ (P ∨ Q) = (¬ P) ∧ (¬Q).
Double-negation elimination:
• ¬ (¬P) = P.
Limitations of Propositional logic:
• We cannot represent relations like ALL, some, or none with propositional logic.
Example:
• All the girls are intelligent.
• Some apples are sweet.
• Propositional logic has limited expressive power.
• In propositional logic, we cannot describe statements in terms of their properties
or logical relationships.
First Order Logic
To represent complicated phrases or natural language statements, PL is insufficient.
The expressive power of propositional logic is quite restricted.
Take a look at the following sentence, which cannot be represented using PL logic.
PL logic is insufficient to represent the above statements, so we required some more powerful
logic, such as first-order logic.
First-order logic is another method of knowledge representation. It's a variant of propositional
logic.
FOL has enough expressiveness to convey natural language statements succinctly.
Predicate logic or First-order predicate logic are other names for first-order logic.
First Order Logic
Like propositional logic, first-order logic (like natural language) implies that the world contains
facts, but it also assumes the following things in the world.
Objects: A, B, people, numbers, colors, squares, pits, wars, theories, wumpus, ......
Relations: It can be unary relation such as: red, round, is adjacent, or n-any relation such as: the
sister of, brother of, has color, comes between
Function: Father of, best friend, third inning of, end of, ......
Variables x, y, z, a, b,....
Predicates Brother, Father, >,....
Connectives ∧, v, ¬, ⇒, ⇔
Equality ==
Quantifier ∀, ∃
Atomic sentences:
• Atomic sentences are the most fundamental first-order logic sentences.
These sentences are made up of a predicate symbol, a parenthesis, and a
series of terms.
• Predicate can be used to represent atomic sentences (term1, term2, ......,
term n).
• Example: Ravi and Ajay are brothers: => Brothers(Ravi, Ajay).
Chinky is a cat: => cat (Chinky).
Complex Sentences:
Connectives are used to join atomic sentences to form complex sentences.
The following are two types of first-order logic statements:
Subject: The major component of the sentence is the subject.
Predicate: A predicate is a relationship that ties two atoms together in a
sentence.
Consider the following statement: "x is an integer." It has two parts: the
first component, x, is the statement's subject, and the second half, "is an
integer," is a predicate.
Quantifiers in First-order logic:
• Quantification describes the quantity of specimen in the universe of
discourse, and a quantifier is a linguistic element that generates
quantification.
• These are the symbols that allow you to determine or identify the
variable's range and scope in a logical expression. There are two
different kinds of quantifiers:
Universal Quantifier, (for all, everyone, everything)
Existential quantifier, (for some, at least one).
Universal Quantifier:
A universal quantifier is a logical symbol that indicates that a statement inside its range is true for
everything or every
instance of a specific thing.
Note: the implication of universal quantifier is "→".
If x is a variable, then ∀x is read as:
For all x
For each x
For every x.
Existential Quantifier:
The Existential Quantifier (∃) is a key idea in logic that allows you to say that at least one
member in a group or
domain meets a certain condition.
Existential quantifiers are a sort of quantifier that expresses that a statement is true for at least one
instance of something
within its scope.
Note: we always use the AND or Conjunction symbol (∧) in Existential quantifiers.
If x is a variable, the existential quantifier is either x or (x). And it will be written like follows:
There exists a 'x.'
For some 'x.'
For at least one 'x.'
Some Examples of FOL using quantifier:
1. All birds fly.
The predicate in this question is "fly(bird)."
Because all birds are able to fly, it will be portrayed as follows.
∀x bird(x) →fly(x).
2. Every man respects his parent.
The predicate in this question is "respect(x, y)," where x=man, and y= parent.
Because there is every man so will use ∀, and it will be portrayed as follows:
∀x man(x) → respects (x, parent).
3. Some boys play cricket.
In this question, the predicate is "play(x, y), " where x= boys, and y= game. Because there are some
boys so we will use ∃, and it will be portrayed as:
∃x boys(x) → play(x, cricket).
Inference Rules In AI:
Inference:
In artificial intelligence, we need intelligent computers which can create
new logic from old logic or by evidence, so generating the conclusions
from evidence and facts is termed as Inference.
Inference rules:
Inference rules are the templates for generating valid arguments. Inference
rules are applied to derive proofs in artificial intelligence, and the proof is a
sequence of the conclusion that leads to the desired goal.
In inference rules, the implication among all the connectives plays an important role.
Following are some
terminologies related to inference rules:
1. Implication: It is one of the logical connectives which can be represented as P → Q.
It is a Boolean expression.
2. Converse: The converse of implication, which means the right-hand side proposition
goes to the left-hand side and vice-versa. It can be written as Q → P.
3. Contrapositive: The negation of converse is termed as contrapositive, and it can be
represented as ¬ Q → ¬ P.
4. Inverse: The negation of implication is called inverse. It can be represented as ¬ P
→ ¬ Q.
The compound statements are equivalent to each other, which we can prove using truth
table:
Types of Inference rules:
1. Modus Ponens:
The Modus Ponens rule is one of the most important rules of inference, and it states that
if P and P → Q is true, then we can infer that Q will be true. It can be represented as:
[(p->q^p)]->q
Example:
Given: If it's raining (P), then I'll take an umbrella (Q).
Statement 1: It's raining (P).
Conclusion: I'll take an umbrella (Q).
2. Modus Tollens:
The Modus Tollens rule state that if P→ Q is true and ¬ Q is true, then ¬ P will
also true. It can be represented as:
Example:
Statement-1: Today is Sunday or Monday. P∨Q
Statement-2: Today is not Sunday. ¬P
Conclusion: Today is Monday. Q
5. Addition:
The Addition rule is one the common inference rule, and
it states that If P is true, then P∨Q will be true.
Example:
Statement: I have a vanilla ice-cream. ==> P
Statement-2: I have Chocolate ice-cream.==> Q
Conclusion: I have vanilla or chocolate ice-cream. ==> (P ∨Q)
6. Resolution:
The Resolution rule state that if P∨Q and ¬ P∧R is true, then Q∨R will also
be true. It can be represented as
7. Simplification:
The simplification rule state that if P∧ Q is true, then Q
or P will also be true. It can be represented as:
This rule can be used if we want to show that every element has a similar property.
In this rule, x must not appear as a free variable.
Example: Let's represent, P(c): "A byte contains 8 bits", so for ∀ x P(x) "All
bytes contain 8 bits.", it will also be true.
2. Universal Instantiation:
• Universal instantiation is also called as universal elimination or UI is a valid
inference rule.
• It can be applied multiple times to add new sentences.
• As per UI, we can infer any sentence obtained by substituting a ground term
for the variable.
• The UI rule state that we can infer any sentence P(c) by substituting a ground term
c (a constant within domain x) from ∀ x P(x) for any object in the universe of
discourse.
• It can be represented as:
Example:1. IF "Every person like ice-cream"=> ∀x P(x) so we can infer that
"John likes ice-cream" => P(c)
3. Existential Instantiation
• Existential instantiation is also called as Existential Elimination, which is a valid inference rule in first-order logic.
• It can be applied only once to replace the existential sentence.
• This rule states that one can infer P(c) from the formula given in the form of ∃x P(x) for a new constant symbol c.
• The restriction with this rule is that c used in the rule must be a new term for which P(c ) is true.
• It can be represented as:
Comparison
Expression A ("Eats(x, Apple)") is compared to Expression B ("Eats(Riya, y)"
Substitution Variable
We can see that Expression A's first parameter is a variable "x," and the second argument is a constant
"Apple."
The first parameter in Expression B is a constant "Riya," while the second argument is a variable "y."
Unifying Variables
We unify the variable "x" in Expression A with "Riya" in Expression B. This results in the substitution: x =
Riya.
We also unify the variable "y" in Expression B with the constant "Apple." This gives us the substitution: y =
Apple.
Applying Substitutions
After substitutions, Expression A becomes "Eats(Riya, Apple)."
Unified Expression
Both expressions are now identical: "Eats(John, Apple)."
In this statement, we will apply negation to the conclusion statements, which will be written as ¬likes(John,
Peanuts)
Step-4: Draw Resolution graph:
Deductive reasoning:
Deductive reasoning is deducing new information from logically related known information. It is the form of valid reasoning, which means
the argument's conclusion must be true when the premises are true.
Deductive reasoning is a type of propositional logic in AI, and it requires various rules and facts. It is sometimes referred to as top-down
reasoning, and contradictory to inductive reasoning.
In deductive reasoning, the truth of the premises guarantees the truth of the conclusion.
Inductive reasoning is a type of propositional logic, which is also known as cause-effect reasoning or bottom-up reasoning.
In inductive reasoning, we use historical data or various premises to generate a generic rule, for which premises support the conclusion.
Example:
Premise: All of the pigeons we have seen in the zoo are white.
Abductive reasoning is an extension of deductive reasoning, but in abductive reasoning, the premises do not guarantee the conclusion
Common Sense reasoning simulates the human ability to make presumptions about events which occurs on every day.
It relies on good judgment rather than exact logic and operates on heuristic knowledge and heuristic rules
Monotonic Reasoning:
In monotonic reasoning, once the conclusion is taken, then it will remain the same even if we add some other information to existing
information in our knowledge base. In monotonic reasoning, adding knowledge does not decrease the set of prepositions that can be derived.
To solve monotonic problems, we can derive the valid conclusion from the available facts only, and it will not be affected by new facts.
Monotonic reasoning is not useful for the real-time systems, as in real time, facts get changed, so we cannot use monotonic reasoning.
Monotonic reasoning is used in conventional reasoning systems, and a logic-based system is monotonic.
Example:
Logic will be said as non-monotonic if some conclusions can be invalidated by adding more knowledge into our knowledge base.
"Human perceptions for various things in daily life, "is a general example of non-monotonic reasoning.
Example: Let suppose the knowledge base contains the following knowledge:
So from the above sentences, we can conclude that Pitty can fly.
Bayes' theorem:
Bayes' theorem is also known as Bayes' rule, Bayes' law, or Bayesian reasoning, which determines the probability of an event with
uncertain knowledge.
In probability theory, it relates the conditional probability and marginal probabilities of two random events.
Bayes' theorem allows updating the probability prediction of an event by observing new information of the real world.
Example: If cancer corresponds to one's age then by using Bayes' theorem, we can determine the probability of cancer more accurately with
the help of age.
Bayes' theorem can be derived using product rule and conditional probability of event A with known event B:
It shows the simple relationship between joint and conditional probabilities. Here,
P(A|B) is known as posterior, which we need to calculate, and it will be read as Probability of hypothesis A when we have occurred an
evidence B.
P(B|A) is called the likelihood, in which we consider that hypothesis is true, then we calculate the probability of evidence.
P(A) is called the prior probability, probability of hypothesis before considering the evidence
Question: From a standard deck of playing cards, a single card is drawn. The probability that the card is king is 4/52, then calculate
posterior probability P(King|Face), which means the drawn face card is a king card.
Solution:
Bayesian belief network is key computer technology for dealing with probabilistic events and to solve a problem which has uncertainty. We
can define a Bayesian network as:
"A Bayesian network is a probabilistic graphical model which represents a set of variables and their conditional dependencies using a
directed acyclic graph."
Bayesian networks are probabilistic, because these networks are built from a probability distribution, and also use probability theory for
prediction and anomaly detection.
Bayesian Network can be used for building models from data and experts opinions, and it consists of two parts:
The generalized form of Bayesian network that represents and solve decision problems under uncertain knowledge is known as an
Influence diagram.
A Bayesian network graph is made up of nodes and Arcs (directed links), where:
○ Each node corresponds to the random variables, and a variable can be continuous or
discrete.
○ Arc or directed arrows represent the causal relationship or conditional probabilities
between random variables. These directed links or arrows connect the pair of nodes in
the graph.
○ In the above diagram, A, B, C, and D are random variables represented by the
nodes of the network graph.
○ If we are considering node B, which is connected with node A by a directed arrow,
then node A is called the parent of Node B.
○ Node C is independent of node A.
The Bayesian network graph does not contain any cyclic graph. Hence, it is known
as a directed acyclic graph or DAG.
○ Causal Component
○ Actual numbers