Introduction to Logic, Propositions & Propositional Equivalences
A Complete Study Guide — Discrete Mathematics • Honors Level
1. Introduction to Logic
1.1 What is Logic?
Logic is the systematic study of the principles of valid reasoning and inference. At its foundation, logic provides a formal language and a set of rules that allow us to distinguish valid arguments from invalid ones. When we say an argument is logically valid, we mean that if the premises are true, then the conclusion must also be true — not just coincidentally, but necessarily.
For students pursuing a BSc (Honours) in Computer Science or Information Technology, logic is not merely a theoretical exercise. It is the backbone of programming, algorithm design, circuit design, artificial intelligence, database query languages, formal verification of software, and much more. In fact, every if-else statement you have ever written is a direct application of logical reasoning.
Logic is a branch of mathematics and philosophy that deals with the formal principles of reasoning. In computer science, it is used to model computation, specify system behavior, verify programs, and design hardware circuits.
1.2 Historical Background
The study of logic dates back to ancient Greece. Aristotle (384–322 BC) is considered the father of formal logic. His work on syllogisms — a form of reasoning where a conclusion follows from two premises — laid the foundation for centuries of logical study. The famous syllogism “All men are mortal; Socrates is a man; therefore Socrates is mortal” is a classic example of Aristotelian deductive reasoning.
The next great leap came in the 19th century. George Boole (1815–1864) published his landmark work The Laws of Thought (1854), which translated logic into algebraic form. Boolean algebra, named after him, became the mathematical system underlying all modern digital computers. Every transistor in a CPU operates according to Boolean logic.
Gottlob Frege, Bertrand Russell, and Alfred North Whitehead further developed predicate logic and formal systems in the late 19th and early 20th centuries. Later, Claude Shannon connected Boolean algebra to electrical circuits in his 1937 MIT thesis, and the rest, as they say, is the history of computing.
1.3 Importance of Logic in Computer Science and IT
Logic permeates virtually every area of computer science. Understanding it deeply is not optional for a CS/IT Honours student — it is fundamental. Here is how logic appears across the discipline:
💻 Programming
Conditional statements (if, while, for), boolean expressions, and control flow are all applications of propositional logic. Every program is, at some level, a logical specification.
⚙ Digital Circuit Design
Logic gates (AND, OR, NOT, NAND, NOR, XOR) implement propositional operators in hardware. Designing efficient circuits requires mastery of Boolean algebra and equivalences.
📊 Database Systems
SQL WHERE clauses, relational algebra, and query optimization rely heavily on propositional and predicate logic to filter, join, and retrieve data correctly.
🤖 Artificial Intelligence
Knowledge representation, automated reasoning, expert systems, and machine learning theory all build on formal logical foundations including propositional and first-order logic.
🔒 Software Verification
Formal methods use propositional and temporal logic to mathematically prove that software systems are correct and free from certain classes of bugs before deployment.
🔢 Algorithm Analysis
Proofs of correctness, loop invariants, and complexity arguments all use logical inference rules. Understanding logic makes you a better algorithm designer and analyst.
2. Propositions
2.1 What is a Proposition?
A proposition (also called a statement) is a declarative sentence that is either true or false, but not both. This binary nature — the law of the excluded middle — is the cornerstone of classical propositional logic.
A proposition is a declarative sentence (a sentence that declares a fact) that is either true (T or 1) or false (F or 0), but never both simultaneously. The truth value of a proposition is its assignment of true or false.
Examples of Propositions:
• “The Sun rises in the East.” → TRUE
• “2 + 2 = 5” → FALSE
• “Karachi is the capital of Pakistan.” → FALSE (Islamabad is)
• “Every even integer greater than 2 is the sum of two primes.” → Truth value unknown (Goldbach's conjecture), but it IS a proposition
• “Python is a programming language.” → TRUE
• “What time is it?” → A question — not a declarative sentence
• “Close the door!” → A command
• “x + 1 = 5” → An open sentence with a variable — truth depends on x
• “This statement is false.” → A paradox (the Liar's Paradox) — not a valid proposition
• “Wow, what a great result!” → An exclamation
Propositions are typically denoted by lowercase letters: p, q, r, s, …. For example, we might let:
p: “It is raining outside.”
q: “The ground is wet.”
r: “I need an umbrella.”
2.2 Atomic vs Compound Propositions
Propositions fall into two fundamental categories:
An atomic proposition (or simple proposition) is one that cannot be broken down into simpler propositions. It has no logical connectives inside it. For example: “Today is Monday” is atomic.
A compound proposition is formed by combining one or more propositions using logical connectives (also called logical operators or Boolean operators). For example: “Today is Monday and it is raining” combines two atomic propositions using the connective “and”.
2.3 Logical Connectives and Operators
The following table summarizes the five primary logical connectives used in propositional logic. These are the building blocks for constructing all compound propositions:
| Connective | Symbol | Name | Read as | Example |
|---|---|---|---|---|
| Negation | ¬p or ~p | NOT | “not p” | ¬p: “It is not raining” |
| Conjunction | p ∧ q | AND | “p and q” | p ∧ q: “Raining and cold” |
| Disjunction | p ∨ q | OR | “p or q” | p ∨ q: “Raining or cold” |
| Implication | p → q | IF…THEN | “if p then q” | p → q: “If raining, then wet” |
| Biconditional | p ↔ q | IF AND ONLY IF | “p iff q” | p ↔ q: “Raining iff cloudy” |
Exclusive OR (XOR)
There is a sixth connective worth knowing: Exclusive OR, written p ⊕ q (or p XOR q). It is true when exactly one of p or q is true, but not both. In digital logic, XOR is one of the most important gates. In regular OR, p ∨ q is true even when both p and q are true; XOR eliminates that case.
p ⊕ q ≡ (p ∨ q) ∧ ¬(p ∧ q)Understanding Each Connective in Depth
¬p — Negation (NOT)
Flips the truth value of a proposition. If p is TRUE, then ¬p is FALSE. If p is FALSE, then ¬p is TRUE. This is the simplest connective — it acts on a single proposition (unary operator).
p: “The sky is blue.” (TRUE)
¬p: “The sky is not blue.” (FALSE)
p ∧ q — Conjunction (AND)
True ONLY when BOTH p and q are true. If either or both are false, the conjunction is false. This mirrors the everyday meaning of “and”.
“It is Tuesday AND it is raining” — TRUE only if both conditions hold simultaneously.
p ∨ q — Disjunction (OR)
True when AT LEAST ONE of p or q is true. It is only false when BOTH are false. Note: this is inclusive OR, meaning it’s true even if both p and q are true.
“You can pay by cash OR card” — paying by both is also acceptable.
p → q — Implication (IF…THEN)
Called a conditional. The only case where it is FALSE is when p is TRUE and q is FALSE. In all other cases, it is TRUE. This surprises many students: if the hypothesis (p) is false, the implication is vacuously true.
“If it rains, the ground is wet.” — FALSE only if it rains but the ground stays dry.
The Conditional p → q: Related Forms
The conditional is especially important in logic and proofs. It has three related variants you must know:
| Name | Form | Logically Equivalent To |
|---|---|---|
| Original | p → q | (itself) |
| Converse | q → p | Contrapositive of the inverse |
| Inverse | ¬p → ¬q | Converse of the original |
| Contrapositive | ¬q → ¬p | ✔ Logically equivalent to p → q |
The contrapositive (¬q → ¬p) is always logically equivalent to the original implication (p → q). This is extremely useful in proofs — sometimes it’s easier to prove the contrapositive than the original statement.
2.4 Truth Tables
A truth table is a tabular representation that lists all possible combinations of truth values for the component propositions and shows the resulting truth value of the compound proposition. For a compound proposition with n atomic variables, there are 2n rows (one for each possible combination of T/F values).
Truth Table: Negation (¬p)
| p | ¬p |
|---|---|
| T | F |
| F | T |
Truth Table: All Basic Binary Connectives (p, q)
| p | q | p ∧ q | p ∨ q | p ⊕ q | p → q | p ↔ q |
|---|---|---|---|---|---|---|
| T | T | T | T | F | T | T |
| T | F | F | T | T | F | F |
| F | T | F | T | T | T | F |
| F | F | F | F | F | T | T |
AND (∧): Both must be TRUE to be TRUE. Think: “strict requirement”
OR (∨): At least one must be TRUE. Think: “generous requirement”
Implication (→): Only FALSE when TRUE leads to FALSE. Think: “breaking a promise”
Biconditional (↔): TRUE when both are the same value. Think: “if and only if”
Precedence of Logical Operators
Just like arithmetic has BODMAS/PEMDAS, propositional logic has an operator precedence. From highest to lowest:
| Precedence | Operator | Symbol |
|---|---|---|
| 1 (Highest) | Negation | ¬ (NOT) |
| 2 | Conjunction | ∧ (AND) |
| 3 | Disjunction | ∨ (OR) |
| 4 | Implication | → (IF…THEN) |
| 5 (Lowest) | Biconditional | ↔ (IFF) |
So in the expression p ∨ q ∧ ¬r → s, we evaluate: first ¬r, then q ∧ ¬r, then p ∨ (q ∧ ¬r), and finally the whole thing → s. Always use parentheses in complex expressions to avoid ambiguity.
3. Types of Propositions
Once we can construct compound propositions and evaluate them using truth tables, we want to classify them based on their truth values across all possible interpretations (all rows of the truth table). This classification is one of the most important concepts in propositional logic.
A proposition that is TRUE for every possible assignment of truth values to its variables. No matter what, it’s always true.
A proposition that is FALSE for every possible assignment. No matter what, it’s always false. Also called a “Fallacy”.
A proposition that is TRUE for some assignments and FALSE for others. Its truth value depends on the specific values of its variables.
3.1 Tautology
A tautology is a compound proposition that is always true, regardless of the truth values of the individual atomic propositions. The term comes from the Greek tautologos meaning “repeating what has been said.”
A compound proposition P is a tautology if P is TRUE for every possible combination of truth values of its component propositions. Symbolically: P is a tautology if P ≡ T (where T represents the constant “True”).
Classic Tautology Examples
Example 1: p ∨ ¬p (Law of Excluded Middle)
| p | ¬p | p ∨ ¬p |
|---|---|---|
| T | F | T |
| F | T | T |
The final column is all TRUE. This is the Law of Excluded Middle: every proposition is either true or its negation is true.
Example 2: (p → q) ∧ (q → r) → (p → r) (Hypothetical Syllogism)
| p | q | r | p→q | q→r | (p→q)∧(q→r) | p→r | Result |
|---|---|---|---|---|---|---|---|
| T | T | T | T | T | T | T | T |
| T | T | F | T | F | F | F | T |
| T | F | T | F | T | F | T | T |
| T | F | F | F | T | F | F | T |
| F | T | T | T | T | T | T | T |
| F | T | F | T | F | F | T | T |
| F | F | T | T | T | T | T | T |
| F | F | F | T | T | T | T | T |
All 8 rows evaluate to TRUE — this is a tautology, and it represents the famous logical rule of Hypothetical Syllogism.
Other Common Tautologies You Must Know
• p → p — (Identity: everything implies itself)
• ¬(p ∧ ¬p) — (Law of Non-Contradiction)
• p ∨ ¬p — (Law of Excluded Middle)
• (p → q) ↔ (¬q → ¬p) — (Contrapositive equivalence)
• [(p → q) ∧ p] → q — (Modus Ponens)
• [(p → q) ∧ ¬q] → ¬p — (Modus Tollens)
3.2 Contradiction (Fallacy)
A contradiction is a compound proposition that is always false, regardless of the truth values of its atomic components. It is the logical opposite of a tautology.
A compound proposition P is a contradiction (or fallacy) if P is FALSE for every possible combination of truth values. Symbolically: P ≡ F.
Classic Contradiction Example: p ∧ ¬p
| p | ¬p | p ∧ ¬p |
|---|---|---|
| T | F | F |
| F | T | F |
This is p AND NOT p — a statement can never be both true and false at the same time. This represents the Law of Non-Contradiction. In logic, if you can derive a contradiction from a set of premises, those premises are inconsistent (they cannot all be true simultaneously).
Practical significance: In programming and formal verification, if your specifications contain a contradiction, no implementation can ever satisfy them. Detecting contradictions is therefore crucial in requirements engineering.
Another Example: (p → q) ∧ p ∧ ¬q
| p | q | p→q | ¬q | (p→q) ∧ p ∧ ¬q |
|---|---|---|---|---|
| T | T | T | F | F |
| T | F | F | T | F |
| F | T | T | F | F |
| F | F | T | T | F |
3.3 Contingency
A contingency is a proposition that is neither a tautology nor a contradiction. It is sometimes true and sometimes false, depending on the truth values assigned to its atomic components. Most propositions we encounter in everyday life are contingencies.
A compound proposition P is a contingency if there exists at least one assignment that makes it TRUE and at least one assignment that makes it FALSE.
Example: p → q
| p | q | p → q |
|---|---|---|
| T | T | T |
| T | F | F |
| F | T | T |
| F | F | T |
Since the result is TRUE in some rows and FALSE in one row, p → q is a contingency. Its truth depends on the specific values of p and q.
3.4 Satisfiable Propositions
A proposition is satisfiable if there exists at least one assignment of truth values that makes it TRUE. This is a broader category:
| Type | Satisfiable? | Description |
|---|---|---|
| Tautology | YES | Always TRUE — trivially satisfiable by every assignment |
| Contingency | YES | TRUE for some assignments — satisfiable |
| Contradiction | NO | Never TRUE — unsatisfiable |
The Boolean Satisfiability Problem (SAT) — determining whether a given propositional formula is satisfiable — was the first problem proven to be NP-complete (Cook–Levin theorem, 1971). It sits at the center of computational complexity theory and has enormous practical applications in hardware verification, AI planning, and cryptography.
4. Propositional Equivalences
4.1 Definition and Notation
Propositional equivalence is one of the most powerful concepts in logic. Two compound propositions are said to be logically equivalent if they have identical truth values for every possible assignment of truth values to their atomic variables.
Propositions P and Q are logically equivalent, written P ≡ Q (or P ⇔ Q), if and only if P ↔ Q is a tautology. That is, P and Q produce the same truth value for every possible combination of truth values of their component propositions.
The symbol ≡ denotes logical equivalence. Note carefully: this is different from the biconditional connective ↔. The biconditional is a formula (a compound proposition), while logical equivalence is a relationship between two formulas.
P ≡ Q if and only if P ↔ Q is a tautologyQuick Verification Example
Are ¬(p ∧ q) and ¬p ∨ ¬q equivalent? (De Morgan’s First Law)
| p | q | p∧q | ¬(p∧q) | ¬p | ¬q | ¬p∨¬q | Same? |
|---|---|---|---|---|---|---|---|
| T | T | T | F | F | F | F | ✔ |
| T | F | F | T | F | T | T | ✔ |
| F | T | F | T | T | F | T | ✔ |
| F | F | F | T | T | T | T | ✔ |
The columns for ¬(p ∧ q) and ¬p ∨ ¬q are identical — confirmed equivalence!
4.2 Laws of Propositional Equivalence
Just as algebra has laws (commutative, associative, distributive, etc.), propositional logic has a complete set of equivalence laws. Mastering these laws allows you to simplify, transform, and prove logical statements without constructing truth tables every time. These laws form the foundation of Boolean algebra.
Identity Laws
-
Identity for ANDp ∧ T ≡ p
Any proposition ANDed with TRUE equals the proposition itself.
-
Identity for ORp ∨ F ≡ p
Any proposition ORed with FALSE equals the proposition itself.
Domination Laws
-
Domination for ORp ∨ T ≡ T
TRUE dominates OR: anything ORed with TRUE is always TRUE.
-
Domination for ANDp ∧ F ≡ F
FALSE dominates AND: anything ANDed with FALSE is always FALSE.
Idempotent Laws
-
Idempotent for ORp ∨ p ≡ p
-
Idempotent for ANDp ∧ p ≡ p
Double Negation Law
-
Double Negation¬(¬p) ≡ p
Negating twice returns the original. Like “not not raining” = “raining”.
Commutative Laws
-
Commutative for ORp ∨ q ≡ q ∨ p
-
Commutative for ANDp ∧ q ≡ q ∧ p
Associative Laws
-
Associative for OR(p ∨ q) ∨ r ≡ p ∨ (q ∨ r)
-
Associative for AND(p ∧ q) ∧ r ≡ p ∧ (q ∧ r)
Distributive Laws
-
Distributive: AND over ORp ∧ (q ∨ r) ≡ (p ∧ q) ∨ (p ∧ r)
-
Distributive: OR over ANDp ∨ (q ∧ r) ≡ (p ∨ q) ∧ (p ∨ r)
Complement (Negation) Laws
-
Complement OR (Excluded Middle)p ∨ ¬p ≡ T
-
Complement AND (Non-Contradiction)p ∧ ¬p ≡ F
-
Negation of T and F¬T ≡ F ¬F ≡ T
Absorption Laws
-
Absorption 1p ∨ (p ∧ q) ≡ p
p already “absorbs” the conjunction: the OR cannot add anything new.
-
Absorption 2p ∧ (p ∨ q) ≡ p
Implication and Biconditional Equivalences
-
Implication Eliminationp → q ≡ ¬p ∨ q
Every conditional can be expressed using only OR and NOT.
-
Biconditional Eliminationp ↔ q ≡ (p → q) ∧ (q → p)
-
Biconditional Alternativep ↔ q ≡ (p ∧ q) ∨ (¬p ∧ ¬q)
-
Contrapositivep → q ≡ ¬q → ¬p
-
Exportation(p ∧ q) → r ≡ p → (q → r)
-
Absurdity / Implication from Contradiction(p → F) ≡ ¬p
⚡ Quick Reference: Equivalence Laws Summary
4.3 De Morgan’s Laws — In Depth
De Morgan’s Laws, named after British mathematician Augustus De Morgan (1806–1871), are arguably the most used equivalence laws in both logic and computer science. They describe how negation distributes over AND and OR.
First Law: ¬(p ∧ q) ≡ ¬p ∨ ¬q
“NOT (p AND q)” is equivalent to “(NOT p) OR (NOT q)”
Second Law: ¬(p ∨ q) ≡ ¬p ∧ ¬q
“NOT (p OR q)” is equivalent to “(NOT p) AND (NOT q)”
Intuitive Understanding of De Morgan’s Laws
Consider the First Law with a real-world example: “It is NOT the case that (it is raining AND it is cold).” This means: either it is not raining, or it is not cold (or both). You break the AND, flip both propositions, and change AND to OR.
For the Second Law: “It is NOT the case that (it is raining OR it is cold).” This means: it is not raining AND it is not cold — both conditions must fail.
Memory Trick: When pushing a NOT inside parentheses, flip AND to OR (or OR to AND), and negate each proposition individually. The connective always swaps.
Extension to n Variables
De Morgan’s Laws generalize to any number of variables:
¬(p₁ ∧ p₂ ∧ … ∧ pₙ) ≡ ¬p₁ ∨ ¬p₂ ∨ … ∨ ¬pₙ ¬(p₁ ∨ p₂ ∨ … ∨ pₙ) ≡ ¬p₁ ∧ ¬p₂ ∧ … ∧ ¬pₙDe Morgan’s Laws in Programming
4.4 The Substitution Principle
The Substitution Principle (also called the Principle of Replacement) states: if P ≡ Q, then in any compound formula, any occurrence of P may be replaced by Q without changing the truth value of the formula. This is the foundation of all algebraic proof methods in logic.
If P ≡ Q, then for any compound formula R that contains P as a subformula, the formula R’ obtained by replacing one or more occurrences of P with Q satisfies R ≡ R’.
5. Methods to Prove Logical Equivalences
Proving that two propositions are logically equivalent is a fundamental skill in discrete mathematics and computer science. There are several systematic methods, each with its own strengths and appropriate use cases. In your BSc (Hons) examinations, you will be expected to know all of them.
5.1 Method 1: The Truth Table Method
The truth table method is the most direct and mechanical approach. To prove P ≡ Q, construct a truth table with all possible truth assignments for the atomic variables, compute the truth values of P and Q for each row, and verify that the columns for P and Q are identical.
Use the truth table method when: (a) the number of variables is small (3 or fewer), (b) you want a definitive, mechanical proof, or (c) the formula structure makes algebraic simplification unclear.
Step-by-Step Procedure
-
1Identify all atomic variablesCount the distinct atomic propositions. For n variables, you will need 2n rows. With p and q: 4 rows. With p, q, r: 8 rows.
-
2Fill in the truth value columnsList all 2n combinations systematically: for the first variable, alternate T and F every 2n-1 rows; for the second, every 2n-2 rows; and so on.
-
3Compute intermediate columnsAdd intermediate columns for each subformula, evaluated left to right (respecting precedence). Build up from atomic variables toward the full formula.
-
4Compare final columnsCheck whether the column for P and the column for Q are identical in every row. If yes, P ≡ Q is proven.
Worked Example: Prove p → q ≡ ¬p ∨ q
| p | q | ¬p | p → q | ¬p ∨ q | Identical? |
|---|---|---|---|---|---|
| T | T | F | T | T | ✔ |
| T | F | F | F | F | ✔ |
| F | T | T | T | T | ✔ |
| F | F | T | T | T | ✔ |
✓ All rows match — p → q ≡ ¬p ∨ q is proven by truth table.
Limitations of the Truth Table Method
While foolproof, truth tables become unwieldy quickly. With 4 variables you have 16 rows; with 5 variables, 32 rows; with 10 variables, 1024 rows. For proofs involving many variables, the algebraic method is far more practical.
5.2 Method 2: The Algebraic (Law Substitution) Method
The algebraic method (also called the law substitution method or equational proof method) proves logical equivalence by starting from one side of the equivalence and systematically applying known equivalence laws, step by step, until the other side is reached.
Prefer the algebraic method when: (a) there are many variables (truth tables would be too large), (b) you want to simplify a formula to its simplest form, or (c) the exam requires you to show a chain of logical steps citing specific laws.
Step-by-Step Procedure
-
1Write the starting formulaBegin with the left-hand side (LHS) or whichever side looks more complex.
-
2Apply equivalence lawsAt each step, apply exactly one equivalence law to transform the expression. Cite the law used at each step in the margin or parenthetically.
-
3Arrive at the targetContinue applying laws until the expression matches the right-hand side (RHS). This completes the proof.
Worked Example 1: Prove ¬(p → q) ≡ p ∧ ¬q
-
1
¬(p → q)Starting expression (LHS) -
2
≡ ¬(¬p ∨ q)By Implication Elimination: p → q ≡ ¬p ∨ q -
3
≡ ¬(¬p) ∧ ¬qBy De Morgan’s Second Law: ¬(A ∨ B) ≡ ¬A ∧ ¬B -
4
≡ p ∧ ¬qBy Double Negation: ¬(¬p) ≡ p ✓ QED
Worked Example 2: Prove (p ∨ q) ∧ (p ∨ ¬q) ≡ p
-
1
(p ∨ q) ∧ (p ∨ ¬q)Starting expression (LHS) -
2
≡ p ∨ (q ∧ ¬q)By Distributive Law: A ∧ (B ∨ C) — distributing OR over AND. Here: p ∨ (q ∧ ¬q) -
3
≡ p ∨ FBy Complement Law: q ∧ ¬q ≡ F (Contradiction) -
4
≡ pBy Identity Law: p ∨ F ≡ p ✓ QED
Worked Example 3: Prove ¬(p ∨ (¬p ∧ q)) ≡ ¬p ∧ ¬q
-
1
¬(p ∨ (¬p ∧ q))Starting expression -
2
≡ ¬((p ∨ ¬p) ∧ (p ∨ q))By Distributive Law: A ∨ (B ∧ C) ≡ (A ∨ B) ∧ (A ∨ C) -
3
≡ ¬(T ∧ (p ∨ q))By Complement Law: p ∨ ¬p ≡ T -
4
≡ ¬(p ∨ q)By Identity Law: T ∧ A ≡ A -
5
≡ ¬p ∧ ¬qBy De Morgan’s Second Law ✓ QED
5.3 Method 3: The Conditional Proof / Biconditional Method
Another method to prove P ≡ Q is to prove both P → Q and Q → P separately. Since P ↔ Q ≡ (P → Q) ∧ (Q → P), showing both implications hold proves the equivalence. This is essentially the definition of biconditional applied as a proof technique.
This method is particularly useful in predicate logic and formal proofs where you want to show two statements are exactly the same. For propositional logic, the algebraic or truth table methods are usually more efficient.
Procedure
-
1Prove P → QAssume P is true. Using valid inference rules (modus ponens, hypothetical syllogism, etc.), derive Q. This establishes the “if P then Q” direction.
-
2Prove Q → PAssume Q is true. Using valid inference rules, derive P. This establishes the “if Q then P” direction.
-
3Conclude P ≡ QSince both implications are established, by the biconditional equivalence, P ≡ Q follows.
5.4 Comprehensive Worked Examples
Let’s now work through more complex examples that combine multiple methods and are representative of Honours-level examination questions.
Example A: Simplify (p ∧ q) ∨ (p ∧ ¬q) ∨ (¬p ∧ q)
-
1
(p ∧ q) ∨ (p ∧ ¬q) ∨ (¬p ∧ q)Original expression -
2
≡ p ∧ (q ∨ ¬q) ∨ (¬p ∧ q)Distributive Law on first two terms: factor out p -
3
≡ (p ∧ T) ∨ (¬p ∧ q)Complement Law: q ∨ ¬q ≡ T -
4
≡ p ∨ (¬p ∧ q)Identity Law: p ∧ T ≡ p -
5
≡ (p ∨ ¬p) ∧ (p ∨ q)Distributive Law: A ∨ (B ∧ C) ≡ (A ∨ B) ∧ (A ∨ C) -
6
≡ T ∧ (p ∨ q)Complement Law: p ∨ ¬p ≡ T -
7
≡ p ∨ qIdentity Law: T ∧ A ≡ A ✓ Final simplified form
Example B: Prove (p → q) ∧ (p → r) ≡ p → (q ∧ r)
-
1
(p → q) ∧ (p → r)LHS -
2
≡ (¬p ∨ q) ∧ (¬p ∨ r)Implication Elimination (applied twice) -
3
≡ ¬p ∨ (q ∧ r)Distributive Law: (A ∨ B) ∧ (A ∨ C) ≡ A ∨ (B ∧ C) -
4
≡ p → (q ∧ r)Implication Elimination (reverse): ¬p ∨ A ≡ p → A ✓ QED
Example C: Prove (p → q) ∨ (p → r) ≡ p → (q ∨ r)
-
1
(p → q) ∨ (p → r)LHS -
2
≡ (¬p ∨ q) ∨ (¬p ∨ r)Implication Elimination (twice) -
3
≡ ¬p ∨ q ∨ ¬p ∨ rAssociativity of OR (removing parentheses) -
4
≡ ¬p ∨ ¬p ∨ q ∨ rCommutativity of OR -
5
≡ ¬p ∨ (q ∨ r)Idempotent Law: ¬p ∨ ¬p ≡ ¬p; then Associativity -
6
≡ p → (q ∨ r)Implication Elimination (reverse) ✓ QED
Example D: Prove that p → (q → r) ≡ (p ∧ q) → r (Exportation)
-
1
p → (q → r)LHS -
2
≡ ¬p ∨ (¬q ∨ r)Implication Elimination (applied to outer, then inner conditional) -
3
≡ (¬p ∨ ¬q) ∨ rAssociativity of OR -
4
≡ ¬(p ∧ q) ∨ rDe Morgan’s First Law (in reverse): ¬p ∨ ¬q ≡ ¬(p ∧ q) -
5
≡ (p ∧ q) → rImplication Elimination (reverse) ✓ QED
6. Applications in Computer Science and IT
Understanding propositional logic is not merely an academic exercise for BSc (Hons) students. Every concept covered in this guide has direct, practical applications across computer science and information technology.
6.1 Circuit Design and Boolean Algebra
Every digital circuit — from the simplest logic gate to the most complex CPU — implements propositional logic. The AND, OR, and NOT gates directly correspond to ∧, ∨, and ¬. Logical equivalences allow circuit engineers to simplify circuits, reducing the number of gates (and thus cost and power consumption) while maintaining the same input-output behavior.
Suppose a circuit implements: (A ∧ B) ∨ (A ∧ C)
Using the distributive law: A ∧ (B ∨ C)
The simplified version requires 3 gates instead of 4 — a 25% hardware reduction, representing real cost savings at scale.
6.2 Programming and Conditional Logic
Every if-else construct in programming is a direct implementation of propositional logic. De Morgan’s Laws are particularly useful when refactoring complex boolean conditions. Understanding logical equivalences allows developers to write cleaner, more readable, and more efficient conditional logic.
6.3 Database Query Optimization
SQL query optimizers use logical equivalences to rewrite WHERE clauses into more efficient forms. For example, a condition like NOT (status = 'active' AND age > 18) can be rewritten using De Morgan’s Law as status != 'active' OR age <= 18, which may allow the query engine to use different indexes and execute much faster.
6.4 Artificial Intelligence: Knowledge Bases
In AI knowledge representation, propositions represent facts about the world. Logical equivalences allow the inference engine to transform complex queries into canonical forms that can be matched against stored knowledge. The process of converting any formula to Conjunctive Normal Form (CNF) or Disjunctive Normal Form (DNF) — both used extensively in AI and automated theorem proving — relies entirely on the equivalence laws covered in this guide.
6.5 Formal Software Verification
In formal methods, pre-conditions and post-conditions of functions are expressed as logical propositions. Tools like model checkers and theorem provers (Coq, Isabelle, PVS) use logical equivalences to automatically verify that software satisfies its specification. This is used in safety-critical systems like medical devices, aircraft control software, and financial systems.
6.6 Cryptography
Many cryptographic protocols rely on problems that are computationally hard but logically verifiable. The SAT problem, which asks whether a propositional formula is satisfiable, underlies the hardness assumptions of many cryptographic systems. Logical equivalences are also used in the design and analysis of hash functions and encryption algorithms.
7. Exam Tips and Common Mistakes
7.1 Top Tips for BSc (Hons) Logic Examinations
-
★Memorize ALL equivalence lawsThere is no substitute. Write each law on a flashcard: name on front, formula on back. The exam may ask you to cite laws by name at each step of a proof.
-
★Know when to use which proof methodFor 2-3 variables: truth table is fine. For complex formulas with algebraic manipulation: use the law substitution method. Examiners often specify which method to use, so read the question carefully.
-
★Work from the more complex sideIn algebraic proofs, always start from the more complicated expression and simplify toward the simpler one. Working in both directions and meeting in the middle can also work.
-
★Cite the law at every stepIn formal proofs, every transformation must be justified by a named law. Losing marks for missing citations is unnecessary — make it a habit to always write the law name.
-
★Practice systematic truth table fillingFill truth tables column by column, not row by row. This prevents errors. For n variables, always check you have exactly 2n rows.
7.2 Common Mistakes and How to Avoid Them
Wrong: “p → q is equivalent to q → p (converse)”
Correct: p → q ≡ ¬q → ¬p (contrapositive). The converse is NOT equivalent.
Wrong: ¬(p ∧ q) ≡ ¬p ∧ ¬q (forgetting to change AND to OR)
Correct: ¬(p ∧ q) ≡ ¬p ∨ ¬q. Always flip the connective when applying De Morgan’s!
Wrong: If p → q, then q → p
Correct: Implication is NOT symmetric. p → q is false only when p is true and q is false. q → p is a different statement entirely (the converse).
For n variables, always use 2n rows. Students often forget to count all distinct variables. Remember: p, q, r means 23 = 8 rows, not 6 or 9.
The logical equivalence symbol ≡ is a metalogical relation (a statement about formulas), not a connective inside a formula. The biconditional ↔ is a connective that appears inside formulas. Using them interchangeably leads to category errors in proofs.
8. Frequently Asked Questions
A proposition is a statement that is either true or false (no variables). A predicate (studied in predicate/first-order logic) contains variables and becomes a proposition only when specific values are substituted for those variables. For example, “x > 5” is a predicate; “7 > 5” (after substituting x = 7) is a proposition.
Yes. The statement “TRUE” (the constant T) is trivially a tautology. Also, expressions like “1 = 1” are propositions that are always true and have no variables. In propositional logic, however, we typically discuss tautologies in the context of formulas with propositional variables.
This is called vacuous truth. If you promise “If it rains, I will bring an umbrella” and it does not rain, you have not broken your promise regardless of whether you have an umbrella or not. The promise is only broken if it rains AND you don’t bring an umbrella. Logically, a false hypothesis means the conditional imposes no constraint, so it is vacuously (trivially) true.
Not exactly. A logical equivalence is a relationship between two formulas (P ≡ Q), established by showing P ↔ Q is a tautology. So every logical equivalence corresponds to a tautological biconditional, but a tautology by itself (like p ∨ ¬p) is not the same as an equivalence — it is simply a formula that is always true.
Conjunctive Normal Form (CNF) is a conjunction of disjunctions (AND of ORs). Disjunctive Normal Form (DNF) is a disjunction of conjunctions (OR of ANDs). Any propositional formula can be converted to CNF or DNF using the equivalence laws. These forms are crucial for automated theorem proving (SAT solvers work on CNF) and circuit design (sum of products / product of sums).
A set of connectives is functionally complete if every possible truth function can be expressed using only connectives from that set. The sets {¬, ∧}, {¬, ∨}, and {¬, →} are all functionally complete. Remarkably, the single connective NAND (NOT AND, written ↑) alone is functionally complete — this is why NAND gates are so important in digital circuit design.
The Principle of Duality states that if a logical equivalence is valid, then its dual (obtained by replacing all ∧ with ∨, all ∨ with ∧, all T with F, and all F with T) is also a valid equivalence. This is why the identity, domination, De Morgan, distributive, and complement laws always come in pairs.
Propositional logic and set theory are deeply connected through Boolean algebra. The connective ∧ (AND) corresponds to set intersection ∩, ∨ (OR) corresponds to set union ∪, and ¬ (NOT) corresponds to set complement. De Morgan’s Laws hold identically in both systems. This parallel is not a coincidence — both are models of Boolean algebra.
✅ Key Takeaways for BSc (Hons) Students
- A proposition is a declarative statement that is either true or false — questions, commands, and open sentences are NOT propositions.
- The five primary logical connectives are: Negation (¬), Conjunction (∧), Disjunction (∨), Implication (→), and Biconditional (↔). Each has a specific truth table definition.
- A tautology is always TRUE; a contradiction is always FALSE; a contingency is sometimes TRUE and sometimes FALSE.
- Two propositions are logically equivalent (P ≡ Q) if their biconditional P ↔ Q is a tautology — they produce identical truth values in every row of the truth table.
- The essential equivalence laws are: Identity, Domination, Idempotent, Double Negation, Commutative, Associative, Distributive, De Morgan’s, Absorption, and Implication Elimination.
- De Morgan’s Laws: ¬(p∧q) ≡ ¬p∨¬q and ¬(p∨q) ≡ ¬p∧¬q. The connective always flips when negation is pushed inside.
- To prove logical equivalence: use truth tables (small formulas) or the algebraic law substitution method (complex formulas), citing a law at every step.
- The contrapositive (¬q → ¬p) is always equivalent to the original implication (p → q). Use this in proofs when the contrapositive is easier to verify.
- Propositional logic has direct applications in programming, circuit design, database queries, AI knowledge representation, and formal software verification.
🎓 Conclusion
Logic is not just one topic among many in your BSc (Hons) Computer Science syllabus — it is the mathematical language through which all of computer science is expressed. Every algorithm you analyze, every circuit you design, every program you write, and every system you verify rests on the propositional logic concepts covered in this guide.
Start by mastering the truth tables for all five connectives. Then memorize the equivalence laws — especially De Morgan’s Laws and the implication elimination. Practice proving equivalences both by truth table and algebraically, citing laws at every step. With these skills solidly in place, you will find that predicate logic, set theory, Boolean algebra, and formal methods all feel like natural extensions of what you have learned here.
Keep practicing, stay consistent, and revisit this guide before your examinations. Logic rewards repetition — each time you review these concepts, you will discover new depth and new connections to other areas of computer science.
📚 Read More on DigitalKsHub
0 Comments