Cheatsheet: Theory of Computation

1. Finite Automata

1.1 Basic Definitions

1.1 Basic Definitions

1.2 Deterministic Finite Automaton (DFA)

1.2 Deterministic Finite Automaton (DFA)

1.2.1 DFA Properties

  • Exactly one transition for each state-symbol pair
  • No ε-transitions allowed
  • Total number of transitions = |Q| × |Σ|
  • For an n-state NFA, the equivalent DFA can have up to 2ⁿ states (subset construction).

1.3 Non-deterministic Finite Automaton (NFA)

1.3 Non-deterministic Finite Automaton (NFA)

1.3.1 NFA Properties

  • Multiple transitions possible for same state-symbol pair
  • Can have 0, 1, or more transitions
  • Accepts if at least one path leads to final state
  • For n-state NFA, equivalent DFA can have up to 2ⁿ states

1.4 NFA with ε-transitions (ε-NFA)

  • Transition function: δ: Q × (Σ ∪ {ε}) → 2^Q
  • ε-closure(q): Set of states reachable from q using only ε-transitions
  • ε-NFA, NFA, and DFA are equivalent in power

1.5 Conversion Algorithms

1.5 Conversion Algorithms

1.6 Closure Properties of Regular Languages

1.6 Closure Properties of Regular Languages

2. Regular Expressions and Languages

2.1 Regular Expression Operators

2.1 Regular Expression Operators

2.2 Regular Expression Laws

  • Associativity: (r₁ + r₂) + r₃ = r₁ + (r₂ + r₃); (r₁r₂)r₃ = r₁(r₂r₃)
  • Commutativity: r₁ + r₂ = r₂ + r₁ (union only, not concatenation)
  • Distributivity: r₁(r₂ + r₃) = r₁r₂ + r₁r₃
  • Identity: r + ∅ = r; rε = εr = r; r∅ = ∅r = ∅
  • Idempotence: r* = (r*)* = ε + r⁺; r** = r*
  • Absorption: r*r* = r*; ε* = ε; ∅* = ε

2.3 Arden's Theorem

  • If R = Q + RP where ε ∉ L(P), then R = QP*
  • Used to convert FA to regular expression

2.4 Regular Expression to NFA (Thompson's Construction)

2.4 Regular Expression to NFA (Thompson`s Construction)

2.5 Pumping Lemma for Regular Languages

  • If L is regular, then ∃ pumping length p ≥ 1 such that:
  • Every string s ∈ L with |s| ≥ p can be divided as s = xyz where:
    1. |xy| ≤ p
    2. |y| ≥ 1
    3. ∀ i ≥ 0, xyⁱz ∈ L
  • Used to prove languages are NOT regular (proof by contradiction)

2.6 Non-Regular Languages Examples

  • L = {aⁿbⁿ | n ≥ 0}
  • L = {aⁿ | n is prime}
  • L = {ww | w ∈ Σ*}
  • L = {aⁿbⁿcⁿ | n ≥ 0}
  • L = {aⁿ² | n ≥ 0}

2.7 Myhill-Nerode Theorem

  • L is regular iff the number of equivalence classes of ≡_L is finite
  • x ≡_L y iff ∀z ∈ Σ*, xz ∈ L ⇔ yz ∈ L
  • Number of equivalence classes = number of states in minimal DFA

3. Context-Free Grammars

3.1 CFG Definition

3.1 CFG Definition

3.2 Derivations

3.2 Derivations

3.3 Parse Trees

  • Root: start symbol S
  • Internal nodes: non-terminals
  • Leaves: terminals or ε
  • Yield: concatenation of leaves from left to right
  • Each derivation corresponds to a parse tree

3.4 Ambiguity

  • Grammar is ambiguous if ∃ string w with two or more distinct parse trees
  • Equivalently: two or more distinct leftmost (or rightmost) derivations
  • Given a CFG, it is undecidable whether the grammar is ambiguous.
  • Some CFLs are inherently ambiguous (all grammars are ambiguous)

3.5 Simplification of CFG

3.5.1 Removal Steps (in order)

  • 1. Remove ε-productions: A → ε (except S → ε if ε ∈ L)
  • 2. Remove unit productions: A → B where A, B ∈ V
  • 3. Remove useless symbols: non-generating and unreachable symbols
  • Order matters: first eliminate non-generating, then unreachable

3.5.2 Normal Forms

3.5.2 Normal Forms

3.6 Closure Properties of CFLs

3.6 Closure Properties of CFLs

3.7 Pumping Lemma for CFLs

  • If L is CFL, then ∃ pumping length p such that:
  • Every string s ∈ L with |s| ≥ p can be divided as s = uvxyz where:
  • 1. |vxy| ≤ p
  • 2. |vy| ≥ 1
  • 3. ∀ i ≥ 0, uvⁱxyⁱz ∈ L
  • Used to prove languages are NOT context-free

3.8 Non-CFL Examples

  • L = {aⁿbⁿcⁿ | n ≥ 0}
  • L = {ww | w ∈ {a,b}*}
  • L = {aⁿ | n is prime}
  • L = {aⁿ² | n ≥ 0}

4. Pushdown Automata

4.1 PDA Definition

4.1 PDA Definition

4.2 Acceptance Modes

4.2 Acceptance Modes
  • Both modes are equivalent in power (can convert between them)

4.3 DPDA vs NPDA

4.3 DPDA vs NPDA
  • DPDA recognizes DCFLs, NPDA recognizes CFLs, and DCFL ⊂ CFL.

4.4 CFG and PDA Equivalence

  • For every CFG, there exists an equivalent NPDA
  • For every NPDA, there exists an equivalent CFG
  • PDA accepts exactly the class of context-free languages

5. Turing Machines

5.1 Turing Machine Definition

5.1 Turing Machine Definition

5.2 TM Characteristics

  • Infinite tape (unbounded on both or one side)
  • Can read, write, and move left or right
  • Can halt in accepting, rejecting, or non-halting state
  • Computation may not halt (loop forever)

5.3 TM Variants

5.3 TM Variants
  • All variants are equivalent in computational power

5.4 Church-Turing Thesis

  • Any intuitively computable function is Turing-computable
  • TM captures the notion of algorithmic computation
  • Cannot be proven (thesis, not theorem)

5.5 Language Classification by TM

5.5 Language Classification by TM
  • REC = RE ∩ Co-RE
  • REC ⊂ RE
  • RE languages are closed under union, intersection, concatenation, Kleene star
  • RE languages are NOT closed under complement
  • Recursive languages are closed under complement, union, intersection

5.6 Decidability

5.6 Decidability

5.7 Halting Problem

  • Problem: Given TM M and input w, does M halt on w?
  • Undecidable (proven by diagonalization)
  • L_H = {⟨M,w⟩ | M halts on w} is RE but not recursive
  • Complement of L_H is not RE

5.8 Rice's Theorem

  • Any non-trivial property of RE languages is undecidable
  • Non-trivial: some RE languages have it, some don't
  • Property: depends only on language recognized, not on TM description

6. Chomsky Hierarchy

6.1 Grammar Types

6.1 Grammar Types

6.2 Language Hierarchy

  • Type 3 ⊂ Type 2 ⊂ Type 1 ⊂ Type 0
  • Regular ⊂ CFL ⊂ CSL ⊂ RE
  • All inclusions are strict (proper subsets)

6.3 Linear Bounded Automaton (LBA)

  • TM with tape length bounded by c·n where n is input length, c is constant
  • Accepts exactly context-sensitive languages
  • LBA acceptance problem is decidable
  • LBA emptiness problem is undecidable

7. Complexity Classes

7.1 Time Complexity Classes

7.1 Time Complexity Classes

7.2 Relationships

  • P ⊆ NP
  • P ⊆ Co-NP
  • NP-Complete ⊆ NP-Hard
  • If any NP-Complete problem is in P, then P = NP
  • P = NP question remains open

7.3 NP-Complete Problems

  • SAT (Boolean Satisfiability) - first proven NP-Complete (Cook's theorem)
  • 3-SAT
  • Clique
  • Vertex Cover
  • Hamiltonian Cycle
  • Traveling Salesman Problem (decision version)
  • Subset Sum
  • Knapsack (decision version)
  • Graph Coloring

7.4 Polynomial-Time Reduction

  • Language L₁ ≤_p L₂ if ∃ polynomial-time computable function f: Σ* → Σ* such that x ∈ L₁ iff f(x) ∈ L₂
  • Used to prove NP-Completeness: reduce known NP-Complete problem to new problem

7.5 Space Complexity Classes

7.5 Space Complexity Classes

7.6 Space-Time Relationships

  • L ⊆ NL ⊆ P ⊆ NP ⊆ PSPACE ⊆ NPSPACE ⊆ EXPTIME
  • Savitch's Theorem: PSPACE = NPSPACE
  • PSPACE-Complete problems exist

8. Key Theorems and Results

8.1 Fundamental Theorems

8.1 Fundamental Theorems

8.2 Conversion Time Complexities

8.2 Conversion Time Complexities

8.3 Decision Algorithm Complexities

8.3 Decision Algorithm Complexities

9. Important Formulas and Counts

9.1 State and Transition Counts

  • DFA transitions: exactly |Q| × |Σ|
  • NFA transitions are defined by a transition relation; no fixed upper bound exists.
  • Minimal DFA states for L: equals number of Myhill-Nerode equivalence classes
  • Product construction (intersection/union): |Q₁| × |Q₂| states

9.2 String and Language Operations

  • |Σⁿ| = |Σ|ⁿ (number of strings of length exactly n)
  • |Σ≤ⁿ| = (|Σ|^(n+1) - 1)/(|Σ| - 1) (total strings of length ≤ n)
  • |wᴿ| = |w| (reversal preserves length)
  • |w₁w₂| = |w₁| + |w₂| (concatenation)

9.3 Parse Tree Heights

  • CNF grammar: derivation tree height for string of length n is O(log n) minimum, O(n) maximum
  • Yield of parse tree with height h in CNF: at most 2^h terminals

9.4 Computation Bounds

  • k-tape TM simulating 1-tape TM: time increased by at most O(t²)
  • Non-deterministic TM simulating deterministic TM: time remains same
  • Deterministic TM simulating non-deterministic TM: exponential time increase possible

10. Exam Tips and Quick Checks

10.1 Proving Non-Regularity

  • Use Pumping Lemma: assume regular, derive contradiction
  • Use Myhill-Nerode: show infinite equivalence classes
  • Use closure properties: if L₁ ∩ L₂ is non-regular and L₂ is regular, then L₁ is non-regular

10.2 Proving Non-CFL

  • Use Pumping Lemma for CFLs
  • Use closure properties: CFLs not closed under intersection or complement
  • Intersection with regular language: if L ∩ R is non-CFL and R is regular, then L is non-CFL

10.3 Grammar Design Strategy

  • Regular: use right-linear or left-linear productions
  • CFL: use recursive productions; balance with non-terminals
  • CNF conversion: eliminate ε-productions, unit productions, then convert

10.4 Automata Design Strategy

  • DFA: track necessary information in states (equivalence classes)
  • NFA: use non-determinism to guess; verify incrementally
  • PDA: use stack for counting/matching; mirror for palindromes

10.5 Reduction Strategy for NP-Completeness

  • Choose appropriate known NP-Complete problem
  • Define polynomial-time transformation
  • Prove: x ∈ L₁ ⟺ f(x) ∈ L₂
  • Verify transformation runs in polynomial time
The document Cheatsheet: Theory of Computation is a part of the Computer Science Engineering (CSE) Course Theory of Computation.
All you need of Computer Science Engineering (CSE) at this link: Computer Science Engineering (CSE)

FAQs on Cheatsheet: Theory of Computation

1. What is a finite automaton and how does it function?
Ans. A finite automaton (FA) is a theoretical model of computation used to design and analyse the behaviour of systems. It consists of a finite number of states, transitions between these states, an initial state, and a set of accept states. The automaton processes input strings of symbols, transitioning between states according to predefined rules. It accepts an input string if it ends in an accept state after processing all symbols. There are two main types of finite automata: deterministic (DFA) and nondeterministic (NFA), with DFAs having a single unique transition for each input symbol from any state, while NFAs may have multiple possible transitions.
2. How are regular expressions related to finite automata?
Ans. Regular expressions are a formal way to describe regular languages, which can be represented by finite automata. Each regular expression can be converted into an equivalent finite automaton that accepts the same language. The connection lies in the ability of both to define patterns of strings over a given alphabet. Regular expressions use operators such as union, concatenation, and Kleene star to denote complex patterns, while finite automata provide a graphical representation of how these patterns can be recognised through state transitions.
3. What is a pushdown automaton and how does it differ from finite automata?
Ans. A pushdown automaton (PDA) is a type of automaton that includes a stack, which allows it to store an unbounded amount of information. This feature enables PDAs to recognise context-free languages, which cannot be recognised by finite automata. The stack allows PDAs to handle nested structures, such as parentheses in expressions. While finite automata rely solely on their states to process input, PDAs use both their states and the stack to control transitions, making them more powerful in terms of the languages they can accept.
4. Can you explain the Chomsky hierarchy and its significance in formal languages?
Ans. The Chomsky hierarchy is a classification of formal languages into four types based on their generative power: Type 0 (recursively enumerable languages), Type 1 (context-sensitive languages), Type 2 (context-free languages), and Type 3 (regular languages). Each type of language can be generated by a corresponding grammatical framework. The hierarchy is significant as it helps in understanding the relationships between different classes of languages and the computational power of various models, such as Turing machines, context-free grammars, and finite automata. It also provides insights into what can be computed or recognised by different computational models.
5. What are complexity classes, and why are they important in theoretical computer science?
Ans. Complexity classes are categories used to classify computational problems based on the resources required to solve them, such as time and space. Common complexity classes include P (problems solvable in polynomial time), NP (nondeterministic polynomial time), NP-complete, and PSPACE. Understanding these classes is crucial in theoretical computer science because they help determine the efficiency and feasibility of algorithms. They also play a significant role in identifying problems that are inherently difficult to solve and understanding the limits of what can be computed within reasonable time constraints.
Explore Courses for Computer Science Engineering (CSE) exam
Get EduRev Notes directly in your Google search
Related Searches
practice quizzes, Objective type Questions, Exam, Previous Year Questions with Solutions, MCQs, past year papers, video lectures, Cheatsheet: Theory of Computation, mock tests for examination, Free, shortcuts and tricks, pdf , Cheatsheet: Theory of Computation, study material, Sample Paper, ppt, Extra Questions, Important questions, Summary, Semester Notes, Cheatsheet: Theory of Computation, Viva Questions;