Table of contents |
|
Multiple Choice Questions (MCQs) |
|
Fill in the Blanks |
|
True or False |
|
Short Answer Questions |
|
Long Answer Questions |
|
Q.1: What is the primary focus of Natural Language Processing (NLP)?
a) Enabling computers to understand human language
b) Processing numerical data for predictions
c) Identifying symbols in images
d) Developing computer vision algorithms
Ans: a) Enabling computers to understand human language
The primary focus of NLP is to enable computers to understand and interpret human language, which distinguishes it from other areas like data processing or computer vision.
Q.2: Which of the following is NOT a domain of AI mentioned in the content?
a) Data Science
b) Computer Vision
c) Natural Language Processing
d) Robotics
Ans: d) Robotics
Robotics is often considered a separate field from the domains of AI like Data Science, Computer Vision, and Natural Language Processing, which focus specifically on intelligent behavior.
Q.3: Which application of NLP involves assigning predefined categories to documents?
a) Automatic Summarization
b) Sentiment Analysis
c) Text Classification
d) Virtual Assistants
Ans: c) Text Classification
Text Classification specifically refers to the process of categorizing documents into predefined groups, distinguishing it from other NLP applications like summarization or sentiment analysis.
Q.4: What distinguishes a Smart Bot from a Script Bot?
a) Smart Bots are easier to create
b) Smart Bots have limited language processing
c) Smart Bots can learn and improve over time
d) Smart Bots are based on predefined scripts
Ans: c) Smart Bots can learn and improve over time
Smart Bots utilize machine learning to adapt and enhance their performance over time, unlike Script Bots, which rely solely on fixed scripts for responses.
Q.5: In the Bag of Words model, what does the term 'bag' imply?
a) The order of words matters
b) The model focuses on word frequency regardless of order
c) The model only processes stopwords
d) The model prioritizes sentence structure
Ans: b) The model focuses on word frequency regardless of order
The 'bag' in the Bag of Words model signifies that it analyzes word frequency without considering the sequence in which words appear, contrasting with models that do factor in order.
Ans: Artificial Intelligence
Natural Language Processing (NLP) is a vital sub-field of Artificial Intelligence that aims to facilitate communication between humans and machines through understanding and processing human language.
Q7: The process of breaking a corpus into individual sentences is called __________.
Ans: Sentence Segmentation
Sentence Segmentation is a crucial step in text processing that involves dividing a larger text into its constituent sentences, enabling better analysis and understanding of the content.
Q8: In text normalization, common words like 'and' or 'the' that are removed are called __________.
Ans: Stopwords
Stopwords are frequently used words in a language that are often filtered out during text processing because they carry less meaning and can skew analysis results.
Q9: The __________ algorithm creates a vocabulary of unique words and their frequencies from a processed corpus.
Ans: Bag of Words
The Bag of Words model is a simplified representation used in NLP that organizes text data into a collection of unique words along with their occurrence frequencies, aiding in various text analysis tasks.
Q10: The formula for TF-IDF is TF(W) * __________.
Ans: log(IDF(W))
The TF-IDF formula combines Term Frequency (TF) with Inverse Document Frequency (IDF) to assess the importance of a word in a document relative to a collection of documents, highlighting significant terms.
Ans: False
Computer Vision primarily focuses on identifying symbols and patterns in images, whereas Data Science encompasses mathematical and statistical principles.
Q.12: Sentiment Analysis can classify sentiments as positive, negative, or neutral.
Ans: True
Sentiment Analysis effectively categorizes sentiments into positive, negative, or neutral classifications.
Q.13: Script Bots are highly adaptable and can handle complex tasks.
Ans: False
Script Bots are limited in functionality and operate based on predefined scripts, making them less adaptable.
Q.14: Syntax refers to the meaning of a sentence, while semantics refers to its grammatical structure.
Ans: False
Syntax pertains to the grammatical structure of sentences, while semantics deals with their meaning.
Q.15: In the Bag of Words model, the order of words in a document affects the output.
Ans: False
The Bag of Words model disregards word order and focuses solely on the frequency of words.
Q.16: Define Natural Language Processing (NLP) in one sentence.
Ans: NLP is a sub-field of AI that enables computers to understand, process, and generate human language, both spoken and written.
Q.17: Name two real-life applications of NLP mentioned in the content.
Ans: Sentiment Analysis and Virtual Assistants.
Q.18: What is the difference between stemming and lemmatization in text normalization?
Ans: Stemming reduces words to their root form, which may not be meaningful (e.g., "caring" → "car"), while lemmatization reduces words to their meaningful base form (e.g., "caring" → "care").
Q.19: Explain what a document vector represents in the Bag of Words model.
Ans: A document vector represents the frequency of each word from the vocabulary in a specific document, typically as a sequence of numbers (e.g., 0s and 1s for absence or presence).
Q.20: What does Inverse Document Frequency (IDF) measure in the TF-IDF algorithm?
Ans: IDF measures the importance of a word by calculating the total number of documents divided by the number of documents containing the word, indicating how rare or common the word is across the corpus.
Q.21: Explain how the human brain processes spoken language in a classroom setting, including how it prioritizes certain sounds.
Ans: In a classroom, the human brain continuously processes sounds it hears, such as the teacher’s lesson, by converting sound waves from the speaker’s mouth into neuron impulses via the eardrum. These impulses are transported to the brain, which interprets their meaning and stores clear signals or seeks clarification if unclear. For example, if a friend whispers during the lesson, the brain may prioritize the friend’s conversation over the teacher’s speech based on interest, demonstrating its ability to shift focus and process multiple sounds simultaneously while prioritizing relevant ones.
Q.22: Discuss the challenges computers face in understanding human language, focusing on syntax and semantics, with an example from the content.
Ans: Computers face challenges in understanding human language due to the complexity of syntax (grammatical structure) and semantics (meaning). Unlike human brains, which intuitively process context, computers require structured data (numbers) and struggle with ambiguous word arrangements and meanings. For instance, the content provides the example: "His face turned red after he found out that he had taken the wrong bag." The word "red" could imply shame (from an honest mistake) or anger (from failing to steal), depending on context. Computers lack the intuitive contextual understanding humans develop through experience, making it difficult to interpret such ambiguities without advanced algorithms like NLP to analyze syntax (sentence structure) and semantics (contextual meaning).
Q.23: Calculate the TF-IDF for the word "pollution" given: total documents = 10, documents containing "pollution" = 3, and TF(pollution) = 2 for a specific document. Show all steps.
Ans: To calculate TF-IDF for "pollution":
Q.24: A company wants to implement a chatbot for customer service. Should they choose a Script Bot or a Smart Bot? Justify your recommendation based on their features and potential use cases.
Ans: For customer service, a Smart Bot is recommended over a Script Bot. Smart Bots are adaptable, powerful, and capable of learning from interactions, making them suitable for handling diverse and complex customer queries. They use extensive databases and advanced NLP to provide accurate, context-aware responses, which is crucial for addressing varied customer needs (e.g., troubleshooting, product inquiries, or personalized recommendations). For example, Smart Bots like Google Assistant or Alexa can manage a wide range of tasks, improving over time with user interactions. In contrast, Script Bots are easier to create, cheaper, and follow predefined scripts, but their limited language processing and functionality restrict them to basic, repetitive tasks. Customer service environment requiring flexibility and nuanced responses, a Smart Bot’s ability to learn and handle complex interactions outweighs the simplicity of a Script Bot, despite the higher setup complexity.
24 videos|87 docs|8 tests
|
1. What is Natural Language Processing (NLP) and why is it important? | ![]() |
2. What are some common applications of NLP in everyday life? | ![]() |
3. What are the main challenges faced in the field of NLP? | ![]() |
4. How does machine learning contribute to NLP? | ![]() |
5. What skills are essential for someone interested in a career in NLP? | ![]() |