Information Theory - Electronics and Communication Engineering (ECE) PDF Download

INFORMATION THEORY:

Information theory is a branch of applied mathematics and electrical engineering involving the quantification of information. Information theory was developed by Claude E. Shannon to find fundamental limits on signal processing operations such as compressing data and on reliably storing and communicating data. Since its inception it has broadened to find applications in many other areas, including statistical inference, natural language processing, cryptography generally, networks other than communication networks — as inneurobiology, the evolution and function of molecular codes, model selection in ecology, thermal physics, quantum computing, plagiarism detection and other forms of data analysis.

A key measure of information is known as entropy, which is usually expressed by the average number of bits needed for storage or communication. Entropy quantifies the uncertainty involved in predicting the value of a random variable. For example, specifying the outcome of a fair coin flip (two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a die (six equally likely outcomes).

Applications of fundamental topics of information theory include lossless data compression (e.g. ZIP files), lossy data compression (e.g. MP3s), and channel coding (e.g. for DSL lines). The field is at the intersection of mathematics, statistics, computer science, physics, neurobiology, and electrical engineering. Its impact has been crucial to the success of the Voyager missions to deep space, the invention of the compact disc, the feasibility of mobile phones, the development of the Internet, the study of linguistics and of human perception, the understanding of black holes, and numerous other fields. Important sub-fields of information theory are source coding, channel coding, algorithmic complexity theory, algorithmic information theory, information-theoretic security, and measures of information.

 

OVERVIEW: 

The main concepts of information theory can be grasped by considering the most widespread means of human communication: language. Two important aspects of a concise language are as follows: First, the most common words (e.g., "a", "the", "I") should be shorter than less common words (e.g., "benefit", "generation", "mediocre"), so that sentences will not be too long. Such a tradeoff in word length is analogous to data compression and is the essential aspect of source coding. Second, if part of a sentence is unheard or misheard due to noise — e.g., a passing car — the listener should still be able to glean the meaning of the underlying message. Such robustness is as essential for an electronic communication system as it is for a language; properly building such robustness into communications is done by channel coding. Source coding and channel coding are the fundamental concerns of information theory.

Note that these concerns have nothing to do with the importance of messages. For example, a platitude such as "Thank you; come again" takes about as long to say or write as the urgent plea, "Call an ambulance!" while the latter may be more important and more meaningful in many contexts. Information theory, however, does not consider message importance or meaning, as these are matters of the quality of data rather than the quantity and readability of data, the latter of which is determined solely by probabilities.

Information theory is generally considered to have been founded in 1948 by Claude Shannon in his seminal work, "A Mathematical Theory of Communication". The central paradigm of classical information theory is the engineering problem of the transmission of information over a noisy channel. The most fundamental results of this theory are Shannon's source coding theorem, which establishes that, on average, the number of bits needed to represent the result of an uncertain event is given by its entropy; and Shannon's noisy-channel coding theorem,which states that reliable communication is possible over noisy channels provided that the rate of communication is below a certain threshold, called the channel capacity. The channel capacity can be approached in practice by using appropriate encoding and decoding systems.

Information theory is closely associated with a collection of pure and applied disciplines that have been investigated and reduced to engineering practice under a variety of rubrics throughout the world over the past half century or more: adaptive systems, anticipatory systems, artificial intelligence, complex systems, complexity science, cybernetics, informatics, machine learning, along with systems sciences of many descriptions. Information theory is a broad and deep mathematical theory, with equally broad and deep applications, amongst which is the vital field of coding theory.

Coding theory is concerned with finding explicit methods, called codes, of increasing the efficiency and reducing the net error rate of data communication over a noisy channel to near the limit that Shannon proved is the maximum possible for that channel. These codes can be roughly subdivided into data compression (source coding) and error-correction (channel coding) techniques. In the latter case, it took many years to find the methods Shannon's work proved were possible. A third class of information theory codes are cryptographic algorithms (both codes and ciphers). Concepts, methods and results from coding theory and information theory are widely used in cryptography and cryptanalysis. See the article ban (information) for a historical application.

Information theory is also used in information retrieval, intelligence gathering, gambling, statistics, and even in musical composition.

 

Quantities of information 

Information theory is based on probability theory and statistics. The most important quantities of information are entropy, the information in a random variable, and mutual information, the amount of information in common between two random variables. The former quantity indicates how easily message data can be compressed while the latter can be used to find the communication rate across a channel. The choice of logarithmic base in the following formulae determines the unit of information entropy that is used. The most common unit of information is the bit, based on the binary logarithm. Other units include the nat, which is based on the natural logarithm, and the hartley, which is based on the common logarithm. In what follows, an expression of the form P log P is considered by convention to be equal to zero whenever p = 0. This is justified because      Information Theory - Electronics and Communication Engineering (ECE) for any logarithmic base.

The document Information Theory - Electronics and Communication Engineering (ECE) is a part of Electronics and Communication Engineering (ECE) category.
All you need of Electronics and Communication Engineering (ECE) at this link: Electronics and Communication Engineering (ECE)

FAQs on Information Theory - Electronics and Communication Engineering (ECE)

1. What is information theory in electronics and communication engineering?
Ans. Information theory in electronics and communication engineering is a mathematical framework that quantifies the amount of information that can be transmitted over a communication channel. It focuses on the study of encoding, transmission, and decoding of information in various forms, such as signals, messages, or data.
2. How is information measured in information theory?
Ans. Information is measured in information theory using a unit called "bits" or "binary digits." A bit represents the amount of information needed to decide between two equally likely alternatives. It can be used to measure the uncertainty or randomness of an event or the amount of information contained in a message.
3. What are the key concepts in information theory?
Ans. The key concepts in information theory include entropy, channel capacity, noise, coding theory, and mutual information. Entropy measures the average amount of information in a message, while channel capacity represents the maximum data rate that can be transmitted without error. Noise refers to unwanted disturbances that affect the transmission of information. Coding theory deals with techniques to efficiently encode and decode information, and mutual information measures the amount of information shared between two random variables.
4. How is information transmitted over a communication channel?
Ans. Information is transmitted over a communication channel by encoding it into a suitable form, such as electrical signals, electromagnetic waves, or digital data. The encoded information is then transmitted through the channel, which can be wired or wireless, and is subject to various types of noise and distortion. At the receiving end, the encoded information is decoded to retrieve the original message.
5. What are the applications of information theory in electronics and communication engineering?
Ans. Information theory has numerous applications in electronics and communication engineering. It is used in the design of efficient communication systems, such as wireless networks, satellite communications, and data transmission over the internet. It also finds applications in error detection and correction codes, data compression algorithms, cryptography, and speech and image processing.
Download as PDF

Top Courses for Electronics and Communication Engineering (ECE)

Related Searches

Objective type Questions

,

Previous Year Questions with Solutions

,

Information Theory - Electronics and Communication Engineering (ECE)

,

MCQs

,

Free

,

Viva Questions

,

Summary

,

Information Theory - Electronics and Communication Engineering (ECE)

,

shortcuts and tricks

,

Important questions

,

Sample Paper

,

practice quizzes

,

Extra Questions

,

ppt

,

video lectures

,

Information Theory - Electronics and Communication Engineering (ECE)

,

mock tests for examination

,

pdf

,

study material

,

Exam

,

Semester Notes

,

past year papers

;