Information theory

From WikiMD.org
Jump to navigation Jump to search

Information Theory

Information theory (pronounced: /ˌɪnfərˈmeɪʃən θɪəri/) is a branch of applied mathematics and electrical engineering involving the quantification of information. Originally introduced by Claude Shannon in 1948, it was primarily designed to find fundamental limits on signal processing operations such as compressing data and reliably storing and communicating data.

Etymology

The term "information theory" was coined by Claude Shannon in his groundbreaking 1948 paper "A Mathematical Theory of Communication". The word "information" is derived from the Latin informare which means to give form to the mind, to discipline, instruct, teach. "Theory" comes from the Greek theoria, which means a looking at, viewing, beholding.

Related Terms

  • Entropy (information theory): In information theory, entropy is the measure of the amount of information that is missing before reception and is sometimes referred to as Shannon entropy.
  • Redundancy (information theory): Redundancy in information theory is the surplus of the information sent over the information necessary to get the message across.
  • Data compression: Data compression involves encoding information using fewer bits than the original representation.
  • Channel capacity: In information theory, channel capacity is the tight upper bound on the rate at which information can be reliably transmitted over a communications channel.
  • Coding theory: Coding theory is the study of the properties of codes and their fitness for a specific application.

See Also

External links

Esculaap.svg

This WikiMD article is a stub. You can help make it a full article.


Languages: - East Asian 中文, 日本, 한국어, South Asian हिन्दी, Urdu, বাংলা, తెలుగు, தமிழ், ಕನ್ನಡ,
Southeast Asian Indonesian, Vietnamese, Thai, မြန်မာဘာသာ, European español, Deutsch, français, русский, português do Brasil, Italian, polski