Entropy
Entropy (en·tro·py)
Entropy (/ˈɛntrəpi/; from the Ancient Greek: ἐντροπία, entropía) is a fundamental concept in the fields of Physics and Information Theory. It is often associated with the amount of disorder or randomness in a system.
Etymology
The term "entropy" was coined in 1865 by the German physicist Rudolf Clausius from the Greek words en (in) and trope (transformation). It was originally used in the context of Thermodynamics as a measure of the heat energy not available for useful work in a thermodynamic process.
Definition
In Physics, entropy is a measure of the number of specific ways in which a system may be arranged, often taken to be a measure of disorder. In Information Theory, entropy is the measure of uncertainty, or randomness, in a set of data.
Related Terms
- Thermodynamics: The branch of physics that deals with heat and temperature, and their relation to energy, work, radiation, and properties of matter.
- Information Theory: A branch of applied mathematics and electrical engineering involving the quantification of information.
- Rudolf Clausius: A German physicist and mathematician who is considered one of the central founders of the science of thermodynamics.
- Disorder (physics): In physics, the term disorder is used to refer to the randomness or lack of order in a system.
See Also
External links
- Medical encyclopedia article on Entropy
- Wikipedia's article - Entropy
This WikiMD article is a stub. You can help make it a full article.
Languages: - East Asian
中文,
日本,
한국어,
South Asian
हिन्दी,
Urdu,
বাংলা,
తెలుగు,
தமிழ்,
ಕನ್ನಡ,
Southeast Asian
Indonesian,
Vietnamese,
Thai,
မြန်မာဘာသာ,
European
español,
Deutsch,
français,
русский,
português do Brasil,
Italian,
polski