翻訳と辞書
Words near each other
・ NeGcon
・ Negdel
・ Negedu
・ Negel
・ Negel River
・ Negel Rural District
・ Negel, Ilam
・ Negel, Kurdistan
・ Negele
・ Negele Arsi
・ Negele Borana
・ Negele Knight
・ Negen Straatjes
・ Negenborn
・ Negenharrie
Negentropy
・ Neger
・ Neger (Bieke)
・ Neger (Ruhr)
・ Negera
・ Negera bimaculata
・ Negera clenchi
・ Negera confusa
・ Negera disspinosa
・ Negera natalensis
・ Negera quadricornis
・ Negera ramosa
・ Negera unispinosa
・ Negerhollands
・ Negeri FM


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Negentropy : ウィキペディア英語版
Negentropy
The negentropy, also negative entropy, syntropy, extropy, ectropy or entaxy,〔Wiener, Norbert〕 of a living system is the entropy that it exports to keep its own entropy low; it lies at the intersection of entropy and life. The concept and phrase "negative entropy" was introduced by Erwin Schrödinger in his 1944 popular-science book ''What is Life?''〔Schrödinger, Erwin, ''What is Life - the Physical Aspect of the Living Cell'', Cambridge University Press, 1944〕 Later, Léon Brillouin shortened the phrase to ''negentropy'',〔Brillouin, Leon: (1953) "Negentropy Principle of Information", ''J. of Applied Physics'', v. 24(9), pp. 1152-1163〕〔Léon Brillouin, ''La science et la théorie de l'information'', Masson, 1959〕 to express it in a more "positive" way: a living system imports negentropy and stores it.〔Mae-Wan Ho, (What is (Schrödinger's) Negentropy? ), Bioelectrodynamics Laboratory, Open university Walton Hall, Milton Keynes〕 In 1974, Albert Szent-Györgyi proposed replacing the term ''negentropy'' with ''syntropy''. That term may have originated in the 1940s with the Italian mathematician Luigi Fantappiè, who tried to construct a unified theory of biology and physics. Buckminster Fuller tried to popularize this usage, but ''negentropy'' remains common.
In a note to What is Life? Schrödinger explained his use of this phrase.
Indeed, negentropy has been used by biologists as the basis for purpose or direction in life, namely cooperative or moral instincts.〔Jeremy Griffith. 2011. ''What is the Meaning of Life?''. In ''The Book of Real Answers to Everything!'' ISBN 9781741290073. From http://www.worldtransformation.com/what-is-the-meaning-of-life/〕
In 2009, Mahulikar & Herwig redefined negentropy of a dynamically ordered sub-system as the specific entropy deficit of the ordered sub-system relative to its surrounding chaos.〔Mahulikar, S.P. & Herwig, H.: (2009) "Exact thermodynamic principles for dynamic order existence and evolution in chaos", ''Chaos, Solitons & Fractals'', v. 41(4), pp. 1939-1948〕 Thus, negentropy has SI units of (J kg-1 K-1) when defined based on specific entropy per unit mass, and (K−1) when defined based on specific entropy per unit energy. This definition enabled: ''i'') scale-invariant thermodynamic representation of dynamic order existence, ''ii'') formulation of physical principles exclusively for dynamic order existence and evolution, and ''iii'') mathematical interpretation of Schrödinger's negentropy debt.
==Information theory==
In information theory and statistics, negentropy is used as a measure of distance to normality.〔Aapo Hyvärinen, (Survey on Independent Component Analysis, node32: Negentropy ), Helsinki University of Technology Laboratory of Computer and Information Science〕〔Aapo Hyvärinen and Erkki Oja, (Independent Component Analysis: A Tutorial, node14: Negentropy ), Helsinki University of Technology Laboratory of Computer and Information Science〕〔Ruye Wang, (Independent Component Analysis, node4: Measures of Non-Gaussianity )〕 Out of all distributions with a given mean and variance, the normal or Gaussian distribution is the one with the highest entropy. Negentropy measures the difference in entropy between a given distribution and the Gaussian distribution with the same mean and variance. Thus, negentropy is always nonnegative, is invariant by any linear invertible change of coordinates, and vanishes if and only if the signal is Gaussian.
Negentropy is defined as
:J(p_x) = S(\phi_x) - S(p_x)\,
where S(\phi_x) is the differential entropy of the Gaussian density with the same mean and variance as p_x and S(p_x) is the differential entropy of p_x:
:S(p_x) = - \int p_x(u) \log p_x(u) du
Negentropy is used in statistics and signal processing. It is related to network entropy, which is used in Independent Component Analysis.〔P. Comon, Independent Component Analysis - a new concept?, ''Signal Processing'', 36 287-314, 1994.〕〔Didier G. Leibovici and Christian Beckmann, (An introduction to Multiway Methods for Multi-Subject fMRI experiment ), FMRIB Technical Report 2001, Oxford Centre for Functional Magnetic Resonance Imaging of the Brain (FMRIB), Department of Clinical Neurology, University of Oxford, John Radcliffe Hospital, Headley Way, Headington, Oxford, UK.〕

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Negentropy」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.