Read More
Date: 28-7-2016
913
Date: 28-7-2016
1120
Date: 13-7-2016
884
|
Mutual Information
Two systems that have no mutual relation have no information about one another. Conversely, if they do have some connection, each contains information about the other. For instance, a measurement of x helps estimate y (or provides information about y, or reduces the uncertainty in y). Suppose now that we want to determine the amount (if any) by which a measurement of x reduces our uncertainty about a value of y. Two ''uncertainties of y" contribute. First, y by itself has an uncertainty, as measured by the self-entropy HY. Secondly, there's an uncertainty of y given a measurement of x, as measured by the conditional entropy HY|X.
Conditional entropy HY|X is a number that represents an amount of information about y. In particular, the basic uncertainty HY is lessened or partially relieved by an amount equal to HY|X. In symbols, that statement says that the overall decrease in uncertainty is HY-HY|X. The name for that difference or reduced uncertainty is mutual information, IY;X:
IY;X = HY-HY|X. .......(1)
Using a semicolon in the subscript Y;X characterizes our symbol for mutual information. For joint probabilities or entropies, we used a comma, as in joint entropy HX,Y. For conditional probabilities or entropies, we used a vertical slash, as in HY|X
Alternate definitions
The technical literature and chaos theory also express mutual information in two other ways. As before, one is in terms of probabilities, the other in terms of entropies. This time we'll take the easy one (the entropy form) first.
|
|
مخاطر خفية لمكون شائع في مشروبات الطاقة والمكملات الغذائية
|
|
|
|
|
"آبل" تشغّل نظامها الجديد للذكاء الاصطناعي على أجهزتها
|
|
|
|
|
المجمع العلميّ يُواصل عقد جلسات تعليميّة في فنون الإقراء لطلبة العلوم الدينيّة في النجف الأشرف
|
|
|