Read More
Date: 11-8-2016
![]()
Date: 30-8-2016
![]()
Date: 21-8-2016
![]() |
Mutual Information
Two systems that have no mutual relation have no information about one another. Conversely, if they do have some connection, each contains information about the other. For instance, a measurement of x helps estimate y (or provides information about y, or reduces the uncertainty in y). Suppose now that we want to determine the amount (if any) by which a measurement of x reduces our uncertainty about a value of y. Two ''uncertainties of y" contribute. First, y by itself has an uncertainty, as measured by the self-entropy HY. Secondly, there's an uncertainty of y given a measurement of x, as measured by the conditional entropy HY|X.
Conditional entropy HY|X is a number that represents an amount of information about y. In particular, the basic uncertainty HY is lessened or partially relieved by an amount equal to HY|X. In symbols, that statement says that the overall decrease in uncertainty is HY-HY|X. The name for that difference or reduced uncertainty is mutual information, IY;X:
IY;X = HY-HY|X. .......(1)
Using a semicolon in the subscript Y;X characterizes our symbol for mutual information. For joint probabilities or entropies, we used a comma, as in joint entropy HX,Y. For conditional probabilities or entropies, we used a vertical slash, as in HY|X
Alternate definitions
The technical literature and chaos theory also express mutual information in two other ways. As before, one is in terms of probabilities, the other in terms of entropies. This time we'll take the easy one (the entropy form) first.
|
|
دراسة: حفنة من الجوز يوميا تحميك من سرطان القولون
|
|
|
|
|
تنشيط أول مفاعل ملح منصهر يستعمل الثوريوم في العالم.. سباق "الأرنب والسلحفاة"
|
|
|
|
|
لتعزيز التواصل مع الزائرات الأجنبيات : العتبة العلويّة المقدّسة تُطلق دورة لتعليم اللغة الإنجليزية لخادمات القسم النسويّ
|
|
|