This book presents the mutual information (MI) estimation methods recently proposed by the author and published in a number of major journals. It includes two types of applications: learning a forest structure from data for multivariate variables and identifying independent variables (independent component analysis). MI between a pair of random variables is mathematically defined in information theory. It measures how dependent the two variables are, takes nonnegative values, and is zero if, and only if, they are independent, and is often necessary to know the value of MI between two variables in machine learning, statistical data analysis, and various sciences, including physics, psychology, and economics. However, the real value of MI is not available and it can only be estimated from data. The essential difference between this and other estimations is that consistency and independence testing are proved for the estimations proposed by the author, where the authors state that an estimation satisfies consistency and independence testing when the estimation corresponds to the true value and when the MI estimation value is zero with probability one as the sample size grows, respectively. Thus far, no MI estimations satisfy both these properties at once.
人気のある作家
John C. Maxwell (54) DK (9) Harvard Business Review (8) Dave Ramsey (7) James Herriot (7) New Nomads Press (7) Brian Tracy (6) Jocko Willink (6) Bob Burg (5) Jamie K. Spatola (5) Author (4) Avi Loeb (4) Daniel Goleman (4) Flame Tree Studio (4) Henry Cloud (4) Jeffrey Gitomer (4) John David Mann (4) Lambda Publishing (4) Sandor Ellix Katz (4) Stephen R. Covey (4)最適なファイルサイズ
1194 KB 1430 KB 353 KB 1006 KB 1022 KB 10250 KB 1038 KB 1042 KB 1047 KB 1051 KB 1064 KB 10693 KB 1069 KB 1072 KB 107697 KB 10876 KB 1090 KB 1096 KB 1106 KB 1111 KB