MATLAB实现各种熵:香农熵、条件熵、模糊熵、样本熵等

更新时间:2023-06-15 21:02:18 阅读: 评论:0

MATLAB实现各种熵:⾹农熵、条件熵、模糊熵、样本熵等
1 ⾹农熵Shannon Entropy
1948年,Shannon将玻尔兹曼熵的概念引⼊到信息论中,作为度量⼀个随机变量不确定性或信息量的定量指标。
1.1 基本原理
变量的不确定性越⼤,熵也就越⼤,把它搞清楚所需要的信息量也就越⼤。
1.2 信息熵的3个性质
greeting信息论之⽗克劳德·⾹农给出的信息熵的三个性质:
1.单调性,发⽣概率越⾼的事件,其携带的信息量越低;
scratched
2.⾮负性,信息熵可以看作为⼀种⼴度量,⾮负性是⼀种合理的必然;
3.累加性,即多随机事件同时发⽣存在的总不确定性的量度是可以表⽰为各事件不确定性的量度的和,这也是⼴度量的⼀种体现。
1.3 MATLAB代码实现有限公司 英文缩写
MATLAB代码:
function[SE,unique]= ShannonEn(ries,L,num_int)
%{
Function which computes the Shannon Entropy (SE) of a time ries of length
'N' using an embedding dimension 'L' and 'Num_int' uniform intervals of
quantification. The algoritm prented by Porta et al. at "Measuring
regularity by means of a corrected conditional entropy in sympathetic
outflow"(PMID: 9485587) has been followed.
INPUT:
ries: the time ries.
L: the embedding dimension.
num_int: the number of uniform intervals ud in the quantification
of the ries.
OUTPUT:
SE: the SE value.
unique: the number of patterns which have appeared only once. This
output is only uful for computing other more complex entropy
measures such as Conditional Entorpy or Corrected Conditional
Entorpy. If you do not want to u it, put '~'in the call of the
function.
PROJECT: Rearch Master in signal theory and bioengineering - University of Valladolid
DATE: 15/10/2014
VERSION: 1�
AUTHOR: Jes鷖 Monge 羖varez
%}
%% Checking the ipunt parameters:
control = ~impty(ries);
2 两随机变量系统中熵的相关概念
2.1 互信息
asrt (control,'The ur must introduce a time ries (first inpunt).');
control = ~impty (L );
复活节的来历
asrt (control,'The ur must introduce an embbeding dimension (cond inpunt).');
control = ~impty (num_int );
asrt (control,'The ur must introduce a number of intervals (third inpunt).');
%% Processing:
% Normalization of the input time  ries:
ries = (ries-mean (ries ))/std (ries );
% We the values of the parameters required for  the quantification:
epsilon = (max (ries )-min (ries ))/num_int ;
partition = min (ries ):epsilon:max (ries );
codebook = -1:num_int ;
% Uniform quantification of the time  ries:
[~,quants ] = quantiz (ries, partition, codebook );
% The minimum value of the signal quantified asrt pass -1 to 0:
quants (logical (quants == -1)) = 0;
% We compo the patterns of length 'L':
N = length (quants ); X = quants (1:N );
for  j = 1:L-1
X =[X
永远在一起的英文quants (j+1:N ) zeros (1,j )];
end
% We eliminate the last 'L-1' columns of 'X' since they are not real patterns:
tik tok什么意思
X = X (:,1:N-L+1);
% We get the number of repetitions of each pattern:
num = ones (1,N-L+1); % This vector will contain the repetition of each pattern
% This loop goes over the columns of 'X':
for  j = 1:(N-L+1)
for  i = j+1:(N-L+1)
tmp = ~isnan (X (:,j ));
if  (tmp (1)) && (iqual (X (:,j ),X (:,i )))
num (j ) = num (j ) + 1; % The counter is incremented one unit
X (:,i ) = NaN (L,1); % The pattern is replace by NaN values
end
% Ret of the auxiliar variable each iteration:
tmp = NaN ;
end
end
% We get tho patterns which  are not NaN:
通灵人士
aux = ~isnan (X (1,:));
% Now, we can compute the number of different patterns:
new_num = num (logical (aux ));
% We get the number of patterns which  have appeared only once:take over
unique = sum (new_num == 1);
% We compute the probability of each pattern:
p_i = new_num/(N-L+1);
% Finally, the Shannon Entropy is computed as:
SE = (-1) * ((p_i )*(log (p_i )).');
end % End of the 'ShannonEn.m' function
在概率论和信息论中,两个随机变量的互信息(Mutual Information,简称MI)或转移信息(transinformation)是变量间相互依赖性的量度。不同于相关系数,互信息并不局限于实值随机变量,它更加⼀般且决定着联合分布 p(X,Y) 和分解的边缘分布的乘积 p(X)p(Y) 的相似程度。互信息(Mutual Information)是度量两个事件集合之间的相关性(mutual dependence)。互信息是点间互信息(PMI)的期望值。互信息最常⽤的单位是bit。 互信息包含两个不同随机变量间平均共同信息量的度量,互信息越⾼,变量间相关性越强;反之,变量间相关性越弱,当变量相互独⽴时,相关性最⼩,互信息为0。
2.2 联合熵Joint Entropy
2.3 条件熵Conditional Entropy
条件熵是在联合符号集合XY熵的条件⾃信息量的数学期望,在已知随机变量X的条件下,随机变量Y的条件熵定义如下:
条件熵是⼀个确定值,表⽰信宿在收到X后,信源Y仍然存在的不确定度。这是传输失真所造成的。有时称H(X|Y)为信道疑义度,也称损失熵。称条件熵H(X|Y)为噪声熵。
2.3.1 基本原理
2.3.2 MATLAB代码实现
MATLAB代码:
2.4 互信息、联合熵、条件熵之间的关系function
[CE,unique ] = CondEn (ries,L,num_int )
%{
Function which  computes the Conditional Entropy (CE ) of a time  ries of
length 'N' using an embedding dimension 'L' and 'Num_int' uniform intervals
of quantification. The algoritm prented by Porta et al. at "Measuring
regularity by means of a corrected conditional entropy in sympathetic
outflow" (PMID: 9485587) has been followed.
INPUT:
外景地ries: the time  ries.
L: the embedding dimension.
num_int: the number of uniform intervals ud in  the quantification
of the ries.
OUTPUT:
CE: the CE value.
unique: the number of patterns which  have appeared only once. This
output is only uful for  computing other more  complex entropy
measures such as Corrected Conditional Entorpy. If you do  not want
to u it, put '~' in  the call of the function.
PROJECT: Rearch Master in  signal theory and bioengineering - University of Valladolid DATE: 15/10/2014
VERSION: 1�
AUTHOR: Jes 鷖 Monge 羖varez
%}
%% Checking the ipunt parameters:
control = ~impty (ries );
asrt (control,'The ur must introduce a time ries (first inpunt).');
control = ~impty (L );
asrt (control,'The ur must introduce an embbeding dimension (cond inpunt).');control = ~impty (num_int );
asrt (control,'The ur must introduce a number of intervals (third inpunt).');
%% Processing:
% First, we call the Shannon Entropy function:
% 'L' as embedding dimension:
[SE,unique ] = ShannonEn (ries,L,num_int );
% 'L-1' as embedding dimension:
[SE_1,~] = ShannonEn (ries,(L-1),num_int );
% The Conditional Entropy is defined as a differential entropy:
CE = SE - SE_1;
end % End of the 'CondEn.m' function
上述变量之间的关系,可以⽤韦恩( Venn) 图来表⽰:
2.5 纠正条件熵Corrected Conditional Entropy
更多的英文2.5.1 基本原理
2.5.2 MATLAB 代码实现

本文发布于:2023-06-15 21:02:18,感谢您对本站的认可!

本文链接:https://www.wtabcd.cn/fanwen/fan/90/146281.html

版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系,我们将在24小时内删除。

相关文章
留言与评论(共有 0 条评论)
   
验证码:
Copyright ©2019-2022 Comsenz Inc.Powered by © 专利检索| 网站地图