A Constrained Channel Coding Approach to Joint Communication and Channel Estimation

更新时间:2023-06-07 01:35:51 阅读: 评论:0

a r X i v :0801.1136v 1  [c s .I T ]  7 J a n  2008
A Constrained Channel Coding Approach to Joint
Communication and Channel Estimation
Wenyi Zhang,Satish Vedantam,and Urbashi Mitra
Ming Hsieh Department of Electrical Engineering
University of Southern California {wenyizha,vedantam,ubli }@usc.edu
Abstract —A joint communication and channel state estimation problem is investigated,in which reliable information transmis-sion over a noisy channel,and high-fidelity estimation of the channel state,are simultaneously sought.The tradeoff between the achievable information rate and the estimation distortion is quantified by formulating the problem as a constrained channel coding problem,and the resulting capacity-distortion function characterizes the fundamental limit of the joint communication and channel estimation problem.The analytical results are illustrated through ca studies,and further issues such as multiple cost constraints,channel uncertainty,and capacity per unit distortion are also briefly discusd.
I.I NTRODUCTION
In this paper,we consider the problem of joint commu-nication and channel estimation over a channel with a time-varying channel state.We consider a noisy channel with a random channel state that evolves with time,in a memoryless fashion,and is neither available to the transmitter nor the receiver.The objective is to have the receiver recover both the information transmitted from the transmitter as well the state of the channel over which the information was transmitted.The problem tting may prove relevant for situations such as environment monitoring in nsor networks [1],underwater acoustic applications [2],and cognitive radio [3].A distinct feature of our problem formulation is that both communication and channel estimation are required.
The interplay between information measures and estimation (minimum mean-squared error (MMSE)in particular)has long been investigated;,[4]and references therein.Previ-ously,however,estimation was only to facilitate information transmission,rather than a parate goal.For example,a common strategy in block interference channels [5]is channel estimation via training [6].The purpo of channel training is only to increa the information rate for communication,and thus the quality of channel estimate is not traded off with the information rate,as we consider in this paper.
The problem formulation in [7],[8]bears some similarity to the one we consider in that the receiver is interested in both communication and channel estimation.It differs from our work in a critical way:the channel state is assumed non-causally known at the transmitter.In contrast,neither the transmitter nor the receiver knows the channel state in our problem formulation.
Intuitively,there exists a tradeoff between a channel’s capa-bility to transfer information and its capability to exhibit state.
Increasing randomness in channel inputs increas information transfer while reducing the receiver’s ability to estimate the channel.In contrast,deterministic signaling facilitates channel estimation at the expen of zero information transfer.In this paper,we show that the optimal tradeoff can be formulated as a channel coding problem,with the channel input distribution constrained by an average “estimation cost”constraint.
The rest of this paper is organized as follows.Section II introduces the channel model and the capacity-distortion function,and Section III formulates the equivalent constrained channel coding problem.Section IV illustrates the application of the capacity-distortion function through veral simple examples.Section V briefly discuss some related issues including multiple cost constraints,
channel uncertainty,and capacity per unit distortion.Finally,Section VI concludes the paper.
II.C HANNEL M ODEL
We consider the channel model in Figure 1.For a length-n block of channel inputs,a message M is equally probably -lected among {1,..., e nR  },and is encoded by the encoder,generating the corresponding channel inputs {X 1,...,X n }.We provide the following definition.Definition 1:(Encoder)An encoder is defined by a func-tion,f n :M ={1,...,
e nR }→X n ,for each n ∈N .
fashion.We note that this model encompass the block interference channel model,becau we can treat a block as a super-symbol and thus convert a block interference channel into a memoryless channel.
Definition2:(Joint decoder and estimator)A joint decoder and estimator is defined by a pair of functions,g n:Y n→M and h n:Y n→S n,for each n∈N.
This definition differs from that of the conventional channel ,[9])in that it explicitly requires estimation of the channel state S at the receiver.The quality of estimation is measured by the distortion function d:S×S→R+∪{0}. That is,ifˆS i is the i th element of h n(Y n),then d(S i,ˆS i) denotes the distortion at time i,i=1,...,n.For technical convenience,we assume that d(·,·)is bounded from above so that there exists afinite T>0with d(s,s′)≤T<∞for any s,s′∈S.Note that for length-n block coding schemes, the average distortion is given by
¯d(S n,ˆS n)=1
becau the channel is memoryless.For each fixed n ,we have
P (S n
|X n
,Y n
)=P (X n ,Y n ,S n )
S n
n
i =1P (S i ,X i ,Y i )=
n
备课组长工作总结i =1P (S i ,X i ,Y i )
P (Y i |X i )P X (X i )
=
n
最重要的小事
i =1P (S i |X i ,Y i ).(9)
As we take n →∞,the lemma is established.Q.E.D.
拼图教程
Proof of Theorem 1:From Lemmas 1and 2,we can rewrite the average distortion constraint (2)as
lim sup
n →∞
1
快速止血小妙招n
n  i =1
E d (S i ,h ∗0(X i ,Y i ))≤D.
(10)
Utilizing (4)and the fact that the channel is memoryless,we
can further deduce from (10)that
E d ∗(X )≤D.
(11)
当爱来临的时候So now the constraints in Definition 3reduce to having P (n )e
→0as n →∞,subject to the constraint (11).This is exactly the problem of channel coding with a cost constraint on the input distribution,and Theorem 1directly follows from standard proofs;,[10].Q.E.D.Discussion :
(1)The proof of Theorem 1suggests the joint decoder and estimator first decode the transmitted message in a “non-coherent”fashion,then utilize the reconstructed channel inputs along with the channel outputs to estimate the channel states.As the coding block length grows large,such a two-stage procedure becomes asymptotically optimal.
(2)For each x ∈X ,d ∗(x )quantifies its associated min-imum distortion.Alternatively,d ∗(x )can be viewed as the “estimation cost”due to signaling with x .Hence the average distortion constraint in (6)regulates the input distribution such that the signaling is estimation-efficient.We emphasize that,d ∗(x )is dependent on the channel through the distribution of the channel state S ,and thus differs from other usual costs such as symbol energies or time durations.
(3)A key condition that leads to the constrained channel coding formulation is that the channel is memoryless.Due to the memoryless property,we can decompo a block estimator into multiple one-
shot estimators,without loss of optimality asymptotically.If the channel state evolves with time in a correlated fashion,then such a decomposition is generally suboptimal.
摩托车怎么保养IV.I LLUSTRATIVE E XAMPLES
In this ction,we discuss veral simple examples to illustrate the application of Theorem 1.
A.Uniform Estimation Costs
A special ca is that d ∗(x )=d 0for all x ∈X .For such type of channels,the average cost constraint in (6)exhibits a singular behavior.If D <d 0,then the joint communication and channel estimation problem is infeasible;otherwi,P D consists of all possible input distributions,and thus the capacity-distortion function C (D )is equal to the unconstrained capacity of the channel.One of the simplest channels with uniform estimation costs is the additive channel Y i =X i +S i ,for which as the receiver reliably decodes M ,it can subtract off X i from Y i .B.A Scalar Multiplicative Channel
Consider the following scalar multiplicative channel
Y i =S i X i ,
(12)
where all the alphabets are binary,X =Y =S ={0,1},and the multiplication is in the conventional n for real numbers.The reader may interpret S as the status of an informed jamming source,a fading level,or the status of another transmitter.Activating S to its “effective status”S =0shuts down the link between X and Y ;otherwi,the link X →Y is esntially noiless.We take the distortion measure as the Hamming distance:d (s,ˆs )=1if and only if ˆs =s and zero otherwi.
The tradeoff between communication and channel estima-tion is straightforward to obrve from the nature of the channel:for good estimation of S ,we want X =1as often as possible,whereas this would reduce the achieved information rate.In this example,we assume that P (S =1)=r ≤1/2.We shall optimize P (X =1),denoted by p ∈[0,1].The channel mutual information is I (X ;Y )=H 2(pr )−p ·H 2(r ),where H 2(·)denotes the binary entropy function H 2(t )=−t log t −(1−t )log(1−t ).For x =0,the optimal one-shot
estimator is ˆS
=0(note that P (S =1)=r ≤1/2),and the resulting minimum conditional distortion is d ∗(0)=r .For
x =1,the optimal one-shot estimator is ˆS
=Y =S ,leading to d ∗(1)=0.Therefore the input distribution should satisfy (1−p )r ≤D .
After manipulations,we find that the optimal solution is given by
If D ≥r − 1+e H 2(r )/r
−1,p ∗=
1r ,
and C (D )=H 2(r −D )−
1−
D
C(D)deviates from the unconstrained channel capacity.We can show from the expression of C(D)that,as D→0,
医保结算
log(1−r)
C(D)=
multiple cost constraints should be simultaneously satisfied by augmenting the feasible
t of input distributions,P D (6),to the interction of multiple feasible ts,each for one cost constraint.
For either single or multiple cost constraints,the capacity-distortion function can be defined following Section II,for-mulated as a constrained channel coding problem following Section III,and computed following efficient algorithms like the Blahut-Arimoto algorithm [11],[12]for discrete alphabets.B.Uncertainty in Channel State Statistics
The constrained channel coding formulation in Section III can also be extended to the ca in which the distribution of the channel state S is uncertain.For such a compound channel tting,we assume that the joint channel distribu-tion P θ(x,s,y )=P (y |x,s )P X (x )P S ,θ(s )is parametrized by an unknown parameter θ∈Θ,which is induced by the parametrized distribution of S ,P S ,θ(s ).If all the alphabets X ,Y ,and S are discrete,we can show following the proof in [13]that the capacity-distortion function of the compound channel is
sup P X ∈P D inf θ∈Θ
I θ(X ;Y ),
(19)
where
P D =
P X :
x ∈X
P X (x )d ∗θ(x )≤D,∀θ∈Θ
.
(20)
In I θ(X ;Y )and d ∗θ(x ),the subscript θdenotes that they are evaluated with respect to P θ(x,s,y ).C.Capacity Per Unit Distortion
In light of the definition of channel capacity per unit cost for general cost-constrained channels [14],we can analogously define the capacity per unit distortion,and show that it is equal to
C d =sup
P X
I (X ;Y )
d ∗(x )
,(21)
where D (· ·)denotes the Kullback-Leibler divergence be-tween two distributions.Here,note that in P Y |X we marginal-ize over the channel state S .
Given (21),we can then conveniently evaluate C d for various channels.For example,the scalar multiplicative chan-nel in Section IV-B has C d =log(1−r )
lead to d ∗(·)=0.
VI.C ONCLUSIONS
In this paper,we introduce a joint communication and channel estimation problem for state-dependent channels,and characterize its fundamental tradeoff by formulating it as a channel coding problem with input distribution constrained by an average “estimation cost”constraint.The resulting capacity-distortion function permits a systematic investiga-tion of the channel property for communication and state estimation.Future rearch topics include specializing the general framework to particular channel models in realistic applications,and generalizing the results to multiur systems and channels of generally correlated state process.
A CKNOWLEDGMENT
This work has been supported in part by NSF OCE0520324,the Annenberg Foundation,and the University of Southern California.
R EFERENCES
[1]R.Szewczyk,E.Osterweil,J.Polastre,M.Hamilton,A.Mainwaring,and
D.Estrin,“Habitat Monitoring with Sensor Networks,”Communications of the ACM ,vol.47,no.6,pp.34-
40,Jun.2004.
[2]M.Stojanovic,“Recent Advances in High-Speed Underwater Acoustic
Communications,”IEEE J.Oceanic Eng.,vol.21,no.2,pp.125–137,Apr.1996.
[3]S.Haykin,“Cognitive Radio:Brain-Empowered Wireless Communica-tions,”IEEE J.Select.Areas Commun.,vol.23,no.2,pp.201–220,Feb.2005.
[4]  D.Guo,S.Shamai (Shitz),and S.Verd´u ,“Mutual Information and Min-imum Mean-Square Error in Gaussian Channels,”IEEE Trans.Inform.
Theory ,vol.51,no.4,pp.1261–1281,Apr.2005.
[5]R.McEliece and W.Stark,“Channels with Block Interference,”IEEE
Trans.Inform.Theory ,vol.30,no.1,pp.44–53,Jan.1984.
[6]  B.Hassibi and B.Hochwald,“How Much Training is Needed in Multiple-Antenna Wireless Links?”IEEE Trans.Inform.Theory ,vol.49,no.4,pp.951–963,Apr.2003.
[7]  A.Sutivong,M.Chiang,T.M.Cover,and Y .-H.Kim,“Channel Capacity
and State Estimation for State-Dependent Gaussian Channels,”IEEE Trans.Inform.Theory ,vol.51,no.4,pp.1486–1496,Apr.2005.
[8]T.M.Cover,Y .-H.Kim,and A.Sutivong,“Simultaneous Communication
of Data and State,”[Online]Available at ArXiv,2007.
[9]T.M.Cover and J.A.Thomas,Elements of Information Theory ,John
Wiley &Sons,Inc.,1991.
[10]R.G.Gallager,Information Theory and Reliable Communication ,Wiley,
1968.
[11]R.E.Blahut,“Computation of Channel Capacity and Rate-Distortion
Functions,”IEEE Trans.Inform.Theory ,vol.18,no.4,pp.460–478,Jul.1972.
[12]S.Arimoto,“An Algorithm for Calculating the Capacity of an Arbitrary
Discrete Memoryless Channel,”IEEE Trans.Inform.Theory ,vol.18,no.1,pp.14–20,Jan.1972.
[13]  D.Blackwell,L.Breiman,and A.J.Thomasian,“The Capacity of a
雨蝶歌词Class of Channels,”The Annals of Mathematical Statistics ,vol.30,no.4,pp.1229–1241,Dec.1959.[14]S.Verd´u ,“On Channel Capacity Per Unit Cost,”IEEE Trans.Inform.
Theory ,vol.36,no.5,pp.1019–1030,Sep.1990.

本文发布于:2023-06-07 01:35:51,感谢您对本站的认可!

本文链接:https://www.wtabcd.cn/fanwen/fan/82/889962.html

版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系,我们将在24小时内删除。

标签:总结   教程   来临   雨蝶   工作   拼图
相关文章
留言与评论(共有 0 条评论)
   
验证码:
推荐文章
排行榜
Copyright ©2019-2022 Comsenz Inc.Powered by © 专利检索| 网站地图