胎心监护是检查什么| 刘庄为什么要灭了阴家| 幽门螺旋杆菌的症状吃什么药| 纵欲过度是什么意思| 消渴病是什么病| 怀孕有什么特征| 4月27日是什么星座| 用红笔写名字代表什么| aoa是什么意思| msgm是什么品牌| imax是什么| 霸屏是什么意思| 食物过敏吃什么药| 目前除皱最好的方法是什么| 妈妈的妈妈叫什么| 腿部抽筋是什么原因| 大便潜血什么意思| 2025是什么年| 肚脐眼周围疼是什么原因| 肚子里面跳动是什么原因| 清明节的习俗有什么| hbsag是什么意思| 富贵病是什么病| 燕子每年从什么方飞往什么方过冬| 1889年属什么生肖| 付梓什么意思| 麻婆豆腐是什么菜系| 不典型血管瘤什么意思| 不出汗是什么原因| 什么的闪电| 极是什么意思| 早餐适合吃什么| 艺高胆大是什么生肖| 甲状腺毒症是什么意思| 一月17号是什么星座| 胆结石吃什么水果好| 710是什么意思| 纯情什么意思| 吃什么食物补肾最快| 安陵容为什么恨甄嬛| 刚怀孕吃什么对胎儿好| 草莓什么季节种植| cc是什么意思啊| 男人有霉菌是什么症状| 牙痛吃什么药最管用| 什么叫三叉神经痛| 唉声叹气是什么意思| 医生为什么看瞳孔知道没救了| 四肢麻木是什么原因引起的| 总是口腔溃疡是什么原因| 美满霉素又叫什么名字| 排卵期在什么时候| 手掌小鱼际发红是什么原因| 女生的下面长什么样| 调岗是什么意思| 魂牵梦绕是什么意思| 六月二十四是什么日子| 女性吃金蝉有什么好处| 夏至是什么节日| carrots是什么意思| 2.16是什么星座| 大便拉不出来什么原因| 运动系统由什么组成| 脱发是什么原因| 2.7是什么星座| 大拇指发麻是什么原因| 左胸隐隐作痛是什么原因| 教师节送什么礼物给老师| 米粉和米线有什么区别| 老年人睡眠多是什么原因| 十二月七号是什么星座| fl是胎儿的什么| 周围型肺ca是什么意思| 什么叫天干| 十月30号是什么星座| 铜绿假单胞菌用什么抗生素| 光圈是什么| 双卵巢是什么意思| 土耳其烤肉是用什么肉| 东南西北五行属什么| 老放屁什么原因| 涧什么字| 神经性皮炎用什么药膏好| 出车前检查的目的是什么| dm医学上是什么意思| 贝珠是什么| 狗狗咳嗽吃什么药好得快| 脚背浮肿是什么原因引起的| 什么叫指标到校| 身上长红点很痒是什么原因| 尿常规3个加号什么意思| saucony是什么牌子| 沙金是什么| 患难见真情是什么意思| 属猪的幸运颜色是什么| 被弹颏是什么意思| 怀孕一个月内有什么反应| 3.22什么星座| 肉蔻是什么样子| 糖尿病吃什么好| 为什么说| 女生吃什么能淡化胡子| 辽宁古代叫什么| 冠冕是什么意思| 还愿有什么讲究| 高血糖能吃什么| 什么是肺磨玻璃结节| 一什么好字| 1943年属什么| 甲肝抗体阳性代表什么| 为什么禁止克隆人| 降甘油三酯吃什么食物最好| 汗臭味很重是什么原因引起的| 白介素2是治疗什么病的| 来世是什么意思| 小孩睡觉趴着睡是什么原因| 香港奶粉为什么限购| 子宫直肠窝积液是什么意思| 梦见买苹果是什么征兆| 视而不见的意思是什么| 6月14号什么星座| sakura是什么牌子| 为什么月经一次比一次提前| 眼皮跳挂什么科| 冰醋酸是什么| 五什么十什么成语| 什么龙什么虎| 藿香泡水喝有什么好处| 什么高什么低| 上火吃什么水果降火快| 女同是什么意思| 藿香正气水有什么用| 耘字五行属什么| 斐然是什么意思| 脑梗前兆是什么症状| 护照拍照穿什么衣服| 牛跟什么相冲| 戾气什么意思| 康复治疗学是做什么的| 苔菜是什么菜图片| 美元长什么样子图片| 阳光照耀是什么意思| fl是胎儿的什么| 1.23是什么星座| 肾功能不好吃什么药| 大庭广众什么意思| 阴虚火旺吃什么| 9月27日是什么星座| 谷草转氨酶偏高是什么原因| 尿酸高肌酐高是什么原因呢| 知青是什么| 梦到手机丢了什么预兆| 儿童发育迟缓挂什么科| 7号来的月经什么时候是排卵期| 积劳成疾的疾什么意思| 880什么意思| 结巴是什么原因引起的| 脑血管堵塞吃什么药好| 什么虫子有毒| 胸闷是什么症状| 一心一意指什么生肖| 烘焙是什么意思| 什么是桑黄| 4月13号是什么星座| 血红蛋白低吃什么补最快| 月经量少吃什么调理快| 30年婚姻是什么婚| 咳嗽可以喝什么| 什么是性激素| 砷对人体有什么危害| 医院介入科是干什么的| 现在摆摊卖什么东西最好卖| 尿隐血弱阳性什么意思| 脚底有痣代表什么| 谷维素片治什么病| 什么是重金属| alds是什么病| 为什么运动完会恶心头晕想吐| adh是什么激素| 双相是什么意思| 喝酒不能吃什么| 下身痒是什么原因| mc是什么意思| 甲醇和乙醇有什么区别| 男鼠配什么生肖最好| 什么人入什么| 1664是什么酒| 九浅一深什么意思| 指甲扁平是什么原因| 什么是艾灸| 认贼作父是什么意思| 幽门螺旋杆菌的症状吃什么药| 乘风破浪是什么意思| ts和cd有什么区别| 身上长了好多红痣是什么原因| 为什么会有血管瘤| 血卡是什么| 为什么一般不检查小肠| 五塔标行军散有什么功效| 着床成功后有什么症状或感觉| 泳字五行属什么| 阴茎不硬吃什么药| 头发斑秃是什么原因引起的| 静脉曲张有什么危害| tfcc是什么| 梦见吃杨梅是什么意思| 大姨妈来了吃什么好| 苦尽甘来是什么意思| 心疼是什么意思| 为什么叫打飞机| 菠菜和豆腐为什么不能一起吃| 癫痫病吃什么药| 妇科ph值是什么意思| 停胎是什么原因造成的| 骨折喝酒有什么影响吗| 低密度脂蛋白偏高吃什么食物| 睡不着吃什么药最有效| 鱼鳔是什么东西| 天鹅吃什么| 利是什么生肖| hcg什么意思| 一阴一阳是什么数字| 交织是什么意思| 万箭穿心代表什么生肖| 头顶痛吃什么药| 猪肚炖什么| 如法炮制是什么意思| 一直倒霉预示着什么| 容易长口腔溃疡是什么原因| 叶绿素是什么| 有什么副作用| 阿托品是什么药| 打哈哈是什么意思| 湿疹有什么忌口的食物| 多囊卵巢综合征是什么意思| 小米叫什么| 湘潭市花是什么| 公关是干什么的| 舌加氏念什么| 十二朵玫瑰花代表什么意思| 男人经常熬夜喝什么汤| 比劫是什么意思| 垫脚石是什么意思| 扁桃体溃疡吃什么药| 降钙素原高是什么原因| 眼睛周围长脂肪粒是什么原因| 团长相当于地方什么官| 喝酒吃头孢有什么反应| 巧克力囊肿是什么| 什么是变应性鼻炎| 斯里兰卡属于什么国家| 助理研究员是什么职称| 木字旁有什么字| 做凉粉用什么淀粉最好| 梦见枕头是什么意思| 肠子有问题有什么症状| 吃什么对卵巢好| 总是流鼻血是什么原因| 心动过缓吃什么药最好| 生地和熟地有什么区别| 达芬奇发明了什么| 小腿抽筋是什么原因引起的| 抠鼻表情是什么意思| 梦见摘果子是什么意思| 百度Jump to content

聊城“证照分离”改革开发区先行试点

From Wikipedia, the free encyclopedia
百度 ”陈宗年说,“海康威视这么多年已经形成了一套完整的适合人才发展的激励机制,这个‘跟投方案’的实施又是一个很大的内在变化,这些动作作用在我们一万多名科技人员身上,对未来影响是很大的。

In information theory, turbo codes are a class of high-performance forward error correction (FEC) codes developed around 1990–91, but first published in 1993. They were the first practical codes to closely approach the maximum channel capacity or Shannon limit, a theoretical maximum for the code rate at which reliable communication is still possible given a specific noise level. Turbo codes are used in 3G/4G mobile communications (e.g., in UMTS and LTE) and in (deep space) satellite communications as well as other applications where designers seek to achieve reliable information transfer over bandwidth- or latency-constrained communication links in the presence of data-corrupting noise. Turbo codes compete with low-density parity-check (LDPC) codes, which provide similar performance. Until the patent for turbo codes expired,[1] the patent-free status of LDPC codes was an important factor in LDPC's continued relevance.[2]

The name "turbo code" arose from the feedback loop used during normal turbo code decoding, which was analogized to the exhaust feedback used for engine turbocharging. Hagenauer has argued the term turbo code is a misnomer since there is no feedback involved in the encoding process.[3]

History

[edit]

The fundamental patent application for turbo codes was filed on 23 April 1991. The patent application lists Claude Berrou as the sole inventor of turbo codes. The patent filing resulted in several patents including US Patent 5,446,747, which expired 29 August 2013.

The first public paper on turbo codes was "Near Shannon Limit Error-correcting Coding and Decoding: Turbo-codes".[4] This paper was published 1993 in the Proceedings of IEEE International Communications Conference. The 1993 paper was formed from three separate submissions that were combined due to space constraints. The merger caused the paper to list three authors: Berrou, Glavieux, and Thitimajshima (from Télécom Bretagne, former ENST Bretagne, France). However, it is clear from the original patent filing that Berrou is the sole inventor of turbo codes and that the other authors of the paper contributed material other than the core concepts.[improper synthesis]

Turbo codes were so revolutionary at the time of their introduction that many experts in the field of coding did not believe the reported results. When the performance was confirmed a small revolution in the world of coding took place that led to the investigation of many other types of iterative signal processing.[5]

The first class of turbo code was the parallel concatenated convolutional code (PCCC). Since the introduction of the original parallel turbo codes in 1993, many other classes of turbo code have been discovered, including serial concatenated convolutional codes and repeat-accumulate codes. Iterative turbo decoding methods have also been applied to more conventional FEC systems, including Reed–Solomon corrected convolutional codes, although these systems are too complex for practical implementations of iterative decoders. Turbo equalization also flowed from the concept of turbo coding.

In addition to turbo codes, Berrou also invented recursive systematic convolutional (RSC) codes, which are used in the example implementation of turbo codes described in the patent. Turbo codes that use RSC codes seem to perform better than turbo codes that do not use RSC codes.

Prior to turbo codes, the best constructions were serial concatenated codes based on an outer Reed–Solomon error correction code combined with an inner Viterbi-decoded short constraint length convolutional code, also known as RSV codes.

In a later paper, Berrou gave credit to the intuition of "G. Battail, J. Hagenauer and P. Hoeher, who, in the late 80s, highlighted the interest of probabilistic processing." He adds "R. Gallager and M. Tanner had already imagined coding and decoding techniques whose general principles are closely related," although the necessary calculations were impractical at that time.[6]

An example encoder

[edit]

There are many different instances of turbo codes, using different component encoders, input/output ratios, interleavers, and puncturing patterns. This example encoder implementation describes a classic turbo encoder, and demonstrates the general design of parallel turbo codes.

This encoder implementation sends three sub-blocks of bits. The first sub-block is the m-bit block of payload data. The second sub-block is n/2 parity bits for the payload data, computed using a recursive systematic convolutional code (RSC code). The third sub-block is n/2 parity bits for a known permutation of the payload data, again computed using an RSC code. Thus, two redundant but different sub-blocks of parity bits are sent with the payload. The complete block has m + n bits of data with a code rate of m/(m + n). The permutation of the payload data is carried out by a device called an interleaver.

Hardware-wise, this turbo code encoder consists of two identical RSC coders, C1 and C2, as depicted in the figure, which are connected to each other using a concatenation scheme, called parallel concatenation:

In the figure, M is a memory register. The delay line and interleaver force input bits dk to appear in different sequences. At first iteration, the input sequence dk appears at both outputs of the encoder, xk and y1k or y2k due to the encoder's systematic nature. If the encoders C1 and C2 are used in n1 and n2 iterations, their rates are respectively equal to

The decoder

[edit]

The decoder is built in a similar way to the above encoder. Two elementary decoders are interconnected to each other, but in series, not in parallel. The decoder operates on lower speed (i.e., ), thus, it is intended for the encoder, and is for correspondingly. yields a soft decision which causes delay. The same delay is caused by the delay line in the encoder. The 's operation causes delay.

An interleaver installed between the two decoders is used here to scatter error bursts coming from output. DI block is a demultiplexing and insertion module. It works as a switch, redirecting input bits to at one moment and to at another. In OFF state, it feeds both and inputs with padding bits (zeros).

Consider a memoryless AWGN channel, and assume that at k-th iteration, the decoder receives a pair of random variables:

where and are independent noise components having the same variance . is a k-th bit from encoder output.

Redundant information is demultiplexed and sent through DI to (when ) and to (when ).

yields a soft decision; i.e.:

and delivers it to . is called the logarithm of the likelihood ratio (LLR). is the a posteriori probability (APP) of the data bit which shows the probability of interpreting a received bit as . Taking the LLR into account, yields a hard decision; i.e., a decoded bit.

It is known that the Viterbi algorithm is unable to calculate APP, thus it cannot be used in . Instead of that, a modified BCJR algorithm is used. For , the Viterbi algorithm is an appropriate one.

However, the depicted structure is not an optimal one, because uses only a proper fraction of the available redundant information. In order to improve the structure, a feedback loop is used (see the dotted line on the figure).

Soft decision approach

[edit]

The decoder front-end produces an integer for each bit in the data stream. This integer is a measure of how likely it is that the bit is a 0 or 1 and is also called soft bit. The integer could be drawn from the range [?127, 127], where:

  • ?127 means "certainly 0"
  • ?100 means "very likely 0"
  • 0 means "it could be either 0 or 1"
  • 100 means "very likely 1"
  • 127 means "certainly 1"

This introduces a probabilistic aspect to the data-stream from the front end, but it conveys more information about each bit than just 0 or 1.

For example, for each bit, the front end of a traditional wireless-receiver has to decide if an internal analog voltage is above or below a given threshold voltage level. For a turbo code decoder, the front end would provide an integer measure of how far the internal voltage is from the given threshold.

To decode the m + n-bit block of data, the decoder front-end creates a block of likelihood measures, with one likelihood measure for each bit in the data stream. There are two parallel decoders, one for each of the n?2-bit parity sub-blocks. Both decoders use the sub-block of m likelihoods for the payload data. The decoder working on the second parity sub-block knows the permutation that the coder used for this sub-block.

Solving hypotheses to find bits

[edit]

The key innovation of turbo codes is how they use the likelihood data to reconcile differences between the two decoders. Each of the two convolutional decoders generates a hypothesis (with derived likelihoods) for the pattern of m bits in the payload sub-block. The hypothesis bit-patterns are compared, and if they differ, the decoders exchange the derived likelihoods they have for each bit in the hypotheses. Each decoder incorporates the derived likelihood estimates from the other decoder to generate a new hypothesis for the bits in the payload. Then they compare these new hypotheses. This iterative process continues until the two decoders come up with the same hypothesis for the m-bit pattern of the payload, typically in 15 to 18 cycles.

An analogy can be drawn between this process and that of solving cross-reference puzzles like crossword or sudoku. Consider a partially completed, possibly garbled crossword puzzle. Two puzzle solvers (decoders) are trying to solve it: one possessing only the "down" clues (parity bits), and the other possessing only the "across" clues. To start, both solvers guess the answers (hypotheses) to their own clues, noting down how confident they are in each letter (payload bit). Then, they compare notes, by exchanging answers and confidence ratings with each other, noticing where and how they differ. Based on this new knowledge, they both come up with updated answers and confidence ratings, repeating the whole process until they converge to the same solution.

Performance

[edit]

Turbo codes perform well due to the attractive combination of the code's random appearance on the channel together with the physically realisable decoding structure. Turbo codes are affected by an error floor.

Practical applications using turbo codes

[edit]

Telecommunications:

Bayesian formulation

[edit]

From an artificial intelligence viewpoint, turbo codes can be considered as an instance of loopy belief propagation in Bayesian networks.[8]

See also

[edit]

References

[edit]
  1. ^ US 5446747 
  2. ^ Erico Guizzo (1 March 2004). "CLOSING IN ON THE PERFECT CODE". IEEE Spectrum. Archived from the original on 23 April 2023. "Another advantage, perhaps the biggest of all, is that the LDPC patents have expired, so companies can use them without having to pay for intellectual-property rights."
  3. ^ Hagenauer, Joachim; Offer, Elke; Papke, Luiz (March 1996). "Iterative Decoding of Binary Block and Convolutional Codes" (PDF). IEEE Transactions on Information Theory. 42 (2): 429–445. doi:10.1109/18.485714. Archived from the original (PDF) on 11 June 2013. Retrieved 20 March 2014.
  4. ^ Berrou, Claude; Glavieux, Alain; Thitimajshima, Punya (1993), "Near Shannon Limit Error – Correcting", Proceedings of IEEE International Communications Conference, vol. 2, pp. 1064–70, doi:10.1109/ICC.1993.397441, S2CID 17770377, retrieved 11 February 2010
  5. ^ Erico Guizzo (1 March 2004). "CLOSING IN ON THE PERFECT CODE". IEEE Spectrum. Archived from the original on 23 April 2023.
  6. ^ Berrou, Claude, The ten-year-old turbo codes are entering into service, Bretagne, France, retrieved 11 February 2010
  7. ^ Digital Video Broadcasting (DVB); Interaction channel for Satellite Distribution Systems, ETSI EN 301 790, V1.5.1, May 2009.
  8. ^ McEliece, Robert J.; MacKay, David J. C.; Cheng, Jung-Fu (1998), "Turbo decoding as an instance of Pearl's "belief propagation" algorithm" (PDF), IEEE Journal on Selected Areas in Communications, 16 (2): 140–152, doi:10.1109/49.661103, ISSN 0733-8716.

Further reading

[edit]

Publications

[edit]
[edit]
清道夫吃什么 白藜芦醇是什么东西 完全性右束支阻滞是什么意思 参加追悼会穿什么衣服 左眼皮跳是什么原因
口干口臭口苦吃什么药 兴风作浪什么意思 老花眼是什么原因引起的 p是什么意思啊 脚凉是什么原因
胰腺有什么作用 纯粹的人是什么性格 凯乐石属于什么档次 2034年是什么年 phicomm是什么牌子
菠菜炒什么好吃 面粉是什么做的 狗的尾巴有什么作用 80年出生属什么生肖 广州番禺有什么好玩的地方
明朝北京叫什么onlinewuye.com 嘴唇紫色是什么原因hcv8jop3ns9r.cn 跑步后头晕是什么原因hcv8jop7ns9r.cn 感染乙肝病毒有什么症状hcv8jop5ns8r.cn 壁厚是什么意思imcecn.com
为什么月经期有性冲动hcv8jop4ns9r.cn 浆果是什么hcv7jop5ns5r.cn 染色体xy代表什么hcv9jop2ns0r.cn 道士是干什么的hcv8jop9ns9r.cn 2019是什么生肖hcv8jop1ns7r.cn
kappa是什么牌子hcv9jop3ns7r.cn 南北朝后面是什么朝代shenchushe.com 囊肿挂什么科hcv8jop0ns9r.cn 木耳菜又叫什么菜520myf.com 为什么喝中药越来越胖hcv9jop3ns1r.cn
左肾积水是什么意思hcv9jop3ns2r.cn 尿潜血阳性是什么意思hcv9jop3ns3r.cn 吃止疼药有什么副作用hcv9jop5ns5r.cn 什么是腺肌症hcv9jop0ns4r.cn 眼镜是什么时候发明的clwhiglsz.com
百度