thesis

Dynamic Data Encoding for Page-Oriented Memories

Abstract

This dissertation presents a key portion of the system architecture for a high performance page-oriented memory. The focus of this research is the development of new dynamic encoding algorithms that provide high data reliability with code density that is higher than in the conventional static modulation schemes. It also presents an intelligent read/write head architecture capable of implementing the most promising of these algorithms in real-time.Data encoding techniques for page-oriented mass storage devices are typically conservative in order to overcome the destructive effects of inter-symbol interference and noise due to the physical characteristics of the media. Therefore significantly more bits are required in an encoded version of data than in the original information. This penalty in the code density, usually referred to as code rate, keeps the utilization of the media relatively low, often less than 50% of the capacity of a maximally dense code. This is partially because encoding techniques are static and assume the worst case for the information surrounding the data block being encoded. However, in the context of page-oriented data transfers it is possible to evaluate the surrounding information for each code block location, and, thus, to apply a custom code set for each code block. Since evaluating each possible code during runtime leads to very high time complexity for encoding and decoding algorithms, we also present alternative algorithms that successfully trade time complexity for code density and are a strong competition to the traditional static modulation schemes. In order to verify that the encoding algorithms are both efficient and applicable, they were analyzed using a two-photon optical memory model. The analysis focused on how well the algorithms performed as a trade off between complexity and code density. It resulted that a full enumeration of codes yielded code density as high as 83%, although the time complexity for the enumeration approach was exponential. In another study, a linear time algorithm was analyzed. The code density of this algorithm was just over 54% percent. Finally, a novel quasidynamic encoding algorithm was created, which yielded 76% code density and had constant time complexity

    Similar works