9 research outputs found

    A NEW SYSTEM TO INCREASE THE LIFE EXPECTANCY OF OPTICAL DISCS. PERFORMANCE ASSESSMENT WITH A DEDICATED EXPERIMENTAL SETUP.

    Get PDF
    Negli ultimi decenni si \ue8 registrata una crescita esponenziale per quel che riguarda l'uso di supporti digitali per l\u2019archiviazione dei dati informativi. Tuttavia, l\u2019aspettativa di vita di questi supporti \ue8 inadeguata rispetto alle necessit\ue0 delle istituzioni che si occupano di archiviare e preservare, ad esempio, il patrimonio artistico, storico, e culturale. Sulla base di questa problematica, pi\uf9 volte sollevata dall\u2019UNESCO [1,35], proponiamo un approccio innovativo che la degradazione chimico-fisica dei dischi ottici in modo da aumentarne l\u2019aspettativa di vita. Gli obiettivi di questa tesi sono quindi la progettazione e la messa in atto di una nuova strategia intelligente, volta ad aumentare l'aspettativa di vita dei dischi ottici. Diversi apparati sperimentali sono stati sviluppati al fine di studiare i meccanismi della degradazione chimico-fisica dei dischi tramite test di invecchiamento accelerato che simulano le condizioni operative di utilizzo del disco. Nei dischi sono state identificate aree critiche, nelle quali la degradazione cresce statisticamente pi\uf9 velocemente della media, e delle aree sicure, dove la degradazione cresce invece pi\uf9 lentamente. Due apparati sperimentali sono stati costruiti. Il primo \ue8 una camera climatica in grado di indurre invecchiamento artificiale sui dischi. Il secondo \ue8 un sistema robotico in grado di rilevare la quantit\ue0 di errori in ogni blocco di dati, prima della fase di correzione degli errori effettuata dal codice "Reed Solomon" (prima e dopo la fase di invecchiamento artificiale, senza polveri, e in un ambiente con controllo di temperatura e umidit\ue0). L'analisi dei dati cos\uec raccolti ha permesso di identificare le gi\ue0 citate aree fisiche in cui i blocchi di dati hanno un numero di errori che si avvicina o addirittura supera la capacit\ue0 di correzione dei dati propria del codice Reed Solomon standard. I risultati di queste analisi hanno portato allo sviluppo di un "Codice Reed Solomon Adattivo" (codice A-RS), che consente di proteggere le informazioni memorizzate all'interno delle aree critiche. Infatti, il codice A-RS utilizza un algoritmo di redistribuzione che viene applicato ai simboli di parit\ue0. La ridistribuzione della ridondanza \ue8 stata calcolata tramite una funzione di degrado, a sua volta ottenuta fittando gli errori rilevati nella fase sperimentale. L'algoritmo di redistribuzione sposta i simboli di parit\ue0 dai blocchi di dati nelle aree sicure ai blocchi di dati nelle aree critiche. Inoltre, \ue8 interessante sottolineare come questo processo non diminuisca la capacit\ue0 di memoria dei Digital Versatile Discs (DVDs) e dei Blu-ray Discs (BDs). Questa strategia consente quindi di evitare la dismissione anticipata dei dischi ottici, dovuta alle possibili perdite di dati.The past decade has witnessed an exponential growth in the use of digital supports for big data archiving. However, the expected lifespan of these supports is inadequate with respect to the actual needs of heritage institutions. Stemming from the issues raised by UNESCO [1, 2], we address the problem of alleviating the effects of aging on optical discs. To achieve this purpose, we propose a novel logical approach that is able to conteract the physical and chemical degradation of different types of optical discs, increasing their life expectancy. In other words, the objectives of this thesis are the design and the implementation of a new intelligent strategy aimed at increasing the life expectancy of optical discs. An experimental setup has been developed in order to investigate the physical and chemical degradation processes of discs, by means of accelerated aging tests that simulated the operational disc-use conditions. Critical areas are identified where disc degradation is statistically faster than average, while in safe areas the degradation is relatively slow. To collect the needed data, two experimental devices have been built. The first is a climatic chamber, which is able to induce artificial disc aging. The second is a robotic device, which is able to detect the amount of errors in each data block prior to the \u201cReed Solomon\u201d error correction stage (before and after the accelerate aging stage, without dust and in an environment with controlled temperature and humidity). The analysis of the data allows to identify the aforementioned physical areas where data blocks have a number of errors that approaches or exceeds the data correction capability of the standard Reed Solomon code. The results of these analyses have led to develop an \u201cAdaptive Reed Solomon Code\u201d (A-RS code) that allows to protect the information stored within the critical areas. The A-RS code uses a redistribution algorithm that is applied to parity symbols. It is calculated from the fitting of the experimental errors obtained through the \u201cdegradation function\u201d. The redistribution algorithm shifts a certain number of parity symbols from Data Blocks in safe areas to Data Blocks in critical areas. Interestingly, these processes do not diminish the memory capability of Digital Versatile discs (DVDs) and of Blu-Ray discs (BDs). This strategy therefore avoids the early dismission of optical discs, due to possible losses of even minimal parts of the recorded information

    Complexity Analysis of Reed-Solomon Decoding over GF(2^m) Without Using Syndromes

    Get PDF
    For the majority of the applications of Reed-Solomon (RS) codes, hard decision decoding is based on syndromes. Recently, there has been renewed interest in decoding RS codes without using syndromes. In this paper, we investigate the complexity of syndromeless decoding for RS codes, and compare it to that of syndrome-based decoding. Aiming to provide guidelines to practical applications, our complexity analysis differs in several aspects from existing asymptotic complexity analysis, which is typically based on multiplicative fast Fourier transform (FFT) techniques and is usually in big O notation. First, we focus on RS codes over characteristic-2 fields, over which some multiplicative FFT techniques are not applicable. Secondly, due to moderate block lengths of RS codes in practice, our analysis is complete since all terms in the complexities are accounted for. Finally, in addition to fast implementation using additive FFT techniques, we also consider direct implementation, which is still relevant for RS codes with moderate lengths. Comparing the complexities of both syndromeless and syndrome-based decoding algorithms based on direct and fast implementations, we show that syndromeless decoding algorithms have higher complexities than syndrome-based ones for high rate RS codes regardless of the implementation. Both errors-only and errors-and-erasures decoding are considered in this paper. We also derive tighter bounds on the complexities of fast polynomial multiplications based on Cantor's approach and the fast extended Euclidean algorithm.Comment: 11 pages, submitted to EURASIP Journal on Wireless Communications and Networkin

    Design and analysis of parity-check-code-based optical recording systems

    Get PDF
    Ph.DNUS-TU/E JOINT PH.D. PROGRAMM

    System characterization and reception techniques for two-dimensional optical storage

    Get PDF

    Computer Aided Verification

    Get PDF
    This open access two-volume set LNCS 13371 and 13372 constitutes the refereed proceedings of the 34rd International Conference on Computer Aided Verification, CAV 2022, which was held in Haifa, Israel, in August 2022. The 40 full papers presented together with 9 tool papers and 2 case studies were carefully reviewed and selected from 209 submissions. The papers were organized in the following topical sections: Part I: Invited papers; formal methods for probabilistic programs; formal methods for neural networks; software Verification and model checking; hyperproperties and security; formal methods for hardware, cyber-physical, and hybrid systems. Part II: Probabilistic techniques; automata and logic; deductive verification and decision procedures; machine learning; synthesis and concurrency. This is an open access book

    Implementation of a VLC HDTV Distribution System for Consumer Premises

    Get PDF
    A unidirectional, visible light communication (VLC) system intended for the distribution of Digital Video Broadcasting (DVB), high-definition television (HDTV) content to DVB compatible TVs within consumer premises is presented. The system receives off-air HDTV content through a consumer grade DVB-T/T2 terrestrial set-top-box (STB) and re-encodes its Moving Picture Experts Group (MPEG) transport stream (TS) using a pulse position modulation (PPM) scheme called inversion offset PPM (IOPPM). The re-encoded TS is used to intensity modulate (IM) a blue light-emitting diode (LED) operating at a wavelength of 470 nm. Directed line-of-sight (DLOS) transmission is used over a free-space optical (FSO) channel exhibiting a Gaussian impulse response. A direct-detection (DD) receiver is used to detect the transmitted IOPPM stream, which is then decoded to recover the original MPEG TS. A STB supporting a high-definition multimedia interface (HDMI) is used to decode the MPEG TS and enable connectivity to an HD monitor. The system is presented as a complementary or an alternative distribution system to existing Wi-Fi and power-line technologies. VLC connectivity is promoted as a safer, securer, unlicensed and unregulated approach. The system is intended to enable TV manufacturers to reduce costs by, firstly, relocating the TV’s region specific radio frequency (RF) tuner and demodulator blocks to an external STB capable of supporting DVB reception standards, and, secondly, by eliminating all input and output connectors interfaces from the TV. Given the current trend for consumers to wall-mount TVs, the elimination of all connector interfaces, except the power cable, makes mounting simpler and easier. The operation of the final system was verified using real-world, off-air broadcast DVB-T/T2 channels supporting HDTV content. A serial optical transmission at a frequency of 66 MHz was achieved. The system also achieved 60 Mbit/s, error free transmission over a distance of 1.2 m without using error correction techniques. The methodology used to realise the system was a top-down, modular approach. Results were obtained from electrical modelling, simulation and experimental techniques, and using time-domain and FFT based measurements and analysis. The modular approach was adopted to enable design, development and testing of the subsystems independently of the overall system

    The Art of Movies

    Get PDF
    Movie is considered to be an important art form; films entertain, educate, enlighten and inspire audiences. Film is a term that encompasses motion pictures as individual projects, as well as — in metonymy — the field in general. The origin of the name comes from the fact that photographic film (also called filmstock) has historically been the primary medium for recording and displaying motion pictures. Many other terms exist — motion pictures (or just pictures or “picture”), the silver screen, photoplays, the cinema, picture shows, flicks — and commonly movies

    Maritime expressions:a corpus based exploration of maritime metaphors

    Get PDF
    This study uses a purpose-built corpus to explore the linguistic legacy of Britain’s maritime history found in the form of hundreds of specialised ‘Maritime Expressions’ (MEs), such as TAKEN ABACK, ANCHOR and ALOOF, that permeate modern English. Selecting just those expressions commencing with ’A’, it analyses 61 MEs in detail and describes the processes by which these technical expressions, from a highly specialised occupational discourse community, have made their way into modern English. The Maritime Text Corpus (MTC) comprises 8.8 million words, encompassing a range of text types and registers, selected to provide a cross-section of ‘maritime’ writing. It is analysed using WordSmith analytical software (Scott, 2010), with the 100 million-word British National Corpus (BNC) as a reference corpus. Using the MTC, a list of keywords of specific salience within the maritime discourse has been compiled and, using frequency data, concordances and collocations, these MEs are described in detail and their use and form in the MTC and the BNC is compared. The study examines the transformation from ME to figurative use in the general discourse, in terms of form and metaphoricity. MEs are classified according to their metaphorical strength and their transference from maritime usage into new registers and domains such as those of business, politics, sports and reportage etc. A revised model of metaphoricity is developed and a new category of figurative expression, the ‘resonator’, is proposed. Additionally, developing the work of Lakov and Johnson, Kovesces and others on Conceptual Metaphor Theory (CMT), a number of Maritime Conceptual Metaphors are identified and their cultural significance is discussed
    corecore