65 research outputs found
Recommended from our members
Energy-aware embedded media processing: customizable memory subsystems and energy management policies
textThe design of energy-efficient data memory architectures for embedded
system platforms has received considerable attention in recent years. In
this dissertation we propose a special-purpose data memory subsystem, called
Xtream-Fit, targeted to streaming media applications executing on both generic
uniprocessor embedded platforms and powerful SMT-based multi-threading
platforms. We empirically demonstrate that Xtream-Fit achieves high energydelay
efficiency across a wide range of media devices, from systems running a
single media application to systems concurrently executing multiple media applications
under synchronization constraints. Xtream-Fit’s energy efficiency
is predicated on a novel task-based execution model that exposes/enhances
opportunities for efficient prefetching, and aggressive dynamic energy conservation
techniques targeting on-chip and off-chip memory components. A key
novelty of Xtream-Fit is that it exposes a single customization parameter, thus
enabling a very simple and yet effective design space exploration methodology
to find the best memory configuration for the target application(s). Extensive
experimental results show that Xtream-Fit reduces energy-delay product
substantially – by 32% to 69% – as compared to ‘standard’ general-purpose
memory subsystems enhanced with state of the art cache decay and SDRAM
power mode control policies.Electrical and Computer Engineerin
1994 Science Information Management and Data Compression Workshop
This document is the proceedings from the 'Science Information Management and Data Compression Workshop,' which was held on September 26-27, 1994, at the NASA Goddard Space Flight Center, Greenbelt, Maryland. The Workshop explored promising computational approaches for handling the collection, ingestion, archival and retrieval of large quantities of data in future Earth and space science missions. It consisted of eleven presentations covering a range of information management and data compression approaches that are being or have been integrated into actual or prototypical Earth or space science data information systems, or that hold promise for such an application. The workshop was organized by James C. Tilton and Robert F. Cromp of the NASA Goddard Space Flight Center
Third International Symposium on Space Mission Operations and Ground Data Systems, part 1
Under the theme of 'Opportunities in Ground Data Systems for High Efficiency Operations of Space Missions,' the SpaceOps '94 symposium included presentations of more than 150 technical papers spanning five topic areas: Mission Management, Operations, Data Management, System Development, and Systems Engineering. The papers focus on improvements in the efficiency, effectiveness, productivity, and quality of data acquisition, ground systems, and mission operations. New technology, techniques, methods, and human systems are discussed. Accomplishments are also reported in the application of information systems to improve data retrieval, reporting, and archiving; the management of human factors; the use of telescience and teleoperations; and the design and implementation of logistics support for mission operations
Compressive sensing based image processing and energy-efficient hardware implementation with application to MRI and JPG 2000
In the present age of technology, the buzzwords are low-power, energy-efficient and compact systems. This directly leads to the date processing and hardware techniques employed in the core of these devices. One of the most power-hungry and space-consuming schemes is that of image/video processing, due to its high quality requirements. In current design methodologies, a point has nearly been reached in which physical and physiological effects limit the ability to just encode data faster. These limits have led to research into methods to reduce the amount of acquired data without degrading image quality and increasing the energy consumption.
Compressive sensing (CS) has emerged as an efficient signal compression and recovery technique, which can be used to efficiently reduce the data acquisition and processing. It exploits the sparsity of a signal in a transform domain to perform sampling and stable recovery. This is an alternative paradigm to conventional data processing and is robust in nature. Unlike the conventional methods, CS provides an information capturing paradigm with both sampling and compression. It permits signals to be sampled below the Nyquist rate, and still allowing optimal reconstruction of the signal. The required measurements are far less than those of conventional methods, and the process is non-adaptive, making the sampling process faster and universal.
In this thesis, CS methods are applied to magnetic resonance imaging (MRI) and JPEG 2000, which are popularly used imaging techniques in clinical applications and image compression, respectively. Over the years, MRI has improved dramatically in both imaging quality and speed. This has further revolutionized the field of diagnostic medicine. However, imaging speed, which is essential to many MRI applications still remains a major challenge. The specific challenge addressed in this work is the use of non-Fourier based complex measurement-based data acquisition. This method provides the possibility of reconstructing high quality MRI data with minimal measurements, due to the high incoherence between the two chosen matrices. Similarly, JPEG2000, though providing a high compression, can be further improved upon by using compressive sampling. In addition, the image quality is also improved. Moreover, having a optimized JPEG 2000 architecture reduces the overall processing, and a faster computation when combined with CS.
Considering the requirements, this thesis is presented in two parts. In the first part: (1) A complex Hadamard matrix (CHM) based 2D and 3D MRI data acquisition with recovery using a greedy algorithm is proposed. The CHM measurement matrix is shown to satisfy the necessary condition for CS, known as restricted isometry property (RIP). The sparse recovery is done using compressive sampling matching pursuit (CoSaMP); (2) An optimized matrix and modified CoSaMP is presented, which enhances the MRI performance when compared with the conventional sampling; (3) An energy-efficient, cost-efficient hardware design based on field programmable gate array (FPGA) is proposed, to provide a platform for low-cost MRI processing hardware. At every stage, the design is proven to be superior with other commonly used MRI-CS methods and is comparable with the conventional MRI sampling.
In the second part, CS techniques are applied to image processing and is combined with JPEG 2000 coder. While CS can reduce the encoding time, the effect on the overall JPEG 2000 encoder is not very significant due to some complex JPEG 2000 algorithms. One problem encountered is the big-level operations in JPEG 2000 arithmetic encoding (AE), which is completely based on bit-level operations. In this work, this problem is tackled by proposing a two-symbol AE with an efficient FPGA based hardware design. Furthermore, this design is energy-efficient, fast and has lower complexity when compared to conventional JPEG 2000 encoding
TV White Space and Broadband Power Line Communications for Indoor High Speed Networks
Current indoor networks have growing data rate demands to satisfy high speed applications. Broadband power line communications (BPLC) and TV white space (TVWS) communications are considered as effective solutions for indoor networks. However, they encounter several challenges concerning coexistence with wireless services. In this thesis, cooperative BPLC and TVWS is investigated in the very high frequency (VHF) band, for the aim of complementing each other to deliver enhanced performance. The main contributions of the thesis are multi-folds. In the first contribution, a general statistical based path loss mapping (GSBPL) approach is proposed for modelling the path loss of indoor low voltage (i.e. 220 v) BPLC. Also, a simplification method is proposed for computing the channel transfer function, which is proved to be more general and computationally more efficient than the previous method in literature. The feasibility of the cooperation between BPLC and wireless communications is thus concluded, through comparing their corresponding path losses. In the second contribution, a general model is proposed to map the TVWS interference with the BPLC in the VHF band, through exciting antenna mode currents along low voltage BPLC cables. A new model is presented for current conversion from antenna to differential mode, which includes a general formula for the antenna mode characteristic impedance and two solutions to the formulated problem: a) a numerical solution referred to as the antenna theory numerical (ATN) approach; b) an analytical solution referred to as the enhanced TL approximation (ETLA) approach. This is the first reported work to obtain the antenna mode characteristic impedance by the antenna theory. The ETLA approach outperforms the previous frequency-independent solution and requires a reduced complexity over the ATN approach. In the third contribution, new hybrid systems utilising BPLC and TVWS are proposed in the VHF band referred to as white BPLC (WBPLC). Two cases are considered in the proposed system: a) point-to-point WBPLC multiple-input multiple-output (MIMO) system, where a power allocation algorithm and an iterative precoding technique are proposed to maximise the ergodic capacity, subject to the constraints of total power and interference limit at the TV primary user (PU) receiver (Rx); b) point-to-multipoint WBPLC MIMO system. The overall network downlink capacity maximisation problem is investigated, using an efficient algorithm for power and subcarrier allocation among different users
The Space and Earth Science Data Compression Workshop
This document is the proceedings from a Space and Earth Science Data Compression Workshop, which was held on March 27, 1992, at the Snowbird Conference Center in Snowbird, Utah. This workshop was held in conjunction with the 1992 Data Compression Conference (DCC '92), which was held at the same location, March 24-26, 1992. The workshop explored opportunities for data compression to enhance the collection and analysis of space and Earth science data. The workshop consisted of eleven papers presented in four sessions. These papers describe research that is integrated into, or has the potential of being integrated into, a particular space and/or Earth science data information system. Presenters were encouraged to take into account the scientists's data requirements, and the constraints imposed by the data collection, transmission, distribution, and archival system
Entropy in Image Analysis III
Image analysis can be applied to rich and assorted scenarios; therefore, the aim of this recent research field is not only to mimic the human vision system. Image analysis is the main methods that computers are using today, and there is body of knowledge that they will be able to manage in a totally unsupervised manner in future, thanks to their artificial intelligence. The articles published in the book clearly show such a future
Métodos computacionais para a caracterização de genes e extração de conhecimento genómico
Doutoramento conjunto MAPi em Ciências da ComputaçãoMotivation: Medicine and health sciences are changing from the classical
symptom-based to a more personalized and genetics-based paradigm, with an
invaluable impact in health-care. While advancements in genetics were already
contributing significantly to the knowledge of the human organism, the
breakthrough achieved by several recent initiatives provided a comprehensive
characterization of the human genetic differences, paving the way for a new era
of medical diagnosis and personalized medicine.
Data generated from these and posterior experiments are now becoming
available, but its volume is now well over the humanly feasible to explore. It is
then the responsibility of computer scientists to create the means for extracting
the information and knowledge contained in that data.
Within the available data, genetic structures contain significant amounts of
encoded information that has been uncovered in the past decades. Finding,
reading and interpreting that information are necessary steps for building
computational models of genetic entities, organisms and diseases; a goal that
in due course leads to human benefits.
Aims: Numerous patterns can be found within the human variome and exome.
Exploring these patterns enables the computational analysis and manipulation
of digital genomic data, but requires specialized algorithmic approaches. In this
work we sought to create and explore efficient methodologies to
computationally calculate and combine known biological patterns for various
purposes, such as the in silico optimization of genetic structures, analysis of
human genes, and prediction of pathogenicity from human genetic variants.
Results: We devised several computational strategies to evaluate genes,
explore genomes, manipulate sequences, and analyze patients’ variomes. By
resorting to combinatorial and optimization techniques we were able to create
and combine sequence redesign algorithms to control genetic structures; by
combining the access to several web-services and external resources we
created tools to explore and analyze available genetic data and patient data;
and by using machine learning we developed a workflow for analyzing human
mutations and predicting their pathogenicity.Motivação: A medicina e as ciências da saúde estão atualmente num
processo de alteração que muda o paradigma clássico baseado em sintomas
para um personalizado e baseado na genética. O valor do impacto desta
mudança nos cuidados da saúde é inestimável. Não obstante as contribuições
dos avanços na genética para o conhecimento do organismo humano até
agora, as descobertas realizadas recentemente por algumas iniciativas
forneceram uma caracterização detalhada das diferenças genéticas humanas,
abrindo o caminho a uma nova era de diagnóstico médico e medicina
personalizada.
Os dados gerados por estas e outras iniciativas estão disponíveis mas o seu
volume está muito para lá do humanamente explorável, e é portanto da
responsabilidade dos cientistas informáticos criar os meios para extrair a
informação e conhecimento contidos nesses dados.
Dentro dos dados disponíveis estão estruturas genéticas que contêm uma
quantidade significativa de informação codificada que tem vindo a ser
descoberta nas últimas décadas. Encontrar, ler e interpretar essa informação
são passos necessários para construir modelos computacionais de entidades
genéticas, organismos e doenças; uma meta que, em devido tempo, leva a
benefícios humanos.
Objetivos: É possível encontrar vários padrões no varioma e exoma humano.
Explorar estes padrões permite a análise e manipulação computacional de
dados genéticos digitais, mas requer algoritmos especializados. Neste trabalho
procurámos criar e explorar metodologias eficientes para o cálculo e
combinação de padrões biológicos conhecidos, com a intenção de realizar
otimizações in silico de estruturas genéticas, análises de genes humanos, e
previsão da patogenicidade a partir de diferenças genéticas humanas.
Resultados: Concebemos várias estratégias computacionais para avaliar
genes, explorar genomas, manipular sequências, e analisar o varioma de
pacientes. Recorrendo a técnicas combinatórias e de otimização criámos e
conjugámos algoritmos de redesenho de sequências para controlar estruturas
genéticas; através da combinação do acesso a vários web-services e recursos
externos criámos ferramentas para explorar e analisar dados genéticos,
incluindo dados de pacientes; e através da aprendizagem automática
desenvolvemos um procedimento para analisar mutações humanas e prever a
sua patogenicidade
Advances in Robotics, Automation and Control
The book presents an excellent overview of the recent developments in the different areas of Robotics, Automation and Control. Through its 24 chapters, this book presents topics related to control and robot design; it also introduces new mathematical tools and techniques devoted to improve the system modeling and control. An important point is the use of rational agents and heuristic techniques to cope with the computational complexity required for controlling complex systems. Through this book, we also find navigation and vision algorithms, automatic handwritten comprehension and speech recognition systems that will be included in the next generation of productive systems developed by man
Special oils for halal and safe cosmetics
Three types of non conventional oils were extracted, analyzed and tested for toxicity. Date palm kernel oil (DPKO), mango kernel oil (MKO) and Ramputan seed oil (RSO). Oil content for tow cultivars of dates Deglect Noor and Moshkan was 9.67% and 7.30%, respectively. The three varieties of mango were found to contain about 10% oil in average. The red yellow types of Ramputan were found to have 11 and 14% oil, respectively. The phenolic compounds in DPKO, MKO and RSO were 0.98, 0.88 and 0.78 mg/ml Gallic acid equivalent, respectively. Oils were analyzed for their fatty acid composition and they are rich in oleic acid C18:1 and showed the presence of (dodecanoic acid) lauric acid C12:0, which reported to appear some antimicrobial activities. All extracted oils, DPKO, MKO and RSO showed no toxic effect using prime shrimp bioassay. Since these oils are stable, melt at skin temperature, have good lubricity and are great source of essential fatty acids; they could be used as highly moisturizing, cleansing and nourishing oils because of high oleic acid content. They are ideal for use in such halal cosmetics such as Science, Engineering and Technology 75 skin care and massage, hair-care, soap and shampoo products
- …