95 research outputs found

    Algoritmo de compresión de imágenes fijas utilizando la transformada Wavelet

    Get PDF
    En este trabajo presentamos la aplicación de un algoritmo de compresión de imágenes fijas utilizando la Transformada Wavelet. La transformada Wavelet es una herramienta conveniente para el análisis multirresolución de señales y en particular se ajusta naturalmente a la compresión de imágenes al adaptar el ancho de banda requerido en forma automática. Este algoritmo estudia las características de las imágenes en tonos de gris para permitir explotar aspectos importantes del sistema visual humano. El ojo humano es menos sensitivo a las frecuencias espaciales altas (bordes de una imagen) que a las frecuencias espaciales bajas (texturas de una imagen). El método utilizado consiste en codificar con pocos bits los coeficientes que representan frecuencias altas y con más bits los coeficientes de frecuencias bajas.Tesis digitalizada en SEDICI gracias a la colaboración de la Biblioteca de la Facultad de Informática.Facultad de Ciencias Exacta

    Algoritmo de compresión de imágenes fijas utilizando la transformada Wavelet

    Get PDF
    En este trabajo presentamos la aplicación de un algoritmo de compresión de imágenes fijas utilizando la Transformada Wavelet. La transformada Wavelet es una herramienta conveniente para el análisis multirresolución de señales y en particular se ajusta naturalmente a la compresión de imágenes al adaptar el ancho de banda requerido en forma automática. Este algoritmo estudia las características de las imágenes en tonos de gris para permitir explotar aspectos importantes del sistema visual humano. El ojo humano es menos sensitivo a las frecuencias espaciales altas (bordes de una imagen) que a las frecuencias espaciales bajas (texturas de una imagen). El método utilizado consiste en codificar con pocos bits los coeficientes que representan frecuencias altas y con más bits los coeficientes de frecuencias bajas.Tesis digitalizada en SEDICI gracias a la colaboración de la Biblioteca de la Facultad de Informática.Facultad de Ciencias Exacta

    Compresión de imágenes fijas utilizando la trasformada wavelet

    Get PDF
    En este trabajo presentamos la aplicación de un algoritmo de compresión de imágenes fijas utilizando la Transformada Wavelet. La transformada Wavelet es una herramienta coveniente para el análisis multirresolución de señales y en particular se ajusta naturalmente a la compresión de imágenes al adaptar el ancho de banda requerido en forma automática. Este algoritmo estudia las características de las imágenes en tonos de gris para permitir explotar aspectos importantes del sistema visual humano. El ojo humano es menos sensitivo a las frecuencias espaciales altas (bordes de una imagen) que a las frecuencias espaciales bajas (texturas de una imagen). El método utilizado consiste en codificar con pocos bits los coeficientes que representan frecuencias altas y con más bits los coeficientes de frecuencias bajas. Las etapas de la compresión son: • La descomposición Wavelet utilizando diferentes filtros FIR, entre ellos los de Haar y Daubechies. • La cuantificación durante la cual se lleva a cabo la compresión efectiva, y que comprende dos pasos: la asignación de bits y el umbralamiento y cuantificación. • la codificación que incluye el método de Run-Length seguido de una codificación de Huffmann dinámica o estática. La decompresión comprende procesos inversos de los anteriores. El algoritmo resulta ser efectivo en cuanto a la calidad de las imágenes comprimidas y en pruebas preliminares se han alcanzado índices de compresión del orden de diez veces.Eje: Procesamiento distribuido y paralelo. Tratamiento de señalesRed de Universidades con Carreras en Informática (RedUNCI

    Compresión de imágenes fijas utilizando la trasformada wavelet

    Get PDF
    En este trabajo presentamos la aplicación de un algoritmo de compresión de imágenes fijas utilizando la Transformada Wavelet. La transformada Wavelet es una herramienta coveniente para el análisis multirresolución de señales y en particular se ajusta naturalmente a la compresión de imágenes al adaptar el ancho de banda requerido en forma automática. Este algoritmo estudia las características de las imágenes en tonos de gris para permitir explotar aspectos importantes del sistema visual humano. El ojo humano es menos sensitivo a las frecuencias espaciales altas (bordes de una imagen) que a las frecuencias espaciales bajas (texturas de una imagen). El método utilizado consiste en codificar con pocos bits los coeficientes que representan frecuencias altas y con más bits los coeficientes de frecuencias bajas. Las etapas de la compresión son: • La descomposición Wavelet utilizando diferentes filtros FIR, entre ellos los de Haar y Daubechies. • La cuantificación durante la cual se lleva a cabo la compresión efectiva, y que comprende dos pasos: la asignación de bits y el umbralamiento y cuantificación. • la codificación que incluye el método de Run-Length seguido de una codificación de Huffmann dinámica o estática. La decompresión comprende procesos inversos de los anteriores. El algoritmo resulta ser efectivo en cuanto a la calidad de las imágenes comprimidas y en pruebas preliminares se han alcanzado índices de compresión del orden de diez veces.Eje: Procesamiento distribuido y paralelo. Tratamiento de señalesRed de Universidades con Carreras en Informática (RedUNCI

    Understanding Molecular Pathology along Injured Spinal Cord Axis: Moving Frontiers toward Effective Neuroprotection and Regeneration

    Get PDF
    Spinal cord injury (SCI) is a severe, often life threatening, traumatic condition leading to serious neurological dysfunctions. The pathological hallmarks of SCI include inflammation, reactive gliosis, axonal demyelination, neuronal death, and cyst formation. Although much has been learned about the progression of SCI pathology affecting a large number of biochemical cascades and reactions, the roles of proteins involved in these processes are not well understood. Advances in proteomic technologies have made it possible to examine the spinal cord proteome from healthy and experimental animals and disclose a detailed overview on the spatial and temporal regionalization of these secondary processes. Data clearly demonstrated that neurotrophic molecules dominated in the segment above the central lesion, while the proteins associated with necrotic/apoptotic pathways abound the segment below the lesion. This knowledge is extremely important in finding optimal targets and pathways on which complementary neuroprotective and neuroregenerative approaches should be focused on. In terms of neuroprotection, several active substances and cell-based therapy together with biomaterials releasing bioactive substances showed partial improvement of spinal cord injury. However, one of the major challenges is to select specific therapies that can be combined safely and in the appropriate order to provide the maximum value of each individual treatment

    LA SEÑORA [Material gráfico]

    Get PDF
    Copia digital. Madrid : Ministerio de Educación, Cultura y Deporte. Subdirección General de Coordinación Bibliotecaria, 201

    Central American Trachemys revisited: New sampling questions current understanding of taxonomy and distribution (Testudines: Emydidae)

    Get PDF
    Using 3226-bp-long mtDNA sequences and five nuclear loci (Cmos, ODC, R35, Rag1, Rag2, together 3409 bp), we examine genetic differentiation and relationships of Central American slider turtles (Trachemys grayi, T. venusta). Our investigation also included samples from taxa endemic to North America (T. gaigeae, T. scripta), the Antilles (T. decorata, T. decussata, T. stejnegeri, T. terrapen), and South America (T. dorbigni, T. medemi plus the two T. venusta subspecies endemic to northern South America). Our mitochondrial phylogeny retrieves all studied species as distinct, with three well-supported clades in a polytomy: (1) the Central and South American species (T. grayi + T. venusta) + (T. dorbigni + T. medemi), (2) the Antillean species, and (3) T. gaigeae + T. scripta. Our nuclear DNA analyses also suggest three distinct but conflicting clusters: (1) T. scripta plus the Antillean species, (2) T. gaigeae, and (3) the Central and South American species T. dorbigni, T. grayi, T. medemi, and T. venusta. However, in the mitochondrial phylogeny, T. gaigeae is the little divergent sister taxon of T. scripta. This conflicting placement of T. gaigeae suggests a distinct evolutionary trajectory and old hybridization with T. scripta and mitochondrial capture. Despite prominent color pattern differences, genetic divergences within T. grayi and T. venusta are shallow and the taxonomic diversity of each species with several currently recognized subspecies could be overestimated. Finally, we provide for the first time evidence for the occurrence of T. grayi along the Caribbean versant of Costa Rica

    Dispersal limitations and historical factors determine the biogeography of specialized terrestrial protists

    Get PDF
    Recent studies show that soil eukaryotic diversity is immense and dominated by micro-organisms. However, it is unclear to what extent the processes that shape the distribution of diversity in plants and animals also apply to micro-organisms. Major diversification events in multicellular organisms have often been attributed to long-term climatic and geological processes, but the impact of such processes on protist diversity has received much less attention as their distribution has often been believed to be largely cosmopolitan. Here, we quantified phylogeographical patterns in Hyalosphenia papilio, a large testate amoeba restricted to Holarctic Sphagnum-dominated peatlands, to test if the current distribution of its genetic diversity can be explained by historical factors or by the current distribution of suitable habitats. Phylogenetic diversity was higher in Western North America, corresponding to the inferred geographical origin of the H. papilio complex, and was lower in Eurasia despite extensive suitable habitats. These results suggest that patterns of phylogenetic diversity and distribution can be explained by the history of Holarctic Sphagnum peatland range expansions and contractions in response to Quaternary glaciations that promoted cladogenetic range evolution, rather than the contemporary distribution of suitable habitats. Species distributions were positively correlated with climatic niche breadth, suggesting that climatic tolerance is key to dispersal ability in H. papilio. This implies that, at least for large and specialized terrestrial micro-organisms, propagule dispersal is slow enough that historical processes may contribute to their diversification and phylogeographical patterns and may partly explain their very high overall diversity

    Bayesian Network Modeling and Expert Elicitation for Probabilistic Eruption Forecasting: Pilot Study for Whakaari/White Island, New Zealand

    Get PDF
    Bayesian Networks (BNs) are probabilistic graphical models that provide a robust and flexible framework for understanding complex systems. Limited case studies have demonstrated the potential of BNs in modeling multiple data streams for eruption forecasting and volcanic hazard assessment. Nevertheless, BNs are not widely employed in volcano observatories. Motivated by their need to determine eruption-related fieldwork risks, we have worked closely with the New Zealand volcano monitoring team to appraise BNs for eruption forecasting with the purpose, at this stage, of assessing the utility of the concept rather than develop a full operational framework. We adapted a previously published BN for a pilot study to forecast volcanic eruption on Whakaari/White Island. Developing the model structure provided a useful framework for the members of the volcano monitoring team to share their knowledge and interpretation of the volcanic system. We aimed to capture the conceptual understanding of the volcanic processes and represent all observables that are regularly monitored. The pilot model has a total of 30 variables, four of them describing the volcanic processes that can lead to three different types of eruptions: phreatic, magmatic explosive and magmatic effusive. The remaining 23 variables are grouped into observations related to seismicity, fluid geochemistry and surface manifestations. To estimate the model parameters, we held a workshop with 11 experts, including two from outside the monitoring team. To reduce the number of conditional probabilities that the experts needed to estimate, each variable is described by only two states. However, experts were concerned about this limitation, in particular for continuous data. Therefore, they were reluctant to define thresholds to distinguish between states. We conclude that volcano monitoring requires BN modeling techniques that can accommodate continuous variables. More work is required to link unobservable (latent) processes with observables and with eruptive patterns, and to model dynamic processes. A provisional application of the pilot model revealed several key insights. Refining the BN modeling techniques will help advance understanding of volcanoes and improve capabilities for forecasting volcanic eruptions. We consider that BNs will become essential for handling ever-burgeoning observations, and for assessing data's evidential meaning for operational eruption forecasting

    The EU Center of Excellence for Exascale in Solid Earth (ChEESE): Implementation, results, and roadmap for the second phase

    Get PDF
    The EU Center of Excellence for Exascale in Solid Earth (ChEESE) develops exascale transition capabilities in the domain of Solid Earth, an area of geophysics rich in computational challenges embracing different approaches to exascale (capability, capacity, and urgent computing). The first implementation phase of the project (ChEESE-1P; 2018¿2022) addressed scientific and technical computational challenges in seismology, tsunami science, volcanology, and magnetohydrodynamics, in order to understand the phenomena, anticipate the impact of natural disasters, and contribute to risk management. The project initiated the optimisation of 10 community flagship codes for the upcoming exascale systems and implemented 12 Pilot Demonstrators that combine the flagship codes with dedicated workflows in order to address the underlying capability and capacity computational challenges. Pilot Demonstrators reaching more mature Technology Readiness Levels (TRLs) were further enabled in operational service environments on critical aspects of geohazards such as long-term and short-term probabilistic hazard assessment, urgent computing, and early warning and probabilistic forecasting. Partnership and service co-design with members of the project Industry and User Board (IUB) leveraged the uptake of results across multiple research institutions, academia, industry, and public governance bodies (e.g. civil protection agencies). This article summarises the implementation strategy and the results from ChEESE-1P, outlining also the underpinning concepts and the roadmap for the on-going second project implementation phase (ChEESE-2P; 2023¿2026).This work has been funded by the European Union Horizon 2020 research and innovation program under the ChEESE project, Grant Agreemen
    • …
    corecore