616 research outputs found
EPSILOD: efficient parallel skeleton for generic iterative stencil computations in distributed GPUs
Producción CientíficaIterative stencil computations are widely used in numerical simulations. They
present a high degree of parallelism, high locality and mostly-coalesced memory
access patterns. Therefore, GPUs are good candidates to speed up their computa-
tion. However, the development of stencil programs that can work with huge grids in
distributed systems with multiple GPUs is not straightforward, since it requires solv-
ing problems related to the partition of the grid across nodes and devices, and the
synchronization and data movement across remote GPUs. In this work, we present
EPSILOD, a high-productivity parallel programming skeleton for iterative stencil
computations on distributed multi-GPUs, of the same or different vendors that sup-
ports any type of n-dimensional geometric stencils of any order. It uses an abstract
specification of the stencil pattern (neighbors and weights) to internally derive the
data partition, synchronizations and communications. Computation is split to better
overlap with communications. This paper describes the underlying architecture of
EPSILOD, its main components, and presents an experimental evaluation to show
the benefits of our approach, including a comparison with another state-of-the-art
solution. The experimental results show that EPSILOD is faster and shows good
strong and weak scalability for platforms with both homogeneous and heterogene-
ous types of GPUJunta de Castilla y León, Ministerio de Economía, Industria y Competitividad, y Fondo Europeo de Desarrollo Regional (FEDER): Proyecto PCAS (TIN2017-88614-R) y Proyecto PROPHET-2 (VA226P20).Ministerio de Ciencia e Innovación, Agencia Estatal de Investigación y “European Union NextGenerationEU/PRTR” : (MCIN/ AEI/10.13039/501100011033) - grant TED2021-130367B-I00CTE-POWER and Minotauro and the technical support provided by Barcelona Supercomputing Center (RES-IM-2021-2-0005, RES-IM-2021-3-0024, RES- IM-2022-1-0014).Publicación en abierto financiada por el Consorcio de Bibliotecas Universitarias de Castilla y León (BUCLE), con cargo al Programa Operativo 2014ES16RFOP009 FEDER 2014-2020 DE CASTILLA Y LEÓN, Actuación:20007-CL - Apoyo Consorcio BUCL
Tiny Machine Learning Environment: Enabling Intelligence on Constrained Devices
Running machine learning algorithms (ML) on constrained devices at the extreme edge of the network is problematic due to the computational overhead of ML algorithms, available resources on the embedded platform, and application budget (i.e., real-time requirements, power constraints, etc.). This required the development of specific solutions and development tools for what is now referred to as TinyML. In this dissertation, we focus on improving the deployment and performance of TinyML applications, taking into consideration the aforementioned challenges, especially memory requirements.
This dissertation contributed to the construction of the Edge Learning Machine environment (ELM), a platform-independent open-source framework that provides three main TinyML services, namely shallow ML, self-supervised ML, and binary deep learning on constrained devices. In this context, this work includes the following steps, which are reflected in the thesis structure. First, we present the performance analysis of state-of-the-art shallow ML algorithms including dense neural networks, implemented on mainstream microcontrollers. The comprehensive analysis in terms of algorithms, hardware platforms, datasets, preprocessing techniques, and configurations shows similar performance results compared to a desktop machine and highlights the impact of these factors on overall performance. Second, despite the assumption that TinyML only permits models inference provided by the scarcity of resources, we have gone a step further and enabled self-supervised on-device training on microcontrollers and tiny IoT devices by developing the Autonomous Edge Pipeline (AEP) system. AEP achieves comparable accuracy compared to the typical TinyML paradigm, i.e., models trained on resource-abundant devices and then deployed on microcontrollers. Next, we present the development of a memory allocation strategy for convolutional neural networks (CNNs) layers, that optimizes memory requirements. This approach reduces the memory footprint without affecting accuracy nor latency. Moreover, e-skin systems share the main requirements of the TinyML fields: enabling intelligence with low memory, low power consumption, and low latency. Therefore, we designed an efficient Tiny CNN architecture for e-skin applications. The architecture leverages the memory allocation strategy presented earlier and provides better performance than existing solutions. A major contribution of the thesis is given by CBin-NN, a library of functions for implementing extremely efficient binary neural networks on constrained devices. The library outperforms state of the art NN deployment solutions by drastically reducing memory footprint and inference latency. All the solutions proposed in this thesis have been implemented on representative devices and tested in relevant applications, of which results are reported and discussed. The ELM framework is open source, and this work is clearly becoming a useful, versatile toolkit for the IoT and TinyML research and development community
Translating Islamic Law: the postcolonial quest for minority representation
This research sets out to investigate how culture-specific or signature concepts are rendered in English-language discourse on Islamic, or ‘shariʿa’ law, which has Arabic roots. A large body of literature has investigated Islamic law from a technical perspective. However, from the perspective of linguistics and translation studies, little attention has been paid to the lexicon that makes up this specialised discourse. Much of the commentary has so far been prescriptive, with limited empirical evidence. This thesis aims to bridge this gap by exploring how ‘culturalese’ (i.e., ostensive cultural discourse) travels through language, as evidenced in the self-built Islamic Law Corpus (ILC), a 9-million-word monolingual English corpus, covering diverse genres on Islamic finance and family law.
Using a mixed methods design, the study first quantifies the different linguistic strategies used to render shariʿa-based concepts in English, in order to explore ‘translation’ norms based on linguistic frequency in the corpus. This quantitative analysis employs two models: profile-based correspondence analysis, which considers the probability of lexical variation in expressing a conceptual category, and logistic regression (using MATLAB programming software), which measures the influence of the explanatory variables ‘genre’, ‘legal function’ and ‘subject field’ on the choice between an Arabic loanword and an endogenous English lexeme, i.e., a close English equivalent. The findings are then interpreted qualitatively in the light of postcolonial translation agendas, which aim to preserve intangible cultural heritage and promote the representation of minoritised groups.
The research finds that the English-language discourse on Islamic law is characterised by linguistic borrowing and glossing, implying an ideologically driven variety of English that can be usefully labelled as a kind of ‘Islamgish’ (blending ‘Islamic’ and ‘English’) aimed at retaining symbols of linguistic hybridity. The regression analysis confirms the influence of the above-mentioned contextual factors on the use of an Arabic loanword versus English alternatives
Evaluation of user response by using visual cues designed to direct the viewer's attention to the main scene in an immersive environment
Today the visualization of 360-degree videos has become a means to live immersive experiences.. However, an important challenge to overcome is how to guide the viewers attention to the video main scene, without interrupting the immersion experience and the narrative thread. To meet this challenge, we have developed a software prototype to assess three approaches: Arrows, Radar and Auto Focus. These are based on visual guidance cues used in first person shooter games such as: Radar-Sonar, Radar-Compass and Arrows. In the study a questionnaire was made to evaluate the comprehension of the narrative, the user's perspective with respect to the design of the visual cues and the usability of the system. In addition, data was collected on the movement of the user's head, in order to analyze the focus of attention. The study used statistical methods to perform the analysis, the results show that the participants who used some visual cue (any of these) showed significant improvements compared to the control group (without using visual cues) in finding the main scene. With respect to narrative compression, significant improvements were obtained in the user group that used Radar and Auto Focus compared to the control group
Contributions to time series analysis, modelling and forecasting to increase reliability in industrial environments.
356 p.La integración del Internet of Things en el sector industrial es clave para alcanzar la inteligencia empresarial. Este estudio se enfoca en mejorar o proponer nuevos enfoques para aumentar la confiabilidad de las soluciones de IA basadas en datos de series temporales en la industria. Se abordan tres fases: mejora de la calidad de los datos, modelos y errores. Se propone una definición estándar de métricas de calidad y se incluyen en el paquete dqts de R. Se exploran los pasos del modelado de series temporales, desde la extracción de características hasta la elección y aplicación del modelo de predicción más eficiente. El método KNPTS, basado en la búsqueda de patrones en el histórico, se presenta como un paquete de R para estimar datos futuros. Además, se sugiere el uso de medidas elásticas de similitud para evaluar modelos de regresión y la importancia de métricas adecuadas en problemas de clases desbalanceadas. Las contribuciones se validaron en casos de uso industrial de diferentes campos: calidad de producto, previsión de consumo eléctrico, detección de porosidad y diagnóstico de máquinas
Algorithms for automated diagnosis of cardiovascular diseases based on ECG data: A comprehensive systematic review
The prevalence of cardiovascular diseases is increasing around the world. However, the technology is evolving and can be monitored with low-cost sensors anywhere at any time. This subject is being researched, and different methods can automatically identify these diseases, helping patients and healthcare professionals with the treatments. This paper presents a systematic review of disease identification, classification, and recognition with ECG sensors. The review was focused on studies published between 2017 and 2022 in different scientific databases, including PubMed Central, Springer, Elsevier, Multidisciplinary Digital Publishing Institute (MDPI), IEEE Xplore, and Frontiers. It results in the quantitative and qualitative analysis of 103 scientific papers. The study demonstrated that different datasets are available online with data related to various diseases. Several ML/DP-based models were identified in the research, where Convolutional Neural Network and Support Vector Machine were the most applied algorithms. This review can allow us to identify the techniques that can be used in a system that promotes the patient’s autonomy.N/
La valoración de empreses mediante la lógica borrosa
[spa] La complejidad de la toma de decisiones en el campo de la economía y las finanzas se ha incrementado en los últimos años. Como resultado, se está prestando cada vez más atención al desarrollo e implementación de modelos matemáticos que puedan dar respuesta a estos problemas. La investigación en el campo de la lógica borrosa ha sido un tema de creciente interés durante muchas décadas, ya que es un concepto fundamental y común en la ciencia. Desde 1965, cuando se publicó el título seminal "Fuzzy sets" (Zadeh, L. A. 1965), se produjo un cambio de la lógica binaria a la lógica multivalente. Este cambio permite dar paso a teorías relacionadas con la incertidumbre, a través de una metodología borrosa, para poder considerar todos los escenarios posibles en la toma de decisiones, teniendo en cuenta la objetividad y subjetividad de los parámetros a considerar.
En general, el objetivo principal de esta tesis doctoral es identificar las características y oportunidades de negocio a través de un análisis de valoración de empresas, que permita una mejor interpretación del contexto incierto para la toma de decisiones. Es decir, la teoría de la decisión en la incertidumbre se desarrolla con la valoración de empresas. Se analiza la situación en la que se encuentra y se estudian las aportaciones que podemos hacer en este campo con los principales algoritmos de lógica difusa estudiados por autores como J. Gil Aluja, A. Kaufmann, R. Yager, entre otros, con especial énfasis en aquellos que han sido aplicados al ámbito empresarial y financiero. La valoración de empresas es un proceso fundamental y complejo en los sistemas económico-financieros. En un entorno que evoluciona hacia formas más complejas e inciertas, es necesario presentar nuevos modelos de valoración empresarial más dinámicos basados en técnicas de tratamiento y gestión de la incertidumbre y toma de decisiones, para eliminar la ambigüedad y la confusión en entornos inciertos.
La primera aportación de este trabajo es el análisis del estado de la cuestión realizado a través de dos estudios bibliométricos que estudian las aportaciones de la comunidad científica a la lógica borrosa y la valoración empresarial. Destaca la importancia de los factores subjetivos a la hora de tomar decisiones en un entorno económico y financiero.
La segunda contribución es el desarrollo de aplicaciones que muestren la toma de decisiones en la incertidumbre aplicada a los métodos de valoración de empresas. Este estudio nos permite desarrollar algoritmos genéricos y modelos matemáticos que se pueden aplicar a la realidad empresarial, para probar su utilidad. En este trabajo, se destacan el coeficiente de adecuación, el coeficiente de calificación, la distancia de Hamming, la teoría del clon, el modelo de preferencia subjetiva, el algoritmo húngaro, los operadores OWA, los intervalos y los expertones.
La tercera contribución es un nuevo algoritmo que combina la matemática borrosa y la valoración de empresas, lo que contribuye al desarrollo de la teoría de la decisión en el ámbito empresarial. En concreto, se desarrolla un modelo de valoración de empresas mediante el descuento de flujos de caja y las matemáticas borrosas, mostrando su utilidad y la posibilidad de ser aplicado por la comunidad académica y profesional en el posterior análisis del valor de una empresa. El modelo propuesto sistematiza y ordena el uso de intervalos para establecer un valor de negocio mínimo y máximo para la empresa. Por lo tanto, hemos encontrado un intervalo de confianza del posible valor comercial.
Finalmente, podríamos decir que a nivel general hay dos aportaciones importantes a destacar en esta tesis doctoral: la aplicabilidad y el desarrollo. Aplicamos algoritmos y modelos en los métodos de valoración de empresas y desarrollamos un nuevo algoritmo que contribuye al desarrollo de la teoría de la decisión
Estudio de la evolución del clima futuro de la región del mediterráneo con un modelo climático regionalmente acoplado
Hacia finales del siglo XXI, se espera que el mar Mediterráneo sea una de las regiones más vulnerables a sufrir los efectos introducidos por el cambio climático. En este contexto, el cambio climático se sitúa sobre el corazón del desarrollo sostenible en el Mediterráneo. Como tal, la región es considerada idónea para la realización de nuevas aproximaciones que aporte información climática útil y adecuada, que sea aplicable a diversos sectores vulnerables. La región mediterránea ubicada en zona de transición entre latitudes tropicales y medias presenta una orografía y costas complejas en las que se producen interacciones locales aire-mar y tierra-mar muy intensas. Estas interacciones locales aire-mar provocan fenómenos de convección profunda que junto con la entrada de agua superficial desde el Atlántico impulsan la circulación termohalina mediterránea. Los modelos climáticos globales no cuentan con la resolución necesaria para reproducir correctamente estos flujos aire-mar de energía y de masas en el Mediterráneo, mientras que los modelos atmosféricos no son capaces por sí mismos de simular estos flujos. Por esta razón, el mar Mediterráneo constituye una región donde los modelos regionales climáticos son esenciales para el estudio de procesos en la atmósfera y en el océano.
En la presente Tesis Doctoral, se analizan simulaciones realizadas por el modelo de sistema regional climático ROM para evaluar el papel de las retroalimentaciones océano-atmósfera en la simulación del clima actual y en la señal reducida del cambio climático. ROM consta de un modelo regional atmosférico (REMO) acoplado a un modelo oceánico global (MPIOM) con alta resolución horizontal a nivel regional. El acoplamiento sólo es efectivo dentro del dominio seleccionado, donde se da interacción atmósfera-océano. Fuera del domino, el modelo oceánico está desacoplado, y es impulsado por el forzamiento atmosférico.
La simulación del clima presente, forzada por el reanálisis ERA-Interim, muestra una representación precisa del clima mediterráneo actual. Las diferencias que surgen en las principales variables físicas se encuentran en el rango de los mostrados por otros modelos regionales de este tipo. El análisis de las simulaciones de cambio climático conducidas por MPI-ESM bajo las condiciones del escenario RCP8.5 proyectan que las aguas del Mediterráneo serán más cálidas y salinas a finales del siglo. En la capa superficial, se prevé que la temperatura tenga un aumento medio de 2.7 ºC, mientras que la salinidad media aumentará 0.2 psu. El calentamiento que inicialmente tiene lugar en la superficie se propaga gradualmente a capas más profundas. Estos cambios en las propiedades hidrográficas de las aguas superficiales e intermedias fortalecen la estratificación dificultando la mezcla vertical y por tanto la convección en las principales regiones de formación de aguas profundas en el Mediterráneo. Estos cambios tendrán un impacto en la ventilación profunda y en la circulación termohalina del mar Mediterráneo a finales del siglo XXI.By the end of the century, the Mediterranean Sea is expected to be among the most
vulnerable regions to suffer the impacts of climate change. In this context, climate
change lies at the heart of sustainable human-environment interaction in the
Mediterranean. As such, the region is an optimal test bed for new approaches to
science-society partnership sustained by the provision of adequate climate
information and applicable to a broad range of vulnerable sectors. The region is
located in a transitional area between tropical and mid-latitudes and presents
complex orography and coastlines where intense local air-sea and land-sea
interactions take place. These intense local air-sea interactions lead to deep
convection which together with the inflow of Atlantic water drive the Mediterranean
thermohaline circulation. Global climate models have too coarse resolution to
correctly describe these air-sea fluxes of energy and mass while stand-alone
atmospheric models can be inadequate to simulate the correct fluxes. For these
reasons, the Mediterranean Sea is a region where regional climate system models
are critical for the study of the processes in the atmosphere and ocean.
In this doctoral dissertation, we analyze simulations of the regional climate system
model ROM in order to assess the role of ocean feedbacks in the simulation of the
present and on the downscaled climate change signal. In ROM, the regional
atmosphere model (REMO) is coupled to the global oceanic model (MPIOM) with
regionally high horizontal resolution. The coupling is only effective within a selected
domain, where the ocean and the atmosphere are interacting. Outside this domain,
the ocean model is uncoupled, driven by prescribed atmospheric forcing.
Simulation forced by ERA-Interim reanalysis shows an accurate representation of
the present Mediterranean climate. The biases of the main physical variables are in
range of those shown by other state of art regional climate models. Our analysis of
the simulation driven by the MPI-ESM under the RCP8.5 scenario shows that the
Mediterranean waters will be warmer and saltier by the end of this century. In the
upper ocean layer, temperature is projected to have a mean increase of 2.7ºC, while
the mean salinity will increase by 0.2 psu. The warming that initially takes place at
the surface is transferred progressively to deeper layers. Changes in the
hydrographic properties of surface and intermediate waters strengthen the
stratification hampering the vertical mixing and thus the convection in the main
spots for deep water formation in the Mediterranean Sea. Those changes seem to have an impact on the deep ventilation and on the thermohaline circulation of the
Mediterranean Sea by the end of the 21st century
Making Presentation Math Computable
This Open-Access-book addresses the issue of translating mathematical expressions from LaTeX to the syntax of Computer Algebra Systems (CAS). Over the past decades, especially in the domain of Sciences, Technology, Engineering, and Mathematics (STEM), LaTeX has become the de-facto standard to typeset mathematical formulae in publications. Since scientists are generally required to publish their work, LaTeX has become an integral part of today's publishing workflow. On the other hand, modern research increasingly relies on CAS to simplify, manipulate, compute, and visualize mathematics. However, existing LaTeX import functions in CAS are limited to simple arithmetic expressions and are, therefore, insufficient for most use cases. Consequently, the workflow of experimenting and publishing in the Sciences often includes time-consuming and error-prone manual conversions between presentational LaTeX and computational CAS formats. To address the lack of a reliable and comprehensive translation tool between LaTeX and CAS, this thesis makes the following three contributions. First, it provides an approach to semantically enhance LaTeX expressions with sufficient semantic information for translations into CAS syntaxes. Second, it demonstrates the first context-aware LaTeX to CAS translation framework LaCASt. Third, the thesis provides a novel approach to evaluate the performance for LaTeX to CAS translations on large-scaled datasets with an automatic verification of equations in digital mathematical libraries. This is an open access book
- …