5,945 research outputs found

    neuroAIx-Framework: design of future neuroscience simulation systems exhibiting execution of the cortical microcircuit model 20× faster than biological real-time

    Get PDF
    IntroductionResearch in the field of computational neuroscience relies on highly capable simulation platforms. With real-time capabilities surpassed for established models like the cortical microcircuit, it is time to conceive next-generation systems: neuroscience simulators providing significant acceleration, even for larger networks with natural density, biologically plausible multi-compartment models and the modeling of long-term and structural plasticity.MethodsStressing the need for agility to adapt to new concepts or findings in the domain of neuroscience, we have developed the neuroAIx-Framework consisting of an empirical modeling tool, a virtual prototype, and a cluster of FPGA boards. This framework is designed to support and accelerate the continuous development of such platforms driven by new insights in neuroscience.ResultsBased on design space explorations using this framework, we devised and realized an FPGA cluster consisting of 35 NetFPGA SUME boards.DiscussionThis system functions as an evaluation platform for our framework. At the same time, it resulted in a fully deterministic neuroscience simulation system surpassing the state of the art in both performance and energy efficiency. It is capable of simulating the microcircuit with 20× acceleration compared to biological real-time and achieves an energy efficiency of 48nJ per synaptic event

    Modelling and Solving the Single-Airport Slot Allocation Problem

    Get PDF
    Currently, there are about 200 overly congested airports where airport capacity does not suffice to accommodate airline demand. These airports play a critical role in the global air transport system since they concern 40% of global passenger demand and act as a bottleneck for the entire air transport system. This imbalance between airport capacity and airline demand leads to excessive delays, as well as multi-billion economic, and huge environmental and societal costs. Concurrently, the implementation of airport capacity expansion projects requires time, space and is subject to significant resistance from local communities. As a short to medium-term response, Airport Slot Allocation (ASA) has been used as the main demand management mechanism. The main goal of this thesis is to improve ASA decision-making through the proposition of models and algorithms that provide enhanced ASA decision support. In doing so, this thesis is organised into three distinct chapters that shed light on the following questions (I–V), which remain untapped by the existing literature. In parentheses, we identify the chapters of this thesis that relate to each research question. I. How to improve the modelling of airline demand flexibility and the utility that each airline assigns to each available airport slot? (Chapters 2 and 4) II. How can one model the dynamic and endogenous adaptation of the airport’s landside and airside infrastructure to the characteristics of airline demand? (Chapter 2) III. How to consider operational delays in strategic ASA decision-making? (Chapter 3) IV. How to involve the pertinent stakeholders into the ASA decision-making process to select a commonly agreed schedule; and how can one reduce the inherent decision-complexity without compromising the quality and diversity of the schedules presented to the decision-makers? (Chapter 3) V. Given that the ASA process involves airlines (submitting requests for slots) and coordinators (assigning slots to requests based on a set of rules and priorities), how can one jointly consider the interactions between these two sides to improve ASA decision-making? (Chapter 4) With regards to research questions (I) and (II), the thesis proposes a Mixed Integer Programming (MIP) model that considers airlines’ timing flexibility (research question I) and constraints that enable the dynamic and endogenous allocation of the airport’s resources (research question II). The proposed modelling variant addresses several additional problem characteristics and policy rules, and considers multiple efficiency objectives, while integrating all constraints that may affect airport slot scheduling decisions, including the asynchronous use of the different airport resources (runway, aprons, passenger terminal) and the endogenous consideration of the capabilities of the airport’s infrastructure to adapt to the airline demand’s characteristics and the aircraft/flight type associated with each request. The proposed model is integrated into a two-stage solution approach that considers all primary and several secondary policy rules of ASA. New combinatorial results and valid tightening inequalities that facilitate the solution of the problem are proposed and implemented. An extension of the above MIP model that considers the trade-offs among schedule displacement, maximum displacement, and the number of displaced requests, is integrated into a multi-objective solution framework. The proposed framework holistically considers the preferences of all ASA stakeholder groups (research question IV) concerning multiple performance metrics and models the operational delays associated with each airport schedule (research question III). The delays of each schedule/solution are macroscopically estimated, and a subtractive clustering algorithm and a parameter tuning routine reduce the inherent decision complexity by pruning non-dominated solutions without compromising the representativeness of the alternatives offered to the decision-makers (research question IV). Following the determination of the representative set, the expected delay estimates of each schedule are further refined by considering the whole airfield’s operations, the landside, and the airside infrastructure. The representative schedules are ranked based on the preferences of all ASA stakeholder groups concerning each schedule’s displacement-related and operational-delay performance. Finally, in considering the interactions between airlines’ timing flexibility and utility, and the policy-based priorities assigned by the coordinator to each request (research question V), the thesis models the ASA problem as a two-sided matching game and provides guarantees on the stability of the proposed schedules. A Stable Airport Slot Allocation Model (SASAM) capitalises on the flexibility considerations introduced for addressing research question (I) through the exploitation of data submitted by the airlines during the ASA process and provides functions that proxy each request’s value considering both the airlines’ timing flexibility for each submitted request and the requests’ prioritisation by the coordinators when considering the policy rules defining the ASA process. The thesis argues on the compliance of the proposed functions with the primary regulatory requirements of the ASA process and demonstrates their applicability for different types of slot requests. SASAM guarantees stability through sets of inequalities that prune allocations blocking the formation of stable schedules. A multi-objective Deferred-Acceptance (DA) algorithm guaranteeing the stability of each generated schedule is developed. The algorithm can generate all stable non-dominated points by considering the trade-off between the spilled airline and passenger demand and maximum displacement. The work conducted in this thesis addresses several problem characteristics and sheds light on their implications for ASA decision-making, hence having the potential to improve ASA decision-making. Our findings suggest that the consideration of airlines’ timing flexibility (research question I) results in improved capacity utilisation and scheduling efficiency. The endogenous consideration of the ability of the airport’s infrastructure to adapt to the characteristics of airline demand (research question II) enables a more efficient representation of airport declared capacity that results in the scheduling of additional requests. The concurrent consideration of airlines’ timing flexibility and the endogenous adaptation of airport resources to airline demand achieves an improved alignment between the airport infrastructure and the characteristics of airline demand, ergo proposing schedules of improved efficiency. The modelling and evaluation of the peak operational delays associated with the different airport schedules (research question III) provides allows the study of the implications of strategic ASA decision-making for operations and quantifies the impact of the airport’s declared capacity on each schedule’s operational performance. In considering the preferences of the relevant ASA stakeholders (airlines, coordinators, airport, and air traffic authorities) concerning multiple operational and strategic ASA efficiency metrics (research question IV) the thesis assesses the impact of alternative preference considerations and indicates a commonly preferred schedule that balances the stakeholders’ preferences. The proposition of representative subsets of alternative schedules reduces decision-complexity without significantly compromising the quality of the alternatives offered to the decision-making process (research question IV). The modelling of the ASA as a two-sided matching game (research question V), results in stable schedules consisting of request-to-slot assignments that provide no incentive to airlines and coordinators to reject or alter the proposed timings. Furthermore, the proposition of stable schedules results in more intensive use of airport capacity, while simultaneously improving scheduling efficiency. The models and algorithms developed as part of this thesis are tested using airline requests and airport capacity data from coordinated airports. Computational results that are relevant to the context of the considered airport instances provide evidence on the potential improvements for the current ASA process and facilitate data-driven policy and decision-making. In particular, with regards to the alignment of airline demand with the capabilities of the airport’s infrastructure (questions I and II), computational results report improved slot allocation efficiency and airport capacity utilisation, which for the considered airport instance translate to improvements ranging between 5-24% for various schedule performance metrics. In reducing the difficulty associated with the assessment of multiple ASA solutions by the stakeholders (question IV), instance-specific results suggest reductions to the number of alternative schedules by 87%, while maintaining the quality of the solutions presented to the stakeholders above 70% (expressed in relation to the initially considered set of schedules). Meanwhile, computational results suggest that the concurrent consideration of ASA stakeholders’ preferences (research question IV) with regards to both operational (research question III) and strategic performance metrics leads to alternative airport slot scheduling solutions that inform on the trade-offs between the schedules’ operational and strategic performance and the stakeholders’ preferences. Concerning research question (V), the application of SASAM and the DA algorithm suggest improvements to the number of unaccommodated flights and passengers (13 and 40% improvements) at the expense of requests concerning fewer passengers and days of operations (increasing the number of rejected requests by 1.2% in relation to the total number of submitted requests). The research conducted in this thesis aids in the identification of limitations that should be addressed by future studies to further improve ASA decision-making. First, the thesis focuses on exact solution approaches that consider the landside and airside infrastructure of the airport and generate multiple schedules. The proposition of pre-processing techniques that identify the bottleneck of the airport’s capacity, i.e., landside and/or airside, can be used to reduce the size of the proposed formulations and improve the required computational times. Meanwhile, the development of multi-objective heuristic algorithms that consider several problem characteristics and generate multiple efficient schedules in reasonable computational times, could extend the capabilities of the models propositioned in this thesis and provide decision support for some of the world’s most congested airports. Furthermore, the thesis models and evaluates the operational implications of strategic airport slot scheduling decisions. The explicit consideration of operational delays as an objective in ASA optimisation models and algorithms is an issue that merits investigation since it may further improve the operational performance of the generated schedules. In accordance with current practice, the models proposed in this work have considered deterministic capacity parameters. Perhaps, future research could propose formulations that consider stochastic representations of airport declared capacity and improve strategic ASA decision-making through the anticipation of operational uncertainty and weather-induced capacity reductions. Finally, in modelling airlines’ utility for each submitted request and available time slot the thesis proposes time-dependent functions that utilise available data to approximate airlines’ scheduling preferences. Future studies wishing to improve the accuracy of the proposed functions could utilise commercial data sources that provide route-specific information; or in cases that such data is unavailable, employ data mining and machine learning methodologies to extract airlines’ time-dependent utility and preferences

    Graphical scaffolding for the learning of data wrangling APIs

    Get PDF
    In order for students across the sciences to avail themselves of modern data streams, they must first know how to wrangle data: how to reshape ill-organised, tabular data into another format, and how to do this programmatically, in languages such as Python and R. Despite the cross-departmental demand and the ubiquity of data wrangling in analytical workflows, the research on how to optimise the instruction of it has been minimal. Although data wrangling as a programming domain presents distinctive challenges - characterised by on-the-fly syntax lookup and code example integration - it also presents opportunities. One such opportunity is how tabular data structures are easily visualised. To leverage the inherent visualisability of data wrangling, this dissertation evaluates three types of graphics that could be employed as scaffolding for novices: subgoal graphics, thumbnail graphics, and parameter graphics. Using a specially built e-learning platform, this dissertation documents a multi-institutional, randomised, and controlled experiment that investigates the pedagogical effects of these. Our results indicate that the graphics are well-received, that subgoal graphics boost the completion rate, and that thumbnail graphics improve navigability within a command menu. We also obtained several non-significant results, and indications that parameter graphics are counter-productive. We will discuss these findings in the context of general scaffolding dilemmas, and how they fit into a wider research programme on data wrangling instruction

    Emergency Advanced Clinical Practitioners: Quality and Acceptability

    Get PDF

    Novel Cardiac Mapping Approaches and Multimodal Techniques to Unravel Multidomain Dynamics of Complex Arrhythmias Towards a Framework for Translational Mechanistic-Based Therapeutic Strategies

    Full text link
    [ES] Las arritmias cardíacas son un problema importante para los sistemas de salud en el mundo desarrollado debido a su alta incidencia y prevalencia a medida que la población envejece. La fibrilación auricular (FA) y la fibrilación ventricular (FV) se encuentran entre las arritmias más complejas observadas en la práctica clínica. Las consecuencias clínicas de tales alteraciones arrítmicas incluyen el desarrollo de eventos cardioembólicos complejos en la FA, y repercusiones dramáticas debido a procesos fibrilatorios sostenidos que amenazan la vida infringiendo daño neurológico tras paro cardíaco por FV, y que pueden provocar la muerte súbita cardíaca (MSC). Sin embargo, a pesar de los avances tecnológicos de las últimas décadas, sus mecanismos intrínsecos se comprenden de forma incompleta y, hasta la fecha, las estrategias terapéuticas carecen de una base mecanicista suficiente y poseen bajas tasas de éxito. Entre los mecanismos implicados en la inducción y perpetuación de arritmias cardíacas, como la FA, se cree que las dinámicas de las fuentes focales y reentrantes de alta frecuencia, en sus diferentes modalidades, son las fuentes primarias que mantienen la arritmia. Sin embargo, se sabe poco sobre los atractores, así como, de la dinámica espacio-temporal de tales fuentes fibrilatorias primarias, específicamente, las fuentes focales o rotacionales dominantes que mantienen la arritmia. Por ello, se ha desarrollado una plataforma computacional, para comprender los factores (activos, pasivos y estructurales) determinantes, y moduladores de dicha dinámica. Esto ha permitido establecer un marco para comprender la compleja dinámica de los rotores con énfasis en sus propiedades deterministas para desarrollar herramientas basadas en los mecanismos para ayuda diagnóstica y terapéutica. Comprender los procesos fibrilatorios es clave para desarrollar marcadores y herramientas fisiológica- y clínicamente relevantes para la ayuda de diagnóstico temprano. Específicamente, las propiedades espectrales y de tiempo-frecuencia de los procesos fibrilatorios han demostrado resaltar el comportamiento determinista principal de los mecanismos intrínsecos subyacentes a las arritmias y el impacto de tales eventos arrítmicos. Esto es especialmente relevante para determinar el pronóstico temprano de los supervivientes comatosos después de un paro cardíaco debido a fibrilación ventricular (FV). Las técnicas de mapeo electrofisiológico, el mapeo eléctrico y óptico cardíaco, han demostrado ser recursos muy valiosos para dar forma a nuevas hipótesis y desarrollar nuevos enfoques mecanicistas y estrategias terapéuticas mejoradas. Esta tecnología permite además el trabajo multidisciplinar entre clínicos y bioingenieros, para el desarrollo y validación de dispositivos y metodologías para identificar biomarcadores multi-dominio que permitan rastrear con precisión la dinámica de las arritmias identificando fuentes dominantes y atractores con alta precisión para ser dianas de estrategias terapeúticas innovadoras. Es por ello que uno de los objetivos fundamentales ha sido la implantación y validación de nuevos sistemas de mapeo en distintas configuraciones que sirvan de plataforma de desarrollo de nuevas estrategias terapeúticas. Aunque el mapeo panorámico es el método principal y más completo para rastrear simultáneamente biomarcadores electrofisiológicos, su adopción por la comunidad científica es limitada principalmente debido al coste elevado de la tecnología. Aprovechando los avances tecnológicos recientes, nos hemos enfocado en desarrollar, y validar, sistemas de mapeo óptico de alta resolución para registro panorámico cardíaco, utilizando modelos clínicamente relevantes para la investigación básica y la bioingeniería.[CA] Les arítmies cardíaques són un problema important per als sistemes de salut del món desenvolupat a causa de la seva alta incidència i prevalença a mesura que la població envelleix. La fibril·lació auricular (FA) i la fibril·lació ventricular (FV), es troben entre les arítmies més complexes observades a la pràctica clínica. Les conseqüències clíniques d'aquests trastorns arítmics inclouen el desenvolupament d'esdeveniments cardioembòlics complexos en FA i repercussions dramàtiques a causa de processos fibril·latoris sostinguts que posen en perill la vida amb danys neurològics posteriors a la FV, que condueixen a una aturada cardíaca i a la mort cardíaca sobtada (SCD). Tanmateix, malgrat els avanços tecnològics de les darreres dècades, els seus mecanismes intrínsecs s'entenen de forma incompleta i, fins a la data, les estratègies terapèutiques no tenen una base mecanicista suficient i tenen baixes taxes d'èxit. La majoria dels avenços en el desenvolupament de biomarcadors òptims i noves estratègies terapèutiques en aquest camp provenen de tècniques valuoses en la investigació de mecanismes d'arítmia. Entre els mecanismes implicats en la inducció i perpetuació de les arítmies cardíaques, es creu que les fonts primàries subjacents a l'arítmia són les fonts focals reingressants d'alta freqüència dinàmica i AF, en les seves diferents modalitats. Tot i això, se sap poc sobre els atractors i la dinàmica espaciotemporal d'aquestes fonts primàries fibril·ladores, específicament les fonts rotacionals o focals dominants que mantenen l'arítmia. Per tant, s'ha desenvolupat una plataforma computacional per entendre determinants actius, passius, estructurals i moduladors d'aquestes dinàmiques. Això va permetre establir un marc per entendre la complexa dinàmica multidomini dels rotors amb ènfasi en les seves propietats deterministes per desenvolupar enfocaments mecanicistes per a l'ajuda i la teràpia diagnòstiques. La comprensió dels processos fibril·latoris és clau per desenvolupar puntuacions i eines rellevants fisiològicament i clínicament per ajudar al diagnòstic precoç. Concretament, les propietats espectrals i de temps-freqüència dels processos fibril·latoris han demostrat destacar un comportament determinista important dels mecanismes intrínsecs subjacents a les arítmies i l'impacte d'aquests esdeveniments arítmics. Mitjançant coneixements previs, processament de senyals, tècniques d'aprenentatge automàtic i anàlisi de dades, es va desenvolupar una puntuació de risc mecanicista a la aturada cardíaca per FV. Les tècniques de cartografia òptica cardíaca i electrofisiològica han demostrat ser recursos inestimables per donar forma a noves hipòtesis i desenvolupar nous enfocaments mecanicistes i estratègies terapèutiques. Aquesta tecnologia ha permès durant molts anys provar noves estratègies terapèutiques farmacològiques o ablatives i desenvolupar mètodes multidominis per fer un seguiment precís de la dinàmica d'arrímies que identifica fonts i atractors dominants. Tot i que el mapatge panoràmic és el mètode principal per al seguiment simultani de paràmetres electrofisiològics, la seva adopció per part de la comunitat multidisciplinària d'investigació cardiovascular està limitada principalment pel cost de la tecnologia. Aprofitant els avenços tecnològics recents, ens centrem en el desenvolupament i la validació de sistemes de mapes òptics de baix cost per a imatges panoràmiques mitjançant models clínicament rellevants per a la investigació bàsica i la bioenginyeria.[EN] Cardiac arrhythmias are a major problem for health systems in the developed world due to their high incidence and prevalence as the population ages. Atrial fibrillation (AF) and ventricular fibrillation (VF), are amongst the most complex arrhythmias seen in the clinical practice. Clinical consequences of such arrhythmic disturbances include developing complex cardio-embolic events in AF, and dramatic repercussions due to sustained life-threatening fibrillatory processes with subsequent neurological damage under VF, leading to cardiac arrest and sudden cardiac death (SCD). However, despite the technological advances in the last decades, their intrinsic mechanisms are incompletely understood, and, to date, therapeutic strategies lack of sufficient mechanistic basis and have low success rates. Most of the progress for developing optimal biomarkers and novel therapeutic strategies in this field has come from valuable techniques in the research of arrhythmia mechanisms. Amongst the mechanisms involved in the induction and perpetuation of cardiac arrhythmias such AF, dynamic high-frequency re-entrant and focal sources, in its different modalities, are thought to be the primary sources underlying the arrhythmia. However, little is known about the attractors and spatiotemporal dynamics of such fibrillatory primary sources, specifically dominant rotational or focal sources maintaining the arrhythmia. Therefore, a computational platform for understanding active, passive and structural determinants, and modulators of such dynamics was developed. This allowed stablishing a framework for understanding the complex multidomain dynamics of rotors with enphasis in their deterministic properties to develop mechanistic approaches for diagnostic aid and therapy. Understanding fibrillatory processes is key to develop physiologically and clinically relevant scores and tools for early diagnostic aid. Specifically, spectral and time-frequency properties of fibrillatory processes have shown to highlight major deterministic behaviour of intrinsic mechanisms underlying the arrhythmias and the impact of such arrhythmic events. Using prior knowledge, signal processing, machine learning techniques and data analytics, we aimed at developing a reliable mechanistic risk-score for comatose survivors of cardiac arrest due to VF. Cardiac optical mapping and electrophysiological mapping techniques have shown to be unvaluable resources to shape new hypotheses and develop novel mechanistic approaches and therapeutic strategies. This technology has allowed for many years testing new pharmacological or ablative therapeutic strategies, and developing multidomain methods to accurately track arrhymia dynamics identigying dominant sources and attractors. Even though, panoramic mapping is the primary method for simultaneously tracking electrophysiological parameters, its adoption by the multidisciplinary cardiovascular research community is limited mainly due to the cost of the technology. Taking advantage of recent technological advances, we focus on developing and validating low-cost optical mapping systems for panoramic imaging using clinically relevant models for basic research and bioengineering.Calvo Saiz, CJ. (2022). Novel Cardiac Mapping Approaches and Multimodal Techniques to Unravel Multidomain Dynamics of Complex Arrhythmias Towards a Framework for Translational Mechanistic-Based Therapeutic Strategies [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/182329TESI

    Studies of b Hadron decays to Charmonium, the LHCb upgrade and operation

    Get PDF
    Precise measurements of CP violation provide stringent tests of the Standard Model towards the search for signs of new physics. Using LHC proton-proton collision data, collected by the LHCb detector during 2015 and 2016 at the centre-of-mass energy of 13 TeV corresponding to an integrated luminosity of 1.9 fb−1 , presented is the latest measurement of the CP -violating phase, ϕs , using B 0 s → J/ψϕ decays. The machine-learning-based data selection, data-driven corrections to simulated event samples, and the control of systematic effects using dedicated samples are discussed. The values ϕs = −0.083±0.041±0.006 rad, ∆Γs = 0.077± 0.008±0.003 ps−1 (i.e. the decay width difference between the light and the heavy mass eigenstates in the B 0 s system) and Γs −Γd = −0.0041±0.0024±0.0015 ps−1 (i.e. the difference of the average B 0 s and B 0 d meson decay widths) are obtained, yielding the World’s most precise determination of these quantities 1 . Furthermore, shown are the efforts and contributions towards the LHCb Upgrade: the quality assurance and testing of the LHCb RICH Upgrade components, and the redesign and upgrade of the fully online software trigger – LHCb HLT Upgrade. Regarding the former, an original implementation of a parallelised, robust and highly available automation system is introduced. In connection to the latter, a novel neural network architecture and optimisation methods are laid out, enabling complex machine learning to be performed in a low latency high-throughput environment. Those directly influence the future deployment of the experiment and its data collecting and analysis capabilities. Thus, they are essential for future more precise and stringent research
    • …
    corecore