15,583 research outputs found

    Теорія систем мобільних інфокомунікацій. Системна архітектура

    Get PDF
    Навчальний посібник містить опис логічних та фізичних структур, процедур, алгоритмів, протоколів, принципів побудови і функціонування мереж стільникового мобільного зв’язку (до 3G) і мобільних інфокомунікацій (4G і вище), приділяючи увагу розгляду загальних архітектур мереж операторів мобільного зв’язку, їх управління і координування, неперервності еволюції розвитку засобів функціонування і способів надання послуг таких мереж. Посібник структурно має сім розділів і побудований так, що складність матеріалу зростає з кожним наступним розділом. Навчальний посібник призначено для здобувачів ступеня бакалавра за спеціальністю 172 «Телекомунікації та радіотехніка», буде також корисним для аспірантів, наукових та інженерно-технічних працівників за напрямом інформаційно-телекомунікаційних систем та технологій.The manual contains a description of the logical and physical structures, procedures, algorithms, protocols, principles of construction and operation of cellular networks for mobile communications (up to 3G) and mobile infocommunications (4G and higher), paying attention to the consideration of general architectures of mobile operators' networks, their management, and coordination, the continuous evolution of the development of the means of operation and methods of providing services of such networks. The manual has seven structural sections and is structured in such a way that the complexity of the material increases with each subsequent chapter. The textbook is intended for applicants for a bachelor's degree in specialty 172 "Telecommunications and Radio Engineering", and will also be useful to graduate students, and scientific and engineering workers in the direction of information and telecommunication systems and technologies

    Formal Modelling and Verification of the Clock Synchronization Algorithm of FlexRay

    Get PDF
    The hundreds of electronic control devices used in an automotive system can effectively communicate with one another, thanks to an in-vehicle network (IVN) like FlexRay. Even though every node in the network will be running on its local clock, a global notion of time is essential. The clock synchronisation algorithm accomplishes this global time between the nodes in FlexRay. In this era of self-driving cars, the vehicle’s safety is paramount. For the vehicle to operate safely and smoothly, timely communication of information is critical, and the clock synchronisation algorithm plays a vital role in this. It is essential to formally test the clock synchronisation algorithm’s correctness. This paper attempts to model and verify the clock synchronisation algorithm of FlexRay using formal methods, which in turn enhance the reliability of safety-critical automotive systems. The clock synchronisation is modelled as a network of six timed automata in the UPPAAL model checker. Three system models were developed, a model for an ideal clock, another for a drifting clock, and a third model considering propagation delay. The precision of the clocks is verified to be within the prescribed limits. Simulation studies are also conducted on the model to ensure that the clock’s drift is always within the precision

    neuroAIx-Framework: design of future neuroscience simulation systems exhibiting execution of the cortical microcircuit model 20× faster than biological real-time

    Get PDF
    IntroductionResearch in the field of computational neuroscience relies on highly capable simulation platforms. With real-time capabilities surpassed for established models like the cortical microcircuit, it is time to conceive next-generation systems: neuroscience simulators providing significant acceleration, even for larger networks with natural density, biologically plausible multi-compartment models and the modeling of long-term and structural plasticity.MethodsStressing the need for agility to adapt to new concepts or findings in the domain of neuroscience, we have developed the neuroAIx-Framework consisting of an empirical modeling tool, a virtual prototype, and a cluster of FPGA boards. This framework is designed to support and accelerate the continuous development of such platforms driven by new insights in neuroscience.ResultsBased on design space explorations using this framework, we devised and realized an FPGA cluster consisting of 35 NetFPGA SUME boards.DiscussionThis system functions as an evaluation platform for our framework. At the same time, it resulted in a fully deterministic neuroscience simulation system surpassing the state of the art in both performance and energy efficiency. It is capable of simulating the microcircuit with 20× acceleration compared to biological real-time and achieves an energy efficiency of 48nJ per synaptic event

    Innovative Hybrid Approaches for Vehicle Routing Problems

    Get PDF
    This thesis deals with the efficient resolution of Vehicle Routing Problems (VRPs). The first chapter faces the archetype of all VRPs: the Capacitated Vehicle Routing Problem (CVRP). Despite having being introduced more than 60 years ago, it still remains an extremely challenging problem. In this chapter I design a Fast Iterated-Local-Search Localized Optimization algorithm for the CVRP, shortened to FILO. The simplicity of the CVRP definition allowed me to experiment with advanced local search acceleration and pruning techniques that have eventually became the core optimization engine of FILO. FILO experimentally shown to be extremely scalable and able to solve very large scale instances of the CVRP in a fraction of the computing time compared to existing state-of-the-art methods, still obtaining competitive solutions in terms of their quality. The second chapter deals with an extension of the CVRP called the Extended Single Truck and Trailer Vehicle Routing Problem, or simply XSTTRP. The XSTTRP models a broad class of VRPs in which a single vehicle, composed of a truck and a detachable trailer, has to serve a set of customers with accessibility constraints making some of them not reachable by using the entire vehicle. This problem moves towards VRPs including more realistic constraints and it models scenarios such as parcel deliveries in crowded city centers or rural areas, where maneuvering a large vehicle is forbidden or dangerous. The XSTTRP generalizes several well known VRPs such as the Multiple Depot VRP and the Location Routing Problem. For its solution I developed an hybrid metaheuristic which combines a fast heuristic optimization with a polishing phase based on the resolution of a limited set partitioning problem. Finally, the thesis includes a final chapter aimed at guiding the computational evaluation of new approaches to VRPs proposed by the machine learning community

    Science with Neutrino Telescopes in Spain

    Get PDF
    The authors gratefully acknowledge the funding support from the following Spanish programs: Ministerio de Ciencia, Innovacion, Investigacion y Universidades (MCIU): Programa Estatal de Generacion de Conocimiento (refs. PGC2018-096663-B-C41, -A-C42, -B-C43, -B-C44) (MCIU/FEDER); Generalitat Valenciana: Prometeo (PROMETEO/2020/019) and GenT (refs. CIDEGENT/2018/034, /2020/049, /2021/023); Junta de Andalucia (ref. A-FQM-053-UGR18).The primary scientific goal of neutrino telescopes is the detection and study of cosmic neutrino signals. However, the range of physics topics that these instruments can tackle is exceedingly wide and diverse. Neutrinos coming from outside the Earth, in association with othermessengers, can contribute to clarify the question of the mechanisms that power the astrophysical accelerators which are known to exist from the observation of high-energy cosmic and gamma rays. Cosmic neutrinos can also be used to bring relevant information about the nature of dark matter, to study the intrinsic properties of neutrinos and to look for physics beyond the Standard Model. Likewise, atmospheric neutrinos can be used to study an ample variety of particle physics issues, such as neutrino oscillation phenomena, the determination of the neutrino mass ordering, non-standard neutrino interactions, neutrino decays and a diversity of other physics topics. In this article, we review a selected number of these topics, chosen on the basis of their scientific relevance and the involvement in their study of the Spanish physics community working in the KM3NeT and ANTARES neutrino telescopes.Ministerio de Ciencia, Innovacion, Investigacion y Universidades (MCIU) PGC2018-096663-B-C41 A-C42 B-C43 B-C44MCIU/FEDERGeneralitat Valenciana PROMETEO/2020/019GenT CIDEGENT/2018/034 2020/049 2021/023Junta de Andalucia A-FQM-053-UGR1

    Analysis of reliable deployment of TDOA local positioning architectures

    Get PDF
    .Local Positioning Systems (LPS) are supposing an attractive research topic over the last few years. LPS are ad-hoc deployments of wireless sensor networks for particularly adapt to the environment characteristics in harsh environments. Among LPS, those based on temporal measurements stand out for their trade-off among accuracy, robustness and costs. But, regardless the LPS architecture considered, an optimization of the sensor distribution is required for achieving competitive results. Recent studies have shown that under optimized node distributions, time-based LPS cumulate the bigger error bounds due to synchronization errors. Consequently, asynchronous architectures such as Asynchronous Time Difference of Arrival (A-TDOA) have been recently proposed. However, the A-TDOA architecture supposes the concentration of the time measurement in a single clock of a coordinator sensor making this architecture less versatile. In this paper, we present an optimization methodology for overcoming the drawbacks of the A-TDOA architecture in nominal and failure conditions with regards to the synchronous TDOA. Results show that this optimization strategy allows the reduction of the uncertainties in the target location by 79% and 89.5% and the enhancement of the convergence properties by 86% and 33% of the A-TDOA architecture with regards to the TDOA synchronous architecture in two different application scenarios. In addition, maximum convergence points are more easily found in the A-TDOA in both configurations concluding the benefits of this architecture in LPS high-demanded applicationS

    Optical coherence tomography methods using 2-D detector arrays

    Get PDF
    Optical coherence tomography (OCT) is a non-invasive, non-contact optical technique that allows cross-section imaging of biological tissues with high spatial resolution, high sensitivity and high dynamic range. Standard OCT uses a focused beam to illuminate a point on the target and detects the signal using a single photodetector. To acquire transverse information, transversal scanning of the illumination point is required. Alternatively, multiple OCT channels can be operated in parallel simultaneously; parallel OCT signals are recorded by a two-dimensional (2D) detector array. This approach is known as Parallel-detection OCT. In this thesis, methods, experiments and results using three parallel OCT techniques, including full -field (time-domain) OCT (FF-OCT), full-field swept-source OCT (FF-SS-OCT) and line-field Fourier-domain OCT (LF-FD-OCT), are presented. Several 2D digital cameras of different formats have been used and evaluated in the experiments of different methods. With the LF-FD-OCT method, photography equipment, such as flashtubes and commercial DSLR cameras have been equipped and tested for OCT imaging. The techniques used in FF-OCT and FF-SS-OCT are employed in a novel wavefront sensing technique, which combines OCT methods with a Shack-Hartmann wavefront sensor (SH-WFS). This combination technique is demonstrated capable of measuring depth-resolved wavefront aberrations, which has the potential to extend the applications of SH-WFS in wavefront-guided biomedical imaging techniques

    Delay measurements In live 5G cellular network

    Get PDF
    Abstract. 5G Network has many important properties, including increased bandwidth, increased data throughput, high reliability, high network density, and low latency. This thesis concentrate on the low latency attribute of the 5G Standalone (SA) mode and 5G Non-Standalone (NSA) mode. One of the most critical considerations in 5G is to have low latency network for various delay-sensitive applications, such as remote diagnostics and surgery in healthcare, self-driven cars, industrial factory automation, and live audio productions in the music industry. Therefore, 5G employs various retransmission algorithms and techniques to meet the low latency standards, a new frame structure with multiple subcarrier spacing (SCS) and time slots, and a new cloud-native core. For the low latency measurements, a test setup is built. A video is sent from the 5G User Equipment (UE) to the multimedia server deployed in the University of Oulu 5G test Network (5GTN) edge server. The University of Oulu 5GTN is operating both in NSA and SA modes. Delay is measured both for the downlink and the uplink direction with Qosium tool. When calculating millisecond-level transmission delays, clock synchronization is essential. Therefore, Precision Time Protocol daemon (PTPd) service is initiated on both the sending and receiving machines. The tests comply with the specifications developed at the University of Oulu 5GTN for both the SA and the NSA mode. When the delay measurement findings were compared between the two deployment modes, it was observed that the comparison was not appropriate. The primary reason for this is that in the 5GTN, the NSA and the SA have entirely different data routing paths and configurations. Additionally, the author did not have sufficient resources to make the required architectural changes

    Developing automated meta-research approaches in the preclinical Alzheimer's disease literature

    Get PDF
    Alzheimer’s disease is a devastating neurodegenerative disorder for which there is no cure. A crucial part of the drug development pipeline involves testing therapeutic interventions in animal disease models. However, promising findings in preclinical experiments have not translated into clinical trial success. Reproducibility has often been cited as a major issue affecting biomedical research, where experimental results in one laboratory cannot be replicated in another. By using meta-research (research on research) approaches such as systematic reviews, researchers aim to identify and summarise all available evidence relating to a specific research question. By conducting a meta-analysis, researchers can also combine the results from different experiments statistically to understand the overall effect of an intervention and to explore reasons for variations seen across different publications. Systematic reviews of the preclinical Alzheimer’s disease literature could inform decision making, encourage research improvement, and identify gaps in the literature to guide future research. However, due to the vast amount of potentially useful evidence from animal models of Alzheimer’s disease, it remains difficult to make sense of and utilise this data effectively. Systematic reviews are common practice within evidence based medicine, yet their application to preclinical research is often limited by the time and resources required. In this thesis, I develop, build-upon, and implement automated meta-research approaches to collect, curate, and evaluate the preclinical Alzheimer’s literature. I searched several biomedical databases to obtain all research relevant to Alzheimer’s disease. I developed a novel deduplication tool to automatically identify and remove duplicate publications identified across different databases with minimal human effort. I trained a crowd of reviewers to annotate a subset of the publications identified and used this data to train a machine learning algorithm to screen through the remaining publications for relevance. I developed text-mining tools to extract model, intervention, and treatment information from publications and I improved existing automated tools to extract reported measures to reduce the risk of bias. Using these tools, I created a categorised database of research in transgenic Alzheimer’s disease animal models and created a visual summary of this dataset on an interactive, openly accessible online platform. Using the techniques described, I also identified relevant publications within the categorised dataset to perform systematic reviews of two key outcomes of interest in transgenic Alzheimer’s disease models: (1) synaptic plasticity and transmission in hippocampal slices and (2) motor activity in the open field test. Over 400,000 publications were identified across biomedical research databases, with 230,203 unique publications. In a performance evaluation across different preclinical datasets, the automated deduplication tool I developed could identify over 97% of duplicate citations and a had an error rate similar to that of human performance. When evaluated on a test set of publications, the machine learning classifier trained to identify relevant research in transgenic models performed was highly sensitive (captured 96.5% of relevant publications) and excluded 87.8% of irrelevant publications. Tools to identify the model(s) and outcome measure(s) within the full-text of publications may reduce the burden on reviewers and were found to be more sensitive than searching only the title and abstract of citations. Automated tools to assess risk of bias reporting were highly sensitive and could have the potential to monitor research improvement over time. The final dataset of categorised Alzheimer’s disease research contained 22,375 publications which were then visualised in the interactive web application. Within the application, users can see how many publications report measures to reduce the risk of bias and how many have been classified as using each transgenic model, testing each intervention, and measuring each outcome. Users can also filter to obtain curated lists of relevant research, allowing them to perform systematic reviews at an accelerated pace with reduced effort required to search across databases, and a reduced number of publications to screen for relevance. Both systematic reviews and meta-analyses highlighted failures to report key methodological information within publications. Poor transparency of reporting limited the statistical power I had to understand the sources of between-study variation. However, some variables were found to explain a significant proportion of the heterogeneity. Transgenic animal model had a significant impact on results in both reviews. For certain open field test outcomes, wall colour of the open field arena and the reporting of measures to reduce the risk of bias were found to impact results. For in vitro electrophysiology experiments measuring synaptic plasticity, several electrophysiology parameters, including magnesium concentration of the recording solution, were found to explain a significant proportion of the heterogeneity. Automated meta-research approaches and curated web platforms summarising preclinical research could have the potential to accelerate the conduct of systematic reviews and maximise the potential of existing evidence to inform translation
    corecore