1,738 research outputs found

    The development and evaluation of an improved Cloverleaf Model for the assessment of technology readiness for commercialisation

    Get PDF
    Technology commercialisation is the process that converts technology that was developed for the government, e.g., defence technology, to commercial use, thereby benefiting the wider community. As technology commercialisation is a high risk and expensive process, care must be taken to commercialise only those technologies with the best chances for success. Heslop, McGregor & Grifith [1] developed the Cloverleaf Model to assess technology readiness for commercialisation. This model is evaluated and improved for use in the South African environment.Die kommersialisering van tegnologie is die proses waardeur tegnologie wat ontwikkel is vir die Staat, byvoorbeeld verdedigingstegnologie, omgeskakel word vir kommersiële gebruik tot voordeel van die breër gemeenskap. Aangesien tegnologie-kommersialisering 'n hoë-risiko en duur proses is, moet net daardie tegnologieë gekommersialiseer word wat die beste kanse vir sukses het. Heslop, McGregor & Grifith [1] het die Cloverleaf Model vir die assessering van tegnologie-gereedheid vir kommersialisering ontwikkel. Hierdie model is ge-evalueer en verbeter vir gebruik in die Suid-Afrikaanse omgewing.http://sajie.journals.ac.z

    Studying Parallel Evolutionary Algorithms: The cellular Programming Case

    Get PDF
    Parallel evolutionary algorithms, studied to some extent over the past few years, have proven empirically worthwhile—though there seems to be lacking a better understanding of their workings. In this paper we concentrate on cellular (fine-grained) models, presenting a number of statistical measures, both at the genotypic and phenotypic levels. We demonstrate the application and utility of these measures on a specific example, that of the cellular programming evolutionary algorithm, when used to evolve solutions to a hard problem in the cellular-automata domain, known as synchronization

    Noninvasive Detection of Fibrosis Applying Contrast-Enhanced Cardiac Magnetic Resonance in Different Forms of Left Ventricular Hypertrophy Relation to Remodeling

    Get PDF
    ObjectivesWe aimed to evaluate the incidence and patterns of late gadolinium enhancement (LGE) in different forms of left ventricular hypertrophy (LVH) and to determine their relation to severity of left ventricular (LV) remodeling.BackgroundLeft ventricular hypertrophy is an independent predictor of cardiac mortality. The relationship between LVH and myocardial fibrosis as defined by LGE cardiovascular magnetic resonance (CMR) is not well understood.MethodsA total of 440 patients with aortic stenosis (AS), arterial hypertension (AH), or hypertrophic cardiomyopathy (HCM) fulfilling echo criteria of LVH underwent CMR with assessment of LV size, weight, function, and LGE. Patients with increased left ventricular mass index (LVMI) resulting in global LVH in CMR were included in the study.ResultsCriteria were fulfilled by 83 patients (56 men, age 57 ± 14 years; AS, n = 21; AH, n = 26; HCM, n = 36). Late gadolinium enhancement was present in all forms of LVH (AS: 62%, AH: 50%; HCM: 72%, p = NS) and was correlated with LVMI (r = 0.237, p = 0.045). There was no significant relationship between morphological obstruction and LGE. The AS subjects with LGE showed higher LV end-diastolic volumes than those without (1.0 ± 0.2 ml/cm vs. 0.8 ± 0.2 ml/cm, p < 0.015). Typical patterns of LGE were observed in HCM but not in AS and AH.ConclusionsFibrosis as detected by CMR is a frequent feature of LVH, regardless of its cause, and depends on the severity of LV remodeling. As LGE emerges as a useful tool for risk stratification also in nonischemic heart diseases, our findings have the potential to individualize treatment strategies

    An open-source software tool for the generation of relaxation time maps in magnetic resonance imaging

    Get PDF
    BACKGROUND: In magnetic resonance (MR) imaging, T1, T2 and T2* relaxation times represent characteristic tissue properties that can be quantified with the help of specific imaging strategies. While there are basic software tools for specific pulse sequences, until now there is no universal software program available to automate pixel-wise mapping of relaxation times from various types of images or MR systems. Such a software program would allow researchers to test and compare new imaging strategies and thus would significantly facilitate research in the area of quantitative tissue characterization. RESULTS: After defining requirements for a universal MR mapping tool, a software program named MRmap was created using a high-level graphics language. Additional features include a manual registration tool for source images with motion artifacts and a tabular DICOM viewer to examine pulse sequence parameters. MRmap was successfully tested on three different computer platforms with image data from three different MR system manufacturers and five different sorts of pulse sequences: multi-image inversion recovery T1; Look-Locker/ TOMROP T1; modified Look-Locker inversion recovery (MOLLI) T1; single-echo T2/ T2*; and multi-echo T2/ T2*. Computing times varied between 2 and 113 seconds. Estimates of relaxation times compared favorably to those obtained from non-automated curve fitting. Completed maps were exported in DICOM format and could be read in standard software packages used for analysis of clinical and research MR data. CONCLUSIONS: MRmap is a flexible cross-platform research tool that enables accurate mapping of relaxation times from various pulse sequences. The software allows researchers to optimize quantitative MR strategies in a manufacturer-independent fashion. The program and its source code were made available as open-source software on the internet

    Virtual Reality via Object Pose Estimation and Active Learning:Realizing Telepresence Robots with Aerial Manipulation Capabilities

    Get PDF
    This paper presents a novel telepresence system for advancing aerial manipulation indynamic and unstructured environments. The proposed system not only features a haptic device, but also a virtual reality (VR) interface that provides real-time 3D displays of the robot’s workspace as well as a haptic guidance to its remotely located operator. To realize this, multiple sensors, namely, a LiDAR, cameras, and IMUs are utilized. For processing of the acquired sensory data, pose estimation pipelines are devised for industrial objects of both known and unknown geometries. We further propose an active learning pipeline in order to increase the sample efficiency of a pipeline component that relies on a Deep Neural Network (DNN) based object detector. All these algorithms jointly address various challenges encountered during the execution of perception tasks in industrial scenarios. In the experiments, exhaustive ablation studies are provided to validate the proposed pipelines. Method-ologically, these results commonly suggest how an awareness of the algorithms’ own failures and uncertainty (“introspection”) can be used to tackle the encountered problems. Moreover, outdoor experiments are conducted to evaluate the effectiveness of the overall system in enhancing aerial manipulation capabilities. In particular, with flight campaigns over days and nights, from spring to winter, and with different users and locations, we demonstrate over 70 robust executions of pick-and-place, force application and peg-in-hole tasks with the DLR cable-Suspended Aerial Manipulator (SAM). As a result, we show the viability of the proposed system in future industrial applications

    Antihyperalgesia by α2-GABAA Receptors Occurs Via a Genuine Spinal Action and Does Not Involve Supraspinal Sites

    Get PDF
    Drugs that enhance GABAergic inhibition alleviate inflammatory and neuropathic pain after spinal application. This antihyperalgesia occurs mainly through GABAA receptors (GABAARs) containing α2 subunits (α2-GABAARs). Previous work indicates that potentiation of these receptors in the spinal cord evokes profound antihyperalgesia also after systemic administration, but possible synergistic or antagonistic actions of supraspinal α2-GABAARs on spinal antihyperalgesia have not yet been addressed. Here we generated two lines of GABAAR-mutated mice, which either lack α2-GABAARs specifically from the spinal cord, or, which express only benzodiazepine-insensitive α2-GABAARs at this site. We analyzed the consequences of these mutations for antihyperalgesia evoked by systemic treatment with the novel non-sedative benzodiazepine site agonist HZ166 in neuropathic and inflammatory pain. Wild-type mice and both types of mutated mice had similar baseline nociceptive sensitivities and developed similar hyperalgesia. However, antihyperalgesia by systemic HZ166 was reduced in both mutated mouse lines by about 60% and was virtually indistinguishable from that of global point-mutated mice, in which all α2-GABAARs were benzodiazepine insensitive. The major (α2-dependent) component of GABAAR-mediated antihyperalgesia was therefore exclusively of spinal origin, whereas supraspinal α2-GABAARs had neither synergistic nor antagonistic effects on antihyperalgesia. Our results thus indicate that drugs that specifically target α2-GABAARs exert their antihyperalgesic effect through enhanced spinal nociceptive control. Such drugs may therefore be well-suited for the systemic treatment of different chronic pain conditions

    Simulation Study of Photon-to-Digital Converter (PDC) Timing Specifications for LoLX Experiment

    Full text link
    The Light only Liquid Xenon (LoLX) experiment is a prototype detector aimed to study liquid xenon (LXe) light properties and various photodetection technologies. LoLX is also aimed to quantify LXe's time resolution as a potential scintillator for 10~ps time-of-flight (TOF) PET. Another key goal of LoLX is to perform a time-based separation of Cerenkov and scintillation photons for new background rejection methods in LXe experiments. To achieve this separation, LoLX is set to be equipped with photon-to-digital converters (PDCs), a photosensor type that provides a timestamp for each observed photon. To guide the PDC design, we explore requirements for time-based Cerenkov separation. We use a PDC simulator, whose input is the light information from the Geant4-based LoLX simulation model, and evaluate the separation quality against time-to-digital converter (TDC) parameters. Simulation results with TDC parameters offer possible configurations supporting a good separation. Compared with the current filter-based approach, simulations show Cerenkov separation level increases from 54% to 71% when using PDC and time-based separation. With the current photon time profile of LoLX simulation, the results also show 71% separation is achievable with just 4 TDCs per PDC. These simulation results will lead to a specification guide for the PDC as well as expected results to compare against future PDC-based experimental measurements. In the longer term, the overall LoLX results will assist large LXe-based experiments and motivate the assembly of a LXe-based TOF-PET demonstrator system.Comment: 5 pages, 7 figure
    • …
    corecore