436 research outputs found

    Volterra Filtering for ADC Error Correction

    Get PDF
    Dynamic non-linearity of analog-to-digital converters (ADC) contributes significantly to the distortion of digitized signals. This paper introduces a new effective method for compensation such a distortion based on application of Volterra filtering. Considering an a-priori error model of ADC allows finding an efficient inverse Volterra model for error correction. Efficiency of proposed method is demonstrated on experimental results

    A SIFT approach for analysing failure by delamination and disbonding in composite structures

    Get PDF
    A strain invariant failure theory (SIFT) has been developed to predict resin failure in damaged and pristine composite structures The finite element (FE) analysis in this work uses shell elements consistent with common practice in the aeronautical industry. The new SIFT is similar in nature to a characteristic length method that requires a matched finite element mesh. It samples the strains in brick elements lofted between two layers of shell elements, each representing half of the damaged laminate in the failure critical zone. Experimental tests involving three laminate materials have been carried out to validate t he modified SIFT approach for notched laminates, including single and multiple level delamination tests, and stiffened panel tests under shear or compression load. These results are summarised in a table. Square and rectangle single delamination tests are presented in more detail and indicate that failure location and load predicted by the modified SIFT approach correlates well with the experimental results

    Untersuchung der klinischen Bedeutung einer computerassistierten Diagnosesoftware zur Quantifizierung von Lungengerüstprozessen mittels Multislice-Spiral-CT und Korrelation mit der Lungenfunktionsdiagnostik

    Get PDF
    Die sozioökonomische Bedeutung der Erkrankungen des Lungengerüsts hat in den letzten Jahren enorm zugenommen. Deshalb ist die zuverlässige Detektion sowohl des Lungenemphysems als auch der Lungenfibrose anzustreben. Steigende Datenmengen durch den Gebrauch von Multislice- Scannern und fehlende objektive Quantifizierungsmethoden machen die zusätzliche computergestützte Auswertung der CT-Daten im klinischen Alltag wünschenswert. Das Ziel dieser Arbeit bestand in der Evaluation der klinischen Anwendbarkeit einer Software zur Quantifizierung von Erkrankungen des Lungengerüsts anhand von MS-CT-Datensätzen im Vergleich zur Lungenfunktionsdiagnostik. In die retrospektive Studie gingen die Daten von insgesamt 100 Patienten ein, die sich sowohl einer Bodyplethysmographie als auch einer MS-CT des Thorax unterzogen haben. Das Gesamtkollektiv wurde in Subgruppen mit Atemwegsobstruktion und restriktiver Ventilationsstörung geteilt. Mit Hilfe des CAD-Systems PULMO 3D der Firma MeVis wurden die CT-Aufnahmen aller Patienten voll automatisiert analysiert. Die von der Software ausgegebenen Parameter wurden mit den per Lungenfunktionsuntersuchung bestimmten Volumina korreliert. Dabei zeigte sich, dass es machbar ist, diagnoserelevante Parameter der Lungenfunktion durch CAD-Anwendung aus CT-Daten zu extrahieren, wobei die Bestimmung dynamischer Parameter wie der Einsekundenkapazität (FEV1) derzeit nicht ausreichend gelingt. Die Unterscheidung zwischen Obstruktion und Restriktion ist mittels quantitativer Analyse grundsätzlich möglich. Des weiteren ergaben sich signifikante Unterschiede der CAD-basiert ermittelten Parameter im Hinblick auf die Stadien der peripheren Obstruktion und der Restriktion, so dass eine Abhängigkeit vom Erkrankungsausmaß angenommen werden kann. Allerdings demonstrieren die Ergebnisse dieser Studie auch, dass die reproduzierbare Schweregradeinteilung erst mit der Definition von Referenz- bzw. Normwerten ermöglicht wird

    Advances in the characterization of nanowire photovoltaic devices

    Get PDF
    III-V nanowires (NWs) have a great potential for solar energy applications due to their diameter-dependent optical properties, which may enhance absorption of light. In addition, core-shell radial p-i-n structures, in which the direction of light absorption is orthogonal to the carrier collection, can provide efficient carrier collection. The main goal of this thesis is the experimental study of the challenges of NW-based solar cells, related to materials and device fabrication. In the first part of the thesis, we present an analysis of where the electrical losses can be originated. By applying an equivalent circuit analysis approach, we classified them into three main groups: (i) the non-uniformity of NWs which may result in a reduction of the parallel resistance, (i) potential barriers originated at the different materials interfaces in the solar cell structure may result in an increase of the series resistance or addition of a second diode and (iii) surface recombination resulting in the reduction of the open-circuit voltage. In this thesis, we propose separate strategies to characterize and tackle these factors. The electric scheme of a NW-based solar cell consists of an ensemble of p-n junctions connected in parallel. We show how conductive-probe atomic force microscopy, C-AFM, is an essential tool for the characterization and optimization of these parallel-connected NW devices. We demonstrate topography and current mapping of the NW arrays, combined with current-voltage (IV) measurements of the individual NW junctions from the ensemble. Our results provide discussion elements on some of the factors limiting the performance of a NW-based solar cell, such as uniformity and photosensitivity of the individual NW p-n junctions within the array, and thereby a path for their improvement. Besides parallel losses due to uniformity issues, barriers in the carrier collection through the various heterointerfaces composing the device is discussed. To analyze it, we illuminate GaAs NW-based solar cells at different levels of light intensity and extract IV characteristics. This analysis helps to separately study the NW p-n junction response and the series resistance. The high series resistance of the NW-ensemble device can be attributed to the following interfaces: 1) GaAs-ITO, forming a photoactive Schottky diode, which suppresses the p-n junction at high concentrations of light, and 2) Si-GaAs heterojunction, disturbing the flow of majority carriers. Finally, the characterization of surface passivation in high-aspect-ratio nano/micro structures is addressed by electrochemical impedance spectroscopy (EIS). The method is applied to Si micropillars, as a proof-of-concept prior to the application to III-V nanowires. We tested structures passivated by a dielectric layer. The effect of different surface treatments on the interface state density were quantified by the analysis of the capacitance-voltage and conductance-voltage characteristics. This method allows the electrical measurements on rough vertical surfaces, which would otherwise suffer from high gate leakage currents if tested using solid-state metal-insulator-semiconductor scheme. The results and characterization methods, demonstrated in this work, contribute to the overall efforts of the scientific community on how to reveal the main engineering challenges in NW-based solar cells. It thus paves the way to approach the fundamental conversion efficiencies predicted by theory

    Professional guideline versus product label selection for treatment with IV thrombolysis: an analysis from SITS registry

    Get PDF
    Introduction: Thrombolysis usage in ischaemic stroke varies across sites. Divergent advice from professional guidelines and product labels may contribute. Patients and methods: We analysed SITS-International registry patients enrolled January 2010 through June 2016. We grouped sites into organisational tertiles by number of patients arriving ≤2.5 h and treated ≤3 h, percentage arriving ≤2.5 h and treated ≤3 h, and numbers treated ≤3 h. We assigned scores of 1–3 (lower/middle/upper) per variable and 2 for onsite thrombectomy. We classified sites as lower efficiency (summed scores 3–5), medium efficiency (6–8) or higher efficiency (9–11). Sites were also grouped by adherence with European product label and ESO guideline: ‘label adherent’ (>95% on-label), ‘guideline adherent’ (≥5% off-label, ≥95% on-guideline) or ‘guideline non-adherent’ (>5% off-guideline). We cross-tabulated site-efficiency and adherence. We estimated the potential benefit of universally selecting by ESO guidance, using onset-to-treatment time-specific numbers needed to treat for day 90 mRS 0–1. Results: A total of 56,689 patients at 597 sites were included: 163 sites were higher efficiency, 204 medium efficiency and 230 lower efficiency. Fifty-six sites were ‘label adherent’, 204 ‘guideline adherent’ and 337 ‘guideline non-adherent’. There were strong associations between site-efficiency and adherence (P < 0.001). Almost all ‘label adherent’ sites (55, 98%) were lower efficiency. If all patients were treated by ESO guidelines, an additional 17,031 would receive alteplase, which translates into 1922 more patients with favourable three-month outcomes. Discussion: Adherence with product labels is highest in lower efficiency sites. Closer alignment with professional guidelines would increase patients treated and favourable outcomes. Conclusion: Product labels should be revised to allow treatment of patients ≤4.5 h from onset and aged ≥80 years

    X-ray diffracted intensity for double reflection channel cut Ge monochromators at extremely asymmetric diffraction conditions

    Get PDF
    The width and the integrated intensity of the 220 x-ray double diffraction profile and the shift of the Bragg condition due to refraction have been measured in a channel cut Ge crystal in an angular range near the critical angle of total external reflection. The Bragg angle and incidence condition were varied by changing the x-ray energy. In agreement with the extended dynamical theory of x-ray diffraction, the integrated intensity of the double diffraction remained almost constant even for grazing incidence condition very close to the critical angle C for total external reflection. A broadening of the diffraction profile not predicted by the extended theory of x-ray diffraction was observed when the Bragg condition was at angles of incidence lower than 0.6?. Plane wave topographs revealed a contrast that could be explained by a slight residual crystal surface undulation of 0.3 degrees due to the etching to remove the cutting damage and the increasing effect of refraction at glancing angles close to the critical angle. These findings confirm that highly asymmetric channel cut Ge crystals can work as efficient monochromators or image magnifiers also at glancing angles close to the critical angle and that the main limitation is the crystal surface preparation

    Predictors for cerebral edema in acute ischemic stroke treated with intravenous thrombolysis

    Get PDF
    Cerebral edema (CED) is a severe complication of acute ischemic stroke. There is uncertainty regarding the predictors for the development of CED after cerebral infarction. We aimed to determine which baseline clinical and radiological parameters predict development of CED in patients treated with intravenous thrombolysis. We used an image-based classification of CED with 3 degrees of severity (less severe CED 1 and most severe CED 3) on postintravenous thrombolysis imaging scans. We extracted data from 42 187 patients recorded in the SITS International Register (Safe Implementation of Treatments in Stroke) during 2002 to 2011. We did univariate comparisons of baseline data between patients with or without CED. We used backward logistic regression to select a set of predictors for each CED severity. CED was detected in 9579/42 187 patients (22.7%: 12.5% CED 1, 4.9% CED 2, 5.3% CED 3). In patients with CED versus no CED, the baseline National Institutes of Health Stroke Scale score was higher (17 versus 10; P<0.001), signs of acute infarct was more common (27.9% versus 19.2%; P<0.001), hyperdense artery sign was more common (37.6% versus 14.6%; P<0.001), and blood glucose was higher (6.8 versus 6.4 mmol/L; P<0.001). Baseline National Institutes of Health Stroke Scale, hyperdense artery sign, blood glucose, impaired consciousness, and signs of acute infarct on imaging were independent predictors for all edema types. The most important baseline predictors for early CED are National Institutes of Health Stroke Scale, hyperdense artery sign, higher blood glucose, decreased level of consciousness, and signs of infarct at baseline. The findings can be used to improve selection and monitoring of patients for drug or surgical treatment

    Ant Algorithm for AP-N Aimed at Optimization of Complex Systems

    Get PDF
    Assignment Problem (AP), which is well known combinatorial problem, has been studied extensively in the course of many operational and technical researches. It has been shown to be NP-hard for three or more dimensions and a few non-deterministic methods have been proposed to solve it. This paper pays attention on new heuristic search method for the n-dimensional assignment problem, based on swarm intelligence and comparing results with those obtained by other scientists. It indicates possible direction of solutions of problems and presents a way of behaviour using ant algorithm for multidimensional optimization complex systems. Results of researches in the form of computational simulations outcomes are presented

    The SITS-UTMOST: a registry-based prospective study in Europe investigating the impact of regulatory approval of intravenous Actilyse in the extended time window (3–4.5 h) in acute ischaemic stroke

    Get PDF
    Introduction: The SITS-UTMOST (Safe Implementation of Thrombolysis in Upper Time window Monitoring Study) was a registry-based prospective study of intravenous alteplase used in the extended time window (3–4.5 h) in acute ischaemic stroke to evaluate the impact of the approval of the extended time window on routine clinical practice. Patients and methods: Inclusion of at least 1000 patients treated within 3–4.5 h according to the licensed criteria and actively registered in the SITS-International Stroke Thrombolysis Registry was planned. Prospective data collection started 2 May 2012 and ended 2 November 2014. A historical cohort was identified for 2 years preceding May 2012. Clinical management and outcome were contrasted between patients treated within 3 h versus 3–4.5 h in the prospective cohort and between historical and prospective cohorts for the 3 h time window. Outcomes were functional independency (modified Rankin scale, mRS) 0–2, favourable outcome (mRS 0–1), and death at 3 months and symptomatic intracerebral haemorrhage (SICH) per SITS. Results: 4157 patients from 81 centres in 12 EU countries were entered prospectively (N ¼ 1118 in the 3–4.5 h, N ¼ 3039 in the 0–3 h time window) and 3454 retrospective patients in the 0–3 h time window who met the marketing approval conditions. In the prospective cohort, median arrival to treatment time was longer in the 3–4.5 h than 3 h window (79 vs. 55 min). Within the 3 h time window, treatment delays were shorter for prospective than historical patients (55 vs. 63). There was no significant difference between the 3–4.5 h versus 3 h prospective cohort with regard to percentage of reported SICH (1.6 vs. 1.7), death (11.6 vs. 11.1), functional independency (66 vs. 65) at 3 months or favourable outcome (51 vs. 50). Discussion: Main weakness is the observational design of the study. Conclusion: This study neither identified negative impact on treatment delay, nor on outcome, following extension of the approved time window to 4.5 h for use of alteplase in stroke

    Does Circuit Analysis Interpretability Scale? Evidence from Multiple Choice Capabilities in Chinchilla

    Full text link
    \emph{Circuit analysis} is a promising technique for understanding the internal mechanisms of language models. However, existing analyses are done in small models far from the state of the art. To address this, we present a case study of circuit analysis in the 70B Chinchilla model, aiming to test the scalability of circuit analysis. In particular, we study multiple-choice question answering, and investigate Chinchilla's capability to identify the correct answer \emph{label} given knowledge of the correct answer \emph{text}. We find that the existing techniques of logit attribution, attention pattern visualization, and activation patching naturally scale to Chinchilla, allowing us to identify and categorize a small set of `output nodes' (attention heads and MLPs). We further study the `correct letter' category of attention heads aiming to understand the semantics of their features, with mixed results. For normal multiple-choice question answers, we significantly compress the query, key and value subspaces of the head without loss of performance when operating on the answer labels for multiple-choice questions, and we show that the query and key subspaces represent an `Nth item in an enumeration' feature to at least some extent. However, when we attempt to use this explanation to understand the heads' behaviour on a more general distribution including randomized answer labels, we find that it is only a partial explanation, suggesting there is more to learn about the operation of `correct letter' heads on multiple choice question answering
    corecore