1,879 research outputs found
Validation of multiprocessor systems
Experiments that can be used to validate fault free performance of multiprocessor systems in aerospace systems integrating flight controls and avionics are discussed. Engineering prototypes for two fault tolerant multiprocessors are tested
Multiple junction biasing of superconducting tunnel junction detectors
We describe a new biasing scheme for single photon detectors based on
superconducting tunnel junctions. It replaces a single detector junction with a
circuit of three junctions and achieves biasing of a detector junction at
subgap currents without the use of an external magnetic field. The biasing
occurs through the nonlinear interaction of the three junctions, which we
demonstrate through numerical simulation. This nonlinear state is numerically
stable against external fluctuations and is compatible with high fidelity
electrical readout of the photon-induced current. The elimination of the
external magnetic field potentially increases the capability of these types of
photon detectors and eases constraints involved in the fabrication of large
detector arrays.Comment: 15 pages, including 3 figure
Software-implemented fault insertion: An FTMP example
This report presents a model for fault insertion through software; describes its implementation on a fault-tolerant computer, FTMP; presents a summary of fault detection, identification, and reconfiguration data collected with software-implemented fault insertion; and compares the results to hardware fault insertion data. Experimental results show detection time to be a function of time of insertion and system workload. For the fault detection time, there is no correlation between software-inserted faults and hardware-inserted faults; this is because hardware-inserted faults must manifest as errors before detection, whereas software-inserted faults immediately exercise the error detection mechanisms. In summary, the software-implemented fault insertion is able to be used as an evaluation technique for the fault-handling capabilities of a system in fault detection, identification and recovery. Although the software-inserted faults do not map directly to hardware-inserted faults, experiments show software-implemented fault insertion is capable of emulating hardware fault insertion, with greater ease and automation
A Validated Reversed-Phase HPLC Method for the Determination of Atorvastatin Calcium in Tablets
A Reversed-Phase Liquid Chromatographic (RP-LC) assay method was developed for the quantitative determination of atorvastatin calcium in the presence of its degradation products. The assay involved an isocratic elution of atorvastatin calcium in a LiChroCARTR 250*4 mm HPLC Cartridge LiChrospherR 100 RP-18 (5 ÎŒm) column using a mobile phase consisting of 0.1% acetic acid solution: acetonitrile (45:55, v/v), pH = 3.8. The flow rate was 0.8 mL/min and the analytes monitored at 246 nm. The assay method was found to be linear from 8.13 to 23.77 ÎŒg/mL. All the validation parameters were within the acceptance range. The developed method was successfully applied to estimate the amount of atorvastatin calcium in tablets.Fil: Simionato, Laura Daniela. Universidad de Buenos Aires. Facultad de Farmacia y BioquĂmica. Departamento de TecnologĂa FarmacĂ©utica; ArgentinaFil: Ferello, L.. Universidad de Buenos Aires. Facultad de Farmacia y BioquĂmica. Departamento de TecnologĂa FarmacĂ©utica; ArgentinaFil: Stamer. S.. Universidad de Buenos Aires. Facultad de Farmacia y BioquĂmica. Departamento de TecnologĂa FarmacĂ©utica; ArgentinaFil: Repetto, M. F.. Universidad de Buenos Aires. Facultad de Farmacia y BioquĂmica. Departamento de TecnologĂa FarmacĂ©utica; ArgentinaFil: Zubata, P. D.. Universidad de Buenos Aires. Facultad de Farmacia y BioquĂmica. Departamento de TecnologĂa FarmacĂ©utica; ArgentinaFil: Segall, Adriana Ines. Universidad de Buenos Aires. Facultad de Farmacia y BioquĂmica. Departamento de TecnologĂa FarmacĂ©utica; Argentina. Consejo Nacional de Investigaciones CientĂficas y TĂ©cnicas. Oficina de CoordinaciĂłn Administrativa Houssay; Argentin
Fault-free performance validation of fault-tolerant multiprocessors
A validation methodology for testing the performance of fault-tolerant computer systems was developed and applied to the Fault-Tolerant Multiprocessor (FTMP) at NASA-Langley's AIRLAB facility. This methodology was claimed to be general enough to apply to any ultrareliable computer system. The goal of this research was to extend the validation methodology and to demonstrate the robustness of the validation methodology by its more extensive application to NASA's Fault-Tolerant Multiprocessor System (FTMP) and to the Software Implemented Fault-Tolerance (SIFT) Computer System. Furthermore, the performance of these two multiprocessors was compared by conducting similar experiments. An analysis of the results shows high level language instruction execution times for both SIFT and FTMP were consistent and predictable, with SIFT having greater throughput. At the operating system level, FTMP consumes 60% of the throughput for its real-time dispatcher and 5% on fault-handling tasks. In contrast, SIFT consumes 16% of its throughput for the dispatcher, but consumes 66% in fault-handling software overhead
A Network Inversion Filter combining GNSS and InSAR for tectonic slip modeling
Studies of the earthquake cycle benefit from long-term time-dependent slip modeling, as it can be a powerful means to improve our understanding on the interaction of earthquake cycle processes such as interseismic, coseismic, postseismic, and aseismic slip. Observations from Interferometric Synthetic Aperture Radar (InSAR) allow us to model slip at depth with a higher spatial resolution than when using GNSS alone. While the temporal resolution of InSAR has typically been limited, the recent fleet of SAR satellites including Sentinel-1, COSMO-SkyMED, and RADARSAT-2 permits the use of InSAR for time-dependent slip modeling, at intervals of a few days when combined. With the vast amount of SAR data available, simultaneous data inversion of all epochs becomes challenging. Here, we expanded the original Network Inversion Filter to include InSAR observations of surface displacements in addition to GNSS. In the NIF framework, geodetic observations are limited to those of a given epoch, with a stochastic model describing slip evolution over time. The combination of the Kalman forward filtering and backward smoothing allows all geodetic observations to constrain the complete observation period. Combining GNSS and InSAR allows modeling of time-dependent slip at unprecedented spatial resolution. We validate the approach with a simulation of the 2006 Guerrero slow slip event. We highlight the importance of including InSAR covariance information, and demonstrate that InSAR provides an additional constraint on the spatial extent of the slow slip
Recommended from our members
Imputation of Assay Bioactivity Data Using Deep Learning.
We describe a novel deep learning neural network method and its application to impute assay pIC50 values. Unlike conventional machine learning approaches, this method is trained on sparse bioactivity data as input, typical of that found in public and commercial databases, enabling it to learn directly from correlations between activities measured in different assays. In two case studies on public domain data sets we show that the neural network method outperforms traditional quantitative structure-activity relationship (QSAR) models and other leading approaches. Furthermore, by focusing on only the most confident predictions the accuracy is increased to R2 > 0.9 using our method, as compared to R2 = 0.44 when reporting all predictions
Band structure analysis of the conduction-band mass anisotropy in 6H and 4H SiC
The band structures of 6H and 4H SiC calculated by means of the FP-LMTO
method are used to determine the effective mass tensors for their
conduction-band minima. The results are shown to be consistent with recent
optically detected cyclotron resonance measurements and predict an unusual band
filling dependence for 6H-SiC.Comment: 5 pages including 4 postscript figures incorporated with epsfig figs.
available as part 2: sicfig.uu self-extracting file to appear in Phys. Rev.
B: Aug. 15 (Rapid Communications
Egalitarian justice and expected value
According to all-luck egalitarianism, the differential distributive effects of both brute luck, which defines the outcome of risks which are not deliberately taken, and option luck, which defines the outcome of deliberate gambles, are unjust. Exactly how to correct the effects of option luck is, however, a complex issue. This article argues that (a) option luck should be neutralized not just by correcting luck among gamblers, but among the community as a whole, because it would be unfair for gamblers as a group to be disadvantaged relative to non-gamblers by bad option luck; (b) individuals should receive the warranted expected results of their gambles, except insofar as individuals blamelessly lacked the ability to ascertain which expectations were warranted; and (c) where societal resources are insufficient to deliver expected results to gamblers, gamblers should receive a lesser distributive share which is in proportion to the expected results. Where all-luck egalitarianism is understood in this way, it allows risk-takers to impose externalities on non-risk-takers, which seems counterintuitive. This may, however, be an advantage as it provides a luck egalitarian rationale for assisting ânegligent victimsâ
- âŠ