281 research outputs found
The curvHDR Method for Gating Flow Cytometry Samples
Motivation: High-throughput flow cytometry experiments produce hundreds of large multivariate samples of cellular characteristics. These samples require specialized processing to obtain clinically meaningful measurements. A major component of this processing is a form of cell subsetting known as gating. Manual gating is time-consuming and subjective. Good automatic and semi-automatic gating algorithms are very beneficial to high-throughput flow cytometry.
Results: We develop a statistical procedure, named curvHDR, for automatic and semi-automatic gating. The method combines the notions of significant high negative curvature regions and highest density regions and has the ability to adapt well to human-perceived gates. The underlying principles apply to dimension of arbitrary size, although we focus on dimensions up to three. Accompanying software, compatible with contemporary flow cytometry informatics, is developed.
Availability: Software for Bioconductor within R is available
LTP interferometer - noise sources and performance
The LISA Technology Package (LTP) uses laser interferometry to measure the changes in relative displacement between two inertial test masses. The goals of the mission require a displacement measuring precision of 10 pm Hz-1/2 at frequencies in the 3â30 mHz band. We report on progress with a prototype LTP interferometer optical bench in which fused silica mirrors and beamsplitters are fixed to a ZERODURÂź substrate using hydroxide catalysis bonding to form a rigid interferometer. The couplings to displacement noise of this interferometer of two expected noise sourcesâlaser frequency noise and ambient temperature fluctuationsâhave been investigated, and an additional, unexpected, noise source has been identified. The additional noise is due to small amounts of signal at the heterodyne frequency arriving at the photodiode preamplifiers with a phase that quasistatically changes with respect to the optical signal. The phase shift is caused by differential changes in the external optical paths the beams travel before they reach the rigid interferometer. Two different external path length stabilization systems have been demonstrated and these allowed the performance of the overall system to meet the LTP displacement noise requirement
LISA pathfinder optical interferometry
The LISA Technology Package (LTP) aboard of LISA pathfinder mission is dedicated to demonstrate and verify key technologies for LISA, in particular drag free control, ultra-precise laser interferometry and gravitational sensor. Two inertial sensor, the optical interferometry in between combined with the dimensional stable Glass ceramic Zerodur structure are setting up the LTP. The validation of drag free operation of the spacecraft is planned by measuring laser interferometrically the relative displacement and tilt between two test masses (and the optical bench) with a noise levels of 10pm/[square root of]Hz and 10 nrad/[square root of]Hz between 3mHz and 30mHz. This performance and additionally overall environmental tests was currently verified on EM level. The OB structure is able to support two inertial sensors ([approximate]17kg each) and to withstand 25 g design loads as well as 0...40°C temperature range. Optical functionality was verified successfully after environmental tests. The engineering model development and manufacturing of the optical bench and interferometry hardware and their verification tests will be presented
A Symmetric Approach to Compilation and Decompilation
Just as specializing a source interpreter can achieve compilation from a source language to a target language, we observe that specializing a target interpreter can achieve compilation from the target language to the source language. In both cases, the key issue is the choice of whether to perform an evaluation or to emit code that represents this evaluation. We substantiate this observation by specializing two source interpreters and two target interpreters. We first consider a source language of arithmetic expressions and a target language for a stack machine, and then the lambda-calculus and the SECD-machine language. In each case, we prove that the target-to-source compiler is a left inverse of the source-to-target compiler, i.e., it is a decompiler. In the context of partial evaluation, compilation by source-interpreter specialization is classically referred to as a Futamura projection. By symmetry, it seems logical to refer to decompilation by target-interpreter specialization as a Futamura embedding
Sub-femto-g free fall for space-based gravitational wave observatories: LISA pathfinder results
We report the first results of the LISA Pathfinder in-flight experiment. The results demonstrate that two free-falling reference test masses, such as those needed for a space-based gravitational wave observatory like LISA, can be put in free fall with a relative acceleration noise with a square root of the power spectral density of 5.2 ± 0.1 fm sâ2/âHz or (0.54 ± 0.01) Ă 10â15 g/âHz, with g the standard gravity, for frequencies between 0.7 and 20 mHz. This value is lower than the LISA Pathfinder requirement by more than a factor 5 and within a factor 1.25 of the requirement for the LISA mission, and is compatible with Brownian noise from viscous damping due to the residual gas surrounding the test masses. Above 60 mHz the acceleration noise is dominated by interferometer displacement readout noise at a level of (34.8 ± 0.3) fm/âHz, about 2 orders of magnitude better than requirements. At f †0.5 mHz we observe a low-frequency tail that stays below 12 fm sâ2/âHz down to 0.1 mHz. This performance would allow for a space-based gravitational wave
observatory with a sensitivity close to what was originally foreseen for LISA
Consistency of the kernel density estimator - a survey
Various consistency proofs for the kernel density estimator have been developed over the last
few decades. Important milestones are the pointwise consistency and almost sure uniform convergence
with a fixed bandwidth on the one hand and the rate of convergence with a fixed or even a variable
bandwidth on the other hand. While considering global properties of the empirical distribution functions
is sufficient for strong consistency, proofs of exact convergence rates use deeper information about the
underlying empirical processes. A unifying character, however, is that earlier and more recent proofs
use bounds on the probability that a sum of random variables deviates from its mean
Deep learning for healthcare applications based on physiological signals: A review
Background and objective: We have cast the net into the ocean of knowledge to retrieve the latest scientific research on deep learning methods for physiological signals. We found 53 research papers on this topic, published from 01.01.2008 to 31.12.2017. Methods: An initial bibliometric analysis shows that the reviewed papers focused on Electromyogram(EMG), Electroencephalogram(EEG), Electrocardiogram(ECG), and Electrooculogram(EOG). These four categories were used to structure the subsequent content review. Results: During the content review, we understood that deep learning performs better for big and varied datasets than classic analysis and machine classification methods. Deep learning algorithms try to develop the model by using all the available input. Conclusions: This review paper depicts the application of various deep learning algorithms used till recently, but in future it will be used for more healthcare areas to improve the quality of diagnosi
The association between hip fracture and hip osteoarthritis: A case-control study
<p>Abstract</p> <p>Background</p> <p>There have been reports both supporting and refuting an inverse relationship between hip fracture and hip osteoarthritis (OA). We explore this relationship using a case-control study design.</p> <p>Methods</p> <p>Exclusion criteria were previous hip fracture (same side or contralateral side), age younger than 60 years, foreign nationality, pathological fracture, rheumatoid arthritis and cases were radiographic examinations were not found in the archives. We studied all subjects with hip fracture that remained after the exclusion process that were treated at Akureyri University Hospital, Iceland 1990-2008, n = 562 (74% women). Hip fracture cases were compared with a cohort of subjects with colon radiographs, n = 803 (54% women) to determine expected population prevalence of hip OA. Presence of radiographic hip OA was defined as a minimum joint space of 2.5 mm or less on an anteroposterior radiograph, or Kellgren and Lawrence grade 2 or higher. Possible causes of secondary osteoporosis were identified by review of medical records.</p> <p>Results</p> <p>The age-adjusted odds ratio (OR) for subjects with hip fracture having radiographic hip OA was 0.30 (95% confidence interval [95% CI] 0.12-0.74) for men and 0.33 (95% CI 0.19-0.58) for women, compared to controls. The probability for subjects with hip fracture and hip OA having a secondary cause of osteoporosis was three times higher than for subjects with hip fracture without hip OA.</p> <p>Conclusion</p> <p>The results of our study support an inverse relationship between hip fractures and hip OA.</p
Neuer Kopf, alte Ideen? : "Normalisierung" des Front National unter Marine Le Pen
In this article, it is investigated
whether vibrational entropy
(VE) is an important contribution to the free energy of globular proteins
at ambient conditions. VE represents the major configurational-entropy
contribution of these proteins. By definition, it is an average of
the configurational entropies of the protein within single minima
of the energy landscape, weighted by their occupation probabilities.
Its large part originates from thermal motion of flexible torsion
angles giving rise to the finite peak widths observed in torsion angle
distributions. While VE may affect the equilibrium properties of proteins,
it is usually neglected in numerical calculations as its consideration
is difficult. Moreover, it is sometimes believed that all well-packed
conformations of a globular protein have similar VE anyway. Here, we measure explicitly the VE for six different conformations from simulation data of a test protein. Estimates
are obtained using the quasi-harmonic approximation for three coordinate
sets, Cartesian, bond-angle-torsion (BAT), and a new set termed rotamer-degeneracy
lifted BAT coordinates by us. The new set gives improved estimates
as it overcomes a known shortcoming of the quasi-harmonic approximation
caused by multiply populated rotamer states, and it may serve for
VE estimation of macromolecules in a very general context. The obtained
VE values depend considerably on the type of coordinates used. However,
for all coordinate sets we find large entropy differences between
the conformations, of the order of the overall stability of the protein.
This result may have important implications on the choice of free
energy expressions used in software for protein structure prediction,
protein design, and NMR refinement
Meta Modeling for Business Process Improvement
Conducting business process improvement (BPI) initiatives is a topic of high priority for todayâs companies. However, performing BPI projects has become challenging. This is due to rapidly changing customer requirements and an increase of inter-organizational business processes, which need to be considered from an end-to-end perspective. In addition, traditional BPI approaches are more and more perceived as overly complex and too resource-consuming in practice. Against this background, the paper proposes a BPI roadmap, which is an approach for systematically performing BPI projects and serves practitionersâ needs for manageable BPI methods. Based on this BPI roadmap, a domain-specific conceptual modeling method (DSMM) has been developed. The DSMM supports the efficient documentation and communication of the results that emerge during the application of the roadmap. Thus, conceptual modeling acts as a means for purposefully codifying the outcomes of a BPI project. Furthermore, a corresponding software prototype has been implemented using a meta modeling platform to assess the technical feasibility of the approach. Finally, the usability of the prototype has been empirically evaluated
- âŠ