26 research outputs found
Thromboelastometry (ROTEM®) in children: age-related reference ranges and correlations with standard coagulation tests
Background The small sample volume needed and the prompt availability of results make viscoelastic methods like rotational thromboelastometry (ROTEM®) attractive for monitoring coagulation in small children. However, data on reference ranges for ROTEM® parameters in children are scarce. Methods Four hundred and seven children (ASA I and II) undergoing elective surgery were recruited for this prospective, two-centre, observational study. Subjects were grouped as follows: 0-3, 4-12, 13-24 months, 2-5, 6-10, and 11-16 yr. Study objectives were to establish age-dependent reference ranges for ROTEM® assays, analyse age dependence of parameters, and compare ROTEM® data with standard coagulation tests. Results Data from 359 subjects remained for final analysis. Except for extrinsically activated clot strength and lysis, parameters for ROTEM® assays were significantly different among all age groups. The most striking finding was that subjects aged 0-3 months exhibited accelerated initiation (ExTEM coagulation time: median 48 s, Q1-Q3 38-65 s; P=0.001) and propagation of coagulation (α angle: median 78o, Q1-Q3 69-84o; P<0.001) and maximum clot firmness (median 62 mm, Q1-Q3 54-74 mm), although standard plasma coagulation test results were prolonged (prothrombin time: median 13.2 s, Q1-Q3 12.6-13.6 s; activated partial thromboplastin time: median 42 s, Q1-Q3 40-46 s). Lysis indices of <85% were observed in nearly one-third of all children without increased bleeding tendency. Platelet count and fibrinogen levels correlated significantly with clot strength, and fibrinogen levels correlated with fibrin polymerization. Conclusions Reference ranges for ROTEM® assays were determined for all paediatric age groups. These values will be helpful when monitoring paediatric patients and in studies of perioperative coagulation in childre
Large scale Optimal Transportation Meshfree (OTM) Simulations of Hypervelocity Impact
Large scale three-dimensional numerical simulations of hypervelocity impact of Aluminum alloy 6061-T6 plates by Nylon 6/6 cylindrical
projectile have been performed using the Optimal Transportation Meshfree (OTM) method of Li et al. [7] along with the seizing contact
and variational material point failure algorithm [17, 18]. The dynamic response of the Al6061-T6 plate including phase transition in the
high strain rate, high pressure and high temperature regime expected in our numerical analysis is described by the use of a variational
thermomechanical coupling constitutive model with SESAME equation of state, rate-dependent J2 plasticity with power law hardening
and thermal softening and temperature dependent Newtonian viscosity. A polytropic type of equation of state fit to in-house ReaxFF
calculations is employed to model the Nylon 6/6 projectile under extreme conditions. The evaluation of the performance of the numerical
model takes the form of a conventional validation analysis. In support of the analysis, we have conducted experiments over a range of
plate thicknesses of [0.5, 3.0] mm, a range of impact velocities of [5.0, 7.0]km/s and a range of obliquities of [0, 70]° at Caltech's Small
Particle Hypervelocity Range (SPHIR) Facility. Large scale three-dimensional OTM simulations of hypervelocity impact are performed
on departmental class systems using a dynamic load balancing MPI/PThreads parallel implementation of the OTM method. We find
excellent full field agreement between measured and computed perforation areas, debris cloud and temperature field
Detection of acute infections during HIV testing in North Carolina
BACKGROUND: North Carolina has added nucleic acid amplification testing for the human immunodeficiency virus (HIV) to standard HIV antibody tests to detect persons with acute HIV infection who are viremic but antibody-negative. METHODS: To determine the effect of nucleic acid amplification testing on the yield and accuracy of HIV detection in public health practice, we conducted a 12-month observational study of methods for state-funded HIV testing. We compared the diagnostic performance of standard HIV antibody tests (i.e., enzyme immunoassay and Western blot analysis) with an algorithm whereby serum samples that yielded negative results on standard antibody tests were tested again with the use of nucleic acid amplification. A surveillance algorithm with repeated sensitive-less-sensitive enzyme immunoassay tests was also evaluated. HIV infection was defined as a confirmed positive result on a nucleic acid amplification test or as HIV antibody seroconversion. RESULTS: Between November 1,2002, and October 31,2003,109,250 persons at risk for HIV infection who had consented to HIV testing presented at state-funded sites. There were 606 HIV-positive results. Established infection, as identified by standard enzyme immunoassay or Western blot analysis, appeared in 583 participants; of these, 107 were identified, with the use of sensitive-less-sensitive enzyme immunoassay tests, as recent infections. A total of 23 acutely infected persons were identified only with the use of the nucleic acid amplification algorithm. With all detectable infections taken into account, the sensitivity of standard antibody testing was 0.962 (95 percent confidence interval, 0.944 to 0.976). There were two false positive results on nucleic acid amplification tests. The specificity and positive predictive value of the algorithm that included nucleic acid amplification testing were greater than 0.999 (95 percent confidence interval, 0.999 to >0.999) and 0.997 (95 percent confidence interval, 0.988 to >0.999), respectively. Of the 23 acute HIV infections, 16 were detected at sexually transmitted disease clinics. Emergency measures for HIV prevention protected 48 sex partners and one fetus from high-risk exposure to HIV. CONCLUSIONS: The addition of nucleic acid amplification testing to an HIV testing algorithm significantly increases the identification of cases of infection without impairing the performance of diagnostic testing. The detection of highly contagious, acutely infected persons creates new opportunities for HIV surveillance and prevention
A massively parallel implementation of the Optimal Transportation Meshfree method for explicit solid dynamics
Presented is a massively parallel implementation of the Optimal Transportation Meshfree (pOTM) method Li et al., 2010 for explicit solid dynamics. Its implementation is based on a two-level scheme using Message Passing Interface between compute servers and threaded parallelism on the multi-core processors within each server. Both layers dynamically subdivide the problem to provide excellent parallel scalability. pOTM is used on three problems and compared to experiments to demonstrate accuracy and performance. For both a Taylor-anvil and a hypervelocity impact problem, the pOTM implementation scales nearly perfectly to about 8000 cores
Rigorous model-based uncertainty quantification with application to terminal ballistics, part I: Systems with controllable inputs and small scatter
This work is concerned with establishing the feasibility of a data-on-demand (DoD) uncertainty quantification (UQ) protocol based on concentration-of-measure inequalities. Specific aims are to establish the feasibility of the protocol and its basic properties, including the tightness of the predictions afforded by the protocol. The assessment is based on an application to terminal ballistics and a specific system configuration consisting of 6061-T6 aluminum plates struck by spherical S-2 tool steel projectiles at ballistic impact speeds. The system's inputs are the plate thickness and impact velocity and the perforation area is chosen as the sole performance measure of the system. The objective of the UQ analysis is to certify the lethality of the projectile, i.e., that the projectile perforates the plate with high probability over a prespecified range of impact velocities and plate thicknesses. The net outcome of the UQ analysis is an M/U ratio, or confidence factor, of 2.93, indicative of a small probability of no perforation of the plate over its entire operating range. The high-confidence (>99.9%) in the successful operation of the system afforded the analysis and the small number of tests (40) required for the determination of the modeling-error diameter, establishes the feasibility of the DoD UQ protocol as a rigorous yet practical approach for model-based certification of complex systems