1,052 research outputs found

    Prosthetic overhang is the most effective way to prevent scapular conflict in a reverse total shoulder prosthesis

    Get PDF
    Methods An average and a "worst case scenario" shape in A-P view in a 2-D computer model of a scapula was created, using data from 200 "normal" scapulae, so that the position of the glenoid and humeral component could be changed as well as design features such as depth of the polyethylene insert, the size of glenosphere, the position of the center of rotation, and downward glenoid inclination. The model calculated the maximum adduction (notch angle) in the scapular plane when the cup of the humeral component was in conflict with the scapula. Results A change in humeral neck shaft inclination from 155 degrees to 145 degrees gave a 10 degrees gain in notch angle. A change in cup depth from 8 mm to 5 mm gave a gain of 12 degrees. With no inferior prosthetic overhang, a lateralization of the center of rotation from 0 mm to 5 mm gained 16 degrees. With an inferior overhang of only 1 mm, no effect of lateralizing the center of rotation was noted. Downward glenoid inclination of 0 boolean OR to 10 boolean OR gained 10 degrees. A change in glenosphere radius from 18 mm to 21 mm gained 31 degrees due to the inferior overhang created by the increase in glenosphere. A prosthetic overhang to the bone from 0 mm to 5 mm gained 39 degrees. Interpretation Of all 6 solutions tested, the prosthetic overhang created the biggest gain in notch angle and this should be considered when designing the reverse arthroplasty and defining optimal surgical technique

    Mobile health in adults with congenital heart disease: Current use and future needs

    Get PDF
    Objective Many adults with congenital heart disease (CHD) are affected lifelong by cardiac events, particularly arrhythmias and heart failure. Despite the care provided, the cardiac event rate remains high. Mobile health (mHealth) brings opportunities to enhance daily monitoring and hence timely response in an attempt to improve outcome. However, it is not known if adults with CHD are currently using mHealth and what type of mHealth they may need in the near future. Methods Consecutive adult patients with CHD who visited the outpatient clinic at the Academic Medical Center in Amsterdam were asked to fill out questionnaires. Exclusion criteria for this study were mental impairment or inability to read and write Dutch. Results All 118 patients participated (median age 40 (range 18–78) years, 40 % male, 49 % symptomatic) and 92 % owned a smartphone. Whereas only a small minority (14 %) of patients used mHealth, the large majority (75 %) were willing to start. Most patients wanted to use mHealth in order to receive more information on physical health, and advice on progression of symptoms or signs of deterioration. Analyses on age, gender and complexity of defect showed significantly less current smartphone usage at older age, but no difference in interest or preferences in type of mHealth application for the near future. Conclusion The relatively young adult CHD population only rarely uses mHealth, but the majority are motivated to start using mHealth. New mHealth initiatives are required in these patients with a chronic condition who need lifelong surveillance in order to reveal if a reduction in morbidity and mortality and improvement in quality of life can be achieved

    Marginalization of end-use technologies in energy innovation for climate protection

    Get PDF
    Mitigating climate change requires directed innovation efforts to develop and deploy energy technologies. Innovation activities are directed towards the outcome of climate protection by public institutions, policies and resources that in turn shape market behaviour. We analyse diverse indicators of activity throughout the innovation system to assess these efforts. We find efficient end-use technologies contribute large potential emission reductions and provide higher social returns on investment than energy-supply technologies. Yet public institutions, policies and financial resources pervasively privilege energy-supply technologies. Directed innovation efforts are strikingly misaligned with the needs of an emissions-constrained world. Significantly greater effort is needed to develop the full potential of efficient end-use technologies

    Sharing tasks or sharing actions? Evidence from the joint Simon task.

    Get PDF
    In a joint Simon task, a pair of co-acting individuals divide labors of performing a choice-reaction task in such a way that each actor responds to one type of stimuli and ignores the other type that is assigned to the co-actor. It has been suggested that the actors share the mental representation of the joint task and perform the co-actor’s trials as if they were their own. However, it remains unclear exactly which aspects of co-actor’s task-set the actors share in the joint Simon task. The present study addressed this issue by manipulating the proportions of compatible and incompatible trials for one actor (inducer actor) and observing its influences on the performance of the other actor (diagnostic actor) for whom there were always an equal proportion of compatible and incompatible trials. The design of the present study disentangled the effect of trial proportion from the confounding effect of compatibility on the preceding trial. The results showed that the trial proportions for the inducer actor had strong influences on the inducer actor’s own performance, but it had little influence on the diagnostic actor’s performance. Thus, the diagnostic actor did not represent aspects of the inducer actor’s task-set beyond stimuli and responses of the inducer actor. We propose a new account of the effect of preceding compatibility on the joint Simon effect.Action Contro

    Constraints on Nucleon Decay via "Invisible" Modes from the Sudbury Neutrino Observatory

    Get PDF
    Data from the Sudbury Neutrino Observatory have been used to constrain the lifetime for nucleon decay to ``invisible'' modes, such as n -> 3 nu. The analysis was based on a search for gamma-rays from the de-excitation of the residual nucleus that would result from the disappearance of either a proton or neutron from O16. A limit of tau_inv > 2 x 10^{29} years is obtained at 90% confidence for either neutron or proton decay modes. This is about an order of magnitude more stringent than previous constraints on invisible proton decay modes and 400 times more stringent than similar neutron modes.Comment: Update includes missing efficiency factor (limits change by factor of 2) Submitted to Physical Review Letter

    Classification of heterogeneous microarray data by maximum entropy kernel

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>There is a large amount of microarray data accumulating in public databases, providing various data waiting to be analyzed jointly. Powerful kernel-based methods are commonly used in microarray analyses with support vector machines (SVMs) to approach a wide range of classification problems. However, the standard vectorial data kernel family (linear, RBF, etc.) that takes vectorial data as input, often fails in prediction if the data come from different platforms or laboratories, due to the low gene overlaps or consistencies between the different datasets.</p> <p>Results</p> <p>We introduce a new type of kernel called maximum entropy (ME) kernel, which has no pre-defined function but is generated by kernel entropy maximization with sample distance matrices as constraints, into the field of SVM classification of microarray data. We assessed the performance of the ME kernel with three different data: heterogeneous kidney carcinoma, noise-introduced leukemia, and heterogeneous oral cavity carcinoma metastasis data. The results clearly show that the ME kernel is very robust for heterogeneous data containing missing values and high-noise, and gives higher prediction accuracies than the standard kernels, namely, linear, polynomial and RBF.</p> <p>Conclusion</p> <p>The results demonstrate its utility in effectively analyzing promiscuous microarray data of rare specimens, e.g., minor diseases or species, that present difficulty in compiling homogeneous data in a single laboratory.</p

    Dynamical Mean-Field Theory

    Full text link
    The dynamical mean-field theory (DMFT) is a widely applicable approximation scheme for the investigation of correlated quantum many-particle systems on a lattice, e.g., electrons in solids and cold atoms in optical lattices. In particular, the combination of the DMFT with conventional methods for the calculation of electronic band structures has led to a powerful numerical approach which allows one to explore the properties of correlated materials. In this introductory article we discuss the foundations of the DMFT, derive the underlying self-consistency equations, and present several applications which have provided important insights into the properties of correlated matter.Comment: Chapter in "Theoretical Methods for Strongly Correlated Systems", edited by A. Avella and F. Mancini, Springer (2011), 31 pages, 5 figure
    corecore