96 research outputs found
A multifactorial obesity model developed from nationwide public health exposome data and modern computational analyses
Summary Statement of the problem Obesity is both multifactorial and multimodal, making it difficult to identify, unravel and distinguish causative and contributing factors. The lack of a clear model of aetiology hampers the design and evaluation of interventions to prevent and reduce obesity. Methods Using modern graph-theoretical algorithms, we are able to coalesce and analyse thousands of inter-dependent variables and interpret their putative relationships to obesity. Our modelling is different from traditional approaches; we make no a priori assumptions about the population, and model instead based on the actual characteristics of a population. Paracliques, noise-resistant collections of highly-correlated variables, are differentially distilled from data taken over counties associated with low versus high obesity rates. Factor analysis is then applied and a model is developed. Results and conclusions Latent variables concentrated around social deprivation, community infrastructure and climate, and especially heat stress were connected to obesity. Infrastructure, environment and community organisation differed in counties with low versus high obesity rates. Clear connections of community infrastructure with obesity in our results lead us to conclude that community level interventions are critical. This effort suggests that it might be useful to study and plan interventions around community organisation and structure, rather than just the individual, to combat the nation’s obesity epidemic
Experimental study and analysis of lubricants dispersed with nano Cu and TiO2 in a four-stroke two wheeler
The present investigation summarizes detailed experimental studies with standard lubricants of commercial quality known as Racer-4 of Hindustan Petroleum Corporation (India) dispersed with different mass concentrations of nanoparticles of Cu and TiO2. The test bench is fabricated with a four-stroke Hero-Honda motorbike hydraulically loaded at the rear wheel with proper instrumentation to record the fuel consumption, the load on the rear wheel, and the linear velocity. The whole range of data obtained on a stationery bike is subjected to regression analysis to arrive at various relationships between fuel consumption as a function of brake power, linear velocity, and percentage mass concentration of nanoparticles in the lubricant. The empirical relation correlates with the observed data with reasonable accuracy. Further, extension of the analysis by developing a mathematical model has revealed a definite improvement in brake thermal efficiency which ultimately affects the fuel economy by diminishing frictional power in the system with the introduction of nanoparticles into the lubricant. The performance of the engine seems to be better with nano Cu-Racer-4 combination than the one with nano TiO2
Chemical Power for Microscopic Robots in Capillaries
The power available to microscopic robots (nanorobots) that oxidize
bloodstream glucose while aggregated in circumferential rings on capillary
walls is evaluated with a numerical model using axial symmetry and
time-averaged release of oxygen from passing red blood cells. Robots about one
micron in size can produce up to several tens of picowatts, in steady-state, if
they fully use oxygen reaching their surface from the blood plasma. Robots with
pumps and tanks for onboard oxygen storage could collect oxygen to support
burst power demands two to three orders of magnitude larger. We evaluate
effects of oxygen depletion and local heating on surrounding tissue. These
results give the power constraints when robots rely entirely on ambient
available oxygen and identify aspects of the robot design significantly
affecting available power. More generally, our numerical model provides an
approach to evaluating robot design choices for nanomedicine treatments in and
near capillaries.Comment: 28 pages, 7 figure
A GPU implementation of a track-repeating algorithm for proton radiotherapy dose calculations
An essential component in proton radiotherapy is the algorithm to calculate
the radiation dose to be delivered to the patient. The most common dose
algorithms are fast but they are approximate analytical approaches. However
their level of accuracy is not always satisfactory, especially for
heterogeneous anatomic areas, like the thorax. Monte Carlo techniques provide
superior accuracy, however, they often require large computation resources,
which render them impractical for routine clinical use. Track-repeating
algorithms, for example the Fast Dose Calculator, have shown promise for
achieving the accuracy of Monte Carlo simulations for proton radiotherapy dose
calculations in a fraction of the computation time. We report on the
implementation of the Fast Dose Calculator for proton radiotherapy on a card
equipped with graphics processor units (GPU) rather than a central processing
unit architecture. This implementation reproduces the full Monte Carlo and
CPU-based track-repeating dose calculations within 2%, while achieving a
statistical uncertainty of 2% in less than one minute utilizing one single GPU
card, which should allow real-time accurate dose calculations
Recommended from our members
Religiosity, financial risk taking, and reward processing: an experimental study
The present study investigated the extent to which financial risk-taking (FRT) perspectives and religiosity influenced an individual’s performance on financial decision-making tasks under risk and/or uncertainty. It further investigated the potential to measure this interaction using electro-encephalogram (EEG) assessments through reward-related event-related potentials (P3 and FRN). EEG data were collected from 37 participants undergoing four decision-making tasks comprising the Balloon Analogue Risk Task (BART), Iowa Gambling Test (IGT), Mixed-Gamble Loss-Aversion Task (MGLAT), and MGLA-Success Task (MGLAST). The present study found that BART performance may be affected by an interaction of FRT perspectives and religiosity. The physiological effects of task feedback were also distinguished between religious and non-religious individuals objectively with EEG data. Overall, while religiosity and FRT may not significantly influence IGT and MGLA performance, and interact with BART in a complex way, physiological reaction towards feedback after BART performance appears to be strongly affected by religiosity and FRT perspectives
Mononuclear metal phosphinates with ancillary pyrazole ligands. Synthesis and X-ray crystal structures of [M(Ph<SUB>2</SUB>PO<SUB>2</SUB>)<SUB>2</SUB>(3,5-DMPZ)<SUB>2</SUB>] (M = Co, Zn)
The reaction of diphenylphosphinic acid, [Ph<SUB>2</SUB>P(O)(OH)], with CoCl<SUB>2</SUB> or ZnCl<SUB>2</SUB> and 3,5-dimethylpyrazole (DMPZ) in the presence of triethylamine affords the mononuclear phosphinates, M(Ph<SUB>2</SUB>PO<SUB>2</SUB>)<SUB>2</SUB>(DMPZ)<SUB>2</SUB> (M= Co(1), Zn(2)). The molecular structures of 1 and 2 reveal a tetrahedral environment around the metal. The phosphinate ligands are monoanionic and bind to the metal in a monodentate manner, while the pyrazole ligands act as neutral monodentate ligands. Intermolecular C-H-O and p-p interactions in 1 and 2 result in the generation of two dimensional supramolecular polymeric sheets in the solid state
Parallel Adaptive Quantum Trajectory Method for Wavepacket Simulations
Time-dependent wavepackets are widely used to model various phenomena in physics. One approach in simulating the wavepacket dynamics is the quantum trajectory method (QTM). Based on the hydrodynamic formulation of quantum mechanics, the QTM represents the wavepacket by an unstructured set of pseudoparticles whose trajectories are coupled by the quantum potential. The governing equations for the pseudoparticle trajectories are solved using a computationally-intensive moving weighted least squares (MWLS) algorithm, and the trajectories can be computed in parallel. This paper contributes a strategy for improving the performance of wavepacket simulations using the QTM. Specifically, adaptivity is incorporated into the MWLS algorithm, and loop scheduling techniques are employed to dynamically load balance the parallel computation of the trajectories. The adaptive MWLS algorithm reduces the amount of computations without sacrificing accuracy, while adaptive loop scheduling addresses the load imbalance introduced by the algorithm and the runtime system. Results of experiments on a Linux cluster are presented to confirm that the adaptive MWLS reduces the trajectory computation time by up to 24%, and adaptive loop scheduling achieves parallel efficiencies of up to 85% when simulating a free particle
- …