85 research outputs found

    Subsonic turbulence in smoothed particle hydrodynamics and moving-mesh simulations

    Full text link
    Highly supersonic, compressible turbulence is thought to be of tantamount importance for star formation processes in the interstellar medium. Likewise, cosmic structure formation is expected to give rise to subsonic turbulence in the intergalactic medium, which may substantially modify the thermodynamic structure of gas in virialized dark matter halos and affect small-scale mixing processes in the gas. Numerical simulations have played a key role in characterizing the properties of astrophysical turbulence, but thus far systematic code comparisons have been restricted to the supersonic regime, leaving it unclear whether subsonic turbulence is faithfully represented by the numerical techniques commonly employed in astrophysics. Here we focus on comparing the accuracy of smoothed particle hydrodynamics (SPH) and our new moving-mesh technique AREPO in simulations of driven subsonic turbulence. To make contact with previous results, we also analyze simulations of transsonic and highly supersonic turbulence. We find that the widely employed standard formulation of SPH yields problematic results in the subsonic regime. Instead of building up a Kolmogorov-like turbulent cascade, large-scale eddies are quickly damped close to the driving scale and decay into small-scale velocity noise. Reduced viscosity settings improve the situation, but the shape of the dissipation range differs compared with expectations for a Kolmogorov cascade. In contrast, our moving-mesh technique does yield power-law scaling laws for the power spectra of velocity, vorticity and density, consistent with expectations for fully developed isotropic turbulence. We show that large errors in SPH's gradient estimate and the associated subsonic velocity noise are ultimately responsible for producing inaccurate results in the subsonic regime. In contrast, SPH's performance is much better for supersonic turbulence. [Abridged]Comment: 22 pages, 20 figures, accepted in MNRAS. Includes a rebuttal to arXiv:1111.1255 of D. Price and significant revisions to address referee comments. Conclusions of original submission unchange

    How good are Bayesian belief networks for environmental management? A test with data from an agricultural river catchment

    Get PDF
    1. The ecological health of rivers worldwide continues to decline despite increasing effort and investment in river science and management. Bayesian belief networks (BBNs) are increasingly being used as a mechanism for decision-making in river management because they provide a simple visual framework to explore different management scenarios for the multiple stressors that impact rivers. However, most applications of BBN modelling to resource management use expert knowledge and/or limited real data, and fail to accurately assess the ability of the model to make predictions. 2. We developed a BBN to model ecological condition in a New Zealand river using field/GIS data (from multiple rivers), rather than expert opinion, and assessed its predictive ability on an independent dataset. The developed BBN performed moderately better than a number of other modelling techniques (e.g., artificial neural networks, classification trees, random forest, logistic regression), although model construction was more time3consuming. Thus the predictive ability of BBNs is (in this case at least) on a par with other modelling methods but the approach is distinctly better for its ability to visually present the data linkages, issues and potential outcomes of management options in real time. 3. The BBN suggested management of habitat quality, su ch as riparian planting, along with the current management focus on limiting nutrient leaching from agricultural land may be most effective in improving ecological condition. 4. BBNs can be a powerful and accurate method of effectively portraying the multiple interacting drivers of environmental condition in an easily understood manner. However, most BBN applications fail to appropriately test the model fit prior to use. We believe this lack of testing may seriously undermine their long-term effectiveness in resource management, and recommend that BBNs should be used in conjunction with some measure of uncertainty about model predictions. We have demonstrated this for a BBN of ecological condition in a New Zealand river, shown that model fit is better than that for other modelling techniques, and that improving habitat would be equally effective to reducing nutrients to improve ecological condition

    N-body simulations of gravitational dynamics

    Full text link
    We describe the astrophysical and numerical basis of N-body simulations, both of collisional stellar systems (dense star clusters and galactic centres) and collisionless stellar dynamics (galaxies and large-scale structure). We explain and discuss the state-of-the-art algorithms used for these quite different regimes, attempt to give a fair critique, and point out possible directions of future improvement and development. We briefly touch upon the history of N-body simulations and their most important results.Comment: invited review (28 pages), to appear in European Physics Journal Plu

    Ketohexokinase-mediated fructose metabolism is lost in hepatocellular carcinoma and can be leveraged for metabolic imaging

    Get PDF
    The ability to break down fructose is dependent on ketohexokinase (KHK) that phosphorylates fructose to fructose-1-phosphate (F1P). We show that KHK expression is tightly controlled and limited to a small number of organs and is down-regulated in liver and intestinal cancer cells. Loss of fructose metabolism is also apparent in hepatocellular adenoma and carcinoma (HCC) patient samples. KHK overexpression in liver cancer cells results in decreased fructose flux through glycolysis. We then developed a strategy to detect this metabolic switch in vivo using hyperpolarized magnetic resonance spectroscopy. Uniformly deuterating [2-13C]-fructose and dissolving in D2O increased its spin-lattice relaxation time (T1) fivefold, enabling detection of F1P and its loss in models of HCC. In summary, we posit that in the liver, fructolysis to F1P is lost in the development of cancer and can be used as a biomarker of tissue function in the clinic using metabolic imaging

    Towards the prediction of essential genes by integration of network topology, cellular localization and biological process information

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The identification of essential genes is important for the understanding of the minimal requirements for cellular life and for practical purposes, such as drug design. However, the experimental techniques for essential genes discovery are labor-intensive and time-consuming. Considering these experimental constraints, a computational approach capable of accurately predicting essential genes would be of great value. We therefore present here a machine learning-based computational approach relying on network topological features, cellular localization and biological process information for prediction of essential genes.</p> <p>Results</p> <p>We constructed a decision tree-based meta-classifier and trained it on datasets with individual and grouped attributes-network topological features, cellular compartments and biological processes-to generate various predictors of essential genes. We showed that the predictors with better performances are those generated by datasets with integrated attributes. Using the predictor with all attributes, i.e., network topological features, cellular compartments and biological processes, we obtained the best predictor of essential genes that was then used to classify yeast genes with unknown essentiality status. Finally, we generated decision trees by training the J48 algorithm on datasets with all network topological features, cellular localization and biological process information to discover cellular rules for essentiality. We found that the number of protein physical interactions, the nuclear localization of proteins and the number of regulating transcription factors are the most important factors determining gene essentiality.</p> <p>Conclusion</p> <p>We were able to demonstrate that network topological features, cellular localization and biological process information are reliable predictors of essential genes. Moreover, by constructing decision trees based on these data, we could discover cellular rules governing essentiality.</p

    Nuclear Actin and Lamins in Viral Infections

    Get PDF
    Lamins are the best characterized cytoskeletal components of the cell nucleus that help to maintain the nuclear shape and participate in diverse nuclear processes including replication or transcription. Nuclear actin is now widely accepted to be another cytoskeletal protein present in the nucleus that fulfills important functions in the gene expression. Some viruses replicating in the nucleus evolved the ability to interact with and probably utilize nuclear actin for their replication, e.g., for the assembly and transport of capsids or mRNA export. On the other hand, lamins play a role in the propagation of other viruses since nuclear lamina may represent a barrier for virions entering or escaping the nucleus. This review will summarize the current knowledge about the roles of nuclear actin and lamins in viral infections

    Summary of included studies.

    No full text
    General practice is generally the first point of contact for patients presenting with COVID-19. Since the start of the COVID-19 pandemic general practitioners (GPs) across Europe have had to adopt to using telemedicine consultations in order to minimise the number of social contacts made. GPs had to balance two needs: preventing the spread of COVID-19, while providing their patients with regular care for other health issues. The aim of this study was to conduct a scoping review of the literature examining the use of telemedicine for delivering routine general practice care since the start of the pandemic from the perspectives of patients and practitioners. The six-stage framework developed by Arksey and O’Malley, with recommendations by Levac et al was used to review the existing literature. The study selection process was conducted according to the PRISMA Extension for Scoping Reviews guidelines. Braun and Clarke’s‘ Thematic Analysis’ approach was used to interpret data. A total of eighteen studies across nine countries were included in the review. Thirteen studies explored the practitioner perspective of the use of telemedicine in general practice since the COVID-19 pandemic, while five studies looked at the patient perspective. The types of studies included were: qualitative studies, literature reviews, a systematic review, observational studies, quantitative studies, Critical incident technique study, and surveys employing both closed and open styled questions. Key themes identified related to the patient/ practitioner experience and knowledge of using telemedicine, patient/ practitioner levels of satisfaction, GP collaboration, nature of workload, and suitability of consultations for telemedicine. The nature of general practice was radically changed during the COVID-19 pandemic. Certain patient groups and areas of clinical and administrative work were identified as having performed well, if not better, by using telemedicine. Our findings suggest a level of acceptability and satisfaction of telemedicine by GPs and patients during the pandemic; however, further research is warranted in this area.</div
    corecore