737 research outputs found
CRISPR/Cas9 and next generation sequencing in the personalized treatment of Cancer
Background: Cancer is caused by a combination of genetic and epigenetic abnormalities. Current cancer therapies are limited due to the complexity of their mechanism, underlining the need for alternative therapeutic approaches. Interestingly, combining the Clustered Regularly Interspaced Short Palindromic Repeats (CRISPR/Cas9) system with next-generation sequencing (NGS) has the potential to speed up the identification, validation, and targeting of high-value targets.
Main text: Personalized or precision medicine combines genetic information with phenotypic and environmental characteristics to produce healthcare tailored to the individual and eliminates the constraints of “one-size-fits-all” therapy. Precision medicine is now possible thanks to cancer genome sequencing. Having advantages over limited sample requirements and the recent development of biomarkers have made the use of NGS a major leap in personalized medicine. Tumor and cell-free DNA profiling using NGS, proteome and RNA analyses, and a better understanding of immunological systems, are all helping to improve cancer treatment choices. Finally, direct targeting of tumor genes in cancer cells with CRISPR/Cas9 may be achievable, allowing for eliminating genetic changes that lead to tumor growth and metastatic capability.
Conclusion: With NGS and CRISPR/Cas9, the goal is no longer to match the treatment for the diagnosed tumor but rather to build a treatment method that fits the tumor exactly. Hence, in this review, we have discussed the potential role of CRISPR/Cas9 and NGS in advancing personalized medicine
Use of the bootstrap in analysing cost data from cluster randomised trials: some simulation results
BACKGROUND: This work has investigated under what conditions confidence intervals around the differences in mean costs from a cluster RCT are suitable for estimation using a commonly used cluster-adjusted bootstrap in preference to methods that utilise the Huber-White robust estimator of variance. The bootstrap's main advantage is in dealing with skewed data, which often characterise patient costs. However, it is insufficiently well recognised that one method of adjusting the bootstrap to deal with clustered data is only valid in large samples. In particular, the requirement that the number of clusters randomised should be large would not be satisfied in many cluster RCTs performed to date. METHODS: The performances of confidence intervals for simple differences in mean costs utilising a robust (cluster-adjusted) standard error and from two cluster-adjusted bootstrap procedures were compared in terms of confidence interval coverage in a large number of simulations. Parameters varied included the intracluster correlation coefficient, the sample size and the distributions used to generate the data. RESULTS: The bootstrap's advantage in dealing with skewed data was found to be outweighed by its poor confidence interval coverage when the number of clusters was at the level frequently found in cluster RCTs in practice. Simulations showed that confidence intervals based on robust methods of standard error estimation achieved coverage rates between 93.5% and 94.8% for a 95% nominal level whereas those for the bootstrap ranged between 86.4% and 93.8%. CONCLUSION: In general, 24 clusters per treatment arm is probably the minimum number for which one would even begin to consider the bootstrap in preference to traditional robust methods, for the parameter combinations investigated here. At least this number of clusters and extremely skewed data would be necessary for the bootstrap to be considered in favour of the robust method. There is a need for further investigation of more complex bootstrap procedures if economic data from cluster RCTs are to be analysed appropriately
Rose Bengal sensitized bilayered photoanode of nano-crystalline TiO–CeO for dye-sensitized solar cell application
There are two traditional ways to read Kant’s claim that every event necessarily has a cause: the weaker every-event some-cause (WCP) and the stronger same-cause same-effect (SCP) causal principles. The debate on whether and where he subscribes to the SCP has focused on the Analogies in the Critique of Pure Reason (Guyer, Allison, and Watkins) and on the Metaphysical Foundations of Natural Science (Friedman). By analysing the arguments and conclusions of both the Analogies and the Postulates, as well as the two Latin principles non datur casus and non datur fatum that summarise their results, I will argue that the SCP is actually demonstrated in the Postulates section of the First Critique
Coordinated optimization of visual cortical maps (II) Numerical studies
It is an attractive hypothesis that the spatial structure of visual cortical
architecture can be explained by the coordinated optimization of multiple
visual cortical maps representing orientation preference (OP), ocular dominance
(OD), spatial frequency, or direction preference. In part (I) of this study we
defined a class of analytically tractable coordinated optimization models and
solved representative examples in which a spatially complex organization of the
orientation preference map is induced by inter-map interactions. We found that
attractor solutions near symmetry breaking threshold predict a highly ordered
map layout and require a substantial OD bias for OP pinwheel stabilization.
Here we examine in numerical simulations whether such models exhibit
biologically more realistic spatially irregular solutions at a finite distance
from threshold and when transients towards attractor states are considered. We
also examine whether model behavior qualitatively changes when the spatial
periodicities of the two maps are detuned and when considering more than 2
feature dimensions. Our numerical results support the view that neither minimal
energy states nor intermediate transient states of our coordinated optimization
models successfully explain the spatially irregular architecture of the visual
cortex. We discuss several alternative scenarios and additional factors that
may improve the agreement between model solutions and biological observations.Comment: 55 pages, 11 figures. arXiv admin note: substantial text overlap with
arXiv:1102.335
Collagen based magnetic nanocomposites for oil removal applications
A stable magnetic nanocomposite of collagen and superparamagnetic iron oxide nanoparticles (SPIONs) is prepared by a simple process utilizing protein wastes from leather industry. Molecular interaction between helical collagen fibers and spherical SPIONs is proven through calorimetric, microscopic and spectroscopic techniques. This nanocomposite exhibited selective oil absorption and magnetic tracking ability, allowing it to be used in oil removal applications. The environmental sustainability of the oil adsorbed nanobiocomposite is also demonstrated here through its conversion into a bi-functional graphitic nanocarbon material via heat treatment. The approach highlights new avenues for converting bio-wastes into useful nanomaterials in scalable and inexpensive ways
Diagnostic work-up and loss of tuberculosis suspects in Jogjakarta, Indonesia
<p>Abstract</p> <p>Background</p> <p>Early and accurate diagnosis of pulmonary tuberculosis (TB) is critical for successful TB control. To assist in the diagnosis of smear-negative pulmonary TB, the World Health Organisation (WHO) recommends the use of a diagnostic algorithm. Our study evaluated the implementation of the national tuberculosis programme's diagnostic algorithm in routine health care settings in Jogjakarta, Indonesia. The diagnostic algorithm is based on the WHO TB diagnostic algorithm, which had already been implemented in the health facilities.</p> <p>Methods</p> <p>We prospectively documented the diagnostic work-up of all new tuberculosis suspects until a diagnosis was reached. We used clinical audit forms to record each step chronologically. Data on the patient's gender, age, symptoms, examinations (types, dates, and results), and final diagnosis were collected.</p> <p>Results</p> <p>Information was recorded for 754 TB suspects; 43.5% of whom were lost during the diagnostic work-up in health centres, 0% in lung clinics. Among the TB suspects who completed diagnostic work-ups, 51.1% and 100.0% were diagnosed without following the national TB diagnostic algorithm in health centres and lung clinics, respectively. However, the work-up in the health centres and lung clinics generally conformed to international standards for tuberculosis care (ISTC). Diagnostic delays were significantly longer in health centres compared to lung clinics.</p> <p>Conclusions</p> <p>The high rate of patients lost in health centres needs to be addressed through the implementation of TB suspect tracing and better programme supervision. The national TB algorithm needs to be revised and differentiated according to the level of care.</p
- …