4,512 research outputs found
Temporal trends in mode, site and stage of presentation with the introduction of colorectal cancer screening: a decade of experience from the West of Scotland
background: Population colorectal cancer screening programmes have been introduced to reduce cancer-specific mortality through the detection of early-stage disease. The present study aimed to examine the impact of screening introduction in the West of Scotland.
methods: Data on all patients with a diagnosis of colorectal cancer between January 2003 and December 2012 were extracted from a prospectively maintained regional audit database. Changes in mode, site and stage of presentation before, during and after screening introduction were examined.
results: In a population of 2.4 million, over a 10-year period, 14 487 incident cases of colorectal cancer were noted. Of these, 7827 (54%) were males and 7727 (53%) were socioeconomically deprived. In the postscreening era, 18% were diagnosed via the screening programme. There was a reduction in both emergency presentation (20% prescreening vs 13% postscreening, P0.001) and the proportion of rectal cancers (34% prescreening vs 31% pos-screening, P0.001) over the timeframe. Within non-metastatic disease, an increase in the proportion of stage I tumours at diagnosis was noted (17% prescreening vs 28% postscreening, P0.001).
conclusions: Within non-metastatic disease, a shift towards earlier stage at diagnosis has accompanied the introduction of a national screening programme. Such a change should lead to improved outcomes in patients with colorectal cancer
Erlotinib in patients with previously irradiated, recurrent brain metastases from non-small cell lung cancer: Two case reports
Background: With the current improvements in primary lung care, the long-term control of brain metastases becomes a clinical challenge. No established therapeutic approaches exist for cranial relapse after response to previous radiotherapy and systemic therapy. Tyrosine kinase inhibitors like erlotinib with its proven activity in non-small cell lung cancer may provide clinical benefits in such patients. Patients and Methods: Two case reports are presented illustrating the efficacy of erlotinib in patients with recurrent brain metastases and parallel thoracic progression. Results: Both patients showed lasting partial remissions in the brain and lung, and clinical symptom improvement. Conclusion: The observed survival times of above 18 and 15 months, respectively, since occurrence of cranial disease manifestation in line with the achieved progression-free survival times of 9 and 6 months by the erlotinib third-line therapy are remarkable. The use of targeted therapies after whole-brain irradiation should be investigated more systematically in prospective clinical trials
Using reciprocity for relating the simulation of transcranial current stimulation to the EEG forward problem
To explore the relationship between transcranial current stimulation (tCS) and the electroencephalography (EEG) forward problem, we investigate and compare accuracy and efficiency of a reciprocal and a direct EEG forward approach for dipolar primary current sources both based on the finite element method (FEM), namely the adjoint approach (AA) and the partial integration approach in conjunction with a transfer matrix concept (PI). By analyzing numerical results, comparing to analytically derived EEG forward potentials and estimating computational complexity in spherical shell models, AA turns out to be essentially identical to PI. It is then proven that AA and PI are also algebraically identical even for general head models. This relation offers a direct link between the EEG forward problem and tCS. We then demonstrate how the quasi-analytical EEG forward solutions in sphere models can be used to validate the numerical accuracies of FEM-based tCS simulation approaches. These approaches differ with respect to the ease with which they can be employed for realistic head modeling based on MRI-derived segmentations. We show that while the accuracy of the most easy to realize approach based on regular hexahedral elements is already quite high, it can be significantly improved if a geometry-adaptation of the elements is employed in conjunction with an isoparametric FEM approach. While the latter approach does not involve any additional difficulties for the user, it reaches the high accuracies of surface-segmentation based tetrahedral FEM, which is considerably more difficult to implement and topologically less flexible in practice. Finally, in a highly realistic head volume conductor model and when compared to the regular alternative, the geometry-adapted hexahedral FEM is shown to result in significant changes in tCS current flow orientation and magnitude up to 45° and a factor of 1.66, respectively
Degeneracy: a link between evolvability, robustness and complexity in biological systems
A full accounting of biological robustness remains elusive; both in terms of the mechanisms by which robustness is achieved and the forces that have caused robustness to grow over evolutionary time. Although its importance to topics such as ecosystem services and resilience is well recognized, the broader relationship between robustness and evolution is only starting to be fully appreciated. A renewed interest in this relationship has been prompted by evidence that mutational robustness can play a positive role in the discovery of adaptive innovations (evolvability) and evidence of an intimate relationship between robustness and complexity in biology.
This paper offers a new perspective on the mechanics of evolution and the origins of complexity, robustness, and evolvability. Here we explore the hypothesis that degeneracy, a partial overlap in the functioning of multi-functional components, plays a central role in the evolution and robustness of complex forms. In support of this hypothesis, we present evidence that degeneracy is a fundamental source of robustness, it is intimately tied to multi-scaled complexity, and it establishes conditions that are necessary for system evolvability
A new modelling approach of evaluating preventive and reactive strategies for mitigating supply chain risks
Supply chains are becoming more complex and vulnerable due to globalization and interdependency between different risks. Existing studies have focused on identifying different preventive and reactive strategies for mitigating supply chain risks and advocating the need for adopting specific strategy under a particular situation. However, current research has not addressed the issue of evaluating an optimal mix of preventive and reactive strategies taking into account their relative costs and benefits within the supply network setting of interconnected firms and organizations. We propose a new modelling approach of evaluating different combinations of such strategies using Bayesian belief networks. This technique helps in determining an optimal solution on the basis of maximum improvement in the network expected loss. We have demonstrated our approach through a simulation study and discussed practical and managerial implications
Phred-Phrap package to analyses tools: a pipeline to facilitate population genetics re-sequencing studies
BACKGROUND: Targeted re-sequencing is one of the most powerful and widely used strategies for population genetics studies because it allows an unbiased screening for variation that is suitable for a wide variety of organisms. Examples of studies that require re-sequencing data are evolutionary inferences, epidemiological studies designed to capture rare polymorphisms responsible for complex traits and screenings for mutations in families and small populations with high incidences of specific genetic diseases. Despite the advent of next-generation sequencing technologies, Sanger sequencing is still the most popular approach in population genetics studies because of the widespread availability of automatic sequencers based on capillary electrophoresis and because it is still less prone to sequencing errors, which is critical in population genetics studies. Two popular software applications for re-sequencing studies are Phred-Phrap-Consed-Polyphred, which performs base calling, alignment, graphical edition and genotype calling and DNAsp, which performs a set of population genetics analyses. These independent tools are the start and end points of basic analyses. In between the use of these tools, there is a set of basic but error-prone tasks to be performed with re-sequencing data.
RESULTS: In order to assist with these intermediate tasks, we developed a pipeline that facilitates data handling typical of re-sequencing studies. Our pipeline: (1) consolidates different outputs produced by distinct Phred-Phrap-Consed contigs sharing a reference sequence; (2) checks for genotyping inconsistencies; (3) reformats genotyping data produced by Polyphred into a matrix of genotypes with individuals as rows and segregating sites as columns; (4) prepares input files for haplotype inferences using the popular software PHASE; and (5) handles PHASE output files that contain only polymorphic sites to reconstruct the inferred haplotypes including polymorphic and monomorphic sites as required by population genetics software for re-sequencing data such as DNAsp.
CONCLUSION: We tested the pipeline in re-sequencing studies of haploid and diploid data in humans, plants, animals and microorganisms and observed that it allowed a substantial decrease in the time required for sequencing analyses, as well as being a more controlled process that eliminates several classes of error that may occur when handling datasets. The pipeline is also useful for investigators using other tools for sequencing and population genetics analyses
Characterizing Interdisciplinarity of Researchers and Research Topics Using Web Search Engines
Researchers' networks have been subject to active modeling and analysis.
Earlier literature mostly focused on citation or co-authorship networks
reconstructed from annotated scientific publication databases, which have
several limitations. Recently, general-purpose web search engines have also
been utilized to collect information about social networks. Here we
reconstructed, using web search engines, a network representing the relatedness
of researchers to their peers as well as to various research topics.
Relatedness between researchers and research topics was characterized by
visibility boost-increase of a researcher's visibility by focusing on a
particular topic. It was observed that researchers who had high visibility
boosts by the same research topic tended to be close to each other in their
network. We calculated correlations between visibility boosts by research
topics and researchers' interdisciplinarity at individual level (diversity of
topics related to the researcher) and at social level (his/her centrality in
the researchers' network). We found that visibility boosts by certain research
topics were positively correlated with researchers' individual-level
interdisciplinarity despite their negative correlations with the general
popularity of researchers. It was also found that visibility boosts by
network-related topics had positive correlations with researchers' social-level
interdisciplinarity. Research topics' correlations with researchers'
individual- and social-level interdisciplinarities were found to be nearly
independent from each other. These findings suggest that the notion of
"interdisciplinarity" of a researcher should be understood as a
multi-dimensional concept that should be evaluated using multiple assessment
means.Comment: 20 pages, 7 figures. Accepted for publication in PLoS On
Towards Open and Equitable Access to Research and Knowledge for Development
Leslie Chan and colleagues discuss the value of open access not just for access
to health information, but also for transforming structural inequity in current
academic reward systems and for valuing scholarship from the South
Networked buffering: a basic mechanism for distributed robustness in complex adaptive systems
A generic mechanism - networked buffering - is proposed for the generation of robust traits in complex systems. It requires two basic conditions to be satisfied: 1) agents are versatile enough to perform more than one single functional role within a system and 2) agents are degenerate, i.e. there exists partial overlap in the functional capabilities of agents. Given these prerequisites, degenerate systems can readily produce a distributed systemic response to local perturbations. Reciprocally, excess resources related to a single function can indirectly support multiple unrelated functions within a degenerate system. In models of genome:proteome mappings for which localized decision-making and modularity of genetic functions are assumed, we verify that such distributed compensatory effects cause enhanced robustness of system traits. The conditions needed for networked buffering to occur are neither demanding nor rare, supporting the conjecture that degeneracy may fundamentally underpin distributed robustness within several biotic and abiotic systems. For instance, networked buffering offers new insights into systems engineering and planning activities that occur under high uncertainty. It may also help explain recent developments in understanding the origins of resilience within complex ecosystems. \ud
\u
LHC Discovery Potential for Non-Standard Higgs Bosons in the 3b Channel
In a variety of well motivated models, such as two Higgs Doublet Models
(2HDMs) and the Minimal Supersymmetric Standard Model (MSSM), there are neutral
Higgs bosons that have significantly enhanced couplings to b-quarks and tau
leptons in comparison to those of the SM Higgs. These so called non-standard
Higgs bosons could be copiously produced at the LHC in association with b
quarks, and subsequently decay into b-quark pairs. However, this production
channel suffers from large irreducible QCD backgrounds. We propose a new search
strategy for non-standard neutral Higgs bosons at the 7 TeV LHC in the 3b's
final state topology. We perform a simulation of the signal and backgrounds,
using state of the art tools and methods for different sets of selection cuts,
and conclude that neutral Higgs bosons with couplings to b-quarks of about 0.3
or larger, and masses up to 400 GeV, could be seen with a luminosity of 30
fb^{-1}. In the case of the MSSM we also discuss the complementarity between
the 3b channel and the inclusive tau pair channel in exploring the
supersymmetric parameter space.Comment: 14 pages, 3 figures, 4 tables, references added, published versio
- …