2,575 research outputs found

    Systematic clustering of transcription start site landscapes

    No full text
    Genome-wide, high-throughput methods for transcription start site (TSS) detection have shown that most promoters have an array of neighboring TSSs where some are used more than others, forming a distribution of initiation propensities. TSS distributions (TSSDs) vary widely between promoters and earlier studies have shown that the TSSDs have biological implications in both regulation and function. However, no systematic study has been made to explore how many types of TSSDs and by extension core promoters exist and to understand which biological features distinguish them. In this study, we developed a new non-parametric dissimilarity measure and clustering approach to explore the similarities and stabilities of clusters of TSSDs. Previous studies have used arbitrary thresholds to arrive at two general classes: broad and sharp. We demonstrated that in addition to the previous broad/sharp dichotomy an additional category of promoters exists. Unlike typical TATA-driven sharp TSSDs where the TSS position can vary a few nucleotides, in this category virtually all TSSs originate from the same genomic position. These promoters lack epigenetic signatures of typical mRNA promoters and a substantial subset of them are mapping upstream of ribosomal protein pseudogenes. We present evidence that these are likely mapping errors, which have confounded earlier analyses, due to the high similarity of ribosomal gene promoters in combination with known G addition bias in the CAGE libraries. Thus, previous two-class separations of promoter based on TSS distributions are motivated, but the ultra-sharp TSS distributions will confound downstream analyses if not removed.This work was supported by a grant from the Novo Nordisk Foundation, http://www.novonordiskfonden.dk/. The European Research Council (http:// erc.europa.eu/) has provided financial support to Dr. Sandelin under the EU 7th Framework Programme (FP7/2007-2013)/ERC grant agreement 204135

    Systematic clustering of transcription start site landscapes

    Get PDF
    Genome-wide, high-throughput methods for transcription start site (TSS) detection have shown that most promoters have an array of neighboring TSSs where some are used more than others, forming a distribution of initiation propensities. TSS distributions (TSSDs) vary widely between promoters and earlier studies have shown that the TSSDs have biological implications in both regulation and function. However, no systematic study has been made to explore how many types of TSSDs and by extension core promoters exist and to understand which biological features distinguish them. In this study, we developed a new non-parametric dissimilarity measure and clustering approach to explore the similarities and stabilities of clusters of TSSDs. Previous studies have used arbitrary thresholds to arrive at two general classes: broad and sharp. We demonstrated that in addition to the previous broad/sharp dichotomy an additional category of promoters exists. Unlike typical TATA-driven sharp TSSDs where the TSS position can vary a few nucleotides, in this category virtually all TSSs originate from the same genomic position. These promoters lack epigenetic signatures of typical mRNA promoters and a substantial subset of them are mapping upstream of ribosomal protein pseudogenes. We present evidence that these are likely mapping errors, which have confounded earlier analyses, due to the high similarity of ribosomal gene promoters in combination with known G addition bias in the CAGE libraries. Thus, previous two-class separations of promoter based on TSS distributions are motivated, but the ultra-sharp TSS distributions will confound downstream analyses if not removed

    Shared care in mental illness: A rapid review to inform implementation

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>While integrated primary healthcare for the management of depression has been well researched, appropriate models of primary care for people with severe and persistent psychotic disorders are poorly understood. In 2010 the NSW (Australia) Health Department commissioned a review of the evidence on "shared care" models of ambulatory mental health services. This focussed on critical factors in the implementation of these models in clinical practice, with a view to providing policy direction. The review excluded evidence about dementia, substance use and personality disorders.</p> <p>Methods</p> <p>A rapid review involving a search for systematic reviews on The Cochrane Database of Systematic Reviews and Database of Abstracts of Reviews of Effects (DARE). This was followed by a search for papers published since these systematic reviews on Medline and supplemented by limited iterative searching from reference lists.</p> <p>Results</p> <p>Shared care trials report improved mental and physical health outcomes in some clinical settings with improved social function, self management skills, service acceptability and reduced hospitalisation. Other benefits include improved access to specialist care, better engagement with and acceptability of mental health services. Limited economic evaluation shows significant set up costs, reduced patient costs and service savings often realised by other providers. Nevertheless these findings are not evident across all clinical groups. Gains require substantial cross-organisational commitment, carefully designed and consistently delivered interventions, with attention to staff selection, training and supervision. Effective models incorporated linkages across various service levels, clinical monitoring within agreed treatment protocols, improved continuity and comprehensiveness of services.</p> <p>Conclusions</p> <p>"Shared Care" models of mental health service delivery require attention to multiple levels (from organisational to individual clinicians), and complex service re-design. Re-evaluation of the roles of specialist mental health staff is a critical requirement. As expected, no one model of "shared" care fits diverse clinical groups. On the basis of the available evidence, we recommended a local trial that examined the process of implementation of core principles of shared care within primary care and specialist mental health clinical services.</p

    NAVY EXPEDITIONARY ADDITIVE MANUFACTURING (NEAM) CAPABILITY INTEGRATION

    Get PDF
    This capstone report analyzes the current and future use of additive manufacturing (AM) technologies within the Department of Defense (DOD). This analysis provided the technical background necessary to develop the Additive Manufacturing Process and Analysis Tool (AMPAT). AMPAT will help stakeholders identify what AM equipment best serves warfighters and their missions in expeditionary environments. Furthermore, the tool can be used by stakeholders to identify the most advantageous dispersions of AM capabilities across the fleet and make decisions on how those capabilities should be integrated into the greater naval mission and larger DOD enterprise. A systems engineering (SE) approach was implemented to gather information on current and prospective AM methods in order to understand and define the AM system operational requirements. Additionally, an SE process was utilized to analyze alternative software options to build the tool, implement agile software development processes to develop the tool, and verify and validate that the tool met the project requirements. The study found that AMPAT successfully outputs a ranked list of AM systems recommendations based upon user-defined input parameters and weighting values. Recommendations for choosing AM equipment and developing dispersion plans for the fleet include using the AMPAT deliverable to conduct customized, iterative analysis with user-defined inputs that are tailored to specific expeditionary environments.Outstanding ThesisCivilian, Department of the NavyCivilian, Department of the NavyCivilian, Department of the NavyCivilian, Department of the ArmyCivilian, Department of the NavyApproved for public release. Distribution is unlimited

    An Experimentation Infrastructure for Quantitative Measurements of Cyber Resilience

    Full text link
    The vulnerability of cyber-physical systems to cyber attack is well known, and the requirement to build cyber resilience into these systems has been firmly established. The key challenge this paper addresses is that maturing this discipline requires the development of techniques, tools, and processes for objectively, rigorously, and quantitatively measuring the attributes of cyber resilience. Researchers and program managers need to be able to determine if the implementation of a resilience solution actually increases the resilience of the system. In previous work, a table top exercise was conducted using a notional heavy vehicle on a fictitious military mission while under a cyber attack. While this exercise provided some useful data, more and higher fidelity data is required to refine the measurement methodology. This paper details the efforts made to construct a cost-effective experimentation infrastructure to provide such data. It also presents a case study using some of the data generated by the infrastructure.Comment: 6 pages, 2022 IEEE Military Communications Conference, pp. 855-86

    Quantitative Measurement of Cyber Resilience: Modeling and Experimentation

    Full text link
    Cyber resilience is the ability of a system to resist and recover from a cyber attack, thereby restoring the system's functionality. Effective design and development of a cyber resilient system requires experimental methods and tools for quantitative measuring of cyber resilience. This paper describes an experimental method and test bed for obtaining resilience-relevant data as a system (in our case -- a truck) traverses its route, in repeatable, systematic experiments. We model a truck equipped with an autonomous cyber-defense system and which also includes inherent physical resilience features. When attacked by malware, this ensemble of cyber-physical features (i.e., "bonware") strives to resist and recover from the performance degradation caused by the malware's attack. We propose parsimonious mathematical models to aid in quantifying systems' resilience to cyber attacks. Using the models, we identify quantitative characteristics obtainable from experimental data, and show that these characteristics can serve as useful quantitative measures of cyber resilience.Comment: arXiv admin note: text overlap with arXiv:2302.04413, arXiv:2302.0794

    On pp-filtrations of Weyl modules

    Full text link
    This paper considers Weyl modules for a simple, simply connected algebraic group over an algebraically closed field kk of positive characteristic p≠2p\not=2. The main result proves, if p≥2h−2p\geq 2h-2 (where hh is the Coxeter number) and if the Lusztig character formula holds for all (irreducible modules with) regular restricted highest weights, then any Weyl module Δ(λ)\Delta(\lambda) has a Δp\Delta^p-filtration, namely, a filtration with sections of the form Δp(μ0+pμ1):=L(μ0)⊗Δ(μ1)[1]\Delta^p(\mu_0+p\mu_1):=L(\mu_0)\otimes\Delta(\mu_1)^{[1]}, where μ0\mu_0 is restricted and μ1\mu_1 is arbitrary dominant. In case the highest weight λ\lambda of the Weyl module Δ(λ)\Delta(\lambda) is pp-regular, the pp-filtration is compatible with the G1G_1-radical series of the module. The problem of showing that Weyl modules have Δp\Delta^p-filtrations was first proposed as a worthwhile ("w\"unschenswert") problem in Jantzen's 1980 Crelle paper.Comment: Latest version corrects minor mistakes in previous versions. A reference is made to Williamson's recent arXiv posting, providing some relevant discussion in a footnote. [Comments on earlier versions: Previous v. 1 with minor errors and statements corrected. Improved organization. Should replace v. 2 which is an older version (even older than v.1) and was mistakenly posted.

    A strained silicon cold electron bolometer using Schottky contacts

    Get PDF
    We describe optical characterisation of a strained silicon cold electron bolometer (CEB), operating on a 350 mK stage, designed for absorption of millimetre-wave radiation. The silicon cold electron bolometer utilises Schottky contacts between a superconductor and an n++ doped silicon island to detect changes in the temperature of the charge carriers in the silicon, due to variations in absorbed radiation. By using strained silicon as the absorber, we decrease the electron-phonon coupling in the device and increase the responsivity to incoming power. The strained silicon absorber is coupled to a planar aluminium twin-slot antenna designed to couple to 160 GHz and that serves as the superconducting contacts. From the measured optical responsivity and spectral response, we calculate a maximum optical efficiency of 50% for radiation coupled into the device by the planar antenna and an overall noise equivalent power, referred to absorbed optical power, of 1.1×10−16 W Hz−1/2 when the detector is observing a 300 K source through a 4 K throughput limiting aperture. Even though this optical system is not optimized, we measure a system noise equivalent temperature difference of 6 mK Hz−1/2. We measure the noise of the device using a cross-correlation of time stream data, measured simultaneously with two junction field-effect transistor amplifiers, with a base correlated noise level of 300 pV Hz−1/2 and find that the total noise is consistent with a combination of photon noise, current shot noise, and electron-phonon thermal noise

    Eddington-Limited Accretion in z~2 WISE-selected Hot, Dust-Obscured Galaxies

    Full text link
    Hot, Dust-Obscured Galaxies, or "Hot DOGs", are a rare, dusty, hyperluminous galaxy population discovered by the WISE mission. Predominantly at redshifts 2-3, they include the most luminous known galaxies in the universe. Their high luminosities likely come from accretion onto highly obscured super massive black holes (SMBHs). We have conducted a pilot survey to measure the SMBH masses of five z~2 Hot DOGs via broad H_alpha emission lines, using Keck/MOSFIRE and Gemini/FLAMINGOS-2. We detect broad H_alpha emission in all five Hot DOGs. We find substantial corresponding SMBH masses for these Hot DOGs (~ 10^{9} M_sun), and their derived Eddington ratios are close to unity. These z~2 Hot DOGs are the most luminous AGNs at given BH masses, suggesting they are accreting at the maximum rates for their BHs. A similar property is found for known z~6 quasars. Our results are consistent with scenarios in which Hot DOGs represent a transitional, high-accretion phase between obscured and unobscured quasars. Hot DOGs may mark a special evolutionary stage before the red quasar and optical quasar phases, and they may be present at other cosmic epochs.Comment: 15 pages, 9 figures. Accepted by Ap
    • …
    corecore