5,609 research outputs found

    Transcriptional Termination Modulated by Nucleotides Outside the Characterized Gene End Sequence of Respiratory Syncytial Virus

    Get PDF
    AbstractThe genes of respiratory syncytial (RS) virus are transcribed sequentially by the viral RNA polymerase from a single 3′-proximal promoter. Polyadenylation and termination are directed by a sequence at the end of each gene, after which the polymerase crosses an intergenic region and reinitiates at the start sequence of the next gene. The 10 viral genes have different gene end sequences and different termination efficiencies, which allow for regulation of gene expression, since termination of each gene is required for initiation of the downstream gene. RNA sequences within the previously characterized 13 nucleotide gene end, including a conserved sequence 3′-UCAAU-5′ and a tract of U residues, are important for termination. In this study, two additional sequence elements outside of the 13 nucleotide gene end were found to modulate termination efficiency: the A residue upstream of the 3′-UCAAU-5′ sequence, and the first nucleotide of the intergenic region when it follows a U4 tract

    An earth pole-sitter using hybrid propulsion

    Get PDF
    In this paper we investigate optimal pole-sitter orbits using hybrid solar sail and solar electric propulsion (SEP). A pole-sitter is a spacecraft that is constantly above one of the Earth's poles, by means of a continuous thrust. Optimal orbits, that minimize propellant mass consumption, are found both through a shape-based approach, and solving an optimal control problem, using a direct method based on pseudo-spectral techniques. Both the pure SEP case and the hybrid case are investigated and compared. It is found that the hybrid spacecraft allows consistent savings on propellant mass fraction. Finally, is it shown that for sufficiently long missions (more than 8 years), a hybrid spacecraft, based on mid-term technology, enables a consistent reduction in the launch mass for a given payload, with respect to a pure SEP spacecraft

    Development of high temperature refractory-based multi-principle-component alloys by thermodynamic calculations and rapid alloy prototyping

    Get PDF
    Recently, new refractory-based high entropy alloys (HEAs) have been investigated for potential use as high temperature structural alloys, and some alloys exhibit excellent high temperature strength and ductility. While the high entropy alloy community is generally concerned with obtaining single phase solid-solution phases, secondary strengthening phases are usually required to achieve an adequate balance of mechanical and physical properties for structural applications. This contribution will report on new Mo,Nb-based alloys that have been developed using HEA design guidelines, as well as new tools that enable thermodynamic property predictions and rapid alloy prototyping and assessment. An elemental palette of Mo-Nb-Hf-Ta-Ti-V-W-Zr was chosen in order to promote the formation of a single body-centered cubic (BCC) solid-solution phase upon solidification, which facilitates homogenization heat treatments. Al, Cr, and Si were also included to promote secondary phase formation. These 11 elements were then used to calculate the phases present and their reaction temperatures of 3-, 4-, 5-, and 6-component alloy compositions from all of the available PandatTM databases. Mo and Nb were required to be present in each alloy composition in order to maintain modest alloy costs and densities. Please click Additional Files below to see the full abstract

    Spatial and observational homogeneities of the galaxy distribution in standard cosmologies

    Full text link
    This work discusses the possible empirical verification of the geometrical concept of homogeneity of the standard relativistic cosmology considering its various definitions of distance. We study the physical consequences of the distinction between the usual concept of spatial homogeneity (SH), as defined by the Cosmological Principle, and the concept of observational homogeneity (OH), arguing that OH is in principle falsifiable by means of astronomical observations, whereas verifying SH is only possible indirectly. Simulated counts of cosmological sources are produced by means of a generalized number-distance expression that can be specialized to produce either the counts of the Einstein-de Sitter (EdS) cosmology, which has SH by construction, or other types of counts, which do, or do not, have OH by construction. Expressions for observational volumes and differential densities are derived with the various cosmological distance definitions in the EdS model. Simulated counts that have OH by construction do not always exhibit SH features. The reverse situation is also true. Besides, simulated counts with no OH features at low redshift start showing OH characteristics at high redshift. The comoving distance seems to be the only distance definition where both SH and OH appear simultaneously. The results show that observations indicating possible lack of OH do not necessarily falsify the standard Friedmannian cosmology, meaning that this cosmology will not necessarily always produce observable homogeneous densities. The general conclusion is that the use of different cosmological distances in the characterization of the galaxy distribution lead to significant ambiguities in reaching conclusions about the behavior of the large-scale galaxy distribution in the Universe.Comment: 12 pages, 12 figures, LaTeX. Matches the final version sent to the journal. Accepted for publication in "Astronomy and Astrophysics

    Evaluation of EDISON\u27s Data Science Competency Framework Through a Comparative Literature Analysis

    Get PDF
    During the emergence of Data Science as a distinct discipline, discussions of what exactly constitutes Data Science have been a source of contention, with no clear resolution. These disagreements have been exacerbated by the lack of a clear single disciplinary \u27parent.\u27 Many early efforts at defining curricula and courses exist, with the EDISON Project\u27s Data Science Framework (EDISON-DSF) from the European Union being the most complete. The EDISON-DSF includes both a Data Science Body of Knowledge (DS-BoK) and Competency Framework (CF-DS). This paper takes a critical look at how EDISON\u27s CF-DS compares to recent work and other published curricular or course materials. We identify areas of strong agreement and disagreement with the framework. Results from the literature analysis provide strong insights into what topics the broader community see as belonging in (or not in) Data Science, both at curricular and course levels. This analysis can provide important guidance for groups working to formalize the discipline and any college or university looking to build their own undergraduate Data Science degree or programs

    The Apparent Fractal Conjecture: Scaling Features in Standard Cosmologies

    Full text link
    This paper presents an analysis of the smoothness problem in cosmology by focussing on the ambiguities originated in the simplifying hypotheses aimed at observationally verifying if the large-scale distribution of galaxies is homogeneous, and conjecturing that this distribution should follow a fractal pattern in perturbed standard cosmologies. This is due to a geometrical effect, appearing when certain types of average densities are calculated along the past light cone. The paper starts reviewing the argument concerning the possibility that the galaxy distribution follows such a scaling pattern, and the premises behind the assumption that the spatial homogeneity of standard cosmology can be observable. Next, it is argued that to discuss observable homogeneity one needs to make a clear distinction between local and average relativistic densities, and showing how the different distance definitions strongly affect them, leading the various average densities to display asymptotically opposite behaviours. Then the paper revisits Ribeiro's (1995: astro-ph/9910145) results, showing that in a fully relativistic treatment some observational average densities of the flat Friedmann model are not well defined at z ~ 0.1, implying that at this range average densities behave in a fundamentally different manner as compared to the linearity of the Hubble law, well valid for z < 1. This conclusion brings into question the widespread assumption that relativistic corrections can always be neglected at low z. It is also shown how some key features of fractal cosmologies can be found in the Friedmann models. In view of those findings, it is suggested that the so-called contradiction between the cosmological principle, and the galaxy distribution forming an unlimited fractal structure, may not exist.Comment: 30 pages, 2 figures, LaTeX. This paper is a follow-up to gr-qc/9909093. Accepted for publication in "General Relativity and Gravitation

    Differences in brain morphometry associated with creative performance in high- and average-creative achievers.

    Get PDF
    Nearly everyone has the ability for creative thought. Yet, certain individuals create works that propel their fields, challenge paradigms, and advance the world. What are the neurobiological factors that might underlie such prominent creative achievement? In this study, we focus on morphometric differences in brain structure between high creative achievers from diverse fields of expertise and a \u27smart\u27 comparison group of age-, intelligence-, and education-matched average creative achievers. Participants underwent a high-resolution structural brain imaging scan and completed a series of intelligence, creative thinking, personality, and creative achievement measures. We examined whether high and average creative achievers could be distinguished based on the relationship between morphometric brain measures (cortical area and thickness) and behavioral measures. Although participants\u27 performance on the behavioral measures did not differ between the two groups aside from creative achievement, the relationship between posterior parietal cortex morphometry and creativity, intelligence, and personality measures depended on group membership. These results suggest that extraordinary creativity may be associated with measurable structural brain differences, especially within parietal cortex

    Total versus superficial parotidectomy for stage III melanoma

    Full text link
    BackgroundThe primary purpose of this study was to describe the parotid recurrence rates after superficial and total parotidectomy.MethodsA retrospective cohort study was performed on patients with cutaneous melanoma metastatic to the parotid gland who underwent parotidectomy from 1998 through 2014. Primary outcome was parotid bed recurrence. Secondary outcomes were facial nerve function postoperatively and at last follow‐up.ResultsOne hundred twenty‐nine patients were included in the study. Thirty‐four patients (26%) underwent a total parotidectomy and 95 patients underwent superficial parotidectomy. Twelve patients (13%) developed parotid bed recurrence after superficial parotidectomy alone versus zero after total parotidectomy (P = .035). Facial nerve function, clinically detected disease, stage, and adjuvant treatment were not statistically different between the groups (P = .32, .32, .13, and 0.99, respectively).ConclusionParotid bed melanoma recurrence was more common after superficial parotidectomy compared to total parotidectomy, and recurrence resulted in significant facial nerve functional deficit. Our results support total parotidectomy when metastatic melanoma involves the parotid nodal basin.Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/137735/1/hed24810_am.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/137735/2/hed24810.pd

    A Phase II Randomized, Double-Blind, Placebo-Controlled Safety and Efficacy Study of Lenalidomide in Lumbar Radicular Pain with a Long-Term Open-Label Extension Phase.

    Get PDF
    OBJECTIVE: This phase II study assessed lenalidomide efficacy and safety. DESIGN: Three-phase core study: 14-day prerandomization, 12-week treatment, and 52-week open-label extension. SETTING: Fourteen US centers from July 2005 to July 2007. SUBJECTS: Chronic lumbar radicular pain patients without history of nerve injury or deficit. METHODS: Subjects were randomized (1:1) to double-blind treatment with lenalidomide 10 mg or placebo once daily for 12 weeks, followed by a 52-week open-label extension. A 12-week, single-center, randomized-withdrawal (1:2, lenalidomide:placebo), exploratory study with open-label extension was undertaken in 12 subjects from the core extension who were naïve to neuropathic medications and with at least a two-point decrease from baseline average daily Pain Intensity-Numerical Rating Scale score. RESULTS: Of 180 subjects enrolled, 176 had at least one postbaseline measure; 132 completed the 12-week treatment phase. In the core study, no statistically significant difference in Pain Intensity-Numerical Rating Scale mean change (-0.02, P = 0.958) was observed at week 12 between lenalidomide and placebo; proportions achieving pain reduction at week 12 and other secondary measures were comparable between lenalidomide and placebo. In the exploratory study, week 12 mean changes in Pain Intensity-Numerical Rating Scale scores were -0.05 (lenalidomide: N = 3) and 2.11 (placebo: N = 8). Mean changes in Brief Pain Inventory-short form interference scores were -3.33 and 8.38, respectively; scores at six months were maintained or decreased in 10 of 12 subjects. CONCLUSIONS: While this study does not support lenalidomide use in an unselected lumbar radicular pain population, an immunomodulating agent may relieve pain in select subjects naïve to neuropathic pain medications.ClinicalTrials.gov identifier: NCT00120120
    corecore