6,167 research outputs found
Discrete Newtonian Cosmology
In this paper we lay down the foundations for a purely Newtonian theory of
cosmology, valid at scales small compared with the Hubble radius, using only
Newtonian point particles acted on by gravity and a possible cosmological term.
We describe the cosmological background which is given by an exact solution of
the equations of motion in which the particles expand homothetically with their
comoving positions constituting a central configuration. We point out, using
previous work, that an important class of central configurations are
homogeneous and isotropic, thus justifying the usual assumptions of elementary
treatments. The scale factor is shown to satisfy the standard Raychaudhuri and
Friedmann equations without making any fluid dynamic or continuum
approximations. Since we make no commitment as to the identity of the point
particles, our results are valid for cold dark matter, galaxies, or clusters of
galaxies. In future publications we plan to discuss perturbations of our
cosmological background from the point particle viewpoint laid down in this
paper and show consistency with much standard theory usually obtained by more
complicated and conceptually less clear continuum methods. Apart from its
potential use in large scale structure studies, we believe that out approach
has great pedagogic advantages over existing elementary treatments of the
expanding universe, since it requires no use of general relativity or continuum
mechanics but concentrates on the basic physics: Newton's laws for
gravitationally interacting particles.Comment: 33 pages; typos fixed, references added, some clarification
Assessing the Proposed IAM, UAW, and USW Merger: Critical Issues and Potential Outcomes
[Excerpt] We examine the many difficult issues facing the IAM, UAW, and USW as they move toward the creation of a single organization. In order to place this merger in con- text, the larger issue of mergers in the American labor movement will be addressed, as will the origins and history of each of the three unions. The specific issues confronting the unions will be examined in three categories — structure, administration, and functions and services. We conclude with an assessment of the current status of the unification effort and the prospects for its realization
Recommended from our members
Analysis of wheat SAGE tags reveals evidence for widespread antisense transcription
BACKGROUND: Serial Analysis of Gene Expression (SAGE) is a powerful tool for genome-wide transcription studies. Unlike microarrays, it has the ability to detect novel forms of RNA such as alternatively spliced and antisense transcripts, without the need for prior knowledge of their existence. One limitation of using SAGE on an organism with a complex genome and lacking detailed sequence information, such as the hexaploid bread wheat Triticum aestivum, is accurate annotation of the tags generated. Without accurate annotation it is impossible to fully understand the dynamic processes involved in such complex polyploid organisms. Hence we have developed and utilised novel procedures to characterise, in detail, SAGE tags generated from the whole grain transcriptome of hexaploid wheat. RESULTS: Examination of 71,930 Long SAGE tags generated from six libraries derived from two wheat genotypes grown under two different conditions suggested that SAGE is a reliable and reproducible technique for use in studying the hexaploid wheat transcriptome. However, our results also showed that in poorly annotated and/or poorly sequenced genomes, such as hexaploid wheat, considerably more information can be extracted from SAGE data by carrying out a systematic analysis of both perfect and "fuzzy" (partially matched) tags. This detailed analysis of the SAGE data shows first that while there is evidence of alternative polyadenylation this appears to occur exclusively within the 3' untranslated regions. Secondly, we found no strong evidence for widespread alternative splicing in the developing wheat grain transcriptome. However, analysis of our SAGE data shows that antisense transcripts are probably widespread within the transcriptome and appear to be derived from numerous locations within the genome. Examination of antisense transcripts showing sequence similarity to the Puroindoline a and Puroindoline b genes suggests that such antisense transcripts might have a role in the regulation of gene expression. CONCLUSION: Our results indicate that the detailed analysis of transcriptome data, such as SAGE tags, is essential to understand fully the factors that regulate gene expression and that such analysis of the wheat grain transcriptome reveals that antisense transcripts maybe widespread and hence probably play a significant role in the regulation of gene expression during grain development
The nature and extent of trace element contamination associated with fly-ash disposal sites in the Chisman Creek Watershed
This study was conducted by the Virginia Institute of Marine Science (VIMS) and the Virginia Associated Research Campus (VARC), both branches of the College of William and Mary, to document the nature, extent, and severity of environmental contamination by trace elements from the landfill disposal of fly-ash within the Chisman Creek watershed. Previous work in the area demonstrated that some metals were apparently mobile in the groundwater, and that two nearby household wells were contaminated (Va. SWCB, 1981). These short term studies were limited to the testing of only a few selected contaminants in wells near the fly-ash pits. The goal of our study was to provide a more comprehensive sampling of the basin to delineate the geographical extent of trace element contamination, and to assess whether the levels found there pose a hazard to man or to the terrestrial and aquatic ecosystem. An important aspect of the program is the use of an analytical technique which provides simultaneous measurement of a large number of elements, thereby obviating the need to speculate which elements would be found before the field work was begun. Proton Induced X-ray Emission (PIXE) is such a technique and provided data on 70 elements from each sample collected during this study
Spatio-temporal dynamics of Marbled Murrelet hotspots during nesting in nearshore waters along the Washington to California coast
The Marbled Murrelet, Brachyramphus marmoratus, is a federally listed alcid that forages in nearshore waters of the Pacific Northwest, and nests in adjacent older-forest conifers within 40-80 km of shore. To estimate abundance and distribution of murrelets, we conduct at-sea surveys from May to July each year, starting in 2000 and continuing to present. We record numbers of individuals sighted by using distance-based transects and compute annual estimates of density after adjusting for detectability. At-sea transects are subdivided into 5-km segments, and we summarized mean and variance of density at each segment in Puget Sound and along the coast from the Canadian border South to San Francisco Bay. We used a boosted regression tree analysis to investigate the contributions of marine and terrestrial attributes on murrelet abundance in each segment. We observed that terrestrial attributes, especially the amount and pattern of suitable nesting habitat in proximity to each segment, made the strongest contribution, but that marine attributes also helped explain variation in murrelet abundance. Hotspots of murrelet abundance therefore reflect not only suitable marine foraging habitat but proximity of suitable inland nesting habitat
Combined Thermal Weakening and Mechanical Disintegration of Hard Rock
This investigation of the combined effects of thermal weakening and mechanical disintegration (thermomechanical fragmentation) was initiated with a view toward better understanding of the processes required for more rapidly and economically fragmenting or excavating hard rock. Boring machines for utility tunnels, transportation tunnels or mining operations may be able to utilize the advantages of processes such as thermomechanical fragmentation. Secondary fragmentation or rock crushing processes also can conceivably employ the data obtained from this study
Improved cardiac management with a disease management program incorporating comprehensive lipid profiling.
Abstract The objective of this study was to evaluate the improved effectiveness of a disease management treatment protocol incorporating comprehensive lipid profiling and targeted lipid care based on lipid profile findings in patients with ischemic heart disease (IHD) or congestive heart failure (CHF) enrolled in a managed care plan. This retrospective cohort study, conducted over a 2-year period, compared outcomes between patients with a standard lipid profile to those evaluated with a comprehensive lipid profile. All adult members of the WellMed Medical Management, Inc. managed care health plan diagnosed with IHD or CHF, and continuously enrolled between July 1, 2006 and June 30, 2008, were included in the study. Cases were defined as those who had at least 1 comprehensive lipid test (the VAP [vertical auto profile] ultracentrifuge test) during this period (n=1767); they were compared to those who had no lipid testing or traditional standard lipid testing only (controls, n=289). Univariate statistics were analyzed to describe the groups, and bivariate t tests or chi-squares examined differences between the 2 cohorts. Multivariate regression analyses were performed to control for potential confounders. The results show that the case group had lower total costs (7413.18; P=0.0255), fewer inpatient stays (13.1% vs. 18.3% of controls; P=0.0175) and emergency department visits (11.9% vs. 15.6% of controls; P=0.0832). Prescription use and frequency of lipid measurement suggested improved control resulting from a targeted approach to managing specific dyslipidemias. A treatment protocol incorporating a comprehensive lipid profile appears to improve care and reduce utilization and costs in a disease management program for cardiac patients. (Population Health Management 2012;15:46-51)
Amphiphilic block copolymers as stabilizers in emulsion polymerization: Effects of molecular weight dispersity and evidence of self-folding behavior
Emulsion polymerizations, used to produce many commodity materials, require stabilizing agents to prevent phase separation. Incorporation of these stabilizers in the final polymer may have negative effects on product properties, so the design of new stabilizers is being actively pursued. Amphiphilic diblock copolymers are a promising type of emulsion polymerization stabilizer and are the focus of this work (Fig. 1). First, the tolerance of an amphiphilic diblock copolymer stabilizer’s performance to high molecular weight dispersity and homopolymer impurity has been investigated. Polystyrene-b-poly(acrylic acid) block copolymers were studied due to their previously demonstrated efficacy as stabilizers in emulsion polymerization, and their similarity to commercially important polystyrene-r-poly(acrylic acid) stabilizers. Neither greater molecular weight dispersity nor homopolymer impurity was found to negatively impact the stabilization performance of these block copolymers, suggesting that the economically unfavorable conditions required to achieve low molecular weight dispersity and homopolymer impurity may be avoided. We then examined novel polystyrene-b-[polystyrene-r-poly(acrylic acid)] block-random copolymers which were shown to stabilize emulsion polymerizations with up to 50 weight percent solids content, exceeding what was possible using the polystyrene-b-poly(acrylic acid) block copolymers. Of even greater significance and scientific value is that the block-random copolymers were also observed to have unusual solution behavior, self-folding rather than self-assembling, to give single chain nanoparticles. Emulsion polymerizations stabilized by these block-random copolymers had a total particle surface area which was directly proportional to the stabilizer concentration and was unaffected by polymerization kinetics. A novel “seeded-coagulative” emulsion polymerization mechanism has been proposed to explain these results, which were unexplainable by any known emulsion polymerization mechanism.
Please click Additional Files below to see the full abstrac
Social Preferences and the Efficiency of Bilateral Exchange
Under what conditions do social preferences, such as altruism or a concern for fair outcomes, generate efficient trade? I analyze theoretically a simple bilateral exchange game: Each player sequentially takes an action that reduces his own material payoff but increases the other player’s. Each player’s preferences may depend on both his/her own material payoff and the other player’s. I identify necessary conditions and sufficient conditions on the players’ preferences for the outcome of their interaction to be Pareto efficient. The results have implications for interpreting the rotten kid theorem, gift exchange in the laboratory, and gift exchange in the field
- …