9,597 research outputs found
Gravity device Patent
Gravity device for accurate and rapid indication of relative gravity conditions aboard accelerating carrie
Beginning at the End: Reimagining the Dissertation Committee, Reimagining Careers
In this article, we forward a perspective on interdisciplinarity and diversity that reconsiders the notion of expertise in order to unstick discussions of graduate education reform that have been at an impasse for some fortyfive years. As research problems have become increasingly complex so has demand for scholars who specialize narrowly within a discipline and who understand the importance of contributions from other disciplines. In light of this, we reimagine the dissertation committee as a group of diverse participants from within and beyond the academy who contribute their knowledge and skills to train the next generation of scholars and researchers to be members of interdisciplinary teams. Graduate students, then, are not expected to be interdisciplinary themselves, but to work in interdisciplinary and diverse teams to discover new insights on their research areas and to prepare for careers interacting with a range of academic and non-academic stakeholders
Constraining Dark Matter-Neutrino Interactions using the CMB and Large-Scale Structure
We present a new study on the elastic scattering cross section of dark matter
(DM) and neutrinos using the latest cosmological data from Planck and
large-scale structure experiments. We find that the strongest constraints are
set by the Lyman-alpha forest, giving sigma_{DM-neutrino} < 10^{-33} (m_DM/GeV)
cm^2 if the cross section is constant and a present-day value of
sigma_{DM-neutrino} < 10^{-45} (m_DM/GeV) cm^2 if it scales as the temperature
squared. These are the most robust limits on DM-neutrino interactions to date,
demonstrating that one can use the distribution of matter in the Universe to
probe dark ("invisible") interactions. Additionally, we show that scenarios
involving thermal MeV DM and a constant elastic scattering cross section
naturally predict (i) a cut-off in the matter power spectrum at the Lyman-alpha
scale, (ii) N_eff ~ 3.5 +/- 0.4, (iii) H_0 ~ 71 +/- 3 km/s/Mpc and (iv) the
possible generation of neutrino masses.Comment: 12 pages, 5 figure
A weighty interpretation of the Galactic Centre excess
Previous attempts at explaining the gamma-ray excess near the Galactic Centre
have focussed on dark matter annihilation directly into Standard Model
particles. This results in a preferred dark matter mass of 30-40 GeV (if the
annihilation is into b quarks) or 10 GeV (if it is into leptons). Here we show
that the gamma-ray excess is also consistent with heavier dark matter
particles; in models of secluded dark matter, dark matter with mass up to 76
GeV provides a good fit to the data. This occurs if the dark matter first
annihilates to an on-shell particle that subsequently decays to Standard Model
particles through a portal interaction. This is a generic process that works in
models with annihilation, semi-annihilation or both. We explicitly demonstrate
this in a model of hidden vector dark matter with an SU(2) gauge group in the
hidden sector.Comment: 5 pages, 4 figures. v2: Matches PRD version. Note: title of PRD
version is "Interpretation of the Galactic Center excess of gamma rays with
heavier dark matter particles
Global Mapping Function (GMF): A new empirical mapping function based on numerical weather model data
Troposphere mapping functions are used in the analyses of Global Positioning System and Very Long Baseline Interferometry observations to map a priori zenith hydrostatic and wet delays to any elevation angle. Most analysts use the Niell Mapping Function (NMF) whose coefficients are determined from site coordinates and the day of year. Here we present the Global Mapping Function (GMF), based on data from the global ECMWF numerical weather model. The coefficients of the GMF were obtained from an expansion of the Vienna Mapping Function (VMF1) parameters into spherical harmonics on a global grid. Similar to NMF, the values of the coefficients require only the station coordinates and the day of year as input parameters. Compared to the 6-hourly values of the VMF1 a slight degradation in short-term precision occurs using the empirical GMF. However, the regional height biases and annual errors of NMF are significantly reduced with GMF
Job polarization has squeezed the American middle class
Has the decline in manufacturing and clerical jobs been responsible for the lagging wages of middleskill workers in the United States? To answer this question, Michael Boehm compares the occupational choices and earnings of survey respondents in the 1980s and today. The decline of the middle class has been much debated in the United States and elsewhere in recent years. The first important component of this decline is the fall in the number of well-paid middle-skill jobs in manufacturing and clerical occupations since the 1980s
The Use of Quality Cost Measurement Systems for Improving Profitability
The focus of this study was to determine if t he use of a quality cost measurement system will allow companies to track and analyze costs and provide a means of improving profitability. Focusing on the philosophies and theories developed by W. Edwards Deming, J .M. Juran, and Philip Crosby, three of the leading authorities in the quality control field, a quality control system can be designed to track and analyze quality costs. Quality costs have been grouped into three categories : Prevention costs, Appraisal costs, and Failure costs. Prevention costs are those associated with preventing poor quality such as, new machinery, inspections, and training. Appraisal costs are associated with analysis of finished products and other such functions. Failure costs are divided into two further categories: Internal failure and external failure. Internal failures occur as a result of problems within the company. External failures are caused by problems with raw materials from suppliers or problems with consumers. Failure costs include such items as scrap, rework of product, and placating irate customers. The purpose of this study was to determine if a correlation exists between quality cost measurement and profitability. Information was gathered through secondary data collection, Magazine articles and published studies were the primary source of secondary data. Hypothetical case scenarios were also utilized.
The following hypothesis was tested: If quality costs are tracked and analyzed, they can be controlled in order to increase profitability.
Results of the analysis failed to supply sufficient information to support t he hypothesis completely , but a positive correlation between quality cost measurement and profitability was revealed. Because of insufficient data, it was concluded that the study needed to be revised by changing the sampling frame and determining more useful analysis calculations
- …