179,328 research outputs found

    Integrative Biological Chemistry Program Includes The Use Of Informatics Tools, GIS And SAS Software Applications

    Get PDF
    Wesley College is a private, primarily undergraduate minority-serving institution located in the historic district of Dover, Delaware (DE). The College recently revised its baccalaureate biological chemistry program requirements to include a one-semester Physical Chemistry for the Life Sciences course and project-based experiential learning courses using instrumentation, data-collection, data-storage, statistical-modeling analysis, visualization, and computational techniques. In this revised curriculum, students begin with a traditional set of biology, chemistry, physics, and mathematics major core-requirements, a geographic information systems (GIS) course, a choice of an instrumental analysis course or a statistical analysis systems (SAS) programming course, and then, students can add major-electives that further add depth and value to their future post-graduate specialty areas. Open-sourced georeferenced census, health and health disparity data were coupled with GIS and SAS tools, in a public health surveillance system project, based on US county zip-codes, to develop use-cases for chronic adult obesity where income, poverty status, health insurance coverage, education, and age were categorical variables. Across the 48 contiguous states, obesity rates are found to be directly proportional to high poverty and inversely proportional to median income and educational achievement. For the State of Delaware, age and educational attainment were found to be limiting obesity risk-factors in its adult population. Furthermore, the 2004-2010 obesity trends showed that for two of the less densely populated Delaware counties; Sussex and Kent, the rates of adult obesity were found to be progressing at much higher proportions when compared to the national average

    Physical and financial characteristics of high input and low input dairy farms in New Zealand : research project for thesis, to be presented in partial fulfilment of the requirements for the degree of Master of Science (M.Sc.) in Animal Science, Institute of Veterinary Animal and Biomedical Sciences, Massey University, Palmerston North, New Zealand

    Get PDF
    In recent years the use of supplements in New Zealand dairy farms has increased, but there is little information about the way in which this extra feed has influenced the dairy system. This research work aimed at analysing the effect of extra feed input on the physical and financial performance of dairy farms. ProfitWatch data corresponding to 915 owner -operated dairy farms were analysed. The data was classified according to dairy season (1998/99, 1999/00, 2000/01, 2001/02), extra feed offered per cow (low input systems: 500kg DM extra feed/cow) and quartiles according to EFS/ha. The definition of extra feed comprised supplements imported, winter grazing and maize grown in the farm. The statistical analysis comprised analysis of variance (ANOVA) and regression analysis done in SAS. In all 4 dairy seasons, high input systems had higher stocking rates (2.7-2.8 vs 2.4-2.5 cows/ha), lower comparative stocking rate (83-86 vs 92-83 kg LWT/t DM), higher milksolids production per cow (293-341 vs 249-295 kg MS/cow) and per hectare (826-921 vs 616-744 kg MS/ha), and higher use of nitrogen fertiliser per hectare (85-116 vs 53-67 kg N/ha/year) than low input systems. During the period of study, milksolids payout increased from 3.58/kgMSin1998/99to3.58/kg MS in 1998/99 to 5.30/kg MS in 2001/02. High input systems had higher Gross Farm Income per hectare (3287/havs3287/ha vs 2374/ha in 1998/99; and 5377/havs5377/ha vs 4362/ha in 2001/02) and higher Farm Working Expenses per hectare (2519/havs2519/ha vs 1760/ha in 1998/99, and 3259/havs3259/ha vs 2187/ha in 2001/02) than low input systems. There were not significant differences in EFS/ha, Return on Assets (%) and Return on Equity (%) between farms in the 3 feed input systems. Within each feed input system, farms in the top quartile for EFS/ha had higher stocking rates and higher estimated pasture consumed per hectare than their corresponding farm system in the bottom quartile. Regression analysis of all the farms (915 farms) showed that across all farms, the marginal (average of 4 years) response to the extra feed used was 50g MS/cow/kg DM extra feed per cow. But the marginal response per hectare to extra feed was higher (96g MS/ha/kg DM extra feed per hectare) due to associated increases in stocking rate and other inputs. The operating cash surplus per hectare increased by approximately 0.07to0.07 to 0.12/kg DM of extra feed used per hectare, but EFS/ha was not significantly affected by these differences in cash operating surplus. Keywords: low, intermediate and high input systems: extra feed

    What is the True Cost to stay in the Hospital?

    Get PDF
    Currently, the unfortunate reality that receiving diverse health procedures can be extremely expensive is widely acknowledged and woefully accepted. However, have you inquired or been curious about the specific factors that influence the cost per day expensed by a hospital? Through examination, investigation, and evaluation operating SAS Enterprise Guide, SAS Enterprise Miner and Tableau I have attempted to arrive at a conclusion for this very question. Utilizing a 1.5 million row data set provided by Rhode Island, for the years 2003-2013, I analyzed the assorted elements conceivably bearing impact on the cost per day at a hospital. Regressions, decision trees, neural networks, ANOVA, linear models, and a countless number of visual representations genuinely assisted in attaining a robust conclusion. Overall, a number of various input variables, with unique magnitudes, such as age, services provided, year of discharge, and hospital provider sincerely shape the overarching cost per day at a hospital

    BCAS: A Web-enabled and GIS-based Decision Support System for the Diagnosis and Treatment of Breast Cancer

    Get PDF
    For decades, geographical variations in cancer rates have been observed but the precise determinants of such geographic differences in breast cancer development are unclear. Various statistical models have been proposed. Applications of these models, however, require that the data be assembled from a variety of sources, converted into the statistical models’ parameters and delivered effectively to researchers and policy makers. A web-enabled and GIS-based system can be developed to provide the needed functionality. This article overviews the conceptual web-enabled and GIS-based system (BCAS), illustrates the system’s use in diagnosing and treating breast cancer and examines the potential benefits and implications for breast cancer research and practice

    Selection of Statistical Software for Solving Big Data Problems for Teaching

    Get PDF
    The need for analysts with expertise in big data software is becoming more apparent in 4 today’s society. Unfortunately, the demand for these analysts far exceeds the number 5 available. A potential way to combat this shortage is to identify the software sought by 6 employers and to align this with the software taught by universities. This paper will 7 examine multiple data analysis software – Excel add-ins, SPSS, SAS, Minitab, and R – and 8 it will outline the cost, training, statistical methods/tests/uses, and specific uses within 9 industry for each of these software. It will further explain implications for universities and 10 students (PDF

    Selection of Statistical Software for Solving Big Data Problems: A Guide for Businesses, Students, and Universities

    Get PDF
    The need for analysts with expertise in big data software is becoming more apparent in today’s society. Unfortunately, the demand for these analysts far exceeds the number available. A potential way to combat this shortage is to identify the software taught in colleges or universities. This article will examine four data analysis software—Excel add-ins, SPSS, SAS, and R—and we will outline the cost, training, and statistical methods/tests/uses for each of these software. It will further explain implications for universities and future students

    Probabilistic methods for seasonal forecasting in a changing climate: Cox-type regression models

    Get PDF
    For climate risk management, cumulative distribution functions (CDFs) are an important source of information. They are ideally suited to compare probabilistic forecasts of primary (e.g. rainfall) or secondary data (e.g. crop yields). Summarised as CDFs, such forecasts allow an easy quantitative assessment of possible, alternative actions. Although the degree of uncertainty associated with CDF estimation could influence decisions, such information is rarely provided. Hence, we propose Cox-type regression models (CRMs) as a statistical framework for making inferences on CDFs in climate science. CRMs were designed for modelling probability distributions rather than just mean or median values. This makes the approach appealing for risk assessments where probabilities of extremes are often more informative than central tendency measures. CRMs are semi-parametric approaches originally designed for modelling risks arising from time-to-event data. Here we extend this original concept to other positive variables of interest beyond the time domain. We also provide tools for estimating CDFs and surrounding uncertainty envelopes from empirical data. These statistical techniques intrinsically account for non-stationarities in time series that might be the result of climate change. This feature makes CRMs attractive candidates to investigate the feasibility of developing rigorous global circulation model (GCM)-CRM interfaces for provision of user-relevant forecasts. To demonstrate the applicability of CRMs, we present two examples for El Niño/Southern Oscillation (ENSO)-based forecasts: the onset date of the wet season (Cairns, Australia) and total wet season rainfall (Quixeramobim, Brazil). This study emphasises the methodological aspects of CRMs rather than discussing merits or limitations of the ENSO-based predictor

    HetHetNets: Heterogeneous Traffic Distribution in Heterogeneous Wireless Cellular Networks

    Full text link
    A recent approach in modeling and analysis of the supply and demand in heterogeneous wireless cellular networks has been the use of two independent Poisson point processes (PPPs) for the locations of base stations (BSs) and user equipments (UEs). This popular approach has two major shortcomings. First, although the PPP model may be a fitting one for the BS locations, it is less adequate for the UE locations mainly due to the fact that the model is not adjustable (tunable) to represent the severity of the heterogeneity (non-uniformity) in the UE locations. Besides, the independence assumption between the two PPPs does not capture the often-observed correlation between the UE and BS locations. This paper presents a novel heterogeneous spatial traffic modeling which allows statistical adjustment. Simple and non-parameterized, yet sufficiently accurate, measures for capturing the traffic characteristics in space are introduced. Only two statistical parameters related to the UE distribution, namely, the coefficient of variation (the normalized second-moment), of an appropriately defined inter-UE distance measure, and correlation coefficient (the normalized cross-moment) between UE and BS locations, are adjusted to control the degree of heterogeneity and the bias towards the BS locations, respectively. This model is used in heterogeneous wireless cellular networks (HetNets) to demonstrate the impact of heterogeneous and BS-correlated traffic on the network performance. This network is called HetHetNet since it has two types of heterogeneity: heterogeneity in the infrastructure (supply), and heterogeneity in the spatial traffic distribution (demand).Comment: JSA

    Selecting Undergraduate Business Majors

    Get PDF
    The paper begins with a brief review of the literature and how business students choose their major in the U.S. and we list the most popular majors in the U.S. Universities. We also talk about the factors that influenced student’s choice. In our next research project, we will not only use a larger sample size but also the sample will come from a few universities to reduce the sampling bias. In this paper, we also talk about changing trends in international students. We talk about the large group of Chinese, Indian, and Arabic students, and we show that with literature and graphical support. In the next section, we analyze one of the up and coming new business majors ―Business Analytics‖ We finish the paper with a discussion of growth of international students both at graduate and undergraduate level, and how we will address the shortcomings of this paper with our next project

    The Agile Alert System For Gamma-Ray Transients

    Full text link
    In recent years, a new generation of space missions offered great opportunities of discovery in high-energy astrophysics. In this article we focus on the scientific operations of the Gamma-Ray Imaging Detector (GRID) onboard the AGILE space mission. The AGILE-GRID, sensitive in the energy range of 30 MeV-30 GeV, has detected many gamma-ray transients of galactic and extragalactic origins. This work presents the AGILE innovative approach to fast gamma-ray transient detection, which is a challenging task and a crucial part of the AGILE scientific program. The goals are to describe: (1) the AGILE Gamma-Ray Alert System, (2) a new algorithm for blind search identification of transients within a short processing time, (3) the AGILE procedure for gamma-ray transient alert management, and (4) the likelihood of ratio tests that are necessary to evaluate the post-trial statistical significance of the results. Special algorithms and an optimized sequence of tasks are necessary to reach our goal. Data are automatically analyzed at every orbital downlink by an alert pipeline operating on different timescales. As proper flux thresholds are exceeded, alerts are automatically generated and sent as SMS messages to cellular telephones, e-mails, and push notifications of an application for smartphones and tablets. These alerts are crosschecked with the results of two pipelines, and a manual analysis is performed. Being a small scientific-class mission, AGILE is characterized by optimization of both scientific analysis and ground-segment resources. The system is capable of generating alerts within two to three hours of a data downlink, an unprecedented reaction time in gamma-ray astrophysics.Comment: 34 pages, 9 figures, 5 table
    • …
    corecore