3,180 research outputs found

    An Assessment of Factors Important to Legislators in Budget Decisions; How Much Impact Can Agencies Have?

    Get PDF
    Budget deliberations represent a dynamic interaction between many actors, including agency officials and legislators. There may be opposing perceptions about the relative importance of various types of information and there are likely many priority items that legislators base their decisions on, which budget officials may, or may not, have the ability to impact or control. Through a survey of state legislators, we first determined the relative importance of 27 items in approving budget proposals. Agency officials were surveyed and asked to rate the degree that they can impact each of the 27 items. We considered how the difference in party affiliation of legislators relates to the type of information they view as important in budget decisions. We then compared the importance ratings of legislators with the impact ratings of budget officials, which led to some recommendations aimed at agency officials

    Predictions of the Compressible Fluid Model and its Comparison to Experimental Measurements of Factors and Flexural Resonance Frequencies for Microcantilevers

    Get PDF
    The qualitative agreement between experimental measurements of the factors and flexural resonance frequencies in air of microcantilevers and calculations based on the compressible fluid model of Van Eysden and Sader (2009) is presented. The factors and resonance frequencies observed on two sets of cantilever arrays were slightly lower than those predicted by the model. This is attributed to the individual design and geometry of the microfabricated hinged end of the cantilever beams in the array

    Predicting the Draft and Career Success of Tight Ends in the National Football League

    Get PDF
    National Football League teams have complex drafting strategies based on college and combine performance that are intended to predict success in the NFL. In this paper, we focus on the tight end position, which is seeing growing importance as the NFL moves towards a more passing-oriented league. We create separate prediction models for 1. the NFL Draft and 2. NFL career performance based on data available prior to the NFL Draft: college performance, the NFL combine, and physical measures. We use linear regression and recursive partitioning decision trees to predict both NFL draft order and NFL career success based on this pre-draft data. With both modeling approaches, we find that the measures that are most predictive of NFL draft order are not necessarily the most predictive measures of NFL career success. This finding suggests that we can improve upon current drafting strategies for tight ends. After factoring the salary cost of drafted players into our analysis in order to predict tight ends with the highest value, we find that size measures (BMI, weight, height) are over-emphasized in the NFL draft

    Cracking in cement paste induced by autogenous shrinkage

    Get PDF
    Detection and quantification of microcracks caused by restrained autogenous shrinkage in high-performance concrete is a difficult task. Available techniques either lack the required resolution or may produce additional cracks that are indistinguishable from the original ones. A recently developed technique allows identification of microcracks while avoiding artefacts induced by unwanted restraint, drying, or temperature variations during sample preparation. Small cylindrical samples of cement paste are cast with steel rods of different diameters in their centre. The rods restrain the autogenous shrinkage of the paste and may cause crack formation. The crack pattern is identified by impregnation with gallium and analyzed by optical and scanning electron microscopy. In this study, a non-linear numerical analysis of the samples was performed. Autogenous strain, elastic modulus, fracture energy, and creep as a function of hydration time were used as inputs in the analysis. The experimental results and the numerical analysis showed that samples with larger steel rods had the highest probability of developing microcracks. In addition, the pattern and the width of the observed microcracks showed good agreement with the simulation result

    TIGER: Toolbox for integrating genome-scale metabolic models, expression data, and transcriptional regulatory networks

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Several methods have been developed for analyzing genome-scale models of metabolism and transcriptional regulation. Many of these methods, such as Flux Balance Analysis, use constrained optimization to predict relationships between metabolic flux and the genes that encode and regulate enzyme activity. Recently, mixed integer programming has been used to encode these gene-protein-reaction (GPR) relationships into a single optimization problem, but these techniques are often of limited generality and lack a tool for automating the conversion of rules to a coupled regulatory/metabolic model.</p> <p>Results</p> <p>We present TIGER, a Toolbox for Integrating Genome-scale Metabolism, Expression, and Regulation. TIGER converts a series of generalized, Boolean or multilevel rules into a set of mixed integer inequalities. The package also includes implementations of existing algorithms to integrate high-throughput expression data with genome-scale models of metabolism and transcriptional regulation. We demonstrate how TIGER automates the coupling of a genome-scale metabolic model with GPR logic and models of transcriptional regulation, thereby serving as a platform for algorithm development and large-scale metabolic analysis. Additionally, we demonstrate how TIGER's algorithms can be used to identify inconsistencies and improve existing models of transcriptional regulation with examples from the reconstructed transcriptional regulatory network of <it>Saccharomyces cerevisiae</it>.</p> <p>Conclusion</p> <p>The TIGER package provides a consistent platform for algorithm development and extending existing genome-scale metabolic models with regulatory networks and high-throughput data.</p

    Optimizations of Pt/SiC and W/Si multilayers for the Nuclear Spectroscopic Telescope Array

    Get PDF
    The Nuclear Spectroscopic Telescope Array, NuSTAR, is a NASA funded Small Explorer Mission, SMEX, scheduled for launch in mid 2011. The spacecraft will fly two co-aligned conical approximation Wolter-I optics with a focal length of 10 meters. The mirrors will be deposited with Pt/SiC and W/Si multilayers to provide a broad band reflectivity from 6 keV up to 78.4 keV. To optimize the mirror coating we use a Figure of Merit procedure developed for gazing incidence optics, which averages the effective area over the energy range, and combines an energy weighting function with an angular weighting function to control the shape of the desired effective area. The NuSTAR multilayers are depth graded with a power-law, d_i = a/(b + i)^c, and we optimize over the total number of bi-layers, N, c, and the maximum bi-layer thickness, d_(max). The result is a 10 mirror group design optimized for a flat even energy response both on and off-axis

    Improving Transit Predictions of Known Exoplanets with TERMS

    Get PDF
    Transiting planet discoveries have largely been restricted to the short-period or low-periastron distance regimes due to the bias inherent in the geometric transit probability. Through the refinement of planetary orbital parameters, and hence reducing the size of transit windows, long-period planets become feasible targets for photometric follow-up. Here we describe the TERMS project that is monitoring these host stars at predicted transit times.Comment: 3 pages, 2 figures, to be published in ASP Conf. Proceedings: "Detection and dynamics of transiting exoplanets" 2010, OHP, France (eds.: F. Bouchy, R.F. D{\i}az, C. Moutou

    2022 Crummer SunTrust Portfolio Recommendations: Crummer Investment Management 23rd Anniversary

    Get PDF
    SunTrust (now Truist) endowed this portfolio to provide scholarships for future Crummer students and to give current students a practical, hands-on learning opportunity. This year, we are pleased to be able to disburse $55,000 to be used for scholarships. We are extremely grateful for SunTrust’s generosity and investment in higher education. We have all learned a great deal from this experience and the responsibility of managing real money. Our first challenge is to establish a portfolio position that takes advantage of economic opportunities while avoiding unnecessary risk and conforming to the Crummer SunTrust Investment Policy Statement (IPS). We are also tasked by the IPS to operate at two levels simultaneously – tactical for the near term, and strategic for the long run. Additionally, this portfolio presents some unusual portfolio management challenges by trading only once a year, in early April. Our tactical approach began with a top-down sector analysis. We established an economic forecast based on research and consultation with economists, including Professor William Seyfried of the Crummer School and Philip Rich of Seaside Bank. We based our equity and fixed income split on that forecast with a modest allocation to bonds of 10%. That forecast also drove our allocation among the eleven S&P sectors: Communication Services, Consumer Discretionary, Consumer Staples, Energy, Financials, Healthcare, Industrials, Information Technology, Materials, Real Estate, and Utilities. This year, we forecast moderated but strong economic growth amid inflationary pressures within the next twelve-month period and we tilted the allocation towards sectors that should do well in such a macro environment while paying attention to post-pandemic dynamics and the war in Ukraine. Our asset class allocation embodies the long-run strategy of our portfolio. The IPS sets asset class ranges from low to moderate risk to keep the portfolio from being whipsawed by transitory market cycles. Our equity allocations entail a reasonable level of risk, consistent with our view that the stock market will relatively outperform the fixed income market as the interest rates are expected to rise between now and March 2023. We maintain an allocation to a sector ETF in each sector to ensure diversification. Due to enrollment constraints, we actively manage only five sectors this year with a limit of two individual stocks in each sector. The remaining sectors are invested 100% in their sector ETF. Fixed income is our anchor sector, providing a hedge against the risk of an economic slowdown adversely impacting our equity holdings. Consistent with our upward shifting yield curve projection, we are at the low end of our IPS range for fixed income at 10%, which is the same as last year’s and slightly higher than the 9.6% market position on February 28, 2022. Furthermore, we have continued to incorporate the theme of Environmental, Social, and Governance (ESG) investing into our portfolio selection process. Whether you believe a high ESG rating signals a company’s prospects or that ESG ratings are a popularity contest, the ESG wave is sweeping the equity markets. Regardless of a security’s consistency with this theme, all recommendations must be undervalued after rigorous quantitative and qualitative analysis. In other words, our intent is not to maximize the ESG impact of our portfolio but to tilt towards this factor. Specifically, the proposed equity holdings in this year’s portfolio have a weighted average FTSE ESG score of 3.38 out of 5, while S&P 500 holdings have a cap-weighted average score of 3.19. Since the onset of the COVID-19 pandemic, we have witnessed two extraordinary and unpredictable years in many respects. Inflation levels that have not been seen in the past 40 years, supply chain problems, and the Russian-Ukrainian war all have contributed to an increased uncertainty. We do not intend to simply follow the crowd. Yet, echoing the philosophy of Warren Buffett, “our opinions and beliefs, grounded in economics and guided by all of those who have counseled us,” lead us to a strategy that is not significantly different from many investors. Even so, we accept responsibility for our investment decisions. We are investing for the long-term and we have been conservative in our forecasts and recommendations. Simultaneously, in the short term, we are mindful of the need to protect the portfolio’s commitment to scholarships

    DNA Fingerprinting of Mycobacterium leprae Strains Using Variable Number Tandem Repeat (VNTR) - Fragment Length Analysis (FLA)

    Get PDF
    The study of the transmission of leprosy is particularly difficult since the causative agent, Mycobacterium leprae, cannot be cultured in the laboratory. The only sources of the bacteria are leprosy patients, and experimentally infected armadillos and nude mice. Thus, many of the methods used in modern epidemiology are not available for the study of leprosy. Despite an extensive global drug treatment program for leprosy implemented by the WHO1, leprosy remains endemic in many countries with approximately 250,000 new cases each year.2 The entire M. leprae genome has been mapped3,4 and many loci have been identified that have repeated segments of 2 or more base pairs (called micro- and minisatellites).5 Clinical strains of M. leprae may vary in the number of tandem repeated segments (short tandem repeats, STR) at many of these loci.5,6,7 Variable number tandem repeat (VNTR)5 analysis has been used to distinguish different strains of the leprosy bacilli. Some of the loci appear to be more stable than others, showing less variation in repeat numbers, while others seem to change more rapidly, sometimes in the same patient. While the variability of certain VNTRs has brought up questions regarding their suitability for strain typing7,8,9, the emerging data suggest that analyzing multiple loci, which are diverse in their stability, can be used as a valuable epidemiological tool. Multiple locus VNTR analysis (MLVA)10 has been used to study leprosy evolution and transmission in several countries including China11,12, Malawi8, the Philippines10,13, and Brazil14. MLVA involves multiple steps. First, bacterial DNA is extracted along with host tissue DNA from clinical biopsies or slit skin smears (SSS).10 The desired loci are then amplified from the extracted DNA via polymerase chain reaction (PCR). Fluorescently-labeled primers for 4-5 different loci are used per reaction, with 18 loci being amplified in a total of four reactions.10 The PCR products may be subjected to agarose gel electrophoresis to verify the presence of the desired DNA segments, and then submitted for fluorescent fragment length analysis (FLA) using capillary electrophoresis. DNA from armadillo passaged bacteria with a known number of repeat copies for each locus is used as a positive control. The FLA chromatograms are then examined using Peak Scanner software and fragment length is converted to number of VNTR copies (allele). Finally, the VNTR haplotypes are analyzed for patterns, and when combined with patient clinical data can be used to track distribution of strain types
    • …
    corecore