246 research outputs found

    iBusy: Research on children, families, and smartphones

    Get PDF
    Within the past 10 years, mobile devices have been widely adopted by adults and are now present in the lives of almost all U.S. children. While phones are common, our understanding of what effect this technology has upon children\u27s development is lagging. Bioecological theory and attachment theory suggest that this new technology may be disruptive, especially to the degree to which it interferes with the parent-child relationship. This article reflects a National Organization for Human Services conference presentation and shares preliminary results from semi-structured interviews conducted with 18 youth, ages 7 through 11. Only four of eighteen interviewees voiced any negative thoughts concerning their parents’ use of mobile devices. However, those who reported feeling ignored by their parents experienced the negative emotions deeply. Themes that emerged from analysis of transcripts included devices as tools and boundaries

    Systematic procedural and sensitivity analysis of the pattern informatics method for forecasting large (M > 5) earthquake events in southern California

    Full text link
    Recent studies in the literature have introduced a new approach to earthquake forecasting based on representing the space-time patterns of localized seismicity by a time-dependent system state vector in a real-valued Hilbert space and deducing information about future space-time fluctuations from the phase angle of the state vector. While the success rate of this Pattern Informatics (PI) method has been encouraging, the method is still in its infancy. Procedural analysis, statistical testing, parameter sensitivity investigation and optimization all still need to be performed. In this paper, we attempt to optimize the PI approach by developing quantitative values for "predictive goodness" and analyzing possible variations in the proposed procedure. In addition, we attempt to quantify the systematic dependence on the quality of the input catalog of historic data and develop methods for combining catalogs from regions of different seismic rates.Comment: 39 pages, 4 tables, 9 figures. Submitted to Pure and Applied Geophysics on 30 November 200

    Incorporating safety into targeted pavement friction data collection and maintenance procedures

    Get PDF
    The objective of this research was to develop a methodology for targeted pavement friction data collection based on the analysis of weather-related crashes. Furthermore, the aim was to identify threshold values of pavement friction characteristics indicating a significant impact on safety prompting the need for maintenance and improvements. Spatial analysis using Local Moran’s I statistic identified hotspots where pavement friction data were collected. A master database was assembled including Wisconsin State Trunk Network (STN) road attributes, hotspots of weather-related crashes, and pavement friction data collected based on hotspot analysis. The analysis results provide evidence in support of hotspot analysis as a viable procedure for targeted pavement friction data collection to enable efficiency and cost reductions. Classification tree analysis using GUIDE (Generalized, Unbiased, Interaction Detection and Estimation) algorithm was used to further explore the relationship between pavement friction characteristics and safety. Statistically significant hotspots were observed below a pavement friction number of approximately 57 and very high hotspots below a pavement friction number of approximately 42. The results indicate that pavement friction thresholds identified in the literature between 20 and 32 may be too low and that safety may be impacted at friction numbers as high as in the forties. The results also show differences in friction and safety for various types of pavement surfaces. The use of weather-related crashes provides a data-driven and cost-effective method of prioritizing locations for pavement friction data collection and maintenance. Results from this research can be readily used in initial steps of systemic road safety management procedures by practitioners

    Effect of Three Initial Implant Programs with a Common Terminal Revalor®- 200 on Feedlot Performance and Carcass Traits of Weaned Steers

    Get PDF
    A commercial feedlot study utilizing 1,350 calf- fed steers (initial BW = 623 lb; ±23 lb) compared three initial implant strategies: Revalor®- IS (day 1), Revalor®- IS (day 1) and Revalor®- 200 (day 67), or Revalor®- XS (day 1). Each initial implant strategy was followed by a terminal Revalor®- 200 implant (day 133) to determine effects on performance and carcass traits. No differences in final body weight, intake, gain, or feed conversion were observed on either a live, or carcass adjusted basis. Th ere were also no differences in hot carcass weight, USDA quality grade, or USDA yield grade. Results from this study suggest initial implant strategy has minimal impact on feedlot and carcass performance when following with a terminal Revalor®- 200 implant

    Sustainability effects of next-generation intersection control for autonomous vehicles

    Get PDF
    Transportation sustainability is adversely affected by recurring traffic congestions, especially at urban intersections. Frequent vehicle deceleration and acceleration caused by stop-and-go behaviours at intersections due to congestion adversely impacts energy consumption and ambient air quality. Availability of the maturing vehicle technologies such as autonomous vehicles and Vehicle-To-Vehicle (V2V) / Vehicle-To-Infrastructure (V2I) communications provides technical feasibility to develop solutions that can reduce vehicle stops at intersections, hence enhance the sustainability of intersections. This paper presents a next-generation intersection control system for autonomous vehicles, which is named ACUTA. ACUTA employs an enhanced reservation-based control algorithm that controls autonomous vehicles’ passing sequence at an intersection. Particularly, the intersection is divided into n-by-n tiles. An intersection controller reserves certain time-space for each vehicle, and assures no conflict exists between reservations. The algorithm was modelled in microscopic traffic simulation platform VISSIM. ACUTA algorithm modelling as well as enhancement strategies to minimize vehicle intersection stops and eventually emission and energy consumption were discussed in the paper. Sustainability benefits offered by this next-generation intersection were evaluated and compared with traditional intersection control strategies. The evaluation reveals that multi-tile ACUTA reduces carbon monoxide (CO) and Particulate Matter (PM) 2.5 emissions by about 5% under low to moderate volume conditions and by about 3% under high volume condition. Meanwhile, energy consumption is reduced by about 4% under low to moderate volume conditions and by about 12% under high volume condition. Compared with four-way stop control, single-tile ACUTA reduces CO and PM 2.5 emissions as well as energy consumption by about 15% under any prevailing volume conditions. These findings validated the sustainability benefits of employing next-generation vehicle technologies in intersection traffic control. In addition, extending the ACUTA to corridor level was explored in the paper

    Astrometry with the Keck-Interferometer: the ASTRA project and its science

    Full text link
    The sensitivity and astrometry upgrade ASTRA of the Keck Interferometer is introduced. After a brief overview of the underlying interferometric principles, the technology and concepts of the upgrade are presented. The interferometric dual-field technology of ASTRA will provide the KI with the means to observe two objects simultaneously, and measure the distance between them with a precision eventually better than 100 uas. This astrometric functionality of ASTRA will add a unique observing tool to fields of astrophysical research as diverse as exo-planetary kinematics, binary astrometry, and the investigation of stars accelerated by the massive black hole in the center of the Milky Way as discussed in this contribution.Comment: 22 pages, 10 figures (low resolution), contribution to the summerschool "Astrometry and Imaging with the Very Large Telescope Interferometer", 2 - 13 June, 2008, Keszthely, Hungary, corrected authorlis

    Planetary Candidates Observed by Kepler. VIII. A Fully Automated Catalog With Measured Completeness and Reliability Based on Data Release 25

    Get PDF
    We present the Kepler Object of Interest (KOI) catalog of transiting exoplanets based on searching four years of Kepler time series photometry (Data Release 25, Q1-Q17). The catalog contains 8054 KOIs of which 4034 are planet candidates with periods between 0.25 and 632 days. Of these candidates, 219 are new and include two in multi-planet systems (KOI-82.06 and KOI-2926.05), and ten high-reliability, terrestrial-size, habitable zone candidates. This catalog was created using a tool called the Robovetter which automatically vets the DR25 Threshold Crossing Events (TCEs, Twicken et al. 2016). The Robovetter also vetted simulated data sets and measured how well it was able to separate TCEs caused by noise from those caused by low signal-to-noise transits. We discusses the Robovetter and the metrics it uses to sort TCEs. For orbital periods less than 100 days the Robovetter completeness (the fraction of simulated transits that are determined to be planet candidates) across all observed stars is greater than 85%. For the same period range, the catalog reliability (the fraction of candidates that are not due to instrumental or stellar noise) is greater than 98%. However, for low signal-to-noise candidates between 200 and 500 days around FGK dwarf stars, the Robovetter is 76.7% complete and the catalog is 50.5% reliable. The KOI catalog, the transit fits and all of the simulated data used to characterize this catalog are available at the NASA Exoplanet Archive.Comment: 61 pages, 23 Figures, 9 Tables, Accepted to The Astrophysical Journal Supplement Serie

    Analytical Challenges and Metrological Approaches to Ensuring Dietary Supplement Quality: International Perspectives

    Get PDF
    The increased utilization of metrology resources and expanded application of its’ approaches in the development of internationally agreed upon measurements can lay the basis for regulatory harmonization, support reproducible research, and advance scientific understanding, especially of dietary supplements and herbal medicines. Yet, metrology is often underappreciated and underutilized in dealing with the many challenges presented by these chemically complex preparations. This article discusses the utility of applying rigorous analytical techniques and adopting metrological principles more widely in studying dietary supplement products and ingredients, particularly medicinal plants and other botanicals. An assessment of current and emerging dietary supplement characterization methods is provided, including targeted and non-targeted techniques, as well as data analysis and evaluation approaches, with a focus on chemometrics, toxicity, dosage form performance, and data management. Quality assessment, statistical methods, and optimized methods for data management are also discussed. Case studies provide examples of applying metrological principles in thorough analytical characterization of supplement composition to clarify their health effects. A new frontier for metrology in dietary supplement science is described, including opportunities to improve methods for analysis and data management, development of relevant standards and good practices, and communication of these developments to researchers and analysts, as well as to regulatory and policy decision makers in the public and private sectors. The promotion of closer interactions between analytical, clinical, and pharmaceutical scientists who are involved in research and product development with metrologists who develop standards and methodological guidelines is critical to advance research on dietary supplement characterization and health effects

    IMRT commissioning: multiple institution planning and dosimetry comparisons, a report from AAPM Task Group 119.

    Get PDF
    AAPM Task Group 119 has produced quantitative confidence limits as baseline expectation values for IMRT commissioning. A set of test cases was developed to assess the overall accuracy of planning and delivery of IMRT treatments. Each test uses contours of targets and avoidance structures drawn within rectangular phantoms. These tests were planned, delivered, measured, and analyzed by nine facilities using a variety of IMRT planning and delivery systems. Each facility had passed the Radiological Physics Center credentialing tests for IMRT. The agreement between the planned and measured doses was determined using ion chamber dosimetry in high and low dose regions, film dosimetry on coronal planes in the phantom with all fields delivered, and planar dosimetry for each field measured perpendicular to the central axis. The planar dose distributions were assessed using gamma criteria of 3%/3 mm. The mean values and standard deviations were used to develop confidence limits for the test results using the concept confidence limit = /mean/ + 1.96sigma. Other facilities can use the test protocol and results as a basis for comparison to this group. Locally derived confidence limits that substantially exceed these baseline values may indicate the need for improved IMRT commissioning
    • …
    corecore