2,716 research outputs found

    Developing an integrated technology roadmapping process to meet regional technology planning needs: the e-bike pilot study

    Get PDF
    Smart grid is a promising class of new technologies offering many potential benefits for electric utility systems, including possibilities for smart appliances which can communicate with power systems and help to better match supply and demand. Additional services include the ability to\ud better integrate growing supplies of renewable energy and perform a variety of value-added services on the grid. However, a number of challenges exist in order to achieving these benefits.\ud Many utility systems have substantial regulatory structures that make business processes and technology innovation substantially different than in other industries. Due to complex histories regarding regulatory and deregulatory efforts, and due to what some economists consider natural monopoly characteristics in the industry, such regulatory structures are unlikely to change in the immediate future. Therefore, innovation within these industries, including the development of\ud smart grid, will require an understanding of such regulatory and policy frameworks, development of appropriate business models, and adaptation of technologies to fit these emerging requirements. Technology Roadmapping may be a useful method of planning this type of future development within the smart grid sector, but such technology roadmaps would require a high level of integrated thinking regarding technology, business, and regulatory and policy considerations. This research provides an initial examination of the process for creating such a type of integrated technology roadmapping and assessment process. This research proposes to build upon previous research in the Pacific Northwest and create a more robust technology planning process that will allow key variables to be tested and different pathways to be explored

    Maximizing Educator Enhancement: Aligned Seminar And Online Professional Development

    Get PDF
    Professional development and learning has a long history in seminar-like models, as well as in the more educator-personal delivery approaches.  The question is whether an intentionally coordinated, integrated combination of the two PDL approaches will have best impacts for educators as quantified in improved student performance.  Contrasts between baseline and Post-Program performance levels showed 19% gains in Reading and 24% gains in Math, significantly beyond expectation.  Analyses for Title 1 schools showed significant shrinkage of performance gaps with contrasted non-Title 1 schools.  These gains outpaced those found for either PDL approach alone, indicating that educational leaders will be wise to undertake implementation of intentionally aligned and coordinated approaches combining PDL Seminars with online, on-demand PDL.

    Debye relaxation in high magnetic fields

    Full text link
    Dielectric relaxation is universal in characterizing polar liquids and solids, insulators, and semiconductors, and the theoretical models are well developed. However, in high magnetic fields, previously unknown aspects of dielectric relaxation can be revealed and exploited. Here, we report low temperature dielectric relaxation measurements in lightly doped silicon in high dc magnetic fields B both parallel and perpendicular to the applied ac electric field E. For B//E, we observe a temperature and magnetic field dependent dielectric dispersion e(w)characteristic of conventional Debye relaxation where the free carrier concentration is dependent on thermal dopant ionization, magnetic freeze-out, and/or magnetic localization effects. However, for BperpE, anomalous dispersion emerges in e(w) with increasing magnetic field. It is shown that the Debye formalism can be simply extended by adding the Lorentz force to describe the general response of a dielectric in crossed magnetic and electric fields. Moreover, we predict and observe a new transverse dielectric response EH perp B perp E not previously described in magneto-dielectric measurements. The new formalism allows the determination of the mobility and the ability to discriminate between magnetic localization/freeze out and Lorentz force effects in the magneto-dielectric response.Comment: 19 pages, 6 figure

    Micromanipulation of InP lasers with optoelectronic tweezers for integration on a photonic platform

    Get PDF
    The integration of light sources on a photonic platform is a key aspect of the fabrication of self-contained photonic circuits with a small footprint that does not have a definitive solution yet. Several approaches are being actively researched for this purpose. In this work we propose optoelectronic tweezers for the manipulation and integration of light sources on a photonic platform and report the positional and angular accuracy of the micromanipulation of standard Fabry-Pérot InP semiconductor laser die. These lasers are over three orders of magnitude bigger in volume than any previously assembled with optofluidic techniques and the fact that they are industry standard lasers makes them significantly more useful than previously assembled microdisk lasers. We measure the accuracy to be 2.5 ± 1.4 µm and 1.4 ± 0.4° and conclude that optoelectronic tweezers are a promising technique for the micromanipulation and integration of optoelectronic components in general and semiconductor lasers in particular

    From Burdens To Benefits: The Societal Impact Of PDL-Enriched, Efficacy-Enhanced Educators

    Get PDF
    Societies continue to absorb increased burdens in cost for helping citizens unable to achieve at optimal levels.  Building on past research, we project educational benefits to offset current societal burdens through enhanced educator capabilities.  Studies reviewed show participation in a high-impact professional development and learning solution resulted in improved student performance and reduced dropout rates, reduced disciplinary rates and increased rates for college-bound, along with lower teacher turnover.  Computations show that generalization of such impacts should trade societal burdens for benefits at between 3.7billionand3.7 billion and 6.9 billion within the first year.  Cumulatively within 20 years the burdens converted to benefits are projected to exceed $85 billion.  Enhanced educator capabilities will substantively reduce needs and costs for societal programs, replaced with tangible benefits to all

    Safeguard: Progress and Test Results for a Reliable Independent On-Board Safety Net for UAS

    Get PDF
    As demands increase to use unmanned aircraft systems (UAS) for a broad spectrum of commercial applications, regulatory authorities are examining how to safely integrate them without compromising safety or disrupting traditional airspace operations. For small UAS, several operational rules have been established; e.g., do not operate beyond visual line-of-sight, do not fly within five miles of a commercial airport, do not fly above 400 feet above ground level. Enforcing these rules is challenging for UAS, as evidenced by the number of incident reports received by the Federal Aviation Administration (FAA). This paper reviews the development of an onboard system - Safeguard - designed to monitor and enforce conformance to a set of operational rules defined prior to flight (e.g., geospatial stay-out or stay-in regions, speed limits, and altitude constraints). Unlike typical geofencing or geo-limitation functions, Safeguard operates independently of the off-the-shelf UAS autopilot and is designed in a way that can be realized by a small set of verifiable functions to simplify compliance with existing standards for safety-critical systems (e.g. for spacecraft and manned commercial transportation aircraft systems). A framework is described that decouples the system from any other devices on the UAS as well as introduces complementary positioning source(s) for applications that require integrity and availability beyond what can be provided by the Global Positioning System (GPS). This paper summarizes the progress and test results for Safeguard research and development since presentation of the design concept at the 35th Digital Avionics Systems Conference (DASC '16). Significant accomplishments include completion of software verification and validation in accordance with NASA standards for spacecraft systems (to Class B), development of improved hardware prototypes, development of a simulation platform that allows for hardware-in-the-loop testing and fast-time Monte Carlo evaluations, and flight testing on multiple air vehicles. Integration testing with NASA's UAS Traffic Management (UTM) service-oriented architecture was also demonstrated

    UAS Autonomous Hazard Mitigation through Assured Compliance with Conformance Criteria

    Get PDF
    The behavior of a drone depends on the integrity of the data it uses and the reliability of the avionics systems that process that data to affect the operation of the aircraft. Commercial unmanned aircraft systems frequently rely on commercial-off-the-shelf and open source avionics components and data sources whose reliability and integrity are not easily assured. To mitigate failure events for aircraft that do not comply with conventional aviation safety standards, operational limitations are typically prescribed by regulators. Part 107 of the Federal Aviation Regulations serves as a good example of operational limitations that mitigate risk for small unmanned aircraft systems. These limitations, however, restrict growth possibilities for the industry. Any reasonable path toward achieving routine operation of all types of drones will have to address the need for assurance of avionics systems, especially their software. This paper discusses the possibility of strategically using assured systems as a stepping stone to routine operation of drones. A specimen system for assured geofencing, called Safeguard, is described as an example of such a stepping stone

    Beyond a warming fingerprint: individualistic biogeographic responses to heterogeneous climate change in California.

    Get PDF
    Understanding recent biogeographic responses to climate change is fundamental for improving our predictions of likely future responses and guiding conservation planning at both local and global scales. Studies of observed biogeographic responses to 20th century climate change have principally examined effects related to ubiquitous increases in temperature - collectively termed a warming fingerprint. Although the importance of changes in other aspects of climate - particularly precipitation and water availability - is widely acknowledged from a theoretical standpoint and supported by paleontological evidence, we lack a practical understanding of how these changes interact with temperature to drive biogeographic responses. Further complicating matters, differences in life history and ecological attributes may lead species to respond differently to the same changes in climate. Here, we examine whether recent biogeographic patterns across California are consistent with a warming fingerprint. We describe how various components of climate have changed regionally in California during the 20th century and review empirical evidence of biogeographic responses to these changes, particularly elevational range shifts. Many responses to climate change do not appear to be consistent with a warming fingerprint, with downslope shifts in elevation being as common as upslope shifts across a number of taxa and many demographic and community responses being inconsistent with upslope shifts. We identify a number of potential direct and indirect mechanisms for these responses, including the influence of aspects of climate change other than temperature (e.g., the shifting seasonal balance of energy and water availability), differences in each taxon's sensitivity to climate change, trophic interactions, and land-use change. Finally, we highlight the need to move beyond a warming fingerprint in studies of biogeographic responses by considering a more multifaceted view of climate, emphasizing local-scale effects, and including a priori knowledge of relevant natural history for the taxa and regions under study

    Assessing U.S. Food Wastage and Opportunities for Reduction

    Get PDF
    Reducing food wastage is one of the key strategies to combat hunger and sustainably feed the world. We present a comprehensive analysis of available data, despite uncertainties due to data limitation, indicating that the U.S. loses at least 150 million metric tonnes (MMT) of food between farm and fork annually, of which about 70 MMT is edible food loss. Currently, \u3c2% of the edible food loss is recovered for human consumption. A reasonably-attainable goal of food waste reduction at the source by 20% would save more food than the annual increase in total food production and would feed millions of people. This is an opportunity of significant magnitude, offering food security and resource and environmental benefits with few negatives. Seizing this opportunity requires technological innovation, policy intervention, and public outreach. This U.S.-based analysis is pertinent to other mid- to high-income countries

    Robust Weak-lensing Mass Calibration of Planck Galaxy Clusters

    Full text link
    In light of the tension in cosmological constraints reported by the Planck team between their SZ-selected cluster counts and Cosmic Microwave Background (CMB) temperature anisotropies, we compare the Planck cluster mass estimates with robust, weak-lensing mass measurements from the Weighing the Giants (WtG) project. For the 22 clusters in common between the Planck cosmology sample and WtG, we find an overall mass ratio of \left = 0.688 \pm 0.072. Extending the sample to clusters not used in the Planck cosmology analysis yields a consistent value of <MPlanck/MWtG>=0.698±0.062\left< M_{Planck}/M_{\rm WtG} \right> = 0.698 \pm 0.062 from 38 clusters in common. Identifying the weak-lensing masses as proxies for the true cluster mass (on average), these ratios are ∼1.6σ\sim 1.6\sigma lower than the default mass bias of 0.8 assumed in the Planck cluster analysis. Adopting the WtG weak-lensing-based mass calibration would substantially reduce the tension found between the Planck cluster count cosmology results and those from CMB temperature anisotropies, thereby dispensing of the need for "new physics" such as uncomfortably large neutrino masses (in the context of the measured Planck temperature anisotropies and other data). We also find modest evidence (at 95 per cent confidence) for a mass dependence of the calibration ratio and discuss its potential origin in light of systematic uncertainties in the temperature calibration of the X-ray measurements used to calibrate the Planck cluster masses. Our results exemplify the critical role that robust absolute mass calibration plays in cluster cosmology, and the invaluable role of accurate weak-lensing mass measurements in this regard.Comment: 5 pages, 2 figure
    • …
    corecore