4,788 research outputs found

    Focal Spot, Fall/Winter 1998

    Get PDF
    https://digitalcommons.wustl.edu/focal_spot_archives/1080/thumbnail.jp

    Developing a distributed electronic health-record store for India

    Get PDF
    The DIGHT project is addressing the problem of building a scalable and highly available information store for the Electronic Health Records (EHRs) of the over one billion citizens of India

    Standard Operating Procedure - Collaborative Spatial Assessment CoSA - Release 1.0

    Get PDF
    The purpose of this Standard Operating Procedure (SOP) is to establish uniform procedures pertaining to the preparation for, the performance of, and the reporting of COllaborative (geo) Spatial Assessment (CoSA). CoSA provides a synoptic, unbiased assessment over the impact area of a disaster, which feeds the two main recovery perspectives of the Post-Disaster Needs Assessment (PDNA): i) the valuation of damages and losses carried out through the Damage and Loss Assessment (DaLA) methodology; and ii) the identification of human impacts and recovery needs carried out though the Human Recovery Needs Assessment (HRNA). CoSA is distinct from other geospatial and remote sensing based assessments because it i) draws on the collaborative efforts of distributed capacities in remote sensing and geospatial analysis, ii) aims to achieve the highest possible accuracy in line with the requirements of the PDNA and iii) tries to do so under stringent timing constraints set by the PDNA schedule. The current SOP will aid in ensuring credibility, consistency, transparency, accuracy and completeness of the CoSA. It is a living document, however, that will be enriched with new practical experiences and regularly updated to incorporate state-of-the-art procedures and new technical developments.JRC.DG.G.2-Global security and crisis managemen

    Density and disasters: economics of urban hazard risk

    Get PDF
    Today, 370 million people live in cities in earthquake prone areas and 310 million in cities with high probability of tropical cyclones. By 2050, these numbers are likely to more than double. Mortality risk therefore is highly concentrated in many of the world’s cities and economic risk even more so. This paper discusses what sets hazard risk in urban areas apart, provides estimates of valuation of hazard risk, and discusses implications for individual mitigation and public policy. The main conclusions are that urban agglomeration economies change the cost-benefit calculation of hazard mitigation, that good hazard management is first and foremost good general urban management, and that the public sector must perform better in generating and disseminating credible information on hazard risk in cities.Banks&Banking Reform,Environmental Economics&Policies,Hazard Risk Management,Urban Housing,Labor Policies

    Composition and combination‐based object trust evaluation for knowledge management in virtual organizations

    Get PDF
    Purpose – This paper aims to develop a framework for object trust evaluation and related object trust principles to facilitate knowledge management in a virtual organization. It proposes systematic methods to quantify the trust of an object and defines the concept of object trust management. The study aims to expand the domain of subject trust to object trust evaluation in terms of whether an object is correct and accurate in expressing a topic or issue and whether the object is secure and safe to execute (in the case of an executable program). By providing theoretical and empirical insights about object trust composition and combination, this research facilitates better knowledge identification, creation, evaluation, and distribution. Design/methodology/approach This paper presents two object trust principles – trust composition and trust combination. These principles provide formal methodologies and guidelines to assess whether an object has the required level of quality and security features (hence it is trustworthy). The paper uses a component‐based approach to evaluate the quality and security of an object. Formal approaches and algorithms have been developed to assess the trustworthiness of an object in different cases. Findings The paper provides qualitative and quantitative analysis about how object trust can be composed and combined. Novel mechanisms have been developed to help users evaluate the quality and security features of available objects. Originality/value This effort fulfills an identified need to address the challenging issue of evaluating the trustworthiness of an object (e.g. a software program, a file, or other type of knowledge element) in a loosely‐coupled system such as a virtual organization. It is the first of its kind to formally define object trust management and study object trust evaluation

    Modeling Through

    Get PDF
    Theorists of justice have long imagined a decision-maker capable of acting wisely in every circumstance. Policymakers seldom live up to this ideal. They face well-understood limits, including an inability to anticipate the societal impacts of state intervention along a range of dimensions and values. Policymakers see around corners or address societal problems at their roots. When it comes to regulation and policy-setting, policymakers are often forced, in the memorable words of political economist Charles Lindblom, to “muddle through” as best they can. Powerful new affordances, from supercomputing to artificial intelligence, have arisen in the decades since Lindblom’s 1959 article that stand to enhance policymaking. Computer-aided modeling holds promise in delivering on the broader goals of forecasting and systems analysis developed in the 1970s, arming policymakers with the means to anticipate the impacts of state intervention along several lines—to model, instead of muddle. A few policymakers have already dipped a toe into these waters, others are being told that the water is warm. The prospect that economic, physical, and even social forces could be modeled by machines confronts policymakers with a paradox. Society may expect policymakers to avail themselves of techniques already usefully deployed in other sectors, especially where statutes or executive orders require the agency to anticipate the impact of new rules on particular values. At the same time, “modeling through” holds novel perils that policymakers may be ill equipped to address. Concerns include privacy, brittleness, and automation bias, all of which law and technology scholars are keenly aware. They also include the extension and deepening of the quantifying turn in governance, a process that obscures normative judgments and recognizes only that which the machines can see. The water may be warm, but there are sharks in it. These tensions are not new. And there is danger in hewing to the status quo. As modeling through gains traction, however, policymakers, constituents, and academic critics must remain vigilant. This being early days, American society is uniquely positioned to shape the transition from muddling to modeling

    Modeling Through

    Get PDF
    Theorists of justice have long imagined a decision-maker capable of acting wisely in every circumstance. Policymakers seldom live up to this ideal. They face well-understood limits, including an inability to anticipate the societal impacts of state intervention along a range of dimensions and values. Policymakers cannot see around corners or address societal problems at their roots. When it comes to regulation and policy-setting, policymakers are often forced, in the memorable words of political economist Charles Lindblom, to “muddle through” as best they can. Powerful new affordances, from supercomputing to artificial intelligence, have arisen in the decades since Lindblom’s 1959 article that stand to enhance policymaking. Computer-aided modeling holds promise in delivering on the broader goals of forecasting and system analysis developed in the 1970s, arming policymakers with the means to anticipate the impacts of state intervention along several lines—to model, instead of muddle. A few policymakers have already dipped a toe into these waters, others are being told that the water is warm. The prospect that economic, physical, and even social forces could be modeled by machines confronts policymakers with a paradox. Society may expect policymakers to avail themselves of techniques already usefully deployed in other sectors, especially where statutes or executive orders require the agency to anticipate the impact of new rules on particular values. At the same time, “modeling through” holds novel perils that policymakers may be ill-equipped to address. Concerns include privacy, brittleness, and automation bias of which law and technology scholars are keenly aware. They also include the extension and deepening of the quantifying turn in governance, a process that obscures normative judgments and recognizes only that which the machines can see. The water may be warm but there are sharks in it. These tensions are not new. And there is danger in hewing to the status quo. (We should still pursue renewable energy even though wind turbines as presently configured waste energy and kill wildlife.) As modeling through gains traction, however, policymakers, constituents, and academic critics must remain vigilant. This being early days, American society is uniquely positioned to shape the transition from muddling to modeling

    Some Remarks about the Complexity of Epidemics Management

    Full text link
    Recent outbreaks of Ebola, H1N1 and other infectious diseases have shown that the assumptions underlying the established theory of epidemics management are too idealistic. For an improvement of procedures and organizations involved in fighting epidemics, extended models of epidemics management are required. The necessary extensions consist in a representation of the management loop and the potential frictions influencing the loop. The effects of the non-deterministic frictions can be taken into account by including the measures of robustness and risk in the assessment of management options. Thus, besides of the increased structural complexity resulting from the model extensions, the computational complexity of the task of epidemics management - interpreted as an optimization problem - is increased as well. This is a serious obstacle for analyzing the model and may require an additional pre-processing enabling a simplification of the analysis process. The paper closes with an outlook discussing some forthcoming problems

    A Hybrid Simulation Methodology To Evaluate Network Centricdecision Making Under Extreme Events

    Get PDF
    Currently the network centric operation and network centric warfare have generated a new area of research focused on determining how hierarchical organizations composed by human beings and machines make decisions over collaborative environments. One of the most stressful scenarios for these kinds of organizations is the so-called extreme events. This dissertation provides a hybrid simulation methodology based on classical simulation paradigms combined with social network analysis for evaluating and improving the organizational structures and procedures, mainly the incident command systems and plans for facing those extreme events. According to this, we provide a methodology for generating hypotheses and afterwards testing organizational procedures either in real training systems or simulation models with validated data. As long as the organization changes their dyadic relationships dynamically over time, we propose to capture the longitudinal digraph in time and analyze it by means of its adjacency matrix. Thus, by using an object oriented approach, three domains are proposed for better understanding the performance and the surrounding environment of an emergency management organization. System dynamics is used for modeling the critical infrastructure linked to the warning alerts of a given organization at federal, state and local levels. Discrete simulations based on the defined concept of community of state enables us to control the complete model. Discrete event simulation allows us to create entities that represent the data and resource flows within the organization. We propose that cognitive models might well be suited in our methodology. For instance, we show how the team performance decays in time, according to the Yerkes-Dodson curve, affecting the measures of performance of the whole organizational system. Accordingly we suggest that the hybrid model could be applied to other types of organizations, such as military peacekeeping operations and joint task forces. Along with providing insight about organizations, the methodology supports the analysis of the after action review (AAR), based on collection of data obtained from the command and control systems or the so-called training scenarios. Furthermore, a rich set of mathematical measures arises from the hybrid models such as triad census, dyad census, eigenvalues, utilization, feedback loops, etc., which provides a strong foundation for studying an emergency management organization. Future research will be necessary for analyzing real data and validating the proposed methodology

    Value of Research

    Get PDF
    Transportation research in Kentucky has been ongoing since 1914. The Road Materials Testing Laboratory was established that year at the University of Kentucky. Operation of the laboratory commenced in 1915 under the guidance of Professor D.V. Terrell who later became Dean of the College of Engineering. Dean Terrell often noted that the purpose of research was not to save money but to make it go farther. Public funding was never sufficient to accomplish all that various agency personnel considered essential nor that which the public deemed necessary and research was conducted most often for the purpose of gaining more for the money available. The purposes of this report are: 1) to provide an abbreviated background of the evolution of transportation research in Kentucky; 2) a discussion of research and the administration of research within the Kentucky Transportation Cabinet and its predecessor agencies - the Department of Highways and the Department of Transportation; and 3) a compilation of recent and current research accomplishments which have contributed to the design, construction, maintenance, management, and operations of the highway network within the Commonwealth. Significant benefits of research endeavors are derived through implementation of research findings and technology transfer to the users. Steps have been initiated by Transportation Cabinet and Federal Highway Administration officials to greatly enhance implementation and technology transfer. Dividends from investments in research multiply in proportion to implementation of research discoveries
    • 

    corecore