2,762 research outputs found
empathi: An ontology for Emergency Managing and Planning about Hazard Crisis
In the domain of emergency management during hazard crises, having sufficient
situational awareness information is critical. It requires capturing and
integrating information from sources such as satellite images, local sensors
and social media content generated by local people. A bold obstacle to
capturing, representing and integrating such heterogeneous and diverse
information is lack of a proper ontology which properly conceptualizes this
domain, aggregates and unifies datasets. Thus, in this paper, we introduce
empathi ontology which conceptualizes the core concepts concerning with the
domain of emergency managing and planning of hazard crises. Although empathi
has a coarse-grained view, it considers the necessary concepts and relations
being essential in this domain. This ontology is available at
https://w3id.org/empathi/
Optimizing the Structure and Scale of Urban Water Infrastructure: Integrating Distributed Systems
Large-scale, centralized water infrastructure has provided clean drinking water, wastewater treatment, stormwater management and flood protection for U.S. cities and towns for many decades, protecting public health, safety and environmental quality. To accommodate increasing demands driven by population growth and industrial needs, municipalities and utilities have typically expanded centralized water systems with longer distribution and collection networks. This approach achieves financial and institutional economies of scale and allows for centralized management. It comes with tradeoffs, however, including higher energy demands for longdistance transport; extensive maintenance needs; and disruption of the hydrologic cycle, including the large-scale transfer of freshwater resources to estuarine and saline environments.While smaller-scale distributed water infrastructure has been available for quite some time, it has yet to be widely adopted in urban areas of the United States. However, interest in rethinking how to best meet our water and sanitation needs has been building. Recent technological developments and concerns about sustainability and community resilience have prompted experts to view distributed systems as complementary to centralized infrastructure, and in some situations the preferred alternative.In March 2014, the Johnson Foundation at Wingspread partnered with the Water Environment Federation and the Patel College of Global Sustainability at the University of South Florida to convene a diverse group of experts to examine the potential for distributed water infrastructure systems to be integrated with or substituted for more traditional water infrastructure, with a focus on right-sizing the structure and scale of systems and services to optimize water, energy and sanitation management while achieving long-term sustainability and resilience
Environmental hazard identification, assessment and control for a sustainable maritime transportation system
A demand exists to contribute towards the widening awareness of the need for sustainable maritime development and for coordinated maritime policies worldwide. Maritime shipping is considered the most eco-efficient mean of transportation and yet, is responsible for negative environmental impacts.
This dissertation focuses on developing data-driven decision support tools to evaluate the sustainable performance of MTS by focusing on the elements of the MTS that place stress on the environment. The first research contribution is a System Dynamics simulation model that examines the MTS resiliency after an extreme event and determines the sequence needed to restore the ocean-going port to its pre-event state. The second is a Decision-Making in Complex Environments (DMCE) tool developed by integrating fuzzy logic with a combination of Analytic Hierarchy Process (FAHP) and Techniques for Order Performance by Similarity to Ideal Solution (FTOPSIS) to quantify and rank preferred environmental impact indicators within MTS. The third is an extension to this DMCE tool by the integration of a Monte Carlo simulation in order to have a better understanding of the risks associated with the resulting rankings of those preferred environmental indicators. And, the fourth is a predictive model for the monitoring of vegetation changes near-port areas and to understand the long-term impacts that maritime activity has towards the environment. The developed models address the impacts MTS has on the natural environment and help achieve environmental sustainability of this complex system by evaluating the sustainability performance of the MTS --Abstract, page iv
MICROGRID RESILIENCE ANALYSIS SOFTWARE DEVELOPMENT
Military installation microgrids need to be resilient to a variety of potential disruptions (storms, attacks, et cetera). Various metrics for assessing microgrid resilience have been described in literature, and multiple tools for simulating microgrid performance have been constructed; however, it is often left to system owners and maintainers to bring these efforts together to identify and realize effective, efficient improvement strategies. Military microgrid stakeholders have expressed a desire for an integrated, unified platform that provides these multiple capabilities in a coordinated fashion. In support of these endeavors, analysis methods developed by NPS and NAVFAC Expeditionary Warfare Center researchers for measuring microgrid resilience have been integrated into an existing web-based microgrid power flow simulation and distributed energy resource rightsizing software tool. This was achieved by the development of additional functions and methods within the existing software platform code base, and expansion of the application programming interface (API). These API additions enabled access to the new calculation and analysis capabilities, as well as increased control over power flow simulation parameters. These analytical and functional contributions were validated through a design of experiments, including comparison to independently generated data, and factorial analysis.Outstanding ThesisCivilian, Department of the NavyCivilian, Department of the NavyCivilian, Department of the NavyCivilian, Department of the NavyCivilian, Department of the NavyApproved for public release. Distribution is unlimited
Tropical Cyclones and Storm Surge Modelling Activities
The Global Disasters Alert and Coordination System (GDACS) automatically invokes ad hoc numerical models to analyse the level of the hazard of natural disasters like earthquakes, tsunamis, tropical cyclones, floods and volcanoes. The Tropical Cyclones (TCs) are among the most damaging events, due to strong winds, heavy rains and storm surge. In order to estimate the area and the population affected, all three types of the above physical impacts must be taken into account. GDACS includes all these dangerous effects, using various sources of data.
The JRC set up an automatic routine that includes the TC information provided by the Joint Typhoon Warning Center (JTWC) and the National Oceanic and Atmospheric Administration (NOAA) into a single database, covering all TCs basins. This information is used in GDACS for the wind impact and as input for the JRC storm surge system. Recently the global numerical models and other TC models have notably improved their resolutions, therefore one of the first aim of this work is the assessment and implementation of new data sources for the wind, storm surge and rainfall impacts in GDACS. Moreover the TC modelling workflow has been revised in order to provide redundancy, transparency and efficiency while addressing issues of accuracy and incorporation of additional physical processes. The status of development is presented along with the outline of future steps.JRC.E.1-Disaster Risk Managemen
Many-Task Computing and Blue Waters
This report discusses many-task computing (MTC) generically and in the
context of the proposed Blue Waters systems, which is planned to be the largest
NSF-funded supercomputer when it begins production use in 2012. The aim of this
report is to inform the BW project about MTC, including understanding aspects
of MTC applications that can be used to characterize the domain and
understanding the implications of these aspects to middleware and policies.
Many MTC applications do not neatly fit the stereotypes of high-performance
computing (HPC) or high-throughput computing (HTC) applications. Like HTC
applications, by definition MTC applications are structured as graphs of
discrete tasks, with explicit input and output dependencies forming the graph
edges. However, MTC applications have significant features that distinguish
them from typical HTC applications. In particular, different engineering
constraints for hardware and software must be met in order to support these
applications. HTC applications have traditionally run on platforms such as
grids and clusters, through either workflow systems or parallel programming
systems. MTC applications, in contrast, will often demand a short time to
solution, may be communication intensive or data intensive, and may comprise
very short tasks. Therefore, hardware and software for MTC must be engineered
to support the additional communication and I/O and must minimize task dispatch
overheads. The hardware of large-scale HPC systems, with its high degree of
parallelism and support for intensive communication, is well suited for MTC
applications. However, HPC systems often lack a dynamic resource-provisioning
feature, are not ideal for task communication via the file system, and have an
I/O system that is not optimized for MTC-style applications. Hence, additional
software support is likely to be required to gain full benefit from the HPC
hardware
Comparison of Traditional Versus CubeSat Remote Sensing: A Model-Based Systems Engineering Approach
This thesis compares the ability of both traditional and CubeSat remote sensing architectures to fulfill a set of mission requirements for a remote sensing scenario. Mission requirements originating from a hurricane disaster response scenario are developed to derive a set of system requirements. Using a Model-based Systems Engineering approach, these system requirements are used to develop notional traditional and CubeSat architecture models. The technical performance of these architectures is analyzed using Systems Toolkit (STK); the results are compared against Measures of Effectiveness (MOEs) derived from the disaster response scenario. Additionally, systems engineering cost estimates are obtained for each satellite architecture using the Constructive Systems Engineering Cost Model (COSYSMO). The technical and cost comparisons between the traditional and CubeSat architectures are intended to inform future discussions relating to the benefits and limitations of using CubeSats to conduct operational missions
Recommended from our members
Innovating Pedagogy 2017: Exploring new forms of teaching, learning and assessment, to guide educators and policy makers. Open University Innovation Report 6
This series of reports explores new forms of teaching, learning and assessment for an interactive world, to guide teachers and policy makers in productive innovation. This sixth report proposes ten innovations that are already in currency but have not yet had a profound influence on education. To produce it, a group of academics at the Institute of Educational Technology in The Open University collaborated with researchers from the Learning In a NetworKed Society (LINKS) Israeli Center of Research Excellence (I-CORE).
Themes:
• Big-data inquiry: thinking with data
• Learners making science
• Navigating post-truth societies
• Immersive learning
• Learning with internal values
• Student-led analytics
• Intergroup empathy
• Humanistic knowledge-building communities
• Open Textbooks
• Spaced Learnin
A CASE STUDY OF VARIOUS WIRELESS NETWORK SIMULATION TOOLS
4G is the fastest developing system in the history of mobile communication networks. Network connectivity is paramount for all kinds of big enterprises. Â 4G not only provides super-fast connectivity to millions of users, but can also act as an enterprise network connectivity enabler and it has inherent advantages such as higher bandwidth, low latency, higher spectrum efficiency along with backward compatibility and future proofing. The design of the 4G based Long Term Evolution physical network provides the required flexibility for optimization during the development phase. In this paper LTE Network related supporting simulation tools is presented to demonstrate the need of Hardware co-simulation of the LTE system. After the feasibility analysis, the importance of the model is to be ported Field Programmable Gate Array platform is examined in survey in detail with the supporting inferences along with the comparison of different wireless network simulators suitable for LTE
- …