1,829 research outputs found

    The Yamanote Loop: Unifying Rail Transportation and Disaster Resilience in Tokyo

    Get PDF
    As climate change and population growth persist, and as the world rapidly urbanizes, major cities across the globe will face unprecedented strains. The risk of devastating impact from natural disasters increases in areas with a growing concentration of people. Megacities in Asia are the most at-risk of natural disasters, given their geographic location and high population density. With the highest projected population growth in the world, Asian cities must quickly expand and adapt their existing infrastructure to accommodate the transforming global conditions. A remarkable anomaly amongst Asian megacities, Tokyo, Japan is effectively adapting to its earthquake-prone environment. Within the last century, Japan has implemented seismically reinforced buildings and educational resources for earthquake preparedness. Amongst other technological innovations, investments in railway transportation have permitted major cities like Tokyo to expand and adjust according to its changing needs. The Yamanote Line is the primary commuter rail line in Tokyo. Its antecedent originated in 1885 and has since undergone significant changes to evolve into the highly sophisticated system it is today. By examining the evolvement of the Yamanote line from its conception and into the 21st century, this study explores the correlation between local rail transportation networks and their city’s resilience to natural disasters. A descriptive analysis aligned with four constructs of transportation resilience—robustness, redundancy, resourcefulness, and rapidity—observes instances in which the Yamanote line potentially strengthens Tokyo’s comprehensive disaster preparedness. The following study intentionally circumvents normative-prescriptive conclusions and focuses primarily on the impact of transformations of railway transportation on its broader urban context over time respective to disaster resilience and with consideration of other relative factors

    A survey of data mining techniques for social media analysis

    Get PDF
    Social network has gained remarkable attention in the last decade. Accessing social network sites such as Twitter, Facebook LinkedIn and Google+ through the internet and the web 2.0 technologies has become more affordable. People are becoming more interested in and relying on social network for information, news and opinion of other users on diverse subject matters. The heavy reliance on social network sites causes them to generate massive data characterised by three computational issues namely; size, noise and dynamism. These issues often make social network data very complex to analyse manually, resulting in the pertinent use of computational means of analysing them. Data mining provides a wide range of techniques for detecting useful knowledge from massive datasets like trends, patterns and rules [44]. Data mining techniques are used for information retrieval, statistical modelling and machine learning. These techniques employ data pre-processing, data analysis, and data interpretation processes in the course of data analysis. This survey discusses different data mining techniques used in mining diverse aspects of the social network over decades going from the historical techniques to the up-to-date models, including our novel technique named TRCM. All the techniques covered in this survey are listed in the Table.1 including the tools employed as well as names of their authors

    CAIR: Using Formal Languages to Study Routing, Leaking, and Interception in BGP

    Full text link
    The Internet routing protocol BGP expresses topological reachability and policy-based decisions simultaneously in path vectors. A complete view on the Internet backbone routing is given by the collection of all valid routes, which is infeasible to obtain due to information hiding of BGP, the lack of omnipresent collection points, and data complexity. Commonly, graph-based data models are used to represent the Internet topology from a given set of BGP routing tables but fall short of explaining policy contexts. As a consequence, routing anomalies such as route leaks and interception attacks cannot be explained with graphs. In this paper, we use formal languages to represent the global routing system in a rigorous model. Our CAIR framework translates BGP announcements into a finite route language that allows for the incremental construction of minimal route automata. CAIR preserves route diversity, is highly efficient, and well-suited to monitor BGP path changes in real-time. We formally derive implementable search patterns for route leaks and interception attacks. In contrast to the state-of-the-art, we can detect these incidents. In practical experiments, we analyze public BGP data over the last seven years

    A contour tree based spatio-temporal data model for oceanographic applications

    Get PDF
    To present the spatio/temporal data from oceanographic modeling in GIS has been a challenging task due to the highly dynamic characteristic and complex pattern of variables, in relation to time and space. This dissertation focuses the research on spatio-temporal GIS data model applied to oceanographic model data, especially to homogeneous iso-surface data. The available spatio-temporal data models are carefully reviewed and characteristics in spatial and temporal issues from oceanographic model data are discussed in detail. As an important tool for data modeling, ontology is introduced to categorize oceanographic model data and further set up fundamental software components in the new data model. The proposed data model is based on the concept of contour tree. By adding temporal information to each node and arc of the contour tree, and using multiple contour trees to represent different time steps in the temporal domain, the changes can be stored and tracked by the data model. In order to reduce the data volume and increase the data quality, the new data model integrates spatial and temporal interpolation methods within it. The spatial interpolation calculates the data that fall between neighboring contours at a single time step. The Inverse Distance Weighting (IDW) is applied as the main algorithm and the Minimum Bounding Rectangle (MBR) is used to enhance the spatial interpolation performance. The temporal interpolation calculates the data that are not recorded, which fall between neighboring contour trees for adjacent time steps. The “linear interpolation” algorithm is preferred to the “nearest neighbor’s value” and “spline” interpolation methods, for its modest accuracy and the simple implementation scheme. In order to evaluate the support functions of the new data model, a case study is presented with the motivation to show how this data model supports complicated spatio-temporal queries in forecasting applications. This dissertation also showcases some work in contour tree simplification. A new simplification algorithm is introduced to reduce the data complexity. This algorithm is based on the branch decomposition method and supports temporal information integrated into contour trees. Three types of criteria parameters are introduced to run different simplification methods for various applications

    Engineering brain : metaverse for future engineering

    Get PDF
    The past decade has witnessed a notable transformation in the Architecture, Engineering and Construction (AEC) industry, with efforts made both in the academia and industry to facilitate improvement of efficiency, safety and sustainability in civil projects. Such advances have greatly contributed to a higher level of automation in the lifecycle management of civil assets within a digitalised environment. To integrate all the achievements delivered so far and further step up their progress, this study proposes a novel theory, Engineering Brain, by effectively adopting the Metaverse concept in the field of civil engineering. Specifically, the evolution of the Metaverse and its key supporting technologies are first reviewed; then, the Engineering Brain theory is presented, including its theoretical background, key components and their inter-connections. Outlooks of this theory’s implementation within the AEC sector are offered, as a description of the Metaverse of future engineering. Through a comparison between the proposed Engineering Brain theory and the Metaverse, their relationships are illustrated; and how Engineering Brain may function as the Metaverse for future engineering is further explored. Providing an innovative insight into the future engineering sector, this study can potentially guide the entire industry towards its new era based on the Metaverse environment

    Multi-faceted analytics of social events: Identification, representation and monitoring

    Get PDF

    Thermoregulation in exercising horses: Aspects of temperature monitoring during field exercise

    Get PDF
    Joint Degree Program between the School of Animal and Veterinary Science, University of Adelaide, and Faculty of Veterinary Medicine, Ghent University, BelgiumHyperthermia is an ongoing welfare and performance issue for all horses exercising in racing and other competitive sport events. At present, little is known about the influence of core body temperature evolvement on hyperthermia in real time during different types of exercise performed in field conditions such as racing and endurance events. Consequently, it is becoming increasingly important to establish appropriate policies regarding the detection and prevention of all types of heat stress. To achieve this, a detailed view of the variability of equine thermoregulation during field exercise and recovery is essential. To date, the vast majority of thermoregulatory studies have been conducted in indoor laboratory conditions using a treadmill and subjecting horses to specific standardized exercise tests. However, this approach cannot successfully reflect real-time field conditions. Hence, there is a need to accurately and reliably monitor equine core body temperature responses to avoid potential harm due to increasing heat load. Chapter 1 provides a review of current research into thermoregulation, hyperthermia and exertional heat illness (EHI) in exercising horses. In addition, several temperature monitoring methods in horses are described along with some relevant methods in human athletes. However, no studies have investigated the promising continuous monitoring method involving a telemetric gastrointestinal pill (GI) that can be applied in the field in all conditions. Chapter 2 outlines the scientific aims of the thesis. Chapter 3 describes a study designed to evaluate the efficacy of continuous monitoring of core body temperature using the novel telemetric GI pill during real-time field exercise for the first time. The results showed that the continuous recording of the GI core temperature in exercising horses in the field using the GI pill was non-invasive, practical and accurate. Temperature fluctuations experienced during exercise and recovery are reliably recorded, and tendencies toward EHI will be easily observed during field exercise. Importantly, the GI pill has proven to be a more accurate and precise tool to monitor core thermal response when compared with serial Tre measurements in the field. Summary 194 Chapter 4 describes the application of this novel thermoregulation monitoring method in detail. The study involved measurements conducted in both endurance horses and trotters in order to compare exercise types in real-life competitions in the field. Not only were the core body temperatures (Tc) continuously monitored during exercise and recovery, the thermoregulatory responses to the different exercise intensities were also compared. The findings of this study reported real-time temperature evolvement during real-time competition in the field. More specifically, endurance horses reached peak temperature at 75% of completion of 40 km of exercise. However, trotters reached peak temperature always during recovery. In addition, the Tc in endurance horses returned to baseline within 60 minutes into recovery while in 30% of trotters, Tc was still higher than 39°C at the end of recovery. Since endurance horses are considered as ‘fit to continue’ competition when the heartrate (HR) is 60 beats per minute or below, the finding that the mean Tc was still 38.8 ± 0.4°C at a HR of 60 bpm is of importance. Overall, the study showed that horses have very individual thermoregulatory responses which require highly accurate monitoring no matter what type of exercise is performed in the field. Chapter 5 investigated the usefulness of monitoring skin temperature in endurance horses. A large array of skin temperature methodologies recently used in the field is reviewed, mainly pre- and post-exercise at time points, In this study, to evaluate if skin temperature could be used as a proxy for core temperature, the skin temperature was continuously monitored and evaluated using an infrared monitor during a real-life endurance competition. The skin temperature was compared to the GI temperature and importantly, there was no correlation between skin and GI temperature.Thesis (Ph.D.) -- University of Adelaide, School of Animal and Veterinary Sciences, 202
    corecore