689,352 research outputs found

    Assessment of the impact of climate change on road maintenance

    Get PDF
    Climate affects road deterioration, vehicle operating costs, road safety and the environment. Current and past pavement design guides and engineering models assume a static climate whose variability can be determined from past data. This fixed climate assumptions is often used in road management decision support models such as the Highway Development and Management system (HDM-4) to simulate future behaviour of road sections and consequently inform long-term road maintenance strategies and policies. Contrary to the assumption of a static climate in road management approaches, observations over the last 40 or 50 years show increasing trend in global warming. This raises the possibility that the severity and frequency of pavement defects may be altered leading to premature pavement deterioration and increased costs of managing and using roads. As a consequence, current road management strategies and policies may not offer sufficient resilience to increased frequency of future extreme climate events. A study was undertaken at the University of Birmingham to develop improved deterioration model for asphalt rut depth prediction. The approach used entailed the application of Bayesian Monte Carlo analysis. The output of the study will be used to improve existing road management systems such as HDM-4 and to consequently facilitate the investigation of strategies for adapting to future changes in climate

    RETROSPECTIVE ANALYSIS AND ANTICIPATION OF CURRICULUM DESIGN FOR FUTURE SOUTHEAST ASIA

    Get PDF
    Background and Purpose: The curriculum design underwent chronological pattern adjustment globally; however, the alterations are heavily contextualized. Therefore, this article is aimed to understand how futuristic curriculum design is perceived in South East Asia (SEA). The understanding of how futuristic curriculum design is perceived in the past is achieved through retrospective examination of published documents. After identifying past trends, anticipation from historical trends refer to a systematic projection of how the curriculum would be constructed for the future generation in the SEA region.   Methodology: This study presents an in-depth bibliometric analysis and visual scientific mapping of 2733 published documents in a reputable database. To examine how a futuristic curriculum is regarded throughout time, temporal, geographical, institutional, partnership and keyword mapping were quantitively analyzed. The succession of events in the past and the emerging keywords visible in the present were then qualitatively assessed in order to anticipate what is viewed as future curriculum in SEA.   Findings: According to the findings, the design of the futuristic curriculum has changed since the 1980s in terms of 1) centricity, 2) measured dimensions, 3) technological advances to support the 4) learning dynamics between internalization, regionalization and localization. Based on past and current trends, it is anticipated that curriculum design for the future will be 1) centered on an individual as a unit of a larger society, 2) focuses on measuring the tangible and intangible one’s performances using indicators by benefitting the technologically advanced 3) seamless and self-regulated learning.   Contributions: The findings and recommendations of this article serves as the baseline evidences in curriculum design in SEA education ecosystem to inform pedagogy and policy by exploring new areas of research and fostering the evidence-based knowledge in education.   Keywords: Future studies, Southeast Asia, foresight, bibliometric, anticipatory.   Cite as: Zainun, M., Farida, N., Saedah, S., Deva Nanthini, S., Shah Jahan, A., & Wan Nor Adzmin, M. S. (2023). Retrospective analysis and anticipation of curriculum design for future Southeast Asia. Journal of Nusantara Studies, 8(2), 266-286. http://dx.doi.org/10.24200/jonus.vol8iss2pp266-28

    An investigation of the impact of using different methods for network meta-analysis: A protocol for an empirical evaluation

    Get PDF
    BACKGROUND: Network meta-analysis, a method to synthesise evidence from multiple treatments, has increased in popularity in the past decade. Two broad approaches are available to synthesise data across networks, namely, arm- and contrast-synthesis models, with a range of models that can be fitted within each. There has been recent debate about the validity of the arm-synthesis models, but to date, there has been limited empirical evaluation comparing results using the methods applied to a large number of networks. We aim to address this gap through the re-analysis of a large cohort of published networks of interventions using a range of network meta-analysis methods. METHODS: We will include a subset of networks from a database of network meta-analyses of randomised trials that have been identified and curated from the published literature. The subset of networks will include those where the primary outcome is binary, the number of events and participants are reported for each direct comparison, and there is no evidence of inconsistency in the network. We will re-analyse the networks using three contrast-synthesis methods and two arm-synthesis methods. We will compare the estimated treatment effects, their standard errors, treatment hierarchy based on the surface under the cumulative ranking (SUCRA) curve, the SUCRA value, and the between-trial heterogeneity variance across the network meta-analysis methods. We will investigate whether differences in the results are affected by network characteristics and baseline risk. DISCUSSION: The results of this study will inform whether, in practice, the choice of network meta-analysis method matters, and if it does, in what situations differences in the results between methods might arise. The results from this research might also inform future simulation studies

    An investigation of the impact of using different methods for network meta-analysis: a protocol for an empirical evaluation.

    Get PDF
    BACKGROUND: Network meta-analysis, a method to synthesise evidence from multiple treatments, has increased in popularity in the past decade. Two broad approaches are available to synthesise data across networks, namely, arm- and contrast-synthesis models, with a range of models that can be fitted within each. There has been recent debate about the validity of the arm-synthesis models, but to date, there has been limited empirical evaluation comparing results using the methods applied to a large number of networks. We aim to address this gap through the re-analysis of a large cohort of published networks of interventions using a range of network meta-analysis methods. METHODS: We will include a subset of networks from a database of network meta-analyses of randomised trials that have been identified and curated from the published literature. The subset of networks will include those where the primary outcome is binary, the number of events and participants are reported for each direct comparison, and there is no evidence of inconsistency in the network. We will re-analyse the networks using three contrast-synthesis methods and two arm-synthesis methods. We will compare the estimated treatment effects, their standard errors, treatment hierarchy based on the surface under the cumulative ranking (SUCRA) curve, the SUCRA value, and the between-trial heterogeneity variance across the network meta-analysis methods. We will investigate whether differences in the results are affected by network characteristics and baseline risk. DISCUSSION: The results of this study will inform whether, in practice, the choice of network meta-analysis method matters, and if it does, in what situations differences in the results between methods might arise. The results from this research might also inform future simulation studies

    Historical Changes in Planform Geometry of the Amite and Comite Rivers and Implications on Flood Routing

    Get PDF
    The Amite River Basin is a 2,220 square-mile basin spanning from southwest Mississippi through southeast Louisiana, encompassing Baton Rouge and its suburbs. In response to historic flooding in August 2016 and other major flood events in the past several decades, the basin has been the subject of a number of studies to quantify the impacts of changes in land-use and reduction in river length and sinuosity. However, there have yet to be relationships defined between the changes in the historical river planform and the resulting flow, stages, and subsequent flood depths. River lengths and sinuosity were measured from the 1930s to present, confirming there has been an overall 6 and 13% decrease in length and sinuosity of the Comite and Amtie Rivers upstream of their confluence, respectively, from the 1930s to present. Planform geometries from four time period scenarios from the 1930’s to present were input into a combined 1D/2D unsteady flow HEC-RAS model, which is run using four spatially-variable rainfall events ranging from 1- to greater than 500-year return period flows to examine the significance on flood characteristics. The results show an overall increase in flow and stage peak magnitude over time, corresponding to an overall decrease in river length and sinuosity. The impacts were largest for for the 3- to 6-year return period flow event due to the magnitude and rainfall distribution of the historic event used. Results from this study will be compared to and combined with complementary projects, focused on spatial and temporal changes in land use and precipitation events, to better understand the driving variables impacting the stages, discharges, and subsequent flood risk within the basin. Further, this knowledge can be applied to better inform mitigation and construction projects within the basin in the future

    Historical Exploration - Learning Lessons from the Past to Inform the Future

    Get PDF
    This report examines a number of exploration campaigns that have taken place during the last 700 years, and considers them from a risk perspective. The explorations are those led by Christopher Columbus, Sir Walter Raleigh, John Franklin, Sir Ernest Shackleton, the Company of Scotland to Darien and the Apollo project undertaken by NASA. To provide a wider context for investigating the selected exploration campaigns, we seek ways of finding analogies at mission, programmatic and strategic levels and thereby to develop common themes. Ultimately, the purpose of the study is to understand how risk has shaped past explorations, in order to learn lessons for the future. From this, we begin to identify and develop tools for assessing strategic risk in future explorations. Figure 0.1 (see Page 6) summarizes the key inputs used to shape the study, the process and the results, and provides a graphical overview of the methodology used in the project. The first step was to identify the potential cases that could be assessed and to create criteria for selection. These criteria were collaboratively developed through discussion with a Business Historian. From this, six cases were identified as meeting our key criteria. Preliminary analysis of two of the cases allowed us to develop an evaluation framework that was used across all six cases to ensure consistency. This framework was revised and developed further as all six cases were analyzed. A narrative and summary statistics were created for each exploration case studied, in addition to a method for visualizing the important dimensions that capture major events. These Risk Experience Diagrams illustrate how the realizations of events, linked to different types of risks, have influenced the historical development of each exploration campaign. From these diagrams, we can begin to compare risks across each of the cases using a common framework. In addition, exploration risks were classified in terms of mission, program and strategic risks. From this, a Venn diagram and Belief Network were developed to identify how different exploration risks interacted. These diagrams allow us to quickly view the key risk drivers and their interactions in each of the historical cases. By looking at the context in which individual missions take place we have been able to observe the dynamics within an exploration campaign, and gain an understanding of how these interact with influences from stakeholders and competitors. A qualitative model has been created to capture how these factors interact, and are further challenged by unwanted events such as mission failures and competitor successes. This Dynamic Systemic Risk Model is generic and applies broadly to all the exploration ventures studied. This model is an amalgamation of a System Dynamics model, hence incorporating the natural feedback loops within each exploration mission, and a risk model, in order to ensure that the unforeseen events that may occur can be incorporated into the modeling. Finally, an overview is given of the motivational drivers and summaries are presented of the overall costs borne in each exploration venture. An important observation is that all the cases - with the exception of Apollo - were failures in terms of meeting their original objectives. However, despite this, several were strategic successes and indeed changed goals as needed in an entrepreneurial way. The Risk Experience Diagrams developed for each case were used to quantitatively assess which risks were realized most often during our case studies and to draw comparisons at mission, program and strategic levels. In addition, using the Risk Experience Diagrams and the narrative of each case, specific lessons for future exploration were identified. There are three key conclusions to this study: Analyses of historical cases have shown that there exists a set of generic risk classes. This set of risk classes cover mission, program and strategic levels, and includes all the risks encountered in the cases studied. At mission level these are Leadership Decisions, Internal Events and External Events; at program level these are Lack of Learning, Resourcing and Mission Failure; at Strategic Level they are Programmatic Failure, Stakeholder Perception and Goal Change. In addition there are two further risks that impact at all levels: Self-Interest of Actors, and False Model. There is no reason to believe that these risk classes will not be applicable to future exploration and colonization campaigns. We have deliberately selected a range of different exploration and colonization campaigns, taking place between the 15th Century and the 20th Century. The generic risk framework is able to describe the significant types of risk for these missions. Furthermore, many of these risks relate to how human beings interact and learn lessons to guide their future behavior. Although we are better schooled than our forebears and are technically further advanced, there is no reason to think we are fundamentally better at identifying, prioritizing and controlling these classes of risk. Modern risk modeling techniques are capable of addressing mission and program risk but are not as well suited to strategic risk. We have observed that strategic risks are prevalent throughout historic exploration and colonization campaigns. However, systematic approaches do not exist at the moment to analyze such risks. A risk-informed approach to understanding what happened in the past helps us guard against the danger of assuming that those events were inevitable, and highlights those chance events that produced the history that the world experienced. In turn, it allows us to learn more clearly from the past about the way our modern risk modeling techniques might help us to manage the future - and also bring to light those areas where they may not. This study has been retrospective. Based on this analysis, the potential for developing the work in a prospective way by applying the risk models to future campaigns is discussed. Follow on work from this study will focus on creating a portfolio of tools for assessing strategic and programmatic risk

    Spectres in the studio : Time, memory and the artist's haunted imagination

    Get PDF
    I spent a week in a haunted house and that is what inspired this project. I then discovered hauntology. Hauntology is a concept that was developed by French philosopher Jacques Derrida and articulated in his 1993 lecture Specters of Marx. Hauntology is a theory reliant on time, where the historical past and the envisaged future bleed in to the present moment. A concept that recognises the ghosts of the past that are bound to return again and again to haunt the mind, as well as spectres that represent expectant futures that did not eventuate. While hauntology (also known as spectral studies) has been used as a thematic device in the fields of music and literature as both a creative and a critical tool, it has not been utilised to the same degree in the visual arts. This PhD by creative project aims to correct that by using hauntology as a framework through which to explore my own visual art practice. To do this I have made a body of work that reflects my own life experience and memories as read through a hauntological sensibility. By beginning with an overview of hauntology and then reflecting on the recursive nature of memory evidenced throughout my previous artworks, I suggest in this thesis that hauntology is an interesting and revealing framework through which to better understand the nature of both my art practice and contemporary art practice in general. In my discussion, I examine the parallel themes of memory and collapsed time, as well as motifs such as the ‘white-sheet ghost’ and abandoned domestic spaces. Inspired by this research, I have created a body of work using hauntology as my visual framework, exploring my own haunted memories and moments in time from my own past (that I have never attempted to do previously). One of my aims was to address certain incidents and moments from my own life that have stayed with me, and how my haunted imagination can inform my art practice. The creation of artworks about these past events neither dulls nor exorcises the memories but allows the artist to better comprehend the range and depth of human experience and the way it contributes to, haunts, our subjectivity both in the present and, probably, in the future. The final body of work was exhibited on the Australian Catholic University campus in Brisbane to complete this project

    An evolutionary approach to the representation of adverse events

    Get PDF
    One way to detect, monitor and prevent adverse events with the help of Information Technology is by using ontologies capable of representing three levels of reality: what is the case, what is believed about reality, and what is represented. We report on how Basic Formal Ontology and Referent Tracking exhibit this capability and how they are used to develop an adverse event ontology and related data annotation scheme for the European ReMINE project
    • …
    corecore