12 research outputs found

    Measuring the public health impact of vaccines: disease burden, vaccine coverage, safety and effectiveness

    Get PDF
    This thesis has eight chapters describing inter-relationships between work in 66 papers published between 1999 and 2020 relevant to the over-arching topic of public health impact of vaccines: measurement of disease burden, vaccine coverage, safety and effectiveness Chapter 1 outlines the key data sources used: 1. routinely collected administrative data (disease notifications, ICD coded hospitalisations and mortality data) and 2. additional data sources the author had a key role in developing (National Serosurveillance. Paediatric Active Enhanced Surveillance (PAEDS). In chapter 4, development of analysis and reporting of data from the Australian Immunisation Register and in chapter 6 development of platforms for vaccine safety evaluation are described. In Chapter 5, how this work culminated in pilot initiatives to link data sources relevant to public health impact of vaccines in a birth cohort from New South Wales and Western Australia, with the aim of demonstrating the potential for an all-age national capacity, is outlined. Chapter 2 focuses on disease due to Bordetella pertussis and research under the headings of measuring prevalence and severity of pertussis, the effectiveness and impact of pertussis vaccines and clinical trials conducted to evaluate the immunogenicity and safety of acellular pertussis vaccine given within 4 days of birth. Chapter 3 focuses on disease due to Streptococcus pneumoniae and research measuring pneumococcal disease, effectiveness and impact of pneumococcal vaccines and a randomised trial comparing immune responses to pneumococcal conjugate and polysaccharide vaccines in the frail elderly. Chapter 7 includes studies of vaccine impact against Hepatitis B, varicella, meningococcal C disease, mumps and Q fever and Chapter 8 includes four major international reviews of vaccine programs and impact

    Evidence-based eLearning Design: Develop and Trial a Prototype Software Instrument for Evaluating the Quality of eLearning Design Within a Framework of Cognitive Load Theory

    Get PDF
    A major research direction within higher education in Australia and internationally is the evaluation of learning design quality and the extent to which the design–teaching–learning–evaluation cycle is evidence based. The quest for increased evidence-based learning design, which has been influenced by evidence-based medical research standards, is driven by its link to improved learning outcomes, higher learner engagement levels and lower attrition rates. Cognitive Load Theory (CLT) has risen to prominence over the past three decades as an evidence-based framework for informing instructional design in traditional, blended and multimedia learning environments. CLT approaches learning from the perspective of engaging specific strategies to manage the loads imposed on a limited working memory in order to form and automate long-term memory schemas. CLT operates on the premise that optimal learning conditions may be obtained by aligning pedagogical strategies with the structure and functions of human cognitive architecture and the individual learner’s prior knowledge. CLT has contributed a suite of strategies derived from a unified model of human cognitive architecture and validated through randomised controlled trial (RCT) experiments as exerting strengthening effects on learning, thus suiting the CLT framework for use as an evidence-based standard in this study. Up to this point, a single digital system has not yet been developed for managing, monitoring and evaluating the implementation and impact of CLT strategies at scale. The key contribution of this study is a new prototype software instrument called Cognitive Load Evaluation Management System (CLEMS) that addresses this issue and also provides a model for its implementation. CLEMS is underpinned by a personalised model of teacher–learner interactions defined as mediative–adaptive in nature that includes diagnostic conversations (DCs) for identifying barriers to learning, interventions called Nodes of Expertise (NOEs) for advancing learners to new levels of understanding of complex knowledge, and validation conversations (VCs) for evaluating learner progress. In addition, the heutagogical or self-directed learning capability of learners, including motivation, has been brought to the fore as a significant factor contributing to schema automation. A qualitative Design-based Research (DBR) methodological approach was used to develop CLEMS, which emerged over three research iterations through the synthesis of literature review findings and empirical data from expert focus groups. Emergent data was continuously triangulated between research iterations and ongoing literature reviews to refine the design and development of CLEMS from a theoretical model to an operational digital prototype. The conceptual framework of the study has been derived from Critical Realism (CR) which posits an ontological–epistemological view of reality that is stratified and multi-mechanistic, thus aligning with the complex nature of authentic learning environments as well as the multi-faceted model of human cognitive architecture contributed by CLT. The implications of the study have been discussed with reference to stakeholders including teachers, learners and educational institutions. Recommendations for future research include the ongoing development of CLEMS for the systematic implementation of CLT strategies at scale.Thesis (Ph.D.) -- University of Adelaide, School of Education, 202

    Debating Lapita: Distribution, chronology, society and subsistence

    Get PDF
    ‘This volume is the most comprehensive review of Lapita research to date, tackling many of the lingering questions regarding origin and dispersal. Multidisciplinary in nature with a focus on summarising new findings, but also identifying important gaps that can help direct future research.’ — Professor Scott Fitzpatrick, Department of Anthropology, University of Oregon ‘This substantial volume offers a welcome update on the definition of the Lapita culture. It significantly refreshes the knowledge on this foundational archaeological culture of the Pacific Islands in providing new data on sites and assemblages, and new discussions of hypotheses previously proposed.’ — Dr Frédérique Valentin, Centre national de la recherche scientifique (CNRS), Paris This volume comprises 23 chapters that focus on the archaeology of Lapita, a cultural horizon associated with the founding populations who first colonised much of the south west Pacific some 3000 years ago. The Lapita culture has been most clearly defined by its distinctive dentate-stamped decorated pottery and the design system represented on it and on further incised pots. Modern research now encompasses a whole range of aspects associated with Lapita and this is reflected in this volume. The broad overlapping themes of the volume—Lapita distribution and chronology, society and subsistence—relate to research questions that have long been debated in relation to Lapita

    Proceedings of the WABER 2017 Conference

    Get PDF
    The scientific information published in peer-reviewed outlets carries special status, and confers unique responsibilities on editors and authors. We must protect the integrity of the scientific process by publishing only manuscripts that have been properly peer-reviewed by scientific reviewers and confirmed by editors to be of sufficient quality. I confirm that all papers in the WABER 2017 Conference Proceedings have been through a peer review process involving initial screening of abstracts, review of full papers by at least two referees, reporting of comments to authors, revision of papers by authors, and reevaluation of re-submitted papers to ensure quality of content. It is the policy of the West Africa Built Environment Research (WABER) Conference that all papers must go through a systematic peer review process involving examination by at least two referees who are knowledgeable on the subject. A paper is only accepted for publication in the conference proceedings based on the recommendation of the reviewers and decision of the editors. The names and affiliation of members of the Scientific Committee & Review Panel for WABER 2017 Conference are published in the Conference Proceedings and on our website www.waberconference.com Papers in the WABER Conference Proceedings are published open access on the conference website www.waberconference.com to facilitate public access to the research papers and wider dissemination of the scientific knowledge

    Performance assessment of real-time data management on wireless sensor networks

    Get PDF
    Technological advances in recent years have allowed the maturity of Wireless Sensor Networks (WSNs), which aim at performing environmental monitoring and data collection. This sort of network is composed of hundreds, thousands or probably even millions of tiny smart computers known as wireless sensor nodes, which may be battery powered, equipped with sensors, a radio transceiver, a Central Processing Unit (CPU) and some memory. However due to the small size and the requirements of low-cost nodes, these sensor node resources such as processing power, storage and especially energy are very limited. Once the sensors perform their measurements from the environment, the problem of data storing and querying arises. In fact, the sensors have restricted storage capacity and the on-going interaction between sensors and environment results huge amounts of data. Techniques for data storage and query in WSN can be based on either external storage or local storage. The external storage, called warehousing approach, is a centralized system on which the data gathered by the sensors are periodically sent to a central database server where user queries are processed. The local storage, in the other hand called distributed approach, exploits the capabilities of sensors calculation and the sensors act as local databases. The data is stored in a central database server and in the devices themselves, enabling one to query both. The WSNs are used in a wide variety of applications, which may perform certain operations on collected sensor data. However, for certain applications, such as real-time applications, the sensor data must closely reflect the current state of the targeted environment. However, the environment changes constantly and the data is collected in discreet moments of time. As such, the collected data has a temporal validity, and as time advances, it becomes less accurate, until it does not reflect the state of the environment any longer. Thus, these applications must query and analyze the data in a bounded time in order to make decisions and to react efficiently, such as industrial automation, aviation, sensors network, and so on. In this context, the design of efficient real-time data management solutions is necessary to deal with both time constraints and energy consumption. This thesis studies the real-time data management techniques for WSNs. It particularly it focuses on the study of the challenges in handling real-time data storage and query for WSNs and on the efficient real-time data management solutions for WSNs. First, the main specifications of real-time data management are identified and the available real-time data management solutions for WSNs in the literature are presented. Secondly, in order to provide an energy-efficient real-time data management solution, the techniques used to manage data and queries in WSNs based on the distributed paradigm are deeply studied. In fact, many research works argue that the distributed approach is the most energy-efficient way of managing data and queries in WSNs, instead of performing the warehousing. In addition, this approach can provide quasi real-time query processing because the most current data will be retrieved from the network. Thirdly, based on these two studies and considering the complexity of developing, testing, and debugging this kind of complex system, a model for a simulation framework of the real-time databases management on WSN that uses a distributed approach and its implementation are proposed. This will help to explore various solutions of real-time database techniques on WSNs before deployment for economizing money and time. Moreover, one may improve the proposed model by adding the simulation of protocols or place part of this simulator on another available simulator. For validating the model, a case study considering real-time constraints as well as energy constraints is discussed. Fourth, a new architecture that combines statistical modeling techniques with the distributed approach and a query processing algorithm to optimize the real-time user query processing are proposed. This combination allows performing a query processing algorithm based on admission control that uses the error tolerance and the probabilistic confidence interval as admission parameters. The experiments based on real world data sets as well as synthetic data sets demonstrate that the proposed solution optimizes the real-time query processing to save more energy while meeting low latency.Fundação para a Ciência e Tecnologi

    Distributed Database Management Techniques for Wireless Sensor Networks

    Full text link
    Authors and/or their employers shall have the right to post the accepted version of IEEE-copyrighted articles on their own personal servers or the servers of their institutions or employers without permission from IEEE, provided that the posted version includes a prominently displayed IEEE copyright notice and, when published, a full citation to the original IEEE publication, including a link to the article abstract in IEEE Xplore. Authors shall not post the final, published versions of their papers.In sensor networks, the large amount of data generated by sensors greatly influences the lifetime of the network. In order to manage this amount of sensed data in an energy-efficient way, new methods of storage and data query are needed. In this way, the distributed database approach for sensor networks is proved as one of the most energy-efficient data storage and query techniques. This paper surveys the state of the art of the techniques used to manage data and queries in wireless sensor networks based on the distributed paradigm. A classification of these techniques is also proposed. The goal of this work is not only to present how data and query management techniques have advanced nowadays, but also show their benefits and drawbacks, and to identify open issues providing guidelines for further contributions in this type of distributed architectures.This work was partially supported by the Instituto de Telcomunicacoes, Next Generation Networks and Applications Group (NetGNA), Portugal, by the Ministerio de Ciencia e Innovacion, through the Plan Nacional de I+D+i 2008-2011 in the Subprograma de Proyectos de Investigacion Fundamental, project TEC2011-27516, by the Polytechnic University of Valencia, though the PAID-05-12 multidisciplinary projects, by Government of Russian Federation, Grant 074-U01, and by National Funding from the FCT-Fundacao para a Ciencia e a Tecnologia through the Pest-OE/EEI/LA0008/2013 Project.Diallo, O.; Rodrigues, JJPC.; Sene, M.; Lloret, J. (2013). Distributed Database Management Techniques for Wireless Sensor Networks. IEEE Transactions on Parallel and Distributed Systems. PP(99):1-17. https://doi.org/10.1109/TPDS.2013.207S117PP9

    Research Selections 2012

    Get PDF
    ANSTO is an instrument of the Australian Government committed to Australian scientists and researchers, Australian innovation and Australia’s future. ANSTO’s existence is inextricably linked to efforts to protect and sustain Australia’s environment, to improve our health, to find ways to better diagnose and treat diseases, to produce nuclear medicines and to make a contribution to the global progress of nuclear science and technology. While ANSTO’s world-class infrastructure and science platforms are a demonstration of the foresight of the Australian Government, through the work of our researchers - and our collaborative thinking and strong partnerships - we contribute significantly to momentum in scientific discovery: bringing ideas to life. Through strong collaborations great advances are made. ANSTO’s local and international partnerships, as evidenced by the articles in this year’s Research Selections, are part of a scientific network that reaches into universities, government and industry and other research organisations – all of which have a crucial contribution to make to research outcomes and, by extension, our living world. ANSTO is part of the international science and technology community and is a hub for a vibrant research engagement and discovery. Each year hundreds of the world’s top researchers who use nuclear techniques to advance knowledge head to ANSTO. These collaborations enable us to magnify our influence and reach across the geography of possibilities. Not least of all through the mechanism of collaborative agreements with international organisations thereby ensuring a gateway for Australian scientists to the best international facilities, including CERN’s Large Hadron Collider - home to the intrepid Higgs Boson particle physicists. Research Selections is a small sample of the vast amount of great science being leveraged with our infrastructure. But science doesn’t stand still. To continue to achieve great science and ensure Australia remains at the forefront of discovery and innovation, we need to develop, grow and build on existing foundations of knowledge. ANSTO is already custodian of OPAL – Australia’s only nuclear reactor and one of the best research reactors in the world. Thanks to OPAL, we are able to supply vital nuclear medicine for Australians, irradiate silicon for the global semi-conductor market and, importantly, provide neutrons enabling researchers to probe matter at the nano and atomic scale, unlocking mysteries that can lead to profound discovery. ANSTO Research Selections 2010 | 3 3 While OPAL puts ANSTO streets ahead of where it otherwise would be, we cannot and are not resting on our laurels. In April 2012, approximately100 national and international experts attended the Second OPAL Guide Hall Workshop to discuss the medium- to long-term future of OPAL, an essential step towards the full exploitation of this world-leading research infrastructure. They discussed the extremely rapid advances in neutron beam instruments over the past few years and brainstormed strategies for using these advances and building a second suite of beamlines and instruments that will be at the forefront of neutron beam capabilities. ANSTO is also taking great strides in accelerator science. Our particle accelerators ANTARES and STAR are well established for analysing the elemental composition and age of materials using ion beam analysis and accelerator mass spectrometry. Two new accelerators will be established at ANSTO, as part of our Centre for Accelerator Science which has been funded through investment by the Government. These will enhance our capabilities in, for example, radiocarbon dating on historical artefacts, environmental studies, and determining how fossil fuels are contributing to climate change. The formation of the Australian Collaboration for Accelerator Science between the Australian Synchrotron, the University of Melbourne and the Australian National University is aimed at creating and maintaining a national pool of world class facilities and accelerator expertise. This collaboration will ensure Australia remains at the leading edge of accelerator capabilities and facilities
    corecore