219 research outputs found

    South Bank Corporation's Draft Ecologically Sustainable Development Policy and Implementation Strategy

    Get PDF
    The Centre for Subtropical Design has reviewed the Draft Ecologically Sustainable Development Policy and Implementation Strategy provided by South Bank Corporation by gathering a team of QUT experts to comment on the full range of sustainability aspects covered by the policy. The Centre has prepared this submission to assist South Bank Corporation to finalise an ESD policy and implementation strategy which will create a truly sustainable, prosperous, and liveable urban parkland precinct

    Success Factors of European Syndromic Surveillance Systems: A Worked Example of Applying Qualitative Comparative Analysis

    Get PDF
    Introduction: Syndromic surveillance aims at augmenting traditional public health surveillance with timely information. To gain a head start, it mainly analyses existing data such as from web searches or patient records. Despite the setup of many syndromic surveillance systems, there is still much doubt about the benefit of the approach. There are diverse interactions between performance indicators such as timeliness and various system characteristics. This makes the performance assessment of syndromic surveillance systems a complex endeavour. We assessed if the comparison of several syndromic surveillance systems through Qualitative Comparative Analysis helps to evaluate performance and identify key success factors. Materials and Methods: We compiled case-based, mixed data on performance and characteristics of 19 syndromic surveillance systems in Europe from scientific and grey literature and from site visits. We identified success factors by applying crisp-set Qualitative Comparative Analysis. We focused on two main areas of syndromic surveillance application: seasonal influenza surveillance and situational awareness during different types of potentially health threatening events. Results: We found that syndromic surveillance systems might detect the onset or peak of seasonal influenza earlier if they analyse non-clinical data sources. Timely situational awareness during different types of events is supported by an automated syndromic surveillance system capable of analysing multiple syndromes. To our surprise, the analysis of multiple data sources was no key success factor for situational awareness. Conclusions: We suggest to consider these key success factors when designing or further developing syndromic surveillance systems. Qualitative Comparative Analysis helped interpreting complex, mixed data on small-N cases and resulted in concrete and practically relevant findings

    Assessing Syndromic Surveillance of Cardiovascular Outcomes from Emergency Department Chief Complaint Data in New York City

    Get PDF
    Prospective syndromic surveillance of emergency department visits has been used for near-real time tracking of communicable diseases to detect outbreaks or other unexpected disease clusters. The utility of syndromic surveillance for tracking cardiovascular events, which may be influenced by environmental factors and influenza, has not been evaluated. We developed and evaluated a method for tracking cardiovascular events using emergency department free-text chief complaints.There were three phases to our analysis. First we applied text processing algorithms based on sensitivity, specificity, and positive predictive value to chief complaint data reported by 11 New York City emergency departments for which ICD-9 discharge diagnosis codes were available. Second, the same algorithms were applied to data reported by a larger sample of 50 New York City emergency departments for which discharge diagnosis was unavailable. From this more complete data, we evaluated the consistency of temporal variation of cardiovascular syndromic events and hospitalizations from 76 New York City hospitals. Finally, we examined associations between particulate matter ≤2.5 µm (PM(2.5)), syndromic events, and hospitalizations. Sensitivity and positive predictive value were low for syndromic events, while specificity was high. Utilizing the larger sample of emergency departments, a strong day of week pattern and weak seasonal trend were observed for syndromic events and hospitalizations. These time-series were highly correlated after removing the day-of-week, holiday, and seasonal trends. The estimated percent excess risks in the cold season (October to March) were 1.9% (95% confidence interval (CI): 0.6, 3.2), 2.1% (95% CI: 0.9, 3.3), and 1.8% (95%CI: 0.5, 3.0) per same-day 10 µg/m(3) increase in PM(2.5) for cardiac-only syndromic data, cardiovascular syndromic data, and hospitalizations, respectively.Near real-time emergency department chief complaint data may be useful for timely surveillance of cardiovascular morbidity related to ambient air pollution and other environmental events

    Exploring spatial-frequency-sequential relationships for motor imagery classification with recurrent neural network

    Get PDF
    Abstract Background Conventional methods of motor imagery brain computer interfaces (MI-BCIs) suffer from the limited number of samples and simplified features, so as to produce poor performances with spatial-frequency features and shallow classifiers. Methods Alternatively, this paper applies a deep recurrent neural network (RNN) with a sliding window cropping strategy (SWCS) to signal classification of MI-BCIs. The spatial-frequency features are first extracted by the filter bank common spatial pattern (FB-CSP) algorithm, and such features are cropped by the SWCS into time slices. By extracting spatial-frequency-sequential relationships, the cropped time slices are then fed into RNN for classification. In order to overcome the memory distractions, the commonly used gated recurrent unit (GRU) and long-short term memory (LSTM) unit are applied to the RNN architecture, and experimental results are used to determine which unit is more suitable for processing EEG signals. Results Experimental results on common BCI benchmark datasets show that the spatial-frequency-sequential relationships outperform all other competing spatial-frequency methods. In particular, the proposed GRU-RNN architecture achieves the lowest misclassification rates on all BCI benchmark datasets. Conclusion By introducing spatial-frequency-sequential relationships with cropping time slice samples, the proposed method gives a novel way to construct and model high accuracy and robustness MI-BCIs based on limited trials of EEG signals

    Using Ethnographic Methods to Articulate Community-Based Conceptions of Cultural Heritage Management

    Get PDF
    How can ethnographic methods help communities articulate and enact their own conceptions of heritage management? This and related questions are being explored through an international research project, ‘Intellectual Property Issues in Cultural Heritage’. The project includes up to twenty community- based initiatives that incorporate community-based participatory research and ethnographic methods to explore emerging intellectual property-related issues in archaeological contexts; the means by which they are being addressed or resolved; and the broader implications of these issues and concerns. We discuss three examples that use ethnography to (a) articulate local or customary laws and principles of archaeological heritage management among a First Nations group in British Columbia; (b) assemble knowledge related to land/sea use and cultural practices of the Moriori people of Rekohu (Chatham Islands) for their use in future land and heritage manage- ment policies; and (c) aid a tribal cultural centre in Michigan in crafting co-management strategies to protect spiritual traditions associated with a rock art site on state property. Such situations call for participatory methods that place control over the design, process, products, and interpretation of ‘archaeology’ in the hands of cultural descendants. We hope that these examples of community-based conceptions of archaeological heritage management, facilitated through ethnographic methods and participatory approaches, will increase awareness of the value of these and other alternative approaches and the need to share them widely

    Sensitivity to Experiencing Alcohol Hangovers: Reconsideration of the 0.11% Blood Alcohol Concentration (BAC) Threshold for Having a Hangover

    Get PDF
    The 2010 Alcohol Hangover Research Group consensus paper defined a cutoff blood alcohol concentration (BAC) of 0.11% as a toxicological threshold indicating that sufficient alcohol had been consumed to develop a hangover. The cutoff was based on previous research and applied mostly in studies comprising student samples. Previously, we showed that sensitivity to hangovers depends on (estimated) BAC during acute intoxication, with a greater percentage of drinkers reporting hangovers at higher BAC levels. However, a substantial number of participants also reported hangovers at comparatively lower BAC levels. This calls the suitability of the 0.11% threshold into question. Recent research has shown that subjective intoxication, i.e., the level of severity of reported drunkenness, and not BAC, is the most important determinant of hangover severity. Non-student samples often have a much lower alcohol intake compared to student samples, and overall BACs often remain below 0.11%. Despite these lower BACs, many non-student participants report having a hangover, especially when their subjective intoxication levels are high. This may be the case when alcohol consumption on the drinking occasion that results in a hangover significantly exceeds their “normal” drinking level, irrespective of whether they meet the 0.11% threshold in any of these conditions. Whereas consumers may have relative tolerance to the adverse effects at their “regular” drinking level, considerably higher alcohol intake—irrespective of the absolute amount—may consequentially result in a next-day hangover. Taken together, these findings suggest that the 0.11% threshold value as a criterion for having a hangover should be abandoned

    Ocean data product integration through innovation-the next level of data interoperability

    Get PDF
    In the next decade the pressures on ocean systems and the communities that rely on them will increase along with impacts from the multiple stressors of climate change and human activities. Our ability to manage and sustain our oceans will depend on the data we collect and the information and knowledge derived from it. Much of the uptake of this knowledge will be outside the ocean domain, for example by policy makers, local Governments, custodians, and other organizations, so it is imperative that we democratize or open the access and use of ocean data. This paper looks at how technologies, scoped by standards, best practice and communities of practice, can be deployed to change the way that ocean data is accessed, utilized, augmented and transformed into information and knowledge. The current portal-download model which requires the user to know what data exists, where it is stored, in what format and with what processing, limits the uptake and use of ocean data. Using examples from a range of disciplines, a web services model of data and information flows is presented. A framework is described, including the systems, processes and human components, which delivers a radical rethink about the delivery of knowledge from ocean data. A series of statements describe parts of the future vision along with recommendations about how this may be achieved. The paper recommends the development of virtual test-beds for end-to-end development of new data workflows and knowledge pathways. This supports the continued development, rationalization and uptake of standards, creates a platform around which a community of practice can be developed, promotes cross discipline engagement from ocean science through to ocean policy, allows for the commercial sector, including the informatics sector, to partner in delivering outcomes and provides a focus to leverage long term sustained funding. The next 10 years will be “make or break” for many ocean systems. The decadal challenge is to develop the governance and co-operative mechanisms to harness emerging information technology to deliver on the goal of generating the information and knowledge required to sustain oceans into the future

    Evolutionary Epidemiology of Drug-Resistance in Space

    Get PDF
    The spread of drug-resistant parasites erodes the efficacy of therapeutic treatments against many infectious diseases and is a major threat of the 21st century. The evolution of drug-resistance depends, among other things, on how the treatments are administered at the population level. “Resistance management” consists of finding optimal treatment strategies that both reduce the consequence of an infection at the individual host level, and limit the spread of drug-resistance in the pathogen population. Several studies have focused on the effect of mixing different treatments, or of alternating them in time. Here, we analyze another strategy, where the use of the drug varies spatially: there are places where no one receives any treatment. We find that such a spatial heterogeneity can totally prevent the rise of drug-resistance, provided that the size of treated patches is below a critical threshold. The range of parasite dispersal, the relative costs and benefits of being drug-resistant compared to being drug-sensitive, and the duration of an infection with drug-resistant parasites are the main factors determining the value of this threshold. Our analysis thus provides some general guidance regarding the optimal spatial use of drugs to prevent or limit the evolution of drug-resistance
    corecore