292 research outputs found

    Efficacy of Feedforward and LSTM Neural Networks at Predicting and Gap Filling Coastal Ocean Timeseries: Oxygen, Nutrients, and Temperature

    Full text link
    Ocean data timeseries are vital for a diverse range of stakeholders (ranging from government, to industry, to academia) to underpin research, support decision making, and identify environmental change. However, continuous monitoring and observation of ocean variables is difficult and expensive. Moreover, since oceans are vast, observations are typically sparse in spatial and temporal resolution. In addition, the hostile ocean environment creates challenges for collecting and maintaining data sets, such as instrument malfunctions and servicing, often resulting in temporal gaps of varying lengths. Neural networks (NN) have proven effective in many diverse big data applications, but few oceanographic applications have been tested using modern frameworks and architectures. Therefore, here we demonstrate a “proof of concept” neural network application using a popular “off-the-shelf” framework called “TensorFlow” to predict subsurface ocean variables including dissolved oxygen and nutrient (nitrate, phosphate, and silicate) concentrations, and temperature timeseries and show how these models can be used successfully for gap filling data products. We achieved a final prediction accuracy of over 96% for oxygen and temperature, and mean squared errors (MSE) of 2.63, 0.0099, and 0.78, for nitrates, phosphates, and silicates, respectively. The temperature gap-filling was done with an innovative contextual Long Short-Term Memory (LSTM) NN that uses data before and after the gap as separate feature variables. We also demonstrate the application of a novel dropout based approach to approximate the Bayesian uncertainty of these temperature predictions. This Bayesian uncertainty is represented in the form of 100 monte carlo dropout estimates of the two longest gaps in the temperature timeseries from a model with 25% dropout in the input and recurrent LSTM connections. Throughout the study, we present the NN training process including the tuning of the large number of NN hyperparameters which could pose as a barrier to uptake among researchers and other oceanographic data users. Our models can be scaled up and applied operationally to provide consistent, gap-free data to all data users, thus encouraging data uptake for data-based decision making

    Internet scalability: properties and evolution

    Get PDF
    Copyright © 2008 IEEEMatthew Roughan; Steve Uhlig; Walter Willinge

    Generating name-like vectors for testing large-scale entity resolution

    Get PDF
    Entity resolution (ER), the problem of identifying and linking records that belong to the same real-world entities in structured and unstructured data, is a primary task in data integration. Accurate and efficient ER has a major practical impact on various applications across commercial, security and scientific domains. Recently, scalable ER techniques have received enormous attention with the increasing need to combine large-scale datasets. The shortage of training and ground truth data impedes the development and testing of ER algorithms. Good public datasets, especially those containing personal information, are restricted in this area and usually small in size. Due to privacy and confidential issues, testing algorithms or techniques with real datasets is challenging in ER research. Simulation is one technique for generating synthetic datasets that have characteristics similar to those of real data for testing algorithms. Many existing simulation tools in ER lack support for generating large-scale data and have problems in complexity, scalability, and limitations of resampling. In our work, we propose a simple, inexpensive, and fast synthetic data generation tool. Our tool only generates entity names in the first stage, but these are commonly used as identification keys in ER algorithms. We avoid the detail-level simulation of entity names using a simple vector representation that delivers simplicity and efficiency. In this paper, we discuss how to simulate simple vectors that approximate the properties of entity names. We describe the overall construction of the tool based on data analysis of a namespace that contains entity names collected from the actual environment.Samudra Herath, Matthew Roughan and Gary Glone

    The one comparing narrative social network extraction techniques

    Get PDF
    Analysing narratives through their social networks is an expanding field in quantitative literary studies. Manually extracting a social network from any narrative can be time consuming, so automatic extraction methods of varying complexity have been developed. However, the effect of different extraction methods on the resulting networks is unknown. Here we model and compare three extraction methods for social networks in narratives: manual extraction, co-occurrence automated extraction and automated extraction using machine learning. Although the manual extraction method produces more precise results in the network analysis, it is highly time consuming. The automatic extraction methods yield comparable results for density, centrality measures and edge weights. Our results provide evidence that automatically-extracted social networks are reliable for many analyses. We also describe which aspects of analysis are not reliable with such a social network. Our findings provide a framework to analyse narratives, which help us improve our understanding of how stories are written and evolve, and how people interact with each other. Index Tenns-social networks, narratives, televisionMichelle Edwards, Jonathan Tuke, Matthew Roughan, Lewis Mitchel

    The Assessment of Post-Vasectomy Pain in Mice Using Behaviour and the Mouse Grimace Scale

    Get PDF
    Background: Current behaviour-based pain assessments for laboratory rodents have significant limitations. Assessment of facial expression changes, as a novel means of pain scoring, may overcome some of these limitations. The Mouse Grimace Scale appears to offer a means of assessing post-operative pain in mice that is as effective as manual behavioural-based scoring, without the limitations of such schemes. Effective assessment of post-operative pain is not only critical for animal welfare, but also the validity of science using animal models. Methodology/Principal Findings: This study compared changes in behaviour assessed using both an automated system (‘‘HomeCageScan’’) and using manual analysis with changes in facial expressions assessed using the Mouse Grimace Scale (MGS). Mice (n = 6/group) were assessed before and after surgery (scrotal approach vasectomy) and either received saline, meloxicam or bupivacaine. Both the MGS and manual scoring of pain behaviours identified clear differences between the pre and post surgery periods and between those animals receiving analgesia (20 mg/kg meloxicam or 5 mg/kg bupivacaine) or saline post-operatively. Both of these assessments were highly correlated with those showing high MGS scores also exhibiting high frequencies of pain behaviours. Automated behavioural analysis in contrast was only able to detect differences between the pre and post surgery periods. Conclusions: In conclusion, both the Mouse Grimace Scale and manual scoring of pain behaviours are assessing th

    Lagrangian and Eulerian characterization of two counter-rotating submesoscale eddies in a western boundary current

    Get PDF
    In recent decades, high-spatial resolution ocean radar and satellite imagery measurements have revealed a complex tangle of submesoscale filaments and eddies, in the surface velocity, temperature, and chlorophyll a fields. We use a suite of high-resolution data to characterize two counter-rotating, short-lived eddies formed at the front between the warm East Australian Current (EAC) and temperate coastal waters (30°S, Eastern Australia). In this region, submesoscale filaments and short-lived eddies are dynamically generated and decay at time scales of hours to days. Dominant cyclonic filaments of O(1) Rossby number formed along frontal jets and eddy boundaries, generating localized ageostrophic circulations at the submesoscale. Measurements of over-ocean wind direction and surface currents from high-frequency radars reveal the influence of the short-term, small-scale wind forcing on the surface circulation, enhancement of the horizontal shear, frontal jet destabilization, and the generation and decay of the cyclonic eddy. By contrast, the anticyclonic eddy formation was most likely associated with EAC mesoscale instability and anticyclonic vorticity. Lagrangian tracks show that surface particles can be temporarily trapped in the eddies and frontal convergent zones, limiting their transport. Mixing between EAC-derived and coastal waters was increased along the frontal regions, and particles starting at the divergent regions around the eddies experienced significant dispersion at submesoscales. The cyclonic cold-core eddy entrained high chlorophyll a shelf waters on its convergent side, suggesting spiral eddy cyclogenesis

    Coastal seascape variability in the intensifying East Australian Current Southern Extension

    Get PDF
    Funding: This study was funded by Australian Research Council Linkage Grants (LP110200603 awarded to RH, DS and Iain Field, and LP160100162 awarded to IJ, Martina Doublin, MC, GC, DS, Iain Suthers and RH) with contributions from the Taronga Conservation Society Australia, NSW National Parks and the Australian Antarctic Division.Coastal pelagic ecosystems are highly variable in space and time, with environmental conditions and the distribution of biomass being driven by complex processes operating at multiple scales. The emergent properties of these processes and their interactive effects result in complex and dynamic environmental mosaics referred to as “seascapes”. Mechanisms that link large-scale oceanographic processes and ecological variability in coastal environments remain poorly understood, despite their importance for predicting how ecosystems will respond to climate change. Here we assessed seascape variability along the path of the rapidly intensifying East Australian Current (EAC) Southern Extension in southeast Australia, a hotspot of ocean warming and ecosystem tropicalisation. Using satellite and in situ measures of temperature, salinity and current velocity coupled with contemporaneous measurements of pelagic biomass distribution from nine boat-based active acoustic surveys in five consecutive years, we investigated relationships between the physical environment and the distribution of pelagic biomass (zooplankton and fish) at multiple timescales. Survey periods were characterised by high variability in oceanographic conditions, with variation in coastal conditions influenced by meso-to-large scale processes occurring offshore, including the position and strength of eddies. Intra-annual variability was often of a similar or greater magnitude to inter-annual variability, suggesting highly dynamic conditions with important variation occurring at scales of days to weeks. Two seascape categories were identified being characterised by (A) warmer, less saline water and (B) cooler, more saline water, with the former indicating greater influence of the EAC on coastal processes. Warmer waters were also associated with fewer, deeper and less dense biological aggregations. As the EAC continues to warm and penetrate further south, it is likely that this will have substantial effects on biological activity in coastal pelagic ecosystems, including a potential reduction in the accessibility of prey aggregations to surface-feeding predators and to fisheries. These results highlight the import role of offshore oceanographic processes in driving coastal seascape variability and biological activity in a region undergoing rapid oceanic warming and ecological change.Publisher PDFPeer reviewe

    A refinement approach in a mouse model of rehabilitation research. analgesia strategy, reduction approach and infrared thermography in spinal cord injury

    Get PDF
    The principles of Refinement, Replacement and Reduction (3R's) should be taken into account when animals must be used for scientific purpose. Here, a Reduction / Refinement approach was applied to the procedure of spinal cord injury (SCI), an animal model used in rehabilitation medicine research, in order to improve the quality of experiments, avoiding unnecessary suffering. The aims of this investigation were 1- to assess acute surgical pain in mice subjected to SCI, 2- to compare the efficacy of commonly used analgesia (three buprenorphine subcutaneous injection in 48 hours, 0,15 mg/kg each) with a combination of opioid and NSAID (one subcutaneous injection of 5 mg/kg carprofen before surgery followed by three buprenorphine subcutaneous injection in 48 hours, 0,15 mg/kg each) and 3- to test if Infrared Thermography (IRT) could be a potential new Refinement method to easily assess thermoregulation, an important metabolic parameter. Finally, we aimed to achieve these goals without recruiting animals on purpose, but using mice already scheduled for studies on SCI. By using behaviours analysis, we found that, despite being commonly used, buprenorphine does not completely relieve acute surgical pain, whereas the combination of buprenorphine and carprofen significantly decreases pain signs by 80%. IRT technology turned out to be a very useful Refinement tool being a non invasive methods to measure animal temperature, particularly useful when rectal probe cannot be used, as in the case of SCI. We could find that temperatures constantly and significantly increased until 7 days after surgery and then slowly decreased and, finally, we could observe that in the buprenorphine and carprofen treated group, temperatures were statistically lower than in the buprenorphine-alone treated mice. To our knowledge this is the first work providing an analgesic Refinement and a description of thermoregulatory response using the IRT technology, in mice subjected to SCI
    • 

    corecore