990 research outputs found
A novel application of deep learning with image cropping: a smart city use case for flood monitoring
© 2020, The Author(s). Event monitoring is an essential application of Smart City platforms. Real-time monitoring of gully and drainage blockage is an important part of flood monitoring applications. Building viable IoT sensors for detecting blockage is a complex task due to the limitations of deploying such sensors in situ. Image classification with deep learning is a potential alternative solution. However, there are no image datasets of gullies and drainages. We were faced with such challenges as part of developing a flood monitoring application in a European Union-funded project. To address these issues, we propose a novel image classification approach based on deep learning with an IoT-enabled camera to monitor gullies and drainages. This approach utilises deep learning to develop an effective image classification model to classify blockage images into different class labels based on the severity. In order to handle the complexity of video-based images, and subsequent poor classification accuracy of the model, we have carried out experiments with the removal of image edges by applying image cropping. The process of cropping in our proposed experimentation is aimed to concentrate only on the regions of interest within images, hence leaving out some proportion of image edges. An image dataset from crowd-sourced publicly accessible images has been curated to train and test the proposed model. For validation, model accuracies were compared considering model with and without image cropping. The cropping-based image classification showed improvement in the classification accuracy. This paper outlines the lessons from our experimentation that have a wider impact on many similar use cases involving IoT-based cameras as part of smart city event monitoring platforms
Design and simulation studies of the novel beam arrival monitor pickup at Daresbury Laboratory
We present the novel beam arrival monitor pickup design currently under construction at Daresbury Laboratory, Warrington, UK. The pickup consists of four flat electrodes in a transverse gap. CST Particle Studio simulations have been undertaken for the new pickup design as well as a pickup design from DESY, which is used as a reference for comparison. Simulation results have highlighted two advantages of the new pickup design over the DESY design; the signal bandwidth is 25 GHZ, which is half that of the DESY design and the response slope is a factor of 1.6 greater. We discuss optimisation studies of the design parameters in order to maximise the response slope for bandwidths up to 50 GHz and present the final design of the pickup
Self-medication amongst pregnant women in a tertiary care teaching hospital in India
Background: Self-medication is a popular practice in developing countries where there is no strict regulation of drugs sold in local pharmacies. General public is usually unaware of the adverse effects of drugs used for common illness and continue using them without prescription during pregnancy. This study was carried out to know the extent of self-medication practised by pregnant women and various factors associated with it.Methods: A questionnaire based, cross-sectional study of pregnant women visiting the OB GYN-OPD of a tertiary care teaching hospital was conducted. 303 eligible subjects were questioned and statistical analysis was carried out.Results: Total 16.5% women were found to be self-medicating during pregnancy for common conditions like headache (26%), fever (23%) and common cold (19%). Odds Ratio between the self-medicating and non-self-medicating groups for variables like age (<25 years; ≥25 years), education (illiterate; literate) and gestational age (<20 weeks; ≥20 weeks) are 1.6, 2 and 1.73 respectively. Women with a history of self-medicating before pregnancy were significantly more likely to continue doing so during pregnancy (p value <0.00001).Conclusions: A significant proportion of pregnant women have been found to self-medicate without knowing the adverse effects of the drug used. Thus, spreading awareness against this health-predicament is necessary
Understanding urban planning outcomes in the UK
The planning process in the UK is a highly complex system, developed over many decades, and is in the process of rapid transitions into digital planning. Among these transformations is a desire to move from an outputs-based assessment to an outcomes-based assessment process. This is challenging, and in this paper, the authors explore the variety of factors that make outcomes assessment challenging. The authors first studied the literature to understand how outcomes are complex, ranging across different sectors and practices, identifying 359 indicators related to outcomes. The authors then conducted a knowledge mapping exercise to understand the characteristics of the indicators in multiple themes. The authors also invited practitioners for an interview on their perspectives of outcomes assessment, definitions of outcomes, barriers to outcomes, the benefits of outcomes assessment, and how practitioners envision a world with outcomes assessment. The authors conclude the paper with future directions of research
Explainable artificial intelligence for developing smart cities solutions
Traditional Artificial Intelligence (AI) technologies used in developing smart cities solutions, Machine Learning (ML) and recently Deep Learning (DL), rely more on utilising best representative training datasets and features engineering and less on the available domain expertise. We argue that such an approach to solution development makes the outcome of solutions less explainable, i.e., it is often not possible to explain the results of the model. There is a growing concern among policymakers in cities with this lack of explainability of AI solutions, and this is considered a major hindrance in the wider acceptability and trust in such AI-based solutions. In this work, we survey the concept of ‘explainable deep learning’ as a subset of the ‘explainable AI’ problem and propose a new solution using Semantic Web technologies, demonstrated with a smart cities flood monitoring application in the context of a European Commission-funded project. Monitoring of gullies and drainage in crucial geographical areas susceptible to flooding issues is an important aspect of any flood monitoring solution. Typical solutions for this problem involve the use of cameras to capture images showing the affected areas in real-time with different objects such as leaves, plastic bottles etc., and building a DL-based classifier to detect such objects and classify blockages based on the presence and coverage of these objects in the images. In this work, we uniquely propose an Explainable AI solution using DL and Semantic Web technologies to build a hybrid classifier. In this hybrid classifier, the DL component detects object presence and coverage level and semantic rules designed with close consultation with experts carry out the classification. By using the expert knowledge in the flooding context, our hybrid classifier provides the flexibility on categorising the image using objects and their coverage relationships. The experimental results demonstrated with a real-world use case showed that this hybrid approach of image classification has on average 11% improvement (F-Measure) in image classification performance compared to DL-only classifier. It also has the distinct advantage of integrating experts’ knowledge on defining the decision-making rules to represent the complex circumstances and using such knowledge to explain the results
Temporal Logic Control of POMDPs via Label-based Stochastic Simulation Relations
The synthesis of controllers guaranteeing linear temporal logic specifications on partially observable Markov decision processes (POMDP) via their belief models causes computational issues due to the continuous spaces. In this work, we construct a finite-state abstraction on which a control policy is synthesized and refined back to the original belief model. We introduce a new notion of label-based approximate stochastic simulation to quantify the deviation between belief models. We develop a robust synthesis methodology that yields a lower bound on the satisfaction probability, by compensating for deviations a priori, and that utilizes a less conservative control refinement
Intracellular sodium elevation reprograms cardiac metabolism
Intracellular Na elevation in the heart is a hallmark of pathologies where both acute and chronic metabolic remodelling occurs. Here, we assess whether acute (75 μM ouabain 100 nM blebbistatin) or chronic myocardial Nai load (PLM3SA mouse) are causally linked to metabolic remodelling and whether the failing heart shares a common Na-mediated metabolic ‘fingerprint’. Control (PLMWT), transgenic (PLM3SA), ouabain-treated and hypertrophied Langendorff-perfused mouse hearts are studied by 23Na, 31P, 13C NMR followed by 1H-NMR metabolomic profiling. Elevated Nai leads to common adaptive metabolic alterations preceding energetic impairment: a switch from fatty acid to carbohydrate metabolism and changes in steady-state metabolite concentrations (glycolytic, anaplerotic, Krebs cycle intermediates). Inhibition of mitochondrial Na/Ca exchanger by CGP37157 ameliorates the metabolic changes. In silico modelling indicates altered metabolic fluxes (Krebs cycle, fatty acid, carbohydrate, amino acid metabolism). Prevention of Nai overload or inhibition of Na/Camito may be a new approach to ameliorate metabolic dysregulation in heart failure
Recommended from our members
Ontology-based discovery of time-series data sources for landslide early warning system
YesModern early warning system (EWS) requires sophisticated knowledge of the natural hazards, the urban context and underlying risk factors to enable dynamic and timely decision making (e.g., hazard detection, hazard preparedness). Landslides are a common form of natural hazard with a global impact and closely linked to a variety of other hazards. EWS for landslides prediction and detection relies on scientific methods and models which requires input from the time series data, such as the earth observation (EO) and urban environment data. Such data sets are produced by a variety of remote sensing satellites and Internet of things sensors which are deployed in the landslide prone areas. To this end, the automatic discovery of potential time series data sources has become a challenge due to the complexity and high variety of data sources. To solve this hard research problem, in this paper, we propose a novel ontology, namely Landslip Ontology, to provide the knowledge base that establishes relationship between landslide hazard and EO and urban data sources. The purpose of Landslip Ontology is to facilitate time series data source discovery for the verification and prediction of landslide hazards. The ontology is evaluated based on scenarios and competency questions to verify the coverage and consistency. Moreover, the ontology can also be used to realize the implementation of data sources discovery system which is an essential component in EWS that needs to manage (store, search, process) rich information from heterogeneous data sources
A Novel Mutation in the Upstream Open Reading Frame of the CDKN1B Gene Causes a MEN4 Phenotype
PubMed ID: 23555276This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited
- …