781 research outputs found

    Water quality monitoring, control and management (WQMCM) framework using collaborative wireless sensor networks

    No full text
    Improving water quality is of global concern, with agricultural practices being the major contributors to reduced water quality. The reuse of nutrient-rich drainage water can be a valuable strategy to gain economic-environmental benefits. However, currently the tools and techniques to allow this do not exist. Therefore, we have proposed a framework, WQMCM, which utilises increasingly common local farm-scale networks across a catchment, adding provision for collaborative information sharing. Using this framework, individual sub-networks can learn their environment and predict the impact of catchment events on their locality, allowing dynamic decision making for local irrigation strategies. Since resource constraints of network nodes (e.g. power consumption, computing power etc.) require a simplified predictive model for discharges, therefore low-dimensional model parameters are derived from the existing National Resource Conservation Method (NRCS), utilising real-time field values. Evaluation of the predictive models, developed using M5 decision trees, demonstrates accuracy of 84-94% compared with the traditional NRCS curve number model. The discharge volume and response time model was tested to perform with 6% relative root mean square error (RRMSE), even for a small training set of around 100 samples; however the discharge response time model required a minimum of 300 training samples to show reasonable performance with 16% RRMS

    The impact of agricultural activities on water quality: a case for collaborative catchment-scale management using integrated wireless sensor networks

    No full text
    The challenge of improving water quality is a growing global concern, typified by the European Commission Water Framework Directive and the United States Clean Water Act. The main drivers of poor water quality are economics, poor water management, agricultural practices and urban development. This paper reviews the extensive role of non-point sources, in particular the outdated agricultural practices, with respect to nutrient and contaminant contributions. Water quality monitoring (WQM) is currently undertaken through a number of data acquisition methods from grab sampling to satellite based remote sensing of water bodies. Based on the surveyed sampling methods and their numerous limitations, it is proposed that wireless sensor networks (WSNs), despite their own limitations, are still very attractive and effective for real-time spatio-temporal data collection for WQM applications. WSNs have been employed for WQM of surface and ground water and catchments, and have been fundamental in advancing the knowledge of contaminants trends through their high resolution observations. However, these applications have yet to explore the implementation and impact of this technology for management and control decisions, to minimize and prevent individual stakeholder’s contributions, in an autonomous and dynamic manner. Here, the potential of WSN-controlled agricultural activities and different environmental compartments for integrated water quality management is presented and limitations of WSN in agriculture and WQM are identified. Finally, a case for collaborative networks at catchment scale is proposed for enabling cooperation among individually networked activities/stakeholders (farming activities, water bodies) for integrated water quality monitoring, control and management

    Pretrained DcAlexnet Cardiac Diseases Classification on Cognitive Multi-Lead Ultrasound Dataset

    Get PDF
    The DcAlexNet CNN deep learning classifier can easily track patterns in medical images (brain, heart, spinal cord and etc.) precisely. According to WHO (world health organization) every year 5 billion people are affecting heart diseases and heart-attacks. Heart abnormalities sometimes tends to death; therefore, an efficient medical image pre-processor and deep learning classifier is needed for diagnosis. So that in this research work multi-class DcAlexNet classifier, RRS-HSB segment-filter has been implemented. The RRS (Restrictive Random segmentation) and GHSB (Gaussian Hue saturation brightness filtration) modules are fused to get multi-level feature. The training process has been incorporated to EchoNet dataset and testing process can be verified to real time samples. The segmented features as well as filtered feature are loaded into weighted .CSV file. The following features are classified tends to get predicting abnormalities in heart ultra sound image. The pretrained DcAlexNet CNN model is loading to EchoNet 1 lakh samples using 165 layers such as normalized layer, dense layer, flatten layer, max pooling layer and ReLu layer. The computer aided design with corresponding CNN layers has been finding hidden sample over to get heart abnormality location. The experimental results in terms of Dice score 98.89%, Accuracy 99.455, precision 99.23%, recall 98.34%, F-1 score 98.92%, CC 99.27%, and sensitivity 99.34% had been attained. The attained performance metrics are competed with present technologies and outperformance the application accuracy on heart diagnosis

    Pretrained DcAlexnet Cardiac Diseases Classification on Cognitive Multi-Lead Ultrasound Dataset

    Get PDF
    The DcAlexNet CNN deep learning classifier can easily track patterns in medical images (brain, heart, spinal cord and etc.) precisely. According to WHO (world health organization) every year 5 billion people are affecting heart diseases and heart-attacks. Heart abnormalities sometimes tends to death; therefore, an efficient medical image pre-processor and deep learning classifier is needed for diagnosis. So that in this research work multi-class DcAlexNet classifier, RRS-HSB segment-filter has been implemented. The RRS (Restrictive Random segmentation) and GHSB (Gaussian Hue saturation brightness filtration) modules are fused to get multi-level feature. The training process has been incorporated to EchoNet dataset and testing process can be verified to real time samples. The segmented features as well as filtered feature are loaded into weighted .CSV file. The following features are classified tends to get predicting abnormalities in heart ultra sound image. The pretrained DcAlexNet CNN model is loading to EchoNet 1 lakh samples using 165 layers such as normalized layer, dense layer, flatten layer, max pooling layer and ReLu layer. The computer aided design with corresponding CNN layers has been finding hidden sample over to get heart abnormality location. The experimental results in terms of Dice score 98.89%, Accuracy 99.455, precision 99.23%, recall 98.34%, F-1 score 98.92%, CC 99.27%, and sensitivity 99.34% had been attained. The attained performance metrics are competed with present technologies and outperformance the application accuracy on heart diagnosis

    Far-from-equilibrium transport with constrained resources

    Full text link
    The totally asymmetric simple exclusion process (TASEP) is a well studied example of far-from-equilibrium dynamics. Here, we consider a TASEP with open boundaries but impose a global constraint on the total number of particles. In other words, the boundary reservoirs and the system must share a finite supply of particles. Using simulations and analytic arguments, we obtain the average particle density and current of the system, as a function of the boundary rates and the total number of particles. Our findings are relevant to biological transport problems if the availability of molecular motors becomes a rate-limiting factor.Comment: 14 pages, 7 figures, uses iopart12.clo and iopart.cl

    On the Scale-Invariant Distribution of the Diffusion Coefficient for Classical Particles Diffusing in Disordered Media.-

    Full text link
    The scaling form of the whole distribution P(D) of the random diffusion coefficient D(x) in a model of classically diffusing particles is investigated. The renormalization group approach above the lower critical dimension d=0 is applied to the distribution P(D) using the n-replica approach. In the annealed approximation (n=1), the inverse gaussian distribution is found to be the stable one under rescaling. This identification is made based on symmetry arguments and subtle relations between this model and that of fluc- tuating interfaces studied by Wallace and Zia. The renormalization-group flow for the ratios between consecutive cumulants shows a regime of pure diffusion for small disorder, in which P(D) goes to delta(D-), and a regime of strong disorder where the cumulants grow infinitely large and the diffusion process is ill defined. The boundary between these two regimes is associated with an unstable fixed-point and a subdiffusive behavior: =Ct**(1-d/2). For the quenched case (n goes to 0) we find that unphysical operators are generated raisng doubts on the renormalizability of this model. Implications to other random systems near their lower critical dimension are discussed.Comment: 21 pages, 1 fig. (not included) Use LaTex twic

    Mixed methods approach to understanding farmer and agricultural advisor perceptions of climate change and adaptation in Vermont, United States

    Get PDF
    The relationships among farmers’ belief in climate change, perceptions of climate-related risk, and use of climate adaptation practices is a growing topic of interest in U.S. scholarship. The northeast region is not well represented in the literature, although it is highly agricultural and will likely face climate-related risks that differ from those faced in other regions. We used a mixed methods approach to examine northeast farmers’ perceptions of climate change and climate-related risks over time, and perceived trade-offs associated with on-farm practices. Our investigation shows how northeastern farmers think about climate-risk, and what they are doing to address it

    Feedback and Fluctuations in a Totally Asymmetric Simple Exclusion Process with Finite Resources

    Full text link
    We revisit a totally asymmetric simple exclusion process (TASEP) with open boundaries and a global constraint on the total number of particles [Adams, et. al. 2008 J. Stat. Mech. P06009]. In this model, the entry rate of particles into the lattice depends on the number available in the reservoir. Thus, the total occupation on the lattice feeds back into its filling process. Although a simple domain wall theory provided reasonably good predictions for Monte Carlo simulation results for certain quantities, it did not account for the fluctuations of this feedback. We generalize the previous study and find dramatically improved predictions for, e.g., the density profile on the lattice and provide a better understanding of the phenomenon of "shock localization."Comment: 11 pages, 3 figures, v2: Minor change

    Making Sense of the Legendre Transform

    Full text link
    The Legendre transform is an important tool in theoretical physics, playing a critical role in classical mechanics, statistical mechanics, and thermodynamics. Yet, in typical undergraduate or graduate courses, the power of motivation and elegance of the method are often missing, unlike the treatments frequently enjoyed by Fourier transforms. We review and modify the presentation of Legendre transforms in a way that explicates the formal mathematics, resulting in manifestly symmetric equations, thereby clarifying the structure of the transform algebraically and geometrically. Then we bring in the physics to motivate the transform as a way of choosing independent variables that are more easily controlled. We demonstrate how the Legendre transform arises naturally from statistical mechanics and show how the use of dimensionless thermodynamic potentials leads to more natural and symmetric relations.Comment: 11 pages, 3 figure
    • 

    corecore