346 research outputs found

    Quality of Information in Mobile Crowdsensing: Survey and Research Challenges

    Full text link
    Smartphones have become the most pervasive devices in people's lives, and are clearly transforming the way we live and perceive technology. Today's smartphones benefit from almost ubiquitous Internet connectivity and come equipped with a plethora of inexpensive yet powerful embedded sensors, such as accelerometer, gyroscope, microphone, and camera. This unique combination has enabled revolutionary applications based on the mobile crowdsensing paradigm, such as real-time road traffic monitoring, air and noise pollution, crime control, and wildlife monitoring, just to name a few. Differently from prior sensing paradigms, humans are now the primary actors of the sensing process, since they become fundamental in retrieving reliable and up-to-date information about the event being monitored. As humans may behave unreliably or maliciously, assessing and guaranteeing Quality of Information (QoI) becomes more important than ever. In this paper, we provide a new framework for defining and enforcing the QoI in mobile crowdsensing, and analyze in depth the current state-of-the-art on the topic. We also outline novel research challenges, along with possible directions of future work.Comment: To appear in ACM Transactions on Sensor Networks (TOSN

    A data analytics approach to gas turbine prognostics and health management

    Get PDF
    As a consequence of the recent deregulation in the electrical power production industry, there has been a shift in the traditional ownership of power plants and the way they are operated. To hedge their business risks, the many new private entrepreneurs enter into long-term service agreement (LTSA) with third parties for their operation and maintenance activities. As the major LTSA providers, original equipment manufacturers have invested huge amounts of money to develop preventive maintenance strategies to minimize the occurrence of costly unplanned outages resulting from failures of the equipments covered under LTSA contracts. As a matter of fact, a recent study by the Electric Power Research Institute estimates the cost benefit of preventing a failure of a General Electric 7FA or 9FA technology compressor at 10to10 to 20 million. Therefore, in this dissertation, a two-phase data analytics approach is proposed to use the existing monitoring gas path and vibration sensors data to first develop a proactive strategy that systematically detects and validates catastrophic failure precursors so as to avoid the failure; and secondly to estimate the residual time to failure of the unhealthy items. For the first part of this work, the time-frequency technique of the wavelet packet transforms is used to de-noise the noisy sensor data. Next, the time-series signal of each sensor is decomposed to perform a multi-resolution analysis to extract its features. After that, the probabilistic principal component analysis is applied as a data fusion technique to reduce the number of the potentially correlated multi-sensors measurement into a few uncorrelated principal components. The last step of the failure precursor detection methodology, the anomaly detection decision, is in itself a multi-stage process. The obtained principal components from the data fusion step are first combined into a one-dimensional reconstructed signal representing the overall health assessment of the monitored systems. Then, two damage indicators of the reconstructed signal are defined and monitored for defect using a statistical process control approach. Finally, the Bayesian evaluation method for hypothesis testing is applied to a computed threshold to test for deviations from the healthy band. To model the residual time to failure, the anomaly severity index and the anomaly duration index are defined as defects characteristics. Two modeling techniques are investigated for the prognostication of the survival time after an anomaly is detected: the deterministic regression approach, and parametric approximation of the non-parametric Kaplan-Meier plot estimator. It is established that the deterministic regression provides poor prediction estimation. The non parametric survival data analysis technique of the Kaplan-Meier estimator provides the empirical survivor function of the data set comprised of both non-censored and right censored data. Though powerful because no a-priori predefined lifetime distribution is made, the Kaplan-Meier result lacks the flexibility to be transplanted to other units of a given fleet. The parametric analysis of survival data is performed with two popular failure analysis distributions: the exponential distribution and the Weibull distribution. The conclusion from the parametric analysis of the Kaplan-Meier plot is that the larger the data set, the more accurate is the prognostication ability of the residual time to failure model.PhDCommittee Chair: Mavris, Dimitri; Committee Member: Jiang, Xiaomo; Committee Member: Kumar, Virendra; Committee Member: Saleh, Joseph; Committee Member: Vittal, Sameer; Committee Member: Volovoi, Vital

    A Coalition Formation Game for Cooperative Spectrum Sensing in Cognitive Radio Network under the Constraint of Overhead

    Get PDF
    Cooperative spectrum sensing improves the sensing performance of secondary users by exploiting spatial diversity in cognitive radio networks. However, the cooperation of secondary users introduces some overhead also that may degrade the overall performance of cooperative spectrum sensing.  The trade-off between cooperation gain and overhead plays a vital role in modeling cooperative spectrum sensing.  This paper considers overhead in terms of reporting energy and reporting time. We propose a cooperative spectrum sensing based coalitional game model where the utility of the game is formulated as a function of throughput gain and overhead. To achieve a rational average throughput of secondary users, the overhead incurred is to be optimized. This work emphasizes on optimization of the overhead incurred. In cooperative spectrum sensing, the large number of cooperating users improve the detection performance, on the contrary, it increases overhead too. So, to limit the maximum coalition size we propose a formulation under the constraint of the probability of false alarm. An efficient fusion center selection scheme and an algorithm to select eligible secondary users for reporting are proposed to reduce the reporting overhead. We also outline a distributed cooperative spectrum sensing algorithm using the properties of the coalition formation game and prove that the utility of the proposed game has non-transferable properties.  The simulation results show that the proposed schemes reduce the overhead of reporting without compromising the overall detection performance of cooperative spectrum sensing

    A Review of Classification Problems and Algorithms in Renewable Energy Applications

    Get PDF
    Classification problems and their corresponding solving approaches constitute one of the fields of machine learning. The application of classification schemes in Renewable Energy (RE) has gained significant attention in the last few years, contributing to the deployment, management and optimization of RE systems. The main objective of this paper is to review the most important classification algorithms applied to RE problems, including both classical and novel algorithms. The paper also provides a comprehensive literature review and discussion on different classification techniques in specific RE problems, including wind speed/power prediction, fault diagnosis in RE systems, power quality disturbance classification and other applications in alternative RE systems. In this way, the paper describes classification techniques and metrics applied to RE problems, thus being useful both for researchers dealing with this kind of problem and for practitioners of the field

    Practical Methods for Optimizing Equipment Maintenance Strategies Using an Analytic Hierarchy Process and Prognostic Algorithms

    Get PDF
    Many large organizations report limited success using Condition Based Maintenance (CbM). This work explains some of the causes for limited success, and recommends practical methods that enable the benefits of CbM. The backbone of CbM is a Prognostics and Health Management (PHM) system. Use of PHM alone does not ensure success; it needs to be integrated into enterprise level processes and culture, and aligned with customer expectations. To integrate PHM, this work recommends a novel life cycle framework, expanding the concept of maintenance into several levels beginning with an overarching maintenance strategy and subordinate policies, tactics, and PHM analytical methods. During the design and in-service phases of the equipment’s life, an organization must prove that a maintenance policy satisfies specific safety and technical requirements, business practices, and is supported by the logistic and resourcing plan to satisfy end-user needs and expectations. These factors often compete with each other because they are designed and considered separately, and serve disparate customers. This work recommends using the Analytic Hierarchy Process (AHP) as a practical method for consolidating input from stakeholders and quantifying the most preferred maintenance policy. AHP forces simultaneous consideration of all factors, resolving conflicts in the trade-space of the decision process. When used within the recommended life cycle framework, it is a vehicle for justifying the decision to transition from generalized high-level concepts down to specific lower-level actions. This work demonstrates AHP using degradation data, prognostic algorithms, cost data, and stakeholder input to select the most preferred maintenance policy for a paint coating system. It concludes the following for this particular system: A proactive maintenance policy is most preferred, and a predictive (CbM) policy is more preferred than predeterminative (time-directed) and corrective policies. A General Path prognostic Model with Bayesian updating (GPM) provides the most accurate prediction of the Remaining Useful Life (RUL). Long periods between inspections and use of categorical variables in inspection reports severely limit the accuracy in predicting the RUL. In summary, this work recommends using the proposed life cycle model, AHP, PHM, a GPM model, and embedded sensors to improve the success of a CbM policy

    From Sensing to Predictions and Database Technique: A Review of TV White Space Information Acquisition in Cognitive Radio Networks

    Get PDF
    Strategies to acquire white space information is the single most significant functionality in cognitive radio networks (CRNs) and as such, it has gone some evolution to enhance information accuracy. The evolution trends are spectrum sensing, prediction algorithm and recently, geo‐location database technique. Previously, spectrum sensing was the main technique for detecting the presence/absence of a primary user (PU) signal in a given radio frequency (RF) spectrum. However, this expectation could not materialized as a result of numerous technical challenges ranging from hardware imperfections to RF signal impairments. To convey the evolutionary trends in the development of white space information, we present a survey of the contemporary advancements in PU detection with emphasis on the practical deployment of CRNs i.e. Television white space (TVWS) networks. It is found that geo‐location database is the most reliable technique to acquire TVWS information although, it is financially driven. Finally, using financially driven database model, this study compared the data‐rate and spectral efficiency of FCC and Ofcom TV channelization. It was discovered that Ofcom TV channelization outperforms FCC TV channelization as a result of having higher spectrum bandwidth. We proposed the adoption of an allinclusive TVWS information acquisition model as the future research direction for TVWS information acquisition techniques

    DECISION SUPPORT MODEL IN FAILURE-BASED COMPUTERIZED MAINTENANCE MANAGEMENT SYSTEM FOR SMALL AND MEDIUM INDUSTRIES

    Get PDF
    Maintenance decision support system is crucial to ensure maintainability and reliability of equipments in production lines. This thesis investigates a few decision support models to aid maintenance management activities in small and medium industries. In order to improve the reliability of resources in production lines, this study introduces a conceptual framework to be used in failure-based maintenance. Maintenance strategies are identified using the Decision-Making Grid model, based on two important factors, including the machines’ downtimes and their frequency of failures. The machines are categorized into three downtime criterions and frequency of failures, which are high, medium and low. This research derived a formula based on maintenance cost, to re-position the machines prior to Decision-Making Grid analysis. Subsequently, the formula on clustering analysis in the Decision-Making Grid model is improved to solve multiple-criteria problem. This research work also introduced a formula to estimate contractor’s response and repair time. The estimates are used as input parameters in the Analytical Hierarchy Process model. The decisions were synthesized using models based on the contractors’ technical skills such as experience in maintenance, skill to diagnose machines and ability to take prompt action during troubleshooting activities. Another important criteria considered in the Analytical Hierarchy Process is the business principles of the contractors, which includes the maintenance quality, tools, equipments and enthusiasm in problem-solving. The raw data collected through observation, interviews and surveys in the case studies to understand some risk factors in small and medium food processing industries. The risk factors are analysed with the Ishikawa Fishbone diagram to reveal delay time in machinery maintenance. The experimental studies are conducted using maintenance records in food processing industries. The Decision Making Grid model can detect the top ten worst production machines on the production lines. The Analytical Hierarchy Process model is used to rank the contractors and their best maintenance practice. This research recommends displaying the results on the production’s indicator boards and implements the strategies on the production shop floor. The proposed models can be used by decision makers to identify maintenance strategies and enhance competitiveness among contractors in failure-based maintenance. The models can be programmed as decision support sub-procedures in computerized maintenance management systems

    ISBIS 2016: Meeting on Statistics in Business and Industry

    Get PDF
    This Book includes the abstracts of the talks presented at the 2016 International Symposium on Business and Industrial Statistics, held at Barcelona, June 8-10, 2016, hosted at the Universitat Politècnica de Catalunya - Barcelona TECH, by the Department of Statistics and Operations Research. The location of the meeting was at ETSEIB Building (Escola Tecnica Superior d'Enginyeria Industrial) at Avda Diagonal 647. The meeting organizers celebrated the continued success of ISBIS and ENBIS society, and the meeting draw together the international community of statisticians, both academics and industry professionals, who share the goal of making statistics the foundation for decision making in business and related applications. The Scientific Program Committee was constituted by: David Banks, Duke University Amílcar Oliveira, DCeT - Universidade Aberta and CEAUL Teresa A. Oliveira, DCeT - Universidade Aberta and CEAUL Nalini Ravishankar, University of Connecticut Xavier Tort Martorell, Universitat Politécnica de Catalunya, Barcelona TECH Martina Vandebroek, KU Leuven Vincenzo Esposito Vinzi, ESSEC Business Schoo

    Mathematics in health care with applications

    Get PDF
    The Author aims to show how mathematics can be useful in supporting key activities in a hospital, including: noninvasive measurement of a patient’s status (see chapter 1), evaluation of quality of services (see chapter 2), business and clinical administration (see chapter 3), and diagnosis and prognosis (see chapter 4). Such applications suggest the development of innovative projects to improve health care processes, services and systems. In this way, mathematics can be a very important tool for technological and societal development
    corecore