36,714 research outputs found

    Smart Grid for the Smart City

    Get PDF
    Modern cities are embracing cutting-edge technologies to improve the services they offer to the citizens from traffic control to the reduction of greenhouse gases and energy provisioning. In this chapter, we look at the energy sector advocating how Information and Communication Technologies (ICT) and signal processing techniques can be integrated into next generation power grids for an increased effectiveness in terms of: electrical stability, distribution, improved communication security, energy production, and utilization. In particular, we deliberate about the use of these techniques within new demand response paradigms, where communities of prosumers (e.g., households, generating part of their electricity consumption) contribute to the satisfaction of the energy demand through load balancing and peak shaving. Our discussion also covers the use of big data analytics for demand response and serious games as a tool to promote energy-efficient behaviors from end users

    Improving Knowledge-Based Systems with statistical techniques, text mining, and neural networks for non-technical loss detection

    Get PDF
    Currently, power distribution companies have several problems that are related to energy losses. For example, the energy used might not be billed due to illegal manipulation or a breakdown in the customer’s measurement equipment. These types of losses are called non-technical losses (NTLs), and these losses are usually greater than the losses that are due to the distribution infrastructure (technical losses). Traditionally, a large number of studies have used data mining to detect NTLs, but to the best of our knowledge, there are no studies that involve the use of a Knowledge-Based System (KBS) that is created based on the knowledge and expertise of the inspectors. In the present study, a KBS was built that is based on the knowledge and expertise of the inspectors and that uses text mining, neural networks, and statistical techniques for the detection of NTLs. Text mining, neural networks, and statistical techniques were used to extract information from samples, and this information was translated into rules, which were joined to the rules that were generated by the knowledge of the inspectors. This system was tested with real samples that were extracted from Endesa databases. Endesa is one of the most important distribution companies in Spain, and it plays an important role in international markets in both Europe and South America, having more than 73 million customers

    An Integrated Approach for Characterizing Aerosol Climate Impacts and Environmental Interactions

    Get PDF
    Aerosols exert myriad influences on the earth's environment and climate, and on human health. The complexity of aerosol-related processes requires that information gathered to improve our understanding of climate change must originate from multiple sources, and that effective strategies for data integration need to be established. While a vast array of observed and modeled data are becoming available, the aerosol research community currently lacks the necessary tools and infrastructure to reap maximum scientific benefit from these data. Spatial and temporal sampling differences among a diverse set of sensors, nonuniform data qualities, aerosol mesoscale variabilities, and difficulties in separating cloud effects are some of the challenges that need to be addressed. Maximizing the long-term benefit from these data also requires maintaining consistently well-understood accuracies as measurement approaches evolve and improve. Achieving a comprehensive understanding of how aerosol physical, chemical, and radiative processes impact the earth system can be achieved only through a multidisciplinary, inter-agency, and international initiative capable of dealing with these issues. A systematic approach, capitalizing on modern measurement and modeling techniques, geospatial statistics methodologies, and high-performance information technologies, can provide the necessary machinery to support this objective. We outline a framework for integrating and interpreting observations and models, and establishing an accurate, consistent, and cohesive long-term record, following a strategy whereby information and tools of progressively greater sophistication are incorporated as problems of increasing complexity are tackled. This concept is named the Progressive Aerosol Retrieval and Assimilation Global Observing Network (PARAGON). To encompass the breadth of the effort required, we present a set of recommendations dealing with data interoperability; measurement and model integration; multisensor synergy; data summarization and mining; model evaluation; calibration and validation; augmentation of surface and in situ measurements; advances in passive and active remote sensing; and design of satellite missions. Without an initiative of this nature, the scientific and policy communities will continue to struggle with understanding the quantitative impact of complex aerosol processes on regional and global climate change and air quality

    The Semantic Grid: A future e-Science infrastructure

    No full text
    e-Science offers a promising vision of how computer and communication technology can support and enhance the scientific process. It does this by enabling scientists to generate, analyse, share and discuss their insights, experiments and results in an effective manner. The underlying computer infrastructure that provides these facilities is commonly referred to as the Grid. At this time, there are a number of grid applications being developed and there is a whole raft of computer technologies that provide fragments of the necessary functionality. However there is currently a major gap between these endeavours and the vision of e-Science in which there is a high degree of easy-to-use and seamless automation and in which there are flexible collaborations and computations on a global scale. To bridge this practice–aspiration divide, this paper presents a research agenda whose aim is to move from the current state of the art in e-Science infrastructure, to the future infrastructure that is needed to support the full richness of the e-Science vision. Here the future e-Science research infrastructure is termed the Semantic Grid (Semantic Grid to Grid is meant to connote a similar relationship to the one that exists between the Semantic Web and the Web). In particular, we present a conceptual architecture for the Semantic Grid. This architecture adopts a service-oriented perspective in which distinct stakeholders in the scientific process, represented as software agents, provide services to one another, under various service level agreements, in various forms of marketplace. We then focus predominantly on the issues concerned with the way that knowledge is acquired and used in such environments since we believe this is the key differentiator between current grid endeavours and those envisioned for the Semantic Grid

    21st Century Simulation: Exploiting High Performance Computing and Data Analysis

    Get PDF
    This paper identifies, defines, and analyzes the limitations imposed on Modeling and Simulation by outmoded paradigms in computer utilization and data analysis. The authors then discuss two emerging capabilities to overcome these limitations: High Performance Parallel Computing and Advanced Data Analysis. First, parallel computing, in supercomputers and Linux clusters, has proven effective by providing users an advantage in computing power. This has been characterized as a ten-year lead over the use of single-processor computers. Second, advanced data analysis techniques are both necessitated and enabled by this leap in computing power. JFCOM's JESPP project is one of the few simulation initiatives to effectively embrace these concepts. The challenges facing the defense analyst today have grown to include the need to consider operations among non-combatant populations, to focus on impacts to civilian infrastructure, to differentiate combatants from non-combatants, and to understand non-linear, asymmetric warfare. These requirements stretch both current computational techniques and data analysis methodologies. In this paper, documented examples and potential solutions will be advanced. The authors discuss the paths to successful implementation based on their experience. Reviewed technologies include parallel computing, cluster computing, grid computing, data logging, OpsResearch, database advances, data mining, evolutionary computing, genetic algorithms, and Monte Carlo sensitivity analyses. The modeling and simulation community has significant potential to provide more opportunities for training and analysis. Simulations must include increasingly sophisticated environments, better emulations of foes, and more realistic civilian populations. Overcoming the implementation challenges will produce dramatically better insights, for trainees and analysts. High Performance Parallel Computing and Advanced Data Analysis promise increased understanding of future vulnerabilities to help avoid unneeded mission failures and unacceptable personnel losses. The authors set forth road maps for rapid prototyping and adoption of advanced capabilities. They discuss the beneficial impact of embracing these technologies, as well as risk mitigation required to ensure success
    corecore