378 research outputs found

    SETIS expert workshop on the assessment of the potential of pumped hydropower storage

    Get PDF
    Energy storage is an important option to enable a higher share of variable renewable electricity such as wind and solar, in the energy system. Pumped hydropower storage (PHS) is currently the only storage technology able to provide the large storage needed for accommodating renewable electricity under the 2020 EU energy targets. Moreover, the transformation of an existing water reservoir into a PHS facility has a much smaller environmental and social impact compared with most new hydropower plant in Europe. The JRC collaborated with University College Cork (UCC) in Ireland to develop a GIS-based methodology and model to assess the potential for transforming single reservoirs into PHS systems. Then the JRC organised a multi-disciplinary expert workshop to validate the methodology and model, provide a set of recommendations for the improvement of the effectiveness and efficiency of the methodology, address the issue of data availability in the Member States, and share and disseminate the methodology among relevant stakeholders, such as policy makers, industry, research, etc. This report presents the results of the workshop which concluded that the assessment of the potential for PHS is different when its purpose is site assessment or policy planning and decision-making; and that the use of geographical information systems models is effective, efficient and convenient for both purposes whereas what differs is the intensity of the use of the tools, the detail of the data needed and the assumptions behind the model and methodology. The restriction to PHS development imposed by the different types of nature protection areas (NPA) is different in different countries. Also, laws and perceptions change with time and as PHS projects take a long time to realise the scientific assessment of European or national potential cannot take current NPAs and laws into account with the same weight as the site assessment for a proposed PHS project. Country and European assessment is heavily dependent on the assumptions taken. For example, sensibility analysis showed that enlarging the maximum distance between two reservoirs from 5 to 20 km increased the theoretical potential for Croatia from 60 GWh to nearly 600 GWh.JRC.F.6-Energy systems evaluatio

    Integration of Synthetic Aperture Radar Interferometry (InSAR) and Geographical Information Systems (GIS) for monitoring mining induced surface deformations

    Get PDF
    Surface subsidence induced by mining is a source of risk to people, equipment and environment. It may also disrupt mining schedules and increase the cost of mine safety. To provide accurate assessment of the surface subsidence and its level of impact on mine production and environment, it is necessary to develop and introduce comprehensive subsidence monitoring systems. Current techniques for monitoring of surface deformation are usually based on classical survey principles. In general these techniques have disadvantages that limit their applicability: they follow point-by-point data collection techniques, they are relatively time-consuming and costly, they usually cover only a small area, they are not applicable for the monitoring of inaccessible areas and they are not able to collect data continuously.As a complementary or alternative technique, the thesis discusses the applicability of SAR interferometry for monitoring mining induced deformations. InSAR is a remote sensing technique that makes use of Synthetic Aperture Radar (SAR) observations to acquire change in terrain topography. In spite of the widespread application of the technique for monitoring large-scale deformations of the Earth crust, specific modifications are necessary for utilising the technology within a mining context. Limitations, such as difficulty to resolve deformation for a high gradient slope, difficulty to retrieve subsidence for localised highly dynamic ground movements and the unavailability of SAR images with the desired specifications restrict the potential to monitor high rate, localised mine subsidence on day-to-day basis.The secondary aim of the thesis is to present integration of InSAR and GIS in order to propose an optimum methodology for processing of InSAR data to determine mine subsidence. The presented research also involves detailed analysis of InSAR limitations. This in consequence has led to suggestions on how to improve current InSAR capability with respect to the mining needs.The thesis introduces a set of new GIS-based tools and methodologies that are integrated into a conventional InSAR processing technique, to further improve and facilitate application of InSAR in mining. The developed tools and techniques cover the three main stages of data processing (pre-processing, processing and postprocessing). The researcher tried to address InSAR.’s limitations associated with mining related applications and also to provide practical solutions to resolve these issues

    Mapping concentrated solar power site suitability in Algeria

    Get PDF
    The investment in solar thermal power technologies has become increasingly attractive, despite their still perceived high costs. Algeria presented an ambitious plan for increasing the participation of renewable energy sources (RES) in the power system, with significant investments foreseen for solar power technologies. To achieve this objective, it is necessary to identify optimal sites for the implementation of these plants, as well as others where implementation is highly inadvisable from the economic, social, or environmental points of view. The main goal of this study is to present and apply a methodology to identify adequate locations for the installation of solar power plants in Algeria. The study addressed the particular case of concentrated solar power (CSP) and proposed a hybrid approach combining multi criteria decision making and Geographic Information System. The approach allowed mapping and visualizing unfeasible areas and ranking the feasible sites. The results showed that more than 51% of the territory of the country is unfeasible for the implementation of CSP, mainly due to criteria related to topographic aspects, water availability, and distance to the grid. The results demonstrated that relying only on Direct Normal Irradiation (DNI) values may result in a reductionist vision for energy planning and thus other criteria can play a fundamental role in the decision process. The model allowed also to identify the best regions for CSP investment and opens routes for more detailed studies for the exact site selection.The authors would like to thank all open source data providers and ESRI Maps for provide the background maps. Also authors thank J. R. Oakleaf et al. for make available spatial data linked to global potential for renewable energy. The authors are also thankful to experts of the research center CDER and the engineering experts who participated in the AHP for their assistance

    An Assessment of Path Loss Tools and Practical Testing of Television White Space Frequencies for Rural Broadband Deployments

    Get PDF
    Broadband internet has grown to become a major part of our daily routines. With this growth increase, those without direct access will not be afforded the same opportunities that come with it. The need for ubiquitous coverage of broadband Internet is clear to provide everyone these opportunities. Rural environments are an area of concern of falling behind the growth as the low population densities make wired broadband solutions cost prohibitive. Wireless options are often the only option for many of these areas; WiFi, cellular, and WiMAX networks are currently used around the world, but with the opening of the unused broadcast television frequencies, deemed TV White Space (TVWS), a new option is hitting the market. This new technology needs to be assessed before it can be seen as a viable solution. The contribution of this work is two-fold. First, findings from a real, ongoing trial of commercially available TVWS radios in the area surrounding the University of New Hampshire campus are presented. The trial shows that though the radios can provide Internet access to a distance of at least 12.5 km, certain terrain and foliage characteristics of the path can form coverage holes in that region. The second contribution explores the use of empirical path loss models to predict the path loss, and compares the predictions to actual path loss measurements from the TVWS network setup. The Stanford University Interim (SUI) model and a modified version of the Okumura-Hata model provide the lowest root mean squared error (RMSE) for the setup. Additionally, the deterministic Longley-Rice model was explored with the Radio Mobile prediction software. It was determined that without extensively tuning the foliage component of the algorithm, the model could produce significant prediction errors, resulting in a trade-off between low cost, un-tuned predictions, and prediction accuracy

    Quantitative analysis of non-cooperative transboundary river basins

    Get PDF
    Le partage de l’eau dans des bassins versants transfrontaliers est un problème complexe, en particulier lorsqu’il n’y a pas de tradition de coopération entre les pays riverains dans d’autres domaines non liés à l’eau tels que le commerce. De plus, à mesure que les ressources en eau se développent et que le changement climatique est une nouvelle source de risque, le manque d’informations partagées quant aux débits hydrologiques et aux décisions humaines et institutionnelles sur la gestion des ressources rend de plus en plus difficile la distinction entre facteurs naturels et anthropiques dans le dérèglement d’un régime hydrologique. Des tentatives de récupération de données hydrologiques dans des régions difficiles d’accès ont été réalisées avec succès en utilisant la télédétection. Mais l’application de cette technique pour la modélisation des systèmes d’eau (notamment pour caractériser des infrastructures ou des comportements d’usagers) reste difficile puisqu’elle nécessite d’importantes observations et interactions avec les gestionnaires de la ressource sur le terrain. La portée de la plupart des techniques de modélisation est également limitée par leur incapacité à gérer la multitude d’institutions en charge des ressources en eau, ou l’impact de leurs intérêts spécifiques et souvent opposés sur la ressource en elle-même. Pendant des décennies, ce manque de données détaillées et de techniques de modélisation appropriées a conduit de nombreuses études sur des bassins versants internationaux non gérés de façon concertée à rester qualitatives ou conceptuelles. Cette incapacité à comprendre et à quantifier de manière indépendante les causes de changements hydrologiques est particulièrement frustrante pour des décideurs politiques. Dans le bassin du Yarmouk, par exemple, qui est partagé entre la Syrie, la Jordanie et Israël, le débit annuel moyen correspond aujourd’hui à moins de 15 % de celui qui a précédé la période de développement, et ce malgré la signature d’accords bilatéraux entre la Syrie et la Jordanie (1987) et entre la Jordanie et Israël (1994). Cette situation a conduit les pays riverains à développer chacun leur propre théorie, contestée, concernant l’effondrement du débit du Yarmouk. En prenant ce bassin comme étude de cas, cette thèse de doctorat vise à analyser quantitativement des changements hydrologiques dans des bassins versants transfrontaliers, non gérés de façon concertée, complexes institutionnellement, et aménagés à l’excès. Cet objectif passe par deux activités de recherche principales : (i) le suivi de la retenue d’eau de petits barrages dans des zones inaccessibles – comme première étape à la caractérisation d’un système multi-réservoirs ; et la simulation et l’analyse de scénarios, dans le but d’étudier de manière quantitative des changements hydrologiques dans un bassin versant. Les résultats indiquent que des facteurs naturels et anthropiques sont responsables de la chute du débit du Yarmouk et évaluent leur contribution à cet effet en combinant télédétection, simulation multi-agent et analyse de scénarios.Sharing waters in a transboundary river basin is challenging, especially when there is no tradition of cooperation between riparian countries in other, non water-related, issues such as trade. Moreover, as water resources are being developed and climate change is a new source of risk, the lack of shared information on hydrological flows and human/institutional decisions on resources management implies that it is increasingly difficult to distinguish between natural and anthropogenic factors affecting a flow regime. Attempts to retrieve hydrological data in hardly accessible areas have successfully been made using remote sensing. But the use of this technique for water systems modeling efforts, and particularly for characterizing infrastructure or understand water user behaviors, remains challenging as it requires extensive on-the-ground observations and interactions with water resources managers. The scope of most modeling techniques is also limited by their inability to handle the multiplicity of institutions dealing with water, or the impact of their specific and often competing interests on water resources. For decades, this lack of detailed data and suitable modeling techniques has led many studies on non-cooperatively managed international river basins to remain qualitative or conceptual, and has therefore frustrated policy makers for not being able to independently understand and quantify the causes of hydrological changes. In the Yarmouk River basin, for example, which is shared between Syria, Jordan and Israel, the annual outflow now corresponds to less than 15% of that of pre-development era, despite the signature of bilateral agreements between Syria and Jordan (1987), and between Jordan and Israel (1994). This state of affairs has led riparian countries to develop their own, contested, narratives regarding the collapse of the Yarmouk flow. Taking the Yarmouk basin as a case-study, this Ph.D. thesis consequently aims at quantitatively analyzing past hydrological changes in non-cooperatively managed, institutionally complex, over-built, transboundary river basins. This objective goes through two main research activities: (i) the monitoring of small reservoirs’ storage in inaccessible areas, as a start to characterize a multi-reservoir system; and (ii) the simulation and analysis of scenarios to quantitatively study changes in a river basin. Results reveal that the contributions of natural and anthropogenic factors to explain the decline of the Yarmouk flows can be identified and then assessed using remote sensing, multi-agent simulation, and scenario analysis

    3D visualization of in-flight recorded data.

    Get PDF
    Human being can easily acquire information by showing the object than reading the description of it. Our brain stores images that the eyes are seeing and by the brain mapping, people can analyze information by imagination in the brain. This is the reason why visualization is important and powerful. It helps people remember the scene later. Visualization transforms the symbolic into the geometric, enabling researchers to observe their simulations and computations (Flurchick, 2001). As a consequence, many computer scientists and programmers take their time to build better visualization of the data for users. For the flight data from an aircraft, it is better to understand data in 3D computer graphics rather than to look at mere numbers. The flight data consists of several fields such as elapsed time, latitude, longitude, altitude, ground speed, roll angle, pitch angle, heading, wind speed, and so on. With these data variables, filtering is the first process for visualization in order to gather important information. The collection of processed data is transformed to 3D graphics form to be rendered by generating Keyhole Mark-up Language (KML) files in the system. KML is an XML grammar and file format for modeling and storing geographic features such as points, lines, images, polygons, and models for display in Google Earth or Google Maps. Like HTML, KML has a tag-based structure with names and attributes used for specific display purposes. In the present work, new approaches to visualize flight using Google Earth are developed. Because of the limitation of the Google Earth API, the Great Circle Distance calculation and trigonometric functions are implemented to handle the position, angles of roll and pitch, and a range of the camera positions to generate several points of view. Currently, visual representation of flight data depends on 2D graphics although an aircraft flies in a 3D space. The graphical interface allows flight analysts to create ground traces in 2D, and flight ribbons and flight paths with altitude in 3D. Additionally, by incorporating weather information, fog and clouds can also be generated as part of the animation effects. With 3D stereoscopic technique, a realistic visual representation of the flights is realized

    Autonomous 3D Urban and Complex Terrain Geometry Generation and Micro-Climate Modelling Using CFD and Deep Learning

    Get PDF
    Sustainable building design requires a clear understanding and realistic modelling of the complex interaction between climate and built environment to create safe and comfortable outdoor and indoor spaces. This necessitates unprecedented urban climate modelling at high temporal and spatial resolution. The interaction between complex urban geometries and the microclimate is characterized by complex transport mechanisms. The challenge to generate geometric and physics boundary conditions in an automated manner is hindering the progress of computational methods in urban design. Thus, the challenge of modelling realistic and pragmatic numerical urban micro-climate for wind engineering, environmental, and building energy simulation applications should address the complexity of the geometry and the variability of surface types involved in urban exposures. The original contribution to knowledge in this research is the proposed an end-to-end workflow that employs a cutting-edge deep learning model for image segmentation to generate building footprint polygons autonomously and combining those polygons with LiDAR data to generate level of detail three (LOD3) 3D building models to tackle the geometry modelling issue in climate modelling and solar power potential assessment. Urban and topography geometric modelling is a challenging task when undertaking climate model assessment. This paper describes a deep learning technique that is based on U-Net architecture to automate 3D building model generation by combining satellite imagery with LiDAR data. The deep learning model used registered a mean squared error of 0.02. The extracted building polygons were extruded using height information from corresponding LiDAR data. The building roof structures were also modelled from the same point cloud data. The method used has the potential to automate the task of generating urban scale 3D building models and can be used for city-wide applications. The advantage of applying a deep learning model in an image processing task is that it can be applied to a new set of input image data to extract building footprint polygons for autonomous application once it has been trained. In addition, the model can be improved over time with minimum adjustments when an improved quality dataset is available, and the trained parameters can be improved further building on previously learned features. Application examples for pedestrian level wind and solar energy availability assessment as well as modeling wind flow over complex terrain are presented

    Fingerprint location methods using ray-tracing

    Full text link
    Mobile location methods that employ signal fingerprints are becoming increasingly popular in a number of wireless positioning solutions. A fingerprint is a spatial database, created either by recorded measurement or simulation, of the radio environment. It is used to assign signal characteristics such as received signal strength or power delay profiles to an actual location. Measurements made by either the handset or the network, are then matched to those in the fingerprint in order to determine a location. Creation of the fingerprint by an a priori measurement stage is costly and time consuming. Virtual fingerprints, those created by a ray-tracing radio propagation prediction tool, normally require a lengthy off-line simulation mode that needs to be repeated each time changes are made to the network or built environment. An open research question exists of whether a virtual fingerprint could be created dynamically via a ray-trace model embedded on a mobile handset for positioning purposes. The key aim of this thesis is to investigate the trade-off between complexity of the physics required for ray-tracing models and the accuracy of the virtual fingerprints they produce. The most demanding computational phase of a ray-trace simulation is the ray-path finding stage, whereby a distribution of rays cast from a source point, interacting with walls and edges by reflection and diffraction phenomena are traced to a set of receive points. Due to this, we specifically develop a new technique that decreases the computation of the ray-path finding stage. The new technique utilises a modified method of images rather than brute-force ray casting. It leads to the creation of virtual fingerprints requiring significantly less computation effort relative to ray casting techniques, with only small decreases in accuracy. Our new technique for virtual fingerprint creation was then applied to the development of a signal strength fingerprint for a 3G UMTS network covering the Sydney central business district. Our main goal was to determine whether on current mobile handsets, a sub-50m location accuracy could be achieved within a few seconds timescale using our system. The results show that this was in fact achievable. We also show how virtual fingerprinting can lead to more accurate solutions. Based on these results we claim user embedded fingerprinting is now a viable alternative to a priori measurement schemes

    The relationship between choice of spectrum sensing device and secondary-user intrusion in database-driven cognitive radio systems

    Get PDF
    As radios in future wireless systems become more flexible and reconfigurable whilst available radio spectrum becomes scarce, the possibility of using TV White Space devices (WSD) as secondary users in the TV Broadcast Bands (without causing harmful interference to licensed incumbents) becomes ever more attractive. Cognitive Radio encompasses a number of technologies which enable adaptive self-programming of systems at different levels to provide more effective use of the increasingly congested radio spectrum. Cognitive Radio has the potential to use spectrum allocated to TV services, which is not actually being used by these services, without causing disruptive interference to licensed users by using channel selection aided by use of appropriate propagation modelling in TV White Spaces.The main purpose of this thesis is to explore the potential of the Cognitive Radio concept to provide additional bandwidth and improved efficiency to help accelerate the development and acceptance of Cognitive Radio technology. Specifically, firstly: three main classes of spectrum sensing techniques (Energy Detection, Matched Filtering and Cyclostationary Feature Detection) have compare in terms of time and spectrum resources consumed, required prior knowledge and complexity, ranking the three classes according to accuracy and performance. Secondly, investigate spectrum occupancy of the UHF TV band in the frequency range from 470 to 862 MHz by undertaking spectrum occupancy measurements in different locations around the Hull area in the UK, using two different receiver devices; a low cost Software-Defined Radio device and a laboratory-quality spectrum analyser. Thirdly, investigate the best propagation model among three propagation models (Extended-Hata, Davidson-Hata and Egli) for use in the TV band, whilst also finding the optimum terrain data resolution to use (1000, 100 or 30 m). it compares modelled results with the previously-mentioned practical measurements and then describe how such models can be integrated into a database-driven tool for Cognitive Radio channel selection within the TV White Space environment. Fourthly, create a flexible simulation system for creating a TV White Space database by using different propagation models. Finally, design a flexible system which uses a combination of Geolocation Database and Spectrum Sensing in the TV band, comparing the performance of two spectrum analysers (Agilent E4407B and Agilent EXA N9010A) with that of a low cost Software-Defined Radio in the real radio environment. The results shows that white space devices can be designed using SDRs based on the Realtek RTL2832U chip (RTL-SDR), combined with a geolocation database for identifying the primary user in the specific location in a cost-effective manner. Furthermore it is shown that improving the sensitivity of RTL-SDR will affect the accuracy and performance of the WSD
    • …
    corecore