15,934 research outputs found

    PADAMOT : project overview report

    Get PDF
    Background and relevance to radioactive waste management International consensus confirms that placing radioactive wastes and spent nuclear fuel deep underground in a geological repository is the generally preferred option for their long-term management and disposal. This strategy provides a number of advantages compared to leaving it on or near the Earth’s surface. These advantages come about because, for a well chosen site, the geosphere can provide: • a physical barrier that can negate or buffer against the effects of surface dominated natural disruptive processes such as deep weathering, glaciation, river and marine erosion or flooding, asteroid/comet impact and earthquake shaking etc. • long and slow groundwater return pathways from the facility to the biosphere along which retardation, dilution and dispersion processes may operate to reduce radionuclide concentration in the groundwater. • a stable, and benign geochemical environment to maximise the longevity of the engineered barriers such as the waste containers and backfill in the facility. • a natural radiation shield around the wastes. • a mechanically stable environment in which the facility can be constructed and will afterwards be protected. • an environment which reduces the likelihood of the repository being disturbed by inadvertent human intrusion such as land use changes, construction projects, drilling, quarrying and mining etc. • protection against the effects of deliberate human activities such as vandalism, terrorism and war etc. However, safety considerations for storing and disposing of long-lived radioactive wastes must take into account various scenarios that might affect the ability of the geosphere to provide the functionality listed above. Therefore, in order to provide confidence in the ability of a repository to perform within the deep geological setting at a particular site, a demonstration of geosphere “stability” needs to be made. Stability is defined here to be the capacity of a geological and hydrogeological system to minimise the impact of external influences on the repository environment, or at least to account for them in a manner that would allow their impacts to be evaluated and accounted for in any safety assessments. A repository should be sited where the deep geosphere is a stable host in which the engineered containment can continue to perform according to design and in which the surrounding hydrogeological, geomechanical and geochemical environment will continue to operate as a natural barrier to radionuclide movement towards the biosphere. However, over the long periods of time during which long-lived radioactive wastes will pose a hazard, environmental change at the surface has the potential to disrupt the stability of the geosphere and therefore the causes of environmental change and their potential consequences need to be evaluated. As noted above, environmental change can include processes such as deep weathering, glaciation, river and marine erosion. It can also lead to changes in groundwater boundary conditions through alternating recharge/discharge relationships. One of the key drivers for environmental change is climate variability. The question then arises, how can geosphere stability be assessed with respect to changes in climate? Key issues raised in connection with this are: • What evidence is there that 'going underground' eliminates the extreme conditions that storage on the surface would be subjected to in the long term? • How can the additional stability and safety of the deep geosphere be demonstrated with evidence from the natural system? As a corollary to this, the capacity of repository sites deep underground in stable rock masses to mitigate potential impacts of future climate change on groundwater conditions therefore needs to be tested and demonstrated. To date, generic scenarios for groundwater evolution relating to climate change are currently weakly constrained by data and process understanding. Hence, the possibility of site-specific changes of groundwater conditions in the future can only be assessed and demonstrated by studying groundwater evolution in the past. Stability of groundwater conditions in the past is an indication of future stability, though both the climatic and geological contexts must be taken into account in making such an assertion

    How Much of the Web Is Archived?

    Full text link
    Although the Internet Archive's Wayback Machine is the largest and most well-known web archive, there have been a number of public web archives that have emerged in the last several years. With varying resources, audiences and collection development policies, these archives have varying levels of overlap with each other. While individual archives can be measured in terms of number of URIs, number of copies per URI, and intersection with other archives, to date there has been no answer to the question "How much of the Web is archived?" We study the question by approximating the Web using sample URIs from DMOZ, Delicious, Bitly, and search engine indexes; and, counting the number of copies of the sample URIs exist in various public web archives. Each sample set provides its own bias. The results from our sample sets indicate that range from 35%-90% of the Web has at least one archived copy, 17%-49% has between 2-5 copies, 1%-8% has 6-10 copies, and 8%-63% has more than 10 copies in public web archives. The number of URI copies varies as a function of time, but no more than 31.3% of URIs are archived more than once per month.Comment: This is the long version of the short paper by the same title published at JCDL'11. 10 pages, 5 figures, 7 tables. Version 2 includes minor typographical correction

    A Framework for Developing Real-Time OLAP algorithm using Multi-core processing and GPU: Heterogeneous Computing

    Full text link
    The overwhelmingly increasing amount of stored data has spurred researchers seeking different methods in order to optimally take advantage of it which mostly have faced a response time problem as a result of this enormous size of data. Most of solutions have suggested materialization as a favourite solution. However, such a solution cannot attain Real- Time answers anyhow. In this paper we propose a framework illustrating the barriers and suggested solutions in the way of achieving Real-Time OLAP answers that are significantly used in decision support systems and data warehouses

    Operation of Modular Smart Grid Applications Interacting through a Distributed Middleware

    Get PDF
    IoT-functionality can broaden the scope of distribution system automation in terms of functionality and communication. However, it also poses risks regarding resource consumption and security. This article presents a field approved IoT-enabled smart grid middleware, which allows for flexible deployment and management of applications within smart grid operation. In the first part of the work, the resource consumption of the middleware is analyzed and current memory bottlenecks are identified. The bottlenecks can be resolved by introducing a new entity that allows to dynamically load multiple applications within one JVM. The performance was experimentally tested and the results suggest that its application can significantly reduce the applications' memory footprint on the physical device. The second part of the study identifies and discusses potential security threats, with a focus on attacks stemming from malicious software applications within the framework. In order to prevent such attacks a proxy based prevention mechanism is developed and demonstrated

    Marea: An Efficient Application-Level Object Graph Swapper

    Get PDF
    International audienceAbstract During the execution of object-oriented applications, several millions of objects are created, used and then collected if they are not referenced. Prob- lems appear when objects are unused but cannot be garbage-collected because they are still referenced from other objects. This is an issue because those ob- jects waste primary memory and applications use more primary memory than they actually need. We claim that relying on the operating system's (OS) virtual memory is not always enough since it cannot take into account the domain and structure of applications. At the same time, applications have no easy way to parametrize nor cooperate with memory management. In this paper, we present Marea, an efficient application-level object graph swapper for object-oriented programming languages. Its main goal is to offer the programmer a novel so- lution to handle application-level memory. Developers can instruct our system to release primary memory by swapping out unused yet referenced objects to secondary memory. Our approach has been qualitatively and quantitatively val- idated. Our experiments and benchmarks on real-world applications show that Marea can reduce the memory footprint between 23% and 36%

    Why Do Developers Get Password Storage Wrong? A Qualitative Usability Study

    Full text link
    Passwords are still a mainstay of various security systems, as well as the cause of many usability issues. For end-users, many of these issues have been studied extensively, highlighting problems and informing design decisions for better policies and motivating research into alternatives. However, end-users are not the only ones who have usability problems with passwords! Developers who are tasked with writing the code by which passwords are stored must do so securely. Yet history has shown that this complex task often fails due to human error with catastrophic results. While an end-user who selects a bad password can have dire consequences, the consequences of a developer who forgets to hash and salt a password database can lead to far larger problems. In this paper we present a first qualitative usability study with 20 computer science students to discover how developers deal with password storage and to inform research into aiding developers in the creation of secure password systems

    Freshwater ecosystem services in mining regions : modelling options for policy development support

    Get PDF
    The ecosystem services (ES) approach offers an integrated perspective of social-ecological systems, suitable for holistic assessments of mining impacts. Yet for ES models to be policy-relevant, methodological consensus in mining contexts is needed. We review articles assessing ES in mining areas focusing on freshwater components and policy support potential. Twenty-six articles were analysed concerning (i) methodological complexity (data types, number of parameters, processes and ecosystem-human integration level) and (ii) potential applicability for policy development (communication of uncertainties, scenario simulation, stakeholder participation and management recommendations). Articles illustrate mining impacts on ES through valuation exercises mostly. However, the lack of ground-and surface-water measurements, as well as insufficient representation of the connectivity among soil, water and humans, leave room for improvements. Inclusion of mining-specific environmental stressors models, increasing resolution of topographies, determination of baseline ES patterns and inclusion of multi-stakeholder perspectives are advantageous for policy support. We argue that achieving more holistic assessments exhorts practitioners to aim for high social-ecological connectivity using mechanistic models where possible and using inductive methods only where necessary. Due to data constraints, cause-effect networks might be the most feasible and best solution. Thus, a policy-oriented framework is proposed, in which data science is directed to environmental modelling for analysis of mining impacts on water ES
    • …
    corecore