12,068 research outputs found
Structural and stochastical modelling of possible contaminant pathways below nuclear installations
Structural and stochastical modelling of possible contaminant pathways below nuclear installations
1Richard Haslam, 1Stuart Clarke, 1Peter Styles & 2Clive Auton
1Earth Sciences and Geography, School of Physical and Geographical Sciences, Keele University, Keele, Staffordshire, ST5 5BG, United Kingdom
2British Geological Survey, Murchison House, West Mains Road, Edinburgh, EH9 3LA, United Kingdom
Dounreay Nuclear Power station is situated on the northern coast of Caithness, Scotland on complex normally faulted Devonian sedimentary rocks with a thin, intermittent cover of superficial deposits, predominantly comprising glacial tills of varying provenance.
Bedrock structure, fracture patterns and the relationships between bedrock and the superficial deposits have a considerable impact on the transmissivity of any possible contaminants. Consequently, an understanding of the bedrock-superficial boundary and how fractures and faults influence and control the transport of fluids are a key concern. The principal aims of this work are to gain an understanding of the processes and controls on fluid flow pathways within such complex geological terrains, and develop methods of stochastatically evaluating likely contamination transport within the subsurface.
This work focuses on the near-surface bedrock geology and superficial deposits. The near surface geology of the Dounreay site comprises cyclic sequences of lacustrine rocks; their cyclicity has enabled a reference stratigraphy to be created and correlated across the site. This stratigraphy, the coastal exposures and the extensive amount of borehole data available, provide a unique opportunity to construct and constrain a three-dimensional bedrock model; the interpretive element of which has been robustly test using structural restoration techniques.
In the bedrock of Dounreay, three principal fracture sets have been identified. The first two sets of fractures are approximately orthogonal, trending north-northwest and west-southwest respectively; they represent the regional fracture set. It is proposed that these fractures where produced during loading and burial of the Devonian sediments. The final fracture set is predominantly parallel to bedding of the laminated sediments; it gives the Caithness Flagstones their ‘flaggy’ nature.
The regional fracture sets are approximately constant over the site area and vary little with depth, whereas the bedding parallel fracture set shows a marked decrease in the number of fractures per meter with depth, on a logarithmic trend. This relationship is also visible in the Rock Quality Designation (RQD) values and hydraulic conductivity data from boreholes. It follows that the bedding parallel fractures are the major controlling factor of flow in the shallow subsurface and that the RQD values can be used as a proxy for fracture density. RQD values are a commonly collected during borehole drilling and the relationship between hydraulic conductivity and RQD values offer a method for stochastically populating a 3D geological model with hydraulic conductivity values.
Current geological interpretations of the superficial deposits are based primarily on their genesis. Consequently, subdivisions based on the origin of the sediments do not relate directly to their fluid transmissivity. The superficial deposits generally have a very low hydraulic conductivity, compared to that of the bedrock, impeding the flow of water from the surface to the groundwater system at depth. A combination of driller’s description and comparisons of grain-size distribution enables subdivisions of the Quaternary strata to be established based on their properties instead of their genesis. These properties can then be stochastically interpolated throughout the 3D geological model.
This work provides a framework from which likely contamination scenarios can be modelled, both in the well-constrained subsurface of Dounreay, and at other nuclear installations where the nature of the subsurface is less well constrained
Graph-based Modelling of Concurrent Sequential Patterns
Structural relation patterns have been introduced recently to extend the search for complex patterns often hidden behind large sequences of data. This has motivated a novel approach to sequential patterns post-processing and a corresponding data mining method was proposed for Concurrent Sequential Patterns (ConSP). This article refines the approach in the context of ConSP modelling, where a companion graph-based model is devised as an extension of previous work. Two new modelling methods are presented here together with a construction algorithm, to complete the transformation of concurrent sequential patterns to a ConSP-Graph representation. Customer orders data is used to demonstrate the effectiveness of ConSP mining while synthetic sample data highlights the strength of the modelling technique, illuminating the theories developed
Recommended from our members
A SIMD architecture for hard real-time systems
Emerging safety-critical systems require high-performance data-parallel architectures and, problematically, ones that can guarantee tight and safe worst-case execution times. Given the complexity of existing architectures like GPUs, it is unlikely that sufficiently accurate models and algorithms for timing analysis will emerge in the foreseeable future. This motivates a clean-slate approach to designing a real-time data-parallel architecture.
In this work I present Sim-D: a wide-SIMD architecture for hard real-time systems. Similar to GPUs, Sim-D performs hardware strip-mining to schedule the work for a compute kernel in entities called work-groups. Sim-D schedules the work for each work-group as a sequence of uninterruptible access- and execute program phases, interleaving the phases of two work-groups. By providing performance isolation between the memory- and compute resources, the execution time of each phase can be tightly bound through static analysis.
I present a predictable closed-page DRAM controller that processes requests for large 1D- and 2D blocks of data, as well as indirect indexed transfers. These large transfers coalesce the data requests of a whole work-group. For a linear 4KiB transfer over a 64-bit data bus, the utilisation provably exceeds 78% for DDR4-3200AA DRAM. For 2D blocks, a well-chosen tiling configuration can achieve near-similar efficiency. I show that bounds on the execution time of indexed transfers are pessimistic by nature, but propose a novel snoopy indexed transfer mechanism that permits more reasonable bounds when the buffer size is limited.
Finally, I present a worst-case execution time calculation algorithm for Sim-D. This algorithm is paired with two hardware work-group scheduling policies that deterministically reduce run-time variance. The worst-case execution time analysis algorithm combines static control flow analysis with a simulation-based cost model for execution and DRAM transfers. Its key novelty is the addition of a stage that considers work-group scheduling effects. I show that the work-group scheduling policies degrade performance on average by 8.9%, but permit the calculation of worst-case execution time bounds that are tight within 14.3% on average for benchmarks that avoid inefficient indexed transfers
Bitproperty
Property is the law of lists and ledgers. County land records, stock certificate entries, mortgage registries, UCC filings on personal property, United States Copyright and Patent registries of interests in intellectual property, bank accounts, domain name systems, and consumers’ Kindle eBook collections in the cloud — all are merely entries in a list, determining who owns what.Each such list has suffered under a traditional limitation. To prevent falsification or duplication, a single entity must maintain the list, and users must trust (and pay) that entity. As a result, transactions must proceed at significant expense and delay. Yet zero or near-zero expense is the fuel of internet scalability. Until technologies get cheap and fast enough, they cannot benefit from the full power of the internet. Property transactions have not yet truly seen an internet revolution because they are constrained by the cost of creating centralized trusted authorities.This article retheorizes the contours of digital property if that central constraint were removed. There is every reason to believe it can be. A spate of interest in cryptocurrencies has driven the development of a series of technologies for creating public, cryptographically secure ledgers of property interests that do not rely on trust in a specific entity to curate the list. Previously, the digital objects that users could buy and sell online were not rivalrous in the same way as offline physical objects, unless some centralized entity such as a social network, digital currency issuer, or game company served the function of trusted list curator. Trustless public ledgers change this dynamic. Counterparties can hand one another digital, rivalrous objects in the same way that they used to hand each other gold bars or dollar bills. No intermediary or curator is needed.Trustless public ledgers can help to reshape property law online. They offer the kind of near-zero transaction costs that have provoked radical disruptive innovation across the internet. With near-zero transaction costs, online property transactions can finally benefit from the huge scaling effects of internet technologies.In addition, the advent of this disruptive technology provides an opportunity to more deeply theorize property interests in information environments. Property online is anemic. Consumers control few online resources and own even less. This is in no small part due to antiquated notions of property as the law of physical, tangible resources. With the advent of new technology that can create digital, scarce, and rival intangible assets, these basic assumptions should be reexamined, discarded, and replaced with a theory of property as an information communication and storage system. That is the project of this piece
Recommended from our members
Post-quantum blockchain for internet of things domain
This thesis was submitted for the award of Doctor of Philosophy and was awarded by Brunel University LondonIn the evolving realm of quantum computing, emerging advancements reveal substantial challenges and threats to existing cryptographic infrastructures, particularly impacting blockchain technologies. These are pivotal for securing the Internet of Things (IoT) ecosystems. The traditional blockchain structures, integral to myriad IoT applications, are susceptible to potential quantum computations, emphasizing an urgent need for innovations in post-quantum blockchain solutions to reinforce security in the expansive domain of IoT.
This PhD thesis delves into the crucial exploration and meticulous examination of the development and implementation of post-quantum blockchain within the IoT landscape, focusing on the incorporation of advanced post-quantum cryptographic algorithms in Hyperledger Fabric, a forefront blockchain platform renowned for its versatility and robustness. The primary aim is to discern viable post-quantum cryptographic solutions capable of fortifying blockchain systems against impending quantum threats enhancing security and reliability in IoT applications.
The research comprehensively evaluates various post-quantum public-key generation and digital signature algorithms, performing detailed analyses of their computational time and memory usage to identify optimal candidates. Furthermore, the thesis proposes an innovative lattice-based digital signature scheme Fast-Fourier Lattice-based Compact Signature over NTRU (Falcon), which leverages the Monte Carlo Markov Chain (MCMC) algorithm as a trapdoor sampler to augment its security attributes.
The research introduces a post-quantum version of the Hyperledger Fabric blockchain that integrates post-quantum signatures. The system utilizes the Open Quantum Safe (OQS) library, rigorously tested against NIST round 3 candidates for optimal performance. The study highlights the capability to manage IoT data securely on the post-quantum Hyperledger Fabric blockchain through the Message Queue Telemetry Transport (MQTT) protocol. Such a configuration ensures safe data transfer from IoT sensors directly to the blockchain nodes, securing the processing and recording of sensor data within the node ledger. The research addresses the multifaceted challenges of quantum computing advancements and significantly contributes to establishing secure, efficient, and resilient post-quantum blockchain infrastructures tailored explicitly for the IoT domain. These findings are instrumental in elevating the security paradigms of IoT systems against quantum vulnerabilities and catalysing innovations in post-quantum cryptography and blockchain technologies.
Furthermore, this thesis introduces strategies for the optimization of performance and scalability of post-quantum blockchain solutions and explores alternative, energy-efficient consensus mechanisms such as the Raft and Stellar Consensus Protocol (SCP), providing sustainable alternatives to the conventional Proof-of-Work (PoW) approach.
A critical insight emphasized throughout this thesis is the imperative of synergistic collaboration among academia, industry, and regulatory bodies. This collaboration is pivotal to expedite the adoption and standardization of post-quantum blockchain solutions, fostering the development of interoperable and standardized technologies enriched with robust security and privacy frameworks for end users.
In conclusion, this thesis furnishes profound insights and substantial contributions to implementing post-quantum blockchain in the IoT domain. It delineates original contributions to the knowledge and practices in the field, offering practical solutions and advancing the state-of-the-art in post-quantum cryptography and blockchain research, thereby paving the way for a secure and resilient future for interconnected IoT systems
Community Enforcement of Informal Contracts: Jewish Diamond Merchants in New York
The diamond industry is home to many unusual features: the predominance of an ethnically homogeneous community of merchants, the norm of intergenerational family businesses, and a rejection of public courts in favor of private contract enforcement. This paper explains that the diamond industry\u27s unique attributes arise specifically to meet the particularly rigorous hazards of transacting in diamonds. Since diamonds are portable, easily concealable, and extremely valuable, the risk associated with a credit sale can be especially costly. However, the industry enjoys valuable organizational efficiencies if transactions occur on credit between independent, fully incentivized agents. Thus, an efficient system of exchange will find ways to induce merchants who purchase on credit to fulfill their payment obligations. The very features that give the diamond industry an unusual profile are responsible for providing institutions to support credit sales. A system of private arbitration spreads information regarding merchants\u27 past dealings, so a reputation mechanism to monitor merchants can take hold. Intergenerational legacies, though restricting entry only to those who can inherit good reputations from family members, resolve an end-game problem and induce merchants to deal honestly through their very last transaction. And the participation of Ultra-Orthodox Jews, for whom inclusion and participation in their communities is equally paramount to their material wealth, serve important value-added services as diamond cutters and brokers without posing the threat of theft and flight
Beggar Thy Neighbour Exchange Rate Regime Misadvice  from Misapplications of Mundell (1961) and the Remedy
Economists invoke Mundell (1961) in arguing for the general policy of  a flexible exchange rate regime as a means of restoring equilibria  after shocks. But there is a discrepancy between the intent of the  general policy and attempts at its implementation as identified by  specific changes in exchange rates.  When we assemble the set of  specific changes called for by distinct economists operating as  advocates for individual countries, these are uniformly in the form  of beggar-thy-neighbour advice – ie travesties of objectively  identifying disequilibria and a menace to international cooperation  and peace.  This paper traces the unintended travesties to problems  of complexity and uncertainty, problems that implicitly are assumed  absent in Mundell (1961) rendering the situation so simple that  equilibria are transparent.  The problems remained essentially  unaddressed when economists extended Mundell (1961) via expected  utility theory since this theory also ignores the impossibility of  maximising and the complexities of central bankers, private firms and  others in doing the evaluation stage in reaching decisions.  The  problems can be overcome by modelling within SKAT, the Stages of  Knowledge Ahead Theory.  This paper points to experimental evidence  in support of the view that under all sorts of disequilibrating  shocks, currency unions outperform flexible currencies by eliminating  the inefficiencies generated by exchange rate uncertainty.optimal currency area; exchange rate regime; certainty effects;  policy; beggar-thy-neighbour; SKAT the Stages of Knowledge Ahead Theory; complexity; equilibrium; small world; shocks; expenditure-switching shocks; supply-side shocks; demand shocks; experiment, safety, international competitiveness.
Runtime-assisted cache coherence deactivation in task parallel programs
With increasing core counts, the scalability of directory-based cache coherence has become a challenging problem. To reduce the area and power needs of the directory, recent proposals reduce its size by classifying data as private or shared, and disable coherence for private data. However, existing classification methods suffer from inaccuracies and require complex hardware support with limited scalability.
This paper proposes a hardware/software co-designed approach: the runtime system identifies data that is guaranteed by the programming model semantics to not require coherence and notifies the microarchitecture. The microarchitecture deactivates coherence for this private data and powers off unused directory capacity. Our proposal reduces directory accesses to just 26% of the baseline system, and supports a 64x smaller directory with only 2.8% performance degradation. By dynamically calibrating the directory size our proposal saves 86% of dynamic energy consumption in the directory without harming performance.This work has been supported by the RoMoL ERC Advanced Grant (GA 321253), by the European HiPEAC Network of Excellence, by the Spanish Ministry of Economy and Competitiveness (contract TIN2015-65316-P), by the Generalitat de Catalunya (contracts 2014-SGR-1051 and 2014-SGR-1272) and by the European Unions Horizon 2020 research
and innovation programme (grant agreements 671697 and 779877). M. Moreto has been partially supported by the Spanish Ministry of Economy, Industry and Competitiveness
under Ramon y Cajal fellowship number RYC-2016-21104.Peer ReviewedPostprint (author's final draft
Supply Chains and Porous Boundaries: The Disaggregation of Legal Services
The economic downturn has had significant effects on law firms, and is causing many of them to rethink some basic assumptions about how they operate. In important respects, however, the downturn has simply intensified the effects of some deeper trends that preceded it, which are likely to continue after any recovery that may occur.
This paper explores one of these trends, which is corporate client insistence that law firms “disaggregate” their services into discrete tasks that can be delegated to the least costly providers who can perform them. With advances in communications technology, there is increasing likelihood that some of these persons may be located outside the formal boundaries of the firm. This means that law firms may need increasingly to confront the make or buy decision that their corporate clients have regularly confronted for some time. The potential for vertical disintegration is a relatively recent development for legal services, but is well-established in other sectors of the global economy.
Empirical work in several disciplines has identified a number of issues that arise for organizations as the make or buy decision becomes a potentially more salient feature of their operations. Much of this work has focused in particular on the implications of relying on outsourcing as an integral part of the production process. This paper discusses research on: (1) the challenges of ensuring that work performed outside the firm is fully integrated into the production process; (2) coordinating projects for which networks of organizations are responsible; (3) managing the transfer of knowledge inside and outside of firms that are participants in a supply chain; and (4) addressing the impact of using contingent workers on an organization’s workforce, structure, and culture. A review of this research suggests considerations that law firms will need to assess if they begin significantly to extend the process of providing services beyond their formal boundaries. Discussing the research also is intended to introduce concepts that may become increasingly relevant to law firms, but which currently are not commonly used to analyze their operations. Considering how these concepts are applicable to law firms may prompt us to rethink how to conceptualize these firms and what they do.
This paper therefore is a preliminary attempt to explore: (1) the extent to which law firms may come to resemble the vertically disintegrated organizations that populate many other economic sectors and (2) the potential implications of this trend for the provision of legal services,the trajectory of legal careers, and lawyers’ sense of themselves as members of a distinct profession
- …