4,474 research outputs found

    Comparative Analysis of Nuclear Event Investigation Methods, Tools and Techniques

    Get PDF
    Feedback from operating experience is one of the key means of enhancing nuclear safety and operational risk management. The effectiveness of learning from experience at NPPs could be maximised, if the best event investigation practices available from a series of methodologies, methods and tools in the form of a ‘toolbox’ approach were promoted. Based on available sources of technical, scientific, normative and regulatory information, an inventory, review and brief comparative analysis of information concerning event investigation methods, tools and techniques, either indicated or already used in the nuclear industry (with some examples from other high risk industry areas), was performed in this study. Its results, including the advantages and drawbacks identified from the different instruments, preliminary recommendations and conclusions, are covered in this report. The results of comparative analysis of nuclear event investigation methods, tools and techniques, presented in this interim report, are of a preliminary character. It is assumed that, for the generation of more concrete recommendations concerning the selection of the most effective and appropriate methods and tools for event investigation, new data, from experienced practitioners in the nuclear industry and/or regulatory institutions are needed. It is planned to collect such data, using the questionnaire prepared and performing the survey currently underway. This is the second step in carrying out an inventory of, reviewing, comparing and evaluating the most recent data on developments and systematic approaches in event investigation, used by organisations (mainly utilities) in the EU Member States. Once the data from this survey are collected and analysed, the final recommendations and conclusions will be developed and presented in the final report on this topic. This should help current and prospective investigators to choose the most suitable and efficient event investigation methods and tools for their particular needs.JRC.DDG.F.5-Safety of present nuclear reactor

    Application and Selection of Nuclear Event Investigation Methods, Tools and Techniques

    Get PDF
    This final report presents the results gained in the last step of a project launched in the frame of technical task “Comparative study of event assessment methodologies with recommendations for an optimized approach in the EU” for 2010 -2011. It contains analysis and evaluation of the existing practices and some developed conclusions and recommendations to facilitate selection and implementation of relevant instruments to improve quality of nuclear event investigations. Study is based on the results of the performed survey with participation of experts from nuclear power plants and regulatory bodies representing 12 European countries and USA. Main attention was paid to analysis of the existing practices and organizational aspects of nuclear event investigation, as well as to qualitative evaluation of currently employed event investigation methods, tools and techniques. Some methodological inconsistencies, existing barriers, bottlenecks and emerging traps in event investigation process were identified and analyzed more thoroughly. In order to avoid the existing ambiguities and misunderstandings in system of terms and definitions, used in event investigation and root cause analysis concretely, the several improved definitions were suggested. Aiming to distinguish better event investigation instruments of different levels and applicability, the original system of classification of basic root cause analysis methods and tools was suggested. This system should facilitate comparison of characteristics and selection of the most relevant instruments for event investigation. Some methodological and practical recommendations how to conduct analysis overcoming the identified obstacles were put forward. The detailed recommendations for selection of root cause analysis methods and tools are presented as well.JRC.F.5-Nuclear Reactor Safety Assessmen

    Space is the machine, part four: theoretical syntheses

    Get PDF
    Part IV of the book, ‘Theoretical Syntheses’, begins to draw together some of the questions raised in Part I, the regularities shown in Part II and the laws proposed in Part III, to suggest how the two central problems in architectural theory, namely the form-function problem and the form-meaning problem, can be reconceptualised. Chapter 10, ‘Space is the machine’, reviews the form-function theory in architecture and attempts to establish a pathology of its formulation: how it came to be set up in such a way that it could not be solved. It then proposes how the configuration paradigm permits a reformulation, through which we can not only make sense of the relation between form and function in buildings, but also we can make sense of how and why buildings, in a powerful sense are ‘social objects’ and in fact play a powerful role in the realisation and sustaining of human society. Finally, in Chapter 11, ‘The reasoning art’, the notion of configuration is applied to the study of what architects do, that is, design. Previous models of the design process are reviewed, and it is shown that without knowledge of configuration and the concept of the non-discursive, we cannot understand the internalities of the design process. A new knowledge-based model of design is proposed, with configuration at its centre. It is argued from this that because design is a configurational process, and because it is the characteristic of configuration that local changes make global differences, design is necessarily a top down process. This does not mean that it cannot be analysed, or supported by research. It shows however that only configurationally biased knowledge can really support the design Introduction Space is the machine | Bill Hillier Space Syntax Introduction process, and this, essentially, is theoretical knowledge. It follows from this that attempts to support designers by building methods and systems for bottom up construction of designs must eventually fail as explanatory systems. They can serve to create specific architectural identities, but not to advance general architectural understanding. In pursuing an analytic rather than a normative theory of architecture, the book might be thought by some to have pretensions to make the art of architecture into a science. This is not what is intended. One effect of a better scientific understanding of architecture is to show that although architecture as a phenomenon is capable of considerable scientific understanding, this does not mean that as a practice architecture is not an art. On the contrary, it shows quite clearly why it is an art and what the nature and limits of that art are. Architecture is an art because, although in key respects its forms can be analysed and understood by scientific means, its forms can only be prescribed by scientific means in a very restricted sense. Architecture is law governed but it is not determinate. What is governed by the laws is not the form of individual buildings but the field of possibility within which the choice of form is made. This means that the impact of these laws on the passage from problem statement to solution is not direct but indirect. It lies deep in the spatial and physical forms of buildings, in their genotypes, not their phenotypes. Architecture is therefore not part art, and part science, in the sense that it has both technical and aesthetic aspects, but is both art and science in the sense that it requires both the processes of abstraction by which we know science and the processes of concretion by which we know art. The architect as scientist and as theorist seeks to establish the laws of the spatial and formal materials with which the architect as artist then composes. The greater scientific content of architecture over art is simply a function of the far greater complexity of the raw materials of space and form, and their far greater reverberations for other aspects of life, than any materials that an artist uses. It is the fact that the architect designs with the spatial stuff of living that builds the science of architecture into the art of architecture. It may seem curious to argue that the quest for a scientific understanding of architecture does not lead to the conclusion that architecture is a science, but nevertheless it is the case. In the last analysis, architectural theory is a matter of understanding architecture as a system of possibilities, and how these are restricted by laws which link this system of possibilities to the spatial potentialities of human life. At this level, and perhaps only at this level, architecture is analogous to language. Language is often naïvely conceptualised as a set of words and meanings, set out in a dictionary, and syntactic rules by which they may be combined into meaningful sentences, set out in grammars. This is not what language is, and the laws that govern language are not of this kind. This can be seen from the simple fact that if we take the words of the dictionary and combine them in grammatically correct sentences, virtually all are utterly meaningless and do not count as legitimate sentences. The structures of language are the laws which restrict the combinatorial possibilities of words, and through these restrictions construct the sayable and the meaningful. The laws of language do not therefore tell us what to say, but prescribe the structure and limits of the sayable. It is within these limits that we use language as the prime means to our individuality and creativity. In this sense architecture does resemble language. The laws of the field of architecture do not tell designers what to do. By restricting and structuring the field of combinatorial possibility, they prescribe the limits within which architecture is possible. As with language, what is left from this restrictive structuring is rich beyond imagination. Even so, without these laws buildings would not be human products, any more than meaningless but syntactically correct concatenations of words are human sentences. The case for a theoretical understanding of architecture then rests eventually not on aspiration to philosophical or scientific status, but on the nature of architecture itself. The foundational proposition of the book is that architecture is an inherently theoretical subject. The very act of building raises issues about the relations of the form of the material world and the way in which we live in it which (as any archaeologist knows who has tried to puzzle out a culture from material remains) are unavoidably both philosophical and scientific. Architecture is the most everyday, the most enveloping, the largest and the most culturally determined human artefact. The act of building implies the transmission of cultural conventions answering these questions through custom and habit. Architecture is their rendering explicit, and their transmutation into a realm of innovation and, at its best, of art. In a sense, architecture is abstract thought applied to building, even therefore in a sense theory applied to building. This is why, in the end, architecture must have analytic theories

    LAI Whitepaper Series: “Lean Product Development for Practitioners”: Program Management for Large Scale Engineering Programs

    Get PDF
    The whitepaper begins by introducing the challenges of programs in section 4, proceeds to define program management in section 5 and then gives an overview of existing program management frameworks in section 6. In section 7, we introduce a new program management framework that is tailored towards describing the early program management phases – up to the start of production. This framework is used in section 8 to summarize the relevant LAI research

    Isolated systems with wind power. Main report

    Get PDF
    The overall objective of this research project is to study the development of methods and guidelines rather than "universal solutions" for the use of wind energy in isolated communities. The main specific objective of the project is to develop and present amore unified and generally applicable approach for assessing the technical and economical feasibility of isolated power supply systems with wind energy. As a part of the project the following tasks were carried out: Review of literature, fieldmeasurements in Egypt, development of an inventory of small isolated systems, overview of end-user demands, analysis of findings and development of proposed guidelines. The project is reported in one main report and four topical reports, all of themissued as Risø reports. This is the Main Report Risø-R-1256, summing up the activities and findings of the project and outlining an Implementation Strategy for Isolated Systems with Wind Power, applicable for international organisations such as donoragencies and development banks

    Program Management for Large Scale Engineering Programs

    Get PDF
    The goal of this whitepaper is to summarize the LAI research that applies to program management. The context of most of the research discussed in this whitepaper are large-scale engineering programs, particularly in the aerospace & defense sector. The main objective is to make a large number of LAI publications – around 120 – accessible to industry practitioners by grouping them along major program management activities. Our goal is to provide starting points for program managers, program management staff and system engineers to explore the knowledge accumulated by LAI and discover new thoughts and practical guidance for their everyday challenges. The whitepaper begins by introducing the challenges of programs in section 4, proceeds to define program management in section 5 and then gives an overview of existing program management frameworks in section 6. In section 7, we introduce a new program management framework that is tailored towards describing the early program management phases – up to the start of production. This framework is used in section 8 to summarize the relevant LAI research

    Improving the odds of drug development success through human genomics: modelling study.

    Get PDF
    Lack of efficacy in the intended disease indication is the major cause of clinical phase drug development failure. Explanations could include the poor external validity of pre-clinical (cell, tissue, and animal) models of human disease and the high false discovery rate (FDR) in preclinical science. FDR is related to the proportion of true relationships available for discovery (γ), and the type 1 (false-positive) and type 2 (false negative) error rates of the experiments designed to uncover them. We estimated the FDR in preclinical science, its effect on drug development success rates, and improvements expected from use of human genomics rather than preclinical studies as the primary source of evidence for drug target identification. Calculations were based on a sample space defined by all human diseases - the 'disease-ome' - represented as columns; and all protein coding genes - 'the protein-coding genome'- represented as rows, producing a matrix of unique gene- (or protein-) disease pairings. We parameterised the space based on 10,000 diseases, 20,000 protein-coding genes, 100 causal genes per disease and 4000 genes encoding druggable targets, examining the effect of varying the parameters and a range of underlying assumptions, on the inferences drawn. We estimated γ, defined mathematical relationships between preclinical FDR and drug development success rates, and estimated improvements in success rates based on human genomics (rather than orthodox preclinical studies). Around one in every 200 protein-disease pairings was estimated to be causal (γ = 0.005) giving an FDR in preclinical research of 92.6%, which likely makes a major contribution to the reported drug development failure rate of 96%. Observed success rate was only slightly greater than expected for a random pick from the sample space. Values for γ back-calculated from reported preclinical and clinical drug development success rates were also close to the a priori estimates. Substituting genome wide (or druggable genome wide) association studies for preclinical studies as the major information source for drug target identification was estimated to reverse the probability of late stage failure because of the more stringent type 1 error rate employed and the ability to interrogate every potential druggable target in the same experiment. Genetic studies conducted at much larger scale, with greater resolution of disease end-points, e.g. by connecting genomics and electronic health record data within healthcare systems has the potential to produce radical improvement in drug development success rate

    Synthesizing TSCA and REACH: Practical Principles for Chemical Regulation Reform

    Get PDF
    The European Union\u27s newly enacted comprehensive regulation for industrial chemicals, known as REACH, draws heavily on three decades of experience in the United States under the Toxic Substances Control Act. Much of that experience has been negative, inasmuch as TSCA is widely regarded as a disappointment among US environmental laws, and so REACH deliberately reverses many of the legislative choices that Congress made in TSCA. REACH also takes advantage of important new regulatory concepts that were not available to the framers of TSCA thirty years ago. The passage of REACH has sparked renewed interest in reforming TSCA, and the reformers will undoubtedly look to REACH for ideas. This article contends that, while many aspects of REACH can fairly be understood as the Anti-TSCA, on closer examination REACH follows many of TSCA\u27s fundamental approaches to chemicals regulation. The fundamental similarities offer a unique opportunity to develop a synthesis of the two regulatory regimes, which could form the practical basis for updating TSCA. While reform based on a synthesis of TSCA and REACH would be evolutionary rather than revolutionary, it could nevertheless greatly improve chemicals regulation in the US. This article offers principles for such reform. The article concludes with a discussion of the global impact of national regulatory systems like REACH

    Data analytics and algorithms in policing in England and Wales: Towards a new policy framework

    Get PDF
    RUSI was commissioned by the Centre for Data Ethics and Innovation (CDEI) to conduct an independent study into the use of data analytics by police forces in England and Wales, with a focus on algorithmic bias. The primary purpose of the project is to inform CDEI’s review of bias in algorithmic decision-making, which is focusing on four sectors, including policing, and working towards a draft framework for the ethical development and deployment of data analytics tools for policing. This paper focuses on advanced algorithms used by the police to derive insights, inform operational decision-making or make predictions. Biometric technology, including live facial recognition, DNA analysis and fingerprint matching, are outside the direct scope of this study, as are covert surveillance capabilities and digital forensics technology, such as mobile phone data extraction and computer forensics. However, because many of the policy issues discussed in this paper stem from general underlying data protection and human rights frameworks, these issues will also be relevant to other police technologies, and their use must be considered in parallel to the tools examined in this paper. The project involved engaging closely with senior police officers, government officials, academics, legal experts, regulatory and oversight bodies and civil society organisations. Sixty nine participants took part in the research in the form of semi-structured interviews, focus groups and roundtable discussions. The project has revealed widespread concern across the UK law enforcement community regarding the lack of official national guidance for the use of algorithms in policing, with respondents suggesting that this gap should be addressed as a matter of urgency. Any future policy framework should be principles-based and complement existing police guidance in a ‘tech-agnostic’ way. Rather than establishing prescriptive rules and standards for different data technologies, the framework should establish standardised processes to ensure that data analytics projects follow recommended routes for the empirical evaluation of algorithms within their operational context and evaluate the project against legal requirements and ethical standards. The new guidance should focus on ensuring multi-disciplinary legal, ethical and operational input from the outset of a police technology project; a standard process for model development, testing and evaluation; a clear focus on the human–machine interaction and the ultimate interventions a data driven process may inform; and ongoing tracking and mitigation of discrimination risk

    Pension Reform, Retirement and Life-Cycle Unemployment

    Get PDF
    The labor market effects of pension reform stem from retirement behavior and from job search and hours worked of prime age workers. This paper investigates the impact of four often proposed policy measures for sustainable pensions: strengthening the tax benefit link, moving from wage to price indexation of benefits, lengthening calculation periods, and introducing more actuarial fairness in pension assessment. We provide some analytical results and use a computational model to demonstrate the economic and welfare impact of recent pension reform in Austria.Pension Reform, Retirement, Job Search, Life-cycle Unemployment
    • …
    corecore