7,256 research outputs found

    Analyzing competitiveness of automotive industry through cumulative belief degrees

    Get PDF
    Ülengin, Füsun (Dogus Author) -- Önsel, Şule (Dogus Author) -- Kabak, Özgür (Dogus Author) -- Conference full title: 10th International Fuzzy Logic and Intelligent Technologies inNuclear Science Conference, FLINS 2012; Istanbul; Turkey; 26 August 2012 through 29 August 2012This study aims to analyze the automotive industry from competitiveness perspective using a novel cumulative belief degrees (CBD) approach. For this purpose, a mathematical model based on CBD is proposed to quantify the relations among the variables in a system. This model is used to analyze the Turkish Automotive Industry through scenario analysis.SEDEFED (Federation of Industrial Associations), REF (TÜSİAD Sabanci University Competitiveness Forum), and OSD (Automotive Manufacturers Association

    The Problem of Prevention

    Get PDF
    prevention ; accidents ; Volunteer's Dilemma; learning; career concerns

    The problem of prevention

    Get PDF
    Many disasters are foreshadowed by insufficient preventative care. In this paper, we argue that there is a true problem of prevention, in that insufficient care is often the result of rational calculations on the part of agents. We identify three factors that lead to dubious efforts in care. First, when objective risks of a disaster are poorly understood, positive experiences may lead to an underestimation of these risks and a corresponding underinvestment in prevention. Second, redundancies designed for safety may lead agents to take substandard care. Finally, elected officials have an incentive to underinvest in prevention for some disasters, especially those that are relatively unlikely

    Towards operational measures of computer security

    Get PDF
    Ideally, a measure of the security of a system should capture quantitatively the intuitive notion of ‘the ability of the system to resist attack’. That is, it should be operational, reflecting the degree to which the system can be expected to remain free of security breaches under particular conditions of operation (including attack). Instead, current security levels at best merely reflect the extensiveness of safeguards introduced during the design and development of a system. Whilst we might expect a system developed to a higher level than another to exhibit ‘more secure behaviour’ in operation, this cannot be guaranteed; more particularly, we cannot infer what the actual security behaviour will be from knowledge of such a level. In the paper we discuss similarities between reliability and security with the intention of working towards measures of ‘operational security’ similar to those that we have for reliability of systems. Very informally, these measures could involve expressions such as the rate of occurrence of security breaches (cf rate of occurrence of failures in reliability), or the probability that a specified ‘mission’ can be accomplished without a security breach (cf reliability function). This new approach is based on the analogy between system failure and security breach. A number of other analogies to support this view are introduced. We examine this duality critically, and have identified a number of important open questions that need to be answered before this quantitative approach can be taken further. The work described here is therefore somewhat tentative, and one of our major intentions is to invite discussion about the plausibility and feasibility of this new approach

    Planning for Excellence: Insights from an International Review of Regulators’ Strategic Plans

    Get PDF
    What constitutes regulatory excellence? Answering this question is an indispensable first step for any public regulatory agency that is measuring, striving towards, and, ultimately, achieving excellence. One useful way to answer this question would be to draw on the broader literature on regulatory design, enforcement, and management. But, perhaps a more authentic way would be to look at how regulators themselves define excellence. However, we actually know remarkably little about how the regulatory officials who are immersed in the task of regulation conceive of their own success. In this Article, we investigate regulators’ definitions of regulatory excellence by drawing on a unique source of data that provides an important window on regulators’ own aspirations: their strategic plans. Strategic plans have been required or voluntarily undertaken for the past decade or longer by regulators around the globe. In these plans, regulators offer mission statements, strategic goals, and measurable and achievable outcomes, all of which indicate what regulators value and are striving to become. Occasionally, they even state explicitly where they have fallen short of “best-in-class” status and how they intend to improve. To date, a voluminous literature exists examining agency practices in strategic planning, but we are aware of no study that tries to glean from the substance of a sizeable number of plans how regulators themselves construe regulatory excellence. The main task of this Article is undertaking this effort. This Article draws on twenty plans from different regulators in nine countries. We found most generally that excellent regulators describe themselves (though not necessarily using exactly these words) as institutions that are more (1) efficient, (2) educative, (3) multiplicative, (4) proportional, (5) vital, (6) just, and (7) honest. In addition to these seven shared attribute categories, our reading of the plans also revealed five other “unusual” attributes that only one or two agencies mentioned. Beyond merely cataloguing the attributes identified by agencies, this Article also discusses commonalities (and differences) between plan structures, emphases, and framings. We found that the plans differed widely in features such as the specificity of their mission statements, the extent to which they emphasized actions over outcomes (or vice versa), and the extent to which commitments were organized along organizational fiefdoms or cut across bureaucratic lines. We urge future scholarship to explore alternative methods of text mining, and to study strategic plans over time within agencies, in order to track how agencies’ notions of regulatory excellence respond to changes in the regulatory context and the larger circumstances within which agencies operate. Looking longitudinally will also shed light on how agencies handle strategic goals that are either met or that prove to be unattainable

    Science or Security: The Future of the Free Flow of Scientific Information in the Age of Terror

    Full text link
    Politically or ideologically motivated speech has been the primary focus of much of the recent political, legal, and academic debate on restrictions on speech imposed as a reaction to perceived threats to national and international security. However, restrictions imposed on informing speech as a response to the threat of terrorism raise equally serious concerns. The development of the body of knowledge relies on the free flow of information, including persuasive speech. Since the terrorist attacks of September 11 and the subsequent anthrax attacks in the US, the issue of censorship of scientific information has been subject of debate in both government and scientific circles

    A Safeguards Design Strategy for Domestic Nuclear Materials Processing Facilities.

    Get PDF
    The outdated and oversized nuclear manufacturing complex within the United States requires its transformation into a smaller, safe, and secure enterprise. Health and safety risks, environmental concerns, and the end of the Cold War have all contributed to this necessity. The events of September 11, 2001, emphasized the protection requirements for nuclear materials within the U.S. as well as abroad. Current Nuclear Safeguards regulations contain minimal prescriptive requirements relating to the design of new production facilities. Project management and engineering design guides require that design documents contain specific and measureable statements relating to systems requirements. The systems engineering process evaluates alternatives for an effective and integrated solution during project design. A Safeguards Design Strategy for domestic nuclear materials processing facilities based upon a core framework of safeguards regulatory programmatic elements that also use the prescriptive requirements and similar goals of safety, health, and physical security regulations is proposed and justifiable

    Time-Interval Analysis for Radiation Monitoring

    Get PDF
    On-line radiation monitoring is essential to the U.S. Department of Energy (DOE) Environmental Management Science Program for assessing the impact of contaminated media at DOE sites. The goal of on-line radiation monitoring is to quickly detect small or abrupt changes in activity levels in the presence of a significant ambient background. The focus of this research is on developing effective statistical algorithms to meet the goal of on-line monitoring based on time-interval (time-difference between two consecutive radiation pulses) data. Compared to the more commonly used count data which are registered in a fixed count time, time-interval data possess the potential to reduce the sampling time required to obtain statistically sufficient information to detect changes in radiation levels. This dissertation has been formulated into three sections based on three statistical methods: sequential probability ratio test (SPRT), Bayesian statistics, and cumulative sum (CUSUM) control chart. In each section, time-interval analysis based on one of the three statistical methods was investigated and compared to conventional analyses based on count data in terms of average run length (ARL or average time to detect a change in radiation levels) and detection probability with both experimental and simulated data. The experimental data were acquired with a DGF-4C (XIA, Inc) system in list mode. Simulated data were obtained by using Monte Carlo techniques to obtain a random sampling of a Poisson process. Statistical algorithms were developed using the statistical software package R and the programming function built in the data analysis environment IGOR Pro. 4.03. Overall, the results showed that the statistical analyses based on time-interval data provided similar or higher detection probabilities relative to other statistical analyses based on count data, but were able to make a quicker detection with fewer pulses at relatively higher radiation levels. To increase the detection probability and further reduce the time needed to detect a change in radiation levels for time-interval analyses, modifications or adjustments were proposed for each of the three chosen statistical methods. Parameter adjustment to the preset background level in the SPRT test could reduce the average time to detect a source by 50%. Enhanced reset modification and moving prior modification proposed for the Bayesian analysis of time-intervals resulted in a higher detection probability than the Bayesian analysis without modifications, and were independent of the amount of background data registered before a radioactive source was present. The robust CUSUM control chart coupled with a modified runs rule showed the ability to further reduce the ARL to respond to changes in radiation levels, and keep the false positive rate at a required level, e.g., about 40% shorter than the standard time-interval CUSUM control chart at 10.0cps relative to a background count rate of 2.0cps. The developed statistical algorithms for time-interval data analyses demonstrate the feasibility and versatility for on-line radiation monitoring. The special properties of time-interval information provide an alternative for low-level radiation monitoring. These findings establish an important base for future on-line monitoring applications when time-interval data are registered
    corecore