575 research outputs found

    Graphical modelling language for spycifying concurrency based on CSP

    Get PDF
    Introduced in this (shortened) paper is a graphical modelling language for specifying concurrency in software designs. The language notations are derived from CSP and the resulting designs form CSP diagrams. The notations reflect both data-flow and control-flow aspects of concurrent software architectures. These designs can automatically be described by CSP algebraic expressions that can be used for formal analysis. The designer does not have to be aware of the underlying mathematics. The techniques and rules presented provide guidance to the development of concurrent software architectures. One can detect and reason about compositional conflicts (errors in design), potential deadlocks (errors at run-time), and priority inversion problems (performance burden) at a high level of abstraction. The CSP diagram collaborates with objectoriented modelling languages and structured methods

    Information security management in cloud computing:a case study

    Get PDF
    Abstract. Organizations are quickly adopting cloud computing in their daily operations. As a result, spending’s on cloud security solutions are increasing in conjunction with security threats redirecting to the cloud. Information security is a constant race against evolving security threats and it also needs to advance in order to accommodate the cloud computing adaptation. The aim of this thesis is to investigate the topics and issues that are related to information security management in cloud computing environments. Related information security management issues include risk management, security technology selection, security investment decision-making, employees’ security policy compliance, security policy development, and security training. By interviewing three different types of actors (normal employees, IT security specialists, and security managers) in a large ICT-oriented company, this study attempts to get different viewpoints related with the introduced issues and provide suggestions on how to improve information security management in cloud computing environments. This study contributes to the community by attempting to give a holistic perspective on information security management in the specific setting of cloud computing. Results of the research illustrate how investment decisions directly affect all other covered topics that in turn have an effect on one another, forming effective information security

    Graphical modelling language for specifying concurrency based on CSP

    Full text link

    The Measure of Success: Evaluating Corporate Citizenship Performance

    Get PDF
    This is the second publication resulting from the Measure of Success research project, designed to explore the current corporate perspective on and practices in measuring corporate citizenship performance. It presents a discussion of the evolution of corporate citizenship and how the political debates have influenced measurement practice; and a discussion of measurement as a management tool in the design and implementation of corporate citizenship programmes

    An Inference about Interference: A Surprising Application of Existing International Law to Inhibit Anti-Satellite Weapons

    Get PDF
    This article presents a thesis that most readers will find surprising, in an effort to develop a novel, simultaneous solution to three urgent, complex problems related to outer space. The three problems are: a) the technical fact that debris in outer space (the accumulated orbital junk produced by decades of space activities) has grown to present a serious hazard to safe and effective exploration and exploitation of space; b) the strategic fact that many countries (notably the United States, China and Russia, but others, too) continue to demonstrate a misguided interest in pursuing anti-satellite weapons, which can jeopardize the security of space; and c) the political fact that attempts to provide additional legal regulation of outer space (via new bilateral or multilateral international agreements) have failed, with little prospect for prompt conclusion of meaningful new accords. The proposed solution is to adapt existing international law in an unforeseen way. Specifically, numerous current and historical arms control treaties provide for verification of parties’ compliance via “national technical means” (NTM) of verification, which prominently include satellite-based sensory and communications systems. These treaties routinely provide protection for those essential space assets by requiring parties to undertake “not to interfere” with NTM. The argument developed here is that additional tests in space of debris-creating anti-satellite weapons would already be illegal, even without the conclusion of any dedicated new treaty against further weaponization of space, because in the current crowded conditions of space, a new cloud of orbital debris would, sooner or later, impermissibly interfere with NTM satellites. If sustained, this thesis can provide a new rationale for opposition to the development, testing, and use of anti-satellite weapons. It a legal reinforcement for the political instincts to avoid activities that further undercut the optimal usability of outer space, and it demonstrates how creative re-interpretation of existing legal provisions can promote the advancement of the rule of international law, even in circumstances where the articulation of new treaties is blocked

    Efficiency Improvements in the Quality Assurance Process for Data Races

    Get PDF
    As the usage of concurrency in software has gained importance in the last years, and is still rising, new types of defects increasingly appeared in software. One of the most prominent and critical types of such new defect types are data races. Although research resulted in an increased effectiveness of dynamic quality assurance regarding data races, the efficiency in the quality assurance process still is a factor preventing widespread practical application. First, dynamic quality assurance techniques used for the detection of data races are inefficient. Too much effort is needed for conducting dynamic quality assurance. Second, dynamic quality assurance techniques used for the analysis of reported data races are inefficient. Too much effort is needed for analyzing reported data races and identifying issues in the source code. The goal of this thesis is to enable efficiency improvements in the process of quality assurance for data races by: (1) analyzing the representation of the dynamic behavior of a system under test. The results are used to focus instrumentation of this system, resulting in a lower runtime overhead during test execution compared to a full instrumentation of this system. (2) Analyzing characteristics and preprocessing of reported data races. The results of the preprocessing are then provided to developers and quality assurance personnel, enabling an analysis and debugging process, which is more efficient than traditional analysis of data race reports. Besides dynamic data race detection, which is complemented by the solution, all steps in the process of dynamic quality assurance for data races are discussed in this thesis. The solution for analyzing UML Activities for nodes possibly executing in parallel to other nodes or themselves is based on a formal foundation using graph theory. A major problem that has been solved in this thesis was the handling of cycles within UML Activities. This thesis provides a dynamic limit for the number of cycle traversals, based on the elements of each UML Activity to be analyzed and their semantics. Formal proofs are provided with regard to the creation of directed acyclic graphs and with regard to their analysis concerning the identification of elements that may be executed in parallel to other elements. Based on an examination of the characteristics of data races and data race reports, the results of dynamic data race detection are preprocessed and the outcome of this preprocessing is presented to users for further analysis. This thesis further provides an exemplary application of the solution idea, of the results of analyzing UML Activities, and an exemplary examination of the efficiency improvement of the dynamic data race detection, which showed a reduction in the runtime overhead of 44% when using the focused instrumentation compared to full instrumentation. Finally, a controlled experiment has been set up and conducted to examine the effects of the preprocessing of reported data races on the efficiency of analyzing data race reports. The results show that the solution presented in this thesis enables efficiency improvements in the analysis of data race reports between 190% and 660% compared to using traditional approaches. Finally, opportunities for future work are shown, which may enable a broader usage of the results of this thesis and further improvements in the efficiency of quality assurance for data races.Da die Verwendung von Concurrency in Software in den letzten Jahren an Bedeutung gewonnen hat, und immer noch gewinnt, sind zunehmend neue Arten von Fehlern in Software aufgetaucht. Eine der prominentesten und kritischsten Arten solcher neuer Fehlertypen sind data races. Auch wenn die Forschung zu einer steigenden EffektivitĂ€t von Verfahren der dynamischen QualitĂ€tssicherung gefĂŒhrt hat, so ist die Effizienz im Prozess der QualitĂ€tssicherung noch immer ein Faktor, der eine weitverbreitete praktische Anwendung verhindert. Zum einen wird zu viel Aufwand benötigt, um dynamische QualitĂ€tssicherung durchzufĂŒhren. Zum anderen sind die Verfahren zur Analyse gemeldeter data races ineffizient; es wird zu viel Aufwand benötigt, um gemeldete data races zu analysieren und Probleme im Quellcode zu identifizieren. Das Ziel dieser Dissertation ist es, Effizienzsteigerungen im QualitĂ€tssicherungsprozess fĂŒr data races zu ermöglichen, durch: (1) Analyse der ReprĂ€sentation des dynamischen Verhaltens des zu testenden Systems. Mit den Ergebnissen wird die Instrumentierung dieses Systems fokussiert, so dass ein im Vergleich zur vollen Instrumentierung des Systems geringerer Mehraufwand an Laufzeit benötigt wird. (2) Analyse der Charakteristiken von und Vorverarbeitung der gemeldeten data races. Die Ergebnisse der Vorverarbeitung werden Mitarbeitenden in der Entwicklung und QualitĂ€tssicherung prĂ€sentiert, so dass ein Analyse- und Fehlerbehebungsprozess ermöglicht wird, welcher effizienter als traditionelle Analysen gemeldeter data races ist. Mit Ausnahme der dynamischen data race Erkennung, welche durch die Lösung komplementiert wird, werden alle Schritte im Prozess der dynamischen QualitĂ€tssicherung fĂŒr data races in dieser Dissertation behandelt. Die Lösung zur Analyse von UML AktivitĂ€ten auf Knoten, die möglicherweise parallel zu sich selbst oder anderen Knoten ausgefĂŒhrt werden, basiert auf einer formalen Grundlage aus dem Bereich der Graphentheorie. Eines der Hauptprobleme, welches gelöst wurde, war die Verarbeitung von Zyklen innerhalb der UML AktivitĂ€ten. Diese Dissertation fĂŒhrt ein dynamisches Limit fĂŒr die Anzahl an ZyklusdurchlĂ€ufen ein, welches die Elemente jeder zu analysierenden UML AktivitĂ€t sowie deren Semantiken berĂŒcksichtigt. Ebenso werden formale Beweise prĂ€sentiert in Bezug auf die Erstellung gerichteter azyklischer Graphen, sowie deren Analyse zur Identifizierung von Elementen, die parallel zu anderen Elementen ausgefĂŒhrt werden können. Auf Basis einer Untersuchung von Charakteristiken von data races sowie Meldungen von data races werden die Ergebnisse der dynamischen Erkennung von data races vorverarbeitet, und das Ergebnis der Vorverarbeitung gemeldeter data races wird Benutzern zur weiteren Analyse prĂ€sentiert. Diese Dissertation umfasst weiterhin eine exemplarische Anwendung der Lösungsidee und der Analyse von UML AktivitĂ€ten, sowie eine exemplarische Untersuchung der Effizienzsteigerung der dynamischen Erkennung von data races. Letztere zeigte eine Reduktion des Mehraufwands an Laufzeit von 44% bei fokussierter Instrumentierung im Vergleich zu voller Instrumentierung auf. Abschließend wurde ein kontrolliertes Experiment aufgesetzt und durchgefĂŒhrt, um die Effekte der Vorverarbeitung gemeldeter data races auf die Effizienz der Analyse dieser gemeldeten data races zu untersuchen. Die Ergebnisse zeigen, dass die in dieser Dissertation vorgestellte Lösung verglichen mit traditionellen AnsĂ€tzen Effizienzsteigerungen in der Analyse gemeldeter data races von 190% bis zu 660% ermöglicht. Abschließend werden Möglichkeiten fĂŒr zukĂŒnftige Arbeiten vorgestellt, welche eine breitere Anwendung der Ergebnisse dieser Dissertation ebenso wie weitere Effizienzsteigerungen im QualitĂ€tssicherungsprozess fĂŒr data races ermöglichen können

    “We are passionate about our social performance” Communicating CSR on the corporate website of a tobacco company

    Get PDF
    Objectives of the study: The thesis is an investigation about the CSR communication of a tobacco company on its corporate website. The study is focused on one case company. The aim is to examine how the case company manages the dilemma between its controversial business and CSR, through its main communication tool (the corporate website). The research questions are formulated as follows: 1) How does the case company communicate about CSR on its website: which issues are emphasized, which textual and rhetorical features are used, and to what extent existing social and political discourses are replicated? 2) What are the expectations of the audience in terms of CSR from a tobacco company? 3) Is the case company’s CSR communication in line with the expectations of the audience? Methodology and theoretical framework: The methods for the study consisted of a critical discourse analysis of the textual material regarding CSR on the selected website, and a web survey. The analysis of the CSR communicative material was based on a model developed by Nielsen and Thomsen (2007), through which CSR communication is studied from four dimensions: perspective, stakeholder priorities, context, and ambition level of the company. The web survey aimed to determine the opinions and expectations of the audience about CSR and tobacco companies. The results from each of the two parts of the analysis were compared, in order to determine which elements of the case company’s CSR communication are suitable to the audience expectations, and which should be corrected. Findings and conclusions: The case company communicates extensively and in detail about CSR. The survey’s results show that this is advisable, in that credibility of the CSR information was linked by respondents to data supporting CSR statements. The CSR topics emphasized by the case company, namely communication of the health risks of smoking, tobacco regulation, environmental responsibility, employees, local communities and philanthropy, are in line with the expectations of the audience, but the importance given to the topics does not correspond to the preferences of the audience. The communication is based on the naturalization of neo-liberal principles, but fails to address aspects that have been recently questioned in the debate about the social responsibility of companies. Hence, some limitations are stressed regarding the case company’s ability to effectively communicate about CSR

    Second-Best Criminal Justice

    Get PDF
    Criminal procedure reform can be understood as a “second-best” enterprise. The general theory of second best applies where an ingredient necessary for a “first-best” ordering is unattainable. That’s an apt description of the contemporary criminal process. Our normative ideals of criminal justice require fair and frequent trials to judge guilt or innocence, but the criminal trial rate has been falling for at least a century; today it is vanishingly close to zero. What may be even worse is how we’ve eliminated trials—by endowing prosecutors with enough leverage to coerce guilty pleas. Excessive prosecutorial leverage is the source of some of criminal procedure’s deepest pathologies. This Article asks the reader to accept—as a thought experiment—that a negligible trial rate is a constraint on criminal procedure reform in the near term. From that starting point, the crucial question becomes whether there is a less destructive way to ensure a negligible trial rate. There is: inefficiency. The road to a more just, humane, and rational criminal process could begin with making formal criminal litigation more inefficient. In matters of institutional design, the general theory of the second best counsels using unseemly practices, like inefficient procedure, to offset fixed constraints, like the absence of criminal trials. If the formal process of criminal litigation could be made unreasonably expensive for both parties, both would want to settle to avoid it. Policymakers would then be free to dismantle the tools of prosecutorial leverage—overlapping offenses, draconian sentencing laws, punitive pre-trial detention, and more—without worrying about increasing the trial rate. The result would not achieve our criminal justice ideals—no second-best solution can—but it could be better than the status quo. Without more trials, it may be the best we can do
    • 

    corecore