5,210 research outputs found

    Developing a GIS-Database and Risk Index for Potentially Polluting Marine Sites

    Get PDF
    The increasing availability of geospatial marine data provides an opportunity for hydrographic offices to contribute to the identification of “Potentially Polluting Marine Sites” (PPMS). These include shipwrecks, oil rigs, pipelines, and dumping areas. To adequately assess the environmental risk of these sites, relevant information must be collected and converted into a multi-scale geodatabase suitable for site inventory and geo-spatial analysis. In addition, a Risk Index – representing an assessment of the magnitude of risk associated with any site – can be derived to determine the potential impacts of these PPMS. However, the successful collection and integration of PPMS information requires some effort to ‘normalize’ and standardize the data based on recognized international standards. In particular, there is benefit in structuring the data in conformance with the Universal Hydrographic Data Model (IHO S-100) recently adopted by the International Hydrographic Organization. In this paper, an S-100 compliant product specification for a PPMS geo-spatial database and associated Marine Site Risk Index is proposed which can be used by national hydrographic offices and marine protection agencies

    Efficient RDF Interchange (ERI) format for RDF data streams

    Get PDF
    RDF streams are sequences of timestamped RDF statements or graphs, which can be generated by several types of data sources (sensors, social networks, etc.). They may provide data at high volumes and rates, and be consumed by applications that require real-time responses. Hence it is important to publish and interchange them efficiently. In this paper, we exploit a key feature of RDF data streams, which is the regularity of their structure and data values, proposing a compressed, efficient RDF interchange (ERI) format, which can reduce the amount of data transmitted when processing RDF streams. Our experimental evaluation shows that our format produces state-of-the-art streaming compression, remaining efficient in performance

    Risk Disclosure and Re-establishing Legitimacy in the Event of a Crisis - Did Northern Rock Use Risk Disclosure to Repair Legitimacy after their 2007 Collapse?

    Get PDF
    Banks are exposed to a wide range of risk in their every day operation and in response to this they have developed various tools and strategies in order to help avoid, measure, or manage these risks. These tools and strategies are not always successful which has lead to several well publicised crises including amongst others the Barings bank collapse, Allfirst fraud, Santander fraud and more recently the collapse of Northern Rock and its subsequent nationalisation. In order for society to permit their operation firms require legitimacy, where by its actions must conform to cultural and social norms (Suchman 1995). Legitimacy and trust are vitally important and central to bank operations (Linsley and Kajuter 2008) therefore society is more likely to hold them to account (Ashforth and Gibbs 1990), meaning that in the occurrence of an legitimacy adverse risk event or crisis a banking organisation must enact strategies in order to repair and re-establish trust. Whilst legitimacy theory and the use of voluntary disclosures as part of a strategy for restoring organisational legitimacy and reputation has received academic attention (Linsley and Kajuter 2008) there has been limited research on banking disclosures (Linsley and Shrives 2006) and even less done on disclosures issued as a response to an adverse risk event or crisis. The only previous study in this area (see Linsley and Kajuter 2008) focused solely on disclosures located in the annual report but in the concluding remarks identified that the annual report is not the only risk disclosure vehicle and that future research should consider looking at disclosures issued through alternative communication methods. This study therefore will be on the use of disclosures in alternative communication methods (as suggested by Linsley and Kajuter (2008)) in the event of an organisational crisis. The research will aim to add to the still evolving academic understanding of the use of risk disclosure, as well as any part it may play in organisational legitimacy repairing strategies. This exploratory research may also help to identify possible new areas for further study. 3 This study will attempt to achieve these aims by comparing and contrasting Northern Rock Plc‘s before and after their spectacular collapse in 2007 to identify any changes in their risk disclosure in press releases, which have been highlighted as a potential candidate for research in past studies (see Lebar 1982) that, as with most others, have focused on disclosures solely in the annual report. The Northern Rock Plc collapse caused by the lack of wholesale market finance resulting from the 2007 subprime crisis was widely reported and caused substantial damage to the banks legitimacy. The study will first compare press release disclosures in the periods before and after the crisis, and if any changes are found then it will attempt to identify whether the changes were part of a strategy formulated to re-legitimise the bank or not. First the chapter comprises a review of the existing risk, risk disclosure and legitimacy theory literature as well as a definition of key concepts and background to Northern Rock and its legitimacy crisis. Secondly, the research methods are explained, hypotheses developed and choice of press releases for the basis of the study is justified. Finally, the results are analysed and finally conclusions are drawn and suggestions for further research are made

    Data Literacy defined pro populo: To read this article, please provide a little information

    Get PDF
    Data literacy is of fundamental importance in societies that emphasize extensive use of data for information and decision-making. Yet, prior definitions for data literacy fall short of addressing the myriad ways individuals are shepherds of, and subjects to, data. This article proposes a definition to accurately reflect the individual in society, including knowledge of what data are, how they are collected, analyzed, visualized and shared, and the understanding of how data are applied for benefit or detriment, within the cultural context of security and privacy. The article concludes by proposing opportunities, strengths, limitations and directions for future research

    Risk Disclosure and Re-establishing Legitimacy in the Event of a Crisis - Did Northern Rock Use Risk Disclosure to Repair Legitimacy after their 2007 Collapse?

    Get PDF
    Banks are exposed to a wide range of risk in their every day operation and in response to this they have developed various tools and strategies in order to help avoid, measure, or manage these risks. These tools and strategies are not always successful which has lead to several well publicised crises including amongst others the Barings bank collapse, Allfirst fraud, Santander fraud and more recently the collapse of Northern Rock and its subsequent nationalisation. In order for society to permit their operation firms require legitimacy, where by its actions must conform to cultural and social norms (Suchman 1995). Legitimacy and trust are vitally important and central to bank operations (Linsley and Kajuter 2008) therefore society is more likely to hold them to account (Ashforth and Gibbs 1990), meaning that in the occurrence of an legitimacy adverse risk event or crisis a banking organisation must enact strategies in order to repair and re-establish trust. Whilst legitimacy theory and the use of voluntary disclosures as part of a strategy for restoring organisational legitimacy and reputation has received academic attention (Linsley and Kajuter 2008) there has been limited research on banking disclosures (Linsley and Shrives 2006) and even less done on disclosures issued as a response to an adverse risk event or crisis. The only previous study in this area (see Linsley and Kajuter 2008) focused solely on disclosures located in the annual report but in the concluding remarks identified that the annual report is not the only risk disclosure vehicle and that future research should consider looking at disclosures issued through alternative communication methods. This study therefore will be on the use of disclosures in alternative communication methods (as suggested by Linsley and Kajuter (2008)) in the event of an organisational crisis. The research will aim to add to the still evolving academic understanding of the use of risk disclosure, as well as any part it may play in organisational legitimacy repairing strategies. This exploratory research may also help to identify possible new areas for further study. 3 This study will attempt to achieve these aims by comparing and contrasting Northern Rock Plc‘s before and after their spectacular collapse in 2007 to identify any changes in their risk disclosure in press releases, which have been highlighted as a potential candidate for research in past studies (see Lebar 1982) that, as with most others, have focused on disclosures solely in the annual report. The Northern Rock Plc collapse caused by the lack of wholesale market finance resulting from the 2007 subprime crisis was widely reported and caused substantial damage to the banks legitimacy. The study will first compare press release disclosures in the periods before and after the crisis, and if any changes are found then it will attempt to identify whether the changes were part of a strategy formulated to re-legitimise the bank or not. First the chapter comprises a review of the existing risk, risk disclosure and legitimacy theory literature as well as a definition of key concepts and background to Northern Rock and its legitimacy crisis. Secondly, the research methods are explained, hypotheses developed and choice of press releases for the basis of the study is justified. Finally, the results are analysed and finally conclusions are drawn and suggestions for further research are made.

    Models, Techniques, and Metrics for Managing Risk in Software Engineering

    Get PDF
    The field of Software Engineering (SE) is the study of systematic and quantifiable approaches to software development, operation, and maintenance. This thesis presents a set of scalable and easily implemented techniques for quantifying and mitigating risks associated with the SE process. The thesis comprises six papers corresponding to SE knowledge areas such as software requirements, testing, and management. The techniques for risk management are drawn from stochastic modeling and operational research. The first two papers relate to software testing and maintenance. The first paper describes and validates novel iterative-unfolding technique for filtering a set of execution traces relevant to a specific task. The second paper analyzes and validates the applicability of some entropy measures to the trace classification described in the previous paper. The techniques in these two papers can speed up problem determination of defects encountered by customers, leading to improved organizational response and thus increased customer satisfaction and to easing of resource constraints. The third and fourth papers are applicable to maintenance, overall software quality and SE management. The third paper uses Extreme Value Theory and Queuing Theory tools to derive and validate metrics based on defect rediscovery data. The metrics can aid the allocation of resources to service and maintenance teams, highlight gaps in quality assurance processes, and help assess the risk of using a given software product. The fourth paper characterizes and validates a technique for automatic selection and prioritization of a minimal set of customers for profiling. The minimal set is obtained using Binary Integer Programming and prioritized using a greedy heuristic. Profiling the resulting customer set leads to enhanced comprehension of user behaviour, leading to improved test specifications and clearer quality assurance policies, hence reducing risks associated with unsatisfactory product quality. The fifth and sixth papers pertain to software requirements. The fifth paper both models the relation between requirements and their underlying assumptions and measures the risk associated with failure of the assumptions using Boolean networks and stochastic modeling. The sixth paper models the risk associated with injection of requirements late in development cycle with the help of stochastic processes

    A Framework for Classification of the Data and Information Quality Literature and Preliminart Results (1996-2007)

    Get PDF
    The value of management decisions, the security of our nation, and the very foundations of our business integrity are all dependent on the quality of data and information. However, the quality of the data and information is dependent on how that data or information will be used. This paper proposes a theory of data quality based on the five principles defined by J. M. Juran for product and service quality and extends Wang et al’s 1995 framework for data quality research. It then examines the data and information quality literature from journals within the context of this framework
    • 

    corecore