250 research outputs found

    Some issues in the 'archaeology' of software evolution

    Get PDF
    During a software project's lifetime, the software goes through many changes, as components are added, removed and modified to fix bugs and add new features. This paper is intended as a lightweight introduction to some of the issues arising from an `archaeological' investigation of software evolution. We use our own work to look at some of the challenges faced, techniques used, findings obtained, and lessons learnt when measuring and visualising the historical changes that happen during the evolution of software

    Identification and tracking of marine objects for collision risk estimation.

    Get PDF
    With the advent of modem high-speed passenger ferries and the general increase in maritime traffic, both commercial and recreational, marine safety is becoming an increasingly important issue. From lightweight catamarans and fishing trawlers to container ships and cruise liners one question remains the same. Is anything in the way? This question is addressed in this thesis. Through the use of image processing techniques applied to video sequences of maritime scenes the images are segmented into two regions, sea and object. This is achieved using statistical measures taken from the histogram data of the images. Each segmented object has a feature vector built containing information including its size and previous centroid positions. The feature vectors are used to track the identified objects across many frames. With information recorded about an object's previous motion its future motion is predicted using a least squares method. Finally a high-level rule-based algorithm is applied in order to estimate the collision risk posed by each object present in the image. The result is an image with the objects identified by the placing of a white box around them. The predicted motion is shown and the estimated collision risk posed by that object is displayed. The algorithms developed in this work have been evaluated using two previously unseen maritime image sequences. These show that the algorithms developed here can be used to estimate the collision risk posed by maritime objects

    Efficient Change Management of XML Documents

    Get PDF
    XML-based documents play a major role in modern information architectures and their corresponding work-flows. In this context, the ability to identify and represent differences between two versions of a document is essential. A second important aspect is the merging of document versions, which becomes crucial in parallel editing processes. Many different approaches exist that meet these challenges. Most rely on operational transformation or document annotation. In both approaches, the operations leading to changes are tracked, which requires corresponding editing applications. In the context of software development, however, a state-based approach is common. Here, document versions are compared and merged using external tools, called diff and patch. This allows users for freely editing documents without being tightened to special tools. Approaches exist that are able to compare XML documents. A corresponding merge capability is still not available. In this thesis, I present a comprehensive framework that allows for comparing and merging of XML documents using a state-based approach. Its design is based on an analysis of XML documents and their modification patterns. The heart of the framework is a context-oriented delta model. I present a diff algorithm that appears to be highly efficient in terms of speed and delta quality. The patch algorithm is able to merge document versions efficiently and reliably. The efficiency and the reliability of my approach are verified using a competitive test scenario

    Identification and tracking of maritime objects for collision risk estimation

    Get PDF
    With the advent of modem high-speed passenger ferries and the general increase in maritime traffic, both commercial and recreational, marine safety is becoming an increasingly important issue. From lightweight catamarans and fishing trawlers to container ships and cruise liners one question remains the same. Is anything in the way? This question is addressed in this thesis. Through the use of image processing techniques applied to video sequences of maritime scenes the images are segmented into two regions, sea and object. This is achieved using statistical measures taken from the histogram data of the images. Each segmented object has a feature vector built containing information including its size and previous centroid positions. The feature vectors are used to track the identified objects across many frames. With information recorded about an object's previous motion its future motion is predicted using a least squares method. Finally a high-level rule-based algorithm is applied in order to estimate the collision risk posed by each object present in the image. The result is an image with the objects identified by the placing of a white box around them. The predicted motion is shown and the estimated collision risk posed by that object is displayed. The algorithms developed in this work have been evaluated using two previously unseen maritime image sequences. These show that the algorithms developed here can be used to estimate the collision risk posed by maritime objects.EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    Cyclist Detection, Tracking, and Trajectory Analysis in Urban Traffic Video Data

    Full text link
    The major objective of this thesis work is examining computer vision and machine learning detection methods, tracking algorithms and trajectory analysis for cyclists in traffic video data and developing an efficient system for cyclist counting. Due to the growing number of cyclist accidents on urban roads, methods for collecting information on cyclists are of significant importance to the Department of Transportation. The collected information provides insights into solving critical problems related to transportation planning, implementing safety countermeasures, and managing traffic flow efficiently. Intelligent Transportation System (ITS) employs automated tools to collect traffic information from traffic video data. In comparison to other road users, such as cars and pedestrians, the automated cyclist data collection is relatively a new research area. In this work, a vision-based method for gathering cyclist count data at intersections and road segments is developed. First, we develop methodology for an efficient detection and tracking of cyclists. The combination of classification features along with motion based properties are evaluated to detect cyclists in the test video data. A Convolutional Neural Network (CNN) based detector called You Only Look Once (YOLO) is implemented to increase the detection accuracy. In the next step, the detection results are fed into a tracker which is implemented based on the Kernelized Correlation Filters (KCF) which in cooperation with the bipartite graph matching algorithm allows to track multiple cyclists, concurrently. Then, a trajectory rebuilding method and a trajectory comparison model are applied to refine the accuracy of tracking and counting. The trajectory comparison is performed based on semantic similarity approach. The proposed counting method is the first cyclist counting method that has the ability to count cyclists under different movement patterns. The trajectory data obtained can be further utilized for cyclist behavioral modeling and safety analysis

    The information content of interim financial reports: U.K. evidence

    Get PDF
    The aim of this study is to investigate whether the public release of interim financial reports in the United Kingdom conveys information that affects share prices. The major objective for reporting the financial affairs of business enterprises is assumed to be the provision of information to help investors make investment decisions. Interim reports fulfil an important role as a source of frequent information regarding the events in the business enterprise which could give investors some indication about the risks and uncertainties attached to a particular firm's cash flows. Accounting data, therefore, is assumed to be part of the broad market information set that is utilised in establishing prices. The study is carried out in the context of a semi-strong form market efficiency since the announcement of interim earnings puts the information in the announcement in the public domain. An efficient securities market impounds price relevant information into prices instantaneously and without bias. Changes in security prices therefore reflect the flow of information to the market information set utilised in establishing prices. The information in interim earnings can therefore be established if security prices change on the public release of the earnings data barring any other price sensitive information at the same time period. The major finding in the study is that interim accounting reports have information contents which affect price activity on the day of release. It is argued that accounting policy makers have incentive to provide economic benefits by recommending the preparation of quarterly reports by firms

    Incrementalizing Lattice-Based Program Analyses in Datalog

    Get PDF
    Program analyses detect errors in code, but when code changes frequently as in an IDE, repeated re-analysis from-scratch is unnecessary: It leads to poor performance unless we give up on precision and recall. Incremental program analysis promises to deliver fast feedback without giving up on precision or recall by deriving a new analysis result from the previous one. However, Datalog and other existing frameworks for incremental program analysis are limited in expressive power: They only support the powerset lattice as representation of analysis results, whereas many practically relevant analyses require custom lattices and aggregation over lattice values. To this end, we present a novel algorithm called DRedL that supports incremental maintenance of recursive lattice-value aggregation in Datalog. The key insight of DRedL is to dynamically recognize increasing replacements of old lattice values by new ones, which allows us to avoid the expensive deletion of the old value. We integrate DRedL into the analysis framework IncA and use IncA to realize incremental implementations of strong-update points-to analysis and string analysis for Java. As our performance evaluation demonstrates, both analyses react to code changes within milliseconds
    • …
    corecore