844 research outputs found

    Towards critical event monitoring, detection and prediction for self-adaptive future Internet applications

    No full text
    The Future Internet (FI) will be composed of a multitude of diverse types of services that offer flexible, remote access to software features, content, computing resources, and middleware solutions through different cloud delivery models, such as IaaS, PaaS and SaaS. Ultimately, this means that loosely coupled Internet services will form a comprehensive base for developing value added applications in an agile way. Unlike traditional application development, which uses computing resources and software components under local administrative control, FI applications will thus strongly depend on third-party services. To maintain their quality of service, those applications therefore need to dynamically and autonomously adapt to an unprecedented level of changes that may occur during runtime. In this paper, we present our recent experiences on monitoring, detection, and prediction of critical events for both software services and multimedia applications. Based on these findings we introduce potential directions for future research on self-adaptive FI applications, bringing together those research directions

    Developing correct, distributed, adaptive software

    Get PDF
    We illustrate our approach to develop and verify distributed, adaptive software systems. The cornerstone of our framework is the use of choreography languages, which allow us to obtain correctness by construction. Behavioural Design Patterns are also used as abstract tools to design real systems, while techniques based on abstract interpretation and on dynamic verication are integrated in our framework to reduce the complexity of verication

    The evolving dynamics of outsourcing: control and conflict a vendors perspective

    Get PDF
    This paper reports on unique longitudinal research conducted within a complex multi-vendor environment in the defence sector and describes the evolving relationship between the vendors and the defence client organization as they developed and implemented an outsourced HRM application development within Europe. The scale, technical complexity and multinational nature of the project drove the formation of a supplier consortium to deliver the project. The project was tracked over a period of four years and a contextualised process model was applied, focused on the design and configuration phase, to clarify complex social processes and to expose the key incidents that framed its evolution. The analysis demonstrated the critical nature of initial and antecedent conditions and how governance and strict contractual controls interacted to cause project failure. The study gives a unique insight into the vendor perspective in an outsourced context and shows how the interplay between bargaining and control led to the focal organization shifting between collaborative and compliant work processes

    Navigating Cyberthreat Intelligence with CYBEX-P: Dashboard Design and User Experience

    Get PDF
    As the world’s data exponentially grows, two major problems increasingly need to be solved. The first is how to interpret large and complex datasets so that actionable insight can be achieved. The second is how to effectively protect these data and the assets they represent. This thesis’ topic lies at the intersection of these two crucial issues. The research presented in the thesis learns from past work on applying data visualization to multiple domains, with a focus on cybersecurity visualization. These learnings were then applied to a new research area: cybersecurity information sharing. The frontend considerations for CYBEX-P, a cybersecurity information sharing platform developed at UNR, are discussed in detail. A user-facing web application was developed from these requirements, resulting in an approachable, highly visual cyberthreat investigation tool. The threat-intelligence graph at the center of this dashboard-style tool allows analysts to interact with indicators of compromise and efficiently reach security conclusions. In addition to research and related software development, a user study was conducted with participants from cybersecurity backgrounds to test different visualization configurations. Subsequent analysis revealed that the misuse of simple visual properties can lead to perilous reductions in accuracy and response-time. Recommendations are provided for avoiding these pitfalls and balancing information density. The study results inform the final functionalities of the CYBEX-P front end and serve as a foundation for similar prospective tools. By improving how insights can be extracted from large cybersecurity datasets, the work presented in the thesis paves the way towards a more secure and informed future in a technology-driven world

    A Teledentistry System for the Second Opinion

    Get PDF
    In this paper we present a teledentistry system aimed to the Second Opinion task. It make use of a particular camera called intra-oral camera, also called dental camera, in order to perform the photo shooting and real-time video of the inner part of the mouth. The pictures acquired by the Operator with such a device are sent to the Oral Medicine Expert (OME) by means of a current File Transfer Protocol (FTP) service and the real-time video is channeled into a video streaming thanks to the VideoLan client/server (VLC) application. It is composed by a HTML5 web-pages generated by PHP and allows to perform the Second Opinion both when Operator and OME are logged and when one of them is offline

    A Case Study on Software Evolution towards Service-Oriented Architecture.

    Get PDF
    The evolution of any software product over its lifetime is unavoidable, caused both by bugs to be fixed and by new requirements appearing in the later stages of the product’s lifecycle. Traditional development and architecture paradigms have proven to be not suited for these continual changes, resulting in large maintenance costs. This has caused the rise of approaches such as Service Oriented Architectures (SOA), based on loosely coupled, interoperable services, aiming to address these issues. This paper describes a case study of the evolution of an existing legacy system towards a more maintainable SOA. The proposed process includes the recovery of the legacy system architecture, as a first step to define the specific evolution plan to be executed and validated. The case study has been applied to a medical imaging system, evolving it into a service model

    Attribute based component design: Supporting model driven development in CbSE

    Get PDF
    In analysing the evolution of Software Engineering, the scale of the components has increased, the requirements for different domains become complex and a variety of different component frameworks and their associated models have emerged. Many modern component frameworks provide enterprise level facilities and services, such as instance management, and component container support, that allow developers to apply if needed to manage scale and complexity. Although the services provided by these frameworks are common, they have different models and implementation. Accordingly, the main problem is, when developing a component based application using a component framework, the design of the components becomes tightly integrated with the framework implementation and the framework model is embedded in the component functionality, and hence reduces reusability. Another problem arose is, the designers must have in-depth knowledge of the implementation of a component framework to be able to model, design and implement the components and take advantages of the services provided. To address these problems, this research proposes the Attribute based Component Design (AbCD) approach which allows developers to model software using logical and abstract components at the specification level. The components encapsulate the provided functionality, as well as the required services, runtime requirements and interaction models using a set of attributes. These attributes are systemically derived by grouping common features and services from light weight component frameworks and heavy weight component frameworks that are available in the literature. The AbCD approach consists of the AbCD Meta-model, which is an extension of the บML meta-model, and the Component Design Guidelines (CDG) that includes core Component based Software Engineering principles to assist the modelling process for designers. To support the AbCD approach, an implementation has been developed as a set of plug-ins, called the AbCD tool suite, for Eclipse IDE. An evaluation of the AbCD approach is conducted by using the tool suite with two case studies. The first case study focuses on abstraction achieved by the AbCD approach and the second focuses on reusability of the components. The evaluation shows that the artefacts produced using the approach provide an alternative architectural view to the design and help to re-factor the design based on aspects. At the same time the evaluation process identified possible improvements in the AbCD meta-model and the tool suite constructed. This research provides a non-invasive approach for designing component based software using model driven development

    ImageJ2: ImageJ for the next generation of scientific image data

    Full text link
    ImageJ is an image analysis program extensively used in the biological sciences and beyond. Due to its ease of use, recordable macro language, and extensible plug-in architecture, ImageJ enjoys contributions from non-programmers, amateur programmers, and professional developers alike. Enabling such a diversity of contributors has resulted in a large community that spans the biological and physical sciences. However, a rapidly growing user base, diverging plugin suites, and technical limitations have revealed a clear need for a concerted software engineering effort to support emerging imaging paradigms, to ensure the software's ability to handle the requirements of modern science. Due to these new and emerging challenges in scientific imaging, ImageJ is at a critical development crossroads. We present ImageJ2, a total redesign of ImageJ offering a host of new functionality. It separates concerns, fully decoupling the data model from the user interface. It emphasizes integration with external applications to maximize interoperability. Its robust new plugin framework allows everything from image formats, to scripting languages, to visualization to be extended by the community. The redesigned data model supports arbitrarily large, N-dimensional datasets, which are increasingly common in modern image acquisition. Despite the scope of these changes, backwards compatibility is maintained such that this new functionality can be seamlessly integrated with the classic ImageJ interface, allowing users and developers to migrate to these new methods at their own pace. ImageJ2 provides a framework engineered for flexibility, intended to support these requirements as well as accommodate future needs
    • …
    corecore