23,453 research outputs found

    Applying Formal Methods to Networking: Theory, Techniques and Applications

    Full text link
    Despite its great importance, modern network infrastructure is remarkable for the lack of rigor in its engineering. The Internet which began as a research experiment was never designed to handle the users and applications it hosts today. The lack of formalization of the Internet architecture meant limited abstractions and modularity, especially for the control and management planes, thus requiring for every new need a new protocol built from scratch. This led to an unwieldy ossified Internet architecture resistant to any attempts at formal verification, and an Internet culture where expediency and pragmatism are favored over formal correctness. Fortunately, recent work in the space of clean slate Internet design---especially, the software defined networking (SDN) paradigm---offers the Internet community another chance to develop the right kind of architecture and abstractions. This has also led to a great resurgence in interest of applying formal methods to specification, verification, and synthesis of networking protocols and applications. In this paper, we present a self-contained tutorial of the formidable amount of work that has been done in formal methods, and present a survey of its applications to networking.Comment: 30 pages, submitted to IEEE Communications Surveys and Tutorial

    Proceedings of International Workshop "Global Computing: Programming Environments, Languages, Security and Analysis of Systems"

    Get PDF
    According to the IST/ FET proactive initiative on GLOBAL COMPUTING, the goal is to obtain techniques (models, frameworks, methods, algorithms) for constructing systems that are flexible, dependable, secure, robust and efficient. The dominant concerns are not those of representing and manipulating data efficiently but rather those of handling the co-ordination and interaction, security, reliability, robustness, failure modes, and control of risk of the entities in the system and the overall design, description and performance of the system itself. Completely different paradigms of computer science may have to be developed to tackle these issues effectively. The research should concentrate on systems having the following characteristics: • The systems are composed of autonomous computational entities where activity is not centrally controlled, either because global control is impossible or impractical, or because the entities are created or controlled by different owners. • The computational entities are mobile, due to the movement of the physical platforms or by movement of the entity from one platform to another. • The configuration varies over time. For instance, the system is open to the introduction of new computational entities and likewise their deletion. The behaviour of the entities may vary over time. • The systems operate with incomplete information about the environment. For instance, information becomes rapidly out of date and mobility requires information about the environment to be discovered. The ultimate goal of the research action is to provide a solid scientific foundation for the design of such systems, and to lay the groundwork for achieving effective principles for building and analysing such systems. This workshop covers the aspects related to languages and programming environments as well as analysis of systems and resources involving 9 projects (AGILE , DART, DEGAS , MIKADO, MRG, MYTHS, PEPITO, PROFUNDIS, SECURE) out of the 13 founded under the initiative. After an year from the start of the projects, the goal of the workshop is to fix the state of the art on the topics covered by the two clusters related to programming environments and analysis of systems as well as to devise strategies and new ideas to profitably continue the research effort towards the overall objective of the initiative. We acknowledge the Dipartimento di Informatica and Tlc of the University of Trento, the Comune di Rovereto, the project DEGAS for partially funding the event and the Events and Meetings Office of the University of Trento for the valuable collaboration

    Generating collaborative systems for digital libraries: A model-driven approach

    Get PDF
    This is an open access article shared under a Creative Commons Attribution 3.0 Licence (http://creativecommons.org/licenses/by/3.0/). Copyright @ 2010 The Authors.The design and development of a digital library involves different stakeholders, such as: information architects, librarians, and domain experts, who need to agree on a common language to describe, discuss, and negotiate the services the library has to offer. To this end, high-level, language-neutral models have to be devised. Metamodeling techniques favor the definition of domainspecific visual languages through which stakeholders can share their views and directly manipulate representations of the domain entities. This paper describes CRADLE (Cooperative-Relational Approach to Digital Library Environments), a metamodel-based framework and visual language for the definition of notions and services related to the development of digital libraries. A collection of tools allows the automatic generation of several services, defined with the CRADLE visual language, and of the graphical user interfaces providing access to them for the final user. The effectiveness of the approach is illustrated by presenting digital libraries generated with CRADLE, while the CRADLE environment has been evaluated by using the cognitive dimensions framework

    Towards a debugging tutor for object-oriented environments

    Get PDF
    Programming has provided a rich domain for Artificial Intelligence in Education and many systems have been developed to advise students about the bugs in their programs, either during program development or post-hoc. Surprisingly few systems have been developed specifically to teach debugging. Learning environment builders have assumed that either the student will be taught these elsewhere or thatthey will be learnt piecemeal without explicit advice.This paper reports on two experiments on Java debugging strategy by novice programmers and discusses their implications for the design of a debugging tutor for Java that pays particular attention to how students use the variety of program representations available. The experimental results are in agreement with research in the area that suggests that good debugging performance is associated with a balanced use ofthe available representations and a sophisticated use of the debugging step facility which enables programmers to detect and obtain information from critical momentsin the execution of the program. A balanced use of the available representations seemsto be fostered by providing representations with a higher degree of dynamic linkingas well as by explicit instruction about the representation formalism employed in the program visualisations

    Scattering line polarization in rotating, optically thick disks

    Full text link
    To interpret observations of astrophysical disks it is essential to understand the formation process of the emitted light. If the disk is optically thick, scattering dominated and permeated by a Keplerian velocity field, Non-Local Thermodynamic Equilibrium radiative transfer modeling must be done to compute the emergent spectrum from a given disk model. We investigate Non-local thermodynamic equilibrium polarized line formation in different simple disk models and aim to demonstrate the importance of both radiative transfer effects and scattering as well as the effects of velocity fields. We self-consistently solve the coupled equations of radiative transfer and statistical equilibrium for a two level atom model by means of Jacobi iteration. We compute scattering polarization, that is Q/I and U/I line profiles. The degree of scattering polarization is significantly influenced by the inclination of the disk with respect to observer, but also by the optical thickness of the disk and the presence of rotation. Stokes U shows double-lobed profiles with amplitude which increases with the disk rotation. Our results suggest that the line profiles, especially the polarized ones, emerging from gaseous disks differ significantly from the profiles predicted by simple approximations. The profiles are diverse in shape, but typically symmetric in Stokes Q and antisymmetric in Stokes U. A clear indicator of disk rotation is the presence of Stokes U, which might prove to be a useful diagnostic tool. We also demonstrate that, for moderate rotational velocities, an approximate treatment can be used, where non-local thermodynamic equilibrium radiative transfer is done in the velocity field-free approximation and Doppler shift is applied in the process of spatial integration over the whole emitting surface.Comment: 16 pages; 12 figures; Accepted with revision for A&A. This is the version after first round of referee's suggestion

    Personalised mobile services supporting the implementation of clinical guidelines

    Get PDF
    Telemonitoring is emerging as a compelling application of Body Area Networks (BANs). We describe two health BAN systems developed respectively by a European team and an Australian team and discuss some issues encountered relating to formalization of clinical knowledge to support real-time analysis and interpretation of BAN data. Our example application is an evidence-based telemonitoring and teletreatment application for home-based rehabilitation. The application is intended to support implementation of a clinical guideline for cardiac rehabilitation following myocardial infarction. In addition to this the proposal is to establish the patient’s individual baseline risk profile and, by real-time analysis of BAN data, continually re-assess the current risk level in order to give timely personalised feedback. Static and dynamic risk factors are derived from literature. Many sources express evidence probabilistically, suggesting a requirement for reasoning with uncertainty; elsewhere evidence requires qualitative reasoning: both familiar modes of reasoning in KBSs. However even at this knowledge acquisition stage some issues arise concerning how best to apply the clinical evidence. Furthermore, in cases where insufficient clinical evidence is currently available, telemonitoring can yield large collections of clinical data with the potential for data mining in order to furnish more statistically powerful and accurate clinical evidence

    Refinement of SDBC Business Process Models Using ISDL

    Get PDF
    Aiming at aligning business process modeling and software specification, the SDBC approach considers a multi-viewpoint modeling where static, dynamic, and data business process aspect models have to be mapped adequately to corresponding static, dynamic, and data software specification aspect models. Next to that, the approach considers also a business process modeling viewpoint which concerns real-life communication and coordination issues, such as meanings, intentions, negotiations, commitments, and obligations. Hence, in order to adequately align communication and dynamic aspect models, SDBC should use at least two modeling techniques. However, the transformation between two techniques unnecessarily complicates the modeling process. Next to that, different techniques use different modeling formalisms whose reflection sometimes causes limitations. For this reason, we explore in the current paper the value which the (modeling) language ISDL could bring to SDBC in the alignment of communication and behavioral (dynamic) business process aspect models; ISDL can usefully refine dynamic process models. Thus, it is feasible to expect that ISDL can complement the SDBC approach, allowing refinement of dynamic business process aspect models, by adding communication and coordination actions. Furthermore, SDBC could benefit from ISDL-related methods assessing whether a realized refinement conforms to the original process model. Our studies in the paper are supported by an illustrative example
    corecore