25,371 research outputs found

    TURTLE-P: a UML profile for the formal validation of critical and distributed systems

    Get PDF
    The timed UML and RT-LOTOS environment, or TURTLE for short, extends UML class and activity diagrams with composition and temporal operators. TURTLE is a real-time UML profile with a formal semantics expressed in RT-LOTOS. Further, it is supported by a formal validation toolkit. This paper introduces TURTLE-P, an extended profile no longer restricted to the abstract modeling of distributed systems. Indeed, TURTLE-P addresses the concrete descriptions of communication architectures, including quality of service parameters (delay, jitter, etc.). This new profile enables co-design of hardware and software components with extended UML component and deployment diagrams. Properties of these diagrams can be evaluated and/or validated thanks to the formal semantics given in RT-LOTOS. The application of TURTLE-P is illustrated with a telecommunication satellite system

    The IceCube Neutrino Observatory IV: Searches for Dark Matter and Exotic Particles

    Full text link
    Exotic particle searches: WIMPs annihilating in the Sun, in the galactic center, in nearby dwarf galaxies; magnetic monopoles; Submitted papers to the 32nd International Cosmic Ray Conference, Beijing 2011.Comment: Papers submitted by the IceCube Collaboration to the 32nd International Cosmic Ray Conference, Beijing 2011; part I

    RV Sonne Cruise 200, 11 Jan-11 Mar 2009. Jakarta - Jakarta

    Get PDF
    All plate boundaries are divided into segments - pieces of fault that are distinct from oneanother, either separated by gaps or with different orientations. The maximum size of anearthquake on a fault system is controlled by the degree to which the propagating rupture cancross the boundaries between such segments. A large earthquake may rupture a whole segmentof plate boundary, but a great earthquake usually ruptures more than one segment at once.The December 26th 2004 MW 9.3 earthquake and the March 28th 2005 MW 8.7 earthquakeruptured, respectively, 1200–1300 km and 300–400 km of the subduction boundary betweenthe Indian-Australian plate and the Burman and Sumatra blocks. Rupture in the 2004 eventstarted at the southern end of the fault segment, and propagated northwards. The observationthat the slip did not propagate significantly southwards in December 2004, even though themagnitude of slip was high at the southern end of the rupture strongly suggests a barrier at thatplace. Maximum slip in the March 2005 earthquake occurred within ~100 km of the barrierbetween the 2004 and 2005 ruptures, confirming both the physical importance of the barrier,and the loading of the March 2005 rupture zone by the December 2004 earthquake.The Sumatran Segmentation Project, funded by the Natural Environment Research Council(NERC), aims to characterise the boundaries between these great earthquakes (in terms of bothsubduction zone structure at scales of 101-104 m and rock physical properties), record seismicactivity, improve and link earthquake slip distribution to the structure of the subduction zoneand to determine the sedimentological record of great earthquakes (both recent and historic)along this part of the margin. The Project is focussed on the areas around two earthquakesegment boundaries: Segment Boundary 1 (SB1) between the 2004 and 2005 ruptures atSimeulue Island, and SB2 between the 2005 and smaller 1935 ruptures between Nias and theBatu Islands.Cruise SO200 is the third of three cruises which will provide a combined geophysical andgeological dataset in the source regions of the 2004 and 2005 subduction zone earthquakes.SO200 was divided into two Legs. Leg 1 (SO200-1), Jakarta to Jakarta between January 22ndand February 22nd, was composed of three main operations: longterm deployment OBSretrieval, TOBI sidescan sonar survey and coring. Leg 2 (SO200-2), Jakarta to Jakarta betweenFebruary 23rd and March 11th, was composed of two main operations: Multichannel seismicreflection (MCS) profiles and heatflow probe transects

    Characterizing a Meta-CDN

    Full text link
    CDNs have reshaped the Internet architecture at large. They operate (globally) distributed networks of servers to reduce latencies as well as to increase availability for content and to handle large traffic bursts. Traditionally, content providers were mostly limited to a single CDN operator. However, in recent years, more and more content providers employ multiple CDNs to serve the same content and provide the same services. Thus, switching between CDNs, which can be beneficial to reduce costs or to select CDNs by optimal performance in different geographic regions or to overcome CDN-specific outages, becomes an important task. Services that tackle this task emerged, also known as CDN broker, Multi-CDN selectors, or Meta-CDNs. Despite their existence, little is known about Meta-CDN operation in the wild. In this paper, we thus shed light on this topic by dissecting a major Meta-CDN. Our analysis provides insights into its infrastructure, its operation in practice, and its usage by Internet sites. We leverage PlanetLab and Ripe Atlas as distributed infrastructures to study how a Meta-CDN impacts the web latency

    Artificial table testing dynamically adaptive systems

    Get PDF
    Dynamically Adaptive Systems (DAS) are systems that modify their behavior and structure in response to changes in their surrounding environment. Critical mission systems increasingly incorporate adaptation and response to the environment; examples include disaster relief and space exploration systems. These systems can be decomposed in two parts: the adaptation policy that specifies how the system must react according to the environmental changes and the set of possible variants to reconfigure the system. A major challenge for testing these systems is the combinatorial explosions of variants and envi-ronment conditions to which the system must react. In this paper we focus on testing the adaption policy and propose a strategy for the selection of envi-ronmental variations that can reveal faults in the policy. Artificial Shaking Table Testing (ASTT) is a strategy inspired by shaking table testing (STT), a technique widely used in civil engineering to evaluate building's structural re-sistance to seismic events. ASTT makes use of artificial earthquakes that simu-late violent changes in the environmental conditions and stresses the system adaptation capability. We model the generation of artificial earthquakes as a search problem in which the goal is to optimize different types of envi-ronmental variations

    Why It Takes So Long to Connect to a WiFi Access Point

    Full text link
    Today's WiFi networks deliver a large fraction of traffic. However, the performance and quality of WiFi networks are still far from satisfactory. Among many popular quality metrics (throughput, latency), the probability of successfully connecting to WiFi APs and the time cost of the WiFi connection set-up process are the two of the most critical metrics that affect WiFi users' experience. To understand the WiFi connection set-up process in real-world settings, we carry out measurement studies on 55 million mobile users from 44 representative cities associating with 77 million APs in 0.40.4 billion WiFi sessions, collected from a mobile "WiFi Manager" App that tops the Android/iOS App market. To the best of our knowledge, we are the first to do such large scale study on: how large the WiFi connection set-up time cost is, what factors affect the WiFi connection set-up process, and what can be done to reduce the WiFi connection set-up time cost. Based on the measurement analysis, we develop a machine learning based AP selection strategy that can significantly improve WiFi connection set-up performance, against the conventional strategy purely based on signal strength, by reducing the connection set-up failures from 33%33\% to 3.6%3.6\% and reducing 80%80\% time costs of the connection set-up processes by more than 1010 times.Comment: 11pages, conferenc

    Nanopipettes as Monitoring Probes for the Single Living Cell: State of the Art and Future Directions in Molecular Biology.

    Get PDF
    Examining the behavior of a single cell within its natural environment is valuable for understanding both the biological processes that control the function of cells and how injury or disease lead to pathological change of their function. Single-cell analysis can reveal information regarding the causes of genetic changes, and it can contribute to studies on the molecular basis of cell transformation and proliferation. By contrast, whole tissue biopsies can only yield information on a statistical average of several processes occurring in a population of different cells. Electrowetting within a nanopipette provides a nanobiopsy platform for the extraction of cellular material from single living cells. Additionally, functionalized nanopipette sensing probes can differentiate analytes based on their size, shape or charge density, making the technology uniquely suited to sensing changes in single-cell dynamics. In this review, we highlight the potential of nanopipette technology as a non-destructive analytical tool to monitor single living cells, with particular attention to integration into applications in molecular biology
    • …
    corecore