183 research outputs found

    A general purpose programming framework for ubiquitous computing environments

    Get PDF
    It is important to note that the need to support ad-hoc and potentially mobile arrangements of devices in ubiquitous environments does not fit well within the traditional client/server architecture. We believe peer-to-peer communication offers a preferable alternative due to its decentralised nature, removing dependence on individual nodes. However, this choice adds to the complexity of the developers task. In this paper, we describe a two-tiered approach to address this problem: A lower tier employing peer-to-peer interactions for managing the network infrastructure and an upper tier providing a mobile agent based programming framework. The result is a general purpose framework for developing ubiquitous applications and services, where the underlying complexity is hidden from the developer. This paper discusses our on-going work; presenting our design decisions, features supported by our framework, and some of the challenges still to be addressed in a complex programming environment

    Mobile object location discovery in unpredictable environments

    Get PDF
    Emerging mobile and ubiquitous computing environments present hard challenges to software engineering. The use of mobile code has been suggested as a natural fit for simplifing software development for these environments. However, the task of discovering mobile code location becomes a problem in unpredictable environments when using existing strategies, designed with fixed and relatively stable networks in mind. This paper introduces AMOS, a mobile code platform augmented with a structured overlay network. We demonstrate how the location discovery strategy of AMOS has better reliability and scalability properties than existing approaches, with minimal communication overhead. Finally, we demonstrate how AMOS can provide autonomous distribution of effort fairly throughout a network using probabilistic methods that requires no global knowledge of host capabilities

    Flow resistance equations for gravel-and boulder-bed streams

    Get PDF
    Alternative general forms are considered for equations to predict mean velocity over the full range of relative submergence experienced in gravel- and boulder-bed streams. A partial unification is suggested for some previous semiempirical models and physical concepts. Two new equations are proposed: a nondimensional hydraulic geometry equation with different parameters for deep and shallow flows, and a variable-power resistance equation that is asymptotic to roughness-layer formulations for shallow flows and to the Manning-Strickler approximation of the logarithmic friction law for deep flows. Predictions by existing and new equations using D 84 as roughness scale are compared to a compilation of measured velocities in natural streams at relative submergences from 0.1 to over 30. The variable-power equation performs as well as the best existing approach, which is a logarithmic law with roughness multiplier. For predicting how a known or assumed discharge is partitioned between depth and velocity, a nondimensional hydraulic geometry approach outperforms equations using relative submergence. Factor-of-two prediction errors occur with all approaches because of sensitivity to operational definitions of depth, velocity, and slope, the inadequacy of using a single grain-size length scale, and the complexity of flow physics in steep shallow streams

    PRECEPT:a framework for ethical digital forensics investigations

    Get PDF
    Purpose: Cyber-enabled crimes are on the increase, and law enforcement has had to expand many of their detecting activities into the digital domain. As such, the field of digital forensics has become far more sophisticated over the years and is now able to uncover even more evidence that can be used to support prosecution of cyber criminals in a court of law. Governments, too, have embraced the ability to track suspicious individuals in the online world. Forensics investigators are driven to gather data exhaustively, being under pressure to provide law enforcement with sufficient evidence to secure a conviction. Yet, there are concerns about the ethics and justice of untrammeled investigations on a number of levels. On an organizational level, unconstrained investigations could interfere with, and damage, the organization’s right to control the disclosure of their intellectual capital. On an individual level, those being investigated could easily have their legal privacy rights violated by forensics investigations. On a societal level, there might be a sense of injustice at the perceived inequality of current practice in this domain. This paper argues the need for a practical, ethically-grounded approach to digital forensic investigations, one that acknowledges and respects the privacy rights of individuals and the intellectual capital disclosure rights of organisations, as well as acknowledging the needs of law enforcement. We derive a set of ethical guidelines, then map these onto a forensics investigation framework. We subjected the framework to expert review in two stages, refining the framework after each stage. We conclude by proposing the refined ethically-grounded digital forensics investigation framework. Our treatise is primarily UK based, but the concepts presented here have international relevance and applicability.Design methodology: In this paper, the lens of justice theory is used to explore the tension that exists between the needs of digital forensic investigations into cybercrimes on the one hand, and, on the other, individuals’ rights to privacy and organizations’ rights to control intellectual capital disclosure.Findings: The investigation revealed a potential inequality between the practices of digital forensics investigators and the rights of other stakeholders. That being so, the need for a more ethically-informed approach to digital forensics investigations, as a remedy, is highlighted, and a framework proposed to provide this.Practical Implications: Our proposed ethically-informed framework for guiding digital forensics investigations suggest a way of re-establishing the equality of the stakeholders in this arena, and ensuring that the potential for a sense of injustice is reduced.Originality/value: Justice theory is used to highlight the difficulties in squaring the circle between the rights and expectations of all stakeholders in the digital forensics arena. The outcome is the forensics investigation guideline, PRECEpt: Privacy-Respecting EthiCal framEwork, which provides the basis for a re-aligning of the balance between the requirements and expectations of digital forensic investigators on the one hand, and individual and organizational expectations and rights, on the other

    Climate and human forcing of Alpine river flow

    Get PDF
    River flow in Alpine environments is likely to be highly sensitive to climate change because of the effects of warming upon snow and ice, and hence the intra-annual distribution of river runoff. It is also likely to be influenced strongly by human impacts both upon hydrology (e.g. flow abstraction) and river regulation. This paper compares the river flow and sediment flux of two Alpine drainage basins over the last 5 to 7 decades, one that is largely unimpacted by human activities, one strongly impacted by flow abstraction for hydroelectricity. The analysis shows that both river flow and sediment transport capacity are strongly dependent upon the effects of temperature and precipitation availability upon snow accumulation. As the latter tends to increase annual maximum flows, and given the non-linear form of most sediment transport laws, current warming trends may lead to increased sedimentation in Alpine rivers. However, extension to a system impacted upon by flow abstraction reveals the dominant effect that human activity can have upon river sedimentation but also how human response to sediment management has co-evolved with climate forcing to make disentangling the two very difficult

    MetaBuilder: The Diagrammer’s Diagrammer

    Full text link

    Accelerated SARS-CoV-2 intrahost evolution leading to distinct genotypes during chronic infection

    Get PDF
    The chronic infection hypothesis for novel severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) variant emergence is increasingly gaining credence following the appearance of Omicron. Here, we investigate intrahost evolution and genetic diversity of lineage B.1.517 during a SARS-CoV-2 chronic infection lasting for 471 days (and still ongoing) with consistently recovered infectious virus and high viral genome copies. During the infection, we find an accelerated virus evolutionary rate translating to 35 nucleotide substitutions per year, approximately 2-fold higher than the global SARS-CoV-2 evolutionary rate. This intrahost evolution results in the emergence and persistence of at least three genetically distinct genotypes, suggesting the establishment of spatially structured viral populations continually reseeding different genotypes into the nasopharynx. Finally, we track the temporal dynamics of genetic diversity to identify advantageous mutations and highlight hallmark changes for chronic infection. Our findings demonstrate that untreated chronic infections accelerate SARS-CoV-2 evolution, providing an opportunity for the emergence of genetically divergent variants

    Performance of reconstruction and identification of τ leptons decaying to hadrons and vτ in pp collisions at √s=13 TeV

    Get PDF
    The algorithm developed by the CMS Collaboration to reconstruct and identify τ leptons produced in proton-proton collisions at √s=7 and 8 TeV, via their decays to hadrons and a neutrino, has been significantly improved. The changes include a revised reconstruction of π⁰ candidates, and improvements in multivariate discriminants to separate τ leptons from jets and electrons. The algorithm is extended to reconstruct τ leptons in highly Lorentz-boosted pair production, and in the high-level trigger. The performance of the algorithm is studied using proton-proton collisions recorded during 2016 at √s=13 TeV, corresponding to an integrated luminosity of 35.9 fbÂŻÂč. The performance is evaluated in terms of the efficiency for a genuine τ lepton to pass the identification criteria and of the probabilities for jets, electrons, and muons to be misidentified as τ leptons. The results are found to be very close to those expected from Monte Carlo simulation

    An embedding technique to determine ττ backgrounds in proton-proton collision data

    Get PDF

    Search for heavy resonances decaying to a top quark and a bottom quark in the lepton+jets final state in proton–proton collisions at 13 TeV

    Get PDF
    A search is presented for narrow heavy resonances decaying to a top quark and a bottom quark using data collected by the CMS experiment at √s = 13TeV in 2016. The data set analyzed corresponds to an integrated luminosity of 35.9 fb−1. Final states that include a single lepton (e, ÎŒ), multiple jets, and missing transverse momentum are analyzed. No evidence is found for the production of a Wâ€Č boson, and the production of right-handed Wâ€Č bosons is excluded at 95% confidence level for masses up to 3.6 TeV depending on the scenario considered. Exclusion limits for Wâ€Č bosons are also presented as a function of their coupling strength to left- and right-handed fermions. These limits on a Wâ€Č boson decaying via a top and a bottom quark are the most stringent published to date
    • 

    corecore