19 research outputs found

    Node localisation in wireless ad hoc networks

    Get PDF
    Wireless ad hoc networks often require a method for estimating their nodes' locations. Typically this is achieved by the use of pair-wise measurements between nodes and their neighbours, where a number of nodes already accurately know their location and the remaining nodes must calculate theirs using these known locations. Typically, a minimum mean square estimate (MMSE), or a maximum likelihood estimate (MLE) is used to generate the unknown node locations, making use of range estimates derived from measurements between the nodes. In this paper we investigate the efficacy of using radio frequency, received signal strength (RSS) measurements for the accurate location of the transmitting nodes over long ranges. We show with signal strength measurements from three or more wireless probes in noisy propagation conditions, that by using a weighted MMSE approach we can obtain significant improvements in the variance of the location estimate over both the standard MMSE and MLE approaches.Jon Arnold, Nigel Bean, Miro Kraetzl, Matthew Rougha

    Topology reconstruction and characterisation of wireless ad hoc networks

    Get PDF
    © Copyright 2007 IEEEWireless ad hoc networks provide a useful communications infrastructure for the mobile battlefield. In this paper we apply and develop passive radio frequency signal strength monitoring and packet transmission time profiling techniques, to characterise and reconstruct an encrypted wireless network's topology. We show that by using signal strength measurements from three or more wireless probes and by assuming the use of carrier sense multiple access with collision avoidance, for physical layer control, we can produce a representation of a wireless network's logical topology and in some cases reconstruct the physical topology. Smoothed Kalman Altering is used to track the reconstructed topology over time, and in conjunction with a weighted least squares template fitting technique, enables the profiling of the individual network nodes and the characterisation of their transmissions. © 2007 Crown Copyright.http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?tp=&arnumber=4289257&isnumber=428867

    The tree cut and merge algorithm for estimation of network reliability

    Get PDF
    This article presents Monte Carlo techniques for estimating network reliability. For highly reliable networks, techniques based on graph evolution models provide very good performance. However, they are known to have significant simulation cost. An existing hybrid scheme (based on partitioning the time space) is available to speed up the simulations; however, there are difficulties with optimizing the important parameter associated with this scheme. To overcome these difficulties, a new hybrid scheme (based on partitioning the edge set) is proposed in this article. The proposed scheme shows orders of magnitude improvement of performance over the existing techniques in certain classes of network. It also provides reliability bounds with little overhead.K.P. Hui, N. Bean, M. Kraetzl and D. Kroes

    Formation of Re-Aggregated Neonatal Porcine Islet Clusters Improves In Vitro Function and Transplantation Outcome

    Get PDF
    Neonatal porcine islet-like cell clusters (NPICCs) are a promising source for islet cell transplantation. Excellent islet quality is important to achieve a cure for type 1 diabetes. We investigated formation of cell clusters from dispersed NPICCs on microwell cell culture plates, evaluated the composition of re-aggregated porcine islets (REPIs) and compared in vivo function by transplantation into diabetic NOD-SCID IL2rγ−/− (NSG) mice with native NPICCs. Dissociation of NPICCs into single cells and re-aggregation resulted in the formation of uniform REPI clusters. A higher prevalence of normoglycemia was observed in diabetic NSG mice after transplantation with a limited number (n = 1500) of REPIs (85.7%) versus NPICCs (n = 1500) (33.3%) (p < 0.05). Transplanted REPIs and NPICCs displayed a similar architecture of endocrine and endothelial cells. Intraperitoneal glucose tolerance tests revealed an improved beta cell function after transplantation of 1500 REPIs (AUC glucose 0–120 min 6260 ± 305.3) as compared to transplantation of 3000 native NPICCs (AUC glucose 0–120 min 8073 ± 536.2) (p < 0.01). Re-aggregation of single cells from dissociated NPICCs generates cell clusters with excellent functionality and improved in vivo function as compared to native NPICCs

    Use of a cepstral information norm for anomaly detection in a BGP-inferred interent

    No full text
    In this paper we use a particular type of mutual information norm — the cepstral information norm — for anomaly detection at the router level in the Internet. We combine the cepstral norm with a state space Kalman filter to define two distance metrics to capture anomalous behaviour. These metrics are implemented using a subspace-based model-free paradigm to aid realtime analysis. We infer a top level Internet topology using Border Gateway Protocol router updates and characterise the structural evolution of the network using a selection of graph metrics. Analysis over one week of non time-homogeneous updates, which includes The SQL Slammer worm event, shows the combined use of the two cepstral distance metrics detects the occurrence and severity of anomalous network events.Belinda A. Chiera, Miro Kraetzl, Matthew Roughan and Langford B. Whit

    On the predictive power of shortest-path weight inference

    No full text
    Copyright © 2008 ACMReverse engineering of the Internet is a valuable activity. Apart from providing scientific insight, the resulting datasets are invaluable in providing realistic network scenarios for other researchers. The Rocketfuel project attempted this process, but it is surprising how little effort has been made to validate its results. This paper concentrates on validating a particular inference methodology used to obtain link weights on a network. There is a basic difficulty in assessing the accuracy of such inferences in that a non-unique set of link-weights may produce the same routing, and so simple measurements of accuracy (even where ground truth data are available) do not capture the usefulness of a set of inferred weights. We propose a methodology based on predictive power to assess the quality of the weight inference. We used this to test Rocketfuel’s algorithm, and our tests suggest that it is reasonably good particularly on certain topologies, though it has limitations when its underlying assumptions are incorrect.Andrew Coyle, Miro Kraetzl, Olaf Maennel and Matthew Rougha

    Network reliability estimation using the tree cut and merge algorithm with importance sampling

    No full text
    It is well known that the exact calculation of network reliability is a NP-complete problem and that for large networks estimating the reliability using simulation techniques becomes attractive. For highly reliable networks, a Monte Carlo scheme called the Merge Process is one of the best performing algorithms, but with a relatively high computational cost per sample. The authors previously proposed a hybrid Monte Carlo scheme called the Tree Cut and Merge algorithm which can improve simulation performance by over seven orders of magnitude in some heterogeneous networks. In homogeneous networks, however, the performance of the algorithm may degrade. In this paper, we first analyse the Tree Cut and Merge algorithm and explain why it does not perform well in some networks. Then a modification is proposed that subdivides the problem into smaller problems and introduces the Importance Sampling technique to the simulation process. The modified algorithm addresses the slow convergence problem in those hard cases while keeping the performance improvement in heterogeneous networks. Experiments and results are presented with some discussions.Hui, K.-P. ; Bean, N.G. ; Kraetzl, M. ; Kroese, D

    GSM signalling in prioritised LEO satellite environment

    No full text
    corecore