14 research outputs found

    Horner's Syndrome as a Complication of Ultrasound-Guided Central Cannulation: A Case Report

    Get PDF
    Cannulation of the internal jugular vein is often necessary for the management of critically ill patients. Despite being a very common procedure and performed more and more safely, several complications still occur. Horner's Syndrome (HS) is one of those complications described before the use of ultrasound as a method of guidance. HS is caused by functional interruption of sympathetic nerve supply to the eye, leading to a classic triad of ipsilateral ptosis, miosis, and anhidrosis. We present the case of a patient, in need of emergent surgery to control the hemorrhagic focus after delivery, with a transient HS secondary to internal jugular vein cannulation under real-time ultrasound guidance.info:eu-repo/semantics/publishedVersio

    Systematic review of first trimester ultrasound screening in detecting fetal structural anomalies and factors affecting screening performance.

    Get PDF
    OBJECTIVES: To determine the sensitivity and specificity of first trimester ultrasound for the detection of fetal abnormalities; and to establish which factors might impact this screening. METHODS: Systematic review and meta-analysis of all relevant publications assessing the diagnostic accuracy of first trimester 2D (transabdominal and transvaginal) ultrasound in the detection of congenital fetal anomalies prior to 14 weeks gestation was performed. The reference standard used was the detection of abnormalities at birth or postmortem. Factors that may impact detection rates were evaluated including population characteristics, gestation, healthcare setting, ultrasound modality, use of an anatomical checklist for first trimester anomaly detection and what types of malformations were included in the study. In an effort to reduce the impact of study heterogeneity on results of the meta-analysis, data from the studies were analyzed within subgroups of major anomalies versus all types of anomalies; and low risk / unselected populations versus high risk populations. RESULTS: An initial electronic search identified 2,225 citations, from which a total of 30 relevant studies, published between 1991 and 2015, were selected for inclusion. For low risk or unselected populations (19 studies, 115,731 fetuses) the pooled estimate for detection of major abnormalities was 46.10% (95% C.I. 36.88-55.46). The detection rate for all abnormalities in low risk or unselected populations was 32.35% (95% C.I. 22.45-43.12), in 14 studies (97,976 fetuses); while the detection rate in high risk populations for the presence of all types of anomalies (six studies, 2,841 fetuses) was 61.18% (95% C.I. 37.71-82.19). Of the factors examined impacting detection rates there was a statistically significant relationship between the use of an anatomical protocol during first trimester anomaly screening and sensitivity for the detection of fetal anomalies in all subgroups (P < 0.0001). CONCLUSION: Detection rates for first trimester anomalies range from 32% in low risk, to over 60% in high risk groups. This demonstrates that first trimester ultrasound has the potential to identify a large proportion of fetuses affected with structural anomalies. The use of a standardized anatomical protocol improves the sensitivity of first trimester ultrasound screening for all anomalies and major anomalies in populations of varying risk. The development and introduction of international protocols with standard anatomical views should be undertaken, in order to optimize rates of first trimester anomaly detection

    Contributions on detection and classification of internet traffic anomalies

    No full text
    Nowadays, one certainty is that traffic is not well behaved, i.e., its pattern is always changing. Several causes have been pointed as responsible for such variations, some of them being extrinsic to the traffic, as the interaction between traffic, be it legitimate or illegitimate. In a practical point of view, traffic irregularities or traffic anomalies can be described as the result of one or more occurrences that change the normal flow of data over a network. Such occurrences can be triggered by different factors, as Denial of Service (DoS) attacks, flash crowds or management operations. Because the occurrence of such misbehaviours can lead to a lack of control over the network (i.e., security, resources or accuracy issues), since a few years the traffic anomaly related domain has become one of the top research areas, with significant and current contributions. For instance, while some work was mainly concerned with the isolation of network failures, other had, as main intention, the statistical prediction of traffic anomalies using mathematical models. While other was concerned with the introduction of new features in order to enhance traffic analysis. Network Anomaly Detection Algorithm - NADA - is an approach that intends to detect, classify and identify traffic anomalies, being a network anomaly an event that is able of introducing some level of variation on measurable network data. Such variations are disturbing since they have potential to deviate network operations from their normal behaviour. The execution of NADA and its accuracy are guaranteed by considering three axis of action: multi-criteria, multi-scale and multi aggregation level. Altogether they allow the detection of traffic anomalies in traffic traces, as well their classification through the definition of traffic profiles, particularly, anomaly traffic profiles. The latter ones are the basis for an anomaly signatures database. Moreover, the use of those three axis allows an anomaly to be detect ed independently of the traffic parameters it affects (multi-criteria axis), its duration (multi-scale axis) and its intensity (multi-level axis). Hence, anomaly detection and classification form a doublet that can be applied at several areas, ranging from network security to traffic engineering or overlay networks, to name a few. Moreover, if IP information of anomalous flows is added to all this knowledge, as NADA do, it will be possible, with minimum effort, to decide the best actions that should be taken in order to control damages from anomaly occurrences - i.e. to have a fully functional detection system.Il est évident aujourd'hui que le trafic Internet est bien plus complexe et irrégulier qu'escompté, ce qui nuit grandement à un fonctionnement efficace des réseaux, ainsi qu'à la garantie de niveaux de performances et de qualité de service (QdS) satisfaisants. En particulier, le comportement du réseau est surtout mis à mal lorsque le trafic contient des anomalies importantes. Différentes raisons peuvent être à la source de ces anomalies, comme les attaques de déni de service (DoS), les foules subites ou les opérations de maintenance ou de gestion des réseaux. De fait, la détection des anomalies dans les réseaux et leurs trafics est devenue un des sujets de recherche les plus chauds du moment. L'objectif de cette thèse a donc été de développer de nouvelles méthodes originales pour détecter, classifier et identifier les anomalies du trafic. La méthode proposée repose notamment sur la recherche de déviations significatives dans les statistiques du trafic par rapport à un trafic normal. La thèse a ainsi conduit à la conception et au développement de l'algorithme NADA : Network Anomaly Detection Algorithm. L'originalité de NADA - et qui garantit son efficacité - repose sur l'analyse du trafic selon 3 axes conjointement : une analyse multi-critères (octets, paquets, flux, ...), multi-échelles et selon plusieurs niveaux d'agrégations. A la suite, la classification repose sur la définition de signatures pour les anomalies de trafic. L'utilisation des 3 axes d'analyse permettent de détecter les anomalies indépendamment des paramètres de trafic affectés (analyse multi-critères), leurs durées (analyse multi-échelles), et leurs intensités (analyse multi-niveaux d'agrégation). Les mécanismes de détection et de classification d'anomalies proposés dans cette thèse peuvent ainsi être utilisés dans différents domaines de l'ingénierie et des opérations réseaux comme la sécurité des réseaux, l'ingénierie du trafic ou des réseaux superposés, pour citer quelques exemples. Une contribu tion importante de la thèse a trait à la méthode de validation et d'évaluation utilisée pour NADA. NADA a ainsi été validé sur une base de trace de trafic contenant des anomalies documentées, puis évalué sur les principales traces de trafic disponibles. Les résultats obtenus sont de très bonne facture, notamment lorsqu'ils sont comparés avec ceux obtenus par d'autres outils de détection d'anomalies. De plus, la qualité des résultats est indépendante du type de trafic analysé et du type d'anomalie. Il a été en particulier montré que NADA était capable de détecter et classifier efficacement les anomalies de faible intensité, qui sont apparues comme des composantes essentielles des attaques DOS. NADA apporte donc une contribution intéressante en matière de sécurité réseau

    Contributions on detection and classification of Internet traffic anomalies

    Get PDF
    Tese de doutoramento em Engenharia Informática apresentada à Fac. Ciências e Tecnologia da Univ. Coimbr

    DĂ©tection, classification et identification d'anomalies de trafic

    No full text
    Cet article présente un nouvel algorithme itératif – NADA – qui détecte, classifie et identifie les anomalies d'un trafic. Cet algorithme a pour objectif de fournir, en plus de ce que font d'autres algorithmes, toutes les informations requises pour stopper la propagation des anomalies, en les localisant dans le temps, en identifiant leur classes (e.g. attaque de déni de service, scan réseau, ou n'importe quel autre type d'anomalies), et en déterminant leurs attributs comme, par exemple, les adresses et ports sources et destinations impliqués. Pour cela, NADA repose sur une approche tomographique générique, multi-échelles, multi-critères et utilisant de multiples niveaux d'agrégation. De plus, NADA utilise un ensemble exhaustif de signatures d'anomalies qui ont été définies spécifiquement pour permettre de classifier ces anomalies. Ces signatures représentées sous forme graphique permettent une classification visuelle par les opérateurs réseaux. NADA a été validé en utilisant des traces de trafic contenant des anomalies connues et documentées comme celles collectées dans le cadre du projet MétroSec

    Some Issues raised by DoS Attacks and the TCP/IP Suite

    No full text
    International audienceThe Internet is a network of such significance that it is now the target of many attacks. Hackers have been first targeting users, but since few years they are now also attacking the network itself. This is why several kinds of attacks have already been studied, classified and resolved. However, this is not true for Denial of Service (DoS) attacks, for which little has been done, and the only countermeasures which have been presented, are only effective for a limited time. The MetroSec project then aims at studying these badly known DoS attacks against the network. For this purpose, MetroSec relies on the use of monitoring and measurement techniques to evaluate the impact of DoS attacks, and discover how they succeed to decrease network QoS. As a first stage in this DoS attacks analysis, this paper presents a classification of DoS attacks associated with the TCP/IP suite, and compares them with some other taxonomies of the literature to help us to understand the similarities and differences in DoS attacks and the scope of the DoS problem. We do expect to get necessary information for issuing solutions for fighting DoS
    corecore