21 research outputs found

    Qualitative Comparison of Community Detection Algorithms

    Full text link
    Community detection is a very active field in complex networks analysis, consisting in identifying groups of nodes more densely interconnected relatively to the rest of the network. The existing algorithms are usually tested and compared on real-world and artificial networks, their performance being assessed through some partition similarity measure. However, artificial networks realism can be questioned, and the appropriateness of those measures is not obvious. In this study, we take advantage of recent advances concerning the characterization of community structures to tackle these questions. We first generate networks thanks to the most realistic model available to date. Their analysis reveals they display only some of the properties observed in real-world community structures. We then apply five community detection algorithms on these networks and find out the performance assessed quantitatively does not necessarily agree with a qualitative analysis of the identified communities. It therefore seems both approaches should be applied to perform a relevant comparison of the algorithms.Comment: DICTAP 2011, The International Conference on Digital Information and Communication Technology and its Applications, Dijon : France (2011

    IP-Based Mobility Management and Handover Latency Measurement in heterogeneous environments

    Get PDF
    One serious concern in the ubiquitous networks is the seamless vertical handover management between different wireless technologies. To meet this challenge, many standardization organizations proposed different protocols at different layers of the protocol stack. The Internet Engineering Task Force (IETF) has different groups working on mobility at IP level in order to enhance mobile IPv4 and mobile IPv6 with different variants: HMIPv6 (Hierarchical Mobile IPv6), FMIPv6 (Fast Mobile IPv6) and PMIPv6 (Proxy Mobile IPv6) for seamless handover. Moreover, the IEEE 802.21 standard provides another framework for seamless handover. The 3GPP standard provides the Access Network and Selection Function (ANDSF) to support seamless handover between 3GPP – non 3GPP networks like Wi-Fi, considered as untrusted, and WIMAX considered as trusted networks. In this paper, we present an in-depth analysis of seamless vertical handover protocols and a handover latency comparison of the main mobility management approaches in the literature. The comparison shows the advantages and drawbacks of every mechanism in order to facilitate the adoption of the convenient one for vertical handover within Next Generation Network (NGN) environments. Keywords: Seamless vertical handover, mobility management protocols, IEEE 802.21 MIH, handover latenc

    On color image quality assessment using natural image statistics

    Full text link
    Color distortion can introduce a significant damage in visual quality perception, however, most of existing reduced-reference quality measures are designed for grayscale images. In this paper, we consider a basic extension of well-known image-statistics based quality assessment measures to color images. In order to evaluate the impact of color information on the measures efficiency, two color spaces are investigated: RGB and CIELAB. Results of an extensive evaluation using TID 2013 benchmark demonstrates that significant improvement can be achieved for a great number of distortion type when the CIELAB color representation is used

    Towards European Portuguese Conversational Assistants for Smart Homes

    Get PDF
    Nowadays, smart environments, such as Smart Homes, are becoming a reality, due to the access to a wide variety of smart devices at a low cost. These devices are connected to the home network and inhabitants can interact with them using smartphones, tablets and smart assistants, a feature with rising popularity. The diversity of devices, the user\u27s expectations regarding Smart Homes, and assistants\u27 requirements pose several challenges. In this context, a Smart Home Assistant capable of conversation and device integration can be a valuable help to the inhabitants, not only for smart device control, but also to obtain valuable information and have a broader picture of how the house and its devices behave. This paper presents the current stage of development of one such assistant, targeting European Portuguese, not only supporting the control of home devices, but also providing a potentially more natural way to access a variety of information regarding the home and its devices. The development has been made in the scope of Smart Green Homes (SGH) project

    Oscillation-free video adaptation at application layer on server side and experiments using DCCP

    No full text
    International audienceNowadays, video data transfers account for much of the Internet bandwidth and a huge number of users use it daily. However, despite its apparent interest, video streaming is still done in a suboptimal manner. Indeed, more and more high-definition and high-quality videos are nowadays stored on Internet but they are not accessible for everybody because a high and stable bandwidth is needed to stream them; also, during videoconferencing, the highest possible quality often exceeds the available bandwidth. Hence, a lower bitrate encoding is usually chosen but it leads to lower quality and network under-utilization too. This paper presents Video Adaptation at Application Layer (VAAL), a simple and efficient method designed to use optimally network resources and to ameliorate user video experience. It involves only the application layer on the server. The main idea of VAAL is that it checks Transmission Control Protocol-friendly transport protocol buffer overflows and adapts the video bitrate accordingly; as a result, the bitrate constantly matches the network bandwidth. It can be used together with Zigzag Avoidance Algorithm (ZAAL), a novel algorithm aiming to avoid quality oscillations. Experimental results show that the video adaptation using VAAL+ZAAL performs much better compared with the currently widely used static encoding, making it a strong candidate for hard real-time video streaming

    Enhancing Existing Communication Services with Context Awareness

    Get PDF

    ScaleSem (model checking et web sémantique)

    Get PDF
    Le développement croissant des réseaux et en particulier l'Internet a considérablement développé l'écart entre les systèmes d'information hétérogènes. En faisant une analyse sur les études de l'interopérabilité des systèmes d'information hétérogènes, nous découvrons que tous les travaux dans ce domaine tendent à la résolution des problèmes de l'hétérogénéité sémantique. Le W3C (World Wide Web Consortium) propose des normes pour représenter la sémantique par l'ontologie. L'ontologie est en train de devenir un support incontournable pour l'interopérabilité des systèmes d'information et en particulier dans la sémantique. La structure de l'ontologie est une combinaison de concepts, propriétés et relations. Cette combinaison est aussi appelée un graphe sémantique. Plusieurs langages ont été développés dans le cadre du Web sémantique et la plupart de ces langages utilisent la syntaxe XML (eXtensible Meta Language). Les langages OWL (Ontology Web Language) et RDF (Resource Description Framework) sont les langages les plus importants du web sémantique, ils sont basés sur XML.Le RDF est la première norme du W3C pour l'enrichissement des ressources sur le Web avec des descriptions détaillées et il augmente la facilité de traitement automatique des ressources Web. Les descriptions peuvent être des caractéristiques des ressources, telles que l'auteur ou le contenu d'un site web. Ces descriptions sont des métadonnées. Enrichir le Web avec des métadonnées permet le développement de ce qu'on appelle le Web Sémantique. Le RDF est aussi utilisé pour représenter les graphes sémantiques correspondant à une modélisation des connaissances spécifiques. Les fichiers RDF sont généralement stockés dans une base de données relationnelle et manipulés en utilisant le langage SQL ou les langages dérivés comme SPARQL. Malheureusement, cette solution, bien adaptée pour les petits graphes RDF n'est pas bien adaptée pour les grands graphes RDF. Ces graphes évoluent rapidement et leur adaptation au changement peut faire apparaître des incohérences. Conduire l application des changements tout en maintenant la cohérence des graphes sémantiques est une tâche cruciale et coûteuse en termes de temps et de complexité. Un processus automatisé est donc essentiel. Pour ces graphes RDF de grande taille, nous suggérons une nouvelle façon en utilisant la vérification formelle Le Model checking .Le Model checking est une technique de vérification qui explore tous les états possibles du système. De cette manière, on peut montrer qu un modèle d un système donné satisfait une propriété donnée. Cette thèse apporte une nouvelle méthode de vérification et d interrogation de graphes sémantiques. Nous proposons une approche nommé ScaleSem qui consiste à transformer les graphes sémantiques en graphes compréhensibles par le model checker (l outil de vérification de la méthode Model checking). Il est nécessaire d avoir des outils logiciels permettant de réaliser la traduction d un graphe décrit dans un formalisme vers le même graphe (ou une adaptation) décrit dans un autre formalismeThe increasing development of networks and especially the Internet has greatly expanded the gap between heterogeneous information systems. In a review of studies of interoperability of heterogeneous information systems, we find that all the work in this area tends to be in solving the problems of semantic heterogeneity. The W3C (World Wide Web Consortium) standards proposed to represent the semantic ontology. Ontology is becoming an indispensable support for interoperability of information systems, and in particular the semantics. The structure of the ontology is a combination of concepts, properties and relations. This combination is also called a semantic graph. Several languages have been developed in the context of the Semantic Web. Most of these languages use syntax XML (eXtensible Meta Language). The OWL (Ontology Web Language) and RDF (Resource Description Framework) are the most important languages of the Semantic Web, and are based on XML.RDF is the first W3C standard for enriching resources on the Web with detailed descriptions, and increases the facility of automatic processing of Web resources. Descriptions may be characteristics of resources, such as the author or the content of a website. These descriptions are metadata. Enriching the Web with metadata allows the development of the so-called Semantic Web. RDF is used to represent semantic graphs corresponding to a specific knowledge modeling. RDF files are typically stored in a relational database and manipulated using SQL, or derived languages such as SPARQL. This solution is well suited for small RDF graphs, but is unfortunately not well suited for large RDF graphs. These graphs are rapidly evolving, and adapting them to change may reveal inconsistencies. Driving the implementation of changes while maintaining the consistency of a semantic graph is a crucial task, and costly in terms of time and complexity. An automated process is essential. For these large RDF graphs, we propose a new way using formal verification entitled "Model Checking".Model Checking is a verification technique that explores all possible states of the system. In this way, we can show that a model of a given system satisfies a given property. This thesis provides a new method for checking and querying semantic graphs. We propose an approach called ScaleSem which transforms semantic graphs into graphs understood by the Model Checker (The verification Tool of the Model Checking method). It is necessary to have software tools to perform the translation of a graph described in a certain formalism into the same graph (or adaptation) described in another formalismDIJON-BU Doc.électronique (212319901) / SudocSudocFranceF
    corecore