17 research outputs found

    Performance analysis of a new packet trace compressor based on TCP flow clustering

    Get PDF
    In this paper we study the properties of a new packet trace compression method based on clustering of TCP flows. With our proposed method, the compression ratio that we achieve is around 3%, reducing the file size, for instance, from 100 MB to 3 MB. Although this specification defines a lossy compressed data format, it preserves important statistical properties present into original trace. In order to validate the method, memory performance studies were done with the Radix Tree algorithm executing a trace generated by our method. To give support to these studies, measurements were taken of memory access and cache miss ratio. For the time, the results have showed that our proposed method provides a good solution for packet trace compression.Peer ReviewedPostprint (published version

    An Intelligent Multicriteria Model for Diagnosing Dementia in People Infected with Human Immunodeficiency Virus

    Get PDF
    Hybrid models to detect dementia based on Machine Learning can provide accurate diagnoses in individuals with neurological disorders and cognitive complications caused by Human Immunodeficiency Virus (HIV) infection. This study proposes a hybrid approach, using Machine Learning algorithms associated with the multicriteria method of Verbal Decision Analysis (VDA). Dementia, which affects many HIV-infected individuals, refers to neurodevelopmental and mental disorders. Some manuals standardize the information used in the correct detection of neurological disorders with cognitive complications. Among the most common manuals used are the DSM-5 (Diagnostic and Statistical Manual of Mental Disorders, 5th edition) of the American Psychiatric Association and the International Classification of Diseases, 10th edition (ICD-10)—both published byWorld Health Organization (WHO). The model is designed to explore the predictive of specific data. Furthermore, a well-defined database data set improves and optimizes the diagnostic models sought in the research.info:eu-repo/semantics/publishedVersio

    Hybrid model for early identification post-Covid-19 sequelae

    Get PDF
    Artificial Intelligence techniques based on Machine Learning algorithms, Neural Networks and Naïve Bayes can optimise the diagnostic process of the SARS-CoV-2 or Covid-19. The most significant help of these techniques is analysing data recorded by health professionals when treating patients with this disease. Health professionals' more specific focus is due to the reduction in the number of observable signs and symptoms, ranging from an acute respiratory condition to severe pneumonia, showing an efficient form of attribute engineering. It is important to note that the clinical diagnosis can vary from asymptomatic to extremely harsh conditions. About 80% of patients with Covid-19 may be asymptomatic or have few symptoms. Approximately 20% of the detected cases require hospital care because they have difficulty breathing, of which about 5% may require ventilatory support in the Intensive Care Unit. Also, the present study proposes a hybrid approach model, structured in the composition of Artificial Intelligence techniques, using Machine Learning algorithms, associated with multicriteria methods of decision support based on the Verbal Decision Analysis methodology, aiming at the discovery of knowledge, as well as exploring the predictive power of specific data in this study, to optimise the diagnostic models of Covid-19. Thus, the model will provide greater accuracy to the diagnosis sought through clinical observation.info:eu-repo/semantics/publishedVersio

    Controle Externo da governança de Tecnologia da Informação

    No full text
    O crescente processo de informatização da Administração Pública brasileira contribui para uma maior agilidade e qualidade nos serviços públicos prestados para a sociedade e um consequente aumento na transparência das ações governamentais. Por outro lado, os gastos nos investimentos e na manutenção dos recursos de Tecnologia da Informação (TI) vêm aumentando consideravelmente, bem como tem-se verificado uma forte dependência das instituições com relação aos sistemas informatizados e à segurança das suas bases de dados. Com o aumento da importância estratégica da área de TI, houve uma busca pela aplicação de modelos de governança, com o objetivo de tornar a área controlável, com resultados mensuráveis e orientada aos objetivos do negócio da instituição. A auditoria de TI tem como função principal avaliar o processo de gestão, no que se refere aos seus diversos aspectos, tais como a governança corporativa, gestão de riscos de TI e procedimentos de aderência às normas regulatórias, apontando eventuais desvios e vulnerabilidades, como também oferecendo alternativas de soluções para esses diversos problemas. No âmbito do controle externo, os Tribunais de Contas começam a reconhecer a necessidade de implantar áreas especializadas na realização de Auditoria de TI. Neste contexto, o presente trabalho pretende apresentar quais as abordagens utilizadas nesta área de fiscalização, destacando a importância da Governança de TI como importante instrumento na atuação do controle externo na fiscalização da gestão e do uso da Tecnologia da Informação na Administração Pública

    Controle Externo da governança de Tecnologia da Informação

    No full text
    O crescente processo de informatização da Administração Pública brasileira contribui para uma maior agilidade e qualidade nos serviços públicos prestados para a sociedade e um consequente aumento na transparência das ações governamentais. Por outro lado, os gastos nos investimentos e na manutenção dos recursos de Tecnologia da Informação (TI) vêm aumentando consideravelmente, bem como tem-se verificado uma forte dependência das instituições com relação aos sistemas informatizados e à segurança das suas bases de dados. Com o aumento da importância estratégica da área de TI, houve uma busca pela aplicação de modelos de governança, com o objetivo de tornar a área controlável, com resultados mensuráveis e orientada aos objetivos do negócio da instituição. A auditoria de TI tem como função principal avaliar o processo de gestão, no que se refere aos seus diversos aspectos, tais como a governança corporativa, gestão de riscos de TI e procedimentos de aderência às normas regulatórias, apontando eventuais desvios e vulnerabilidades, como também oferecendo alternativas de soluções para esses diversos problemas. No âmbito do controle externo, os Tribunais de Contas começam a reconhecer a necessidade de implantar áreas especializadas na realização de Auditoria de TI. Neste contexto, o presente trabalho pretende apresentar quais as abordagens utilizadas nesta área de fiscalização, destacando a importância da Governança de TI como importante instrumento na atuação do controle externo na fiscalização da gestão e do uso da Tecnologia da Informação na Administração Pública

    QUE RECURSOS PODEM CONTRIBURIR PARA OTIMIZAR A APRENDIZAGEM EM UM AMBIENTE VIRTUAL

    No full text
    Estamos implantando um curso de educação a distância em Geometria Dinâmica, usando o software Cabri-Géomètre II for Windows, para a formação de professores da rede pública com os conteúdos de Geometria Euclidiana Plana. Além da formação do professor, o curso visa a constituição de uma ferramenta para desenvolvimento de atividades didáticas em matemática com recursos de Internet e software educativos e multimídia. O presente trabalho versa sobre uma experiência realizada com alunos no Laboratório Multimeios FACED/UFC, aonde procuramos verificar de modo experimental, como efetivamente as comunicações por áudio, vídeo e partilhamento da mesma área de trabalho são recursos que contribuem para ampliar significativamente as possibilidades de aprendizagem, mediação e colaboração entre os pare

    Performance analysis of a new packet trace compressor based on TCP flow clustering

    No full text
    In this paper we study the properties of a new packet trace compression method based on clustering of TCP flows. With our proposed method, the compression ratio that we achieve is around 3%, reducing the file size, for instance, from 100 MB to 3 MB. Although this specification defines a lossy compressed data format, it preserves important statistical properties present into original trace. In order to validate the method, memory performance studies were done with the Radix Tree algorithm executing a trace generated by our method. To give support to these studies, measurements were taken of memory access and cache miss ratio. For the time, the results have showed that our proposed method provides a good solution for packet trace compression.Peer Reviewe

    Using QoC for improving energy-efficient context management in U-Health Systems

    No full text
    International audienceContext Management Framework (CMF) for Ubiquitous Health (U-Health) Systems should be able to continuously gather raw data from observed entities to characterize their current situation (context). However, the death of battery-dependent sensors reduce their ability for detecting the context, which directly affects the availability of context-aware u-health services. This paper proposes the use of Quality of Context (QoC) integrated with a data reduction approach to minimize the amount of sensed raw data sent to CMF, reducing the energy consumption and maximizing the lifetime of sensor-based CMF. The proposed approach rebuilds the gathered raw data taking into account QoC requirements, avoiding the loss of precision (QoC Indicator precision) and timeliness (QoC Indicator up-to-dateness), which has been integrated into our Context Management Framework (CxtMF). Experimental results demonstrate the effectiveness of our approach by reducing the amount of packets sent over network to 3% for the ECG monitoring service

    Infrastructure for Integration of Legacy Electrical Equipment into a Smart-Grid Using Wireless Sensor Networks

    No full text
    At present, the standardisation of electrical equipment communications is on the rise. In particular, manufacturers are releasing equipment for the smart grid endowed with communication protocols such as DNP3, IEC 61850, and MODBUS. However, there are legacy equipment operating in the electricity distribution network that cannot communicate using any of these protocols. Thus, we propose an infrastructure to allow the integration of legacy electrical equipment to smart grids by using wireless sensor networks (WSNs). In this infrastructure, each legacy electrical device is connected to a sensor node, and the sink node runs a middleware that enables the integration of this device into a smart grid based on suitable communication protocols. This middleware performs tasks such as the translation of messages between the power substation control centre (PSCC) and electrical equipment in the smart grid. Moreover, the infrastructure satisfies certain requirements for communication between the electrical equipment and the PSCC, such as enhanced security, short response time, and automatic configuration. The paper’s contributions include a solution that enables electrical companies to integrate their legacy equipment into smart-grid networks relying on any of the above mentioned communication protocols. This integration will reduce the costs related to the modernisation of power substations

    A Middleware for the Integration of Smart Grid Elements with WSN Based Solutions

    No full text
    Currently, electricity distributors make use of various types of equipment divided into levels of automation. This automation enables the integration of elements such as Intelligent Electronic Devices (IEDs) to the supervision of the distribution electrical system, but there is not an appropriate environment to increase the scale of these elements. In this context, the smart grid comes with specifications that allow adding new elements to the intelligence of the power grid operation. However, the cost of communication is still an impediment to the scalability of the integration of these elements into the current structure. In this paper, we propose a middleware that optimizes the communication of this integration using wireless sensor networks (WSN). The goal is to ensure a gradual integration of new elements taking advantage of the increase in the number of sensor nodes in the network due to the scalability of the system itself. The conversion solutions have been used to allow easy communication between the WSN and the smart grid system, and we also have used data aggregation and compression techniques to increase the lifetime of the wireless sensor network
    corecore