1,437 research outputs found

    Implementation of Secure Log Management Over Cloud

    Get PDF
    A Log records are very important information which is related to activities of systems, applications or networks and these log records having various fields and their syntax. Actually logs are automatically generated on activities that are done and doing by user on system, or on any Applications such as Google Chrome or in networks. These logs are costly and need to any organization for future references such as to identify or finding any problems, to record all events, to find performance, and to investigate malicious activities in systems or networks or in application. So, protection of logs from attackers is required. Hence organization should maintain integrity, confidentiality, and security of logs. The cost to maintain logs for organizations for longer period is very less. Hence, we developed secure log management over cloud to decrease cost as well as provide security of log from attackers. To achieve this, we have done this with the help of Blowfish algorithm to Encrypt log records then SHA-1 is used to provide confidentiality while transmitting and at end point security purpose we used Shamir’s Secret sharing algorithm. DOI: 10.17762/ijritcc2321-8169.150511

    Development of a centralized log management system

    Get PDF
    Os registos de um sistema são uma peça crucial de qualquer sistema e fornecem uma visão útil daquilo que este está fazendo e do que acontenceu em caso de falha. Qualquer processo executado num sistema gera registos em algum formato. Normalmente, estes registos ficam armazenados em memória local. À medida que os sistemas evoluiram, o número de registos a analisar também aumentou, e, como consequência desta evolução, surgiu a necessidade de produzir um formato de registos uniforme, minimizando assim dependências e facilitando o processo de análise. A ams é uma empresa que desenvolve e cria soluções no mercado dos sensores. Com vinte e dois centros de design e três locais de fabrico, a empresa fornece os seus serviços a mais de oito mil clientes em todo o mundo. Um centro de design está localizado no Funchal, no qual está incluida uma equipa de engenheiros de aplicação que planeiam e desenvolvem applicações de software para clientes internos. O processo de desenvolvimento destes engenheiros envolve várias aplicações e programas, cada um com o seu próprio sistema de registos. Os registos gerados por cada aplicação são mantido em sistemas de armazenamento distintos. Se um desenvolvedor ou administrador quiser solucionar um problema que abrange várias aplicações, será necessário percorrer as várias localizações onde os registos estão armazenados, colecionando-os e correlacionando-os de forma a melhor entender o problema. Este processo é cansativo e, se o ambiente for dimensionado automaticamente, a solução de problemas semelhantes torna-se inconcebível. Este projeto teve como principal objetivo resolver estes problemas, criando assim um Sistema de Gestão de Registos Centralizado capaz de lidar com registos de várias fontes, como também fornecer serviços que irão ajudar os desenvolvedores e administradores a melhor entender os diferentes ambientes afetados. A solução final foi desenvolvida utilizando um conjunto de diferentes tecnologias de código aberto, tais como a Elastic Stack (Elasticsearch, Logstash e Kibana), Node.js, GraphQL e Cassandra. O presente documento descreve o processo e as decisões tomadas para chegar à solução apresentada.Logs are a crucial piece of any system and give a helpful insight into what it is doing as well as what happened in case of failure. Every process running on a system generates logs in some format. Generally, these logs are written to local storage resources. As systems evolved, the number of logs to analyze increased, and, as a consequence of this progress, there was the need of having a standardized log format, minimizing dependencies and making the analysis process easier. ams is a company that develops and creates sensor solutions. With twenty-two design centers and three manufacturing locations, the company serves to over eight thousand clients worldwide. One design center is located in Funchal, which includes a team of application engineers that design and develop software applications to clients inside the company. The application engineer’s development process is comprised of several applications and programs, each having its own logging system. Log entries generated by different applications are kept in separate storage systems. If a developer or administrator wants to troubleshoot an issue that includes several applications, he/she would have to go to different database systems or locations to collect the logs and correlate them across the several requests. This is a tiresome process and if the environment is auto-scaled, then troubleshooting an issue is inconceivable. This project aimed to solve these problems by creating a Centralized Log Management System that was capable of handling logs from a variety of sources, as well as to provide services that will help developers and administrators better understand the different affected environments. The deployed solution was developed using a set of different open-source technologies, such as the Elastic Stack (Elasticsearch, Logstash and Kibana), Node.js, GraphQL and Cassandra. The present document describes the process and decisions taken to achieve the solution

    Blockchain analytics - real-time log management in healthcare

    Get PDF
    With the growing dependence on information technologies in healthcare organizations, it is crucial to ensure data security since its sensitiveness. To guarantee this requirement, a system was developed consisting of a private blockchain network designed to support the recording of hospital logs and an interface to facilitate data analysis. This article covers the development of a private blockchain solution. For its development were used Hyperledger technologies and Sockets.io. Apart from that, web technologies that were used to interface real-time logs. The objective of the developed solution is to allow the visualization of logs in real time, where analysis methods have applied.FCT -Fundação para a Ciência e a Tecnologia(UIDB/00319/2020

    Evaluation of standard monitoring tools(including log analysis) for control systems at Cern

    Get PDF
    Project Specification: The goal of this Openlab Summer Student project was to assess the implications and the benefits of integrating two standard IT tools, namely Icinga and Splunkstorm with the existing production setup for monitoring and management of control systems at CERN. Icinga – an open source monitoring software based on Nagios would need to be integrated with an in-house developed WinCC OA application called MOON, that is currently used for monitoring and managing all the components that make up the control systems. Splunkstorm – a data analysis and log management online application would be used stand alone, so it didn’t need integration with other software, only understanding of features and installation procedure. Abstract: The aim of this document is to provide insights into installation procedures, key features and functionality and projected implementation effort of Icinga and Splunkstorm IT tools. Focus will be on presenting the most feasible implementation paths that surfaced once both software were well understood

    On Security Log Management Systems

    Get PDF
    A log management system (LMS) is a system for creating, receiving, processing, releasing, and transferring of security log data. Its main objectives include detecting and preventing unauthorised access and abuse, and meeting regulatory requirements. One of its main components is the classification of events to make decisions related to archiving and to invoking responses to certain events. Most current approaches to LMS design are system dependent and involve specific hardware (e.g., firewalls, servers) and commercial software systems. This paper presents a theoretical framework for LMS in terms of a flow-based conceptual model with emphasis on security-related events. The framework includes four separate flow systems: active system, log system, alarm system, and response system. All systems are composed of five inclusive stages: receiving, processing, creating, releasing, and transferring. The experimental part of the paper concentrates on log analysis in the processing stage in the log system. We select actual log entries and classify them according to these five stages

    Testing goGPS low-cost RTK positioning with a web-based track log management system

    Get PDF
    Location-based online collaborative platforms are proving to be an effective and widely adopted solution for geospatial data collection, update and sharing. Popular collaborative projects like OpenStreetMap, Wikimapia and other services that collect and publish user-generated geographic contents have been fostered by the increasing availability of location-aware palmtop devices. These instruments include GPS-enabled mobile phones and low-cost GPS receivers, which are employed for quick field surveys at both professional and non-professional levels. Nevertheless, data collected with such devices are often not accurate enough to avoid heavy user intervention before using or sharing them. Providing tools for collecting and sharing accuracy-enhanced positioning data to a wide and diverse user base requires to integrate modern web technologies and online services with advanced satellite positioning techniques. A web-based prototype system for enhancing GPS tracks quality and managing track logs and points of interest (POI), originally developed using standard GPS devices, was tested by using goGPS software to apply kinematic relative positioning (RTK) with low-cost single-frequency receivers. The workflow consists of acquiring raw GPS measurements from the user receiver and from a network of permanent GPS stations, processing them by RTK positioning within goGPS Kalman filter algorithm, sending the accurate positioning data to the web-based system, performing further quality enhancements if needed, logging the data and displaying them. The whole system can work either in real-time or post-processing, the latter providing a solution to collect and publish enhanced location data without necessarily requiring mobile Internet connection on the field. Tests were performed in open areas and variously dense urban environments, comparing different indices for quality-based filtering. Results are promising and suggest that the integration of web technologies with advanced geodetic techniques applied to low-cost instruments can be an effective solution to collect, update and share accurate location data on collaborative platforms

    Keskitetty lokienhallinta HP ArcSightilla

    Get PDF
    Tämän työn tavoitteena oli tutkia työn tilaajalle Ymon Oy:lle sitä, kuinka Hewlett-Packardin ArcSight-tuoteperheen avulla rakennetaan keskitetty lokienhallintajärjestelmä ja miten hyvän lokienhallinnan tavoitteet toteutuvat. Tietoturvan osa-alueita ovat luottamuksellisuus, eheys, kiistämättömyys ja saatavuus. Lokienhallinta liittyy olen-naisena osana kaikkiin näihin neljään osa-alueeseen ja parantaa niiden toteutumista valitussa ympäristössä. Lähes poikkeuksetta kaikki tietoliikenneverkossa toimivat laitteet tuottavat lokia. Lokitiedon avulla saadaan tarkka käsitys tapahtumasta, ajanhetkestä ja autentikointia käytettäessä tekijästä. Keräämällä lokitiedot keskitettyyn lokienhallintajärjestelmään ja normalisoimalla eri lokit yhtenäiseen lokimuotoon saadaan kokonaiskuva koko järjestelmän tapahtumiin. Keskitetystä järjestelmästä voidaan vaivattomasti nähdä tarvittavat tiedot helposti verrattuna hajallaan oleviin eri lokiformaatteja sisältäviin järjestelmiin. Nämä vaatimukset ovat erityisen tärkeitä luottamuksellisia tietoja käsittelevillä organisaatioilla, kuten luottokorttiyhtiöillä. Tämän työn tuloksena syntyi tutkielma, jossa arvioidaan, miten ohjeistukset ja säännökset erityisesti VAHTI-lokiohjeessa vaikuttavat lokienhallinnan toteutukseen. Lopputuloksena syntyi myös pelkistetty malli miten yksinkertainen ja vikasietoinen keskitetty lokienhallintajärjestelmä voidaan toteuttaa HP ArcSightin lokienhallintatuotteilla.The purpose of this Bachelor’s thesis was to study a centralized log management system for Ymon Oy. This study concentrates on examining the centralized log management system, how it is implemented with Hewlett Packard ArcSight and how well the log management objectives are met. The aspects of information security include confidentiality, integrity, non-repudiation and availability. Log management is an essential component of each of these four areas and significantly improves their implementation. Almost without exception all the communication network devices produce log information. Log information tells the exact nature and time of the event and also of its author, if authentication is used. Gathering log information into a centralized log management system and normalizing it to one log format gives an overview of the events of the whole system. Without this kind of system log information must be collected from various scattered devices and systems. Managing log information efficiently and securely is often a necessary requirement for organizations that handle confidential information. This thesis resulted in a study which assesses how the guidelines and regulations, regarding the VAHTI log-guide in particular, affect the implementation of the log management system. This work also includes the basics of planning, designing and implementation that need to be considered when creating a simple and fault-tolerant centralized log management system with HP ArcSight products. The thesis is especially helpful for those who are looking for information about what kinds of problems may appear at various stages of the development when building a log management system

    The Making of Cloud Applications An Empirical Study on Software Development for the Cloud

    Full text link
    Cloud computing is gaining more and more traction as a deployment and provisioning model for software. While a large body of research already covers how to optimally operate a cloud system, we still lack insights into how professional software engineers actually use clouds, and how the cloud impacts development practices. This paper reports on the first systematic study on how software developers build applications in the cloud. We conducted a mixed-method study, consisting of qualitative interviews of 25 professional developers and a quantitative survey with 294 responses. Our results show that adopting the cloud has a profound impact throughout the software development process, as well as on how developers utilize tools and data in their daily work. Among other things, we found that (1) developers need better means to anticipate runtime problems and rigorously define metrics for improved fault localization and (2) the cloud offers an abundance of operational data, however, developers still often rely on their experience and intuition rather than utilizing metrics. From our findings, we extracted a set of guidelines for cloud development and identified challenges for researchers and tool vendors
    corecore