28,014 research outputs found

    XML Schema-based Minification for Communication of Security Information and Event Management (SIEM) Systems in Cloud Environments

    Get PDF
    XML-based communication governs most of today's systems communication, due to its capability of representing complex structural and hierarchical data. However, XML document structure is considered a huge and bulky data that can be reduced to minimize bandwidth usage, transmission time, and maximize performance. This contributes to a more efficient and utilized resource usage. In cloud environments, this affects the amount of money the consumer pays. Several techniques are used to achieve this goal. This paper discusses these techniques and proposes a new XML Schema-based Minification technique. The proposed technique works on XML Structure reduction using minification. The proposed technique provides a separation between the meaningful names and the underlying minified names, which enhances software/code readability. This technique is applied to Intrusion Detection Message Exchange Format (IDMEF) messages, as part of Security Information and Event Management (SIEM) system communication hosted on Microsoft Azure Cloud. Test results show message size reduction ranging from 8.15% to 50.34% in the raw message, without using time-consuming compression techniques. Adding GZip compression to the proposed technique produces 66.1% shorter message size compared to original XML messages.Comment: XML, JSON, Minification, XML Schema, Cloud, Log, Communication, Compression, XMill, GZip, Code Generation, Code Readability, 9 pages, 12 figures, 5 tables, Journal Articl

    spChains: A Declarative Framework for Data Stream Processing in Pervasive Applications

    Get PDF
    Pervasive applications rely on increasingly complex streams of sensor data continuously captured from the physical world. Such data is crucial to enable applications to ``understand'' the current context and to infer the right actions to perform, be they fully automatic or involving some user decisions. However, the continuous nature of such streams, the relatively high throughput at which data is generated and the number of sensors usually deployed in the environment, make direct data handling practically unfeasible. Data not only needs to be cleaned, but it must also be filtered and aggregated to relieve higher level algorithms from near real-time handling of such massive data flows. We propose here a stream-processing framework (spChains), based upon state-of-the-art stream processing engines, which enables declarative and modular composition of stream processing chains built atop of a set of extensible stream processing blocks. While stream processing blocks are delivered as a standard, yet extensible, library of application-independent processing elements, chains can be defined by the pervasive application engineering team. We demonstrate the flexibility and effectiveness of the spChains framework on two real-world applications in the energy management and in the industrial plant management domains, by evaluating them on a prototype implementation based on the Esper stream processo

    Remote real-time monitoring of subsurface landfill gas migration

    Get PDF
    The cost of monitoring greenhouse gas emissions from landfill sites is of major concern for regulatory authorities. The current monitoring procedure is recognised as labour intensive, requiring agency inspectors to physically travel to perimeter borehole wells in rough terrain and manually measure gas concentration levels with expensive hand-held instrumentation. In this article we present a cost-effective and efficient system for remotely monitoring landfill subsurface migration of methane and carbon dioxide concentration levels. Based purely on an autonomous sensing architecture, the proposed sensing platform was capable of performing complex analytical measurements in situ and successfully communicating the data remotely to a cloud database. A web tool was developed to present the sensed data to relevant stakeholders. We report our experiences in deploying such an approach in the field over a period of approximately 16 months

    Temporal and Cross Correlations in Business News

    Get PDF
    We empirically investigated temporal and cross correlations in the frequency of news reports on companies using a unique dataset with more than 100 million news articles reported in English by around 500 press agencies worldwide for the period 2003-2009. Our main findings are as follows. First, the frequency of news reports on a company does not follow a Poisson process; instead, it is characterized by long memory with a positive autocorrelation for more than a year. Second, there exist significant correlations in the frequency of news across companies. Specifically, on a daily or longer time scale, the frequency of news is governed by external dynamics such as an increase in the number of news due to, for example, the outbreak of an economic crisis, while it is governed by internal dynamics on a time scale of minutes. These two findings indicate that the frequency of news on companies has similar statistical properties as trading activities, measured by trading volumes or price volatility, in stock markets, suggesting that the flow of information through news on companies plays an important role in price dynamics in stock markets.

    ROPocop - Dynamic Mitigation of Code-Reuse Attacks

    Full text link
    Control-flow attacks, usually achieved by exploiting a buffer-overflow vulnerability, have been a serious threat to system security for over fifteen years. Researchers have answered the threat with various mitigation techniques, but nevertheless, new exploits that successfully bypass these technologies still appear on a regular basis. In this paper, we propose ROPocop, a novel approach for detecting and preventing the execution of injected code and for mitigating code-reuse attacks such as return-oriented programming (RoP). ROPocop uses dynamic binary instrumentation, requiring neither access to source code nor debug symbols or changes to the operating system. It mitigates attacks by both monitoring the program counter at potentially dangerous points and by detecting suspicious program flows. We have implemented ROPocop for Windows x86 using PIN, a dynamic program instrumentation framework from Intel. Benchmarks using the SPEC CPU2006 suite show an average overhead of 2.4x, which is comparable to similar approaches, which give weaker guarantees. Real-world applications show only an initially noticeable input lag and no stutter. In our evaluation our tool successfully detected all 11 of the latest real-world code-reuse exploits, with no false alarms. Therefore, despite the overhead, it is a viable, temporary solution to secure critical systems against exploits if a vendor patch is not yet available
    corecore