43 research outputs found
Shadow Honeypots
We present Shadow Honeypots, a novel hybrid architecture that combines the best features of honeypots and anomaly detection. At a high level, we use a variety of anomaly detectors to monitor all traffic to a protected network or service. Traffic that is considered anomalous is processed by a "shadow honeypot" to determine the accuracy of the anomaly prediction. The shadow is an instance of the protected software that shares all internal state with a regular ("production") instance of the application, and is instrumented to detect potential attacks. Attacks against the shadow are caught, and any incurred state changes are discarded. Legitimate traffic that was misclassified will be validated by the shadow and will be handled correctly by the system transparently to the end user. The outcome of processing a request by the shadow is used to filter future attack instances and could be used to update the anomaly detector. Our architecture allows system designers to fine-tune systems for performance, since false positives will be filtered by the shadow. We demonstrate the feasibility of our approach in a proof-of-concept implementation of the Shadow Honeypot architecture for the Apache web server and the Mozilla Firefox browser. We show that despite a considerable overhead in the instrumentation of the shadow honeypot (up to 20% for Apache), the overall impact on the system is diminished by the ability to minimize the rate of false-positives
Detecting Targeted Attacks Using Shadow Honeypots
We present Shadow Honeypots, a novel hybrid architecture that combines the best features of honeypots and anomaly detection. At a high level, we use a variety of anomaly detectors to monitor all traffic to a protected network/service. Traffic that is considered anomalous is processed by a "shadow honeypot'' to determine the accuracy of the anomaly prediction. The shadow is an instance of the protected software that shares all internal state with a regular ("production'') instance of the application, and is instrumented to detect potential attacks. Attacks against the shadow are caught, and any incurred state changes are discarded. Legitimate traffic that was misclassified will be validated by the shadow and will be handled correctly by the system transparently to the end user. The outcome of processing a request by the shadow is used to filter future attack instances and could be used to update the anomaly detector. Our architecture allows system designers to fine-tune systems for performance, since false positives will be filtered by the shadow. Contrary to regular honeypots, our architecture can be used both for server and client applications. We demonstrate the feasibility of our approach in a proof-of-concept implementation of the Shadow Honeypot architecture for the Apache web server and the Mozilla Firefox browser. We show that despite a considerable overhead in the instrumentation of the shadow honeypot (up to 20% for Apache), the overall impact on the system is diminished by the ability to minimize the rate of false-positives
Validation of Memory Accesses Through Symbolic Analyses
International audienceThe C programming language does not prevent out-of- bounds memory accesses. There exist several techniques to secure C programs; however, these methods tend to slow down these programs substantially, because they populate the binary code with runtime checks. To deal with this prob- lem, we have designed and tested two static analyses - sym- bolic region and range analysis - which we combine to re- move the majority of these guards. In addition to the analy- ses themselves, we bring two other contributions. First, we describe live range splitting strategies that improve the effi- ciency and the precision of our analyses. Secondly, we show how to deal with integer overflows, a phenomenon that can compromise the correctness of static algorithms that validate memory accesses. We validate our claims by incorporating our findings into AddressSanitizer. We generate SPEC CINT 2006 code that is 17% faster and 9% more energy efficient than the code produced originally by this tool. Furthermore, our approach is 50% more effective than Pentagons, a state- of-the-art analysis to sanitize memory accesses
A deep stratosphere-to-troposphere ozone transport event over Europe simulated in CAMS global and regional forecast systems: analysis and evaluation
Stratosphere-to-troposphere transport (STT) is an important natural source of
tropospheric ozone, which can occasionally influence ground-level ozone
concentrations relevant for air quality. Here, we analyse and evaluate the
Copernicus Atmosphere Monitoring Service (CAMS) global and regional forecast
systems during a deep STT event over Europe for the time period from 4 to 9 January 2017. The predominant synoptic condition is described by a deep upper
level trough over eastern and central Europe, favouring the formation of
tropopause folding events along the jet stream axis and therefore the
intrusion of stratospheric ozone into the troposphere. Both global and
regional CAMS forecast products reproduce the hook-shaped streamer of
ozone-rich and dry air in the middle troposphere depicted from the observed
satellite images of water vapour. The CAMS global model successfully
reproduces the folding of the tropopause at various European sites, such as
Trapani (Italy), where a deep folding down to 550 hPa is seen. The
stratospheric ozone intrusions into the troposphere observed by WOUDC
ozonesonde and IAGOS aircraft measurements are satisfactorily forecasted up
to 3 days in advance by the CAMS global model in terms of both temporal and
vertical features of ozone. The fractional gross error (FGE) of CAMS ozone
day 1 forecast between 300 and 500 hPa is 0.13 over Prague, while over
Frankfurt it is 0.04 and 0.19, highlighting the contribution of data
assimilation, which in most cases improves the model performance. Finally, the
meteorological and chemical forcing of CAMS global forecast system in the CAMS
regional forecast systems is found to be beneficial for predicting the
enhanced ozone concentrations in the middle troposphere during a deep STT
event.</p
In Situ Microscopy Analysis Reveals Local Innate Immune Response Developed around Brucella Infected Cells in Resistant and Susceptible Mice
Brucella are facultative intracellular bacteria that chronically infect humans and animals causing brucellosis. Brucella are able to invade and replicate in a broad range of cell lines in vitro, however the cells supporting bacterial growth in vivo are largely unknown. In order to identify these, we used a Brucella melitensis strain stably expressing mCherry fluorescent protein to determine the phenotype of infected cells in spleen and liver, two major sites of B. melitensis growth in mice. In both tissues, the majority of primary infected cells expressed the F4/80 myeloid marker. The peak of infection correlated with granuloma development. These structures were mainly composed of CD11b+ F4/80+ MHC-II+ cells expressing iNOS/NOS2 enzyme. A fraction of these cells also expressed CD11c marker and appeared similar to inflammatory dendritic cells (DCs). Analysis of genetically deficient mice revealed that differentiation of iNOS+ inflammatory DC, granuloma formation and control of bacterial growth were deeply affected by the absence of MyD88, IL-12p35 and IFN-γ molecules. During chronic phase of infection in susceptible mice, we identified a particular subset of DC expressing both CD11c and CD205, serving as a reservoir for the bacteria. Taken together, our results describe the cellular nature of immune effectors involved during Brucella infection and reveal a previously unappreciated role for DC subsets, both as effectors and reservoir cells, in the pathogenesis of brucellosis
Modern web technologies
Nowadays, World Wide Web is one of the most significant tools that people employ to seek information, locate new sources of knowledge, communicate, share ideas and experiences or even purchase products and make online bookings. The technologies adopted by the modern Web applications are being discussed in this book chapter. We summarize the most fundamental principles employed by the Web such as the client-server model and the http protocol and then we continue by presenting the current trends such as asynchronous communications, distributed applications, cloud computing and mobile Web applications. Finally, we conduct a short discussion regarding the future of the Web and the technologies that are going to play key roles in the deployment of novel applications. © 2011 Springer-Verlag Berlin Heidelberg
Effective ranking fusion methods for personalized metasearch engines
Metasearch engines are a significant part of the information retrieval process. Most of Web users use them directly or indirectly to access information from more than one data sources. The cornerstone of their technology is their rank aggregation method, which is the algorithm they use to classify the collected results. In this paper we present three new rank aggregation methods. At first, we propose a method that takes into consideration the regional data for the user and the pages and assigns scores according to a variety of user defined parameters. In the second expansion, not all component engines are treated equally. The user is free to define the importance of each engine by setting appropriate weights. The third algorithm is designed to classify pages having URLs that contain subdomains. The three presented methods are combined into a single, personalized scoring formula, the Global KE. All algorithms have been implemented in QuadSearch, an experimental metasearch engine available at http://quadsearch.csd.auth.gr. © 2008 IEEE
The f Index: Quantifying the Impact of Coterminal Citations on Scientists' Ranking
Designing fair and unbiased metrics to measure the "level of excellence" of a scientist is a very significant task because they recently also have been taken into account when deciding faculty promotions, when allocating funds, and so on. Despite criticism that such scientometric evaluators are confronted with, they do have their merits, and efforts should be spent to arm them with robustness and resistance to manipulation. This article alms at initiating the study of the coterminal citations-their existence and implications-and presents them as a generalization of self-citations and of co-citation; it also shows how they can be used to capture any manipulation attempts against scientometric indicators, and finally presents a new index, the f index, that takes into account the coterminal citations. The utility of the new index is validated using the academic production of a number of esteemed computer scientists. The results confirm that the new index can discriminate those individuals whose work penetrates many scientific communities
On the impact of future climate change on tropopause folds and tropospheric ozone
Using a transient simulation for the period 19602100 with the state-of-the-art ECHAM5/MESSy Atmospheric Chemistry (EMAC) global model and a tropopause fold identification algorithm, we explore the future projected changes in tropopause folds, stratosphere-to-troposphere transport (STT) of ozone, and tropospheric ozone under the RCP6.0 scenario. Statistically significant changes in tropopause fold frequencies from 1970-1999 to 2070-2099 are identified in both hemispheres, regionally exceeding 3 %, and are associated with the projected changes in the position and intensity of the subtropical jet streams. A strengthening of ozone STT is projected for the future in both hemispheres, with an induced increase in transported stratospheric ozone tracer throughout the whole troposphere, reaching up to 10 nmol mol -1 in the upper troposphere, 8 nmol mol-1 in the middle troposphere, and 3 nmol mol-1 near the surface. Notably, the regions exhibiting the largest changes of ozone STT at 400 hPa coincide with those with the highest fold frequency changes, highlighting the role of the tropopause folding mechanism in STT processes under a changing climate. For both the eastern Mediterranean and Middle East (EMME) and Afghanistan (AFG) regions, which are known as hotspots of fold activity and ozone STT during the summer period, the year-to-year variability of middle-tropospheric ozone with stratospheric origin is largely explained by the short-term variations in ozone at 150 hPa and tropopause fold frequency. Finally, ozone in the lower troposphere is projected to decrease under the RCP6.0 scenario during MAM (March, April, and May) and JJA (June, July, and August) in the Northern Hemisphere and during DJF (December, January, and February) in the Southern Hemisphere, due to the decline of ozone precursor emissions and the enhanced ozone loss from higher water vapour abundances, while in the rest of the troposphere ozone shows a remarkable increase owing mainly to the STT strengthening and the stratospheric ozone recovery