3,157 research outputs found

    Towards a set of metrics to guide the generation of fake computer file systems

    Get PDF
    Fake file systems are used in the field of cyber deception to bait intruders and fool forensic investigators. File system researchers also frequently generate their own synthetic document repositories, due to data privacy and copyright concerns associated with experimenting on real-world corpora. For both these fields, realism is critical. Unfortunately, after creating a set of files and folders, there are no current testing standards that can be applied to validate their authenticity, or conversely, reliably automate their detection. This paper reviews the previous 30 years of file system surveys on real world corpora, to identify a set of discrete measures for generating synthetic file systems. Statistical distributions, such as size, age and lifetime of files, common file types, compression and duplication ratios, directory distribution and depth (and its relationship with numbers of files and sub-directories) were identified and the respective merits discussed. Additionally, this paper highlights notable absences in these surveys, which could be beneficial, such as analysing, on mass, the text content distribution, file naming habits, and comparing file access times against traditional working hours

    Two ways to Grid: the contribution of Open Grid Services Architecture (OGSA) mechanisms to service-centric and resource-centric lifecycles

    Get PDF
    Service Oriented Architectures (SOAs) support service lifecycle tasks, including Development, Deployment, Discovery and Use. We observe that there are two disparate ways to use Grid SOAs such as the Open Grid Services Architecture (OGSA) as exemplified in the Globus Toolkit (GT3/4). One is a traditional enterprise SOA use where end-user services are developed, deployed and resourced behind firewalls, for use by external consumers: a service-centric (or ‘first-order’) approach. The other supports end-user development, deployment, and resourcing of applications across organizations via the use of execution and resource management services: A Resource-centric (or ‘second-order’) approach. We analyze and compare the two approaches using a combination of empirical experiments and an architectural evaluation methodology (scenario, mechanism, and quality attributes) to reveal common and distinct strengths and weaknesses. The impact of potential improvements (which are likely to be manifested by GT4) is estimated, and opportunities for alternative architectures and technologies explored. We conclude by investigating if the two approaches can be converged or combined, and if they are compatible on shared resources

    Experiences In Migrating An Industrial Application To Aspects

    Get PDF
    Aspect-Oriented Software Development (AOSD) is a paradigm aiming to solve problems of object-oriented programming (OOP). With normal OOP it’s often unlikely to accomplish fine system modularity due to crosscutting concerns being scattered and tangled throughout the system. AOSD resolves this problem by its capability to crosscut the regular code and as a consequence transfer the crosscutting concerns to a single model called aspect. This thesis describes an experiment on industrial application wherein the effectiveness of aspect-oriented techniques is explained in migration the OOP application into aspects. The experiment goals at first to identify the crosscutting concerns in source code of the industrial application and transform these concerns to a functionally equivalent aspect-oriented version. In addition to presenting experiences gained through the experiment, the thesis aims to provide practical guidance of aspect solutions in a real application

    Remodularization Analysis Using Semantic Clustering

    Get PDF
    International audienceIn this paper, we report an experience on using and adapting Semantic Clustering to evaluate software remodularizations. Semantic Clustering is an approach that relies on information retrieval and clustering techniques to extract sets of similar classes in a system, according to their vocabularies. We adapted Semantic Clustering to support remodularization analysis. We evaluate our adaptation using six real-world remodularizations of four software systems. We report that Semantic Clustering and conceptual metrics can be used to express and explain the intention of the architects when performing common modularization operators, such as module decomposition

    Intelligent Radio Spectrum Monitoring

    Full text link
    [EN] Spectrum monitoring is an important part of the radio spectrum management process, providing feedback on the workflow that allows for our current wirelessly interconnected lifestyle. The constantly increasing number of users and uses of wireless technologies is pushing the limits and capabilities of the existing infrastructure, demanding new alternatives to manage and analyse the extremely large volume of data produced by existing spectrum monitoring networks. This study addresses this problem by proposing an information management system architecture able to increase the analytical level of a spectrum monitoring measurement network. This proposal includes an alternative to manage the data produced by such network, methods to analyse the spectrum data and to automate the data gathering process. The study was conducted employing system requirements from the Brazilian National Telecommunications Agency and related functional concepts were aggregated from the reviewed scientific literature and publications from the International Telecommunication Union. The proposed solution employs microservice architecture to manage the data, including tasks such as format conversion, analysis, optimization and automation. To enable efficient data exchange between services, we proposed the use of a hierarchical structure created using the HDF5 format. The suggested architecture was partially implemented as a pilot project, which allowed to demonstrate the viability of presented ideas and perform an initial refinement of the proposed data format and analytical algorithms. The results pointed to the potential of the solution to solve some of the limitations of the existing spectrum monitoring workflow. The proposed system may play a crucial role in the integration of the spectrum monitoring activities into open data initiatives, promoting transparency and data reusability for this important public service.[ES] El control y análisis de uso del espectro electromagnético, un servicio conocido como comprobación técnica del espectro, es una parte importante del proceso de gestión del espectro de radiofrecuencias, ya que proporciona la información necesaria al flujo de trabajo que permite nuestro estilo de vida actual, interconectado e inalámbrico. El número cada vez más grande de usuarios y el creciente uso de las tecnologías inalámbricas amplían las demandas sobre la infraestructura existente, exigiendo nuevas alternativas para administrar y analizar el gran volumen de datos producidos por las estaciones de medición del espectro. Este estudio aborda este problema al proponer una arquitectura de sistema para la gestión de información capaz de aumentar la capacidad de análisis de una red de equipos de medición dedicados a la comprobación técnica del espectro. Esta propuesta incluye una alternativa para administrar los datos producidos por dicha red, métodos para analizar los datos recolectados, así como una propuesta para automatizar el proceso de recopilación. El estudio se realizó teniendo como referencia los requisitos de la Agencia Nacional de Telecomunicaciones de Brasil, siendo considerados adicionalmente requisitos funcionales relacionados descritos en la literatura científica y en las publicaciones de la Unión Internacional de Telecomunicaciones. La solución propuesta emplea una arquitectura de microservicios para la administración de datos, incluyendo tareas como la conversión de formatos, análisis, optimización y automatización. Para permitir el intercambio eficiente de datos entre servicios, sugerimos el uso de una estructura jerárquica creada usando el formato HDF5. Esta arquitectura se implementó parcialmente dentro de un proyecto piloto, que permitió demostrar la viabilidad de las ideas presentadas, realizar mejoras en el formato de datos propuesto y en los algoritmos analíticos. Los resultados señalaron el potencial de la solución para resolver algunas de las limitaciones del tradicional flujo de trabajo de comprobación técnica del espectro. La utilización del sistema propuesto puede mejorar la integración de las actividades e impulsar iniciativas de datos abiertos, promoviendo la transparencia y la reutilización de datos generados por este importante servicio público[CA] El control i anàlisi d'ús de l'espectre electromagnètic, un servei conegut com a comprovació tècnica de l'espectre, és una part important del procés de gestió de l'espectre de radiofreqüències, ja que proporciona la informació necessària al flux de treball que permet el nostre estil de vida actual, interconnectat i sense fils. El número cada vegada més gran d'usuaris i el creixent ús de les tecnologies sense fils amplien la demanda sobre la infraestructura existent, exigint noves alternatives per a administrar i analitzar el gran volum de dades produïdes per les xarxes d'estacions de mesurament. Aquest estudi aborda aquest problema en proposar una arquitectura de sistema per a la gestió d'informació capaç d’augmentar la capacitat d’anàlisi d'una xarxa d'equips de mesurament dedicats a la comprovació tècnica de l'espectre. Aquesta proposta inclou una alternativa per a administrar les dades produïdes per aquesta xarxa, mètodes per a analitzar les dades recol·lectades, així com una proposta per a automatitzar el procés de recopilació. L'estudi es va realitzar tenint com a referència els requisits de l'Agència Nacional de Telecomunicacions del Brasil, sent considerats addicionalment requisits funcionals relacionats descrits en la literatura científica i en les publicacions de la Unió Internacional de Telecomunicacions. La solució proposada empra una arquitectura de microserveis per a l'administració de dades, incloent tasques com la conversió de formats, anàlisi, optimització i automatització. Per a permetre l'intercanvi eficient de dades entre serveis, suggerim l'ús d'una estructura jeràrquica creada usant el format HDF5. Aquesta arquitectura es va implementar parcialment dins d'un projecte pilot, que va permetre demostrar la viabilitat de les idees presentades, realitzar millores en el format de dades proposat i en els algorismes analítics. Els resultats van assenyalar el potencial de la solució per a resoldre algunes de les limitacions del tradicional flux de treball de comprovació tècnica de l'espectre. La utilització del sistema proposat pot millorar la integració de les activitats i impulsar iniciatives de dades obertes, promovent la transparència i la reutilització de dades generades per aquest important servei públicSantos Lobão, F. (2019). Intelligent Radio Spectrum Monitoring. http://hdl.handle.net/10251/128850TFG
    • …
    corecore