1,114 research outputs found

    A cooperative agent-based security framework

    Get PDF
    The actual economic paradigm is based on a strongly cooperative model that tries to support a more competitive and global organizations response. With cooperation comes an intrinsic need - interconnection and interoperability of information systems among business partners. This represents, in many areas, a huge organizational challenge, being the field of information, and communication security one emerging key issue and a natural enabler for cooperative behavior and to the proper establishment and support of trust among network partners. Security frameworks, that can be able to describe and act on the basis of interoperability, cooperation and proactivity, became essential to support the new needs of modern business models. This paper present a framework that aims to contribute to a sustainable organizational information security-processes support, based on the ability to describe the allowed business process interactions among cooperative partners; furthermore, the framework presents a cooperative security perspective among partners basis on the idea that, if organizations have business cooperation, they should also have active security cooperation and regulation. If organizations, that need to cooperate, do not feel secure when they interconnect their information systems, the all cooperative perspective can fall down. Trust being one basic need for cooperation, impulse the need of a new security approach for cooperative scenarios

    Towards Protection Against Low-Rate Distributed Denial of Service Attacks in Platform-as-a-Service Cloud Services

    Get PDF
    Nowadays, the variety of technology to perform daily tasks is abundant and different business and people benefit from this diversity. The more technology evolves, more useful it gets and in contrast, they also become target for malicious users. Cloud Computing is one of the technologies that is being adopted by different companies worldwide throughout the years. Its popularity is essentially due to its characteristics and the way it delivers its services. This Cloud expansion also means that malicious users may try to exploit it, as the research studies presented throughout this work revealed. According to these studies, Denial of Service attack is a type of threat that is always trying to take advantage of Cloud Computing Services. Several companies moved or are moving their services to hosted environments provided by Cloud Service Providers and are using several applications based on those services. The literature on the subject, bring to attention that because of this Cloud adoption expansion, the use of applications increased. Therefore, DoS threats are aiming the Application Layer more and additionally, advanced variations are being used such as Low-Rate Distributed Denial of Service attacks. Some researches are being conducted specifically for the detection and mitigation of this kind of threat and the significant problem found within this DDoS variant, is the difficulty to differentiate malicious traffic from legitimate user traffic. The main goal of this attack is to exploit the communication aspect of the HTTP protocol, sending legitimate traffic with small changes to fill the requests of a server slowly, resulting in almost stopping the access of real users to the server resources during the attack. This kind of attack usually has a small time window duration but in order to be more efficient, it is used within infected computers creating a network of attackers, transforming into a Distributed attack. For this work, the idea to battle Low-Rate Distributed Denial of Service attacks, is to integrate different technologies inside an Hybrid Application where the main goal is to identify and separate malicious traffic from legitimate traffic. First, a study is done to observe the behavior of each type of Low-Rate attack in order to gather specific information related to their characteristics when the attack is executing in real-time. Then, using the Tshark filters, the collection of those packet information is done. The next step is to develop combinations of specific information obtained from the packet filtering and compare them. Finally, each packet is analyzed based on these combinations patterns. A log file is created to store the data gathered after the Entropy calculation in a friendly format. In order to test the efficiency of the application, a Cloud virtual infrastructure was built using OpenNebula Sandbox and Apache Web Server. Two tests were done against the infrastructure, the first test had the objective to verify the effectiveness of the tool proportionally against the Cloud environment created. Based on the results of this test, a second test was proposed to demonstrate how the Hybrid Application works against the attacks performed. The conclusion of the tests presented how the types of Slow-Rate DDoS can be disruptive and also exhibited promising results of the Hybrid Application performance against Low-Rate Distributed Denial of Service attacks. The Hybrid Application was successful in identify each type of Low-Rate DDoS, separate the traffic and generate few false positives in the process. The results are displayed in the form of parameters and graphs.Actualmente, a variedade de tecnologias que realizam tarefas diárias é abundante e diferentes empresas e pessoas se beneficiam desta diversidade. Quanto mais a tecnologia evolui, mais usual se torna, em contraposição, essas empresas acabam por se tornar alvo de actividades maliciosas. Computação na Nuvem é uma das tecnologias que vem sendo adoptada por empresas de diferentes segmentos ao redor do mundo durante anos. Sua popularidade se deve principalmente devido as suas características e a maneira com o qual entrega seus serviços ao cliente. Esta expansão da Computação na Nuvem também implica que usuários maliciosos podem tentar explorá-la, como revela estudos de pesquisas apresentados ao longo deste trabalho. De acordo também com estes estudos, Ataques de Negação de Serviço são um tipo de ameaça que sempre estão a tentar tirar vantagens dos serviços de Computação na Nuvem. Várias empresas moveram ou estão a mover seus serviços para ambientes hospedados fornecidos por provedores de Computação na Nuvem e estão a utilizar várias aplicações baseadas nestes serviços. A literatura existente sobre este tema chama atenção sobre o fato de que, por conta desta expansão na adopção à serviços na Nuvem, o uso de aplicações aumentou. Portanto, ameaças de Negação de Serviço estão visando mais a camada de aplicação e também, variações de ataques mais avançados estão sendo utilizadas como Negação de Serviço Distribuída de Baixa Taxa. Algumas pesquisas estão a ser feitas relacionadas especificamente para a detecção e mitigação deste tipo de ameaça e o maior problema encontrado nesta variante é diferenciar tráfego malicioso de tráfego legítimo. O objectivo principal desta ameaça é explorar a maneira como o protocolo HTTP trabalha, enviando tráfego legítimo com pequenas modificações para preencher as solicitações feitas a um servidor lentamente, tornando quase impossível para usuários legítimos aceder os recursos do servidor durante o ataque. Este tipo de ataque geralmente tem uma janela de tempo curta mas para obter melhor eficiência, o ataque é propagado utilizando computadores infectados, criando uma rede de ataque, transformando-se em um ataque distribuído. Para este trabalho, a ideia para combater Ataques de Negação de Serviço Distribuída de Baixa Taxa é integrar diferentes tecnologias dentro de uma Aplicação Híbrida com o objectivo principal de identificar e separar tráfego malicioso de tráfego legítimo. Primeiro, um estudo é feito para observar o comportamento de cada tipo de Ataque de Baixa Taxa, a fim de recolher informações específicas relacionadas às suas características quando o ataque é executado em tempo-real. Então, usando os filtros do programa Tshark, a obtenção destas informações é feita. O próximo passo é criar combinações das informações específicas obtidas dos pacotes e compará-las. Então finalmente, cada pacote é analisado baseado nos padrões de combinações feitos. Um arquivo de registo é criado ao fim para armazenar os dados recolhidos após o cálculo da Entropia em um formato amigável. A fim de testar a eficiência da Aplicação Híbrida, uma infra-estrutura Cloud virtual foi construída usando OpenNebula Sandbox e servidores Apache. Dois testes foram feitos contra a infra-estrutura, o primeiro teste teve o objectivo de verificar a efectividade da ferramenta proporcionalmente contra o ambiente de Nuvem criado. Baseado nos resultados deste teste, um segundo teste foi proposto para verificar o funcionamento da Aplicação Híbrida contra os ataques realizados. A conclusão dos testes mostrou como os tipos de Ataques de Negação de Serviço Distribuída de Baixa Taxa podem ser disruptivos e também revelou resultados promissores relacionados ao desempenho da Aplicação Híbrida contra esta ameaça. A Aplicação Híbrida obteve sucesso ao identificar cada tipo de Ataque de Negação de Serviço Distribuída de Baixa Taxa, em separar o tráfego e gerou poucos falsos positivos durante o processo. Os resultados são exibidos em forma de parâmetros e grafos

    Semantic discovery and reuse of business process patterns

    Get PDF
    Patterns currently play an important role in modern information systems (IS) development and their use has mainly been restricted to the design and implementation phases of the development lifecycle. Given the increasing significance of business modelling in IS development, patterns have the potential of providing a viable solution for promoting reusability of recurrent generalized models in the very early stages of development. As a statement of research-in-progress this paper focuses on business process patterns and proposes an initial methodological framework for the discovery and reuse of business process patterns within the IS development lifecycle. The framework borrows ideas from the domain engineering literature and proposes the use of semantics to drive both the discovery of patterns as well as their reuse

    Internet of robotic things : converging sensing/actuating, hypoconnectivity, artificial intelligence and IoT Platforms

    Get PDF
    The Internet of Things (IoT) concept is evolving rapidly and influencing newdevelopments in various application domains, such as the Internet of MobileThings (IoMT), Autonomous Internet of Things (A-IoT), Autonomous Systemof Things (ASoT), Internet of Autonomous Things (IoAT), Internetof Things Clouds (IoT-C) and the Internet of Robotic Things (IoRT) etc.that are progressing/advancing by using IoT technology. The IoT influencerepresents new development and deployment challenges in different areassuch as seamless platform integration, context based cognitive network integration,new mobile sensor/actuator network paradigms, things identification(addressing, naming in IoT) and dynamic things discoverability and manyothers. The IoRT represents new convergence challenges and their need to be addressed, in one side the programmability and the communication ofmultiple heterogeneous mobile/autonomous/robotic things for cooperating,their coordination, configuration, exchange of information, security, safetyand protection. Developments in IoT heterogeneous parallel processing/communication and dynamic systems based on parallelism and concurrencyrequire new ideas for integrating the intelligent “devices”, collaborativerobots (COBOTS), into IoT applications. Dynamic maintainability, selfhealing,self-repair of resources, changing resource state, (re-) configurationand context based IoT systems for service implementation and integrationwith IoT network service composition are of paramount importance whennew “cognitive devices” are becoming active participants in IoT applications.This chapter aims to be an overview of the IoRT concept, technologies,architectures and applications and to provide a comprehensive coverage offuture challenges, developments and applications

    From Social Data Mining to Forecasting Socio-Economic Crisis

    Full text link
    Socio-economic data mining has a great potential in terms of gaining a better understanding of problems that our economy and society are facing, such as financial instability, shortages of resources, or conflicts. Without large-scale data mining, progress in these areas seems hard or impossible. Therefore, a suitable, distributed data mining infrastructure and research centers should be built in Europe. It also appears appropriate to build a network of Crisis Observatories. They can be imagined as laboratories devoted to the gathering and processing of enormous volumes of data on both natural systems such as the Earth and its ecosystem, as well as on human techno-socio-economic systems, so as to gain early warnings of impending events. Reality mining provides the chance to adapt more quickly and more accurately to changing situations. Further opportunities arise by individually customized services, which however should be provided in a privacy-respecting way. This requires the development of novel ICT (such as a self- organizing Web), but most likely new legal regulations and suitable institutions as well. As long as such regulations are lacking on a world-wide scale, it is in the public interest that scientists explore what can be done with the huge data available. Big data do have the potential to change or even threaten democratic societies. The same applies to sudden and large-scale failures of ICT systems. Therefore, dealing with data must be done with a large degree of responsibility and care. Self-interests of individuals, companies or institutions have limits, where the public interest is affected, and public interest is not a sufficient justification to violate human rights of individuals. Privacy is a high good, as confidentiality is, and damaging it would have serious side effects for society.Comment: 65 pages, 1 figure, Visioneer White Paper, see http://www.visioneer.ethz.c

    A study of EU data protection regulation and appropriate security for digital services and platforms

    Get PDF
    A law often has more than one purpose, more than one intention, and more than one interpretation. A meticulously formulated and context agnostic law text will still, when faced with a field propelled by intense innovation, eventually become obsolete. The European Data Protection Directive is a good example of such legislation. It may be argued that the technological modifications brought on by the EU General Data Protection Regulation (GDPR) are nominal in comparison to the previous Directive, but from a business perspective the changes are significant and important. The Directive’s lack of direct economic incentive for companies to protect personal data has changed with the Regulation, as companies may now have to pay severe fines for violating the legislation. The objective of the thesis is to establish the notion of trust as a key design goal for information systems handling personal data. This includes interpreting the EU legislation on data protection and using the interpretation as a foundation for further investigation. This interpretation is connected to the areas of analytics, security, and privacy concerns for intelligent service development. Finally, the centralised platform business model and its challenges is examined, and three main resolution themes for regulating platform privacy are proposed. The aims of the proposed resolutions are to create a more trustful relationship between providers and data subjects, while also improving the conditions for competition and thus providing data subjects with service alternatives. The thesis contributes new insights into the evolving privacy practices in the digital society at an important time of transition from the service driven business models to the platform business models. Firstly, privacy-related regulation and state of the art analytics development are examined to understand their implications for intelligent services that are based on automated processing and profiling. The ability to choose between providers of intelligent services is identified as the core challenge. Secondly, the thesis examines what is meant by appropriate security for systems that handle personal data, something the GDPR requires that organisations use without however specifying what can be considered appropriate. We propose a method for active network security in web software that is developed through the use of analytics for detection and by inserting data generators into a software installation. The active network security method is proposed as a framework for achieving compliance with the GDPR requirements for services and platforms to use appropriate security. Thirdly, the platform business model is considered from the privacy point of view and the implication of “processing silos” for intelligent services. The centralised platform model is considered problematic from both the data subject and from the competition standpoint. A resolution is offered for enabling user-initiated open data flow to counter the centralised “processing silos”, and thereby to facilitate the introduction of decentralised platforms. The thesis provides an interdisciplinary analysis considering the legal study (lex lata) and additionally the resolution (lex ferenda) is defined through argumentativist legal dogmatics and (de lege ferenda) of how the legal framework ought to be adapted to fit the described environment. User-friendly Legal Science is applied as a theory framework to provide a holistic approach to answering the research questions. The User-friendly Legal Science theory has its roots in design science and offers a way towards achieving interdisciplinary research in the fields of information systems and legal science

    ERP implementation methodologies and frameworks: a literature review

    Get PDF
    Enterprise Resource Planning (ERP) implementation is a complex and vibrant process, one that involves a combination of technological and organizational interactions. Often an ERP implementation project is the single largest IT project that an organization has ever launched and requires a mutual fit of system and organization. Also the concept of an ERP implementation supporting business processes across many different departments is not a generic, rigid and uniform concept and depends on variety of factors. As a result, the issues addressing the ERP implementation process have been one of the major concerns in industry. Therefore ERP implementation receives attention from practitioners and scholars and both, business as well as academic literature is abundant and not always very conclusive or coherent. However, research on ERP systems so far has been mainly focused on diffusion, use and impact issues. Less attention has been given to the methods used during the configuration and the implementation of ERP systems, even though they are commonly used in practice, they still remain largely unexplored and undocumented in Information Systems research. So, the academic relevance of this research is the contribution to the existing body of scientific knowledge. An annotated brief literature review is done in order to evaluate the current state of the existing academic literature. The purpose is to present a systematic overview of relevant ERP implementation methodologies and frameworks as a desire for achieving a better taxonomy of ERP implementation methodologies. This paper is useful to researchers who are interested in ERP implementation methodologies and frameworks. Results will serve as an input for a classification of the existing ERP implementation methodologies and frameworks. Also, this paper aims also at the professional ERP community involved in the process of ERP implementation by promoting a better understanding of ERP implementation methodologies and frameworks, its variety and history

    Critical Infrastructure Protection Metrics and Tools Papers and Presentations

    Get PDF
    Contents: Dr. Hilda Blanco: Prioritizing Assets in Critical Infrastructure Systems; Christine Poptanich: Strategic Risk Analysis; Geoffrey S. French/Jin Kim: Threat-Based Approach to Risk Case Study: Strategic Homeland Infrastructure Risk Assessment (SHIRA); William L. McGill: Techniques for Adversary Threat Probability Assessment; Michael R. Powers: The Mathematics of Terrorism Risk Stefan Pickl: SOA Approach to the IT-based Protection of CIP; Richard John: Probabilistic Project Management for a Terrorist Planning a Dirty Bomb Attack on a Major US Port; LCDR Brady Downs: Maritime Security Risk Analysis Model (MSRAM); Chel Stromgren: Terrorism Risk Assessment and Management (TRAM); Steve Lieberman: Convergence of CIP and COOP in Banking and Finance; Harry Mayer: Assessing the Healthcare and Public Health Sector with Model Based Risk Analysis; Robert Powell: How Much and On What? Defending and Deterring Strategic Attackers; Ted G. Lewis: Why Do Networks Cascade

    Security Enhanced Applications for Information Systems

    Get PDF
    Every day, more users access services and electronically transmit information which is usually disseminated over insecure networks and processed by websites and databases, which lack proper security protection mechanisms and tools. This may have an impact on both the users’ trust as well as the reputation of the system’s stakeholders. Designing and implementing security enhanced systems is of vital importance. Therefore, this book aims to present a number of innovative security enhanced applications. It is titled “Security Enhanced Applications for Information Systems” and includes 11 chapters. This book is a quality guide for teaching purposes as well as for young researchers since it presents leading innovative contributions on security enhanced applications on various Information Systems. It involves cases based on the standalone, network and Cloud environments

    Cybersecurity issues in software architectures for innovative services

    Get PDF
    The recent advances in data center development have been at the basis of the widespread success of the cloud computing paradigm, which is at the basis of models for software based applications and services, which is the "Everything as a Service" (XaaS) model. According to the XaaS model, service of any kind are deployed on demand as cloud based applications, with a great degree of flexibility and a limited need for investments in dedicated hardware and or software components. This approach opens up a lot of opportunities, for instance providing access to complex and widely distributed applications, whose cost and complexity represented in the past a significant entry barrier, also to small or emerging businesses. Unfortunately, networking is now embedded in every service and application, raising several cybersecurity issues related to corruption and leakage of data, unauthorized access, etc. However, new service-oriented architectures are emerging in this context, the so-called services enabler architecture. The aim of these architectures is not only to expose and give the resources to these types of services, but it is also to validate them. The validation includes numerous aspects, from the legal to the infrastructural ones e.g., but above all the cybersecurity threats. A solid threat analysis of the aforementioned architecture is therefore necessary, and this is the main goal of this thesis. This work investigate the security threats of the emerging service enabler architectures, providing proof of concepts for these issues and the solutions too, based on several use-cases implemented in real world scenarios
    corecore