4,759 research outputs found

    MICROTUBULE BASED NEURO-FUZZY NESTED FRAMEWORK FOR SECURITY OF CYBER PHYSICAL SYSTEM

    Get PDF
    Network and system security of cyber physical system is of vital significance in the present information correspondence environment. Hackers and network intruders can make numerous fruitful endeavors to bring crashing of the networks and web services by unapproved interruption. Computing systems connected to the Internet are stood up to with a plenty of security threats, running from exemplary computer worms to impart drive by downloads and bot networks. In the most recent years these threats have achieved another nature of automation and sophistication, rendering most defenses inadequate. Ordinary security measures that depend on the manual investigation of security incidents and attack advancement intrinsically neglect to give an assurance from these threats. As an outcome, computer systems regularly stay unprotected over longer time frames. This study presents a network intrusion detection based on machine learning as a perfect match for this issue, as learning strategies give the capacity to naturally dissect data and backing early detection of threats. The results from the study have created practical results so far and there is eminent wariness in the community about learning based defenses. Machine learning based Intrusion Detection and Network Security Systems are one of these solutions. It dissects and predicts the practices of clients, and after that these practices will be viewed as an attack or a typical conduct

    An experimental study on network intrusion detection systems

    Get PDF
    A signature database is the key component of an elaborate intrusion detection system. The efficiency of signature generation for an intrusion detection system is a crucial requirement because of the rapid appearance of new attacks on the World Wide Web. However, in the commercial applications, signature generation is still a manual process, which requires professional skills and heavy human effort. Knowledge Discovery and Data Mining methods may be a solution to this problem. Data Mining and Machine Learning algorithms can be applied to the network traffic databases, in order to automatically generate signatures. The purpose of this thesis and the work related to it is to construct a feasible architecture for building a database of network traffic data. This database can then be used to generate signatures automatically. This goal is achieved using network traffic data captured on the data communication network at the New Jersey Institute of Technology (NJIT)

    From Social Data Mining to Forecasting Socio-Economic Crisis

    Full text link
    Socio-economic data mining has a great potential in terms of gaining a better understanding of problems that our economy and society are facing, such as financial instability, shortages of resources, or conflicts. Without large-scale data mining, progress in these areas seems hard or impossible. Therefore, a suitable, distributed data mining infrastructure and research centers should be built in Europe. It also appears appropriate to build a network of Crisis Observatories. They can be imagined as laboratories devoted to the gathering and processing of enormous volumes of data on both natural systems such as the Earth and its ecosystem, as well as on human techno-socio-economic systems, so as to gain early warnings of impending events. Reality mining provides the chance to adapt more quickly and more accurately to changing situations. Further opportunities arise by individually customized services, which however should be provided in a privacy-respecting way. This requires the development of novel ICT (such as a self- organizing Web), but most likely new legal regulations and suitable institutions as well. As long as such regulations are lacking on a world-wide scale, it is in the public interest that scientists explore what can be done with the huge data available. Big data do have the potential to change or even threaten democratic societies. The same applies to sudden and large-scale failures of ICT systems. Therefore, dealing with data must be done with a large degree of responsibility and care. Self-interests of individuals, companies or institutions have limits, where the public interest is affected, and public interest is not a sufficient justification to violate human rights of individuals. Privacy is a high good, as confidentiality is, and damaging it would have serious side effects for society.Comment: 65 pages, 1 figure, Visioneer White Paper, see http://www.visioneer.ethz.c

    Big Data Meets Telcos: A Proactive Caching Perspective

    Full text link
    Mobile cellular networks are becoming increasingly complex to manage while classical deployment/optimization techniques and current solutions (i.e., cell densification, acquiring more spectrum, etc.) are cost-ineffective and thus seen as stopgaps. This calls for development of novel approaches that leverage recent advances in storage/memory, context-awareness, edge/cloud computing, and falls into framework of big data. However, the big data by itself is yet another complex phenomena to handle and comes with its notorious 4V: velocity, voracity, volume and variety. In this work, we address these issues in optimization of 5G wireless networks via the notion of proactive caching at the base stations. In particular, we investigate the gains of proactive caching in terms of backhaul offloadings and request satisfactions, while tackling the large-amount of available data for content popularity estimation. In order to estimate the content popularity, we first collect users' mobile traffic data from a Turkish telecom operator from several base stations in hours of time interval. Then, an analysis is carried out locally on a big data platform and the gains of proactive caching at the base stations are investigated via numerical simulations. It turns out that several gains are possible depending on the level of available information and storage size. For instance, with 10% of content ratings and 15.4 Gbyte of storage size (87% of total catalog size), proactive caching achieves 100% of request satisfaction and offloads 98% of the backhaul when considering 16 base stations.Comment: 8 pages, 5 figure
    corecore