1,515 research outputs found

    Evolution of 5G Network: A Precursor towards the Realtime Implementation of VANET for Safety Applications in Nigeria

    Get PDF
      A crucial requirement for the successful real-time design and deployment of Vehicular Adhoc Networks (VANET) is to ensure high speed data rates, low latency, information security, and a wide coverage area without sacrificing the required Quality of Service (QoS) in VANET. These requirements must be met for flawless communication on the VANET. This study examines the generational patterns in mobile wireless communication and looks into the possibilities of adopting fifth generation (5G) network technology for real-time communication of road abnormalities in VANET. The current paper addresses the second phase of a project that is now underway to develop real-time road anomaly detection, characterization, and communication systems for VANET. The major goal is to reduce the amount of traffic accidents on Nigerian roadways. It will also serve as a platform for the real-time deployment and testing of various road anomaly detection algorithms, as well as schemes for communicating such detected anomalies in the VANET.   &nbsp

    Specifics of Algorithmization in Data Culture

    Get PDF
    Information societies effectively transform existing cultures. New cultures are variously defined, but because of the fact that they are dominated by information, the term “data cultures” seems to be the most relevant name for them. Although, it is possible to create a single global data culture in the future and such predictions prevail in academic and non-academic reflections on this subject, so far in addition to global trends there occur local data cultures, what also dynamizes and enriches both individual and collective identities. As Kazimierz Krzysztofek aptly notes, in this situation: "The greatest contradiction of the civilization of the 21st century is drawn. On the one hand, a continuous imperative: be creative and innovative, on the other hand, an increasing pressure on prediction of people’s behaviour, because unpredictability causes chaos, which cannot be managed". In other words, one of the most important social issues today is to create some order in data culture / cultures (often pictured by columnists and researchers as a "magnetic storm"), to reduce its / their infinite complexity, i.e. simply the algorithmization process. With regard to culture, it is not possible to use unequivocally a mathematical algorithm that is the most precise, or a genetic or hormonal algorithm that functions in nature, because accustoming cultural chaos is always strongly ideologized. The algorithm should be treated as a metaphor used to explain cultural phenomena, especially their developmental tendencies. For the researcher of contemporary societies, it is very important to answer the question: what proportions of structure and network are the most beneficial for the survival of data culture / cultures? This answer also directs reflection on the quality of life of individuals and societies, limiting or promoting individualism and collective intelligence in the era of hyper-digitization. These considerations are limited to the initial characterization and evaluation of the information algorithmization of man. The author of the study refers to the concepts of researchers from different countries, highlighting the specificity of today's algorithmization, among others the model of ambient perception, which facilitates participation in the networked information environment, scope and reach of the big data phenomenon, forms of data visualization, personalization of content, Isotype visual language, network custody, data journalism and others. In conclusion, it is pointed out that the information algorithmization of man is constantly growing, which proves that data management strategies weaken the phenomenon of information overload through the logic of numerical civilization, which limits diversity, seeking to count, record and globalize everything

    Optimization of a wifi wireless network that maximizes the level of satisfaction of users and allows the use of new technological trends in higher education institutions

    Get PDF
    The campus wireless networks have many users, who have different roles and network requirements, ranging from the use of educational platforms, informative consultations, emails, among others. Currently due to the inefficient use of network resources and little wireless planning, caused by the growth of the technological infrastructure (which is often due to daily worries, rather than to a lack of preparation by those in charge of managing the network), There are two essential factors that truncate the requirement of having a stable and robust net-work platform. First, the degradation of the quality of services perceived by users, and second, the congestion caused by the high demand for convergent traffic (video, voice, and data). Both factors imply great challenges on the part of the administrators of the network, which in many occasions are overwhelmed by per-manent incidences of instability, coverage, and congestion, as well as the diffi-culty of maintaining it economically. The present investigation seeks to propose a process of optimization of the infrastructure and parameters of the configuration of a wireless network, that allows maximizing the level of satisfaction of the users in Higher Education Institutions. In the first place, it is expected to determine an adequate methodology to estimate the level of satisfaction of the users (defining a mathematical criterion or algorithm based on the study variables [1], character-ize the environment in which the project will be developed, making a complete study of the wireless conditions and implement optimization strategies with soft-ware-defined networks (SDN). SDN is a concept in computer networks that al-lows network management to be carried out efficiently and flexibly, separating the control plane from the data plane into network devices. SDN architecture consists of an infrastructure layer which is a collection of network devices con-nected to the SDN Controller using protocol (OpenFlow) as a protocol [2]. Also, SDN will study traffic patterns on the network as a basis for optimizing network device usage [3]. The phases of the research will be carried out following the life cycle defined by the Cisco PPDIOO methodology (Prepare, Plan, Design, Imple-ment, Operate, Optimize) [4].Institución Universitaria ITSA, Corporación Universitaria Reformada CUR, Corporación Universitaria Latinoamericana CUL, Universidad de la Costa CUC, Universitaria Minuto de Dios UNIMINUTO, Universidad Libre

    Software\u27s Copyright Anticommons

    Get PDF
    Scholars have long assessed “anticommons” problems in creative and innovative environments. An anticommons develops when an asset has numerous rights holders, each of which has a right to prevent use of the asset, but none of which has a right to use the asset without authorization from the other rights holders. Hence, when any one of those rights holders uses its rights in ways that inhibit use of the common asset, an anticommons may result.In the software world, scholars have long argued that anticommons problems arise, if at all, because of patent rights. Copyright, on the other hand, has not been viewed as a significant source of anticommons problems. But this Article argues that copyright is an increasingly significant cause of anticommons concerns in the software context for at least two related reasons. First, the increasingly collaborative nature of much modern software innovation means that any given software resource is subject to dozens, hundreds, or even thousands of distinct copyright interests, each of which can ultimately hamper use of the software resource. While collaborative innovation licensing models help reduce the threat of any given copyright holder restricting use of the software resource, these licensing models do not altogether eliminate such risks and, in fact, actually create risks of holdup and underuse that have previously received less attention than they are due. Second, interoperability needs in the growing “Internet of Things” and “cloud” economies demand sharing and reuse of software for these ecosystems to work. Yet because these technological ecosystems implicate thousands of different parties with distinct copyright interests in their software, the threat of any one of those parties ultimately using its rights in ways that inhibit the successful development and use of the Internet of Things and cloud economies looms large. In order to illustrate some of these anticommons problems in practice, this Article examines a recent high-profile software copyright dispute between Oracle and Google.As a possible solution to these types of problems, this Article assesses the merits of more explicitly adapting copyright’s fair use defense to the collaborative and interconnected nature of modern software innovation. The Article concludes by arguing that copyright disputes in other fields of creativity characterized by collaborative, interconnected development may also merit such fair use adaptations. Otherwise, anticommons problems may increasingly affect those fields as well

    Digital-Twins towards Cyber-Physical Systems: A Brief Survey

    Get PDF
    Cyber-Physical Systems (CPS) are integrations of computation and physical processes. Physical processes are monitored and controlled by embedded computers and networks, which frequently have feedback loops where physical processes affect computations and vice versa. To ease the analysis of a system, the costly physical plants can be replaced by the high-fidelity virtual models that provide a framework for Digital-Twins (DT). This paper aims to briefly review the state-of-the-art and recent developments in DT and CPS. Three main components in CPS, including communication, control, and computation, are reviewed. Besides, the main tools and methodologies required for implementing practical DT are discussed by following the main applications of DT in the fourth industrial revolution through aspects of smart manufacturing, sixth wireless generation (6G), health, production, energy, and so on. Finally, the main limitations and ideas for future remarks are talked about followed by a short guideline for real-world application of DT towards CPS

    An investigation upon Industry 4.0 implementation: the case of small and medium enterprises and Lean organizations

    Get PDF
    In recent years, industries have undergone several shifts in their operating and management systems. Alongside to the technological innovation, rapid market changes and high competitiveness; growing customer needs are driving industries to focus on producing highly customized products with even less time to market. In this context, Industry 4.0 is a manufacturing paradigm that promises to have a great impact not only on improving productivity but also on developing new products, services and business models. However, the literature review has shown that research on Industry 4.0 implementation is still characterized by some weaknesses and gaps (e.g., topics such as the implementation of Industry 4.0 in SMEs and its integration with Lean Management approach). Motivated by so, this thesis sought to answer four key questions: (RQ1) What are the challenges and opportunities for SMEs in the Industry 4.0 field? (RQ2) What are the resources and capabilities for Industry 4.0 implementation in SMEs? (RQ3) How can these resources and capabilities be acquired and/or developed and (RQ4) How to integrate Industry 4.0 and Lean Management? To deal with the first research question, a semi-systematic literature review in the Industry 4.0 field was conducted. The main goal is to explore the implementation of Industry 4.0 in SMEs in order to identify common challenges and opportunities for SMEs in the Industry 4.0 era. To face with the second and third research questions, a multiple case study research was conducted to pursue two main aims: (1) to identify the resources and capabilities required to implement Industry 4.0 in Portuguese SMEs. Furthermore, based on mainstream theories such as resource-based view (RBV) and dynamic capability theory, it sought empirical evidence on how SMEs use resources and capabilities to gain sustainable competitive advantage; (2) to shed light on how those SMEs acquire and/or develop the Industry 4.0 resources and capabilities. Finally, this thesis employed a semi-systematic literature review methodology to deal with the fourth research question. As such, it explored the synergistic relationship between Industry 4.0 and Lean Management to identify the main trends in this field of research and, ultimately, the best practices. The analysis and discussion of the best practices revealed a set of potential relationships which provided a more clear understanding of the outcomes of an Industry 4.0-LM integration.Nos últimos anos, as indústrias têm passado por várias mudanças tanto nos seus sistemas operacionais, como de gestão. Juntamente com a inovação tecnológica e alta competitividade; as mudanças nas necessidades dos clientes levaram as indústrias a se concentrarem na produção de produtos altamente personalizados e com tempo de lançamento no mercado cade vez menores. Nesse contexto, a Indústria 4.0 é um paradigma de manufatura que promete ter um grande impacto não só na melhoria da produtividade, mas também no desenvolvimento de novos produtos, serviços e modelos de negócios. No entanto, a revisão da literatura mostrou que a investigação sobre a implementação da Indústria 4.0 ainda é caracterizada por algumas lacunas (por exemplo em tópicos como a implementação da Indústria 4.0 em pequenas e médias empresas (PMEs) e sua integração com a filosofia de gestão Lean Management). Diante disso, esta tese procura responder à quatro questões-chave: (RQ1) Quais são os desafios e oportunidades para as PMEs no campo da Indústria 4.0? (RQ2) Quais são os recursos e capacidades necessários para a implementação da Indústria 4.0 nas PMEs? (RQ3) Como esses recursos e capacidades podem ser adquiridos e/ou desenvolvidos e (RQ4) Como integrar os paradigmas de manufatura, Indústria 4.0 e Lean Management? Para responder à primeira questão de investigação, este trabalho empregou uma revisão semi-sistemática da literatura. O objetivo principal foi explorar a implementação da Indústria 4.0 nas PMEs, a fim de identificar quais são os desafios e oportunidades para as PMEs na era da Indústria 4.0. Para fazer face à segunda e terceira questões de investigação, foi realizado um estudo de caso em 5 PMEs localizadas em Portugal a fim de atingir os seguintes objetivos: (1) identificar os recursos e capacidades necessários para implementar a Indústria 4.0 nas PME portuguesas; (2) esclarecer como essas PMEs adquirem e/ou desenvolvem esses recursos e capacidades. Além disso, com base nas teorias resourcebased view (RBV) e dynamic capabilities, buscar evidências empíricas sobre como as PMEs usam recursos e capacidades para obter vantagem competitiva sustentável. Finalmente, para lidar com a quarta questão de investigação, este estudo explorou a relação sinérgica entre a Indústria 4.0 e a filosofia de gestão Lean Management (LM) para identificar as principais tendências neste campo de investigação e promover as melhores práticas. A análise e discussão das melhores práticas revelaram um conjunto de potenciais relações, o que contribuiu para um entendimento mais claro sobre a integração da Indústria 4.0 com LM

    A Survey of Using Machine Learning in IoT Security and the Challenges Faced by Researchers

    Get PDF
    The Internet of Things (IoT) has become more popular in the last 15 years as it has significantly improved and gained control in multiple fields. We are nowadays surrounded by billions of IoT devices that directly integrate with our lives, some of them are at the center of our homes, and others control sensitive data such as military fields, healthcare, and datacenters, among others. This popularity makes factories and companies compete to produce and develop many types of those devices without caring about how secure they are. On the other hand, IoT is considered a good insecure environment for cyber thefts. Machine Learning (ML) and Deep Learning (DL) also gained more importance in the last 15 years; they achieved success in the networking security field too. IoT has some similar security requirements such as traditional networks, but with some differences according to its characteristics, some specific security features, and environmental limitations, some differences are made such as low energy resources, limited computational capability, and small memory. These limitations inspire some researchers to search for the perfect and lightweight security ways which strike a balance between performance and security. This survey provides a comprehensive discussion about using machine learning and deep learning in IoT devices within the last five years. It also lists the challenges faced by each model and algorithm. In addition, this survey shows some of the current solutions and other future directions and suggestions. It also focuses on the research that took the IoT environment limitations into consideration

    Management And Security Of Multi-Cloud Applications

    Get PDF
    Single cloud management platform technology has reached maturity and is quite successful in information technology applications. Enterprises and application service providers are increasingly adopting a multi-cloud strategy to reduce the risk of cloud service provider lock-in and cloud blackouts and, at the same time, get the benefits like competitive pricing, the flexibility of resource provisioning and better points of presence. Another class of applications that are getting cloud service providers increasingly interested in is the carriers\u27 virtualized network services. However, virtualized carrier services require high levels of availability and performance and impose stringent requirements on cloud services. They necessitate the use of multi-cloud management and innovative techniques for placement and performance management. We consider two classes of distributed applications – the virtual network services and the next generation of healthcare – that would benefit immensely from deployment over multiple clouds. This thesis deals with the design and development of new processes and algorithms to enable these classes of applications. We have evolved a method for optimization of multi-cloud platforms that will pave the way for obtaining optimized placement for both classes of services. The approach that we have followed for placement itself is predictive cost optimized latency controlled virtual resource placement for both types of applications. To improve the availability of virtual network services, we have made innovative use of the machine and deep learning for developing a framework for fault detection and localization. Finally, to secure patient data flowing through the wide expanse of sensors, cloud hierarchy, virtualized network, and visualization domain, we have evolved hierarchical autoencoder models for data in motion between the IoT domain and the multi-cloud domain and within the multi-cloud hierarchy

    Graph-Theoretic Approach for Manufacturing Cybersecurity Risk Modeling and Assessment

    Full text link
    Identifying, analyzing, and evaluating cybersecurity risks are essential to assess the vulnerabilities of modern manufacturing infrastructures and to devise effective decision-making strategies to secure critical manufacturing against potential cyberattacks. In response, this work proposes a graph-theoretic approach for risk modeling and assessment to address the lack of quantitative cybersecurity risk assessment frameworks for smart manufacturing systems. In doing so, first, threat attributes are represented using an attack graphical model derived from manufacturing cyberattack taxonomies. Attack taxonomies offer consistent structures to categorize threat attributes, and the graphical approach helps model their interdependence. Second, the graphs are analyzed to explore how threat events can propagate through the manufacturing value chain and identify the manufacturing assets that threat actors can access and compromise during a threat event. Third, the proposed method identifies the attack path that maximizes the likelihood of success and minimizes the attack detection probability, and then computes the associated cybersecurity risk. Finally, the proposed risk modeling and assessment framework is demonstrated via an interconnected smart manufacturing system illustrative example. Using the proposed approach, practitioners can identify critical connections and manufacturing assets requiring prioritized security controls and develop and deploy appropriate defense measures accordingly.Comment: 25 pages, 10 figure

    A review of Smart Contract Blockchain Based on Multi-Criteria Analysis: Challenges and Motivations

    Full text link
    A smart contract is a digital program of transaction protocol (rules of contract) based on the consensus architecture of blockchain. Smart contracts with Blockchain are modern technologies that have gained enormous attention in scientific and practical applications. A smart contract is the central aspect of a blockchain that facilitates blockchain as a platform outside the cryptocurrency spectrum. The development of blockchain technology, with a focus on smart contracts, has advanced significantly in recent years. However research on the smart contract idea has weaknesses in the implementation sectors based on a decentralized network that shares an identical state. This paper extensively reviews smart contracts based on multi criteria analysis challenges and motivations. Therefore, implementing blockchain in multi-criteria research is required to increase the efficiency of interaction between users via supporting information exchange with high trust. Implementing blockchain in the multi-criteria analysis is necessary to increase the efficiency of interaction between users via supporting information exchange and with high confidence, detecting malfunctioning, helping users with performance issues, reaching a consensus, deploying distributed solutions and allocating plans, tasks and joint missions. The smart contract with decision-making performance, planning and execution improves the implementation based on efficiency, sustainability and management. Furthermore the uncertainty and supply chain performance lead to improved users confidence in offering new solutions in exchange for problems in smart contacts. Evaluation includes code analysis and performance while development performance can be under development.Comment: Revie
    corecore