4,056 research outputs found

    Digital transformation in food supply chains: an implementation framework

    Get PDF
    Purpose Digital transformation using Industry 4.0 technologies can address various challenges in food supply chains (FSCs). However, the integration of emerging technologies to achieve digital transformation in FSCs is unclear. This study aims to establish how the digital transformation of FSCs can be achieved by adopting key technologies such as the Internet of Things (IoTs), cloud computing (CC) and big data analytics (BDA). Design/methodology/approach A systematic literature review (SLR) resulted in 57 articles from 2008 to 2022. Following descriptive and thematic analysis, a conceptual framework based on the diffusion of innovation (DOI) theory and the context-intervention-mechanism-outcome (CIMO) logic is established, along with avenues for future research. Findings The combination of DOI theory and CIMO logic provides the theoretical foundation for linking the general innovation process to the digital transformation process. A novel conceptual framework for achieving digital transformation in FSCs is developed from the initiation to implementation phases. Objectives and principles for digitally transforming FSCs are identified for the initiation phase. A four-layer technology implementation architecture is developed for the implementation phase, facilitating multiple applications for FSC digital transformation. Originality/value The study contributes to the development of theory on digital transformation in FSCs and offers managerial guidelines for accelerating the growth of the food industry using key Industry 4.0 emerging technologies. The proposed framework brings clarity into the “neglected” intermediate stage of data management between data collection and analysis. The study highlights the need for a balanced integration of IoT, CC and BDA as key Industry 4.0 technologies to achieve digital transformation successfully

    Cybersecurity in Motion: A Survey of Challenges and Requirements for Future Test Facilities of CAVs

    Get PDF
    The way we travel is changing rapidly and Cooperative Intelligent Transportation Systems (C-ITSs) are at the forefront of this evolution. However, the adoption of C-ITSs introduces new risks and challenges, making cybersecurity a top priority for ensuring safety and reliability. Building on this premise, this paper introduces an envisaged Cybersecurity Centre of Excellence (CSCE) designed to bolster researching, testing, and evaluating the cybersecurity of C-ITSs. We explore the design, functionality, and challenges of CSCE's testing facilities, outlining the technological, security, and societal requirements. Through a thorough survey and analysis, we assess the effectiveness of these systems in detecting and mitigating potential threats, highlighting their flexibility to adapt to future C-ITSs. Finally, we identify current unresolved challenges in various C-ITS domains, with the aim of motivating further research into the cybersecurity of C-ITSs

    Optimization of Beyond 5G Network Slicing for Smart City Applications

    Get PDF
    Transitioning from the current fifth-generation (5G) wireless technology, the advent of beyond 5G (B5G) signifies a pivotal stride toward sixth generation (6G) communication technology. B5G, at its essence, harnesses end-to-end (E2E) network slicing (NS) technology, enabling the simultaneous accommodation of multiple logical networks with distinct performance requirements on a shared physical infrastructure. At the forefront of this implementation lies the critical process of network slice design, a phase central to the realization of efficient smart city networks. This thesis assumes a key role in the network slicing life cycle, emphasizing the analysis and formulation of optimal procedures for configuring, customizing, and allocating E2E network slices. The focus extends to catering to the unique demands of smart city applications, encompassing critical areas such as emergency response, smart buildings, and video surveillance. By addressing the intricacies of network slice design, the study navigates through the complexities of tailoring slices to meet specific application needs, thereby contributing to the seamless integration of diverse services within the smart city framework. Addressing the core challenge of NS, which involves the allocation of virtual networks on the physical topology with optimal resource allocation, the thesis introduces a dual integer linear programming (ILP) optimization problem. This problem is formulated to jointly minimize the embedding cost and latency. However, given the NP-hard nature of this ILP, finding an efficient alternative becomes a significant hurdle. In response, this thesis introduces a novel heuristic approach the matroid-based modified greedy breadth-first search (MGBFS) algorithm. This pioneering algorithm leverages matroid properties to navigate the process of virtual network embedding and resource allocation. By introducing this novel heuristic approach, the research aims to provide near-optimal solutions, overcoming the computational complexities associated with the dual integer linear programming problem. The proposed MGBFS algorithm not only addresses the connectivity, cost, and latency constraints but also outperforms the benchmark model delivering solutions remarkably close to optimal. This innovative approach represents a substantial advancement in the optimization of smart city applications, promising heightened connectivity, efficiency, and resource utilization within the evolving landscape of B5G-enabled communication technology

    Software Defined Radio, a perspective from education

    Get PDF
    The evolution of communication systems has brought about a paradigm shift, particularly in radiocommunications, where software has increasingly taken precedence over hardware. This transition has not only reduced implementation costs but has also significantly enhanced the flexibility of equipment architecture. A prime example of this trend is the emergence and consolidation of software-defined radio (SDR) technology in recent decades. This study provides a comprehensive contextualization of SDR technology, offering insights into its current state in terms of development tools and market equipment. Additionally, two learning scenarios are presented that employ different teaching methodologies. In one of these scenarios, communication theory is exclusively approached from a theoretical perspective. In the second scenario, knowledge acquisition is encouraged through the implementation of low-cost laboratories that incorporate SDR technology. The study indicates that implementing SDR technology boosts student motivation and learning, with 73.13% believing it enhances engineering education and 96% showing increased motivation. Those using SDR in practical laboratories perform better on knowledge tests, but statistical analysis shows that the difference is not statistically significant

    Natural and Technological Hazards in Urban Areas

    Get PDF
    Natural hazard events and technological accidents are separate causes of environmental impacts. Natural hazards are physical phenomena active in geological times, whereas technological hazards result from actions or facilities created by humans. In our time, combined natural and man-made hazards have been induced. Overpopulation and urban development in areas prone to natural hazards increase the impact of natural disasters worldwide. Additionally, urban areas are frequently characterized by intense industrial activity and rapid, poorly planned growth that threatens the environment and degrades the quality of life. Therefore, proper urban planning is crucial to minimize fatalities and reduce the environmental and economic impacts that accompany both natural and technological hazardous events

    Improving the SEP licensing framework by revising SSOs’ IPR policies

    Get PDF
    This thesis examines the SEP licensing framework with a view to understanding whether it can be improved by revising IPR policies. The ICT standardisation, which provides interoperability, is one of the building blocks of the modern economy. Put simply, without standards, there would not be IoT or for example, consumers would only be able to connect to a wireless network with devices specifically built for that network. Standards are not a new phenomenon; however, they became more complex with the increasing importance of technology, which made them, in return, more dependent on patented technologies (i.e. SEPs). SEPs cause complications in standardisation as they require SEP owners and potential licensees to negotiate/agree on usually complex licensing agreements. Although SSOs have attempted to regulate this relationship with their IPR policies, now it seems these policies cannot keep up with the changing dynamics and needs in standardisation. Dysfunctions in the system do not only affect competition in the relevant markets, they also prejudice consumers’ interests, for example, by passing on higher prices to cover supra-competitive royalties. In particular, since the first Rambus case in the US, competition/antitrust agencies and courts have been dealing with SEP-related issues. Recently, the EU has been considering addressing some of those with legislation. Conversely, this research derives from the notion that active standardisation participants are better equipped to deal with SEP-related issues, and flexible IPR policies are more suitable for addressing these issues in the dynamic standardisation ecosystem. Against this backdrop, this comparative research aims to identify areas where SEP licensing framework can be improved by reforming IPR policies, and it develops some proposals using the black-letter and empirical research methods that SSOs can implement

    Adaptive Data-driven Optimization using Transfer Learning for Resilient, Energy-efficient, Resource-aware, and Secure Network Slicing in 5G-Advanced and 6G Wireless Systems

    Get PDF
    Title from PDF of title page, viewed January 31, 2023Dissertation advisor: Cory BeardVitaIncludes bibliographical references (pages 134-141)Dissertation (Ph.D)--Department of Computer Science and Electrical Engineering. University of Missouri--Kansas City, 20225G–Advanced is the next step in the evolution of the fifth–generation (5G) technology. It will introduce a new level of expanded capabilities beyond connections and enables a broader range of advanced applications and use cases. 5G–Advanced will support modern applications with greater mobility and high dependability. Artificial intelligence and Machine Learning will enhance network performance with spectral efficiency and energy savings enhancements. This research established a framework to optimally control and manage an appropriate selection of network slices for incoming requests from diverse applications and services in Beyond 5G networks. The developed DeepSlice model is used to optimize the network and individual slice load efficiency across isolated slices and manage slice lifecycle in case of failure. The DeepSlice framework can predict the unknown connections by utilizing the learning from a developed deep-learning neural network model. The research also addresses threats to the performance, availability, and robustness of B5G networks by proactively preventing and resolving threats. The study proposed a Secure5G framework for authentication, authorization, trust, and control for a network slicing architecture in 5G systems. The developed model prevents the 5G infrastructure from Distributed Denial of Service by analyzing incoming connections and learning from the developed model. The research demonstrates the preventive measure against volume attacks, flooding attacks, and masking (spoofing) attacks. This research builds the framework towards the zero trust objective (never trust, always verify, and verify continuously) that improves resilience. Another fundamental difficulty for wireless network systems is providing a desirable user experience in various network conditions, such as those with varying network loads and bandwidth fluctuations. Mobile Network Operators have long battled unforeseen network traffic events. This research proposed ADAPTIVE6G to tackle the network load estimation problem using knowledge-inspired Transfer Learning by utilizing radio network Key Performance Indicators from network slices to understand and learn network load estimation problems. These algorithms enable Mobile Network Operators to optimally coordinate their computational tasks in stochastic and time-varying network states. Energy efficiency is another significant KPI in tracking the sustainability of network slicing. Increasing traffic demands in 5G dramatically increase the energy consumption of mobile networks. This increase is unsustainable in terms of dollar cost and environmental impact. This research proposed an innovative ECO6G model to attain sustainability and energy efficiency. Research findings suggested that the developed model can reduce network energy costs without negatively impacting performance or end customer experience against the classical Machine Learning and Statistical driven models. The proposed model is validated against the industry-standardized energy efficiency definition, and operational expenditure savings are derived, showing significant cost savings to MNOs.Introduction -- A deep neural network framework towards a resilient, efficient, and secure network slicing in Beyond 5G Networks -- Adaptive resource management techniques for network slicing in Beyond 5G networks using transfer learning -- Energy and cost analysis for network slicing deployment in Beyond 5G networks -- Conclusion and future scop

    Potential of machine learning/Artificial Intelligence (ML/AI) for verifying configurations of 5G multi Radio Access Technology (RAT) base station

    Get PDF
    Abstract. The enhancements in mobile networks from 1G to 5G have greatly increased data transmission reliability and speed. However, concerns with 5G must be addressed. As system performance and reliability improve, ML and AI integration in products and services become more common. The integration teams in cellular network equipment creation test devices from beginning to end to ensure hardware and software parts function correctly. Radio unit integration is typically the first integration phase, where the radio is tested independently without additional network components like the BBU and UE. 5G architecture and the technology that it is using are explained further. The architecture defined by 3GPP for 5G differs from previous generations, using Network Functions (NFs) instead of network entities. This service-based architecture offers NF reusability to reduce costs and modularity, allowing for the best vendor options for customer radio products. 5G introduced the O-RAN concept to decompose the RAN architecture, allowing for increased speed, flexibility, and innovation. NG-RAN provided this solution to speed up the development and implementation process of 5G. The O-RAN concept aims to improve the efficiency of RAN by breaking it down into components, allowing for more agility and customization. The four protocols, the eCPRI interface, and the functionalities of fronthaul that NGRAN follows are expressed further. Additionally, the significance of NR is described with an explanation of its benefits. Some benefits are high data rates, lower latency, improved spectral efficiency, increased network flexibility, and improved energy efficiency. The timeline for 5G development is provided along with different 3GPP releases. Stand-alone and non-stand-alone architecture is integral while developing the 5G architecture; hence, it is also defined with illustrations. The two frequency bands that NR utilizes, FR1 and FR2, are expressed further. FR1 is a sub-6 GHz frequency band. It contains frequencies of low and high values; on the other hand, FR2 contains frequencies above 6GHz, comprising high frequencies. FR2 is commonly known as the mmWave band. Data collection for implementing the ML approaches is expressed that contains the test setup, data collection, data description, and data visualization part of the thesis work. The Test PC runs tests, executes test cases using test libraries, and collects data from various logs to analyze the system’s performance. The logs contain information about the test results, which can be used to identify issues and evaluate the system’s performance. The data collection part describes that the data was initially present in JSON files and extracted from there. The extraction took place using the Python code script and was then fed into an Excel sheet for further analysis. The data description explains the parameters that are taken while training the models. Jupyter notebook has been used for visualizing the data, and the visualization is carried out with the help of graphs. Moreover, the ML techniques used for analyzing the data are described. In total, three methods are used here. All the techniques come under the category of supervised learning. The explained models are random forest, XG Boost, and LSTM. These three models form the basis of ML techniques applied in the thesis. The results and discussion section explains the outcomes of the ML models and discusses how the thesis will be used in the future. The results include the parameters that are considered to apply the ML models to them. SINR, noise power, rxPower, and RSSI are the metrics that are being monitored. These parameters have variance, which is essential in evaluating the quality of the product test setup, the quality of the software being tested, and the state of the test environment. The discussion section of the thesis explains why the following parameters are taken, which ML model is most appropriate for the data being analyzed, and what the next steps are in implementation
    • …
    corecore