1,359 research outputs found
Deep generative models for network data synthesis and monitoring
Measurement and monitoring are fundamental tasks in all networks, enabling the down-stream management and optimization of the network.
Although networks inherently
have abundant amounts of monitoring data, its access and effective measurement is
another story. The challenges exist in many aspects. First, the inaccessibility of network monitoring data for external users, and it is hard to provide a high-fidelity dataset
without leaking commercial sensitive information. Second, it could be very expensive
to carry out effective data collection to cover a large-scale network system, considering the size of network growing, i.e., cell number of radio network and the number of
flows in the Internet Service Provider (ISP) network. Third, it is difficult to ensure fidelity and efficiency simultaneously in network monitoring, as the available resources
in the network element that can be applied to support the measurement function are
too limited to implement sophisticated mechanisms. Finally, understanding and explaining the behavior of the network becomes challenging due to its size and complex
structure. Various emerging optimization-based solutions (e.g., compressive sensing)
or data-driven solutions (e.g. deep learning) have been proposed for the aforementioned challenges. However, the fidelity and efficiency of existing methods cannot yet
meet the current network requirements.
The contributions made in this thesis significantly advance the state of the art in
the domain of network measurement and monitoring techniques. Overall, we leverage
cutting-edge machine learning technology, deep generative modeling, throughout the
entire thesis. First, we design and realize APPSHOT , an efficient city-scale network
traffic sharing with a conditional generative model, which only requires open-source
contextual data during inference (e.g., land use information and population distribution). Second, we develop an efficient drive testing system — GENDT, based on generative model, which combines graph neural networks, conditional generation, and quantified model uncertainty to enhance the efficiency of mobile drive testing. Third, we
design and implement DISTILGAN, a high-fidelity, efficient, versatile, and real-time
network telemetry system with latent GANs and spectral-temporal networks. Finally,
we propose SPOTLIGHT , an accurate, explainable, and efficient anomaly detection system of the Open RAN (Radio Access Network) system. The lessons learned through
this research are summarized, and interesting topics are discussed for future work in
this domain. All proposed solutions have been evaluated with real-world datasets and
applied to support different applications in real systems
Digitalization and Development
This book examines the diffusion of digitalization and Industry 4.0 technologies in Malaysia by focusing on the ecosystem critical for its expansion. The chapters examine the digital proliferation in major sectors of agriculture, manufacturing, e-commerce and services, as well as the intermediary organizations essential for the orderly performance of socioeconomic agents.
The book incisively reviews policy instruments critical for the effective and orderly development of the embedding organizations, and the regulatory framework needed to quicken the appropriation of socioeconomic synergies from digitalization and Industry 4.0 technologies. It highlights the importance of collaboration between government, academic and industry partners, as well as makes key recommendations on how to encourage adoption of IR4.0 technologies in the short- and long-term.
This book bridges the concepts and applications of digitalization and Industry 4.0 and will be a must-read for policy makers seeking to quicken the adoption of its technologies
Techno-Economic Assessment in Communications: New Challenges
This article shows a brief history of Techno-Economic Assessment (TEA) in
Communications, a proposed redefinition of TEA as well as the new challenges
derived from a dynamic context with cloud-native virtualized networks, the
Helium Network & alike blockchain-based decentralized networks, the new network
as a platform (NaaP) paradigm, carbon pricing, network sharing, and web3,
metaverse and blockchain technologies. The authors formulate the research
question and show the need to improve TEA models to integrate and manage all
this increasing complexity. This paper also proposes the characteristics TEA
models should have and their current degree of compliance for several use
cases: 5G and beyond, software-defined wide area network (SD-WAN), secure
access service edge (SASE), secure service edge (SSE), and cloud cybersecurity
risk assessment. The authors also present TEA extensibility to request for
proposals (RFP) processes and other industries, to conclude that there is an
urgent need for agile and effective TEA in Comms that allows industrialization
of agile decision-making for all market stakeholders to choose the optimal
solution for any technology, scenario and use case.Comment: 18 pages, 1 figure, 2 table
Improving the SEP licensing framework by revising SSOs’ IPR policies
This thesis examines the SEP licensing framework with a view to understanding whether it can be improved by revising IPR policies.
The ICT standardisation, which provides interoperability, is one of the building blocks of the modern economy. Put simply, without standards, there would not be IoT or for example, consumers would only be able to connect to a wireless network with devices specifically built for that network. Standards are not a new phenomenon; however, they became more complex with the increasing importance of technology, which made them, in return, more dependent on patented technologies (i.e. SEPs). SEPs cause complications in standardisation as they require SEP owners and potential licensees to negotiate/agree on usually complex licensing agreements. Although SSOs have attempted to regulate this relationship with their IPR policies, now it seems these policies cannot keep up with the changing dynamics and needs in standardisation. Dysfunctions in the system do not only affect competition in the relevant markets, they also prejudice consumers’ interests, for example, by passing on higher prices to cover supra-competitive royalties.
In particular, since the first Rambus case in the US, competition/antitrust agencies and courts have been dealing with SEP-related issues. Recently, the EU has been considering addressing some of those with legislation. Conversely, this research derives from the notion that active standardisation participants are better equipped to deal with SEP-related issues, and flexible IPR policies are more suitable for addressing these issues in the dynamic standardisation ecosystem.
Against this backdrop, this comparative research aims to identify areas where SEP licensing framework can be improved by reforming IPR policies, and it develops some proposals using the black-letter and empirical research methods that SSOs can implement
Investigating the Usability and Quality of Experience of Mobile Video-Conferencing Apps Among Bandwidth-Constrained Users in South Africa
In response to Covid-19 and global lockdowns, we have seen a surge in video-conferencing tools' usage to enable people to work from home and stay connected to family and friends. Although understanding the performance and the perceived quality of experience for users with bandwidth caps and poor internet connections could guide the design of video-conferencing apps, the usability of video-conferencing applications have been severely overlooked in developing countries like South Africa, where one-third of adults rely on mobile devices to access the internet and where the per-gigabyte data cost is some of the most expensive in Africa. Considering these numbers, we conduct a two-prong study where 1) we measure bandwidth consumption of different Android apps through bandwidth measurement experiments and 2) we conduct interviews with bandwidth-constrained users to better understand their perceptions of mobile videoconferencing apps. The key benefit of this study will be to inform organisations that seek to be inclusive about these tools' relative usability by letting them know about the factors influencing users' quality of experience
Recommended from our members
The Economics of Information and Communication Technologies in our Society
Information and Communication Technologies (ICTs) play a fundamental role in today\u27s society. As ICTs they become more mature and widely adopted, societies become more dependent on their use to operationalize daily activities. However, there are multiple societal impacts of ICTs that are not yet well understood. In this dissertation, I explore three different aspects of ICTs that have been widely discussed by media and industry during recent years. I analyze these topics from an economic perspective, contributing to the debate with rigorous modeling and the ensuing discussion of its implications. First, I study the impact that the COVID-19 pandemic had on remote meeting technologies\u27 usage. Second, I empirically tackle the long debated question of whether internet users perceive internet providers\u27 Network Neutrality practices. Finally, I analyze the most recent and ambitious public policy in the U.S. to improve households\u27 broadband internet connectivity - the so-called policy of bridging of the digital divide
Split Federated Learning for 6G Enabled-Networks: Requirements, Challenges and Future Directions
Sixth-generation (6G) networks anticipate intelligently supporting a wide
range of smart services and innovative applications. Such a context urges a
heavy usage of Machine Learning (ML) techniques, particularly Deep Learning
(DL), to foster innovation and ease the deployment of intelligent network
functions/operations, which are able to fulfill the various requirements of the
envisioned 6G services. Specifically, collaborative ML/DL consists of deploying
a set of distributed agents that collaboratively train learning models without
sharing their data, thus improving data privacy and reducing the
time/communication overhead. This work provides a comprehensive study on how
collaborative learning can be effectively deployed over 6G wireless networks.
In particular, our study focuses on Split Federated Learning (SFL), a technique
recently emerged promising better performance compared with existing
collaborative learning approaches. We first provide an overview of three
emerging collaborative learning paradigms, including federated learning, split
learning, and split federated learning, as well as of 6G networks along with
their main vision and timeline of key developments. We then highlight the need
for split federated learning towards the upcoming 6G networks in every aspect,
including 6G technologies (e.g., intelligent physical layer, intelligent edge
computing, zero-touch network management, intelligent resource management) and
6G use cases (e.g., smart grid 2.0, Industry 5.0, connected and autonomous
systems). Furthermore, we review existing datasets along with frameworks that
can help in implementing SFL for 6G networks. We finally identify key technical
challenges, open issues, and future research directions related to SFL-enabled
6G networks
Resource Allocation in Networking and Computing Systems: A Security and Dependability Perspective
In recent years, there has been a trend to integrate networking and computing systems, whose management is getting increasingly complex. Resource allocation is one of the crucial aspects of managing such systems and is affected by this increased complexity. Resource allocation strategies aim to effectively maximize performance, system utilization, and profit by considering virtualization technologies, heterogeneous resources, context awareness, and other features. In such complex scenario, security and dependability are vital concerns that need to be considered in future computing and networking systems in order to provide the future advanced services, such as mission-critical applications. This paper provides a comprehensive survey of existing literature that considers security and dependability for resource allocation in computing and networking systems. The current research works are categorized by considering the allocated type of resources for different technologies, scenarios, issues, attributes, and solutions. The paper presents the research works on resource allocation that includes security and dependability, both singularly and jointly. The future research directions on resource allocation are also discussed. The paper shows how there are only a few works that, even singularly, consider security and dependability in resource allocation in the future computing and networking systems and highlights the importance of jointly considering security and dependability and the need for intelligent, adaptive and robust solutions. This paper aims to help the researchers effectively consider security and dependability in future networking and computing systems.publishedVersio
Transformation of the Digital Payment Ecosystem in India: A Case Study of Paytm
Paytm is a payment app in India providing e‐wallet services; it is also the most prominent mobile e‐commerce app in the world’s third‐largest economy. This article uses Paytm as a case study to better understand the global platform economy and its implications for social and economic inequities. We contextualize the emergence of Paytm by drawing attention to its relationship with India’s developing digital infrastructure and marginalized populations—many of whom are part of the platform’s user base. We use a political economy lens to investigate Paytm’s market structure, stakeholders, innovations, and beneficiaries. Our research is guided by the question: What resources, infrastructures, and policies have given rise to India’s digital payment ecosystem, and how have these contributed to economic and social inequities? Accordingly, we audited the international and Indian business press and Paytm’s corporate communications from 2016 to 2020. Our analysis points to the tensions between private and public interests in the larger platform ecosystem, dispelling notions of platforms as neutral arbiters of market transactions. We argue that Paytm is socially beneficial to the extent that it reduces transaction costs and makes digital payments more accessible for marginalized populations; it is detrimental to the time that it jeopardizes user data and privacy while suppressing competition in the platform economy
- …