26 research outputs found

    Ecosystem synergies, change and orchestration

    Get PDF
    This thesis investigates ecosystem synergies, change, and orchestration. The research topics are motivated by my curiosity, a fragmented research landscape, theoretical gaps, and new phenomena that challenge extant theories. To address these motivators, I conduct literature reviews to organise existing studies and identify their limited assumptions in light of new phenomena. Empirically, I adopt a case study method with abductive reasoning for a longitudinal analysis of the Alibaba ecosystem from 1999 to 2020. My findings provide an integrated and updated conceptualisation of ecosystem synergies that comprises three distinctive but interrelated components: 1) stack and integrate generic resources for efficiency and optimisation, 2) empower generative changes for variety and evolvability, and 3) govern tensions for sustainable growth. Theoretically grounded and empirically refined, this new conceptualisation helps us better understand the unique synergies of ecosystems that differ from those of alternative collective organisations and explain the forces that drive voluntary participation for value co-creation. Regarding ecosystem change, I find a duality relationship between intentionality and emergence and develop a phasic model of ecosystem sustainable growth with internal and external drivers. This new understanding challenges and extends prior discussions on their dominant dualism view, focus on partial drivers, and taken-for-granted lifecycle model. I propose that ecosystem orchestration involves systematic coordination of technological, adoption, internal, and institutional activities and is driven by long-term visions and adjusted by re-visioning. My analysis reveals internal orchestration's important role (re-envisioning, piloting, and organisation architectural reconfiguring), the synergy and system principles in designing adoption activities, and the expanding arena of institutional activities. Finally, building on the above findings, I reconceptualise ecosystems and ecosystem sustainable growth to highlight multi-stakeholder value creation, inclusivity, long-term orientation and interpretative approach. The thesis ends with discussing the implications for practice, policy, and future research.Open Acces

    Data Center Server Virtualization Solution Using Microsoft Hyper-V

    Get PDF
    Cloud Computing has helped businesses scale within minutes and take their services to their customers much faster. Virtualization is considered the core-computing layer of a cloud setup. All the problems a traditional data center environment like space, power, resilience, centralized data management, and rapid deployment of servers as per business need have been solved with the introduction of Hyper-V (a server virtualization solution from Microsoft). Now companies can deploy multiple servers and applications with just a click and they can also centrally manage the data storage. This paper focuses on the difference between VMware and Hyper virtualization platforms and building a virtualized infrastructure solution using Hyper

    Towards 6G Through SDN and NFV-Based Solutions for Terrestrial and Non-Terrestrial Networks

    Get PDF
    As societal needs continue to evolve, there has been a marked rise in a wide variety of emerging use cases that cannot be served adequately by existing networks. For example, increasing industrial automation has not only resulted in a massive rise in the number of connected devices, but has also brought forth the need for remote monitoring and reconnaissance at scale, often in remote locations characterized by a lack of connectivity options. Going beyond 5G, which has largely focused on enhancing the quality-of-experience for end devices, the next generation of wireless communications is expected to be centered around the idea of "wireless ubiquity". The concept of wireless ubiquity mandates that the quality of connectivity is not only determined by classical metrics such as throughput, reliability, and latency, but also by the level of coverage offered by the network. In other words, the upcoming sixth generation of wireless communications should be characterized by networks that exhibit high throughput and reliability with low latency, while also providing robust connectivity to a multitude of devices spread across the surface of the Earth, without any geographical constraints. The objective of this PhD thesis is to design novel architectural solutions for the upcoming sixth generation of cellular and space communications systems with a view to enabling wireless ubiquity with software-defined networking and network function virtualization at its core. Towards this goal, this thesis introduces a novel end-to-end system architecture for cellular communications characterized by innovations such as the AirHYPE wireless hypervisor. Furthermore, within the cellular systems domain, solutions for radio access network design with software-defined mobility management, and containerized core network design optimization have also been presented. On the other hand, within the space systems domain, this thesis introduces the concept of the Internet of Space Things (IoST). IoST is a novel cyber-physical system centered on nanosatellites and is capable of delivering ubiquitous connectivity for a wide variety of use cases, ranging from monitoring and reconnaissance to in-space backhauling. In this direction, contributions relating to constellation design, routing, and automatic network slicing form a key aspect of this thesis.Ph.D

    Proyecto Docente e Investigador, Trabajo Original de Investigación y Presentación de la Defensa, preparado por Germán Moltó para concursar a la plaza de Catedrático de Universidad, concurso 082/22, plaza 6708, área de Ciencia de la Computación e Inteligencia Artificial

    Full text link
    Este documento contiene el proyecto docente e investigador del candidato Germán Moltó Martínez presentado como requisito para el concurso de acceso a plazas de Cuerpos Docentes Universitarios. Concretamente, el documento se centra en el concurso para la plaza 6708 de Catedrático de Universidad en el área de Ciencia de la Computación en el Departamento de Sistemas Informáticos y Computación de la Universitat Politécnica de València. La plaza está adscrita a la Escola Técnica Superior d'Enginyeria Informàtica y tiene como perfil las asignaturas "Infraestructuras de Cloud Público" y "Estructuras de Datos y Algoritmos".También se incluye el Historial Académico, Docente e Investigador, así como la presentación usada durante la defensa.Germán Moltó Martínez (2022). Proyecto Docente e Investigador, Trabajo Original de Investigación y Presentación de la Defensa, preparado por Germán Moltó para concursar a la plaza de Catedrático de Universidad, concurso 082/22, plaza 6708, área de Ciencia de la Computación e Inteligencia Artificial. http://hdl.handle.net/10251/18903

    Detection and Mitigation of Steganographic Malware

    Get PDF
    A new attack trend concerns the use of some form of steganography and information hiding to make malware stealthier and able to elude many standard security mechanisms. Therefore, this Thesis addresses the detection and the mitigation of this class of threats. In particular, it considers malware implementing covert communications within network traffic or cloaking malicious payloads within digital images. The first research contribution of this Thesis is in the detection of network covert channels. Unfortunately, the literature on the topic lacks of real traffic traces or attack samples to perform precise tests or security assessments. Thus, a propaedeutic research activity has been devoted to develop two ad-hoc tools. The first allows to create covert channels targeting the IPv6 protocol by eavesdropping flows, whereas the second allows to embed secret data within arbitrary traffic traces that can be replayed to perform investigations in realistic conditions. This Thesis then starts with a security assessment concerning the impact of hidden network communications in production-quality scenarios. Results have been obtained by considering channels cloaking data in the most popular protocols (e.g., TLS, IPv4/v6, and ICMPv4/v6) and showcased that de-facto standard intrusion detection systems and firewalls (i.e., Snort, Suricata, and Zeek) are unable to spot this class of hazards. Since malware can conceal information (e.g., commands and configuration files) in almost every protocol, traffic feature or network element, configuring or adapting pre-existent security solutions could be not straightforward. Moreover, inspecting multiple protocols, fields or conversations at the same time could lead to performance issues. Thus, a major effort has been devoted to develop a suite based on the extended Berkeley Packet Filter (eBPF) to gain visibility over different network protocols/components and to efficiently collect various performance indicators or statistics by using a unique technology. This part of research allowed to spot the presence of network covert channels targeting the header of the IPv6 protocol or the inter-packet time of generic network conversations. In addition, the approach based on eBPF turned out to be very flexible and also allowed to reveal hidden data transfers between two processes co-located within the same host. Another important contribution of this part of the Thesis concerns the deployment of the suite in realistic scenarios and its comparison with other similar tools. Specifically, a thorough performance evaluation demonstrated that eBPF can be used to inspect traffic and reveal the presence of covert communications also when in the presence of high loads, e.g., it can sustain rates up to 3 Gbit/s with commodity hardware. To further address the problem of revealing network covert channels in realistic environments, this Thesis also investigates malware targeting traffic generated by Internet of Things devices. In this case, an incremental ensemble of autoencoders has been considered to face the ''unknown'' location of the hidden data generated by a threat covertly exchanging commands towards a remote attacker. The second research contribution of this Thesis is in the detection of malicious payloads hidden within digital images. In fact, the majority of real-world malware exploits hiding methods based on Least Significant Bit steganography and some of its variants, such as the Invoke-PSImage mechanism. Therefore, a relevant amount of research has been done to detect the presence of hidden data and classify the payload (e.g., malicious PowerShell scripts or PHP fragments). To this aim, mechanisms leveraging Deep Neural Networks (DNNs) proved to be flexible and effective since they can learn by combining raw low-level data and can be updated or retrained to consider unseen payloads or images with different features. To take into account realistic threat models, this Thesis studies malware targeting different types of images (i.e., favicons and icons) and various payloads (e.g., URLs and Ethereum addresses, as well as webshells). Obtained results showcased that DNNs can be considered a valid tool for spotting the presence of hidden contents since their detection accuracy is always above 90% also when facing ''elusion'' mechanisms such as basic obfuscation techniques or alternative encoding schemes. Lastly, when detection or classification are not possible (e.g., due to resource constraints), approaches enforcing ''sanitization'' can be applied. Thus, this Thesis also considers autoencoders able to disrupt hidden malicious contents without degrading the quality of the image

    End-to-End Trust Fulfillment of Big Data Workflow Provisioning over Competing Clouds

    Get PDF
    Cloud Computing has emerged as a promising and powerful paradigm for delivering data- intensive, high performance computation, applications and services over the Internet. Cloud Computing has enabled the implementation and success of Big Data, a relatively recent phenomenon consisting of the generation and analysis of abundant data from various sources. Accordingly, to satisfy the growing demands of Big Data storage, processing, and analytics, a large market has emerged for Cloud Service Providers, offering a myriad of resources, platforms, and infrastructures. The proliferation of these services often makes it difficult for consumers to select the most suitable and trustworthy provider to fulfill the requirements of building complex workflows and applications in a relatively short time. In this thesis, we first propose a quality specification model to support dual pre- and post-cloud workflow provisioning, consisting of service provider selection and workflow quality enforcement and adaptation. This model captures key properties of the quality of work at different stages of the Big Data value chain, enabling standardized quality specification, monitoring, and adaptation. Subsequently, we propose a two-dimensional trust-enabled framework to facilitate end-to-end Quality of Service (QoS) enforcement that: 1) automates cloud service provider selection for Big Data workflow processing, and 2) maintains the required QoS levels of Big Data workflows during runtime through dynamic orchestration using multi-model architecture-driven workflow monitoring, prediction, and adaptation. The trust-based automatic service provider selection scheme we propose in this thesis is comprehensive and adaptive, as it relies on a dynamic trust model to evaluate the QoS of a cloud provider prior to taking any selection decisions. It is a multi-dimensional trust model for Big Data workflows over competing clouds that assesses the trustworthiness of cloud providers based on three trust levels: (1) presence of the most up-to-date cloud resource verified capabilities, (2) reputational evidence measured by neighboring users and (3) a recorded personal history of experiences with the cloud provider. The trust-based workflow orchestration scheme we propose aims to avoid performance degradation or cloud service interruption. Our workflow orchestration approach is not only based on automatic adaptation and reconfiguration supported by monitoring, but also on predicting cloud resource shortages, thus preventing performance degradation. We formalize the cloud resource orchestration process using a state machine that efficiently captures different dynamic properties of the cloud execution environment. In addition, we use a model checker to validate our monitoring model in terms of reachability, liveness, and safety properties. We evaluate both our automated service provider selection scheme and cloud workflow orchestration, monitoring and adaptation schemes on a workflow-enabled Big Data application. A set of scenarios were carefully chosen to evaluate the performance of the service provider selection, workflow monitoring and the adaptation schemes we have implemented. The results demonstrate that our service selection outperforms other selection strategies and ensures trustworthy service provider selection. The results of evaluating automated workflow orchestration further show that our model is self-adapting, self-configuring, reacts efficiently to changes and adapts accordingly while enforcing QoS of workflows

    Deep Learning Techniques for Mobility Prediction and Management in Mobile Networks

    Get PDF
    Trajectory prediction is an important research topic in modern mobile networks (e.g., 5G and beyond 5G) to enhance the network quality of service by accurately predicting the future locations of mobile users, such as pedestrians and vehicles, based on their past mobility patterns. A trajectory is defined as the sequence of locations the user visits over time. The primary objective of this thesis is to improve the modeling of mobility data and establish personalized, scalable, collective-intelligent, distributed, and strategic trajectory prediction techniques that can effectively adapt to the dynamics of urban environments in order to facilitate the optimal delivery of mobility-aware network services. Our proposed approaches aim to increase the accuracy of trajectory prediction while minimizing communication and computational costs leading to more efficient mobile networks. The thesis begins by introducing a personalized trajectory prediction technique using deep learning and reinforcement learning. It adapts the neural network architecture to capture the distinct characteristics of mobile users’ data. Furthermore, it introduces advanced anticipatory handover management and dynamic service migration techniques that optimize network management using our high-performance trajectory predictor. This approach ensures seamless connectivity and proactively migrates network services, enhancing the quality of service in dense wireless networks. The second contribution of the thesis introduces cluster-level prediction to extend the reinforcement learning-based trajectory prediction, addressing scalability challenges in large-scale networks. Cluster-level trajectory prediction leverages users’ similarities within clusters to train only a few representatives. This enables efficient transfer learning of pre-trained mobility models and reduces computational overhead enhancing the network scalability. The third contribution proposes a collaborative social-aware multi-agent trajectory prediction technique that accounts for the interactions between multiple intra-cluster agents in a dynamic urban environment, increasing the prediction accuracy but decreasing the algorithm complexity and computational resource usage. The fourth contribution proposes a federated learning-driven multi-agent trajectory prediction technique that leverages the collaborative power of multiple local data sources in a decentralized manner to enhance user privacy and improve the accuracy of trajectory prediction while jointly minimizing computational and communication costs. The fifth contribution proposes a game theoretic non-cooperative multi-agent prediction technique that considers the strategic behaviors among competitive inter-cluster mobile users. The proposed approaches are evaluated on small-scale and large-scale location-based mobility datasets, where locations could be GPS coordinates or cellular base station IDs. Our experiments demonstrate that our proposed approaches outperform state-of-the-art trajectory prediction methods making significant contributions to the field of mobile networks

    HyperCell: A Bio-inspired Design Framework for Real-time Interactive Architectures

    Get PDF
    This pioneering research focuses on Biomimetic Interactive Architecture using “Computation”, “Embodiment”, and “Biology” to generate an intimate embodied convergence to propose a novel rule-based design framework for creating organic architectures composed of swarm-based intelligent components. Furthermore, the research boldly claims that Interactive Architecture should emerge as the next truly Organic Architecture. As the world and society are dynamically changing, especially in this digital era, the research dares to challenge the Utilitas, Firmitas, and Venustas of the traditional architectural Weltanschauung, and rejects them by adopting the novel notion that architecture should be dynamic, fluid, and interactive. This project reflects a trajectory from the 1960’s with the advent of the avant-garde architectural design group, Archigram, and its numerous intriguing and pioneering visionary projects. Archigram’s non-standard, mobile, and interactive projects profoundly influenced a new generation of architects to explore the connection between technology and their architectural projects. This research continues this trend of exploring novel design thinking and the framework of Interactive Architecture by discovering the interrelationship amongst three major topics: “Computation”, “Embodiment”, and “Biology”. The project aims to elucidate pioneering research combining these three topics in one discourse: “Bio-inspired digital architectural design”. These three major topics will be introduced in this Summary.   “Computation”, is any type of calculation that includes both arithmetical and nonarithmetical steps and follows a well-defined model understood and described as, for example, an algorithm. But, in this research, refers to the use of data storage, parametric design application, and physical computing for developing informed architectural designs. “Form” has always been the most critical focus in architectural design, and this focus has also been a major driver behind the application computational design in Architecture. Nonetheless, this research will interpret the term “Form” in architecture as a continual “information processor” rather than the result of information processing. In other words, “Form” should not be perceived only as an expressive appearance based computational outcome but rather as a real-time process of information processing, akin to organic “Formation”. Architecture embodying kinetic ability for adjusting or changing its shape with the ability to process the surroundings and feedback in accordance with its free will with an inherent interactive intelligent movement of a living body. Additionally, it is also crucial to address the question of whether computational technologies are being properly harnessed, if they are only used for form-generating purposes in architecture design, or should this be replaced with real-time information communication and control systems to produce interactive architectures, with embodied computation abilities?   “Embodiment” in the context of this research is embedded in Umberto Eco’s vision on Semiotics, theories underlying media studies in Marshall McLuhan’s “Body Extension” (McLuhan, 1964), the contemporary philosophical thought of “Body Without Organs” (Gilles Deleuze and Félix Guattari, 1983), the computational Logic of ‘Swarm Behavior’ and the philosophical notion of “Monadology” proposed by Gottfried Leibniz (Leibniz, 1714). Embodied computation and design are predominant today within the wearable computing and smart living domains, which combine Virtual and Real worlds. Technical progress and prowess in VR development also contribute to advancing 3D smart architectural design and display solutions. The proposed ‘Organic body-like architectural spaces’ emphasize upon the realization of a body-like interactive space. Developing Interactive Architecture will imply eliciting the collective intelligence prevalent in nature and the virtual world of Big Data. Interactive Architecture shall thus embody integrated Information exchange protocols and decision-making systems in order to possess organic body-like qualities.   “Biology”, in this research explores biomimetic principles intended to create purposedriven kinetic and organic architecture. This involves a detailed study/critique of organic architecture, generating organic shapes, performance optimization based digital fabrication techniques and kinetic systems. A holistic bio-inspired architecture embodies multiple performance criteria akin to natural systems, which integrate structural, infrastructure performances throughout the growth of an organic body. Such a natural morphogenesis process of architectural design explores what Janine M. Benyus described as “learning the natural process”. Profoundly influenced by the processes behind morphogenesis, the research further explores Evolutionary Development Biology (Evo-Devo) explaining how embryological regulation strongly affect the resulting formations. Evo-Devo in interactive architecture implies the development of architecture based on three fundamental principles: “Simple to Complex”, “Geometric Information Distribution”, and “On/Off Switch and Trigger.” The research seeks to create a relatively intelligent architectural body, and the tactile interactive spatial environment by applying the extracted knowledge from the study of the aforementioned principles of Evo-Devo in the following fashion: A. Extract a Self-Similar Componential System based approach from the “Simple to Complex” principle of Evo-Devo B. Extract the idea of “Collective Intelligence” from “Geometric information Distribution” principle of Evo-Devo C. Extract the principle of “Assembly Regulation” from “On/Off switch and trigger” principle of Evo-Devo The “HyperCell” research, through an elaborate investigation on the three aforementioned topics, develops a design framework for developing real-time adaptive spatial systems. HyperCell does this, by developing a system of transformable cubic elements which can self-organize, adapt and interact in real-time. These Hypercells shall comprise an organic space which can adjust itself in relation to our human bodies. The furniture system is literally reified and embodied to develop an intra-active space that proactively provokes human movement. The space thus acquires an emotive dimension and can become your pet, partner, or even friend, and might also involve multiple usabilities of the same space. The research and its progression were also had actively connected with a 5-year collaborative European Culture project: “MetaBody”. The research thus involves exploration of Interactive Architecture from the following perspectives: architectural design, digital architectural history trajectory, computational technology, philosophical discourse related to the embodiment, media and digital culture, current VR and body-related technology, and Evolutionary Developmental Biology. “HyperCell” will encourage young architects to pursue interdisciplinary design initiatives via the fusion of computational design, embodiment, and biology for developing bio-inspired organic architectures

    Ubiquitous Robotics System for Knowledge-based Auto-configuration System for Service Delivery within Smart Home Environments

    Get PDF
    The future smart home will be enhanced and driven by the recent advance of the Internet of Things (IoT), which advocates the integration of computational devices within an Internet architecture on a global scale [1, 2]. In the IoT paradigm, the smart home will be developed by interconnecting a plethora of smart objects both inside and outside the home environment [3-5]. The recent take-up of these connected devices within home environments is slowly and surely transforming traditional home living environments. Such connected and integrated home environments lead to the concept of the smart home, which has attracted significant research efforts to enhance the functionality of home environments with a wide range of novel services. The wide availability of services and devices within contemporary smart home environments make their management a challenging and rewarding task. The trend whereby the development of smart home services is decoupled from that of smart home devices increases the complexity of this task. As such, it is desirable that smart home services are developed and deployed independently, rather than pre-bundled with specific devices, although it must be recognised that this is not always practical. Moreover, systems need to facilitate the deployment process and cope with any changes in the target environment after deployment. Maintaining complex smart home systems throughout their lifecycle entails considerable resources and effort. These challenges have stimulated the need for dynamic auto-configurable services amongst such distributed systems. Although significant research has been directed towards achieving auto-configuration, none of the existing solutions is sufficient to achieve auto-configuration within smart home environments. All such solutions are considered incomplete, as they lack the ability to meet all smart home requirements efficiently. These requirements include the ability to adapt flexibly to new and dynamic home environments without direct user intervention. Fulfilling these requirements would enhance the performance of smart home systems and help to address cost-effectiveness, considering the financial implications of the manual configuration of smart home environments. Current configuration approaches fail to meet one or more of the requirements of smart homes. If one of these approaches meets the flexibility criterion, the configuration is either not executed online without affecting the system or requires direct user intervention. In other words, there is no adequate solution to allow smart home systems to adapt dynamically to changing circumstances, hence to enable the correct interconnections among its components without direct user intervention and the interruption of the whole system. Therefore, it is necessary to develop an efficient, adaptive, agile and flexible system that adapts dynamically to each new requirement of the smart home environment. This research aims to devise methods to automate the activities associated with customised service delivery for dynamic home environments by exploiting recent advances in the field of ubiquitous robotics and Semantic Web technologies. It introduces a novel approach called the Knowledge-based Auto-configuration Software Robot (Sobot) for Smart Home Environments, which utilises the Sobot to achieve auto-configuration of the system. The research work was conducted under the Distributed Integrated Care Services and Systems (iCARE) project, which was designed to accomplish and deliver integrated distributed ecosystems with a homecare focus. The auto-configuration Sobot which is the focus of this thesis is a key component of the iCARE project. It will become one of the key enabling technologies for generic smart home environments. It has a profound impact on designing and implementing a high quality system. Its main role is to generate a feasible configuration that meets the given requirements using the knowledgebase of the smart home environment as a core component. The knowledgebase plays a pivotal role in helping the Sobot to automatically select the most appropriate resources in a given context-aware system via semantic searching and matching. Ontology as a technique of knowledgebase representation generally helps to design and develop a specific domain. It is also a key technology for the Semantic Web, which enables a common understanding amongst software agents and people, clarifies the domain assumptions and facilitates the reuse and analysis of its knowledge. The main advantages of the Sobot over traditional applications is its awareness of the changing digital and physical environments and its ability to interpret these changes, extract the relevant contextual data and merge any new information or knowledge. The Sobot is capable of creating new or alternative feasible configurations to meet the system’s goal by utilising inferred facts based on the smart home ontological model, so that the system can adapt to the changed environment. Furthermore, the Sobot has the capability to execute the generated reconfiguration plan without interrupting the running of the system. A proof-of-concept testbed has been designed and implemented. The case studies carried out have shown the potential of the proposed approach to achieve flexible and reliable auto-configuration of the smart home system, with promising directions for future research

    Perspectives of university teaching in Costa Rica in times of digital media

    Get PDF
    Perspectives of university teaching in Costa Rica in times of digital media examines an educational approach to understand the space of learning that takes place in higher education. For that, a selection of viewpoints of digital media and university teaching are discussed in the light of a tradition: the Journeyman Years. The key research question is: what is a space of learning in higher education from the students and professor's perspectives at the Universidad de Costa Rica? Pertinent to this topic, other sub-questions are: what kind of spaces of learning are being ofered at the Universidad de Costa Rica? How to reconsider the space of learning at a university? Chapter Two introduces the Wanderjahre (Journeyman Years) story, a leading metaphor for this manuscript where an approach to learning in terms of space is presented. Chapter Three examines two diferent knowledge approaches: frst, mechanistic thinking is highlighted in relation to digital media. Humans learn of natural phenomena through rational means, seeking to demystify and unveil a true world. Second, romantic thinking is featured in relation to higher education. Individuals learn about the world by engaging in practice while being social, experiencing directly the world in continuous change. Chapter Four presents an interpretation of the previous theoretical perspectives. After a selection of reviewed concepts, Learning by Wandering is proposed, a structure to analyze the construction of the space of learning in higher education. Chapter Five describes an ethnographic case study of the space of learning at the Universidad de Costa Rica, where 150 students and eight university teachers throughout diferent contexts are studied. Chapter Six features the major relevant fndings in my thesis to analyze university teaching in terms of space. In this chapter, a list of recommendations for the Universidad de Costa Rica is ofered, in order to foster higher education in terms of space
    corecore