12,236 research outputs found

    The Metaverse: Survey, Trends, Novel Pipeline Ecosystem & Future Directions

    Full text link
    The Metaverse offers a second world beyond reality, where boundaries are non-existent, and possibilities are endless through engagement and immersive experiences using the virtual reality (VR) technology. Many disciplines can benefit from the advancement of the Metaverse when accurately developed, including the fields of technology, gaming, education, art, and culture. Nevertheless, developing the Metaverse environment to its full potential is an ambiguous task that needs proper guidance and directions. Existing surveys on the Metaverse focus only on a specific aspect and discipline of the Metaverse and lack a holistic view of the entire process. To this end, a more holistic, multi-disciplinary, in-depth, and academic and industry-oriented review is required to provide a thorough study of the Metaverse development pipeline. To address these issues, we present in this survey a novel multi-layered pipeline ecosystem composed of (1) the Metaverse computing, networking, communications and hardware infrastructure, (2) environment digitization, and (3) user interactions. For every layer, we discuss the components that detail the steps of its development. Also, for each of these components, we examine the impact of a set of enabling technologies and empowering domains (e.g., Artificial Intelligence, Security & Privacy, Blockchain, Business, Ethics, and Social) on its advancement. In addition, we explain the importance of these technologies to support decentralization, interoperability, user experiences, interactions, and monetization. Our presented study highlights the existing challenges for each component, followed by research directions and potential solutions. To the best of our knowledge, this survey is the most comprehensive and allows users, scholars, and entrepreneurs to get an in-depth understanding of the Metaverse ecosystem to find their opportunities and potentials for contribution

    Machine Learning Research Trends in Africa: A 30 Years Overview with Bibliometric Analysis Review

    Full text link
    In this paper, a critical bibliometric analysis study is conducted, coupled with an extensive literature survey on recent developments and associated applications in machine learning research with a perspective on Africa. The presented bibliometric analysis study consists of 2761 machine learning-related documents, of which 98% were articles with at least 482 citations published in 903 journals during the past 30 years. Furthermore, the collated documents were retrieved from the Science Citation Index EXPANDED, comprising research publications from 54 African countries between 1993 and 2021. The bibliometric study shows the visualization of the current landscape and future trends in machine learning research and its application to facilitate future collaborative research and knowledge exchange among authors from different research institutions scattered across the African continent

    Visualisation of Fundamental Movement Skills (FMS): An Iterative Process Using an Overarm Throw

    Get PDF
    Fundamental Movement Skills (FMS) are precursor gross motor skills to more complex or specialised skills and are recognised as important indicators of physical competence, a key component of physical literacy. FMS are predominantly assessed using pre-defined manual methodologies, most commonly the various iterations of the Test of Gross Motor Development. However, such assessments are time-consuming and often require a minimum basic level of training to conduct. Therefore, the overall aim of this thesis was to utilise accelerometry to develop a visualisation concept as part of a feasibility study to support the learning and assessment of FMS, by reducing subjectivity and the overall time taken to conduct a gross motor skill assessment. The overarm throw, an important fundamental movement skill, was specifically selected for the visualisation development as it is an acyclic movement with a distinct initiation and conclusion. Thirteen children (14.8 ± 0.3 years; 9 boys) wore an ActiGraph GT9X Link Inertial Measurement Unit device on the dominant wrist whilst performing a series of overarm throws. This thesis illustrates how the visualisation concept was developed using raw accelerometer data, which was processed and manipulated using MATLAB 2019b software to obtain and depict key throw performance data, including the trajectory and velocity of the wrist during the throw. Overall, this thesis found that the developed visualisation concept can provide strong indicators of throw competency based on the shape of the throw trajectory. Future research should seek to utilise a larger, more diverse, population, and incorporate machine learning. Finally, further work is required to translate this concept to other gross motor skills

    Um modelo para suporte automatizado ao reconhecimento, extração, personalização e reconstrução de gráficos estáticos

    Get PDF
    Data charts are widely used in our daily lives, being present in regular media, such as newspapers, magazines, web pages, books, and many others. A well constructed data chart leads to an intuitive understanding of its underlying data and in the same way, when data charts have wrong design choices, a redesign of these representations might be needed. However, in most cases, these charts are shown as a static image, which means that the original data are not usually available. Therefore, automatic methods could be applied to extract the underlying data from the chart images to allow these changes. The task of recognizing charts and extracting data from them is complex, largely due to the variety of chart types and their visual characteristics. Computer Vision techniques for image classification and object detection are widely used for the problem of recognizing charts, but only in images without any disturbance. Other features in real-world images that can make this task difficult are not present in most literature works, like photo distortions, noise, alignment, etc. Two computer vision techniques that can assist this task and have been little explored in this context are perspective detection and correction. These methods transform a distorted and noisy chart in a clear chart, with its type ready for data extraction or other uses. The task of reconstructing data is straightforward, as long the data is available the visualization can be reconstructed, but the scenario of reconstructing it on the same context is complex. Using a Visualization Grammar for this scenario is a key component, as these grammars usually have extensions for interaction, chart layers, and multiple views without requiring extra development effort. This work presents a model for automated support for custom recognition, and reconstruction of charts in images. The model automatically performs the process steps, such as reverse engineering, turning a static chart back into its data table for later reconstruction, while allowing the user to make modifications in case of uncertainties. This work also features a model-based architecture along with prototypes for various use cases. Validation is performed step by step, with methods inspired by the literature. This work features three use cases providing proof of concept and validation of the model. The first use case features usage of chart recognition methods focused on documents in the real-world, the second use case focus on vocalization of charts, using a visualization grammar to reconstruct a chart in audio format, and the third use case presents an Augmented Reality application that recognizes and reconstructs charts in the same context (a piece of paper) overlaying the new chart and interaction widgets. The results showed that with slight changes, chart recognition and reconstruction methods are now ready for real-world charts, when taking time, accuracy and precision into consideration.Os gráficos de dados são amplamente utilizados na nossa vida diária, estando presentes nos meios de comunicação regulares, tais como jornais, revistas, páginas web, livros, e muitos outros. Um gráfico bem construído leva a uma compreensão intuitiva dos seus dados inerentes e da mesma forma, quando os gráficos de dados têm escolhas de conceção erradas, poderá ser necessário um redesenho destas representações. Contudo, na maioria dos casos, estes gráficos são mostrados como uma imagem estática, o que significa que os dados originais não estão normalmente disponíveis. Portanto, poderiam ser aplicados métodos automáticos para extrair os dados inerentes das imagens dos gráficos, a fim de permitir estas alterações. A tarefa de reconhecer os gráficos e extrair dados dos mesmos é complexa, em grande parte devido à variedade de tipos de gráficos e às suas características visuais. As técnicas de Visão Computacional para classificação de imagens e deteção de objetos são amplamente utilizadas para o problema de reconhecimento de gráficos, mas apenas em imagens sem qualquer ruído. Outras características das imagens do mundo real que podem dificultar esta tarefa não estão presentes na maioria das obras literárias, como distorções fotográficas, ruído, alinhamento, etc. Duas técnicas de visão computacional que podem ajudar nesta tarefa e que têm sido pouco exploradas neste contexto são a deteção e correção da perspetiva. Estes métodos transformam um gráfico distorcido e ruidoso em um gráfico limpo, com o seu tipo pronto para extração de dados ou outras utilizações. A tarefa de reconstrução de dados é simples, desde que os dados estejam disponíveis a visualização pode ser reconstruída, mas o cenário de reconstrução no mesmo contexto é complexo. A utilização de uma Gramática de Visualização para este cenário é um componente chave, uma vez que estas gramáticas têm normalmente extensões para interação, camadas de gráficos, e visões múltiplas sem exigir um esforço extra de desenvolvimento. Este trabalho apresenta um modelo de suporte automatizado para o reconhecimento personalizado, e reconstrução de gráficos em imagens estáticas. O modelo executa automaticamente as etapas do processo, tais como engenharia inversa, transformando um gráfico estático novamente na sua tabela de dados para posterior reconstrução, ao mesmo tempo que permite ao utilizador fazer modificações em caso de incertezas. Este trabalho também apresenta uma arquitetura baseada em modelos, juntamente com protótipos para vários casos de utilização. A validação é efetuada passo a passo, com métodos inspirados na literatura. Este trabalho apresenta três casos de uso, fornecendo prova de conceito e validação do modelo. O primeiro caso de uso apresenta a utilização de métodos de reconhecimento de gráficos focando em documentos no mundo real, o segundo caso de uso centra-se na vocalização de gráficos, utilizando uma gramática de visualização para reconstruir um gráfico em formato áudio, e o terceiro caso de uso apresenta uma aplicação de Realidade Aumentada que reconhece e reconstrói gráficos no mesmo contexto (um pedaço de papel) sobrepondo os novos gráficos e widgets de interação. Os resultados mostraram que com pequenas alterações, os métodos de reconhecimento e reconstrução dos gráficos estão agora prontos para os gráficos do mundo real, tendo em consideração o tempo, a acurácia e a precisão.Programa Doutoral em Engenharia Informátic

    Fiabilité de l’underfill et estimation de la durée de vie d’assemblages microélectroniques

    Get PDF
    Abstract : In order to protect the interconnections in flip-chip packages, an underfill material layer is used to fill the volumes and provide mechanical support between the silicon chip and the substrate. Due to the chip corner geometry and the mismatch of coefficient of thermal expansion (CTE), the underfill suffers from a stress concentration at the chip corners when the temperature is lower than the curing temperature. This stress concentration leads to subsequent mechanical failures in flip-chip packages, such as chip-underfill interfacial delamination and underfill cracking. Local stresses and strains are the most important parameters for understanding the mechanism of underfill failures. As a result, the industry currently relies on the finite element method (FEM) to calculate the stress components, but the FEM may not be accurate enough compared to the actual stresses in underfill. FEM simulations require a careful consideration of important geometrical details and material properties. This thesis proposes a modeling approach that can accurately estimate the underfill delamination areas and crack trajectories, with the following three objectives. The first objective was to develop an experimental technique capable of measuring underfill deformations around the chip corner region. This technique combined confocal microscopy and the digital image correlation (DIC) method to enable tri-dimensional strain measurements at different temperatures, and was named the confocal-DIC technique. This techique was first validated by a theoretical analysis on thermal strains. In a test component similar to a flip-chip package, the strain distribution obtained by the FEM model was in good agreement with the results measured by the confocal-DIC technique, with relative errors less than 20% at chip corners. Then, the second objective was to measure the strain near a crack in underfills. Artificial cracks with lengths of 160 μm and 640 μm were fabricated from the chip corner along the 45° diagonal direction. The confocal-DIC-measured maximum hoop strains and first principal strains were located at the crack front area for both the 160 μm and 640 μm cracks. A crack model was developed using the extended finite element method (XFEM), and the strain distribution in the simulation had the same trend as the experimental results. The distribution of hoop strains were in good agreement with the measured values, when the model element size was smaller than 22 μm to capture the strong strain gradient near the crack tip. The third objective was to propose a modeling approach for underfill delamination and cracking with the effects of manufacturing variables. A deep thermal cycling test was performed on 13 test cells to obtain the reference chip-underfill delamination areas and crack profiles. An artificial neural network (ANN) was trained to relate the effects of manufacturing variables and the number of cycles to first delamination of each cell. The predicted numbers of cycles for all 6 cells in the test dataset were located in the intervals of experimental observations. The growth of delamination was carried out on FEM by evaluating the strain energy amplitude at the interface elements between the chip and underfill. For 5 out of 6 cells in validation, the delamination growth model was consistent with the experimental observations. The cracks in bulk underfill were modelled by XFEM without predefined paths. The directions of edge cracks were in good agreement with the experimental observations, with an error of less than 2.5°. This approach met the goal of the thesis of estimating the underfill initial delamination, areas of delamination and crack paths in actual industrial flip-chip assemblies.Afin de protéger les interconnexions dans les assemblages, une couche de matériau d’underfill est utilisée pour remplir le volume et fournir un support mécanique entre la puce de silicium et le substrat. En raison de la géométrie du coin de puce et de l’écart du coefficient de dilatation thermique (CTE), l’underfill souffre d’une concentration de contraintes dans les coins lorsque la température est inférieure à la température de cuisson. Cette concentration de contraintes conduit à des défaillances mécaniques dans les encapsulations de flip-chip, telles que la délamination interfaciale puce-underfill et la fissuration d’underfill. Les contraintes et déformations locales sont les paramètres les plus importants pour comprendre le mécanisme des ruptures de l’underfill. En conséquent, l’industrie utilise actuellement la méthode des éléments finis (EF) pour calculer les composantes de la contrainte, qui ne sont pas assez précises par rapport aux contraintes actuelles dans l’underfill. Ces simulations nécessitent un examen minutieux de détails géométriques importants et des propriétés des matériaux. Cette thèse vise à proposer une approche de modélisation permettant d’estimer avec précision les zones de délamination et les trajectoires des fissures dans l’underfill, avec les trois objectifs suivants. Le premier objectif est de mettre au point une technique expérimentale capable de mesurer la déformation de l’underfill dans la région du coin de puce. Cette technique, combine la microscopie confocale et la méthode de corrélation des images numériques (DIC) pour permettre des mesures tridimensionnelles des déformations à différentes températures, et a été nommée le technique confocale-DIC. Cette technique a d’abord été validée par une analyse théorique en déformation thermique. Dans un échantillon similaire à un flip-chip, la distribution de la déformation obtenues par le modèle EF était en bon accord avec les résultats de la technique confocal-DIC, avec des erreurs relatives inférieures à 20% au coin de puce. Ensuite, le second objectif est de mesurer la déformation autour d’une fissure dans l’underfill. Des fissures artificielles d’une longueuer de 160 μm et 640 μm ont été fabriquées dans l’underfill vers la direction diagonale de 45°. Les déformations circonférentielles maximales et principale maximale étaient situées aux pointes des fissures correspondantes. Un modèle de fissure a été développé en utilisant la méthode des éléments finis étendue (XFEM), et la distribution des contraintes dans la simuation a montré la même tendance que les résultats expérimentaux. La distribution des déformations circonférentielles maximales était en bon accord avec les valeurs mesurées lorsque la taille des éléments était plus petite que 22 μm, assez petit pour capturer le grand gradient de déformation près de la pointe de fissure. Le troisième objectif était d’apporter une approche de modélisation de la délamination et de la fissuration de l’underfill avec les effets des variables de fabrication. Un test de cyclage thermique a d’abord été effectué sur 13 cellules pour obtenir les zones délaminées entre la puce et l’underfill, et les profils de fissures dans l’underfill, comme référence. Un réseau neuronal artificiel (ANN) a été formé pour établir une liaison entre les effets des variables de fabrication et le nombre de cycles à la délamination pour chaque cellule. Les nombres de cycles prédits pour les 6 cellules de l’ensemble de test étaient situés dans les intervalles d’observations expérimentaux. La croissance de la délamination a été réalisée par l’EF en évaluant l’énergie de la déformation au niveau des éléments interfaciaux entre la puce et l’underfill. Pour 5 des 6 cellules de la validation, le modèle de croissance du délaminage était conforme aux observations expérimentales. Les fissures dans l’underfill ont été modélisées par XFEM sans chemins prédéfinis. Les directions des fissures de bord étaient en bon accord avec les observations expérimentales, avec une erreur inférieure à 2,5°. Cette approche a répondu à la problématique qui consiste à estimer l’initiation des délamination, les zones de délamination et les trajectoires de fissures dans l’underfill pour des flip-chips industriels

    Stakeholder Governance: Empirical and Theoretical Developments

    Get PDF
    Stakeholder governance receives attention across many disciplines, resulting in fragmented knowledge. The inherent complexity of stakeholder governance requires the integration of this knowledge to develop comprehensive and inclusive theories to better conceptualize this phenomenon. In this research, we develop stakeholder governance through empirical and theoretical approaches. In the first essay, we use multiple case comparisons to empirically examine how and why organizations manage food waste to develop grounded theory through contextualized explanations. We contribute grounded theoretical and empirical evidence to show that food waste represents a significant business problem. Our data suggests that dimensions of logistics and stakeholder governance dictate how and why organizations manage food waste. These findings stimulate a deeper dive into stakeholder governance, revealing fragmentations in knowledge that require holistic, interdisciplinary review and synthesis. In the second essay, we identify definitions and terminologies, review the evolution of theories and orientations, organize mechanisms and conceptualizations, synthesize key theoretical tensions, and offer suggestions for future research to contribute theoretical developments for stakeholder governance. We contribute pluralist conceptual frameworks that integrate knowledge across disciplines to provide a comprehensive overview and recommendations. Overall, we contribute empirical and theoretical research to advance theory development for stakeholder governance

    Delay measurements In live 5G cellular network

    Get PDF
    Abstract. 5G Network has many important properties, including increased bandwidth, increased data throughput, high reliability, high network density, and low latency. This thesis concentrate on the low latency attribute of the 5G Standalone (SA) mode and 5G Non-Standalone (NSA) mode. One of the most critical considerations in 5G is to have low latency network for various delay-sensitive applications, such as remote diagnostics and surgery in healthcare, self-driven cars, industrial factory automation, and live audio productions in the music industry. Therefore, 5G employs various retransmission algorithms and techniques to meet the low latency standards, a new frame structure with multiple subcarrier spacing (SCS) and time slots, and a new cloud-native core. For the low latency measurements, a test setup is built. A video is sent from the 5G User Equipment (UE) to the multimedia server deployed in the University of Oulu 5G test Network (5GTN) edge server. The University of Oulu 5GTN is operating both in NSA and SA modes. Delay is measured both for the downlink and the uplink direction with Qosium tool. When calculating millisecond-level transmission delays, clock synchronization is essential. Therefore, Precision Time Protocol daemon (PTPd) service is initiated on both the sending and receiving machines. The tests comply with the specifications developed at the University of Oulu 5GTN for both the SA and the NSA mode. When the delay measurement findings were compared between the two deployment modes, it was observed that the comparison was not appropriate. The primary reason for this is that in the 5GTN, the NSA and the SA have entirely different data routing paths and configurations. Additionally, the author did not have sufficient resources to make the required architectural changes

    Emergency Advanced Clinical Practitioners: Quality and Acceptability

    Get PDF

    Developing a Decision Support System for Integrated Decision-Making in Purchasing and Scheduling under Lead Time Uncertainty

    Get PDF
    Decision-making in supply chain management is complex because of the relations between planning tasks from different stages and planning levels. Uncertainties such as unpredictable supplier lead times and supply chain disruptions further complicate decision-making. Considering the case study of a company in printed circuit board assembly, a three-level concept is proposed that includes a decision support system. The global single-source supply network is characterized by highly variable lead times. Hence, the company maintains high inventory levels to prevent running out of stock. The decision support system considers the purchasing and scheduling decision problems in an integrated way. The prototypical implementation of the purchasing algorithm uses a genetic algorithm that recommends reorder days and order quantities using a simulation model. In addition, it evaluates the risks of the recommended solution by calculating the probability of stockouts for each order cycle
    corecore