8,187 research outputs found
The Metaverse: Survey, Trends, Novel Pipeline Ecosystem & Future Directions
The Metaverse offers a second world beyond reality, where boundaries are
non-existent, and possibilities are endless through engagement and immersive
experiences using the virtual reality (VR) technology. Many disciplines can
benefit from the advancement of the Metaverse when accurately developed,
including the fields of technology, gaming, education, art, and culture.
Nevertheless, developing the Metaverse environment to its full potential is an
ambiguous task that needs proper guidance and directions. Existing surveys on
the Metaverse focus only on a specific aspect and discipline of the Metaverse
and lack a holistic view of the entire process. To this end, a more holistic,
multi-disciplinary, in-depth, and academic and industry-oriented review is
required to provide a thorough study of the Metaverse development pipeline. To
address these issues, we present in this survey a novel multi-layered pipeline
ecosystem composed of (1) the Metaverse computing, networking, communications
and hardware infrastructure, (2) environment digitization, and (3) user
interactions. For every layer, we discuss the components that detail the steps
of its development. Also, for each of these components, we examine the impact
of a set of enabling technologies and empowering domains (e.g., Artificial
Intelligence, Security & Privacy, Blockchain, Business, Ethics, and Social) on
its advancement. In addition, we explain the importance of these technologies
to support decentralization, interoperability, user experiences, interactions,
and monetization. Our presented study highlights the existing challenges for
each component, followed by research directions and potential solutions. To the
best of our knowledge, this survey is the most comprehensive and allows users,
scholars, and entrepreneurs to get an in-depth understanding of the Metaverse
ecosystem to find their opportunities and potentials for contribution
Recommended from our members
Ensuring Access to Safe and Nutritious Food for All Through the Transformation of Food Systems
Interview with Wolfgang Knauss
An oral history in four sessions (September 2019–January 2020) with Wolfgang Knauss, von Kármán Professor of Aeronautics and Applied Mechanics, Emeritus. Born in Germany in 1933, he speaks about his early life and experiences under the Nazi regime, his teenage years in Siegen and Heidelberg during the Allied occupation, and his move to Pasadena, California, in 1954 under the sponsorship of a local minister and his family. He enrolled in Caltech as an undergraduate in 1957, commencing a more than half-century affiliation with the Institute and GALCIT (today the Graduate Aerospace Laboratories of Caltech). He recalls the roots of his interest in aeronautics, his PhD solid mechanics studies with his advisor, M. Williams, and the GALCIT environment in the late 1950s and 1960s at the dawn of the Space Age, including the impact of Sputnik and classes with NASA astronauts. He discusses his experimental and theoretical work on materials deformation, dynamic fracture, and crack propagation, including his solid-propellant fuels research for NASA and the US Army, wide-ranging programs with the US Navy, and his pioneering micromechanics investigations and work on the time-dependent fracture of polymers in the 1990s.
He offers his perspective on GALCIT’s academic culture, its solid mechanics and fluid mechanics programs, and its evolving administrative directions over the course of five decades, as well as its impact and reputation both within and beyond Caltech. He describes his work with Caltech’s undergraduate admissions committee and his scientific collaborations with numerous graduate students and postdocs and shares his recollections of GALCIT and other Caltech colleagues, including C. Babcock, D. Coles, R.P. Feynman, Y.C. Fung, G. Neugebauer, G. Housner, D. Hudson, H. Liepmann, A. Klein, G. Ravichandran, A. Rosakis, A. Roshko, and E. Sechler.
Six appendices contributed by Dr. Knauss, offering further insight into his life and career, also form part of this oral history and are cross-referenced in the main text
Defining Service Level Agreements in Serverless Computing
The emergence of serverless computing has brought significant advancements to the delivery of computing resources to cloud users. With the abstraction of infrastructure, ecosystem, and execution environments, users could focus on their code while relying on the cloud provider to manage the abstracted layers. In addition, desirable features such as autoscaling and high availability became a provider’s responsibility and can be adopted by the user\u27s application at no extra overhead.
Despite such advancements, significant challenges must be overcome as applications transition from monolithic stand-alone deployments to the ephemeral and stateless microservice model of serverless computing. These challenges pertain to the uniqueness of the conceptual and implementation models of serverless computing. One of the notable challenges is the complexity of defining Service Level Agreements (SLA) for serverless functions. As the serverless model shifts the administration of resources, ecosystem, and execution layers to the provider, users become mere consumers of the provider’s abstracted platform with no insight into its performance. Suboptimal conditions of the abstracted layers are not visible to the end-user who has no means to assess their performance. Thus, SLA in serverless computing must take into consideration the unique abstraction of its model.
This work investigates the Service Level Agreement (SLA) modeling of serverless functions\u27 and serverless chains’ executions. We highlight how serverless SLA fundamentally differs from earlier cloud delivery models. We then propose an approach to define SLA for serverless functions by utilizing resource utilization fingerprints for functions\u27 executions and a method to assess if executions adhere to that SLA. We evaluate the approach’s accuracy in detecting SLA violations for a broad range of serverless application categories. Our validation results illustrate a high accuracy in detecting SLA violations resulting from resource contentions and provider’s ecosystem degradations. We conclude by presenting the empirical validation of our proposed approach, which could detect Execution-SLA violations with accuracy up to 99%
Cost-effective non-destructive testing of biomedical components fabricated using additive manufacturing
Biocompatible titanium-alloys can be used to fabricate patient-specific medical components using additive manufacturing (AM). These novel components have the potential to improve clinical outcomes in various medical scenarios. However, AM introduces stability and repeatability concerns, which are potential roadblocks for its widespread use in the medical sector. Micro-CT imaging for non-destructive testing (NDT) is an effective solution for post-manufacturing quality control of these components. Unfortunately, current micro-CT NDT scanners require expensive infrastructure and hardware, which translates into prohibitively expensive routine NDT. Furthermore, the limited dynamic-range of these scanners can cause severe image artifacts that may compromise the diagnostic value of the non-destructive test. Finally, the cone-beam geometry of these scanners makes them susceptible to the adverse effects of scattered radiation, which is another source of artifacts in micro-CT imaging.
In this work, we describe the design, fabrication, and implementation of a dedicated, cost-effective micro-CT scanner for NDT of AM-fabricated biomedical components. Our scanner reduces the limitations of costly image-based NDT by optimizing the scanner\u27s geometry and the image acquisition hardware (i.e., X-ray source and detector). Additionally, we describe two novel techniques to reduce image artifacts caused by photon-starvation and scatter radiation in cone-beam micro-CT imaging.
Our cost-effective scanner was designed to match the image requirements of medium-size titanium-alloy medical components. We optimized the image acquisition hardware by using an 80 kVp low-cost portable X-ray unit and developing a low-cost lens-coupled X-ray detector. Image artifacts caused by photon-starvation were reduced by implementing dual-exposure high-dynamic-range radiography. For scatter mitigation, we describe the design, manufacturing, and testing of a large-area, highly-focused, two-dimensional, anti-scatter grid.
Our results demonstrate that cost-effective NDT using low-cost equipment is feasible for medium-sized, titanium-alloy, AM-fabricated medical components. Our proposed high-dynamic-range strategy improved by 37% the penetration capabilities of an 80 kVp micro-CT imaging system for a total x-ray path length of 19.8 mm. Finally, our novel anti-scatter grid provided a 65% improvement in CT number accuracy and a 48% improvement in low-contrast visualization. Our proposed cost-effective scanner and artifact reduction strategies have the potential to improve patient care by accelerating the widespread use of patient-specific, bio-compatible, AM-manufactured, medical components
Transporte sedimentar de misturas arenosas em ambientes costeiros
Sediment dynamics is a complex subject, playing an important role in coastal areas. The interaction between the wave action and sediment particles is determinant to understand sediment transport. The knowledge of sediment transport in sand mixtures is relevant as the coastal zone usually presents large heterogeneities of sediment particles sizes in the horizontal and vertical directions, which denote the existence of selective transport process.
The goal of the present study was to understand selective sand transport mechanisms associated with wave-dominated conditions. To achieve the proposed objective, the work was supported by several approaches to give insights into the processes associated with heterometric sediment transport. The first approach consisted in a review of the literature to understand what is known and the lack of knowledge in the subject. The second approach consisted in performing a set of experiments with fluorescent sand tracers in natural conditions, at the field, and in a controlled environment, at the laboratory. The experiment conducted at Patos beach, Spain, aimed to observe the behavior of the native sand in the natural environment. The one performed at a large wave flume (Großer Wellenkanal, GWK) in Hannover, considered distinct sediment mixtures under two wave conditions. In both, measurements of the total and fractional transport were made. Finally, a quasi-steady and a semi-unsteady model were considered to calculate the net sediment transport of a large data set, for uniform and graded sand, allowing to identify limitations of the models and to propose new methodologies to obtain more adequate results.
The experimental results obtained in the field allowed to observe the tracer transport towards the beach, in the wave direction, and to characterize the transport of the different fractions of the tracer in terms of the hydrodynamic processes. The laboratory experiment allowed to verify the occurrence of interaction between the sand fractions, with the finer (coarser) sediment decreasing (increasing) their transport, as the percentage of coarse sand in the bed mixture increases.
The validation of the models was conducted and their performance was improved with the introduction of parameters related to surface wave streaming effects and bedforms.A dinâmica sedimentar é um tema complexo, desempenhando um papel importante nas zonas costeiras. A interação entre a ação das ondas e as partículas de sedimento é determinante para compreender o transporte sedimentar. O conhecimento do transporte de sedimentos heterométricos é relevante uma vez que a zona costeira apresenta geralmente grandes heterogeneidades na dimensão das partículas sedimentares nas direções horizontal e vertical, o que denota a existência de processos seletivos de transporte.
O presente estudo teve como objetivo compreender os mecanismos de transporte seletivo de areia associados a condições hidrodinâmicas dominadas pelas ondas e considerou diferentes abordagens para evidenciar os processos que intervêm no transporte de sedimentos heterométricos. A primeira abordagem consistiu numa revisão da literatura para compreender o conhecimento atual e as lacunas existentes. A segunda abordagem consistiu na realização de um conjunto de experiências com traçadores de areia fluorescente em condições naturais, no campo, e em ambiente controlado, no laboratório. A experiência realizada na praia de Patos, Espanha, visou observar o comportamento da areia nativa em ambiente natural. A experiência realizada no grande canal de ondas (Großer Wellenkanal, GWK) em Hannover, considerou misturas de sedimentos com granulometrias distintas sob duas condições de ondas. Em ambas as experiências, foram feitas medições do transporte total e fracionado. Finalmente, um modelo quase-estacionário e um modelo semi-não estacionário foram considerados para calcular as taxas de transporte sedimentar de um grande conjunto de dados experimentais, para areia uniforme e heterométrica, permitindo identificar limitações dos modelos e propor novas metodologias para obter resultados mais adequados.
Os resultados experimentais obtidos no campo permitiram observar o transporte do traçador em direcção à praia, na direcção das ondas, e caracterizar o transporte das diferentes frações do traçador em termos dos processos hidrodinâmicos. A experiência laboratorial permitiu verificar a ocorrência de processos de interação no transporte das diferentes frações de areia, com a fração mais fina do sedimento (mais grosseira) a diminuir (aumentar) o seu transporte, à medida que a percentagem de areia grosseira na mistura do leito de fundo aumenta.
A validação dos modelos foi realizada e o seu desempenho foi melhorado com a introdução de parâmetros relacionados com os efeitos de streaming e formas de fundo.Programa Doutoral em Ciência, Tecnologia e Gestão do Ma
How to Be a God
When it comes to questions concerning the nature of Reality, Philosophers and Theologians have the answers.
Philosophers have the answers that can’t be proven right. Theologians have the answers that can’t be proven wrong.
Today’s designers of Massively-Multiplayer Online Role-Playing Games create realities for a living. They can’t spend centuries mulling over the issues: they have to face them head-on. Their practical experiences can indicate which theoretical proposals actually work in practice.
That’s today’s designers. Tomorrow’s will have a whole new set of questions to answer.
The designers of virtual worlds are the literal gods of those realities. Suppose Artificial Intelligence comes through and allows us to create non-player characters as smart as us. What are our responsibilities as gods? How should we, as gods, conduct ourselves?
How should we be gods
Exploring Blockchain Adoption Supply Chains: Opportunities and Challenges
Acquisition Management / Grant technical reportAcquisition Research Program Sponsored Report SeriesSponsored Acquisition Research & Technical ReportsIn modern supply chains, acquisition often occurs with the involvement of a network of organizations. The resilience, efficiency, and effectiveness of supply networks are crucial for the viability of acquisition. Disruptions in the supply chain require adequate communication infrastructure to ensure resilience. However, supply networks do not have a shared information technology infrastructure that ensures effective communication. Therefore decision-makers seek new methodologies for supply chain management resilience. Blockchain technology offers new decentralization and service delegation methods that can transform supply chains and result in a more flexible, efficient, and effective supply chain. This report presents a framework for the application of Blockchain technology in supply chain management to improve resilience. In the first part of this study, we discuss the limitations and challenges of the supply chain system that can be addressed by integrating Blockchain technology. In the second part, the report provides a comprehensive Blockchain-based supply chain network management framework. The application of the proposed framework is demonstrated using modeling and simulation. The differences in the simulation scenarios can provide guidance for decision-makers who consider using the developed framework during the acquisition process.Approved for public release; distribution is unlimited
Industry 4.0: product digital twins for remanufacturing decision-making
Currently there is a desire to reduce natural resource consumption and expand circular business principles whilst Industry 4.0 (I4.0) is regarded as the evolutionary and potentially disruptive movement of technology, automation, digitalisation, and data manipulation into the industrial sector. The remanufacturing industry is recognised as being vital to the circular economy (CE) as it extends the in-use life of products, but its synergy with I4.0 has had little attention thus far. This thesis documents the first investigating into I4.0 in remanufacturing for a CE contributing a design and demonstration of a model that optimises remanufacturing planning using data from different instances in a product’s life cycle.
The initial aim of this work was to identify the I4.0 technology that would enhance the stability in remanufacturing with a view to reducing resource consumption. As the project progressed it narrowed to focus on the development of a product digital twin (DT) model to support data-driven decision making for operations planning. The model’s architecture was derived using a bottom-up approach where requirements were extracted from the identified complications in production planning and control that differentiate remanufacturing from manufacturing. Simultaneously, the benefits of enabling visibility of an asset’s through-life health were obtained using a DT as the modus operandi. A product simulator and DT prototype was designed to use Internet of Things (IoT) components, a neural network for remaining life estimations and a search algorithm for operational planning optimisation. The DT was iteratively developed using case studies to validate and examine the real opportunities that exist in deploying a business model that harnesses, and commodifies, early life product data for end-of-life processing optimisation. Findings suggest that using intelligent programming networks and algorithms, a DT can enhance decision-making if it has visibility of the product and access to reliable remanufacturing process information, whilst existing IoT components provide rudimentary “smart” capabilities, but their integration is complex, and the durability of the systems over extended product life cycles needs to be further explored
- …