11,180 research outputs found
The Metaverse: Survey, Trends, Novel Pipeline Ecosystem & Future Directions
The Metaverse offers a second world beyond reality, where boundaries are
non-existent, and possibilities are endless through engagement and immersive
experiences using the virtual reality (VR) technology. Many disciplines can
benefit from the advancement of the Metaverse when accurately developed,
including the fields of technology, gaming, education, art, and culture.
Nevertheless, developing the Metaverse environment to its full potential is an
ambiguous task that needs proper guidance and directions. Existing surveys on
the Metaverse focus only on a specific aspect and discipline of the Metaverse
and lack a holistic view of the entire process. To this end, a more holistic,
multi-disciplinary, in-depth, and academic and industry-oriented review is
required to provide a thorough study of the Metaverse development pipeline. To
address these issues, we present in this survey a novel multi-layered pipeline
ecosystem composed of (1) the Metaverse computing, networking, communications
and hardware infrastructure, (2) environment digitization, and (3) user
interactions. For every layer, we discuss the components that detail the steps
of its development. Also, for each of these components, we examine the impact
of a set of enabling technologies and empowering domains (e.g., Artificial
Intelligence, Security & Privacy, Blockchain, Business, Ethics, and Social) on
its advancement. In addition, we explain the importance of these technologies
to support decentralization, interoperability, user experiences, interactions,
and monetization. Our presented study highlights the existing challenges for
each component, followed by research directions and potential solutions. To the
best of our knowledge, this survey is the most comprehensive and allows users,
scholars, and entrepreneurs to get an in-depth understanding of the Metaverse
ecosystem to find their opportunities and potentials for contribution
Testing the nomological network for the Personal Engagement Model
The study of employee engagement has been a key focus of management for over three decades. The academic literature on engagement has generated multiple definitions but there are two primary models of engagement: the Personal Engagement Model of Kahn (1990), and the Work Engagement Model (WEM) of Schaufeli et al., (2002). While the former is cited by most authors as the seminal work on engagement, research has tended to focus on elements of the model and most theoretical work on engagement has predominantly used the WEM to consider the topic.
The purpose of this study was to test all the elements of the nomological network of the PEM to determine whether the complete model of personal engagement is viable. This was done using data from a large, complex public sector workforce. Survey questions were designed to test each element of the PEM and administered to a sample of the workforce (n = 3,103). The scales were tested and refined using confirmatory factor analysis and then the model was tested determine the structure of the nomological network. This was validated and the generalisability of the final model was tested across different work and organisational types.
The results showed that the PEM is viable but there were differences from what was originally proposed by Kahn (1990). Specifically, of the three psychological conditions deemed necessary for engagement to occur, meaningfulness, safety, and availability, only meaningfulness was found to contribute to employee engagement. The model demonstrated that employees experience meaningfulness through both the nature of the work that they do and the organisation within which they do their work. Finally, the findings were replicated across employees in different work types and different organisational types.
This thesis makes five contributions to the engagement paradigm. It advances engagement theory by testing the PEM and showing that it is an adequate representation of engagement. A model for testing the causal mechanism for engagement has been articulated, demonstrating that meaningfulness in work is a primary mechanism for engagement. The research has shown the key aspects of the workplace in which employees experience meaningfulness, the nature of the work that they do and the organisation within which they do it. It has demonstrated that this is consistent across organisations and the type of work. Finally, it has developed a reliable measure of the different elements of the PEM which will support future research in this area
Self-Supervised Learning to Prove Equivalence Between Straight-Line Programs via Rewrite Rules
We target the problem of automatically synthesizing proofs of semantic
equivalence between two programs made of sequences of statements. We represent
programs using abstract syntax trees (AST), where a given set of
semantics-preserving rewrite rules can be applied on a specific AST pattern to
generate a transformed and semantically equivalent program. In our system, two
programs are equivalent if there exists a sequence of application of these
rewrite rules that leads to rewriting one program into the other. We propose a
neural network architecture based on a transformer model to generate proofs of
equivalence between program pairs. The system outputs a sequence of rewrites,
and the validity of the sequence is simply checked by verifying it can be
applied. If no valid sequence is produced by the neural network, the system
reports the programs as non-equivalent, ensuring by design no programs may be
incorrectly reported as equivalent. Our system is fully implemented for a given
grammar which can represent straight-line programs with function calls and
multiple types. To efficiently train the system to generate such sequences, we
develop an original incremental training technique, named self-supervised
sample selection. We extensively study the effectiveness of this novel training
approach on proofs of increasing complexity and length. Our system, S4Eq,
achieves 97% proof success on a curated dataset of 10,000 pairs of equivalent
programsComment: 30 pages including appendi
Central-provincial Politics and Industrial Policy-making in the Electric Power Sector in China
In addition to the studies that provide meaningful insights into the complexity of technical and economic issues, increasing studies have focused on the political process of market transition in network industries such as the electric power sector. This dissertation studies the central–provincial interactions in industrial policy-making and implementation, and attempts to evaluate the roles of Chinese provinces in the market reform process of the electric power sector. Market reforms of this sector are used as an illustrative case because the new round of market reforms had achieved some significant breakthroughs in areas such as pricing reform and wholesale market trading. Other policy measures, such as the liberalization of the distribution market and cross-regional market-building, are still at a nascent stage and have only scored moderate progress. It is important to investigate why some policy areas make greater progress in market reforms than others. It is also interesting to examine the impacts of Chinese central-provincial politics on producing the different market reform outcomes. Guangdong and Xinjiang are two provinces being analyzed in this dissertation. The progress of market reforms in these two provinces showed similarities although the provinces are very different in terms of local conditions such as the stages of their economic development and energy structures. The actual reform can be understood as the outcomes of certain modes of interactions between the central and provincial actors in the context of their particular capabilities and preferences in different policy areas. This dissertation argues that market reform is more successful in policy areas where the central and provincial authorities are able to engage mainly in integrative negotiations than in areas where they engage mainly in distributive negotiations
Examples of works to practice staccato technique in clarinet instrument
Klarnetin staccato tekniğini güçlendirme aşamaları eser çalışmalarıyla uygulanmıştır. Staccato
geçişlerini hızlandıracak ritim ve nüans çalışmalarına yer verilmiştir. Çalışmanın en önemli amacı
sadece staccato çalışması değil parmak-dilin eş zamanlı uyumunun hassasiyeti üzerinde de
durulmasıdır. Staccato çalışmalarını daha verimli hale getirmek için eser çalışmasının içinde etüt
çalışmasına da yer verilmiştir. Çalışmaların üzerinde titizlikle durulması staccato çalışmasının ilham
verici etkisi ile müzikal kimliğe yeni bir boyut kazandırmıştır. Sekiz özgün eser çalışmasının her
aşaması anlatılmıştır. Her aşamanın bir sonraki performans ve tekniği güçlendirmesi esas alınmıştır.
Bu çalışmada staccato tekniğinin hangi alanlarda kullanıldığı, nasıl sonuçlar elde edildiği bilgisine
yer verilmiştir. Notaların parmak ve dil uyumu ile nasıl şekilleneceği ve nasıl bir çalışma disiplini
içinde gerçekleşeceği planlanmıştır. Kamış-nota-diyafram-parmak-dil-nüans ve disiplin
kavramlarının staccato tekniğinde ayrılmaz bir bütün olduğu saptanmıştır. Araştırmada literatür
taraması yapılarak staccato ile ilgili çalışmalar taranmıştır. Tarama sonucunda klarnet tekniğin de
kullanılan staccato eser çalışmasının az olduğu tespit edilmiştir. Metot taramasında da etüt
çalışmasının daha çok olduğu saptanmıştır. Böylelikle klarnetin staccato tekniğini hızlandırma ve
güçlendirme çalışmaları sunulmuştur. Staccato etüt çalışmaları yapılırken, araya eser çalışmasının
girmesi beyni rahatlattığı ve istekliliği daha arttırdığı gözlemlenmiştir. Staccato çalışmasını yaparken
doğru bir kamış seçimi üzerinde de durulmuştur. Staccato tekniğini doğru çalışmak için doğru bir
kamışın dil hızını arttırdığı saptanmıştır. Doğru bir kamış seçimi kamıştan rahat ses çıkmasına
bağlıdır. Kamış, dil atma gücünü vermiyorsa daha doğru bir kamış seçiminin yapılması gerekliliği
vurgulanmıştır. Staccato çalışmalarında baştan sona bir eseri yorumlamak zor olabilir. Bu açıdan
çalışma, verilen müzikal nüanslara uymanın, dil atış performansını rahatlattığını ortaya koymuştur.
Gelecek nesillere edinilen bilgi ve birikimlerin aktarılması ve geliştirici olması teşvik edilmiştir.
Çıkacak eserlerin nasıl çözüleceği, staccato tekniğinin nasıl üstesinden gelinebileceği anlatılmıştır.
Staccato tekniğinin daha kısa sürede çözüme kavuşturulması amaç edinilmiştir. Parmakların
yerlerini öğrettiğimiz kadar belleğimize de çalışmaların kaydedilmesi önemlidir. Gösterilen azmin ve
sabrın sonucu olarak ortaya çıkan yapıt başarıyı daha da yukarı seviyelere çıkaracaktır
Victims' Access to Justice in Trinidad and Tobago: An exploratory study of experiences and challenges of accessing criminal justice in a post-colonial society
This thesis investigates victims' access to justice in Trinidad and Tobago, using their own narratives. It seeks to capture how their experiences affected their identities as victims and citizens, alongside their perceptions of legitimacy regarding the criminal justice system. While there have been some reforms in the administration of criminal justice in Trinidad and Tobago, such reforms have not focused on victims' accessibility to the justice system. Using grounded theory methodology, qualitative data was collected through 31 in-depth interviews with victims and victim advocates. The analysis found that victims experienced interpersonal, structural, and systemic barriers at varying levels throughout the criminal justice system, which manifested as institutionalized secondary victimization, silencing and inequality. This thesis argues that such experiences not only served to appropriate conflict but demonstrates that access is often given in a very narrow sense. Furthermore, it shows a failure to encompass access to justice as appropriated conflicts are left to stagnate in the system as there is often very little resolution. Adopting a postcolonial lens to analyse victims' experiences, the analysis identified othering practices that served to institutionalize the vulnerability and powerlessness associated with victim identities. Here, it is argued that these othering practices also affected the rights consciousness of victims, delegitimating their identities as citizens. Moreover, as a result of their experiences, victims had mixed perceptions of the justice system. It is argued that while the system is a legitimate authority victims' endorsement of the system is questionable, therefore victims' experiences suggest that there is a reinforcement of the system's legal hegemony. The findings suggest that within the legal system of Trinidad and Tobago, legacies of colonialism shape the postcolonial present as the psychology and inequalities of the past are present in the interactions and processes of justice. These findings are relevant for policymakers in Trinidad and Tobago and other regions. From this study it is recognized that, to improve access to justice for victims, there needs to be a move towards victim empowerment that promotes resilience and enhances social capital. Going forward it is noted that there is a need for further research
Recommended from our members
Antecedents of business intelligence system use
This thesis was submitted for the award of Doctor of Philosophy and was awarded by Brunel University London.Organisational reliance on information has become vital for organisational competitiveness. With increasing data volumes, Business Intelligence (BI) becomes a cornerstone of the decision-support system. However, employee resistance to use Business Intelligence Systems (BIS) is evident. This creates a problem to organisations in realising the benefits of BIS. It is thus important to study the enablers of sustained use of BIS amongst employees.
This thesis identifies existing theories that can be used to study BI system use. It integrates and extends technology use theories through a framework focusing on Business Intelligence System Use (BISU). Empirical research is then conducted in Kuwait’s telecom and banking industries through a close-ended, self-administered questionnaire using a five-point Likert scale. Responses were received from 211 BI users. The data was analysed using SmartPLS to study the convergent and discriminant validity and reliability. Partial least squares structural equation modelling (PLS-SEM) was used to study the direct and indirect relationships between constructs and answer the hypotheses. In addition to SmartPLS, SPSS was used for descriptive analysis.
The results indicated that UTAUT factors consisting of performance expectancy, effort expectancy and social influence positively impact BI system use. Voluntariness of use was found to positively moderate the relationship between social influence and BI system use. Furthermore, BI system quality positively impacts both performance expectancy and effort expectancy. The BI user’s self-efficacy also positively impacts effort expectancy. In addition, social influence was found to be positively influenced by organisational factors, namely top management support and information culture.
The findings of this research contribute to literature by determining and quantifying the factors that influence BISU through the lens of employee perspectives. This thesis also explains how employees’ object-based beliefs about BI affect their behavioural beliefs, which in turn impact BISU. Limitations of this research include the omission of UTAUT’s facilitating conditions and the limited variance of respondent demographics
Foundations for programming and implementing effect handlers
First-class control operators provide programmers with an expressive and efficient
means for manipulating control through reification of the current control state as a first-class object, enabling programmers to implement their own computational effects and
control idioms as shareable libraries. Effect handlers provide a particularly structured
approach to programming with first-class control by naming control reifying operations
and separating from their handling.
This thesis is composed of three strands of work in which I develop operational
foundations for programming and implementing effect handlers as well as exploring
the expressive power of effect handlers.
The first strand develops a fine-grain call-by-value core calculus of a statically
typed programming language with a structural notion of effect types, as opposed to the
nominal notion of effect types that dominates the literature. With the structural approach,
effects need not be declared before use. The usual safety properties of statically typed
programming are retained by making crucial use of row polymorphism to build and
track effect signatures. The calculus features three forms of handlers: deep, shallow,
and parameterised. They each offer a different approach to manipulate the control state
of programs. Traditional deep handlers are defined by folds over computation trees,
and are the original con-struct proposed by Plotkin and Pretnar. Shallow handlers are
defined by case splits (rather than folds) over computation trees. Parameterised handlers
are deep handlers extended with a state value that is threaded through the folds over
computation trees. To demonstrate the usefulness of effects and handlers as a practical
programming abstraction I implement the essence of a small UNIX-style operating
system complete with multi-user environment, time-sharing, and file I/O.
The second strand studies continuation passing style (CPS) and abstract machine
semantics, which are foundational techniques that admit a unified basis for implementing deep, shallow, and parameterised effect handlers in the same environment. The
CPS translation is obtained through a series of refinements of a basic first-order CPS
translation for a fine-grain call-by-value language into an untyped language. Each refinement moves toward a more intensional representation of continuations eventually
arriving at the notion of generalised continuation, which admit simultaneous support for
deep, shallow, and parameterised handlers. The initial refinement adds support for deep
handlers by representing stacks of continuations and handlers as a curried sequence of
arguments. The image of the resulting translation is not properly tail-recursive, meaning some function application terms do not appear in tail position. To rectify this the
CPS translation is refined once more to obtain an uncurried representation of stacks
of continuations and handlers. Finally, the translation is made higher-order in order to
contract administrative redexes at translation time. The generalised continuation representation is used to construct an abstract machine that provide simultaneous support for
deep, shallow, and parameterised effect handlers. kinds of effect handlers.
The third strand explores the expressiveness of effect handlers. First, I show that
deep, shallow, and parameterised notions of handlers are interdefinable by way of typed
macro-expressiveness, which provides a syntactic notion of expressiveness that affirms
the existence of encodings between handlers, but it provides no information about the
computational content of the encodings. Second, using the semantic notion of expressiveness I show that for a class of programs a programming language with first-class
control (e.g. effect handlers) admits asymptotically faster implementations than possible in a language without first-class control
Digital asset management via distributed ledgers
Distributed ledgers rose to prominence with the advent of Bitcoin, the first provably secure protocol to solve consensus in an open-participation setting. Following, active research and engineering efforts have proposed a multitude of applications and alternative designs, the most prominent being Proof-of-Stake (PoS). This thesis expands the scope of secure and efficient asset management over a distributed ledger around three axes: i) cryptography; ii) distributed systems; iii) game theory and economics. First, we analyze the security of various wallets. We start with a formal model of hardware wallets, followed by an analytical framework of PoS wallets, each outlining the unique properties of Proof-of-Work (PoW) and PoS respectively. The latter also provides a rigorous design to form collaborative participating entities, called stake pools. We then propose Conclave, a stake pool design which enables a group of parties to participate in a PoS system in a collaborative manner, without a central operator. Second, we focus on efficiency. Decentralized systems are aimed at thousands of users across the globe, so a rigorous design for minimizing memory and storage consumption is a prerequisite for scalability. To that end, we frame ledger maintenance as an optimization problem and design a multi-tier framework for designing wallets which ensure that updates increase the ledger’s global state only to a minimal extent, while preserving the security guarantees outlined in the security analysis. Third, we explore incentive-compatibility and analyze blockchain systems from a micro and a macroeconomic perspective. We enrich our cryptographic and systems' results by analyzing the incentives of collective pools and designing a state efficient Bitcoin fee function. We then analyze the Nash dynamics of distributed ledgers, introducing a formal model that evaluates whether rational, utility-maximizing participants are disincentivized from exhibiting undesirable infractions, and highlighting the differences between PoW and PoS-based ledgers, both in a standalone setting and under external parameters, like market price fluctuations. We conclude by introducing a macroeconomic principle, cryptocurrency egalitarianism, and then describing two mechanisms for enabling taxation in blockchain-based currency systems
Statistical Learning for Gene Expression Biomarker Detection in Neurodegenerative Diseases
In this work, statistical learning approaches are used to detect biomarkers for neurodegenerative diseases (NDs). NDs are becoming increasingly prevalent as populations age, making understanding of disease and identification of biomarkers progressively important for facilitating early diagnosis and the screening of individuals for clinical trials. Advancements in gene expression profiling has enabled the exploration of disease biomarkers at an unprecedented scale. The work presented here demonstrates the value of gene expression data in understanding the underlying processes and detection of biomarkers of NDs. The value of novel approaches to previously collected -omics data is shown and it is demonstrated that new therapeutic targets can be identified. Additionally, the importance of meta-analysis to improve power of multiple small studies is demonstrated. The value of blood transcriptomics data is shown in applications to researching NDs to understand underlying processes using network analysis and a novel hub detection method. Finally, after demonstrating the value of blood gene expression data for investigating NDs, a combination of feature selection and classification algorithms were used to identify novel accurate biomarker signatures for the diagnosis and prognosis of Parkinson’s disease (PD) and Alzheimer’s disease (AD). Additionally, the use of feature pools based on previous knowledge of disease and the viability of neural networks in dimensionality reduction and biomarker detection is demonstrated and discussed. In summary, gene expression data is shown to be valuable for the investigation of ND and novel gene biomarker signatures for the diagnosis and prognosis of PD and AD
- …