9,799 research outputs found
Recommended from our members
Ensuring Access to Safe and Nutritious Food for All Through the Transformation of Food Systems
A Phenomenological Study of How Active Engagement in Black Greek Letter Sororities Influences Christian Members\u27 Spiritual Growth
This phenomenological study explored how being part of a Black Greek Letter. Organization (BGLO) sorority impacts the spiritual growth of its Christian members. One of the issues explored was the influence relationships within these sororities have on members striving to be like Christ. There is a dichotomy of perspectives regarding Black Greek Letter Organizations (BGLOs). They have a significant role in the Black community as organizations that foster leadership, philanthropy, and sisterhood and promote education. They are admired on and off college campuses and in the broader community in graduate chapters. The objective of phenomenology is to describe phenomena of spiritual growth among Christian sorority members from the life experiences of those who live them; that premise guided the interviews conducted for this study. The results found that active engagement in a BGLO sorority positively impacts its members\u27 spiritual growth. From the emotional stories of sisterhood, service, and devotion to prayer, their experiences evidenced strengthened walks of faith. This study contrasts the Anti-BGLO narrative as a testament to these organizations\u27 legacy and practices deeply grounded in the church
Model Diagnostics meets Forecast Evaluation: Goodness-of-Fit, Calibration, and Related Topics
Principled forecast evaluation and model diagnostics are vital in fitting probabilistic models and forecasting outcomes of interest. A common principle is that fitted or predicted distributions ought to be calibrated, ideally in the sense that the outcome is indistinguishable from a random draw from the posited distribution. Much of this thesis is centered on calibration properties of various types of forecasts.
In the first part of the thesis, a simple algorithm for exact multinomial goodness-of-fit tests is proposed. The algorithm computes exact -values based on various test statistics, such as the log-likelihood ratio and Pearson\u27s chi-square. A thorough analysis shows improvement on extant methods. However, the runtime of the algorithm grows exponentially in the number of categories and hence its use is limited.
In the second part, a framework rooted in probability theory is developed, which gives rise to hierarchies of calibration, and applies to both predictive distributions and stand-alone point forecasts. Based on a general notion of conditional T-calibration, the thesis introduces population versions of T-reliability diagrams and revisits a score decomposition into measures of miscalibration, discrimination, and uncertainty. Stable and efficient estimators of T-reliability diagrams and score components arise via nonparametric isotonic regression and the pool-adjacent-violators algorithm. For in-sample model diagnostics, a universal coefficient of determination is introduced that nests and reinterprets the classical in least squares regression.
In the third part, probabilistic top lists are proposed as a novel type of prediction in classification, which bridges the gap between single-class predictions and predictive distributions. The probabilistic top list functional is elicited by strictly consistent evaluation metrics, based on symmetric proper scoring rules, which admit comparison of various types of predictions
On the Mechanism of Building Core Competencies: a Study of Chinese Multinational Port Enterprises
This study aims to explore how Chinese multinational port enterprises (MNPEs) build
their core competencies. Core competencies are firms’special capabilities and sources
to gain sustainable competitive advantage (SCA) in marketplace, and the concept led
to extensive research and debates. However, few studies include inquiries about the
mechanisms of building core competencies in the context of Chinese MNPEs.
Accordingly, answers were sought to three research questions:
1. What are the core competencies of the Chinese MNPEs?
2. What are the mechanisms that the Chinese MNPEs use to build their core
competencies?
3. What are the paths that the Chinese MNPEs pursue to build their resources bases?
The study adopted a multiple-case study design, focusing on building mechanism of
core competencies with RBV. It selected purposively five Chinese leading MNPEs
and three industry associations as Case Companies.
The study revealed three main findings. First, it identified three generic core
competencies possessed by Case Companies, i.e., innovation in business models and
operations, utilisation of technologies, and acquisition of strategic resources. Second,
it developed the conceptual framework of the Mechanism of Building Core
Competencies (MBCC), which is a process of change of collective learning in
effective and efficient utilization of resources of a firm in response to critical events.
Third, it proposed three paths to build core competencies, i.e., enhancing collective
learning, selecting sustainable processes, and building resource base.
The study contributes to the knowledge of core competencies and RBV in three ways:
(1) presenting three generic core competencies of the Chinese MNPEs, (2) proposing
a new conceptual framework to explain how Chinese MNPEs build their core
competencies, (3) suggesting a solid anchor point (MBCC) to explain the links among
resources, core competencies, and SCA. The findings set benchmarks for Chinese
logistics industry and provide guidelines to build core competencies
Recommended from our members
Meaning-Making Practices of Emergent Arabic–English Bilingual Kindergarten Children in Cairo
The number of British Schools in the Middle East and North Africa (MENA) region is growing. The National Curriculum of England is used by an increasing number of such schools. As well as exporting a culturally-specific curriculum, these schools usually adopt an ideology of monolingualism, thus potentially limiting communication for emergent bilinguals and failing to acknowledge the multiple ways of meaning-making.
Current studies of translanguaging are moving the focus to multimodal forms of communication as a resource for thinking and communicating (García and Wei 2014, Wei 2018). Building on the work of Kress (1997, 2010) I explore pre-school emergent bilinguals’ wider signifying practices and create an analytical framework, which I call MMTL (multimodal translanguaging), used as a lens to illustrate meaning-making.
Valley Hill in Cairo, Egypt is a British school which encourages ‘English-only’ as the medium of instruction in the kindergarten. Using a case study methodology, this research explores the meaning-making practices of eight emergent bilingual children aged 3–4 during child-initiated play, later reduced to four in the thesis to provide a detailed multimodal analysis. The principal aim is to explore their speech, gaze, gesture, and their engagement (layout/position) with artefacts during play.
The findings of this study suggest that although there is an ‘English-only’ approach, these young emergent bilingual children are meaning-making in a variety of ways. Children are translanguaging but it is never in isolation from other modes of communication. Emergent bilinguals use a range of modes to mediate their understanding and communication with others. They use gesture, gaze, and artefacts alongside translingual practices to move meaning across to more accessible modes, enabling communication and understanding. The implications for schools should be to embrace such hybrid practices and for teachers to be more responsive to young children’s meaning-making to enable learning
Digital asset management via distributed ledgers
Distributed ledgers rose to prominence with the advent of Bitcoin, the first provably secure protocol to solve consensus in an open-participation setting. Following, active research and engineering efforts have proposed a multitude of applications and alternative designs, the most prominent being Proof-of-Stake (PoS). This thesis expands the scope of secure and efficient asset management over a distributed ledger around three axes: i) cryptography; ii) distributed systems; iii) game theory and economics. First, we analyze the security of various wallets. We start with a formal model of hardware wallets, followed by an analytical framework of PoS wallets, each outlining the unique properties of Proof-of-Work (PoW) and PoS respectively. The latter also provides a rigorous design to form collaborative participating entities, called stake pools. We then propose Conclave, a stake pool design which enables a group of parties to participate in a PoS system in a collaborative manner, without a central operator. Second, we focus on efficiency. Decentralized systems are aimed at thousands of users across the globe, so a rigorous design for minimizing memory and storage consumption is a prerequisite for scalability. To that end, we frame ledger maintenance as an optimization problem and design a multi-tier framework for designing wallets which ensure that updates increase the ledger’s global state only to a minimal extent, while preserving the security guarantees outlined in the security analysis. Third, we explore incentive-compatibility and analyze blockchain systems from a micro and a macroeconomic perspective. We enrich our cryptographic and systems' results by analyzing the incentives of collective pools and designing a state efficient Bitcoin fee function. We then analyze the Nash dynamics of distributed ledgers, introducing a formal model that evaluates whether rational, utility-maximizing participants are disincentivized from exhibiting undesirable infractions, and highlighting the differences between PoW and PoS-based ledgers, both in a standalone setting and under external parameters, like market price fluctuations. We conclude by introducing a macroeconomic principle, cryptocurrency egalitarianism, and then describing two mechanisms for enabling taxation in blockchain-based currency systems
International Conference Shaping light for health and wellbeing in cities
The book collects contributions presented during the international conference “Shaping light for health and wellbeing in cities” organized in the framework of the H2020 ENLIGHTENme project. The conference has investigated the multifaceted consequences light has on life in cities, by adopting a multidisciplinary and integrated approach to explore the complexity of challenges urban lighting poses on health and wellbeing, urban realm and social life. Papers cover several disciplines such as clinical and biomedical sciences, ethics and Responsible Research & Innovation, urban planning and architecture, data accessibility and interoperability, as well as social sciences and economics, and provide multifaceted insights that inspire further explorations. Contributions represent a step towards the development of innovative policies for improving health and wellbeing in our cities, addressing indoor and outdoor lighting
Extraction of transforming sequences and sentence histories from writing process data : a first step towards linguistic modeling of writing
Online first, part of special issue "Methods for understanding writing process by analysis of writing timecourse"
Erworben im Rahmen der Schweizer Nationallizenzen (http://www.nationallizenzen.ch)Producing written texts is a non-linear process: in contrast to speech, writers are free to change already written text at any place at any point in time. Linguistic considerations are likely to play an important role, but so far, no linguistic models of the writing process exist. We present an approach for the analysis of writing processes with a focus on linguistic structures based on the novel concepts of transforming sequences, text history, and sentence history. The processing of raw keystroke logging data and the application of natural language processing tools allows for the extraction and filtering of product and process data to be stored in a hierarchical data structure. This structure is used to re-create and visualize the genesis and history for a text and its individual sentences. Focusing on sentences as primary building blocks of written language and full texts, we aim to complement established writing process analyses and, ultimately, to interpret writing timecourse data with respect to linguistic structures. To enable researchers to explore this view, we provide a fully functional implementation of our approach as an open-source software tool and visualizations of the results. We report on a small scale exploratory study in German where we used our tool. The results indicate both the feasibility of the approach and that writers actually revise on a linguistic level. The latter confirms the need for modeling written text production from the perspective of linguistic structures beyond the word level
Hunting Wildlife in the Tropics and Subtropics
The hunting of wild animals for their meat has been a crucial activity in the evolution of humans. It continues to be an essential source of food and a generator of income for millions of Indigenous and rural communities worldwide. Conservationists rightly fear that excessive hunting of many animal species will cause their demise, as has already happened throughout the Anthropocene. Many species of large mammals and birds have been decimated or annihilated due to overhunting by humans. If such pressures continue, many other species will meet the same fate. Equally, if the use of wildlife resources is to continue by those who depend on it, sustainable practices must be implemented. These communities need to remain or become custodians of the wildlife resources within their lands, for their own well-being as well as for biodiversity in general. This title is also available via Open Access on Cambridge Core
Elasto-plastic deformations within a material point framework on modern GPU architectures
Plastic strain localization is an important process on Earth. It strongly influ- ences the mechanical behaviour of natural processes, such as fault mechanics, earthquakes or orogeny. At a smaller scale, a landslide is a fantastic example of elasto-plastic deformations. Such behaviour spans from pre-failure mech- anisms to post-failure propagation of the unstable material. To fully resolve the landslide mechanics, the selected numerical methods should be able to efficiently address a wide range of deformation magnitudes.
Accurate and performant numerical modelling requires important compu- tational resources. Mesh-free numerical methods such as the material point method (MPM) or the smoothed-particle hydrodynamics (SPH) are particu- larly computationally expensive, when compared with mesh-based methods, such as the finite element method (FEM) or the finite difference method (FDM). Still, mesh-free methods are particularly well-suited to numerical problems involving large elasto-plastic deformations. But, the computational efficiency of these methods should be first improved in order to tackle complex three-dimensional problems, i.e., landslides.
As such, this research work attempts to alleviate the computational cost of the material point method by using the most recent graphics processing unit (GPU) architectures available. GPUs are many-core processors originally designed to refresh screen pixels (e.g., for computer games) independently. This allows GPUs to delivers a massive parallelism when compared to central processing units (CPUs).
To do so, this research work first investigates code prototyping in a high- level language, e.g., MATLAB. This allows to implement vectorized algorithms and benchmark numerical results of two-dimensional analysis with analytical solutions and/or experimental results in an affordable amount of time. After- wards, low-level language such as CUDA C is used to efficiently implement a GPU-based solver, i.e., ep2-3De v1.0, can resolve three-dimensional prob- lems in a decent amount of time. This part takes advantages of the massive parallelism of modern GPU architectures. In addition, a first attempt of GPU parallel computing, i.e., multi-GPU codes, is performed to increase even more the performance and to address the on-chip memory limitation. Finally, this GPU-based solver is used to investigate three-dimensional granular collapses and is compared with experimental evidences obtained in the laboratory.
This research work demonstrates that the material point method is well suited to resolve small to large elasto-plastic deformations. Moreover, the computational efficiency of the method can be dramatically increased using modern GPU architectures. These allow fast, performant and accurate three- dimensional modelling of landslides, provided that the on-chip memory limi- tation is alleviated with an appropriate parallel strategy
- …