6,068 research outputs found
Recommended from our members
Ensuring Access to Safe and Nutritious Food for All Through the Transformation of Food Systems
Offline and Online Models for Learning Pairwise Relations in Data
Pairwise relations between data points are essential for numerous machine learning algorithms. Many representation learning methods consider pairwise relations to identify the latent features and patterns in the data. This thesis, investigates learning of pairwise relations from two different perspectives: offline learning and online learning.The first part of the thesis focuses on offline learning by starting with an investigation of the performance modeling of a synchronization method in concurrent programming using a Markov chain whose state transition matrix models pairwise relations between involved cores in a computer process.Then the thesis focuses on a particular pairwise distance measure, the minimax distance, and explores memory-efficient approaches to computing this distance by proposing a hierarchical representation of the data with a linear memory requirement with respect to the number of data points, from which the exact pairwise minimax distances can be derived in a memory-efficient manner. Then, a memory-efficient sampling method is proposed that follows the aforementioned hierarchical representation of the data and samples the data points in a way that the minimax distances between all data points are maximally preserved. Finally, the thesis proposes a practical non-parametric clustering of vehicle motion trajectories to annotate traffic scenarios based on transitive relations between trajectories in an embedded space.The second part of the thesis takes an online learning perspective, and starts by presenting an online learning method for identifying bottlenecks in a road network by extracting the minimax path, where bottlenecks are considered as road segments with the highest cost, e.g., in the sense of travel time. Inspired by real-world road networks, the thesis assumes a stochastic traffic environment in which the road-specific probability distribution of travel time is unknown. Therefore, it needs to learn the parameters of the probability distribution through observations by modeling the bottleneck identification task as a combinatorial semi-bandit problem. The proposed approach takes into account the prior knowledge and follows a Bayesian approach to update the parameters. Moreover, it develops a combinatorial variant of Thompson Sampling and derives an upper bound for the corresponding Bayesian regret. Furthermore, the thesis proposes an approximate algorithm to address the respective computational intractability issue.Finally, the thesis considers contextual information of road network segments by extending the proposed model to a contextual combinatorial semi-bandit framework and investigates and develops various algorithms for this contextual combinatorial setting
A Decision Support System for Economic Viability and Environmental Impact Assessment of Vertical Farms
Vertical farming (VF) is the practice of growing crops or animals using the vertical dimension via multi-tier racks or vertically inclined surfaces. In this thesis, I focus on the emerging industry of plant-specific VF. Vertical plant farming (VPF) is a promising and relatively novel practice that can be conducted in buildings with environmental control and artificial lighting. However, the nascent sector has experienced challenges in economic viability, standardisation, and environmental sustainability. Practitioners and academics call for a comprehensive financial analysis of VPF, but efforts are stifled by a lack of valid and available data.
A review of economic estimation and horticultural software identifies a need for a decision support system (DSS) that facilitates risk-empowered business planning for vertical farmers. This thesis proposes an open-source DSS framework to evaluate business sustainability through financial risk and environmental impact assessments. Data from the literature, alongside lessons learned from industry practitioners, would be centralised in the proposed DSS using imprecise data techniques. These techniques have been applied in engineering but are seldom used in financial forecasting. This could benefit complex sectors which only have scarce data to predict business viability.
To begin the execution of the DSS framework, VPF practitioners were interviewed using a mixed-methods approach. Learnings from over 19 shuttered and operational VPF projects provide insights into the barriers inhibiting scalability and identifying risks to form a risk taxonomy. Labour was the most commonly reported top challenge. Therefore, research was conducted to explore lean principles to improve productivity.
A probabilistic model representing a spectrum of variables and their associated uncertainty was built according to the DSS framework to evaluate the financial risk for VF projects. This enabled flexible computation without precise production or financial data to improve economic estimation accuracy. The model assessed two VPF cases (one in the UK and another in Japan), demonstrating the first risk and uncertainty quantification of VPF business models in the literature. The results highlighted measures to improve economic viability and the viability of the UK and Japan case.
The environmental impact assessment model was developed, allowing VPF operators to evaluate their carbon footprint compared to traditional agriculture using life-cycle assessment. I explore strategies for net-zero carbon production through sensitivity analysis. Renewable energies, especially solar, geothermal, and tidal power, show promise for reducing the carbon emissions of indoor VPF. Results show that renewably-powered VPF can reduce carbon emissions compared to field-based agriculture when considering the land-use change.
The drivers for DSS adoption have been researched, showing a pathway of compliance and design thinking to overcome the ‘problem of implementation’ and enable commercialisation. Further work is suggested to standardise VF equipment, collect benchmarking data, and characterise risks. This work will reduce risk and uncertainty and accelerate the sector’s emergence
Recommended from our members
Reliable Decision-Making with Imprecise Models
The rapid growth in the deployment of autonomous systems across various sectors has generated considerable interest in how these systems can operate reliably in large, stochastic, and unstructured environments. Despite recent advances in artificial intelligence and machine learning, it is challenging to assure that autonomous systems will operate reliably in the open world. One of the causes of unreliable behavior is the impreciseness of the model used for decision-making. Due to the practical challenges in data collection and precise model specification, autonomous systems often operate based on models that do not represent all the details in the environment. Even if the system has access to a comprehensive decision-making model that accounts for all the details in the environment and all possible scenarios the agent may encounter, it may be intractable to solve this complex model optimally. Consequently, this complex, high fidelity model may be simplified to accelerate planning, introducing imprecision. Reasoning with such imprecise models affects the reliability of autonomous systems. A system\u27s actions may sometimes produce unexpected, undesirable consequences, which are often identified after deployment. How can we design autonomous systems that can operate reliably in the presence of uncertainty and model imprecision?
This dissertation presents solutions to address three classes of model imprecision in a Markov decision process, along with an analysis of the conditions under which bounded-performance can be guaranteed. First, an adaptive outcome selection approach is introduced to devise risk-aware reduced models of the environment that efficiently balance the trade-off between model simplicity and fidelity, to accelerate planning in resource-constrained settings. Second, a framework that extends stochastic shortest path framework to problems with imperfect information about the goal state during planning is introduced, along with two solution approaches to solve this problem. Finally, two complementary solution approaches are presented to minimize the negative side effects of agent actions. The techniques presented in this dissertation enable an autonomous system to detect and mitigate undesirable behavior, without redesigning the model entirely
A productive response to legacy system petrification
Requirements change. The requirements of a legacy information system change, often in unanticipated ways, and at a more rapid pace than the rate at which the information system itself can be evolved to support them. The capabilities of a legacy system progressively fall further and further behind their evolving requirements, in a degrading process termed petrification. As systems petrify, they deliver diminishing business value, hamper business effectiveness, and drain organisational resources. To address legacy systems, the first challenge is to understand how to shed their resistance to tracking requirements change. The second challenge is to ensure that a newly adaptable system never again petrifies into a change resistant legacy system. This thesis addresses both challenges. The approach outlined herein is underpinned by an agile migration process - termed Productive Migration - that homes in upon the specific causes of petrification within each particular legacy system and provides guidance upon how to address them. That guidance comes in part from a personalised catalogue of petrifying patterns, which capture recurring themes underlying petrification. These steer us to the problems actually present in a given legacy system, and lead us to suitable antidote productive patterns via which we can deal with those problems one by one. To prevent newly adaptable systems from again degrading into legacy systems, we appeal to a follow-on process, termed Productive Evolution, which embraces and keeps pace with change rather than resisting and falling behind it. Productive Evolution teaches us to be vigilant against signs of system petrification and helps us to nip them in the bud. The aim is to nurture systems that remain supportive of the business, that are adaptable in step with ongoing requirements change, and that continue to retain their value as significant business assets
The mechanisms of antibody generation in the llama
The llama is able to generate a unique class of antibody. The heavy chain immunoglobulins consist only of two heavy chain polypeptides and bind antigen specifically through single protein domains. Although the mechanisms by which such an antibody interacts with antigen has been studied at some length the manner in which the heavy chain antibody is generated within the llama is unknown. In this study a number of components of the llama immune system have been characterised. The isolation of genes encoding the variable domain of the heavy chain antibody indicates that specific genetic elements within the llama genome are responsible for the generation of the heavy chain antibody. The discovery of constant region genes that encode the heavy chain antibody provides an explanation for the absence of a major immunoglobulin domain from the final, secreted gene product. The lack of this domain within the expressed antibody is believed to be the result of a single nucleotide splice site mutation. In order to investigate the process of llama antibody generation further additional components of the llama immune system, the recombination activating genes (rag) were isolated. One such llama rag gene (rag-i) was cloned, expressed and utilised in an in vitro assay system to investigate recombination events taking place during antibody generation. This assay involved the use of specific signal sequences derived from variable domain gene sequence data and represents, to our knowledge, the first examination of non-murine RAG activity. Through the use of this system distinct differences between llama and mouse recombination signal sequences (RSSs) were uncovered. These differences, located within a specific region of the RSS known as the coding flank, may play an important role in llama antibody generation. These results have led to the proposal of a number of models for the mechanisms involved in llama antibody generation
In search of 'The people of La Manche': A comparative study of funerary practices in the Transmanche region during the late Neolithic and Early Bronze Age (250BC-1500BC)
This research project sets out to discover whether archaeological evidence dating between 2500 BC - 1500 BC from supposed funerary contexts in Kent, flanders and north-eastern Transmanche France is sufficient to make valid comparisons between social and cultural structures on either side of the short-sea Channel region. Evidence from the beginning of the period primarily comes in the form of the widespread Beaker phenomenon. Chapter 5 shows that this class of data is abundant in Kent but quite sparse in the Continental zones - most probably because it has not survived well. This problem also affects the human depositional evidence catalogued in Chapter 6, particularly in Fanders but also in north-eastern Transmanche France. This constricts comparative analysis, however, the abundant data from Kent means that general trends are still discernible. The quality and volume of data relating to the distribution, location, morphology and use of circular monuments in all three zones is far better - as demonstrated in Chapter 7 -mostly due to extensive aerial surveying over several decades. When the datasets are taken as a whole, it becomes possible to successfully apply various forms of comparative analyses. Most remarkably, this has revealed that some monuments apparently have encoded within them a sophisticated and potentially symbolically charged geometric shape. This, along with other less contentious evidence, demonstrates a level of conformity that strongly suggests a stratum of cultural homogeneity existed throughout the Transmanche region during the period 2500 BC - 1500 BC. The fact that such changes as are apparent seem to have developed simultaneously in each of the zones adds additional weight to the theory that contact throughout the Transmanche region was endemic. Even so, it may not have been continuous; there may actually have been times of relative isolation - the data is simply too course to eliminate such a possibility
SUBSUMPTION AS DEVELOPMENT: A WORLD-ECOLOGICAL CRITIQUE OF THE SOUTH KOREAN "MIRACLE"
This work offers a critical reinterpretation of South Korean "economic development" from the perspectives of Marxian form critique and Jason Moore's world-ecology. Against the "production in general" view of economic life that dominates the extant debates, it analyzes the rise, spread, and deepening of capitalism's historically specific social forms in twentieth-century (South) Korea: commodity, wage-labor, value, and capital. Eschewing the binary language of development and underdevelopment, we adopt Marx's non-stagist distinctions regarding the relative degree of labor's (and society's) subsumption under capital: hybrid, formal, and real. Examining the (South) Korean experience across three dialectically interrelated scales – regional, global, and "national" – we outline the historical-geographical contingency surrounding South Koreas emergence by c.1980 as a regime of (industrialized) real subsumption, one of the only non-Western societies ever to do so. Crucial to this was the generalization of commodification and proletarianization that betokened deep structural changes in (South) Korea's class structure, but also a host of often-mentioned issues such as land reform, foreign aid, the developmental state, and a "heaven sent" position within the US-led Cold War order. Despite agreeing on the importance of these latter factors, however, the conclusions we draw from them differ radically from those of the extant analyses. For although regimes of real subsumption are the most materially, socially, and technologically dynamic, they are also the most socio-ecologically unsustainable and alienating due to the dualistic tensions inherent to capital's "fully developed" forms, in particular the temporal grounding of value. US protestations about the generalizability of these relations aside, moreover, these regimes have always been in the extreme minority and, crucially, have depended on less developed societies for their success. Historically, this has been achieved through widening the net of capitalist value relations; however, four decades of neoliberalization has all but eliminated any further large-scale "frontier strategies" of this sort. Due to its relatively dense population vis-a-vis its geographical size, contemporary South Korea faces stark challenges that render it anything but a model of "sustainable development," but rather signal the growing anachronism of value as the basis for regulating the future of nature-society relations in the "developed world" and beyond
The Theatre of Linda Griffiths
Linda Griffiths, actor and playwright, is a charismatic and vital presence on the Toronto theatre scene from the early 1970s until her untimely death in 2014. She travels across Canada and to Broadway, performing Maggie & Pierre after it premiers in the Backspace of Theatre Passe Muraille in 1980. She performs in her final play, Heaven Above, Heaven Below, with Layne Coleman in this same intimate space in 2013. Between these two shows, Griffiths works in theatres across Canada all the while maintaining her dedication to Theatre Passe Muraille. Her beginnings in collective creation lead her to experiment with process and with the formal composition of her plays, as well as to continuously navigate between her roles as actor and playwright. This dissertation studies the arc of Griffiths's career in order to reposition her in the field.
It explores Griffiths's experiments with form as well as her embodiment and continuation of the spirit and enthusiasm of the alternative theatre movement in Canada. I trace the development of her uvre as that of a playwright whose creative process travels the arc from collective, to collaboration, to writing solo for backspaces and mainstages, for both intimate venues and large national theatres, ultimately establishing her as one of Canada's most original and vibrant playwrights. This dissertation analyzes Griffiths's career as she discovers her actor-playwright identity, develops her own distinct creative process, and using her own unique methods writes and performs meaningful, powerful pieces which imagine new possibilities in women's representation.
I draw on original archival research from the Linda Griffiths fonds held at the University of Guelph as well as archives and papers in the private possession of Layne Coleman, Griffiths's long-term fellow theatre practitioner. As a kaleidoscopic creator, Griffiths's work necessitates a kaleidoscopic study. The methods of analysis, however, remain focused on archival research into her process. Because of her tendency to write from lived experience and to thoroughly research her subjects, who are often derived from real people, I investigate Griffiths's alchemical methods of transforming truthful material into illusory, fantastical, ephemeral, yet poignant and impactful performances
Controlled Discovery and Localization of Signals via Bayesian Linear Programming
In many statistical problems, it is necessary to simultaneously discover
signals and localize them as precisely as possible. For instance, genetic
fine-mapping studies aim to discover causal genetic variants, but the strong
local dependence structure of the genome makes it hard to identify the exact
locations of those variants. So the statistical task is to output as many
regions as possible and have those regions be as small as possible while
controlling how many outputted regions contain no signal. The same problem
arises in any application where signals cannot be perfectly localized, such as
locating stars in astronomical sky surveys and change point detection in time
series data. However, there are two competing objectives: maximizing the number
of discoveries and minimizing the size of those discoveries (all while
controlling false positives), so our first contribution is to propose a unified
measure called resolution-adjusted power that formally trades off these two
objectives and thus, in principle, can be maximized subject to a constraint on
false positives. We take a Bayesian approach, but the resulting posterior
optimization problem is non-convex and extremely high-dimensional. Thus our
second contribution is Bayesian Linear Programming (BLiP), a method which
overcomes this intractability to jointly detect and localize signals in a way
that verifiably nearly maximizes the expected resolution-adjusted power while
provably controlling false positives. BLiP is very computationally efficient
and can wrap around nearly any Bayesian model and algorithm. Applying BLiP on
top of existing state-of-the-art analyses of UK Biobank data (for genetic
fine-mapping) and the Sloan Digital Sky Survey (for astronomical point source
detection) increased resolution-adjusted power by 30-120% in just a few minutes
of computation. BLiP is implemented in the new packages pyblip (Python) and
blipr (R).Comment: 57 pages, 20 figure
- …