9,893 research outputs found

    Towards Autonomous Selective Harvesting: A Review of Robot Perception, Robot Design, Motion Planning and Control

    Full text link
    This paper provides an overview of the current state-of-the-art in selective harvesting robots (SHRs) and their potential for addressing the challenges of global food production. SHRs have the potential to increase productivity, reduce labour costs, and minimise food waste by selectively harvesting only ripe fruits and vegetables. The paper discusses the main components of SHRs, including perception, grasping, cutting, motion planning, and control. It also highlights the challenges in developing SHR technologies, particularly in the areas of robot design, motion planning and control. The paper also discusses the potential benefits of integrating AI and soft robots and data-driven methods to enhance the performance and robustness of SHR systems. Finally, the paper identifies several open research questions in the field and highlights the need for further research and development efforts to advance SHR technologies to meet the challenges of global food production. Overall, this paper provides a starting point for researchers and practitioners interested in developing SHRs and highlights the need for more research in this field.Comment: Preprint: to be appeared in Journal of Field Robotic

    Impact of Microstructure of Nanoscale Magnetron Sputtered Ru/Al Multilayers on Thermally Induced Phase Formation

    Get PDF
    In this study, we report on phase formation and microstructure evolution in multiscale magnetron sputtered Ru/Al multilayers upon thermal annealing in vacuum at slow heating rates of 10 K/min. By specifically adjusting the microstructure and design of the as-deposited multilayers, the formation of certain desired phases can be tuned. We demonstrate that the synthesis of single phase RuAl thin films is possible in a very controlled manner in a solid state only via thermal activation without initiating the self-propagating exothermic reactions of Ru/Al multilayers. To investigate phase formation sequences and the resulting microstructures, Ru/Al multilayers were designed via magnetron sputtering with systematic variation of bilayer modulation periods and subsequent vacuum annealing. Thin films samples were characterized by in situ high-temperature XRD, TEM imaging and diffraction. It is shown that different phase sequences appear in strong correlation with the modulation length. Depending on the multilayer design, the phase formation toward single-phase RuAl thin films happens as either a multi-step or single-step event. In particular, below a critical threshold of the modulation period, the multi-step phase formation can be suppressed, and only the desired RuAl target phase is obtained with a pronounced growth in a preferred orientation. This finding may be versatile for the targeted synthesis of intermetallic phases, contributing to further understanding of phase formation in such nanoscale multilayer systems

    Pretrained Embeddings for E-commerce Machine Learning: When it Fails and Why?

    Full text link
    The use of pretrained embeddings has become widespread in modern e-commerce machine learning (ML) systems. In practice, however, we have encountered several key issues when using pretrained embedding in a real-world production system, many of which cannot be fully explained by current knowledge. Unfortunately, we find that there is a lack of a thorough understanding of how pre-trained embeddings work, especially their intrinsic properties and interactions with downstream tasks. Consequently, it becomes challenging to make interactive and scalable decisions regarding the use of pre-trained embeddings in practice. Our investigation leads to two significant discoveries about using pretrained embeddings in e-commerce applications. Firstly, we find that the design of the pretraining and downstream models, particularly how they encode and decode information via embedding vectors, can have a profound impact. Secondly, we establish a principled perspective of pre-trained embeddings via the lens of kernel analysis, which can be used to evaluate their predictability, interactively and scalably. These findings help to address the practical challenges we faced and offer valuable guidance for successful adoption of pretrained embeddings in real-world production. Our conclusions are backed by solid theoretical reasoning, benchmark experiments, as well as online testings

    The place where curses are manufactured : four poets of the Vietnam War

    Get PDF
    The Vietnam War was unique among American wars. To pinpoint its uniqueness, it was necessary to look for a non-American voice that would enable me to articulate its distinctiveness and explore the American character as observed by an Asian. Takeshi Kaiko proved to be most helpful. From his novel, Into a Black Sun, I was able to establish a working pair of 'bookends' from which to approach the poetry of Walter McDonald, Bruce Weigl, Basil T. Paquet and Steve Mason. Chapter One is devoted to those seemingly mismatched 'bookends,' Walt Whitman and General William C. Westmoreland, and their respective anthropocentric and technocentric visions of progress and the peculiarly American concept of the "open road" as they manifest themselves in Vietnam. In Chapter, Two, I analyze the war poems of Walter McDonald. As a pilot, writing primarily about flying, his poetry manifests General Westmoreland's technocentric vision of the 'road' as determined by and manifest through technology. Chapter Three focuses on the poems of Bruce Weigl. The poems analyzed portray the literal and metaphorical descent from the technocentric, 'numbed' distance of aerial warfare to the world of ground warfare, and the initiation of a 'fucking new guy,' who discovers the contours of the self's interior through a set of experiences that lead from from aerial insertion into the jungle to the degradation of burning human feces. Chapter Four, devoted to the thirteen poems of Basil T. Paquet, focuses on the continuation of the descent begun in Chapter Two. In his capacity as a medic, Paquet's entire body of poems details his quotidian tasks which entail tending the maimed, the mortally wounded and the dead. The final chapter deals with Steve Mason's JohnnY's Song, and his depiction of the plight of Vietnam veterans back in "The World" who are still trapped inside the interior landscape of their individual "ghettoes" of the soul created by their war-time experiences

    Image classification over unknown and anomalous domains

    Get PDF
    A longstanding goal in computer vision research is to develop methods that are simultaneously applicable to a broad range of prediction problems. In contrast to this, models often perform best when they are specialized to some task or data type. This thesis investigates the challenges of learning models that generalize well over multiple unknown or anomalous modes and domains in data, and presents new solutions for learning robustly in this setting. Initial investigations focus on normalization for distributions that contain multiple sources (e.g. images in different styles like cartoons or photos). Experiments demonstrate the extent to which existing modules, batch normalization in particular, struggle with such heterogeneous data, and a new solution is proposed that can better handle data from multiple visual modes, using differing sample statistics for each. While ideas to counter the overspecialization of models have been formulated in sub-disciplines of transfer learning, e.g. multi-domain and multi-task learning, these usually rely on the existence of meta information, such as task or domain labels. Relaxing this assumption gives rise to a new transfer learning setting, called latent domain learning in this thesis, in which training and inference are carried out over data from multiple visual domains, without domain-level annotations. Customized solutions are required for this, as the performance of standard models degrades: a new data augmentation technique that interpolates between latent domains in an unsupervised way is presented, alongside a dedicated module that sparsely accounts for hidden domains in data, without requiring domain labels to do so. In addition, the thesis studies the problem of classifying previously unseen or anomalous modes in data, a fundamental problem in one-class learning, and anomaly detection in particular. While recent ideas have been focused on developing self-supervised solutions for the one-class setting, in this thesis new methods based on transfer learning are formulated. Extensive experimental evidence demonstrates that a transfer-based perspective benefits new problems that have recently been proposed in anomaly detection literature, in particular challenging semantic detection tasks

    Data-to-text generation with neural planning

    Get PDF
    In this thesis, we consider the task of data-to-text generation, which takes non-linguistic structures as input and produces textual output. The inputs can take the form of database tables, spreadsheets, charts, and so on. The main application of data-to-text generation is to present information in a textual format which makes it accessible to a layperson who may otherwise find it problematic to understand numerical figures. The task can also automate routine document generation jobs, thus improving human efficiency. We focus on generating long-form text, i.e., documents with multiple paragraphs. Recent approaches to data-to-text generation have adopted the very successful encoder-decoder architecture or its variants. These models generate fluent (but often imprecise) text and perform quite poorly at selecting appropriate content and ordering it coherently. This thesis focuses on overcoming these issues by integrating content planning with neural models. We hypothesize data-to-text generation will benefit from explicit planning, which manifests itself in (a) micro planning, (b) latent entity planning, and (c) macro planning. Throughout this thesis, we assume the input to our generator are tables (with records) in the sports domain. And the output are summaries describing what happened in the game (e.g., who won/lost, ..., scored, etc.). We first describe our work on integrating fine-grained or micro plans with data-to-text generation. As part of this, we generate a micro plan highlighting which records should be mentioned and in which order, and then generate the document while taking the micro plan into account. We then show how data-to-text generation can benefit from higher level latent entity planning. Here, we make use of entity-specific representations which are dynam ically updated. The text is generated conditioned on entity representations and the records corresponding to the entities by using hierarchical attention at each time step. We then combine planning with the high level organization of entities, events, and their interactions. Such coarse-grained macro plans are learnt from data and given as input to the generator. Finally, we present work on making macro plans latent while incrementally generating a document paragraph by paragraph. We infer latent plans sequentially with a structured variational model while interleaving the steps of planning and generation. Text is generated by conditioning on previous variational decisions and previously generated text. Overall our results show that planning makes data-to-text generation more interpretable, improves the factuality and coherence of the generated documents and re duces redundancy in the output document

    Increased lifetime of Organic Photovoltaics (OPVs) and the impact of degradation, efficiency and costs in the LCOE of Emerging PVs

    Get PDF
    Emerging photovoltaic (PV) technologies such as organic photovoltaics (OPVs) and perovskites (PVKs) have the potential to disrupt the PV market due to their ease of fabrication (compatible with cheap roll-to-roll processing) and installation, as well as their significant efficiency improvements in recent years. However, rapid degradation is still an issue present in many emerging PVs, which must be addressed to enable their commercialisation. This thesis shows an OPV lifetime enhancing technique by adding the insulating polymer PMMA to the active layer, and a novel model for quantifying the impact of degradation (alongside efficiency and cost) upon levelized cost of energy (LCOE) in real world emerging PV installations. The effect of PMMA morphology on the success of a ternary strategy was investigated, leading to device design guidelines. It was found that either increasing the weight percent (wt%) or molecular weight (MW) of PMMA resulted in an increase in the volume of PMMA-rich islands, which provided the OPV protection against water and oxygen ingress. It was also found that adding PMMA can be effective in enhancing the lifetime of different active material combinations, although not to the same extent, and that processing additives can have a negative impact in the devices lifetime. A novel model was developed taking into account realistic degradation profile sourced from a literature review of state-of-the-art OPV and PVK devices. It was found that optimal strategies to improve LCOE depend on the present characteristics of a device, and that panels with a good balance of efficiency and degradation were better than panels with higher efficiency but higher degradation as well. Further, it was found that low-cost locations were more favoured from reductions in the degradation rate and module cost, whilst high-cost locations were more benefited from improvements in initial efficiency, lower discount rates and reductions in install costs

    Defining Service Level Agreements in Serverless Computing

    Get PDF
    The emergence of serverless computing has brought significant advancements to the delivery of computing resources to cloud users. With the abstraction of infrastructure, ecosystem, and execution environments, users could focus on their code while relying on the cloud provider to manage the abstracted layers. In addition, desirable features such as autoscaling and high availability became a provider’s responsibility and can be adopted by the user\u27s application at no extra overhead. Despite such advancements, significant challenges must be overcome as applications transition from monolithic stand-alone deployments to the ephemeral and stateless microservice model of serverless computing. These challenges pertain to the uniqueness of the conceptual and implementation models of serverless computing. One of the notable challenges is the complexity of defining Service Level Agreements (SLA) for serverless functions. As the serverless model shifts the administration of resources, ecosystem, and execution layers to the provider, users become mere consumers of the provider’s abstracted platform with no insight into its performance. Suboptimal conditions of the abstracted layers are not visible to the end-user who has no means to assess their performance. Thus, SLA in serverless computing must take into consideration the unique abstraction of its model. This work investigates the Service Level Agreement (SLA) modeling of serverless functions\u27 and serverless chains’ executions. We highlight how serverless SLA fundamentally differs from earlier cloud delivery models. We then propose an approach to define SLA for serverless functions by utilizing resource utilization fingerprints for functions\u27 executions and a method to assess if executions adhere to that SLA. We evaluate the approach’s accuracy in detecting SLA violations for a broad range of serverless application categories. Our validation results illustrate a high accuracy in detecting SLA violations resulting from resource contentions and provider’s ecosystem degradations. We conclude by presenting the empirical validation of our proposed approach, which could detect Execution-SLA violations with accuracy up to 99%

    A productive response to legacy system petrification

    Get PDF
    Requirements change. The requirements of a legacy information system change, often in unanticipated ways, and at a more rapid pace than the rate at which the information system itself can be evolved to support them. The capabilities of a legacy system progressively fall further and further behind their evolving requirements, in a degrading process termed petrification. As systems petrify, they deliver diminishing business value, hamper business effectiveness, and drain organisational resources. To address legacy systems, the first challenge is to understand how to shed their resistance to tracking requirements change. The second challenge is to ensure that a newly adaptable system never again petrifies into a change resistant legacy system. This thesis addresses both challenges. The approach outlined herein is underpinned by an agile migration process - termed Productive Migration - that homes in upon the specific causes of petrification within each particular legacy system and provides guidance upon how to address them. That guidance comes in part from a personalised catalogue of petrifying patterns, which capture recurring themes underlying petrification. These steer us to the problems actually present in a given legacy system, and lead us to suitable antidote productive patterns via which we can deal with those problems one by one. To prevent newly adaptable systems from again degrading into legacy systems, we appeal to a follow-on process, termed Productive Evolution, which embraces and keeps pace with change rather than resisting and falling behind it. Productive Evolution teaches us to be vigilant against signs of system petrification and helps us to nip them in the bud. The aim is to nurture systems that remain supportive of the business, that are adaptable in step with ongoing requirements change, and that continue to retain their value as significant business assets
    • …
    corecore