398 research outputs found

    Genetic conectivity patterns in Holothuria Mammata considering different spatial scales

    Get PDF
    Dissertação de Mestrado, Biologia Marinha, Faculdade de Ciências e Tecnologia, Universidade do Algarve, 2015As a consequence of the world population growth, the demand for sea food resources is increasing. Consequently, worldwide fisheries transition from depleted finfish resources to alternative invertebrates species created a massive industry. Sea cucumber stocks have been overfished in Indo-Pacific Oceans as result of lack of effective management, non-regulated fisheries and an increasing demand from Oriental countries. The “beche de mer” demands have induced expansion of their fisheries worldwide and have resulted in catches of new target species from the Mediterranean Sea and NE Atlantic Ocean. With high commercial value and fragile life-history traits, sea cucumbers are particularly vulnerable to commercial fisheries, especially when there are no scientific knowledge to support their effective management. This study aims to fill in some of the gap in scientific data about sea cucumber populations, namely Holothuria mammata. It was assessed the genetic diversity and structure, connectivity and effective population size with novel polymorphic molecular markers (microsatellites). This analysis was done in several spatial scales and considering oceanographic patterns. Some morphometric traits were also analysed, such as the distribution of length and weight classes. The results showed that Holothuria mammata has globally high genetic diversity, higher genetic connectivity between Atlantic populations and genetic differentiation between Atlantic/Mediterranean and eastern/western Mediterranean basins. Effective population sizes were smaller in the Atlantic, showing some mutationdrift disequilibrium. Oceanographic patterns were strongly correlated with the genetic differentiation patterns. Atlantic populations presented bigger individuals (i.e. length and weight) than Mediterranean ones, clearly associated with environmental conditions, and ecology features. Biometric data and genetic analysis allowed us to establish the three potential stocks inhabiting the geographic area and improve the biological knowledge of this new target species. This information will be useful to suggest the first recommendations to its effective fishery management, and future comparisons in order to assess the fishery effects either at genetic and/or morphometric level

    Equity research - BHP Billiton Ltd.

    Get PDF
    Mestrado em FinançasEste trabalho pretende apresentar uma avaliação da empresa BHP Billiton Ltd, elaborado de acorde com o Projeto de Trabalho Final de Mestrado em Finanças no ISEG. Esta avaliação segue o formato recomendado pelo CFA Institute (Pinto, Henry, Robinson & Stowe, 2010) Esta avaliação é emitida considerando toda a informação publicamente disponível a 29 de outubro de 2018. A principal metodologia usada utilizada para aferir o preço-alvo é através dos Fluxos de Caixa Descontados (DCF) Na avaliação consideramos várias premissas que resultaram de uma análise cuidada dos dados históricos da empresa. A recomendação é de MANTER, com um preço-alvo de 50.39porac\ca~oeumapotencialvalorizac\ca~ode11.26TheequityresearchisissuedconsideringallthepubliclyavailableinformationonthecompanyasofOctober29th,2018.WeusetheDiscountedCashFlowsmethodtoachieveourfinaltargetprice.Theassumptionsconsideredinthevaluationresultfromacarefulanalysisofthecompanysdata,industrymaindriversandfuturemarketprospects.OurfinalrecommendationstandsforHOLD,withapricetargetof50.39 por ação e uma potencial valorização de 11.26%.This work aims to present a company valuation on BHP Billiton Ltd elaborated with ISEG's Finance Master Work Project. This Equity Research follows the format recommended by the CFA Institute (Pinto, Henry, Robinson & Stowe, 2010) The equity research is issued considering all the publicly available information on the company as of October 29th, 2018. We use the Discounted Cash Flows method to achieve our final target price. The assumptions considered in the valuation result from a careful analysis of the company's data, industry main drivers and future market prospects. Our final recommendation stands for HOLD, with a price target of 50.39/sh and an upside potential of 11.26%, with medium risk.info:eu-repo/semantics/publishedVersio

    Gestão de conteúdos digitais em múltiplos monitores

    Get PDF
    With the generalized use of systems for digital contents dissemination arises the opportunity for implementing solutions capable of evaluating audience reaction. This dissertation reflects the implementation of one of those solutions. To this end, the development involved adapting a previously functional digital signage system. In this sense, digital cameras were paired to the content display terminals in order to capture information from the area in front of them. Using computer vision technologies, the terminals detect, in real time, people who appear in the cameras’ field of view, and this information is communicated to a server for data extraction. On the server, methods are used to perform face and emotion recognition, and also to extract data indicating the position of the head, which allows the calculation of an attention coefficient. The data is stored in a relational database, and can be consulted through a web platform, where they are presented associated with the contents corresponding to the moment of their capture and extraction. This solution thus allows the evaluation of the impact of the digital contents presented by the system.Com a utilização generalizada de sistemas de disseminação de conteúdos digitais, surge a oportunidade de implementar soluções capazes de avaliar a reacção do público. Esta dissertação reflete a implementação de uma dessas soluções. Para isso, o desenvolvimento passou pela adaptação de um sistema de sinalização digital previamente funcional. Neste sentido, aos terminais de exposição de conteúdos, foram emparelhadas câmaras digitais de modo a permitir a captação de informação da área à frente destes. Com recurso a tecnologias de visão de computador, os terminais fazem, em tempo real, deteção de pessoas que apareçam no campo de visão das câmaras, sendo esta informação comunicada a um servidor para extração de dados. No servidor, são utilizados métodos para realização de reconhecimento de faces e emoções, e também é feita extração de dados indicadores da posição da cabeça, o que permite o cálculo de um coeficiente de atenção. Os dados são guardados numa base de dados relacional e podem ser consultados através de uma plataforma web, onde são apresentados associados aos contéudos correspondentes ao momento de captação e extração destes. Esta solução, permite, assim, a avaliação do impacto dos conteúdos digitais apresentados pelo sistema.Mestrado em Engenharia de Computadores e Telemátic

    Improve of business value for portuguese SME that adopt industry 4.0

    Get PDF
    Dissertation presented as partial requirement for obtaining the Master’s degree in Information Management, with a specialization in Information Systems and Technologies ManagementMany industrial organizations already began the Digital Transformation path, with the premise and objective of applying the features and models that Industry 4.0 brought to the industry field, particularly factories, creating a new context of smart factories that will provide higher revenue. Aggregated to Industry 4.0 new types of architecture were born, that will create great challenges in terms of security, control, interoperability among others. One of these areas is Edge computing that combines the very best of cloud and on-site computing. With the control authority, overall automation management and cloud based analytics, the edge systems answer the challenges identified as being crucial for any industrial 4.0 deployment. In this work the research methodology will consist of a comprehensive review and strategic analysis of existing global literature on those topics, in parallel with interviews and data analysis that were performed to entities representing the Portuguese Small and Medium Enterprises in the Industrial market. This work aims to contribute to stimulate the need of achievements, a more strategic and operational approach in the use of edge computing within organizations to clarify the existing strengths in this area. This way, Industry 4.0 strategy can be implemented with effective approaches and planned actions for a direct business value increase and also be able to create a reference model that could be applied in similar organizations. As a result, this work bring knowledge for companies, in the same market segment of the study, that want to adopt initiatives of industry 4.0

    Automatic Completion of Text-based Tasks

    Get PDF
    Crowdsourcing is a widespread problem-solving model which consists in assigning tasks to an existing pool of workers in order to solve a problem, being a scalable alternative to hiring a group of experts for labeling high volumes of data. It can provide results that are similar in quality, with the advantage of achieving such standards in a faster and more efficient manner. Modern approaches to crowdsourcing use Machine Learning models to do the labeling of the data and request the crowd to validate the results. Such approaches can only be applied if the data in which the model was trained (source data), and the data that needs labeling (target data) share some relation. Furthermore, since the model is not adapted to the target data, its predictions may produce a substantial amount of errors. Consequently, the validation of these predictions can be very time-consuming. In this thesis, we propose an approach that leverages in-domain data, which is a labeled portion of the target data, to adapt the model. The remainder of the data is labeled based on these model’s predictions. The crowd is tasked with the generation of the in-domain data and the validation of the model’s predictions. Under this approach, train the model with only in-domain data and with both in-domain data and data from an outer domain. We apply these learning settings with the intent of optimizing a crowdsourcing pipeline for the area of Natural Language Processing, more concretely for the task of Named Entity Recognition (NER). This optimization relates to the effort required by the crowd to performed the NER task. The results of the experiments show that the usage of in-domain data achieves effort savings ranging from 6% to 53%. Furthermore, we such savings in nine distinct datasets, which demonstrates the robustness and application depth of this approach. In conclusion, the in-domain data approach is capable of optimizing a crowdsourcing pipeline of NER. Furthermore, it has a broader range of use cases when compared to reusing a model to generate predictions in the target data

    A Model for Planning TELCO Work-Field Activities Enabled by Genetic and Ant Colony Algorithms

    Get PDF
    Telecommunication Company’s (TELCO) are continuously delivering their efforts on the effectiveness of their daily work. Planning the activities for their workers is a crucial sensitive, and time-consuming task usually taken by experts. This plan aims to find an optimized solution maximizing the number of activities assigned to workers and minimizing the inherent costs (e.g., labor from workers, fuel, and other transportation costs). This paper proposes a model that allows computing a maximized plan for the activities assigned to their workers, allowing to alleviate the burden of the existing experts, even if supported by software implementing rule-based heuristic models. The proposed model is inspired by nature and relies on two stages supported by Genetic and Ant Colony evolutionary algorithms. At the first stage, a Genetic Algorithms (GA) identifies the optimal set of activities to be assigned to workers as the way to maximize the revenues. At a second step, an Ant Colony algorithm searches for an efficient path among the activities to minimize the costs. The conducted experimental work validates the effectiveness of the proposed model in the optimization of the planning TELCO work-field activities in comparison to a rule-based heuristic model.info:eu-repo/semantics/publishedVersio

    Using the cloud in lab classes

    Get PDF
    Computation laboratories are an essential part of educational institutions. The management of these laboratories presents some challenges to these institutions and to their IT department. The laboratories need to be prepared in advance, before the start of the cycle of study and usually entail high costs to the institutions. [...]Esta dissertação apresenta o desenvolvimento de uma aplicação que permite criar e gerir laborat órios de computação virtuais usando plataformas de Nuvem (Cloud). O uso de laboratórios virtuais é usual em áreas como a engenharia e ciências da computação. No entanto, a maioria das soluções existentes são proprietárias e exigem algum esforço de aprendizagem. [...

    Application development for Software-Defined networks in state of the art controllers

    Get PDF
    In the last few years, the importance of the internet in our lives increased considerably. Networks have become a big part of our lives and there will be a setup almost everywhere we go: in our homes, in the workplace, in stores, in universities, in the subway. Each and every one of these places has a network, a router, Wi-Fi, etc. Due to its high importance, service providers must guarantee a fully operational network, 24 hours a day, leaving no room for mistakes. However, that creates a problem: how can developers test new protocols? In no way is a service provider willing to risk ruining its network because a developer tested a non-working protocol. Researchers who dedicate themselves to the study of these frameworks believe that the main problems of a fully operational network lie essentially in its architecture, as network devices incorporate different and quite complex functions. Major networks, such as service providers, are built upon robust architectures with the ability to support large traffic volumes, with different characteristics. The service provider is able to process large amounts of data simultaneously, as well as route and forward traffic. As they have built-in control functions that work in a distributed manner and considering they are made by a limited number of manufacturers, these networks present several limitations. Besides its complexity and configuration, it must be taken into account that every network should be prepared to deal with potential failures that might occur, as well as any security-related problems. A network - regardless of its level of use - must allow its users to use it as safely as possible. Networks today have poor flexibility and their development, growth and innovation are far from simple. Thus, the provision of more diversified services to satisfy the users presents a challenge to service providers, since the system and the administration functions are separated. The answer to these problems lies within the Software-Defined Networks (SDN), given that they seem to be very promising as far as innovation is concerned, allowing the development of new strategies and management control networks. These networks use programmable switches and routers that can process packets of data for several isolated experimental networks simultaneously, through virtualization. These networks run in the Control Plane, in servers operating separately from the network devices. This gives the network administrator a greater control over the network, as it allows to manage different resources by directing them to different traffic flows. A SDN using OpenFlow is capable of supporting a high-response network to each and every controller failures that might occur, without slowing the network's response, as it offers great flexibility and helps with fighting the limitations of any existing network. The main goal of this thesis is to explain how to use this new approach (SDN) and its capacities. This work will serve as a basis to all who wish to obtain new knowledge about this topic. One of the main focuses of this thesis is to pinpoint the advantages and disadvantages of SDN with an OpenFlow architecture
    corecore