392 research outputs found

    Marine species distributions : from data to predictive models

    Get PDF
    The increased anthropogenic pressure on the marine environment through over-use and overfishing, invasion of species and global climate change has led to an urgent need for more knowledge on the marine ecosystem. Marine species distribution modelling is an important element of marine ecosystem management. It is relied upon by marine spatial planning for i.e. predicting biological resources, the design of marine protected areas, the designation of essential fish habitats, the assessment of species invasion risk, pest control, human-animal conflict prevention, ….This study aims to improve and contribute to the process and understanding of marine species distribution modelling in order to facilitate an in depth study of the trends, vectors and distribution of introduced seaweeds in Europe. More specifically we wanted to 1) provide quality indicators for the marine species distribution data available in the Ocean Biogeographic Information System (OBIS), 2) make global datasets for species distribution modelling in the past, current and future climate more accessible in R, 3) explore the relevance of different predictors of marine species distributions with MarineSPEED, a marine benchmark dataset of more than 500 species, 4) investigate the introduction history and trends in introduced seaweeds in Europe, 5) evaluate the risk of aquarium trade as a vector for future introductions of seaweeds and 6) study the ability of species distribution modelling to predict the introduction and spread of introduced seaweeds and propose a method for identifying candidate areas for further spreading under climate change. The first part of this thesis concerns general aspects of marine species distributions, the environmental data used for modelling and the relevance of marine predictors of species distributions

    Flat Cellular (UMTS) Networks

    Get PDF
    Traditionally, cellular systems have been built in a hierarchical manner: many specialized cellular access network elements that collectively form a hierarchical cellular system. When 2G and later 3G systems were designed there was a good reason to make system hierarchical: from a cost-perspective it was better to concentrate traffic and to share the cost of processing equipment over a large set of users while keeping the base stations relatively cheap. However, we believe the economic reasons for designing cellular systems in a hierarchical manner have disappeared: in fact, hierarchical architectures hinder future efficient deployments. In this paper, we argue for completely flat cellular wireless systems, which need just one type of specialized network element to provide radio access network (RAN) functionality, supplemented by standard IP-based network elements to form a cellular network. While the reason for building a cellular system in a hierarchical fashion has disappeared, there are other good reasons to make the system architecture flat: (1) as wireless transmission techniques evolve into hybrid ARQ systems, there is less need for a hierarchical cellular system to support spatial diversity; (2) we foresee that future cellular networks are part of the Internet, while hierarchical systems typically use interfaces between network elements that are specific to cellular standards or proprietary. At best such systems use IP as a transport medium, not as a core component; (3) a flat cellular system can be self scaling while a hierarchical system has inherent scaling issues; (4) moving all access technologies to the edge of the network enables ease of converging access technologies into a common packet core; and (5) using an IP common core makes the cellular network part of the Internet

    Lignocellulosic feedstocks: research progress and challenges in optimising biomass quality and yield

    Get PDF
    Lignocellulosic biomass derived from energy crops and agri-cultural residues is a promising renewable source for the pro-duction of transportation fuels and bio-based materials. Plants exhibiting C4 photosynthesis are amongst the most promising dedicated energy crops as they possess tremendous intrinsic effi-ciency in converting solar energy to biomass. Van der Weijde et al. (2013) provide an excellent overview of the potential of five C4 grasses from the Panicoideae clade (maize, Miscanthus, sorghum, sugarcane, and switchgrass) as lignocellulosic feedstock for the production of biofuels. The authors discuss yield poten-tial, biomass quality and genetic improvement of dual-purpose food and energy cultivars and dedicated energy cultivars through plant breeding and also highlight several research needs. Perennial growth habit provides a number of environmental advantages over annuals as bioenergy crops, including the requirement o

    Quantum computational finance: martingale asset pricing for incomplete markets

    Full text link
    A derivative is a financial security whose value is a function of underlying traded assets and market outcomes. Pricing a financial derivative involves setting up a market model, finding a martingale (``fair game") probability measure for the model from the given asset prices, and using that probability measure to price the derivative. When the number of underlying assets and/or the number of market outcomes in the model is large, pricing can be computationally demanding. We show that a variety of quantum techniques can be applied to the pricing problem in finance, with a particular focus on incomplete markets. We discuss three different methods that are distinct from previous works: they do not use the quantum algorithms for Monte Carlo estimation and they extract the martingale measure from market variables akin to bootstrapping, a common practice among financial institutions. The first two methods are based on a formulation of the pricing problem into a linear program and are using respectively the quantum zero-sum game algorithm and the quantum simplex algorithm as subroutines. For the last algorithm, we formalize a new market assumption milder than market completeness for which quantum linear systems solvers can be applied with the associated potential for large speedups. As a prototype use case, we conduct numerical experiments in the framework of the Black-Scholes-Merton model.Comment: 31 pages, 6 figure

    An analog quantum variational embedding classifier

    Full text link
    Quantum machine learning has the potential to provide powerful algorithms for artificial intelligence. The pursuit of quantum advantage in quantum machine learning is an active area of research. For current noisy, intermediate-scale quantum (NISQ) computers, various quantum-classical hybrid algorithms have been proposed. One such previously proposed hybrid algorithm is a gate-based variational embedding classifier, which is composed of a classical neural network and a parameterized gate-based quantum circuit. We propose a quantum variational embedding classifier based on an analog quantum computer, where control signals vary continuously in time: our particular focus is implementation using quantum annealers. In our algorithm, the classical data is transformed into the parameters of the time-varying Hamiltonian of the analog quantum computer by a linear transformation. The nonlinearity needed for a nonlinear classification problem is purely provided by the analog quantum computer, through the nonlinear dependence of the final quantum state on the control parameters of the Hamiltonian. We performed numerical simulations that demonstrate the effectiveness of our algorithm for performing binary and multi-class classification on linearly inseparable datasets such as concentric circles and MNIST digits. Our algorithm performs much better than classical linear classifiers. We find that the performance of our classifier can be increased by increasing the number of qubits. Our algorithm presents the possibility to use current quantum annealers for solving practical machine-learning problems and it could also be useful to explore quantum advantage in quantum machine learning

    Neural Networks for Programming Quantum Annealers

    Full text link
    Quantum machine learning has the potential to enable advances in artificial intelligence, such as solving problems intractable on classical computers. Some fundamental ideas behind quantum machine learning are similar to kernel methods in classical machine learning. Both process information by mapping it into high-dimensional vector spaces without explicitly calculating their numerical values. We explore a setup for performing classification on labeled classical datasets, consisting of a classical neural network connected to a quantum annealer. The neural network programs the quantum annealer's controls and thereby maps the annealer's initial states into new states in the Hilbert space. The neural network's parameters are optimized to maximize the distance of states corresponding to inputs from different classes and minimize the distance between quantum states corresponding to the same class. Recent literature showed that at least some of the "learning" is due to the quantum annealer, connecting a small linear network to a quantum annealer and using it to learn small and linearly inseparable datasets. In this study, we consider a similar but not quite the same case, where a classical fully-fledged neural network is connected with a small quantum annealer. In such a setting, the fully-fledged classical neural-network already has built-in nonlinearity and learning power, and can already handle the classification problem alone, we want to see whether an additional quantum layer could boost its performance. We simulate this system to learn several common datasets, including those for image and sound recognition. We conclude that adding a small quantum annealer does not provide a significant benefit over just using a regular (nonlinear) classical neural network.Comment: 15 pages and 9 figure

    When Two Worlds Collide: A Rare Case of Multiple Myeloma With Extramedullary Plasmacytoma

    Get PDF
    In this case report, we discuss the presentation, diagnosis, and management of a 67-year-old gentleman with stage II multiple myeloma with concurrent biopsy-proven bone plasmacytoma and why it is important to understand the molecular intricacies of these disorders. We emphasize the critical role of radiology in identifying, characterizing, and managing these lesions. Furthermore, we shed light on the critical differentiation between solitary extramedullary plasmacytoma and multiple myeloma and discuss treatment modalities for both conditions

    Tissue microarrays analysis in chondrosarcomas: light microscopy, immunohistochemistry and xenograft study

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Chondrosarcoma (Chs) is the third most frequent primary malignant tumour of bone and can be primary or secondary, the latter results mainly from the malignant transformation of a benign pre-existing tumour.</p> <p>Methods</p> <p>All the cases diagnosed as Chs (primary tumours, recurrences and/or metastasis and xenotransplanted Chs) from the files of our Department were collected. Only cases with paraffin blocks available were selected (Total 32 cases). Six Tissue Microarrays (TMAs) were performed and all the cases and biopsies were distributed into the following groups: a) only paraffin block available from primary and/or metastatic tumours (3 TMAs), b) paraffin block available from primary and/or metastatic tumours as well as from the corresponding Nude mice xenotransplant (2 TMAs), c) only paraffin block available from xenotransplanted Chs (1 TMA). A reclassification of all the cases was performed; in addition, conventional hematoxylin-eosin as well as immunohistochemistry staining (S100, SOX-9, Ki-67, BCL-2, p53, p16, CK, CD99, Survivin and Caveolin) was analyzed in all the TMA.</p> <p>Results</p> <p>The distribution of the cases according to the histopathological pattern and the location of tumours were as follows: fourteen Grade I Chs (all primaries), two primary Grade II Chs, ten Grade III Chs (all primaries), five dedifferentiated Chs (four primaries and one primary with metastasis), and two Chs from cell cultures (Ch grade III). One recurrent extraskeletal myxoid Chs was included as a control in the TMA. Although there was heterogeneity in immunohistochemistry results of the different material analyzed, S100, SOX-9, Caveolin and Survivin were more expressed. The number of passages in xenotransplants fluctuated between 1 and 13. Curiously, in Grade I Chs, these implanted tumours hardly grew, and the number of passages did not exceed one.</p> <p>Conclusion</p> <p>The study of Chs by means of TMA techniques is very important because it will improve the assessment of different antibodies applied in the immunohistochemical assays. Xenotransplanted tumours in TMA improve knowledge concerning the variability in the morphological pattern shown by these tumours during the evolution in nudes.</p

    Bio-ORACLE v2.0 : extending marine data layers for bioclimatic modelling

    Get PDF
    Motivation: The availability of user-friendly, high-resolution global environmental datasets is crucial for bioclimatic modelling. For terrestrial environments, WorldClim has served this purpose since 2005, but equivalent marine data only became available in 2012, with pioneer initiatives like Bio-ORACLE providing data layers for several ecologically relevant variables. Currently, the available marine data packages have not yet been updated to the most recent Intergovernmental Panel on Climate Change (IPCC) predictions nor to present times, and are mostly restricted to the top surface layer of the oceans, precluding the modelling of a large fraction of the benthic diversity that inhabits deeper habitats. To address this gap, we present a significant update of Bio-ORACLE for new future climate scenarios, present-day conditions and benthic layers (near sea bottom). The reliability of data layers was assessed using a cross-validation framework against in situ quality-controlled data. This test showed a generally good agreement between our data layers and the global climatic patterns. We also provide a package of functions in the R software environment (sdmpredictors) to facilitate listing, extraction and management of data layers and allow easy integration with the available pipelines for bioclimatic modelling. Main types of variable contained: Surface and benthic layers for water temperature, salinity, nutrients, chlorophyll, sea ice, current velocity, phytoplankton, primary productivity, iron and light at bottom. Spatial location and grain: Global at 5 arcmin (c.0.08 degrees or 9.2 km at the equator). Time period and grain: Present (2000-2014) and future (2040-2050 and 2090-2100) environmental conditions based on monthly averages. Major taxa and level of measurement: Marine biodiversity associated with sea surface and epibenthic habitats. Software format: ASCII and TIFF grid formats for geographical information systems and a package of functions developed for R software
    corecore