87 research outputs found

    Weighted entropy and optimal portfolios for risk-averse Kelly investments

    Full text link
    Following a series of works on capital growth investment, we analyse log-optimal portfolios where the return evaluation includes `weights' of different outcomes. The results are twofold: (A) under certain conditions, the logarithmic growth rate leads to a supermartingale, and (B) the optimal (martingale) investment strategy is a proportional betting. We focus on properties of the optimal portfolios and discuss a number of simple examples extending the well-known Kelly betting scheme. An important restriction is that the investment does not exceed the current capital value and allows the trader to cover the worst possible losses. The paper deals with a class of discrete-time models. A continuous-time extension is a topic of an ongoing study

    Constraints on the resistivity of the oceanic lithosphere and asthenosphere from seafloor ocean tidal electromagnetic measurements

    Get PDF
    Author Posting. © The Author(s), 2019. This is the author's version of the work. It is posted here by permission of Oxford University Press for personal use, not for redistribution. The definitive version was published in Geophysical Journal International, 219(1), (2019): 464-478, doi:10.1093/gji/ggz315.The electromagnetic (EM) field generated by ocean tidal flow is readily detectable in both satellite magnetic field data, and in ocean-bottom measurements of electric and magnetic fields. The availability of accurate charts of tidal currents, constrained by assimilation of modern satellite altimetry data, opens the possibility of using tidal EM fields as a source to image mantle electrical resistivity beneath the ocean basins, as highlighted by the recent success in defining the globally averaged lithosphere–asthenosphere boundary (LAB) with satellite data. In fact, seafloor EM data would be expected to provide better constraints on the structure of resistive oceanic lithosphere, since the toroidal magnetic mode, which can constrain resistive features, is a significant component of the tidal EM field within the ocean, but is absent above the surface (in particular in satellite data). Here we consider this issue in more detail, using a combination of simplified theoretical analysis and 1-D and 3-D numerical modelling to provide a thorough discussion of the sensitivity of satellite and seafloor data to subsurface electrical structure. As part of this effort, and as a step toward 3-D inversion of seafloor tidal data, we have developed a new flexible 3-D spherical-coordinate finite difference scheme for both global and regional scale modelling, with higher resolution models nested in larger scale solutions. We use the new 3-D model, together with Monte Carlo simulations of errors in tidal current estimates, to provide a quantitative assessment of errors in the computed tidal EM signal caused by uncertainty in the tidal source. Over the open ocean this component of error is below 0.01 nT in Bz at satellite height and 0.05 nT in Bx on the seafloor, well below typical signal levels. However, as coastlines are approached error levels can increase substantially. Both analytical and 3-D modelling demonstrate that the seafloor magnetic field is most sensitive to the lithospheric resistance (the product of resistivity and thickness), and is more weakly influenced (primarily in the phase) by resistivity of the underlying asthenosphere. Satellite data, which contain only the poloidal magnetic mode, are more sensitive to the conductive asthenosphere, but have little sensitivity to lithospheric resistance. For both seafloor and satellite data’s changes due to plausible variations in Earth parameters are well above error levels associated with source uncertainty, at least in the ocean interior. Although the 3-D modelling results are qualitatively consistent with theoretical analysis, the presence of coastlines and bathymetric variations generates a complex response, confirming that quantitative interpretation of ocean tidal EM fields will require a 3-D treatment. As an illustration of the nested 3-D scheme, seafloor data at five magnetic and seven electric stations in the northeastern Pacific (41○N, 165○W) are fit with trial-and-error forward modelling of a local domain. The simulation results indicate that the lithospheric resistance is roughly 7 × 108 Ωm2. The phase of the seafloor data in this region are inconsistent with a sharp transition between the resistive lithosphere and conductive asthenosphere.This work has been supported by National Natural Science Foundation of China grants 41804072 and 41574104, and NSF grant EAR-1447109. Special thanks to Dr Benjamin Murphy who provided the conductivity-depth profile for 1-D earth model, Dr Min Ding who provided valuable discussion about the oceanic lithosphere and Dr Jeffery Love who provided comments on the stylistics of the manuscript

    Determinants on an efficient cellulase recycling process for the production of bioethanol from recycled paper sludge under high solid loadings

    Get PDF
    Background: In spite of the continuous efforts and investments in the last decades, lignocellulosic ethanol is still not economically competitive with fossil fuels. Optimization is still required in different parts of the process. Namely, the cost effective usage of enzymes has been pursued by different strategies, one of them being recycling. Results: Cellulase recycling was analyzed on Recycled Paper Sludge (RPS) conversion into bioethanol under intensified conditions. Different cocktails were studied regarding thermostability, hydrolysis efficiency, distribution in the multiphasic system and recovery from solid. Celluclast showed inferior stability at higher temperatures (45-55 ºC), nevertheless its performance at moderate temperatures (40ºC) was slightly superior to other cocktails (ACCELLERASE®1500 and Cellic®CTec2). Celluclast distribution in the solid-liquid medium was also more favorable, enabling to recover 88 % of final activity at the end of the process. A Central Composite Design studied the influence of solids concentration and enzyme dosage on RPS conversion by Celluclast. Solids concentration showed a significant positive effect on glucose production, no major limitations being found from utilizing high amounts of solids under the studied conditions. Increasing enzyme loading from 20 to 30 FPU/ gcellulose had no significant effect on sugars production, suggesting that 22 % solids and 20 FPU/gcellulose are the best operational conditions towards an intensified process. Applying these, a system of multiple rounds of hydrolysis with enzyme recycling was implemented, allowing to maintain steady levels of enzyme activity with only 50 % of enzyme on each recycling stage. Additionally, interesting levels of solid conversion (70-81 %) were also achieved, leading to considerable improvements on glucose and ethanol production comparatively with the reports available so far (3.4 and 3.8 fold, respectively). Conclusions: Enzyme recycling viability depends on enzyme distribution between the solid and liquid phases at the end of hydrolysis, as well as enzymes thermostability. Both are critical features to be observed for a judicious choice of enzyme cocktail. This work demonstrates that enzyme recycling in intensified biomass degradation can be achieved through simple means. The process is possibly much more effective at larger scale, hence novel enzyme formulations favoring this possibility should be developed for industrial usage.This work had the fnancial support of the Portuguese Foundation for Science and Technology (FCT) under the scope of the strategic funding of UID/ BIO/04469/2013 unit, COMPETE 2020 (POCI-01-0145-FEDER-006684) and the MultiBiorefnery project (POCI-01-0145-FEDER-016403). Furthermore, FCT equally supported the Ph.D. grant to DG (SFRH/BD/88623/2012).info:eu-repo/semantics/publishedVersio

    Enriched biodiversity data as a resource and service

    Get PDF
    Background: Recent years have seen a surge in projects that produce large volumes of structured, machine-readable biodiversity data. To make these data amenable to processing by generic, open source “data enrichment” workflows, they are increasingly being represented in a variety of standards-compliant interchange formats. Here, we report on an initiative in which software developers and taxonomists came together to address the challenges and highlight the opportunities in the enrichment of such biodiversity data by engaging in intensive, collaborative software development: The Biodiversity Data Enrichment Hackathon. Results: The hackathon brought together 37 participants (including developers and taxonomists, i.e. scientific professionals that gather, identify, name and classify species) from 10 countries: Belgium, Bulgaria, Canada, Finland, Germany, Italy, the Netherlands, New Zealand, the UK, and the US. The participants brought expertise in processing structured data, text mining, development of ontologies, digital identification keys, geographic information systems, niche modeling, natural language processing, provenance annotation, semantic integration, taxonomic name resolution, web service interfaces, workflow tools and visualisation. Most use cases and exemplar data were provided by taxonomists. One goal of the meeting was to facilitate re-use and enhancement of biodiversity knowledge by a broad range of stakeholders, such as taxonomists, systematists, ecologists, niche modelers, informaticians and ontologists. The suggested use cases resulted in nine breakout groups addressing three main themes: i) mobilising heritage biodiversity knowledge; ii) formalising and linking concepts; and iii) addressing interoperability between service platforms. Another goal was to further foster a community of experts in biodiversity informatics and to build human links between research projects and institutions, in response to recent calls to further such integration in this research domain. Conclusions: Beyond deriving prototype solutions for each use case, areas of inadequacy were discussed and are being pursued further. It was striking how many possible applications for biodiversity data there were and how quickly solutions could be put together when the normal constraints to collaboration were broken down for a week. Conversely, mobilising biodiversity knowledge from their silos in heritage literature and natural history collections will continue to require formalisation of the concepts (and the links between them) that define the research domain, as well as increased interoperability between the software platforms that operate on these concepts

    Modelling Stochastic and Deterministic Behaviours in Virus Infection Dynamics

    Get PDF
    Many human infections with viruses such as human immunodeficiency virus type 1 (HIV--1) are characterized by low numbers of founder viruses for which the random effects and discrete nature of populations have a strong effect on the dynamics, e.g., extinction versus spread. It remains to be established whether HIV transmission is a stochastic process on the whole. In this study, we consider the simplest (so-called, 'consensus') virus dynamics model and develop a computational methodology for building an equivalent stochastic model based on Markov Chain accounting for random interactions between the components. The model is used to study the evolution of the probability densities for the virus and target cell populations. It predicts the probability of infection spread as a function of the number of the transmitted viruses. A hybrid algorithm is suggested to compute efficiently the dynamics in state space domain characterized by a mix of small and large species densities

    Simultaneous saccharification and fermentation of hydrothermal pretreated lignocellulosic biomass: evaluation of process performance under multiple stress conditions

    Get PDF
    Industrial lignocellulosic bioethanol processes are exposed to different environmental stresses (such as inhibitor compounds, high temperature, and high solid loadings). In this study, a systematic approach was followed where the liquid and solid fractions were mixed to evaluate the influence of varied solid loadings, and different percentages of liquor were used as liquid fraction to determine inhibitor effect. Ethanol production by simultaneous saccharification and fermentation (SSF) of hydrothermally pretreated Eucalyptus globulus wood (EGW) was studied under combined diverse stress operating conditions (3038 °C, 6080 g of liquor from hydrothermal treatment or autohydrolysis (containing inhibitor compounds)/100 g of liquid and liquid to solid ratio between 4 and 6.4 g liquid in SSF/g unwashed pretreated EGW) using an industrial Saccharomyces cerevisiae strain supplemented with low-cost byproducts derived from agro-food industry. Evaluation of these variables revealed that the combination of temperature and higher solid loadings was the most significant variable affecting final ethanol concentration and cellulose to ethanol conversion, whereas solid and autohydrolysis liquor loadings had the most significant impact on ethanol productivity. After optimization, an ethanol concentration of 54 g/L (corresponding to 85 % of conversion and 0.51 g/Lh of productivity at 96 h) was obtained at 37 °C using 60 % of autohydrolysis liquor and 16 % solid loading (liquid to solid ratio of 6.4 g/g). The selection of a suitable strain along with nutritional supplementation enabled to produce noticeable ethanol titers in quite restrictive SSF operating conditions, which can reduce operating cost and boost the economic feasibility of lignocellulose-to-ethanol processes.The authors thank the financial support from the Strategic Project of UID/BIO/04469/2013 CEB Unit and A Romaní postdoctoral grant funded by Xunta of Galicia (Plan I2C, 2014)

    TaLoS: secure and transparent TLS termination inside SGX enclaves

    Get PDF
    We introduce TaLoS1, a drop-in replacement for existing transport layer security (TLS) libraries that protects itself from a malicious environment by running inside an Intel SGX trusted execution environment. By minimising the amount of enclave transitions and reducing the overhead of the remaining enclave transitions, TaLoS imposes an overhead of no more than 31% in our evaluation with the Apache web server and the Squid proxy

    Glamdring: automatic application partitioning for Intel SGX

    Get PDF
    Trusted execution support in modern CPUs, as offered by Intel SGX enclaves , can protect applications in untrusted environments. While prior work has shown that legacy applications can run in their entirety inside enclaves, this results in a large trusted computing base (TCB). Instead, we explore an approach in which we partition an applica- tion and use an enclave to protect only security-sensitive data and functions, thus obtaining a smaller TCB. We describe Glamdring , the first source-level parti- tioning framework that secures applications written in C using Intel SGX. A developer first annotates security- sensitive application data. Glamdring then automatically partitions the application into untrusted and enclave parts: (i) to preserve data confidentiality, Glamdring uses dataflow analysis to identify functions that may be ex- posed to sensitive data; (ii) for data integrity, it uses back- ward slicing to identify functions that may affect sensitive data. Glamdring then places security-sensitive functions inside the enclave, and adds runtime checks and crypto- graphic operations at the enclave boundary to protect it from attack. Our evaluation of Glamdring with the Mem- cached store, the LibreSSL library, and the Digital Bitbox bitcoin wallet shows that it achieves small TCB sizes and has acceptable performance overheads
    corecore