54 research outputs found

    BigDL: A Distributed Deep Learning Framework for Big Data

    Full text link
    This paper presents BigDL (a distributed deep learning framework for Apache Spark), which has been used by a variety of users in the industry for building deep learning applications on production big data platforms. It allows deep learning applications to run on the Apache Hadoop/Spark cluster so as to directly process the production data, and as a part of the end-to-end data analysis pipeline for deployment and management. Unlike existing deep learning frameworks, BigDL implements distributed, data parallel training directly on top of the functional compute model (with copy-on-write and coarse-grained operations) of Spark. We also share real-world experience and "war stories" of users that have adopted BigDL to address their challenges(i.e., how to easily build end-to-end data analysis and deep learning pipelines for their production data).Comment: In ACM Symposium of Cloud Computing conference (SoCC) 201

    Social Internet of Things and New Generation Computing -- A Survey

    Full text link
    Social Internet of Things (SIoT) tries to overcome the challenges of Internet of Things (IoT) such as scalability, trust and discovery of resources, by inspiration from social computing. This survey aims to investigate the research done on SIoT from two perspectives including application domain and the integration to the new computing models. For this, a two-dimensional framework is proposed and the projects are investigated, accordingly. The first dimension considers and classifies available research from the application domain perspective and the second dimension performs the same from the integration to new computing models standpoint. The aim is to technically describe SIoT, to classify related research, to foster the dissemination of state-of-the-art, and to discuss open research directions in this field.Comment: IoT, Social computing, Surve

    Critical analysis of vendor lock-in and its impact on cloud computing migration: a business perspective

    Get PDF
    Vendor lock-in is a major barrier to the adoption of cloud computing, due to the lack of standardization. Current solutions and efforts tackling the vendor lock-in problem are predominantly technology-oriented. Limited studies exist to analyse and highlight the complexity of vendor lock-in problem in the cloud environment. Consequently, most customers are unaware of proprietary standards which inhibit interoperability and portability of applications when taking services from vendors. This paper provides a critical analysis of the vendor lock-in problem, from a business perspective. A survey based on qualitative and quantitative approaches conducted in this study has identified the main risk factors that give rise to lock-in situations. The analysis of our survey of 114 participants shows that, as computing resources migrate from on-premise to the cloud, the vendor lock-in problem is exacerbated. Furthermore, the findings exemplify the importance of interoperability, portability and standards in cloud computing. A number of strategies are proposed on how to avoid and mitigate lock-in risks when migrating to cloud computing. The strategies relate to contracts, selection of vendors that support standardised formats and protocols regarding standard data structures and APIs, developing awareness of commonalities and dependencies among cloud-based solutions. We strongly believe that the implementation of these strategies has a great potential to reduce the risks of vendor lock-in

    A Digital Repository and Execution Platform for Interactive Scholarly Publications in Neuroscience

    Get PDF
    The CARMEN Virtual Laboratory (VL) is a cloud-based platform which allows neuroscientists to store, share, develop, execute, reproduce and publicise their work. This paper describes new functionality in the CARMEN VL: an interactive publications repository. This new facility allows users to link data and software to publications. This enables other users to examine data and software associated with the publication and execute the associated software within the VL using the same data as the authors used in the publication. The cloud-based architecture and SaaS (Software as a Service) framework allows vast data sets to be uploaded and analysed using software services. Thus, this new interactive publications facility allows others to build on research results through reuse. This aligns with recent developments by funding agencies, institutions, and publishers with a move to open access research. Open access provides reproducibility and verification of research resources and results. Publications and their associated data and software will be assured of long-term preservation and curation in the repository. Further, analysing research data and the evaluations described in publications frequently requires a number of execution stages many of which are iterative. The VL provides a scientific workflow environment to combine software services into a processing tree. These workflows can also be associated with publications and executed by users. The VL also provides a secure environment where users can decide the access rights for each resource to ensure copyright and privacy restrictions are met

    Viruses affect picocyanobacterial abundance and biogeography in the North Pacific Ocean

    Get PDF
    The photosynthetic picocyanobacteria Prochlorococcus and Synechococcus are models for dissecting how ecological niches are defined by environmental conditions, but how interactions with bacteriophages affect picocyanobacterial biogeography in open ocean biomes has rarely been assessed. We applied single-virus and single-cell infection approaches to quantify cyanophage abundance and infected picocyanobacteria in 87 surface water samples from five transects that traversed approximately 2,200 km in the North Pacific Ocean on three cruises, with a duration of 2–4 weeks, between 2015 and 2017. We detected a 550-km-wide hotspot of cyanophages and virus-infected picocyanobacteria in the transition zone between the North Pacific Subtropical and Subpolar gyres that was present in each transect. Notably, the hotspot occurred at a consistent temperature and displayed distinct cyanophage-lineage composition on all transects. On two of these transects, the levels of infection in the hotspot were estimated to be sufficient to substantially limit the geographical range of Prochlorococcus. Coincident with the detection of high levels of virally infected picocyanobacteria, we measured an increase of 10–100-fold in the Synechococcus populations in samples that are usually dominated by Prochlorococcus. We developed a multiple regression model of cyanophages, temperature and chlorophyll concentrations that inferred that the hotspot extended across the North Pacific Ocean, creating a biological boundary between gyres, with the potential to release organic matter comparable to that of the sevenfold-larger North Pacific Subtropical Gyre. Our results highlight the probable impact of viruses on large-scale phytoplankton biogeography and biogeochemistry in distinct regions of the oceans

    Evolutionary genomics of a cold-adapted diatom: Fragilariopsis cylindrus

    Get PDF
    The Southern Ocean houses a diverse and productive community of organisms1, 2. Unicellular eukaryotic diatoms are the main primary producers in this environment, where photosynthesis is limited by low concentrations of dissolved iron and large seasonal fluctuations in light, temperature and the extent of sea ice3, 4, 5, 6, 7. How diatoms have adapted to this extreme environment is largely unknown. Here we present insights into the genome evolution of a cold-adapted diatom from the Southern Ocean, Fragilariopsis cylindrus8, 9, based on a comparison with temperate diatoms. We find that approximately 24.7 per cent of the diploid F. cylindrus genome consists of genetic loci with alleles that are highly divergent (15.1 megabases of the total genome size of 61.1 megabases). These divergent alleles were differentially expressed across environmental conditions, including darkness, low iron, freezing, elevated temperature and increased CO2. Alleles with the largest ratio of non-synonymous to synonymous nucleotide substitutions also show the most pronounced condition-dependent expression, suggesting a correlation between diversifying selection and allelic differentiation. Divergent alleles may be involved in adaptation to environmental fluctuations in the Southern Ocean

    Cell-specific deletion of C1qa identifies microglia as the dominant source of C1q in mouse brain

    Get PDF
    BACKGROUND: The complement cascade not only provides protection from infection but can also mediate destructive inflammation. Complement is also involved in elimination of neuronal synapses which is essential for proper development, but can be detrimental during aging and disease. C1q, required for several of these complement-mediated activities, is present in the neuropil, microglia, and a subset of interneurons in the brain. METHODS: To identify the source(s) of C1q in the brain, the C1qa gene was selectively inactivated in the microglia or Thy-1(+) neurons in both wild type mice and a mouse model of Alzheimer’s disease (AD), and C1q synthesis assessed by immunohistochemistry, QPCR, and western blot analysis. RESULTS: While C1q expression in the brain was unaffected after inactivation of C1qa in Thy-1(+) neurons, the brains of C1qa (FL/FL) :Cx3cr1 (CreERT2) mice in which C1qa was ablated in microglia were devoid of C1q with the exception of limited C1q in subsets of interneurons. Surprisingly, this loss of C1q occurred even in the absence of tamoxifen by 1 month of age, demonstrating that Cre activity is tamoxifen-independent in microglia in Cx3cr1 (CreERT2/WganJ) mice. C1q expression in C1qa (FL/FL) : Cx3cr1 (CreERT2/WganJ) mice continued to decline and remained almost completely absent through aging and in AD model mice. No difference in C1q was detected in the liver or kidney from C1qa (FL/FL) : Cx3cr1 (CreERT2/WganJ) mice relative to controls, and C1qa (FL/FL) : Cx3cr1 (CreERT2/WganJ) mice had minimal, if any, reduction in plasma C1q. CONCLUSIONS: Thus, microglia, but not neurons or peripheral sources, are the dominant source of C1q in the brain. While demonstrating that the Cx3cr1 (CreERT2/WganJ) deleter cannot be used for adult-induced deletion of genes in microglia, the model described here enables further investigation of physiological roles of C1q in the brain and identification of therapeutic targets for the selective control of complement-mediated activities contributing to neurodegenerative disorders. ELECTRONIC SUPPLEMENTARY MATERIAL: The online version of this article (doi:10.1186/s12974-017-0814-9) contains supplementary material, which is available to authorized users

    The History, Relevance, and Applications of the Periodic System in Geochemistry

    Get PDF
    Geochemistry is a discipline in the earth sciences concerned with understanding the chemistry of the Earth and what that chemistry tells us about the processes that control the formation and evolution of Earth materials and the planet itself. The periodic table and the periodic system, as developed by Mendeleev and others in the nineteenth century, are as important in geochemistry as in other areas of chemistry. In fact, systemisation of the myriad of observations that geochemists make is perhaps even more important in this branch of chemistry, given the huge variability in the nature of Earth materials – from the Fe-rich core, through the silicate-dominated mantle and crust, to the volatile-rich ocean and atmosphere. This systemisation started in the eighteenth century, when geochemistry did not yet exist as a separate pursuit in itself. Mineralogy, one of the disciplines that eventually became geochemistry, was central to the discovery of the elements, and nineteenth-century mineralogists played a key role in this endeavour. Early “geochemists” continued this systemisation effort into the twentieth century, particularly highlighted in the career of V.M. Goldschmidt. The focus of the modern discipline of geochemistry has moved well beyond classification, in order to invert the information held in the properties of elements across the periodic table and their distribution across Earth and planetary materials, to learn about the physicochemical processes that shaped the Earth and other planets, on all scales. We illustrate this approach with key examples, those rooted in the patterns inherent in the periodic law as well as those that exploit concepts that only became familiar after Mendeleev, such as stable and radiogenic isotopes

    Canine cancer immunotherapy studies: linking mouse and human

    Full text link
    Despite recent major clinical breakthroughs in human cancer immunotherapy including the use of checkpoint inhibitors and engineered T cells, important challenges remain, including determining the sub-populations of patients who will respond and who will experience at times significant toxicities. Although advances in cancer immunotherapy depend on preclinical testing, the majority of in-vivo testing currently relies on genetically identical inbred mouse models which, while offering critical insights regarding efficacy and mechanism of action, also vastly underrepresent the heterogeneity and complex interplay of human immune cells and cancers. Additionally, laboratory mice uncommonly develop spontaneous tumors, are housed under specific-pathogen free conditions which markedly impacts immune development, and incompletely model key aspects of the tumor/immune microenvironment. The canine model represents a powerful tool in cancer immunotherapy research as an important link between murine models and human clinical studies. Dogs represent an attractive outbred combination of companion animals that experience spontaneous cancer development in the setting of an intact immune system. This allows for study of complex immune interactions during the course of treatment while also directly addressing long-term efficacy and toxicity of cancer immunotherapies. However, immune dissection requires access to robust and validated immune assays and reagents as well as appropriate numbers for statistical evaluation. Canine studies will need further optimization of these important mechanistic tools for this model to fulfill its promise as a model for immunotherapy. This review aims to discuss the canine model in the context of existing preclinical cancer immunotherapy models to evaluate both its advantages and limitations, as well as highlighting its growth as a powerful tool in the burgeoning field of both human and veterinary immunotherapy

    An evolutionary simulating annealing algorithm for Google machine reassignment problem

    No full text
    Google Machine Reassignment Problem (GMRP) is a real world problem proposed at ROADEF/EURO challenge 2012 competition which must be solved within 5 min. GMRP consists in reassigning a set of services into a set of machines for which the aim is to improve the machine usage while satisfying numerous constraints. This paper proposes an evolutionary simulating annealing (ESA) algorithm for solving this problem. Simulating annealing (SA) is a single solution based heuristic, which has been successfully used in various optimisation problems. The proposed ESA uses a population of solutions instead of a single solution. Each solution has its own SA algorithm and all SAs work in parallel manner. Each SA starts with different initial solution which can lead to a different search path with distinct local optima. In addition, mutation operators are applied once the solution cannot be improved for a certain number of iterations. This will not only help the search avoid being trapped in a local optima, but also reduce computation time. Because new solutions are not generated from scratch but based on existing ones. This study shows that the proposed ESA method can outperform state of the art algorithms on GMRP
    • …
    corecore