61,917 research outputs found

    Harnessing the Data Revolution

    Get PDF
    Harnessing Data for 21st Century Science and Engineering (aka Harnessing the Data Revolution, HDR) is one of NSF\u27s six Big Research Ideas, aimed at supporting fundamental research in data science and engineering; developing a cohesive, federated approach to the research data infrastructure needed to power this revolution; and developing of a 21st-century data-capable workforce. HDR will enable new modes of data-driven discovery allowing researchers to ask and answer new questions in frontier science and engineering, generate new knowledge and understanding by working with domain experts, and accelerate discovery and innovation. This initiative builds on NSF\u27s history of data science investments. The HDR Big Idea is particularly well-suited for collaborations and partnerships with industry. After providing an overview of HDR, we will explore areas for potential collaboration and partnership with industry. As the only federal agency supporting all fields of science and engineering, NSF is uniquely positioned to help ensure that our country\u27s future is one enriched and improved by data

    An Overview of Big Data Analytics in Banking: Approaches, Challenges and Issues

    Get PDF
    Banks are harnessing the power of Big Data. They use Big Data and Data Science to drive change towards data and analytics to gain an overall competitive advantage. The Big Data has potential to transform enterprise operations and processes especially in the banking sector, because they have huge amount of transaction data. The goal of this paper is to give an overview of different approaches and challenges that exists in Big Data in banking sector. The work presented here will fulfill the gap of research papers in the last five years, with focus on Big Data in central banks and credit scoring in central banks. For this paper, we have reviewed existing research literature, official reports, surveys and seminars of central banks, all these related directly or indirectly to Big Data in banks

    Unlocking the potential of deep learning for marine ecology: overview, applications, and outlook

    Get PDF
    The deep learning (DL) revolution is touching all scientific disciplines and corners of our lives as a means of harnessing the power of big data. Marine ecology is no exception. New methods provide analysis of data from sensors, cameras, and acoustic recorders, even in real time, in ways that are reproducible and rapid. Off-the-shelf algorithms find, count, and classify species from digital images or video and detect cryptic patterns in noisy data. These endeavours require collaboration across ecological and data science disciplines, which can be challenging to initiate. To promote the use of DL towards ecosystem-based management of the sea, this paper aims to bridge the gap between marine ecologists and computer scientists. We provide insight into popular DL approaches for ecological data analysis, focusing on supervised learning techniques with deep neural networks, and illustrate challenges and opportunities through established and emerging applications of DL to marine ecology. We present case studies on plankton, fish, marine mammals, pollution, and nutrient cycling that involve object detection, classification, tracking, and segmentation of visualized data. We conclude with a broad outlook of the field’s opportunities and challenges, including potential technological advances and issues with managing complex data sets.publishedVersionPaid Open Acces

    Towards an Italian Energy Data Space

    Get PDF
    The efficient use and the sustainable production of energy are some of the main challenges to face the ever increasing request for energy and the need to limit the damages to the Earth. Smart energy grids, pervasive computing and communication technologies have enabled the stakeholders in the energy industry to collect large amounts of useful and highly granular energy data. They are generated in large volumes and in a variety of different formats, depending on their originating systems and prospected purposes. Moreover, the data type can be structured and unstructured, in open or proprietary formats. This work focuses on harnessing the power of Big Data Management to propose a first model of an Italian Energy Data Lake: the goal is to create a repository of national energy data that respects the FAIRness' key principles [1], aimed at providing a decision support system and the availability of FAIR data for open science. Starting from data of two thematic areas that are part of the nine common European Data Spaces identified in the European Data Strategy[2], namely the Green Deal data space and the Energy data space, an open and extensible platform to enable secure, resilient acquisition and sharing of information will be presented, for enabling the Green Deal priority actions on issues such as climate change, circular economy, pollution, biodiversity, and deforestation

    Launching the Grand Challenges for Ocean Conservation

    Get PDF
    The ten most pressing Grand Challenges in Oceans Conservation were identified at the Oceans Big Think and described in a detailed working document:A Blue Revolution for Oceans: Reengineering Aquaculture for SustainabilityEnding and Recovering from Marine DebrisTransparency and Traceability from Sea to Shore:  Ending OverfishingProtecting Critical Ocean Habitats: New Tools for Marine ProtectionEngineering Ecological Resilience in Near Shore and Coastal AreasReducing the Ecological Footprint of Fishing through Smarter GearArresting the Alien Invasion: Combating Invasive SpeciesCombatting the Effects of Ocean AcidificationEnding Marine Wildlife TraffickingReviving Dead Zones: Combating Ocean Deoxygenation and Nutrient Runof

    Harnessing the cognitive surplus of the nation: new opportunities for libraries in a time of change. The 2012 Jean Arnot Memorial Fellowship Essay.

    Get PDF
    This essay is the winner of the 2012 Jean Arnot Memorial Fellowship. The essay draws on Rose Holley's experience of managing innovative library services that engage crowds such as The Australian Newspapers Digitisation Program and Trove, and her ongoing research into library, archive and museum crowdsourcing projects. This experience and knowledge has been put into the context of Jean Arnot’s values and visions for Australian libraries. Jean Arnot, the distinguished Australian librarian, described her vision for an innovative library service over sixty years ago. Rose suggests how some of her goals are now being achieved through use of the internet and digital technologies, and how we can build on these to ensure that libraries remain valued and relevant by harnessing the cognitive surplus of the nation they serve, and by crowdsourcing

    EChO Payload electronics architecture and SW design

    Full text link
    EChO is a three-modules (VNIR, SWIR, MWIR), highly integrated spectrometer, covering the wavelength range from 0.55 μ\mum, to 11.0 μ\mum. The baseline design includes the goal wavelength extension to 0.4 μ\mum while an optional LWIR module extends the range to the goal wavelength of 16.0 μ\mum. An Instrument Control Unit (ICU) is foreseen as the main electronic subsystem interfacing the spacecraft and collecting data from all the payload spectrometers modules. ICU is in charge of two main tasks: the overall payload control (Instrument Control Function) and the housekeepings and scientific data digital processing (Data Processing Function), including the lossless compression prior to store the science data to the Solid State Mass Memory of the Spacecraft. These two main tasks are accomplished thanks to the Payload On Board Software (P-OBSW) running on the ICU CPUs.Comment: Experimental Astronomy - EChO Special Issue 201

    The Evidence Hub: harnessing the collective intelligence of communities to build evidence-based knowledge

    Get PDF
    Conventional document and discussion websites provide users with no help in assessing the quality or quantity of evidence behind any given idea. Besides, the very meaning of what evidence is may not be unequivocally defined within a community, and may require deep understanding, common ground and debate. An Evidence Hub is a tool to pool the community collective intelligence on what is evidence for an idea. It provides an infrastructure for debating and building evidence-based knowledge and practice. An Evidence Hub is best thought of as a filter onto other websites — a map that distills the most important issues, ideas and evidence from the noise by making clear why ideas and web resources may be worth further investigation. This paper describes the Evidence Hub concept and rationale, the breath of user engagement and the evolution of specific features, derived from our work with different community groups in the healthcare and educational sector
    • …
    corecore