9 research outputs found

    From Copernicus Big Data to Extreme Earth Analytics

    Get PDF
    Copernicus is the European programme for monitoring the Earth. It consists of a set of systems that collect data from satellites and in-situ sensors, process this data and provide users with reliable and up-to-date information on a range of environmental and security issues. The data and information processed and disseminated puts Copernicus at the forefront of the big data paradigm, giving rise to all relevant challenges, the so-called 5 Vs: volume, velocity, variety, veracity and value. In this short paper, we discuss the challenges of extracting information and knowledge from huge archives of Copernicus data. We propose to achieve this by scale-out distributed deep learning techniques that run on very big clusters offering virtual machines and GPUs. We also discuss the challenges of achieving scalability in the management of the extreme volumes of information and knowledge extracted from Copernicus data. The envisioned scientific and technical work will be carried out in the context of the H2020 project ExtremeEarth which starts in January 2019

    ExtremeEarth meets satellite data from space

    Get PDF
    Bringing together a number of cutting-edge technologies that range from storing extremely large volumesof data all the way to developing scalable machine learning and deep learning algorithms in a distributed manner, and having them operate over the same infrastructure poses unprecedentedchallenges. One of these challenges is the integration of European Space Agency (ESA)s Thematic Exploitation Platforms (TEPs) and data information access service platforms with a data platform, namely Hopsworks, that enables scalable data processing, machine learning, and deep learning on Copernicus data, and development of very large training datasets for deep learning architectures targeting the classification of Sentinel images. In this paper, we present the software architecture of ExtremeEarth that aims at the development of scalable deep learning and geospatial analytics techniques for processing and analyzing petabytes of Copernicus data. The ExtremeEarth software infrastructure seamlessly integrates existing and novel software platforms and tools for storing, accessing, processing, analyzing, and visualizing large amounts of Copernicus data. New techniques in the areas of remote sensing and artificial intelligence with an emphasis on deep learning are developed. These techniques and corresponding software presented in thispaper are to be integrated with and used in two ESA TEPs, namely Polar and Food Security TEPs. Furthermore, we presentthe integration of Hopsworks with the Polar and Food Securityuse cases and the flow of events for the products offered through the TEPs

    Encoding and Validation of Earth Observation Metadata using Schema.org and SHACL

    No full text
    Στην παρούσα διπλωματική εργασία παρουσιάζουμε μία επέκταση του λεξιλογίου schema.org για την κωδικοποίηση συνόλων δεδομένων και των χαρακτηριστικών τους, που αφορούν τη τηλεπισκόπιση. Η επέκταση αυτή είναι βασισμένη στο έγγραφο – οδηγία OGC 17-003, στο οποίο περιγράφεται η κωδικοποίηση μεταδεδομένων που αφορούν τη γεωσκόπηση, με τη χρήση των προτύπων GeoJSON και JSON-LD. Ανανεώσαμε αυτό το λεξιλόγιο απλουστεύοντας τη δομή του, έτσι ώστε να συμβαδίζει με τις απαιτήσεις του τύπου μικροδεδομένων schema.org, χρησιμοποιώντας όμως όλη την πληροφορία που δίνεται για τα γεωχωρικά δεδομένα. Επιπλέον, χρησιμοποιήσαμε τη γλώσσα περιορισμών SHACL για να δημιουργήσουμε γράφους περιορισμών για το λεξιλόγιό μας. Οι γράφοι περιορισμών στοχεύουν στη μοντελοποίηση και επικύρωση των γράφων δεδομένων τύπου RDF που δημιουργούνται από τα δεδομένα γεωσκόπησης. Καταλήγοντας, παραθέτουμε ένα σύνολο λεπτομερών παραδειγμάτων για να κατανοηθεί ο τρόπος που εφαρμόζεται και επικυρώνεται η επέκταση του λεξιλογίου μας σε σύνολα δεδομένων που αφορούν τη τηλεπισκόπιση.The current thesis presents a schema.org vocabulary extension for encoding Earth observation (EO) datasets and their properties. It is based on the vocabulary defined in OGC 17-003 specification, which describes a GeoJSON and JSON-LD encoding of Earth observation metadata for datasets. We updated this vocabulary in order to make it simpler, as schema.org principals demand, without excluding any information provided for the EO datasets. We also used Shapes Constraint Language (SHACL) in order to create a shapes graph for our schema.org extension. This shapes graph includes constraints regarding the properties of our vocabulary, so that we can model and validate RDF graphs constructed by EO data. We conclude by providing detailed examples for annotating and validating EO datasets based on our schema.org vocabulary extension

    From Copernicus Big Data to Big Information and Big Knowledge: A Demo from the Copernicus App Lab Project

    No full text
    Copernicus is the European program for monitoring the Earth. It consists of a set of complex systems that collect data from satellites and in-situ sensors, process this data and provide users with reliable and up-to-date information on a range of environmental and security issues. The data collected by Copernicus is made available freely following an open access policy. Information extracted from Copernicus data is disseminated to users through the Copernicus services which address six thematic areas: land, marine, atmosphere, climate, emergency and security. We present a demo from the Horizon 2020 Copernicus App Lab project which takes big data from the Copernicus land service, makes it available on the Web as linked geospatial data and interlinks it with other useful public data to aid the development of applications by developers that might not be Earth Observation experts. Our demo targets a scenario where we want to study the “greenness” of Paris

    From Copernicus Big Data to Big Information and Big Knowledge:A Demo from the Copernicus App Lab Project

    No full text
    Copernicus is the European program for monitoring the Earth. It consists of a set of complex systems that collect data from satellites and in-situ sensors, process this data and provide users with reliable and up-to-date information on a range of environmental and security issues. The data collected by Copernicus is made available freely following an open access policy. Information extracted from Copernicus data is disseminated to users through the Copernicus services which address six thematic areas: land, marine, atmosphere, climate, emergency and security. We present a demo from the Horizon 2020 Copernicus App Lab project which takes big data from the Copernicus land service, makes it available on the Web as linked geospatial data and interlinks it with other useful public data to aid the development of applications by developers that might not be Earth Observation experts. Our demo targets a scenario where we want to study the "greenness" of Paris

    Providing Satellite Data to Mobile Developers Using Semantic Technologies and Linked Data

    No full text
    Copernicus is the European program for monitoring the Earth. It consists of a set of complex systems that collect data from satellites and in-situ sensors, process it, and provide users with reliable and up-to-date information on a range of environmental and security issues. Information extracted from Copernicus data is made available to users through Copernicus services addressing six thematic areas: land, marine, atmosphere, climate, emergency and security. The data processed and disseminated puts Copernicus at the forefront of the big data paradigm and gives rise to all relevant challenges: volume, velocity, variety, veracity and value. In this paper we discuss the challenges of big Copernicus data and how the Copernicus program handled them. We also present lessons learned from our project Copernicus App Lab, which takes Copernicus services information and makes it available on the Web using semantic technologies to aid its take up by mobile developers

    Artificial Intelligence and Big Data Technologies for Copernicus Data: The ExtremeEarth Project

    No full text
    ExtremeEarth is a three-year H2020 ICT research and innovation project which is currently in its final year. The main objective of ExtremeEarth is to develop Artificial Intelligence and Big Data techniques and technologies that scale to the large volumes of big Copernicus data, information and knowledge, and apply these technologies in two of the ESA Thematic Exploitation Platforms: Food Security and Polar. The technical contributions of the project so far include: (i) new deep learning architectures for crop type mapping in the context of the Food Security use case, (ii) new deep learning architectures for sea ice mapping in the context of the Polar use case, (iii) the development and open publication of very large datasets for training these architectures, (iv) new versions of scalable semantic technologies for managing big linked geospatial data, and (v) a new platform for bringing all the previous technologies together and applying them to the two use cases
    corecore