1,042 research outputs found

    Data intensive scientific analysis with grid computing

    Get PDF
    At the end of September 2009, a new Italian GPS receiver for radio occultation was launched from the Satish Dhawan Space Center (Sriharikota, India) on the Indian Remote Sensing OCEANSAT-2 satellite. The Italian Space Agency has established a set of Italian universities and research centers to implement the overall processing radio occultation chain. After a brief description of the adopted algorithms, which can be used to characterize the temperature, pressure and humidity, the contribution will focus on a method for automatic processing these data, based on the use of a distributed architecture. This paper aims at being a possible application of grid computing for scientific research

    Enhancing Job Scheduling of an Atmospheric Intensive Data Application

    Get PDF
    Nowadays, e-Science applications involve great deal of data to have more accurate analysis. One of its application domains is the Radio Occultation which manages satellite data. Grid Processing Management is a physical infrastructure geographically distributed based on Grid Computing, that is implemented for the overall processing Radio Occultation analysis. After a brief description of algorithms adopted to characterize atmospheric profiles, the paper presents an improvement of job scheduling in order to decrease processing time and optimize resource utilization. Extension of grid computing capacity is implemented by virtual machines in existing physical Grid in order to satisfy temporary job requests. Also scheduling plays an important role in the infrastructure that is handled by a couple of schedulers which are developed to manage data automaticall

    GNSS transpolar earth reflectometry exploriNg system (G-TERN): mission concept

    Get PDF
    The global navigation satellite system (GNSS) Transpolar Earth Reflectometry exploriNg system (G-TERN) was proposed in response to ESA's Earth Explorer 9 revised call by a team of 33 multi-disciplinary scientists. The primary objective of the mission is to quantify at high spatio-temporal resolution crucial characteristics, processes and interactions between sea ice, and other Earth system components in order to advance the understanding and prediction of climate change and its impacts on the environment and society. The objective is articulated through three key questions. 1) In a rapidly changing Arctic regime and under the resilient Antarctic sea ice trend, how will highly dynamic forcings and couplings between the various components of the ocean, atmosphere, and cryosphere modify or influence the processes governing the characteristics of the sea ice cover (ice production, growth, deformation, and melt)? 2) What are the impacts of extreme events and feedback mechanisms on sea ice evolution? 3) What are the effects of the cryosphere behaviors, either rapidly changing or resiliently stable, on the global oceanic and atmospheric circulation and mid-latitude extreme events? To contribute answering these questions, G-TERN will measure key parameters of the sea ice, the oceans, and the atmosphere with frequent and dense coverage over polar areas, becoming a “dynamic mapper”of the ice conditions, the ice production, and the loss in multiple time and space scales, and surrounding environment. Over polar areas, the G-TERN will measure sea ice surface elevation (<;10 cm precision), roughness, and polarimetry aspects at 30-km resolution and 3-days full coverage. G-TERN will implement the interferometric GNSS reflectometry concept, from a single satellite in near-polar orbit with capability for 12 simultaneous observations. Unlike currently orbiting GNSS reflectometry missions, the G-TERN uses the full GNSS available bandwidth to improve its ranging measurements. The lifetime would be 2025-2030 or optimally 2025-2035, covering key stages of the transition toward a nearly ice-free Arctic Ocean in summer. This paper describes the mission objectives, it reviews its measurement techniques, summarizes the suggested implementation, and finally, it estimates the expected performance.Peer ReviewedPostprint (published version

    Enhancing Job Scheduling of an Atmospheric Intensive Data Application

    Get PDF
    Nowadays, e-Science applications involve great deal of data to have more accurate analysis. One of its application domains is the Radio Occultation which manages satellite data. Grid Processing Management is a physical infrastructure geographically distributed based on Grid Computing, that is implemented for the overall processing Radio Occultation analysis. After a brief description of algorithms adopted to characterize atmospheric profiles, the paper presents an improvement of job scheduling in order to decrease processing time and optimize resource utilization. Extension of grid computing capacity is implemented by virtual machines in existing physical Grid in order to satisfy temporary job requests. Also scheduling plays an important role in the infrastructure that is handled by a couple of schedulers which are developed to manage data automatically

    GRID AND CLOUD COMPUTING FOR E-SCIENCE APPLICATIONS

    Get PDF
    eScience fields which include areas such as spatial data, electromagnetic,bioinformatics, energy, social sciences, simulation, physical science have on the course of recent years a significant development regarding the complexity of algorithms and applications for data analysis. Information data has also evolved with an explosion in term of data volume and datasets for the scientific community. This has led researchers to identify new necessity regarding tools analysis, applications, by a profound change in computing infrastructures utilization. The field of eScience is constantly evolving through the creation of ever more growing scientific community who have a real needs in availability in computational resources ever more powerful calculations. Another important issue is the ability to be able to share results, this is why cloud technology through virtualization can be an important help for the scientist community for giving a flexible and scalable IT infrastructure depending on necessities. Indeed, cloud computing allows for the provision of computing resources, storage in an easy configurable way and adaptable in functions of real needs. Researchers often do not have all the computing capacities to meet their needs, so cloud technology and cloud models as Private, Public and Hybrid is an enable technology for having a guarantee of service availability, scalability and flexibility. The transition from traditional infrastructure to new virtualized with distributed models allows researchers to have access to an environment extremely flexible allowing an optimization of the use of hardware for having more available resources. However, the computational needs on e-Science have a direct effect regarding the way that applications are developed. The approach of writing algorithm and applications is still too tied to a model centered on a workstation for example. The vast majority of researchers conducts the writing process of their applications on their laptop or workstation in a limited context of computing power, storage and in a non-distributed way

    GRID AND CLOUD COMPUTING FOR E-SCIENCE APPLICATIONS

    Get PDF
    eScience fields which include areas such as spatial data, electromagnetic,bioinformatics, energy, social sciences, simulation, physical science have on the course of recent years a significant development regarding the complexity of algorithms and applications for data analysis. Information data has also evolved with an explosion in term of data volume and datasets for the scientific community. This has led researchers to identify new necessity regarding tools analysis, applications, by a profound change in computing infrastructures utilization. The field of eScience is constantly evolving through the creation of ever more growing scientific community who have a real needs in availability in computational resources ever more powerful calculations. Another important issue is the ability to be able to share results, this is why cloud technology through virtualization can be an important help for the scientist community for giving a flexible and scalable IT infrastructure depending on necessities. Indeed, cloud computing allows for the provision of computing resources, storage in an easy configurable way and adaptable in functions of real needs. Researchers often do not have all the computing capacities to meet their needs, so cloud technology and cloud models as Private, Public and Hybrid is an enable technology for having a guarantee of service availability, scalability and flexibility. The transition from traditional infrastructure to new virtualized with distributed models allows researchers to have access to an environment extremely flexible allowing an optimization of the use of hardware for having more available resources. However, the computational needs on e-Science have a direct effect regarding the way that applications are developed. The approach of writing algorithm and applications is still too tied to a model centered on a workstation for example. The vast majority of researchers conducts the writing process of their applications on their laptop or workstation in a limited context of computing power, storage and in a non-distributed wa
    corecore