1,670 research outputs found

    Numerical simulation of the heavy rainfall caused by a convection band over Korea: a case study on the comparison of WRF and CReSS

    Get PDF
    This study investigates the capability of two numerical models, namely the Weather Research and Forecasting (WRF) and Cloud Resolving Storm Simulator (CReSS), to simulate the heavy rainfall that occurred on September 21, 2010 in the middle of the Korean peninsula. This event was considered part of the typical rainfall caused by intense quasi-stationary convection band, leading to a large accumulated rainfall amount within a narrow area. To investigate the relevant characteristics of this heavy rainfall and the feasibility of the numerical models to simulate them, the experiments using both numerical models were designed with a focus on Korea with a horizontal grid spacing of 2 km. The initial and later boundary conditions were interpolated using the output of the mesoscale model of Japan Meteorological Agency and integration spanned the 24-h period from 2100 UTC on September 20, 2010 when the rainfall started in the Yellow Sea. Generally, the spatial distribution and temporal evolution of the rainfall simulated by CReSS are closer than those of the WRF to the in situ observations (655 stations). The WRF simulation reveals the deficiency in capturing the unusual stagnant behavior of this event. The spatial and vertical patterns of reflectivity are consistent with the rainfall pattern, supporting that strong reflectivity coincides with the convective activity that accompanies excessive rainfall. The thermodynamic structure is the main driver of the different behavior between both simulations. The higher equivalent potential temperature, deep moist absolutely unstable layer and strong veering wind shear seen in the CReSS simulation play a role in the development of a favorable environment for inducing convection.National Institute of Meteorological Research (Korea) (Grant (NIMR-2012-B-7))Korea. Meteorological Administratio

    Meteorological influences of the Japan Sea data assimilation on the northwestern Pacific area

    Get PDF
    第33回極域気水圏シンポジウム12月1日(水) 国立極地研究所 2階大会議

    The Goddard Cumulus Ensemble Model (GCE): Improvements and Applications for Studying Precipitation Processes

    Get PDF
    Convection is the primary transport process in the Earth's atmosphere. About two-thirds of the Earth's rainfall and severe floods derive from convection. In addition, two-thirds of the global rain falls in the tropics, while the associated latent heat release accounts for three-fourths of the total heat energy for the Earth's atmosphere. Cloud-resolving models (CRMs) have been used to improve our understanding of cloud and precipitation processes and phenomena from micro-scale to cloud-scale and mesoscale as well as their interactions with radiation and surface processes. CRMs use sophisticated and realistic representations of cloud microphysical processes and can reasonably well resolve the time evolution, structure, and life cycles of clouds and cloud systems. CRMs also allow for explicit interaction between clouds, outgoing longwave (cooling) and incoming solar (heating) radiation, and ocean and land surface processes. Observations are required to initialize CRMs and to validate their results. The Goddard Cumulus Ensemble model (GCE) has been developed and improved at NASA/Goddard Space Flight Center over the past three decades. It is amulti-dimensional non-hydrostatic CRM that can simulate clouds and cloud systems in different environments. Early improvements and testing were presented in Tao and Simpson (1993) and Tao et al. (2003a). A review on the application of the GCE to the understanding of precipitation processes can be found in Simpson and Tao (1993) and Tao (2003). In this paper, recent model improvements (microphysics, radiation and land surface processes) are described along with their impact and performance on cloud and precipitation events in different geographic locations via comparisons with observations. In addition, recent advanced applications of the GCE are presented that include understanding the physical processes responsible for diurnal variation, examining the impact of aerosols (cloud condensation nuclei or CCN and ice nuclei or IN) on precipitation processes, utilizing a satellite simulator to improve the microphysics, providing better simulations for satellite-derived latent heating retrieval, and coupling with a general circulation model to improve the representation of precipitation processes

    Global Cloud-Resolving Models

    Get PDF
    Global cloud-resolving models (GCRMs) are a new category of atmospheric global models designed to solve different flavors of the nonhydrostatic equations through the use of kilometer-scale global meshes. GCRMs make it possible to explicitly simulate deep convection, thereby avoiding the need for cumulus parameterization and allowing for clouds to be resolved by microphysical models responding to grid-scale forcing. GCRMs require high-resolution discretization over the globe, for which a variety of mesh structures have been proposed and employed. The first GCRM was constructed 15 years ago, and in recent years, other groups have also begun adopting this approach, enabling the first intercomparison studies of such models. Because conventional general circulation models (GCMs) suffer from large biases associated with cumulus parameterization, GCRMs are attractive tools for researchers studying global weather and climate. In this review, GCRMs are described, with some emphasis on their historical development and the associated literature documenting their use. The advantages of GCRMs are presented, and currently existing GCRMs are listed and described. Future prospects for GCRMs are also presented in the final section

    A Fortran-Keras Deep Learning Bridge for Scientific Computing

    Get PDF
    Implementing artificial neural networks is commonly achieved via high-level programming languages such as Python and easy-to-use deep learning libraries such as Keras. These software libraries come preloaded with a variety of network architectures, provide autodifferentiation, and support GPUs for fast and efficient computation. As a result, a deep learning practitioner will favor training a neural network model in Python, where these tools are readily available. However, many large-scale scientific computation projects are written in Fortran, making it difficult to integrate with modern deep learning methods. To alleviate this problem, we introduce a software library, the Fortran-Keras Bridge (FKB). This two-way bridge connects environments where deep learning resources are plentiful with those where they are scarce. The paper describes several unique features offered by FKB, such as customizable layers, loss functions, and network ensembles. The paper concludes with a case study that applies FKB to address open questions about the robustness of an experimental approach to global climate simulation, in which subgrid physics are outsourced to deep neural network emulators. In this context, FKB enables a hyperparameter search of one hundred plus candidate models of subgrid cloud and radiation physics, initially implemented in Keras, to be transferred and used in Fortran. Such a process allows the model’s emergent behavior to be assessed, i.e., when fit imperfections are coupled to explicit planetary-scale fluid dynamics. The results reveal a previously unrecognized strong relationship between offline validation error and online performance, in which the choice of the optimizer proves unexpectedly critical. This in turn reveals many new neural network architectures that produce considerable improvements in climate model stability including some with reduced error, for an especially challenging training dataset

    A Fortran-Keras Deep Learning Bridge for Scientific Computing

    Get PDF
    Implementing artificial neural networks is commonly achieved via high-level programming languages like Python and easy-to-use deep learning libraries like Keras. These software libraries come pre-loaded with a variety of network architectures, provide autodifferentiation, and support GPUs for fast and efficient computation. As a result, a deep learning practitioner will favor training a neural network model in Python, where these tools are readily available. However, many large-scale scientific computation projects are written in Fortran, making it difficult to integrate with modern deep learning methods. To alleviate this problem, we introduce a software library, the Fortran-Keras Bridge (FKB). This two-way bridge connects environments where deep learning resources are plentiful, with those where they are scarce. The paper describes several unique features offered by FKB, such as customizable layers, loss functions, and network ensembles. The paper concludes with a case study that applies FKB to address open questions about the robustness of an experimental approach to global climate simulation, in which subgrid physics are outsourced to deep neural network emulators. In this context, FKB enables a hyperparameter search of one hundred plus candidate models of subgrid cloud and radiation physics, initially implemented in Keras, to be transferred and used in Fortran. Such a process allows the model's emergent behavior to be assessed, i.e. when fit imperfections are coupled to explicit planetary-scale fluid dynamics. The results reveal a previously unrecognized strong relationship between offline validation error and online performance, in which the choice of optimizer proves unexpectedly critical. This reveals many neural network architectures that produce considerable improvements in stability including some with reduced error, for an especially challenging training dataset

    High Resolution Model Intercomparison Project (HighResMIP v1.0) for CMIP6

    Get PDF
    Robust projections and predictions of climate variability and change, particularly at regional scales, rely on the driving processes being represented with fidelity in model simulations. The role of enhanced horizontal resolution in improved process representation in all components of the climate system is of growing interest, particularly as some recent simulations suggest the possibility for significant changes in both large-scale aspects of circulation, as well as improvements in small-scale processes and extremes. However, such high resolution global simulations at climate time scales, with resolutions of at least 50 km in the atmosphere and 0.25° in the ocean, have been performed at relatively few research centers and generally without overall coordination, primarily due to their computational cost. Assessing the robustness of the response of simulated climate to model resolution requires a large multi-model ensemble using a coordinated set of experiments. The Coupled Model Intercomparison Project 6 (CMIP6) is the ideal framework within which to conduct such a study, due to the strong link to models being developed for the CMIP DECK experiments and other MIPs. Increases in High Performance Computing (HPC) resources, as well as the revised experimental design for CMIP6, now enables a detailed investigation of the impact of increased resolution up to synoptic weather scales on the simulated mean climate and its variability. The High Resolution Model Intercomparison Project (HighResMIP) presented in this paper applies, for the first time, a multi-model approach to the systematic investigation of the impact of horizontal resolution. A coordinated set of experiments has been designed to assess both a standard and an enhanced horizontal resolution simulation in the atmosphere and ocean. The set of HighResMIP experiments is divided into three tiers consisting of atmosphere-only and coupled runs and spanning the period 1950-2050, with the possibility to extend to 2100, together with some additional targeted experiments. This paper describes the experimental set-up of HighResMIP, the analysis plan, the connection with the other CMIP6 endorsed MIPs, as well as the DECK and CMIP6 historical simulation. HighResMIP thereby focuses on one of the CMIP6 broad questions: “what are the origins and consequences of systematic model biases?”, but we also discuss how it addresses the World Climate Research Program (WCRP) grand challenges

    Resource provisioning and scheduling algorithms for hybrid workflows in edge cloud computing

    Get PDF
    In recent years, Internet of Things (IoT) technology has been involved in a wide range of application domains to provide real-time monitoring, tracking and analysis services. The worldwide number of IoT-connected devices is projected to increase to 43 billion by 2023, and IoT technologies are expected to engaged in 25% of business sector. Latency-sensitive applications in scope of intelligent video surveillance, smart home, autonomous vehicle, augmented reality, are all emergent research directions in industry and academia. These applications are required connecting large number of sensing devices to attain the desired level of service quality for decision accuracy in a sensitive timely manner. Moreover, continuous data stream imposes processing large amounts of data, which adds a huge overhead on computing and network resources. Thus, latency-sensitive and resource-intensive applications introduce new challenges for current computing models, i.e, batch and stream. In this thesis, we refer to the integrated application model of stream and batch applications as a hybrid work ow model. The main challenge of the hybrid model is achieving the quality of service (QoS) requirements of the two computation systems. This thesis provides a systemic and detailed modeling for hybrid workflows which describes the internal structure of each application type for purposes of resource estimation, model systems tuning, and cost modeling. For optimizing the execution of hybrid workflows, this thesis proposes algorithms, techniques and frameworks to serve resource provisioning and task scheduling on various computing systems including cloud, edge cloud and cooperative edge cloud. Overall, experimental results provided in this thesis demonstrated strong evidences on the responsibility of proposing different understanding and vision on the applications of integrating stream and batch applications, and how edge computing and other emergent technologies like 5G networks and IoT will contribute on more sophisticated and intelligent solutions in many life disciplines for more safe, secure, healthy, smart and sustainable society

    cISP: A Speed-of-Light Internet Service Provider

    Full text link
    Low latency is a requirement for a variety of interactive network applications. The Internet, however, is not optimized for latency. We thus explore the design of cost-effective wide-area networks that move data over paths very close to great-circle paths, at speeds very close to the speed of light in vacuum. Our cISP design augments the Internet's fiber with free-space wireless connectivity. cISP addresses the fundamental challenge of simultaneously providing low latency and scalable bandwidth, while accounting for numerous practical factors ranging from transmission tower availability to packet queuing. We show that instantiations of cISP across the contiguous United States and Europe would achieve mean latencies within 5% of that achievable using great-circle paths at the speed of light, over medium and long distances. Further, we estimate that the economic value from such networks would substantially exceed their expense
    corecore