179,814 research outputs found

    Filament L1482 in the California molecular cloud

    Full text link
    Aims. The process of gravitational fragmentation in the L1482 molecular filament of the California molecular cloud is studied by combining several complementary observations and physical estimates. We investigate the kinematic and dynamical states of this molecular filament and physical properties of several dozens of dense molecular clumps embedded therein. Methods. We present and compare molecular line emission observations of the J=2--1 and J=3--2 transitions of 12CO in this molecular complex, using the KOSMA 3-meter telescope. These observations are complemented with archival data observations and analyses of the 13CO J=1--0 emission obtained at the Purple Mountain Observatory 13.7-meter radio telescope at Delingha Station in QingHai Province of west China, as well as infrared emission maps from the Herschel Space Telescope online archive, obtained with the SPIRE and PACS cameras. Comparison of these complementary datasets allow for a comprehensive multi-wavelength analysis of the L1482 molecular filament. Results. We have identified 23 clumps along the molecular filament L1482 in the California molecular cloud. All these molecular clumps show supersonic non-thermal gas motions. While surprisingly similar in mass and size to the much better known Orion molecular cloud, the formation rate of high-mass stars appears to be suppressed in the California molecular cloud relative to that in the Orion molecular cloud based on the mass-radius threshold derived from the static Bonnor Ebert sphere. Our analysis suggests that these molecular filaments are thermally supercritical and molecular clumps may form by gravitational fragmentation along the filament. Instead of being static, these molecular clumps are most likely in processes of dynamic evolution.Comment: 10 pages, 9 figures, 2 tables, accepted to Astronomy and Astrophysic

    Case study: IBM Watson Analytics cloud platform as Analytics-as-a-Service system for heart failure early detection

    Get PDF
    In the recent years the progress in technology and the increasing availability of fast connections have produced a migration of functionalities in Information Technologies services, from static servers to distributed technologies. This article describes the main tools available on the market to perform Analytics as a Service (AaaS) using a cloud platform. It is also described a use case of IBM Watson Analytics, a cloud system for data analytics, applied to the following research scope: detecting the presence or absence of Heart Failure disease using nothing more than the electrocardiographic signal, in particular through the analysis of Heart Rate Variability. The obtained results are comparable with those coming from the literature, in terms of accuracy and predictive power. Advantages and drawbacks of cloud versus static approaches are discussed in the last sections

    Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    Get PDF
    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications

    Constraining the Environment of CH+ Formation with CH3+ Observations

    Full text link
    The formation of CH+ in the interstellar medium has long been an outstanding problem in chemical models. In order to probe the physical conditions of the ISM in which CH+ forms, we propose the use of CH3+ observations. The pathway to forming CH3+ begins with CH+, and a steady state analysis of CH3+ and the reaction intermediary CH2+ results in a relationship between the CH+ and CH3+ abundances. This relationship depends on the molecular hydrogen fraction, f_H2, and gas temperature, T, so observations of CH+ and CH3+ can be used to infer the properties of the gas in which both species reside. We present observations of both molecules along the diffuse cloud sight line toward Cyg OB2 No. 12. Using our computed column densities and upper limits, we put constraints on the f_H2 vs. T parameter space in which CH+ and CH3+ form. We find that average, static, diffuse molecular cloud conditions (i.e. f_H2>0.2, T~60 K) are excluded by our analysis. However, current theory suggests that non-equilibrium effects drive the reaction C+ + H_2 --> CH+ + H, endothermic by 4640 K. If we consider a higher effective temperature due to collisions between neutrals and accelerated ions, the CH3+ partition function predicts that the overall population will be spread out into several excited rotational levels. As a result, observations of more CH3+ transitions with higher signal-to-noise ratios are necessary to place any constraints on models where magnetic acceleration of ions drives the formation of CH+.Comment: 7 pages, 3 figures, 2 tables, accepted for publication in Ap

    A large-scale study on the security vulnerabilities of cloud deployments

    Get PDF
    As cloud deployments are becoming ubiquitous, the rapid adoption of this new paradigm may potentially bring additional cyber security issues. It is crucial that practitioners and researchers pose questions about the current state of cloud deployment security. By better understanding existing vulnerabilities, progress towards a more secure cloud can be accelerated. This is of paramount importance especially with more and more critical infrastructures moving to the cloud, where the consequences of a security incident can be significantly broader. This study presents a data-centric approach to security research – by using three static code analysis tools and scraping the internet for publicly available codebases, a footprint of the current state of open-source infrastructure-as-code repositories can be achieved. Out of the scraped 44485 repository links, the study is concentrated on 8256 repositories from the same cloud provider, across which 292538 security violations have been collected. Our contributions consist of: understanding on existing security vulnerabilities of cloud deployments, contributing a list of Top Guidelines for practitioners to follow to securely deploy systems in the cloud, and providing the raw data for further studies.info:eu-repo/semantics/acceptedVersio

    A Penny a Function: Towards Cost Transparent Cloud Programming

    Full text link
    Understanding and managing monetary cost factors is crucial when developing cloud applications. However, the diverse range of factors influencing costs for computation, storage, and networking in cloud applications poses a challenge for developers who want to manage and minimize costs proactively. Existing tools for understanding cost factors are often detached from source code, causing opaqueness regarding the origin of costs. Moreover, existing cost models for cloud applications focus on specific factors such as compute resources and necessitate manual effort to create the models. This paper presents initial work toward a cost model based on a directed graph that allows deriving monetary cost estimations directly from code using static analysis. Leveraging the cost model, we explore visualizations embedded in a code editor that display costs close to the code causing them. This makes cost exploration an integrated part of the developer experience, thereby removing the overhead of external tooling for cost estimation of cloud applications at development time.Comment: Proceedings of the 2nd ACM SIGPLAN International Workshop on Programming Abstractions and Interactive Notations, Tools, and Environments (PAINT 2023), 10 pages, 5 figure
    • …
    corecore