257 research outputs found

    Planning and monitoring the execution of web service requests

    Get PDF
    Interaction with web services enabled marketplaces would be greatly facilitated if users were given a high level service request language to express their goals in complex business domains. This could be achieved by using a planning framework which monitors the execution of planned goals against predefined standard business processes and interacts with the user to achieve goal satisfaction. We present a planning architecture that accepts high level requests, expressed in XSRL (Xml Service Request Language). The planning framework is based on the principle of interleaving planning and execution. This is accomplished on the basis of refinement and revision as new service-related information is gathered from UDDI and web services instances, and as execution circumstances necessitate change. The system interacts with the user whenever confirmation or verification is needed

    Unraveling the evolution of hot Jupiter systems under the effect of tidal and magnetic interactions and mass loss

    Full text link
    Various interactions affect the population of close-in planets. Among them, the tidal and magnetic interactions drive orbital decay and star-planet angular momentum exchange, leading to stellar spin-up. As a result of the above processes, a planet may initiate the mass transfer to the host star once it encounters the Roche limit. Another mechanism providing substantial mass loss is associated with the atmospheric escape caused by photoevaporation followed by orbital expansion, which is thought to be important for hot Neptunes and super-Earths. Thus, the fraction of the initial number of hot Jupiters may transform into lower-mass planets through the Roche-lobe overflow (RLO) phase and continue secular evolution under the effect of photoevaporation. In the present paper, we compile the latest prescriptions for tidal and magnetic migration and mass-loss rates to explore the dynamics of hot Jupiter systems. We study how the implemented interactions shape the orbital architecture of Jovian planets and whether their impact is enough to reproduce the observational sample. Our models suggest that the tidal interaction is able to generate the upper boundary of the hot Jupiter population in the mass-separation diagram. To recreate the sub-Jovian desert, we need to make additional assumptions regarding the RLO phase or the influence of the protoplanetary disc's inner edge on the initial planetary location. According to our estimates, 12-15% of hot Jupiters around solar-mass stars have been engulfed or become lower-mass planets. 0.20-0.25% of the present-day giant planet population undergoes decay intense enough to be detected with modern facilities.Comment: 18 pages, 13 figures. Submitted to MNRA

    Adaptive On-the-Fly Changes in Distributed Processing Pipelines

    Get PDF
    Distributed data processing systems have become the standard means for big data analytics. These systems are based on processing pipelines where operations on data are performed in a chain of consecutive steps. Normally, the operations performed by these pipelines are set at design time, and any changes to their functionality require the applications to be restarted. This is not always acceptable, for example, when we cannot afford downtime or when a long-running calculation would lose significant progress. The introduction of variation points to distributed processing pipelines allows for on-the-fly updating of individual analysis steps. In this paper, we extend such basic variation point functionality to provide fully automated reconfiguration of the processing steps within a running pipeline through an automated planner. We have enabled pipeline modeling through constraints. Based on these constraints, we not only ensure that configurations are compatible with type but also verify that expected pipeline functionality is achieved. Furthermore, automating the reconfiguration process simplifies its use, in turn allowing users with less development experience to make changes. The system can automatically generate and validate pipeline configurations that achieve a specified goal, selecting from operation definitions available at planning time. It then automatically integrates these configurations into the running pipeline. We verify the system through the testing of a proof-of-concept implementation. The proof of concept also shows promising results when reconfiguration is performed frequently

    ECiDA:Evolutionary Changes in Data Analysis

    Get PDF
    Modern data analysis platforms all too often rely on the fact that theapplication and underlying data flow are static. That is, such platformsgenerally do not implement the capabilities to update individual componentsof running pipelines without restarting the pipeline, and they relyon data sources to remain unchanged while they are being used. However,in reality these assumptions do not hold: data scientists come up withnew methods to analyze data all the time, and data sources are almost bydefinition dynamic. Companies performing data science analyses eitherneed to accept the fact that their pipeline goes down during an update,or they should run a duplicate setup of their often costly infrastructurethat continues the pipeline operations.In this research we present the Evolutionary Changes in Data Analysis(ECiDA) platform, with which we show how evolution and data sciencecan go hand in hand. ECiDA aims to bridge the gap that is present betweenengineers that build large scale computation platforms on the onehand, and data scientists that perform analyses on large quantities of dataon the other, while making change a first-class citizen. ECiDA allows datascientists to build their data science pipelines on scalable infrastructures,and make changes to them while they remain up and running. Suchchanges can range from parameter changes in individual pipeline componentsto general changes in network topology. Changes may also beinitiated by an ECiDA pipeline itself as part of a diagnostic response: forinstance, it may dynamically replace a data source that has become unavailablewith one that is available. To make sure the platform remains ina consistent state while performing these updates, ECiDA uses a set of automaticformal verification methods, such as constraint programming andAI planning, to transparently check the validity of updates and preventundesired behavior

    Planning and monitoring the execution of web service requests

    Get PDF

    Workshop for e-Government via Software Services (WeGovS2 2009)

    Get PDF
    • ā€¦
    corecore