276,711 research outputs found

    GRIDA3—a shared resources manager for environmental data analysis and applications

    Get PDF
    GRIDA3 (Shared Resources Manager for Environmental Data Analysis and Applications) is a multidisciplinary project designed to deliver an integrated system to forge solutions to some environmental challenges such as the constant increase of polluted sites, the sustainability of natural resources usage and the forecast of extreme meteorological events. The GRIDA3 portal is mainly based on Web 2.0 technologies and EnginFrame framework. The portal, now at an advanced stage of development, provides end-users with intuitive Web-interfaces and tools that simplify job submission to the underneath computing resources. The framework manages the user authentication and authorization, then controls the action and job execution into the grid computing environment, collects the results and transforms them into an useful format on the client side. The GRIDA3 Portal framework will provide a problem-solving platform allowing, through appropriate access policies, the integration and the sharing of skills, resources and tools located at multiple sites across federated domains

    Incorporating Agile with MDA Case Study: Online Polling System

    Full text link
    Nowadays agile software development is used in greater extend but for small organizations only, whereas MDA is suitable for large organizations but yet not standardized. In this paper the pros and cons of Model Driven Architecture (MDA) and Extreme programming have been discussed. As both of them have some limitations and cannot be used in both large scale and small scale organizations a new architecture has been proposed. In this model it is tried to opt the advantages and important values to overcome the limitations of both the software development procedures. In support to the proposed architecture the implementation of it on Online Polling System has been discussed and all the phases of software development have been explained.Comment: 14 pages,1 Figure,1 Tabl

    Integrating climate change impacts to improve understanding of coastal climate change: heavy rains, strong winds, and high seas in coastal Hawaii, Alaska and the Pacific Northwest

    Get PDF
    Coastal storms, and the strong winds, heavy rains, and high seas that accompany them pose a serious threat to the lives and livelihoods of the peoples of the Pacific basin, from the tropics to the high latitudes. To reduce their vulnerability to the economic, social, and environmental risks associated with these phenomena (and correspondingly enhance their resiliency), decision-makers in coastal communities require timely access to accurate information that affords them an opportunity to plan and respond accordingly. This includes information about the potential for coastal flooding, inundation and erosion at time scales ranging from hours to years, as well as the longterm climatological context of this information. The Pacific Storms Climatology Project (PSCP) was formed in 2006 with the intent of improving scientific understanding of patterns and trends of storm frequency and intensity - “storminess”- and related impacts of these extreme events. The project is currently developing a suite of integrated information products that can be used by emergency managers, mitigation planners, government agencies and decision-makers in key sectors, including: water and natural resource management, agriculture and fisheries, transportation and communication, and recreation and tourism. The PSCP is exploring how the climate-related processes that govern extreme storm events are expressed within and between three primary thematic areas: heavy rains, strong winds, and high seas. To address these thematic areas, PSCP has focused on developing analyses of historical climate records collected throughout the Pacific region, and the integration of these climatological analyses with near-real time observations to put recent weather and climate events into a longer-term perspective.(PDF contains 4 pages

    Development and formative evaluation of the e-Health implementation toolkit

    Get PDF
    <b>Background</b> The use of Information and Communication Technology (ICT) or e-Health is seen as essential for a modern, cost-effective health service. However, there are well documented problems with implementation of e-Health initiatives, despite the existence of a great deal of research into how best to implement e-Health (an example of the gap between research and practice). This paper reports on the development and formative evaluation of an e-Health Implementation Toolkit (e-HIT) which aims to summarise and synthesise new and existing research on implementation of e-Health initiatives, and present it to senior managers in a user-friendly format.<p></p> <b>Results</b> The content of the e-HIT was derived by combining data from a systematic review of reviews of barriers and facilitators to implementation of e-Health initiatives with qualitative data derived from interviews of "implementers", that is people who had been charged with implementing an e-Health initiative. These data were summarised, synthesised and combined with the constructs from the Normalisation Process Model. The software for the toolkit was developed by a commercial company (RocketScience). Formative evaluation was undertaken by obtaining user feedback. There are three components to the toolkit - a section on background and instructions for use aimed at novice users; the toolkit itself; and the report generated by completing the toolkit. It is available to download from http://www.ucl.ac.uk/pcph/research/ehealth/documents/e-HIT.xls<p></p> <b>Conclusions</b> The e-HIT shows potential as a tool for enhancing future e-Health implementations. Further work is needed to make it fully web-enabled, and to determine its predictive potential for future implementations

    Higher-Order Process Modeling: Product-Lining, Variability Modeling and Beyond

    Full text link
    We present a graphical and dynamic framework for binding and execution of business) process models. It is tailored to integrate 1) ad hoc processes modeled graphically, 2) third party services discovered in the (Inter)net, and 3) (dynamically) synthesized process chains that solve situation-specific tasks, with the synthesis taking place not only at design time, but also at runtime. Key to our approach is the introduction of type-safe stacked second-order execution contexts that allow for higher-order process modeling. Tamed by our underlying strict service-oriented notion of abstraction, this approach is tailored also to be used by application experts with little technical knowledge: users can select, modify, construct and then pass (component) processes during process execution as if they were data. We illustrate the impact and essence of our framework along a concrete, realistic (business) process modeling scenario: the development of Springer's browser-based Online Conference Service (OCS). The most advanced feature of our new framework allows one to combine online synthesis with the integration of the synthesized process into the running application. This ability leads to a particularly flexible way of implementing self-adaption, and to a particularly concise and powerful way of achieving variability not only at design time, but also at runtime.Comment: In Proceedings Festschrift for Dave Schmidt, arXiv:1309.455

    A cloud-based tool for sentiment analysis in reviews about restaurants on TripAdvisor

    Get PDF
    The tourism industry has been promoting its products and services based on the reviews that people often write on travel websites like TripAdvisor.com, Booking.com and other platforms like these. These reviews have a profound effect on the decision making process when evaluating which places to visit, such as which restaurants to book, etc. In this contribution is presented a cloud based software tool for the massive analysis of this social media data (TripAdvisor.com). The main characteristics of the tool developed are: i) the ability to aggregate data obtained from social media; ii) the possibility of carrying out combined analyses of both people and comments; iii) the ability to detect the sense (positive, negative or neutral) in which the comments rotate, quantifying the degree to which they are positive or negative, as well as predicting behaviour patterns from this information; and iv) the ease of doing everything in the same application (data downloading, pre-processing, analysis and visualisation). As a test and validation case, more than 33.500 revisions written in English on restaurants in the Province of Granada (Spain) were analyse

    Spatial groundings for meaningful symbols

    Get PDF
    The increasing availability of ontologies raises the need to establish relationships and make inferences across heterogeneous knowledge models. The approach proposed and supported by knowledge representation standards consists in establishing formal symbolic descriptions of a conceptualisation, which, it has been argued, lack grounding and are not expressive enough to allow to identify relations across separate ontologies. Ontology mapping approaches address this issue by exploiting structural or linguistic similarities between symbolic entities, which is costly, error-prone, and in most cases lack cognitive soundness. We argue that knowledge representation paradigms should have a better support for similarity and propose two distinct approaches to achieve it. We first present a representational approach which allows to ground symbolic ontologies by using Conceptual Spaces (CS), allowing for automated computation of similarities between instances across ontologies. An alternative approach is presented, which considers symbolic entities as contextual interpretations of processes in spacetime or Differences. By becoming a process of interpretation, symbols acquire the same status as other processes in the world and can be described (tagged) as well, which allows the bottom-up production of meaning

    Development of an Extended Product Lifecycle Management through Service Oriented Architecture.

    Get PDF
    Organised by: Cranfield UniversityThe aim of this work is to define new business opportunities through the concept of Extended Product Lifecycle Management (ExtPLM), analysing its potential implementation within a Service Oriented Architecture. ExtPLM merges the concepts of Extended Product, Avatar and PLM. It aims at allowing a closer interaction between enterprises and their customers, who are integrated in all phases of the life cycle, creating new technical functionalities and services, improving both the practical (e.g. improving usage, improving safety, allowing predictive maintenance) and the emotional side (e.g. extreme customization) of the product.Mori Seiki – The Machine Tool Company; BAE Systems; S4T – Support Service Solutions: Strategy and Transitio
    • …
    corecore