1,274 research outputs found

    SMIL State: an architecture and implementation for adaptive time-based web applications

    Get PDF
    In this paper we examine adaptive time-based web applications (or presentations). These are interactive presentations where time dictates which parts of the application are presented (providing the major structuring paradigm), and that require interactivity and other dynamic adaptation. We investigate the current technologies available to create such presentations and their shortcomings, and suggest a mechanism for addressing these shortcomings. This mechanism, SMIL State, can be used to add user-defined state to declarative time-based languages such as SMIL or SVG animation, thereby enabling the author to create control flows that are difficult to realize within the temporal containment model of the host languages. In addition, SMIL State can be used as a bridging mechanism between languages, enabling easy integration of external components into the web application. Finally, SMIL State enables richer expressions for content control. This paper defines SMIL State in terms of an introductory example, followed by a detailed specification of the State model. Next, the implementation of this model is discussed. We conclude with a set of potential use cases, including dynamic content adaptation and delayed insertion of custom content such as advertisements. © 2009 Springer Science+Business Media, LLC

    An overview of decision table literature 1982-1995.

    Get PDF
    This report gives an overview of the literature on decision tables over the past 15 years. As much as possible, for each reference, an author supplied abstract, a number of keywords and a classification are provided. In some cases own comments are added. The purpose of these comments is to show where, how and why decision tables are used. The literature is classified according to application area, theoretical versus practical character, year of publication, country or origin (not necessarily country of publication) and the language of the document. After a description of the scope of the interview, classification results and the classification by topic are presented. The main body of the paper is the ordered list of publications with abstract, classification and comments.

    Systems integration with DLSI

    Get PDF
    Systems integration aims to provide a homogenous view to a system consisting of many heterogeneous systems. Systems integration can be achieved by many means out of which the middleware approach is the most popular. This is because middleware can be built around existing systems thereby reducing the amount of changes needed for the integration. Nowadays many middleware technologies are available to integrate heterogeneous systems. One can chose among them depending upon the requirements. But due to the many technologies available for Middleware, a muddle of middleware exists. To reduce this muddle and to integrate the systems already implemented using different middleware technologies, Web Services plays a key role. Web Services is an XML-based technology which uses known protocols to communicate between applications, thus making them platform and technology independent. Digital Library Service Integration (DLSI) provides a systematic approach in integrating the digital library collections and services. Amazon\u27s website www.amazon.com is integrated with DLSI such that users see the same web pages as they see when they are logged in normally, but these pages will be augmented with link anchors. Wrappers will parse the web pages to look for elements of interests like book title, author name, etc., and provide them with link anchors which will connect the user to the library at New Jersey Institute of Technology where he or she can search for books or request them through interlibrary loan. The thesis proposes a new architecture for the DLSI as a web service, which can provide this functionality

    Guideline-based decision support in medicine : modeling guidelines for the development and application of clinical decision support systems

    Get PDF
    Guideline-based Decision Support in Medicine Modeling Guidelines for the Development and Application of Clinical Decision Support Systems The number and use of decision support systems that incorporate guidelines with the goal of improving care is rapidly increasing. Although developing systems that are both effective in supporting clinicians and accepted by them has proven to be a difficult task, of the systems that were evaluated by a controlled trial, the majority showed impact. The work, described in this thesis, aims at developing a methodology and framework that facilitates all stages in the guideline development process, ranging from the definition of models that represent guidelines to the implementation of run-time systems that provide decision support, based on the guidelines that were developed during the previous stages. The framework consists of 1) a guideline representation formalism that uses the concepts of primitives, Problem-Solving Methods (PSMs) and ontologies to represent guidelines of various complexity and granularity and different application domains, 2) a guideline authoring environment that enables guideline authors to define guidelines, based on the newly developed guideline representation formalism, and 3) a guideline execution environment that translates defined guidelines into a more efficient symbol-level representation, which can be read in and processed by an execution-time engine. The described methodology and framework were used to develop and validate a number of guidelines and decision support systems in various clinical domains such as Intensive Care, Family Practice, Psychiatry and the areas of Diabetes and Hypertension control

    Space shuttle avionics system

    Get PDF
    The Space Shuttle avionics system, which was conceived in the early 1970's and became operational in the 1980's represents a significant advancement of avionics system technology in the areas of systems and redundacy management, digital data base technology, flight software, flight control integration, digital fly-by-wire technology, crew display interface, and operational concepts. The origins and the evolution of the system are traced; the requirements, the constraints, and other factors which led to the final configuration are outlined; and the functional operation of the system is described. An overall system block diagram is included

    In Situ Visualization of Performance Data in Parallel CFD Applications

    Get PDF
    This thesis summarizes the work of the author on visualization of performance data in parallel Computational Fluid Dynamics (CFD) simulations. Current performance analysis tools are unable to show their data on top of complex simulation geometries (e.g. an aircraft engine). But in CFD simulations, performance is expected to be affected by the computations being carried out, which in turn are tightly related to the underlying computational grid. Therefore it is imperative that performance data is visualized on top of the same computational geometry which they originate from. However, performance tools have no native knowledge of the underlying mesh of the simulation. This scientific gap can be filled by merging the branches of HPC performance analysis and in situ visualization of CFD simulations data, which shall be done by integrating existing, well established state-of-the-art tools from each field. In this threshold, an extension for the open-source performance tool Score-P was designed and developed, which intercepts an arbitrary number of manually selected code regions (mostly functions) and send their respective measurements – amount of executions and cumulative time spent – to the visualization software ParaView – through its in situ library, Catalyst –, as if they were any other flow-related variable. Subsequently the tool was extended with the capacity to also show communication data (messages sent between MPI ranks) on top of the CFD mesh. Testing and evaluation are done with two industry-grade codes: Rolls-Royce’s CFD code, Hydra, and Onera, DLR and Airbus’ CFD code, CODA. On the other hand, it has been also noticed that the current performance tools have limited capacity of displaying their data on top of three-dimensional, framed (i.e. time-stepped) representations of the cluster’s topology. Parallel to that, in order for the approach not to be limited to codes which already have the in situ adapter, it was extended to take the performance data and display it – also in codes without in situ – on a three-dimensional, framed representation of the hardware resources being used by the simulation. Testing is done with the Multi-Grid and Block Tri-diagonal NAS Parallel Benchmarks (NPB), as well as with Hydra and CODA again. The benchmarks are used to explain how the new visualizations work, while real performance analyses are done with the industry-grade CFD codes. The proposed solution is able to provide concrete performance insights, which would not have been reached with the current performance tools and which motivated beneficial changes in the respective source code in real life. Finally, its overhead is discussed and proven to be suitable for usage with CFD codes. The dissertation provides a valuable addition to the state of the art of highly parallel CFD performance analysis and serves as basis for further suggested research directions

    Web Mail Information Extraction

    Get PDF
    This project is conducted as to deliver the background of study, problem statements, objective, scope, literature review, methodology of choice for the development process, results and discussion, conclusion, recommendations and references used throughout its completion. The objective of this project is to extract relevant and useful information from Google Mail (GMail) by performing Information Extraction (IE) using Java progranuning language. After several testing have take place, the system developed is able to successfully extract relevant and useful information from GMail account and the emails come from different folders such as All Mail, Inbox, Drafts, Starred, Sent Mail, Spam and Trash. The focus is to extract email information such as the sender, recipient, subject and content. Those extracted information are presented in two mediums; as a text file or being stored inside database in order to better suit different users who come from different backgrounds and needs

    A new Web-based medical tool for assessment and prevention of comprehensive cardiovascular risk

    Get PDF
    Daniele Franchi1,2, Davide Cini1, Giorgio Iervasi11Istituto di Fisiologia Clinica, CNR, Pisa, Italy; 2Dipartimento di Oncologia, dei Trapianti e delle Nuove Tecnologie in Medicina, Università di Pisa, Pisa, ItalyBackground: Multifactor cardiovascular disease is the leading cause of death; besides well-known cardiovascular risk factors, several emerging factors such as mental stress, diet type, and physical inactivity, have been associated to cardiovascular disease. To date, preventive strategies are based on the concept of absolute risk calculated by different algorithms and scoring systems. However, in general practice the patient's data collection represents a critical issue.Design: A new multipurpose computer-based program has been developed in order to:1) easily calculate and compare the absolute cardiovascular risk by the Framingham, Procam, and Progetto Cuore algorithms; 2) to design a web-based computerized tool for prospective collection of structured data; 3) to support the doctor in the decision-making process for patients at risk according to recent international guidelines.Methods: During a medical consultation the doctor utilizes a common computer connected by Internet to a medical server where all the patient's data and software reside. The program evaluates absolute and relative cardiovascular risk factors, personalized patient's goals, and multiparametric trends, monitors critical parameter values, and generates an automated medical report.Results: In a pilot study on 294 patients (47% males; mean age 60 ± 12 years [± SD]) the global time to collect data at first consultation was 13 ± 11 minutes which declined to 8 ± 7 minutes at the subsequent consultation. In 48.2% of cases the program revealed 2 or more primary risk factor parameters outside guideline indications and gave specific clinical suggestions to return altered parameters to target values.Conclusion: The web-based system proposed here may represent a feasible and flexible tool for clinical management of patients at risk of cardiovascular disease and for epidemiological research.Keywords: internet, informatics, cardiovascular risk, evidence-based practice, medical consultation, decision support, clinical information system
    • …
    corecore