61,494 research outputs found

    Investigating five key predictive text entry with combined distance and keystroke modelling

    Get PDF
    This paper investigates text entry on mobile devices using only five-keys. Primarily to support text entry on smaller devices than mobile phones, this method can also be used to maximise screen space on mobile phones. Reported combined Fitt's law and keystroke modelling predicts similar performance with bigram prediction using a five-key keypad as is currently achieved on standard mobile phones using unigram prediction. User studies reported here show similar user performance on five-key pads as found elsewhere for novice nine-key pad users

    Validation of Soft Classification Models using Partial Class Memberships: An Extended Concept of Sensitivity & Co. applied to the Grading of Astrocytoma Tissues

    Full text link
    We use partial class memberships in soft classification to model uncertain labelling and mixtures of classes. Partial class memberships are not restricted to predictions, but may also occur in reference labels (ground truth, gold standard diagnosis) for training and validation data. Classifier performance is usually expressed as fractions of the confusion matrix, such as sensitivity, specificity, negative and positive predictive values. We extend this concept to soft classification and discuss the bias and variance properties of the extended performance measures. Ambiguity in reference labels translates to differences between best-case, expected and worst-case performance. We show a second set of measures comparing expected and ideal performance which is closely related to regression performance, namely the root mean squared error RMSE and the mean absolute error MAE. All calculations apply to classical crisp classification as well as to soft classification (partial class memberships and/or one-class classifiers). The proposed performance measures allow to test classifiers with actual borderline cases. In addition, hardening of e.g. posterior probabilities into class labels is not necessary, avoiding the corresponding information loss and increase in variance. We implement the proposed performance measures in the R package "softclassval", which is available from CRAN and at http://softclassval.r-forge.r-project.org. Our reasoning as well as the importance of partial memberships for chemometric classification is illustrated by a real-word application: astrocytoma brain tumor tissue grading (80 patients, 37000 spectra) for finding surgical excision borders. As borderline cases are the actual target of the analytical technique, samples which are diagnosed to be borderline cases must be included in the validation.Comment: The manuscript is accepted for publication in Chemometrics and Intelligent Laboratory Systems. Supplementary figures and tables are at the end of the pd

    Model-driven performance evaluation for service engineering

    Get PDF
    Service engineering and service-oriented architecture as an integration and platform technology is a recent approach to software systems integration. Software quality aspects such as performance are of central importance for the integration of heterogeneous, distributed service-based systems. Empirical performance evaluation is a process of measuring and calculating performance metrics of the implemented software. We present an approach for the empirical, model-based performance evaluation of services and service compositions in the context of model-driven service engineering. Temporal databases theory is utilised for the empirical performance evaluation of model-driven developed service systems

    Dynamic dependence networks: Financial time series forecasting and portfolio decisions (with discussion)

    Full text link
    We discuss Bayesian forecasting of increasingly high-dimensional time series, a key area of application of stochastic dynamic models in the financial industry and allied areas of business. Novel state-space models characterizing sparse patterns of dependence among multiple time series extend existing multivariate volatility models to enable scaling to higher numbers of individual time series. The theory of these "dynamic dependence network" models shows how the individual series can be "decoupled" for sequential analysis, and then "recoupled" for applied forecasting and decision analysis. Decoupling allows fast, efficient analysis of each of the series in individual univariate models that are linked-- for later recoupling-- through a theoretical multivariate volatility structure defined by a sparse underlying graphical model. Computational advances are especially significant in connection with model uncertainty about the sparsity patterns among series that define this graphical model; Bayesian model averaging using discounting of historical information builds substantially on this computational advance. An extensive, detailed case study showcases the use of these models, and the improvements in forecasting and financial portfolio investment decisions that are achievable. Using a long series of daily international currency, stock indices and commodity prices, the case study includes evaluations of multi-day forecasts and Bayesian portfolio analysis with a variety of practical utility functions, as well as comparisons against commodity trading advisor benchmarks.Comment: 31 pages, 9 figures, 3 table

    Coarse grained force field for the molecular simulation of natural gases and condensates

    Get PDF
    AbstractThe atomistically-detailed molecular modelling of petroleum fluids is challenging, amongst other aspects, due to the very diverse multicomponent and asymmetric nature of the mixtures in question. Complicating matters further, the time scales for many important processes can be much larger than the current and foreseeable capacity of modern computers running fully-atomistic models. To overcome these limitations, a coarse grained (CG) model is proposed where some of the less-important degrees of freedom are safely integrated out, leaving as key parameters the average energy levels, the molecular conformations and the range of the Mie intermolecular potentials employed as the basis of the model. The parametrization is performed by using an analytical equation of state of the statistical associating fluid theory (SAFT) family to link the potential parameters to macroscopically observed thermophysical properties. The parameters found through this top-down approach are used directly in molecular dynamics simulations of multi-component multi-phase systems. The procedure is exemplified by calculating the phase envelope of the methaneā€“decane binary and of two synthetic light condensate mixtures. A methodology based on the discrete expansion of a mixture is used to determine the bubble points of these latter mixtures, with an excellent agreement to experimental data. The model presented is entirely predictive and an abridged table of parameters for some fluids of interest is provided

    A proposed case for the cloud software engineering in security

    Get PDF
    This paper presents Cloud Software Engineering in Security (CSES) proposal that combines the benefits from each of good software engineering process and security. While other literature does not provide a proposal for Cloud security as yet, we use Business Process Modeling Notation (BPMN) to illustrate the concept of CSES from its design, implementation and test phases. BPMN can be used to raise alarm for protecting Cloud security in a real case scenario in real-time. Results from BPMN simulations show that a long execution time of 60 hours is required to protect real-time security of 2 petabytes (PB). When data is not in use, BPMN simulations show that the execution time for all data security rapidly falls off. We demonstrate a proposal to deal with Cloud security and aim to improve its current performance for Big Data

    Numerical validation of a population balance model describing cement paste rheology

    Get PDF
    Rheology control is essential during the period in which cement and concrete pastes are encountered in the fresh state, due to the fact that it directly affects workability, initial placement and the structural performance of the hardened material. Optimizations of clinker formulations and reductions in cement-to-water ratios induced by economic and environmental considerations have a significant effect in rheology, which invokes the need for mechanistic models capable of describing the effect of multiple relevant phenomena on the observed paste flow. In this work, the population balance framework was implemented to develop a model able to relate the transient microstructural evolution of cement pastes under typical experimental conditions with its macroscopic rheological responses. Numerical details and performance are assessed and discussed. It was found that the model is capable of reproducing experimentally observed flow curves by using measured cluster size distribution information. It is also able to predict the complex rheological characteristics typically found in cement pastes. Furthermore, a spatially resolved scheme was proposed to investigate the nature of flow inside a parallel-plates rheometer geometry with the objective of assessing the ability of the model of qualitatively predicting experimentally observed behavior and to gain insight into the effect of possible secondary flows
    • ā€¦
    corecore