3,873 research outputs found

    An intelligent auto-response short message service categorization model using semantic index

    Get PDF
    Short message service (SMS) is one of the quickest and easiest ways used for communication, used by businesses, government organizations, and banks to send short messages to large groups of people. Categorization of SMS under different message types in their inboxes will provide a concise view for receivers. Former studies on the said problem are at the binary level as ham or spam which triggered the masking of specific messages that were useful to the end user but were treated as spam. Further, it is extended with multi labels such as ham, spam, and others which is not sufficient to meet all the necessities of end users. Hence, a multi-class SMS categorization is needed based on the semantics (information) embedded in it. This paper introduces an intelligent auto-response model using a semantic index for categorizing SMS messages into 5 categories: ham, spam, info, transactions, and one time password’s, using the multi-layer perceptron (MLP) algorithm. In this approach, each SMS is classified into one of the predefined categories. This experiment was conducted on the “multi-class SMS dataset” with 7,398 messages, which are differentiated into 5 classes. The accuracy obtained from the experiment was 97%

    Probabilistic Models of Motor Production

    Get PDF
    N. Bernstein defined the ability of the central neural system (CNS) to control many degrees of freedom of a physical body with all its redundancy and flexibility as the main problem in motor control. He pointed at that man-made mechanisms usually have one, sometimes two degrees of freedom (DOF); when the number of DOF increases further, it becomes prohibitively hard to control them. The brain, however, seems to perform such control effortlessly. He suggested the way the brain might deal with it: when a motor skill is being acquired, the brain artificially limits the degrees of freedoms, leaving only one or two. As the skill level increases, the brain gradually "frees" the previously fixed DOF, applying control when needed and in directions which have to be corrected, eventually arriving to the control scheme where all the DOF are "free". This approach of reducing the dimensionality of motor control remains relevant even today. One the possibles solutions of the Bernstetin's problem is the hypothesis of motor primitives (MPs) - small building blocks that constitute complex movements and facilitite motor learnirng and task completion. Just like in the visual system, having a homogenious hierarchical architecture built of similar computational elements may be beneficial. Studying such a complicated object as brain, it is important to define at which level of details one works and which questions one aims to answer. David Marr suggested three levels of analysis: 1. computational, analysing which problem the system solves; 2. algorithmic, questioning which representation the system uses and which computations it performs; 3. implementational, finding how such computations are performed by neurons in the brain. In this thesis we stay at the first two levels, seeking for the basic representation of motor output. In this work we present a new model of motor primitives that comprises multiple interacting latent dynamical systems, and give it a full Bayesian treatment. Modelling within the Bayesian framework, in my opinion, must become the new standard in hypothesis testing in neuroscience. Only the Bayesian framework gives us guarantees when dealing with the inevitable plethora of hidden variables and uncertainty. The special type of coupling of dynamical systems we proposed, based on the Product of Experts, has many natural interpretations in the Bayesian framework. If the dynamical systems run in parallel, it yields Bayesian cue integration. If they are organized hierarchically due to serial coupling, we get hierarchical priors over the dynamics. If one of the dynamical systems represents sensory state, we arrive to the sensory-motor primitives. The compact representation that follows from the variational treatment allows learning of a motor primitives library. Learned separately, combined motion can be represented as a matrix of coupling values. We performed a set of experiments to compare different models of motor primitives. In a series of 2-alternative forced choice (2AFC) experiments participants were discriminating natural and synthesised movements, thus running a graphics Turing test. When available, Bayesian model score predicted the naturalness of the perceived movements. For simple movements, like walking, Bayesian model comparison and psychophysics tests indicate that one dynamical system is sufficient to describe the data. For more complex movements, like walking and waving, motion can be better represented as a set of coupled dynamical systems. We also experimentally confirmed that Bayesian treatment of model learning on motion data is superior to the simple point estimate of latent parameters. Experiments with non-periodic movements show that they do not benefit from more complex latent dynamics, despite having high kinematic complexity. By having a fully Bayesian models, we could quantitatively disentangle the influence of motion dynamics and pose on the perception of naturalness. We confirmed that rich and correct dynamics is more important than the kinematic representation. There are numerous further directions of research. In the models we devised, for multiple parts, even though the latent dynamics was factorized on a set of interacting systems, the kinematic parts were completely independent. Thus, interaction between the kinematic parts could be mediated only by the latent dynamics interactions. A more flexible model would allow a dense interaction on the kinematic level too. Another important problem relates to the representation of time in Markov chains. Discrete time Markov chains form an approximation to continuous dynamics. As time step is assumed to be fixed, we face with the problem of time step selection. Time is also not a explicit parameter in Markov chains. This also prohibits explicit optimization of time as parameter and reasoning (inference) about it. For example, in optimal control boundary conditions are usually set at exact time points, which is not an ecological scenario, where time is usually a parameter of optimization. Making time an explicit parameter in dynamics may alleviate this

    An Investigation of Methods for CT Synthesis in MR-only Radiotherapy

    Get PDF

    Complexity of Atrial Fibrillation Electrograms Through Nonlinear Signal Analysis: In Silico Approach

    Get PDF
    Identification of atrial fibrillation (AF) mechanisms could improve the rate of ablation success. However, the incomplete understanding of those mechanisms makes difficult the decision of targeting sites for ablation. This work is focused on the importance of EGM analysis for detecting and modulating rotors to guide ablation procedures and improve its outcomes. Virtual atrial models are used to show how nonlinear measures can be used to generate electroanatomical maps to detect critical sites in AF. A description of the atrial cell mathematical models, and the procedure of coupling them within two‐dimensional and three‐dimensional virtual atrial models in order to simulate arrhythmogenic mechanisms, is given. Mathematical modeling of unipolar and bipolar electrogramas (EGM) is introduced. It follows a discussion of EGM signal processing. Nonlinear descriptors, such as approximate entropy and multifractal analysis, are used to study the dynamical behavior of EGM signals, which are not well described by a linear law. Our results evince that nonlinear analysis of EGM can provide information about the dynamics of rotors and other mechanisms of AF. Furthermore, these fibrillatory patterns can be simulated using virtual models. The combination of features using machine learning tools can be used for identifying arrhythmogenic sources of AF

    Bottleneck Management through Strategic Sequencing in Smart Manufacturing Systems

    Get PDF
    Nowadays, industries put a significant emphasis on finding the optimum order for carrying out jobs in sequence. This is a crucial element in determining net productivity. Depending on the demand criterion, all production systems, including flexible manufacturing systems, follow a predefined sequence of job-based machine operations. The complexity of the problem increases with increasing machines and jobs to sequence, demanding the use of an appropriate sequencing technique. The major contribution of this work is to modify an existing algorithm with a very unusual machine setup and find the optimal sequence which will really minimize the makespan. This custom machine setup completes all tasks by maintaining precedence and satisfying all other constraints. This thesis concentrates on identifying the most effective technique of sequencing which will be validated in a lab environment and a simulated environment. It illustrates some of the key methods of addressing a circular non permutation flow shop sequencing problem with some additional constraints. Additionally, comparisons among the various heuristics algorithms are presented based on different sequencing criteria. The optimum sequence is provided as an input to a real-life machine set up and a simulated environment for selecting the best performing algorithm which is the basic goal of this research. To achieve this goal, at first, a code using python programming language was generated to find an optimum sequence. By analyzing the results, the makespan is increasing with the number of jobs but additional pallet constraint shows, adding more pallets will help to reduce makespan for both flow shops and job shops. Though the sequence obtained from both algorithms is different, for flow shops the makespan remains same for both cases but in the job shop scenario Nawaz, Enscore and Ham (NEH) algorithms always perform better than Campbell Dudek Smith (CDS) algorithms. For job shops with different combinations the makespan decreases mostly for maximum percentage of easy category jobs combined with equal percentage of medium and complex category jobs
    corecore