5,836 research outputs found

    A general framework for online audio source separation

    Get PDF
    We consider the problem of online audio source separation. Existing algorithms adopt either a sliding block approach or a stochastic gradient approach, which is faster but less accurate. Also, they rely either on spatial cues or on spectral cues and cannot separate certain mixtures. In this paper, we design a general online audio source separation framework that combines both approaches and both types of cues. The model parameters are estimated in the Maximum Likelihood (ML) sense using a Generalised Expectation Maximisation (GEM) algorithm with multiplicative updates. The separation performance is evaluated as a function of the block size and the step size and compared to that of an offline algorithm.Comment: International conference on Latente Variable Analysis and Signal Separation (2012

    Efficient Coxian duration modelling for activity recognition in smart environment with the hidden semi-Markov model

    Full text link
    In this paper, we exploit the discrete Coxian distribution and propose a novel form of stochastic model, termed as the Coxian hidden semi-Makov model (Cox-HSMM), and apply it to the task of recognising activities of daily living (ADLs) in a smart house environment. The use of the Coxian has several advantages over traditional parameterization (e.g. multinomial or continuous distributions) including the low number of free parameters needed, its computational efficiency, and the existing of closed-form solution. To further enrich the model in real-world applications, we also address the problem of handling missing observation for the proposed Cox-HSMM. In the domain of ADLs, we emphasize the importance of the duration information and model it via the Cox-HSMM. Our experimental results have shown the superiority of the Cox-HSMM in all cases when compared with the standard HMM. Our results have further shown that outstanding recognition accuracy can be achieved with relatively low number of phases required in the Coxian, thus making the Cox-HSMM particularly suitable in recognizing ADLs whose movement trajectories are typically very long in nature.<br /

    RA2: predicting simulation execution time for cloud-based design space explorations

    Get PDF
    Design space exploration refers to the evaluation of implementation alternatives for many engineering and design problems. A popular exploration approach is to run a large number of simulations of the actual system with varying sets of configuration parameters to search for the optimal ones. Due to the potentially huge resource requirements, cloud-based simulation execution strategies should be considered in many cases. In this paper, we look at the issue of running large-scale simulation-based design space exploration problems on commercial Infrastructure-as-a-Service clouds, namely Amazon EC2, Microsoft Azure and Google Compute Engine. To efficiently manage cloud resources used for execution, the key problem would be to accurately predict the running time for each simulation instance in advance. This is not trivial due to the currently wide range of cloud resource types which offer varying levels of performance. In addition, the widespread use of virtualization techniques in most cloud providers often introduces unpredictable performance interference. In this paper, we propose a resource and application-aware (RA2) prediction approach to combat performance variability on clouds. In particular, we employ neural network based techniques coupled with non-intrusive monitoring of resource availability to obtain more accurate predictions. We conducted extensive experiments on commercial cloud platforms using an evacuation planning design problem over a month-long period. The results demonstrate that it is possible to predict simulation execution times in most cases with high accuracy. The experiments also provide some interesting insights on how we should run similar simulation problems on various commercially available clouds

    Technical Report: Using Static Analysis to Compute Benefit of Tolerating Consistency

    Full text link
    Synchronization is the Achilles heel of concurrent programs. Synchronization requirement is often used to ensure that the execution of the concurrent program can be serialized. Without synchronization requirement, a program suffers from consistency violations. Recently, it was shown that if programs are designed to tolerate such consistency violation faults (\cvf{s}) then one can obtain substantial performance gain. Previous efforts to analyze the effect of \cvf-tolerance are limited to run-time analysis of the program to determine if tolerating \cvf{s} can improve the performance. Such run-time analysis is very expensive and provides limited insight. In this work, we consider the question, `Can static analysis of the program predict the benefit of \cvf-tolerance?' We find that the answer to this question is affirmative. Specifically, we use static analysis to evaluate the cost of a \cvf and demonstrate that it can be used to predict the benefit of \cvf-tolerance. We also find that when faced with a large state space, partial analysis of the state space (via sampling) also provides the required information to predict the benefit of \cvf-tolerance. Furthermore, we observe that the \cvf-cost distribution is exponential in nature, i.e., the probability that a \cvf has a cost of cc is A.B−cA.B^{-c}, where AA and BB are constants, i.e., most \cvf{s} cause no/low perturbation whereas a small number of \cvf{s} cause a large perturbation. This opens up new aveneus to evaluate the benefit of \cvf-tolerance

    Dust masses of disks around 8 Brown Dwarfs and Very Low-Mass Stars in Upper Sco OB1 and Ophiuchus

    Full text link
    We present the results of ALMA band 7 observations of dust and CO gas in the disks around 7 objects with spectral types ranging between M5.5 and M7.5 in Upper Scorpius OB1, and one M3 star in Ophiuchus. We detect unresolved continuum emission in all but one source, and the 12^{12}CO J=3-2 line in two sources. We constrain the dust and gas content of these systems using a grid of models calculated with the radiative transfer code MCFOST, and find disk dust masses between 0.1 and 1 M⊕_\oplus, suggesting that the stellar mass / disk mass correlation can be extrapolated for brown dwarfs with masses as low as 0.05 M⊙_\odot. The one disk in Upper Sco in which we detect CO emission, 2MASS J15555600, is also the disk with warmest inner disk as traced by its H - [4.5] photometric color. Using our radiative transfer grid, we extend the correlation between stellar luminosity and mass-averaged disk dust temperature originally derived for stellar mass objects to the brown dwarf regime to ⟨Tdust⟩≈22(L∗/L⊙)0.16K\langle T_{dust} \rangle \approx 22 (L_{*} /L_{\odot})^{0.16} K, applicable to spectral types of M5 and later. This is slightly shallower than the relation for earlier spectral type objects and yields warmer low-mass disks. The two prescriptions cross at 0.27 L⊙_\odot, corresponding to masses between 0.1 and 0.2 M⊙_\odot depending on age.Comment: 9 pages,6 figures, accepted to ApJ on 26/01/201
    • …
    corecore