13,651 research outputs found

    DeFeeNet: Consecutive 3D Human Motion Prediction with Deviation Feedback

    Full text link
    Let us rethink the real-world scenarios that require human motion prediction techniques, such as human-robot collaboration. Current works simplify the task of predicting human motions into a one-off process of forecasting a short future sequence (usually no longer than 1 second) based on a historical observed one. However, such simplification may fail to meet practical needs due to the neglect of the fact that motion prediction in real applications is not an isolated ``observe then predict'' unit, but a consecutive process composed of many rounds of such unit, semi-overlapped along the entire sequence. As time goes on, the predicted part of previous round has its corresponding ground truth observable in the new round, but their deviation in-between is neither exploited nor able to be captured by existing isolated learning fashion. In this paper, we propose DeFeeNet, a simple yet effective network that can be added on existing one-off prediction models to realize deviation perception and feedback when applied to consecutive motion prediction task. At each prediction round, the deviation generated by previous unit is first encoded by our DeFeeNet, and then incorporated into the existing predictor to enable a deviation-aware prediction manner, which, for the first time, allows for information transmit across adjacent prediction units. We design two versions of DeFeeNet as MLP-based and GRU-based, respectively. On Human3.6M and more complicated BABEL, experimental results indicate that our proposed network improves consecutive human motion prediction performance regardless of the basic model.Comment: accepted by CVPR202

    On Monte Carlo methods for the Dirichlet process mixture model, and the selection of its precision parameter prior

    Get PDF
    Two issues commonly faced by users of Dirichlet process mixture models are: 1) how to appropriately select a hyperprior for its precision parameter alpha, and 2) the typically slow mixing of the MCMC chain produced by conditional Gibbs samplers based on its stick-breaking representation, as opposed to marginal collapsed Gibbs samplers based on the Polya urn, which have smaller integrated autocorrelation times. In this thesis, we analyse the most common approaches to hyperprior selection for alpha, we identify their limitations, and we propose a new methodology to overcome them. To address slow mixing, we revisit three label-switching Metropolis moves from the literature (Hastie et al., 2015; Papaspiliopoulos and Roberts, 2008), improve them, and introduce a fourth move. Secondly, we revisit two i.i.d. sequential importance samplers which operate in the collapsed space (Liu, 1996; S. N. MacEachern et al., 1999), and we develop a new sequential importance sampler for the stick-breaking parameters of Dirichlet process mixtures, which operates in the stick-breaking space and which has minimal integrated autocorrelation time. Thirdly, we introduce the i.i.d. transcoding algorithm which, conditional to a partition of the data, can infer back which specific stick in the stick-breaking construction each observation originated from. We use it as a building block to develop the transcoding sampler, which removes the need for label-switching Metropolis moves in the conditional stick-breaking sampler, as it uses the better performing marginal sampler (or any other sampler) to drive the MCMC chain, and augments its exchangeable partition posterior with conditional i.i.d. stick-breaking parameter inferences after the fact, thereby inheriting its shorter autocorrelation times

    Bayesian Reconstruction of Magnetic Resonance Images using Gaussian Processes

    Full text link
    A central goal of modern magnetic resonance imaging (MRI) is to reduce the time required to produce high-quality images. Efforts have included hardware and software innovations such as parallel imaging, compressed sensing, and deep learning-based reconstruction. Here, we propose and demonstrate a Bayesian method to build statistical libraries of magnetic resonance (MR) images in k-space and use these libraries to identify optimal subsampling paths and reconstruction processes. Specifically, we compute a multivariate normal distribution based upon Gaussian processes using a publicly available library of T1-weighted images of healthy brains. We combine this library with physics-informed envelope functions to only retain meaningful correlations in k-space. This covariance function is then used to select a series of ring-shaped subsampling paths using Bayesian optimization such that they optimally explore space while remaining practically realizable in commercial MRI systems. Combining optimized subsampling paths found for a range of images, we compute a generalized sampling path that, when used for novel images, produces superlative structural similarity and error in comparison to previously reported reconstruction processes (i.e. 96.3% structural similarity and <0.003 normalized mean squared error from sampling only 12.5% of the k-space data). Finally, we use this reconstruction process on pathological data without retraining to show that reconstructed images are clinically useful for stroke identification

    Limit theorems for non-Markovian and fractional processes

    Get PDF
    This thesis examines various non-Markovian and fractional processes---rough volatility models, stochastic Volterra equations, Wiener chaos expansions---through the prism of asymptotic analysis. Stochastic Volterra systems serve as a conducive framework encompassing most rough volatility models used in mathematical finance. In Chapter 2, we provide a unified treatment of pathwise large and moderate deviations principles for a general class of multidimensional stochastic Volterra equations with singular kernels, not necessarily of convolution form. Our methodology is based on the weak convergence approach by Budhiraja, Dupuis and Ellis. This powerful approach also enables us to investigate the pathwise large deviations of families of white noise functionals characterised by their Wiener chaos expansion as~Xε=n=0εnIn(fnε).X^\varepsilon = \sum_{n=0}^{\infty} \varepsilon^n I_n \big(f_n^{\varepsilon} \big). In Chapter 3, we provide sufficient conditions for the large deviations principle to hold in path space, thereby refreshing a problem left open By Pérez-Abreu (1993). Hinging on analysis on Wiener space, the proof involves describing, controlling and identifying the limit of perturbed multiple stochastic integrals. In Chapter 4, we come back to mathematical finance via the route of Malliavin calculus. We present explicit small-time formulae for the at-the-money implied volatility, skew and curvature in a large class of models, including rough volatility models and their multi-factor versions. Our general setup encompasses both European options on a stock and VIX options. In particular, we develop a detailed analysis of the two-factor rough Bergomi model. Finally, in Chapter 5, we consider the large-time behaviour of affine stochastic Volterra equations, an under-developed area in the absence of Markovianity. We leverage on a measure-valued Markovian lift introduced by Cuchiero and Teichmann and the associated notion of generalised Feller property. This setting allows us to prove the existence of an invariant measure for the lift and hence of a stationary distribution for the affine Volterra process, featuring in the rough Heston model.Open Acces

    Statistical-dynamical analyses and modelling of multi-scale ocean variability

    Get PDF
    This thesis aims to provide a comprehensive analysis of multi-scale oceanic variabilities using various statistical and dynamical tools and explore the data-driven methods for correct statistical emulation of the oceans. We considered the classical, wind-driven, double-gyre ocean circulation model in quasi-geostrophic approximation and obtained its eddy-resolving solutions in terms of potential vorticity anomaly and geostrophic streamfunctions. The reference solutions possess two asymmetric gyres of opposite circulations and a strong meandering eastward jet separating them with rich eddy activities around it, such as the Gulf Stream in the North Atlantic and Kuroshio in the North Pacific. This thesis is divided into two parts. The first part discusses a novel scale-separation method based on the local spatial correlations, called correlation-based decomposition (CBD), and provides a comprehensive analysis of mesoscale eddy forcing. In particular, we analyse the instantaneous and time-lagged interactions between the diagnosed eddy forcing and the evolving large-scale PVA using the novel `product integral' characteristics. The product integral time series uncover robust causality between two drastically different yet interacting flow quantities, termed `eddy backscatter'. We also show data-driven augmentation of non-eddy-resolving ocean models by feeding them the eddy fields to restore the missing eddy-driven features, such as the merging western boundary currents, their eastward extension and low-frequency variabilities of gyres. In the second part, we present a systematic inter-comparison of Linear Regression (LR), stochastic and deep-learning methods to build low-cost reduced-order statistical emulators of the oceans. We obtain the forecasts on seasonal and centennial timescales and assess them for their skill, cost and complexity. We found that the multi-level linear stochastic model performs the best, followed by the ``hybrid stochastically-augmented deep learning models''. The superiority of these methods underscores the importance of incorporating core dynamics, memory effects and model errors for robust emulation of multi-scale dynamical systems, such as the oceans.Open Acces

    Application of advanced fluorescence microscopy and spectroscopy in live-cell imaging

    Get PDF
    Since its inception, fluorescence microscopy has been a key source of discoveries in cell biology. Advancements in fluorophores, labeling techniques and instrumentation have made fluorescence microscopy a versatile quantitative tool for studying dynamic processes and interactions both in vitro and in live-cells. In this thesis, I apply quantitative fluorescence microscopy techniques in live-cell environments to investigate several biological processes. To study Gag processing in HIV-1 particles, fluorescence lifetime imaging microscopy and single particle tracking are combined to follow nascent HIV-1 virus particles during assembly and release on the plasma membrane of living cells. Proteolytic release of eCFP embedded in the Gag lattice of immature HIV-1 virus particles results in a characteristic increase in its fluorescence lifetime. Gag processing and rearrangement can be detected in individual virus particles using this approach. In another project, a robust method for quantifying Förster resonance energy transfer in live-cells is developed to allow direct comparison of live-cell FRET experiments between laboratories. Finally, I apply image fluctuation spectroscopy to study protein behavior in a variety of cellular environments. Image cross-correlation spectroscopy is used to study the oligomerization of CXCR4, a G-protein coupled receptor on the plasma membrane. With raster image correlation spectroscopy, I measure the diffusion of histones in the nucleoplasm and heterochromatin domains of the nuclei of early mouse embryos. The lower diffusion coefficient of histones in the heterochromatin domain supports the conclusion that heterochromatin forms a liquid phase-separated domain. The wide range of topics covered in this thesis demonstrate that fluorescence microscopy is more than just an imaging tool but also a powerful instrument for the quantification and elucidation of dynamic cellular processes

    A Case Study Examining Japanese University Students' Digital Literacy and Perceptions of Digital Tools for Academic English learning

    Get PDF
    Current Japanese youth are constantly connected to the Internet and using digital devices, but predominantly for social media and entertainment. According to literature on the Japanese digital native, tertiary students do not—and cannot—use technology with any reasonable fluency, but the likely reasons are rarely addressed. To fill the gap in the literature, this study, by employing a case study methodology, explores students’ experience with technology for English learning through the introduction of digital tools. First-year Japanese university students in an Academic English Program (AEP) were introduced to a variety of easily available digital tools. The instruction was administered online, and each tool was accompanied by a task directly related to classwork. Both quantitative and qualitative data were collected in the form of a pre-course Computer Literacy Survey, a post-course open-ended Reflection Activity survey, and interviews. The qualitative data was reviewed drawing on the Technology Acceptance Model (TAM) and its educational variants as an analytical framework. Educational, social, and cultural factors were also examined to help identify underlying factors that would influence students’ perceptions. The results suggest that the subjects’ lack of awareness of, and experience with, the use of technology for learning are the fundamental causes of their perceptions of initial difficulty. Based on these findings, this study proposes a possible technology integration model that enhances digital literacy for more effective language learning in the context of Japanese education

    Addressing infrastructure challenges posed by the Harwich Formation through understanding its geological origins

    Get PDF
    Variable deposits known to make up the sequence of the Harwich Formation in London have been the subject of ongoing uncertainty within the engineering industry. Current stratigraphical subdivisions do not account for the systematic recognition of individual members in unexposed ground where recovered material is usually disturbed - fines are flushed out during the drilling process and loose materials are often lost or mixed with the surrounding layers. Most engineering problems associated with the Harwich Formation deposits are down to their unconsolidated nature and irregular cementation within layers. The consequent engineering hazards are commonly reflected in high permeability, raised groundwater pressures, ground settlements - when found near the surface and poor stability - when exposed during excavations or tunnelling operations. This frequently leads to sudden design changes or requires contingency measures during construction. All of these can result in damaged equipment, slow progress, and unforeseen costs. This research proposes a facies-based approach where the lithological facies assigned were identified based on reinterpretation of available borehole data from various ground investigations in London, supported by visual inspection of deposits in-situ and a selection of laboratory testing including Particle Size Distribution, Optical and Scanning Electron Microscopy and X-ray Diffraction analyses. Two ground models were developed as a result: 1st a 3D geological model (MOVE model) of the stratigraphy found within the study area that explores the influence of local structural processes controlling/affecting these sediments pre-, syn- and post- deposition and 2nd a sequence stratigraphic model (Dionisos Flow model) unveiling stratal geometries of facies at various stages of accretion. The models present a series of sediment distribution maps, localised 3D views and cross-sections that aim to provide a novel approach to assist the geotechnical industry in predicting the likely distribution of the Harwich Formation deposits, decreasing the engineering risks associated with this stratum.Open Acces

    Management controls, government regulations, customer involvement: Evidence from a Chinese family-owned business

    Get PDF
    This research reports on a case study of a family-owned elevator manufacturing company in China, where management control was sandwiched between the state policies and global customer production requirements. By analysing the role of government and customer, this thesis aimed to illustrate how management control operated in a family-owned business and to see how and why they do management control differently. In particular, it focused on how international production standards and existing Chinese industry policies translated into a set of the management control practices through a local network within the family-owned business I studied. Based on an ethnographic approach to research, I spent six months in the field, conducted over 30 interviews, several conservations, and reviewed relevant internal documents to understand how management control (MC) techniques with humans cooperated in the company. I also understood how two layers of pressure have shaped company behaviour, and how a company located in a developing country is connecting with global network. I also found there is considerable tension among key actors and investigated how the company responded and managed it. Drawing on Actor Network Theory (ANT), I analysed the interviews from key actors, examined the role of government regulations and customer requirements to see how management control being managed under two layers of pressure, i.e., the government regulations (e.g., labour, tax, environment control) and customer requirement (e.g., quality and production control). Management controls were an obligatory passage point (OPP), and transformation of those elements of Western production requirements and government requirements arrived at the Chinese local factory and influenced management control and budgeting. The findings suggest that management control systems are not only a set of technical procedures, but it is also about managing tensions. This understanding shows a linear perspective on MC practices rather than a social perspective. However, when we use ANT as a theoretical perspective, we see those actors who, being obliged and sandwiched, and controlled by external forces for them to follow. Consequently, human actors must work in an unavoidable OPP. This is the tension they face which constructed mundane practices of MC. Hence, MCs are managing such tensions. This study contributes to management control research by analysing management controls in terms of OPP, extends our understanding by illustrating the role of the government and customers, and our understanding of family-owned business from a management controls perspective in a developing country
    corecore