4,928 research outputs found

    L2-Homogenization of Heat Equations on Tubular Neighborhoods

    Full text link
    We consider the heat equation with Dirichlet boundary conditions on the tubular neighborhood of a closed Riemannian submanifold. We show that, as the tube radius decreases, the semigroup of a suitably rescaled and renormalized generator can be effectively described by a Hamiltonian on the submanifold with a potential that depends on the geometry of the submanifold and of the embedding.Comment: 34 page

    Chernoff's Theorem and Discrete Time Approximations of Brownian Motion on Manifolds

    Full text link
    Let (S(t)) be a one-parameter family S = (S(t)) of positive integral operators on a locally compact space L. For a possibly non-uniform partition of [0,1] define a measure on the path space C([0,1],L) by using a) S(dt) for the transition between cosecutive partition times of distance dt, and b) a suitable continuous interpolation scheme (e.g. Brownian bridges or geodesics). If necessary normalize to get a probability measure. We prove a version of Chernoff's theorem of semigroup theory and tighness results which together yield convergence in law of such measures as the partition gets finer. In particular let L be a closed smooth submanifold of a Riemannian manifold M. We prove convergence of Brownian motion on M, conditioned to visit L at all partition times, to a process on L whose law has a Radon-Nikodym density with repect to Brownian motion on L which contains scalar, mean and sectional curvature terms. Various approximation schemes for Brownian motion are also given. These results substantially extend earlier work by the authors and by Andersson and Driver.Comment: 35 pages, revised version for publication, more detailed expositio

    Smooth Homogenization of Heat Equations on Tubular Neighborhoods

    Full text link
    We consider the heat equation with Dirichlet boundary conditions on the tubular neighborhood of a closed Riemannian submanifold. We show that, as the tube diameter tends to zero, a suitably rescaled and renormalized semigroup converges to a limit semigroup in Sobolev spaces of arbitrarily large Sobolev index.Comment: 30 page

    Parsimonious Segmentation of Time Series' by Potts Models

    Get PDF
    Typical problems in the analysis of data sets like time-series or images crucially rely on the extraction of primitive features based on segmentation. Variational approaches are a popular and convenient framework in which such problems can be studied. We focus on Potts models as simple nontrivial instances. The discussion proceeds along two data sets from brain mapping and functional genomics

    Robust nonparametric detection of objects in noisy images

    Get PDF
    We propose a novel statistical hypothesis testing method for detection of objects in noisy images. The method uses results from percolation theory and random graph theory. We present an algorithm that allows to detect objects of unknown shapes in the presence of nonparametric noise of unknown level and of unknown distribution. No boundary shape constraints are imposed on the object, only a weak bulk condition for the object's interior is required. The algorithm has linear complexity and exponential accuracy and is appropriate for real-time systems. In this paper, we develop further the mathematical formalism of our method and explore important connections to the mathematical theory of percolation and statistical physics. We prove results on consistency and algorithmic complexity of our testing procedure. In addition, we address not only an asymptotic behavior of the method, but also a finite sample performance of our test

    Scale space consistency of piecewise constant least squares estimators -- another look at the regressogram

    Full text link
    We study the asymptotic behavior of piecewise constant least squares regression estimates, when the number of partitions of the estimate is penalized. We show that the estimator is consistent in the relevant metric if the signal is in L2([0,1])L^2([0,1]), the space of c\`{a}dl\`{a}g functions equipped with the Skorokhod metric or C([0,1])C([0,1]) equipped with the supremum metric. Moreover, we consider the family of estimates under a varying smoothing parameter, also called scale space. We prove convergence of the empirical scale space towards its deterministic target.Comment: Published at http://dx.doi.org/10.1214/074921707000000274 in the IMS Lecture Notes Monograph Series (http://www.imstat.org/publications/lecnotes.htm) by the Institute of Mathematical Statistics (http://www.imstat.org

    An Elementary Rigorous Introduction to Exact Sampling

    Get PDF
    We introduce coupling from the past, a recently developed method for exact sampling from a given distribution. Focus is on rigour and thorough proofs. We stay on an elementary level which requires little or no prior knowledge from probability theory. This should fill an obvious gap between innumerable intuitive and incomplete reviews, and few precise derivations on an abstract level

    Importance sampling for high speed statistical Monte-Carlo simulations

    Get PDF
    As transistor dimensions of Static Random AccessMemory (SRAM) become smaller with each new technology generation, they become increasingly susceptible to statistical variations in their parameters. These statistical variations can result in failing memory. SRAM is used as a building block for the construction of large Integrated Circuits (IC). To ensure SRAM does not degrade the yield (fraction of functional devices) of ICs, very low failure probabilities of Pfail = 10-10 are strived for. For instance in SRAMmemory design one aims to get a 0.1% yield loss for 10Mbit memory, which means that 1 in 10 billion cells fails (Pfail = 10-10; this corresponds with an occurrence of -6.4s when dealing with a normal distribution). To simulate such probabilities, traditional Monte-Carlo simulations are not sufficient and more advanced techniques are required. Importance Sampling is a technique that is relatively easy to implement and provides sufficiently accurate results. Importance sampling is a well known technique in statistics to estimate the occurrences of rare events. Rare or extreme events can be associated with dramatic costs, like in finance or because of reasons of safety in environment (dikes, power plants). Recently this technique also received new attention in circuit design. Importance sampling tunes Monte Carlo to the area in parameter space from where the rare events are generated. By this a speed up of several orders can be achieved when compared to standard Monte Carlo methods. We describe the underlying mathematics. Experiments reveal the intrinsic power of the method. The efficiency of the method increases when the dimension of the parameter space increases. The method could be a valuable extension to the statistical capacities of any circuit simulator A Matlab implementation is included in the Appendix

    Prevalence of sensory impairments in home care and long-term care using interRAI data from across Canada

    Get PDF
    Background In the general population, sensory impairments increase markedly with age in adults over 60 years of age. We estimated the prevalence of hearing loss only (HL), vision loss only (VL), and a combined impairment (i.e., dual sensory loss or DSL) in Canadians receiving home care (HC) or long-term care (LTC). Methods Annual cross-sectional analyses were conducted using data collected with one of two interRAI assessments, one used for the HC setting (n = 2,667,199), and one for LTC (n = 1,538,691). Items in the assessments were used to measure three mutually exclusive outcomes: prevalence of VL only, HL only, or DSL. Trends over time for each outcome were examined using the Cochran-Armitage trend test. A negative binomial model was used to quantify the trends over time for each outcome while adjusting for age, sex and province. Results In HC, there was a significant trend in the rate for all three outcomes (p \u3c 0.001), with a small increase (roughly 1%) each year. In HC, HL was the most prevalent sensory loss, with a rate of roughly 25% to 29%, while in LTC, DSL was the most prevalent impairment, at roughly 25% across multiple years of data. In both settings, roughly 60% of the sample was female. Males in both HC and LTC had a higher prevalence of HL compared to females, but the differences were very small (no more than 2% in any given year). The prevalence of HL differed by province after adjusting for year, age and sex. Compared to Ontario, Yukon Territory had a 26% higher rate of HL in HC (relative rate [RR] = 1.26; 95% confidence interval [CI]:1.11, 1.43), but LTC residents in Newfoundland and Labrador had a significantly lower rate of HL (RR: 0.57; CI: 0.43, 0.76).When combined, approximately 60% of LTC residents, or HC clients, had at least one sensory impairment. Conclusions Sensory impairments are highly prevalent in both HC and LTC, with small sex-related differences and some variation across Canadian provinces. The interRAI assessments provide clinicians with valuable information to inform care planning and can also be used to estimate the prevalence of these impairments in specific population sub-groups

    A multimodal imaging workflow for monitoring CAR T cell therapy against solid tumor from whole-body to single-cell level

    Get PDF
    CAR T cell research in solid tumors often lacks spatiotemporal information and therefore, there is a need for a molecular tomography to facilitate high-throughput preclinical monitoring of CAR T cells. Furthermore, a gap exists between macro- and microlevel imaging data to better assess intratumor infiltration of therapeutic cells. We addressed this challenge by combining 3D µComputer tomography bioluminescence tomography (µCT/BLT), light-sheet fluorescence microscopy (LSFM) and cyclic immunofluorescence (IF) staining. Methods: NSG mice with subcutaneous AsPC1 xenograft tumors were treated with EGFR CAR T cell (± IL-2) or control BDCA-2 CAR T cell (± IL-2) (n = 7 each). Therapeutic T cells were genetically modified to co-express the CAR of interest and the luciferase CBR2opt. IL-2 was administered s.c. under the xenograft tumor on days 1, 3, 5 and 7 post-therapy-initiation at a dose of 25,000 IU/mouse. CAR T cell distribution was measured in 2D BLI and 3D µCT/BLT every 3-4 days. On day 6, 4 tumors were excised for cyclic IF where tumor sections were stained with a panel of 25 antibodies. On day 6 and 13, 8 tumors were excised from rhodamine lectin-preinjected mice, permeabilized, stained for CD3 and imaged by LSFM. Results: 3D µCT/BLT revealed that CAR T cells pharmacokinetics is affected by antigen recognition, where CAR T cell tumor accumulation based on target-dependent infiltration was significantly increased in comparison to target-independent infiltration, and spleen accumulation was delayed. LSFM supported these findings and revealed higher T cell accumulation in target-positive groups at day 6, which also infiltrated the tumor deeper. Interestingly, LSFM showed that most CAR T cells accumulate at the tumor periphery and around vessels. Surprisingly, LSFM and cyclic IF revealed that local IL-2 application resulted in early-phase increased proliferation, but long-term overstimulation of CAR T cells, which halted the early added therapeutic effect. Conclusion: Overall, we demonstrated that 3D µCT/BLT is a valuable non-isotope-based technology for whole-body cell therapy monitoring and investigating CAR T cell pharmacokinetics. We also presented combining LSFM and MICS for ex vivo 3D- and 2D-microscopy tissue analysis to assess intratumoral therapeutic cell distribution and status
    corecore