528 research outputs found

    Sub-clinical assessment of atopic dermatitis severity using angiographic optical coherence tomography

    Get PDF
    Measurement of sub-clinical atopic dermatitis (AD) is important for determining how long therapies should be continued after clinical clearance of visible AD lesions. An important biomarker of sub-clinical AD is epidermal hypertrophy, the structural measures of which often make optical coherence tomography (OCT) challenging due to the lack of a clearly delineated dermal-epidermal junction in AD patients. Alternatively, angiographic OCT measurements of vascular depth and morphology may represent a robust biomarker for quantifying the severity of clinical and sub-clinical AD. To investigate this, angiographic data sets were acquired from 32 patients with a range of AD severities. Deeper vascular layers within skin were found to correlate with increasing clinical severity. Furthermore, for AD patients exhibiting no clinical symptoms, the superficial plexus depth was found to be significantly deeper than healthy patients at both the elbow (p = 0.04) and knee (p < 0.001), suggesting that sub-clinical changes in severity can be detected. Furthermore, the morphology of vessels appeared altered in patients with severe AD, with significantly different vessel diameter, length, density and fractal dimension. These metrics provide valuable insight into the sub-clinical severity of the condition, allowing the effects of treatments to be monitored past the point of clinical remission

    Second order gradient ascent pulse engineering

    Full text link
    We report some improvements to the gradient ascent pulse engineering (GRAPE) algorithm for optimal control of quantum systems. These include more accurate gradients, convergence acceleration using the BFGS quasi-Newton algorithm as well as faster control derivative calculation algorithms. In all test systems, the wall clock time and the convergence rates show a considerable improvement over the approximate gradient ascent.Comment: Submitted for publicatio

    A Hamilton-Jacobi Formalism for Thermodynamics

    Full text link
    We show that classical thermodynamics has a formulation in terms of Hamilton-Jacobi theory, analogous to mechanics. Even though the thermodynamic variables come in conjugate pairs such as pressure/volume or temperature/entropy, the phase space is odd-dimensional. For a system with n thermodynamic degrees of freedom it is (2n+1)-dimensional. The equations of state of a substance pick out an n-dimensional submanifold. A family of substances whose equations of state depend on n parameters define a hypersurface of co-dimension one. This can be described by the vanishing of a function which plays the role of a Hamiltonian. The ordinary differential equations (characteristic equations) defined by this function describe a dynamical system on the hypersurface. Its orbits can be used to reconstruct the equations of state. The `time' variable associated to this dynamics is related to, but is not identical to, entropy. After developing this formalism on well-grounded systems such as the van der Waals gases and the Curie-Weiss magnets, we derive a Hamilton-Jacobi equation for black hole thermodynamics in General Relativity. The cosmological constant appears as a constant of integration in this picture.Comment: Minor typos fixe

    Coastal squeeze on temperate reefs: Long-term shifts in salinity, water quality, and oyster-associated communities

    Get PDF
    Foundation species, such as mangroves, saltmarshes, kelps, seagrasses, and oysters, thrive within suitable environmental envelopes as narrow ribbons along the land–sea margin. Therefore, these habitat-forming species and resident fauna are sensitive to modified environmental gradients. For oysters, many estuaries impacted by sea-level rise, channelization, and municipal infrastructure are experiencing saltwater intrusion and water-quality degradation that may alter reef distributions, functions, and services. To explore decadal-scale oyster–reef community patterns across a temperate estuary in response to environmental change, we resampled reefs in the Newport River Estuary (NRE) during 2013–2015 that had previously been studied during 1955–1956. We also coalesced historical NRE reef distribution (1880s–2015), salinity (1913–2015), and water-quality-driven shellfish closure boundary (1970s–2015) data to document environmental trends that could influence reef ecology and service delivery. Over the last 60–120 years, the entire NRE has shifted toward higher salinities. Consequently, oyster–reef communities have become less distinct across the estuary, manifest by 20%–27% lower species turnover and decreased faunal richness among NRE reefs in the 2010s relative to the 1950s. During the 2010s, NRE oyster–reef communities tended to cluster around a euhaline, intertidal-reef type more so than during the 1950s. This followed faunal expansions farther up estuary and biological degradation of subtidal reefs as NRE conditions became more marine and favorable for aggressive, reef-destroying taxa. In addition to these biological shifts, the area of suitable bottom on which subtidal reefs persist (contracting due to up-estuary intrusion of marine waters) and support human harvest (driven by water quality, eroding from up-estuary) has decreased by >75% since the natural history of NRE reefs was first explored. This “coastal squeeze” on harvestable subtidal oysters (reduced from a 4.5-km to a 0.75-km envelope along the NRE's main axis) will likely have consequences regarding the economic incentives for future oyster conservation, as well as the suite of services delivered by remaining shellfish reefs (e.g., biodiversity maintenance, seafood supply). More broadly, these findings exemplify how “squeeze” may be a pervasive concern for biogenic habitats along terrestrial or marine ecotones during an era of intense global change

    Characterizing the microcirculation of atopic dermatitis using angiographic optical coherence tomography

    Get PDF
    Background and Aim: With inflammatory skin conditions such as atopic dermatitis (AD), epidermal thickness is mediated by both pathological hyperplasia and atrophy such as that resulting from corticosteroid treatment. Such changes are likely to influence the depth and shape of the underlying microcirculation. Optical coherence tomography (OCT) provides a non-invasive view into the tissue, however structural measures of epidermal thickness are made challenging due to the lack of a delineated dermal-epidermal junction in AD patients. Instead, angiographic extensions to OCT may allow for direct measurement of vascular depth, potentially presenting a more robust method of estimating the degree of epidermal thickening. Methods and results: To investigate microcirculatory changes within AD patients, volumes of angiographic OCT data were collected from 5 healthy volunteers and compared to that of 5 AD patients. Test sites included the cubital and popliteal fossa, which are commonly affected by AD. Measurements of the capillary loop and superficial arteriolar plexus (SAP) depth were acquired and used to estimate the lower and upper bounds of the undulating basement membrane of the dermal-epidermal junction. Furthermore, quantitative parameters such as vessel density and diameter were derived from each dataset and compared between groups. Capillary loop depth increased slightly for AD patients at the poplitial fossa and SAP was found to be measurably deeper in AD patients at both sites, likely due to localized epidermal hyperplasia

    Low Complexity Regularization of Linear Inverse Problems

    Full text link
    Inverse problems and regularization theory is a central theme in contemporary signal processing, where the goal is to reconstruct an unknown signal from partial indirect, and possibly noisy, measurements of it. A now standard method for recovering the unknown signal is to solve a convex optimization problem that enforces some prior knowledge about its structure. This has proved efficient in many problems routinely encountered in imaging sciences, statistics and machine learning. This chapter delivers a review of recent advances in the field where the regularization prior promotes solutions conforming to some notion of simplicity/low-complexity. These priors encompass as popular examples sparsity and group sparsity (to capture the compressibility of natural signals and images), total variation and analysis sparsity (to promote piecewise regularity), and low-rank (as natural extension of sparsity to matrix-valued data). Our aim is to provide a unified treatment of all these regularizations under a single umbrella, namely the theory of partial smoothness. This framework is very general and accommodates all low-complexity regularizers just mentioned, as well as many others. Partial smoothness turns out to be the canonical way to encode low-dimensional models that can be linear spaces or more general smooth manifolds. This review is intended to serve as a one stop shop toward the understanding of the theoretical properties of the so-regularized solutions. It covers a large spectrum including: (i) recovery guarantees and stability to noise, both in terms of â„“2\ell^2-stability and model (manifold) identification; (ii) sensitivity analysis to perturbations of the parameters involved (in particular the observations), with applications to unbiased risk estimation ; (iii) convergence properties of the forward-backward proximal splitting scheme, that is particularly well suited to solve the corresponding large-scale regularized optimization problem

    Three-dimensional random Voronoi tessellations: From cubic crystal lattices to Poisson point processes

    Get PDF
    We perturb the SC, BCC, and FCC crystal structures with a spatial Gaussian noise whose adimensional strength is controlled by the parameter a, and analyze the topological and metrical properties of the resulting Voronoi Tessellations (VT). The topological properties of the VT of the SC and FCC crystals are unstable with respect to the introduction of noise, because the corresponding polyhedra are geometrically degenerate, whereas the tessellation of the BCC crystal is topologically stable even against noise of small but finite intensity. For weak noise, the mean area of the perturbed BCC and FCC crystals VT increases quadratically with a. In the case of perturbed SCC crystals, there is an optimal amount of noise that minimizes the mean area of the cells. Already for a moderate noise (a>0.5), the properties of the three perturbed VT are indistinguishable, and for intense noise (a>2), results converge to the Poisson-VT limit. Notably, 2-parameter gamma distributions are an excellent model for the empirical of of all considered properties. The VT of the perturbed BCC and FCC structures are local maxima for the isoperimetric quotient, which measures the degre of sphericity of the cells, among space filling VT. In the BCC case, this suggests a weaker form of the recentluy disproved Kelvin conjecture. Due to the fluctuations of the shape of the cells, anomalous scalings with exponents >3/2 is observed between the area and the volumes of the cells, and, except for the FCC case, also for a->0. In the Poisson-VT limit, the exponent is about 1.67. As the number of faces is positively correlated with the sphericity of the cells, the anomalous scaling is heavily reduced when we perform powerlaw fits separately on cells with a specific number of faces

    Gravitational Lensing at Millimeter Wavelengths

    Full text link
    With today's millimeter and submillimeter instruments observers use gravitational lensing mostly as a tool to boost the sensitivity when observing distant objects. This is evident through the dominance of gravitationally lensed objects among those detected in CO rotational lines at z>1. It is also evident in the use of lensing magnification by galaxy clusters in order to reach faint submm/mm continuum sources. There are, however, a few cases where millimeter lines have been directly involved in understanding lensing configurations. Future mm/submm instruments, such as the ALMA interferometer, will have both the sensitivity and the angular resolution to allow detailed observations of gravitational lenses. The almost constant sensitivity to dust emission over the redshift range z=1-10 means that the likelihood for strong lensing of dust continuum sources is much higher than for optically selected sources. A large number of new strong lenses are therefore likely to be discovered with ALMA, allowing a direct assessment of cosmological parameters through lens statistics. Combined with an angular resolution <0.1", ALMA will also be efficient for probing the gravitational potential of galaxy clusters, where we will be able to study both the sources and the lenses themselves, free of obscuration and extinction corrections, derive rotation curves for the lenses, their orientation and, thus, greatly constrain lens models.Comment: 69 pages, Review on quasar lensing. Part of a LNP Topical Volume on "Dark matter and gravitational lensing", eds. F. Courbin, D. Minniti. To be published by Springer-Verlag 2002. Paper with full resolution figures can be found at ftp://oden.oso.chalmers.se/pub/tommy/mmviews.ps.g

    The Mathematical Universe

    Full text link
    I explore physics implications of the External Reality Hypothesis (ERH) that there exists an external physical reality completely independent of us humans. I argue that with a sufficiently broad definition of mathematics, it implies the Mathematical Universe Hypothesis (MUH) that our physical world is an abstract mathematical structure. I discuss various implications of the ERH and MUH, ranging from standard physics topics like symmetries, irreducible representations, units, free parameters, randomness and initial conditions to broader issues like consciousness, parallel universes and Godel incompleteness. I hypothesize that only computable and decidable (in Godel's sense) structures exist, which alleviates the cosmological measure problem and help explain why our physical laws appear so simple. I also comment on the intimate relation between mathematical structures, computations, simulations and physical systems.Comment: Replaced to match accepted Found. Phys. version, 31 pages, 5 figs; more details at http://space.mit.edu/home/tegmark/toe.htm
    • …
    corecore