5 research outputs found
Ptychography
Ptychography is a computational imaging technique. A detector records an extensive data set consisting of many inference patterns obtained as an object is displaced to various positions relative to an illumination field. A computer algorithm of some type is then used to invert these data into an image. It has three key advantages: it does not depend upon a good-quality lens, or indeed on using any lens at all; it can obtain the image wave in phase as well as in intensity; and it can self-calibrate in the sense that errors that arise in the experimental set up can be accounted for and their effects removed. Its transfer function is in theory perfect, with resolution being wavelength limited. Although the main concepts of ptychography were developed many years ago, it has only recently (over the last 10 years) become widely adopted. This chapter surveys visible light, x-ray, electron, and EUV ptychography as applied to microscopic imaging. It describes the principal experimental arrangements used at these various wavelengths. It reviews the most common inversion algorithms that are nowadays employed, giving examples of meta code to implement these. It describes, for those new to the field, how to avoid the most common pitfalls in obtaining good quality reconstructions. It also discusses more advanced techniques such as modal decomposition and strategies to cope with three-dimensional () multiple scattering
Myeloid cell-specific Irf5 deficiency stabilizes atherosclerotic plaques in Apoe–/– mice
Objective: Interferon regulatory factor (IRF) 5 is a transcription factor known for promoting M1 type macrophage polarization in vitro. Given the central role of inflammatory macrophages in promoting atherosclerotic plaque progression, we hypothesize that myeloid cell-specific deletion of IRF5 is protective against atherosclerosis. Methods: Female Apoe–/– LysmCre/+ Irf5fl/fl and Apoe −/− Irf5fl/fl mice were fed a high-cholesterol diet for three months. Atherosclerotic plaque size and compositions as well as inflammatory gene expression were analyzed. Mechanistically, IRF5-dependent bone marrow-derived macrophage cytokine profiles were tested under M1 and M2 polarizing conditions. Mixed bone marrow chimeras were generated to determine intrinsic IRF5-dependent effects on macrophage accumulation in atherosclerotic plaques. Results: Myeloid cell-specific Irf5 deficiency blunted LPS/IFNγ-induced inflammatory gene expression in vitro and in the atherosclerotic aorta in vivo. While atherosclerotic lesion size was not reduced in myeloid cell-specific Irf5-deficient Apoe–/– mice, plaque composition was favorably altered, resembling a stable plaque phenotype with reduced macrophage and lipid contents, reduced inflammatory gene expression and increased collagen deposition alongside elevated Mertk and Tgfβ expression. Irf5-deficient macrophages, when directly competing with wild type macrophages in the same mouse, were less prone to accumulate in atherosclerotic lesion, independent of monocyte recruitment. Irf5-deficient monocytes, when exposed to oxidized low density lipoprotein, were less likely to differentiate into macrophage foam cells, and Irf5-deficient macrophages proliferated less in the plaque. Conclusion: Our study provides genetic evidence that selectively altering macrophage polarization induces a stable plaque phenotype in mice
Recommended from our members
Gaussian processes for autonomous data acquisition at large-scale synchrotron and neutron facilities
The execution and analysis of complex experiments are challenged by the vast dimensionality of the underlying parameter spaces. Although an increase in data-acquisition rates should allow broader querying of the parameter space, the complexity of experiments and the subtle dependence of the model function on input parameters remains daunting owing to the sheer number of variables. New strategies for autonomous data acquisition are being developed, with one promising direction being the use of Gaussian process regression (GPR). GPR is a quick, non-parametric and robust approximation and uncertainty quantification method that can be applied directly to autonomous data acquisition. We review GPR-driven autonomous experimentation and illustrate its functionality using real-world examples from large experimental facilities in the USA and France. We introduce the basics of a GPR-driven autonomous loop with a focus on Gaussian processes, and then shift the focus to the infrastructure that needs to be built around GPR to create a closed loop. Finally, the case studies we discuss show that Gaussian-process-based autonomous data acquisition is a widely applicable method that can facilitate the optimal use of instruments and facilities by enabling the efficient acquisition of high-value datasets