2,891 research outputs found

    Fluorescence microscopy: Established and emerging methods, experimental strategies, and applications in immunology

    Full text link
    Cutting-edge biophysical technologies including total internal reflection fluorescence microscopy, single molecule fluorescence, single channel opening events, fluorescence resonance energy transfer, high-speed exposures, two-photon imaging, fluorescence lifetime imaging, and other tools are becoming increasingly important in immunology as they link molecular events to cellular physiology, a key goal of modern immunology. The primary concern in all forms of microscopy is the generation of contrast; for fluorescence microscopy contrast can be thought of as the difference in intensity between the cell and background, the signal-to-noise ratio. High information-content images can be formed by enhancing the signal, suppressing the noise, or both. As improved tools, such as ICCD and EMCCD cameras, become available for fluorescence imaging in molecular and cellular immunology, it is important to optimize other aspects of the imaging system. Numerous practical strategies to enhance fluorescence microscopy experiments are reviewed. The use of instrumentation such as light traps, cameras, objectives, improved fluorescent labels, and image filtration routines applicable to low light level experiments are discussed. New methodologies providing resolution well beyond that given by the Rayleigh criterion are outlined. Ongoing and future developments in fluorescence microscopy instrumentation and technique are reviewed. This review is intended to address situations where the signal is weak, which is important for emerging techniques stressing super-resolution or live cell dynamics, but is less important for conventional applications such as indirect immunofluorescence. This review provides a broad integrative discussion of fluorescence microscopy with selected applications in immunology. Microsc. Res. Tech., 2007. © 2007 Wiley-Liss, Inc.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/56150/1/20455_ftp.pd

    Stabilizing optical microcavities in 3D

    Get PDF
    Optical (micro-)cavities are the workhorse for studying light-matter interactions with important applications in lasing, sensing, and quantum simulations, to name a few. Open resonators in particular offer great versatility due to their tunability but pose challenges in terms of control. This concerns, on the one hand, the control of their length, and on the other hand, the relative orientation (tilt) of the mirror planes to each other. The latter becomes particularly important when working with optically unstable resonators, such as plane-parallel resonators.There are numerous strategies to enhance stability using passive techniques, such as material selection, mechanical damping, or thermal compensation. But especially for tuneable microcavities often an active stabilization method with feedback control systems must be employed. Here, we present a novel method for tilt measurement and stabilization using inverse solving of the Schrödinger equation arising in the paraxial description of the cavity modes. Our method enables the highly precise determination of absolute tilt angles, making it suitable for microcavity applications that require the highest level of cavity parallelism

    Single photons in an interferometer:Forbidden outcomes

    Get PDF
    The well-known Hong-Ou-Mandel (HOM) effect is the first demonstration of perfect destructive interference of quantum amplitudes in a linear optical system with single photon Fock-states at input. A natural question to ask is whether we are able to predict similar behavior for systems of a general size, that is, predicting forbidden transitions for a given input state and a given linear optical system. Previous studies have found that certain symmetries between the input and output configuration in combination with a symmetric interferometer will always result into so called suppressions. But recently, a few examples have been found of suppressions which do not obey the suppression laws constructed in previous studies. In this work, weparametrize a general three-mode interferometer and we use a numerical optimization algorithm in order to find all three-mode interferometers that demonstrate forbidden transitions. These results help us to gain a better understanding of the fundamental reason behind suppressions. In addition, we compare the quality of the different interferometers for applicationsin quantum tomography

    Time-domain Physical Unclonable Functions

    Get PDF
    One can replace one-way functions (also known as hash functions) commonlyfound in cryptography with physical processes, known as physical unclonable functions (PUFs). Optical PUFs have been devised based on the complex response ofscattering media to the spatial wavefront. However, such PUFs are intrinsicallyunpractical for use over larger distances. In this project, we design PUFs with atime-domain scattering response (tPUF), whose readout can be performed over asingle spatial mode, in our case an optical fiber. These tPUFs are networks of microring resonators, whose transfer function is highly complex and vary strongly from realization to realization as a result of manufacturing imperfections. Thesedevices are developed for pulses with a very low number of average photons perpulse which negates any attempts at reading out the pulse shape in transit, therebyeliminating eavesdropping. These PUFs can then be used for a variety of applicationsin asymmetric cryptography, such as proof of identity and secure messaging

    Deep Learning-based Kinetic Analysis in Paper-based Analytical Cartridges Integrated with Field-effect Transistors

    Full text link
    This study explores the fusion of a field-effect transistor (FET), a paper-based analytical cartridge, and the computational power of deep learning (DL) for quantitative biosensing via kinetic analyses. The FET sensors address the low sensitivity challenge observed in paper analytical devices, enabling electrical measurements with kinetic data. The paper-based cartridge eliminates the need for surface chemistry required in FET sensors, ensuring economical operation (cost < $0.15/test). The DL analysis mitigates chronic challenges of FET biosensors such as sample matrix interference, by leveraging kinetic data from target-specific bioreactions. In our proof-of-concept demonstration, our DL-based analyses showcased a coefficient of variation of < 6.46% and a decent concentration measurement correlation with an r2 value of > 0.976 for cholesterol testing when blindly compared to results obtained from a CLIA-certified clinical laboratory. These integrated technologies can create a new generation of FET-based biosensors, potentially transforming point-of-care diagnostics and at-home testing through enhanced accessibility, ease-of-use, and accuracy.Comment: 18 pages, 4 figure

    Coherent diffraction imaging for enhanced fault and fracture network characterization

    Get PDF
    Faults and fractures represent unique features of the solid Earth and are especially pervasive in the shallow crust. Aside from directly relating to crustal dynamics and the systematic assessment of associated risk, fault and fracture networks enable the efficient migration of fluids and therefore have a direct impact on concrete topics relevant to society, including climate-change-mitigating measures like CO2 sequestration or geothermal exploration and production. Due to their small-scale complexity, fault zones and fracture networks are typically poorly resolved, and their presence can often only be inferred indirectly in seismic and ground-penetrating radar (GPR) subsurface reconstructions. We suggest a largely data-driven framework for the direct imaging of these features by making use of the faint and still often underexplored diffracted portion of the wave field. Finding inspiration in the fields of optics and visual perception, we introduce two different conceptual pathways for coherent diffraction imaging and discuss respective advantages and disadvantages in different contexts of application. At the heart of both of these strategies lies the assessment of data coherence, for which a range of quantitative measures is introduced. To illustrate the versatility and effectiveness of the approach for high-resolution geophysical imaging, several seismic and GPR field data examples are presented, in which the diffracted wave field sheds new light on crustal features like fluvial channels, erosional surfaces, and intricate fault and fracture networks on land and in the marine environment

    Small business innovation research. Abstracts of 1988 phase 1 awards

    Get PDF
    Non-proprietary proposal abstracts of Phase 1 Small Business Innovation Research (SBIR) projects supported by NASA are presented. Projects in the fields of aeronautical propulsion, aerodynamics, acoustics, aircraft systems, materials and structures, teleoperators and robots, computer sciences, information systems, data processing, spacecraft propulsion, bioastronautics, satellite communication, and space processing are covered

    Development of High-speed Optical Coherence Tomography for Time-lapse Non-destructive Characterization of Samples

    Get PDF
    Optical coherence tomography (OCT) is an established optical imaging modality which can obtain label-free, non-destructive 3D images of samples with micron-scale resolution and millimeter penetration. OCT has been widely adopted for biomedical researches

    The Emergence of the Modern Universe: Tracing the Cosmic Web

    Full text link
    This is the report of the Ultraviolet-Optical Working Group (UVOWG) commissioned by NASA to study the scientific rationale for new missions in ultraviolet/optical space astronomy approximately ten years from now, when the Hubble Space Telescope (HST) is de-orbited. The UVOWG focused on a scientific theme, The Emergence of the Modern Universe, the period from redshifts z = 3 to 0, occupying over 80% of cosmic time and beginning after the first galaxies, quasars, and stars emerged into their present form. We considered high-throughput UV spectroscopy (10-50x throughput of HST/COS) and wide-field optical imaging (at least 10 arcmin square). The exciting science to be addressed in the post-HST era includes studies of dark matter and baryons, the origin and evolution of the elements, and the major construction phase of galaxies and quasars. Key unanswered questions include: Where is the rest of the unseen universe? What is the interplay of the dark and luminous universe? How did the IGM collapse to form the galaxies and clusters? When were galaxies, clusters, and stellar populations assembled into their current form? What is the history of star formation and chemical evolution? Are massive black holes a natural part of most galaxies? A large-aperture UV/O telescope in space (ST-2010) will provide a major facility in the 21st century for solving these scientific problems. The UVOWG recommends that the first mission be a 4m aperture, SIRTF-class mission that focuses on UV spectroscopy and wide-field imaging. In the coming decade, NASA should investigate the feasibility of an 8m telescope, by 2010, with deployable optics similar to NGST. No high-throughput UV/Optical mission will be possible without significant NASA investments in technology, including UV detectors, gratings, mirrors, and imagers.Comment: Report of UV/O Working Group to NASA, 72 pages, 13 figures, Full document with postscript figures available at http://casa.colorado.edu/~uvconf/UVOWG.htm
    corecore