5,722 research outputs found

    Subject Unrest

    Get PDF
    Roll-to-roll manufacturing of micro components based on advanced printing, structuring and lamination of ceramic tapes is rapidly progressing. This large-scale and cost-effective manufacturing process of ceramic micro devices is however prone to hide defects within the visually opaque tape stacks. To achieve a sustainable manufacturing with zero defects in the future, there is an urgent need for reliable inspection systems. The systems to be developed have to perform high-resolution in-process quality control at high speed. Optical coherence tomography (OCT) is a promising technology for detailed in-depth inspection and metrology. Combined with infrared screening of larger areas it can solve the inspection demands in the roll-to-roll ceramic tape processes. In this thesis state-of-art commercial and laboratory OCT systems, operating at the central wavelength of 1.3 µm and 1.7 µm respectively, are evaluated for detecting microchannels, metal prints, defects and delaminations embedded in alumina and zirconia ceramic layers at hundreds of micrometers beneath surfaces. The effect of surface roughness induced scattering and scattering by pores on the probing radiation, is analyzed by experimentally captured and theoretically simulated OCT images of the ceramic samples, while varying surface roughnesses and operating wavelengths. By extending the Monte Carlo simulations of the OCT response to the mid-infrared the optimal operating wavelength is found to be 4 µm for alumina and 2 µm for zirconia. At these wavelengths we predict a sufficient probing depth of about 1 mm and we demonstrate and discuss the effect of rough surfaces on the detectability of embedded boundaries. For high-precision measurement a new and automated 3D image processing algorithm for analysis of volumetric OCT data is developed. We show its capability by measuring the geometric dimensions of embedded structures in ceramic layers, extracting features with irregular shapes and detecting geometric deformations. The method demonstrates its suitability for industrial applications by rapid inspection of manufactured samples with high accuracy and robustness. The new inspection methods we demonstrate are finally analyzed in the context of measurement uncertainty, both in the axial and lateral cases, and reveal that scattering in the sample indeed affects the lateral measurement uncertainty. Two types of image artefacts are found to be present in OCT images due to multiple reflections between neighboring boundaries and inhomogeneity of refractive index. A wavefront aberration is found in the OCT system with a scanning scheme of two galvo mirrors, and it can be corrected using our image processing algorithm.QC 20140428Multilayer (FP7-NMP4-2007-214122

    High-order div- and quasi curl-conforming basis functions for calderon multiplicative preconditioning of the EFIE

    Get PDF
    A new high-order Calderon multiplicative preconditioner (HO-CMP) for the electric field integral equation (EFIE) is presented. In contrast to previous CMPs, the proposed preconditioner allows for high-order surface representations and current expansions by using a novel set of high-order quasi curl-conforming basis functions. Like its predecessors, the HO-CMP can be seamlessly integrated into existing EFIE codes. Numerical results demonstrate that the linear systems of equations obtained using the proposed HO-CMP converge rapidly, regardless of the mesh density and of the order of the current expansion

    Simultaneous Multicolor Detection of Faint Galaxies in the Hubble Deep Field

    Get PDF
    We present a novel way to detect objects when multiband images are available. Typically, object detection is performed in one of the available bands or on a somewhat arbitrarily co-added image. Our technique provides an almost optimal way to use all the color information available. We build up a composite image of the N passbands where each pixel value corresponds to the probability that the given pixel is just sky. By knowing the probability distribution of sky pixels (a chi-square distribution with N degrees of freedom), the data can be used to derive the distribution of pixels dominated by object flux. From the two distributions an optimal segmentation threshold can be determined. Clipping the probability image at this threshold yields a mask, where pixels unlikely to be sky are tagged. After using a standard connected-pixel criterion, the regions of this mask define the detected objects. Applying this technique to the Hubble Deep Field data, we find that we can extend the detection limit of the data below that possible using linearly co-added images. We also discuss possible ways of enhancing object detection probabilities for certain well defined classes of objects by using various optimized linear combinations of the pixel fluxes (optimal subspace filtering).Comment: 8 pages, 5 figures (4 postscript, 1 JPEG). To be published in A

    The Law and Economics of Critical Race Theory

    Get PDF
    Legal academics often perceive law and economics (L&E) and critical race theory (CRT) as oppositional discourses. Using a recently published collection of essays on CRT as a starting point, we argue that the understanding of workplace discrimination can be furthered through a collaboration between L&E and CRT. L&E\u27s strength is in its attention to incentives and norms, specifically its concern with explicating how norms incentivize behavior. Its limitation is that it treats race as exogenous and static. Thus, the literature fails to consider how institutional norms affect, and are affected by, race. To put the point another way, L&E does not discuss how norms incentivize racial behavior, obscuring that how people present their race (or themselves as racial subjects) is a function of norms. The strength of CRT is its conception of race as a social construction. Under this view, race is neither biologically determined nor fixed. Instead, race is ever evolving as a function of social, political, legal, and economic pressures. A limitation of CRT is that much of its analysis of race as a social construction is macro-oriented. Thus, CRT has paid insufficient attention to the social construction of race within specific institutional settings, like the workplace. Further, CRT has virtually ignored the agency people of color exercise to shape how their racial identity is interpreted - that is say, constructed. Explicitly incorporating L&E\u27s focus on incentives and norms into CRT provides CRT with a means by which to articulate the notion of race as a social construction at the level of individual choice. The basic idea is that people of color construct (present racial impressions of) themselves in response to norms. Norms, in this sense, are racially productive, and individuals are part of the production apparatus. Having set out the basic elements of the collaborative enterprise, we deploy this collaboration to respond to a specific and important question about the workplace: How are modern employers and employees likely to manage workplace racial diversity? We raise this question because we assume that, for institutional legitimacy reasons, most workplaces will strive to achieve at least a modicum of racial diversity. The question, again, is: How will this diversity be managed? Part of the answer has to do with assimilation, an ideological technology for constructing race and a central theme in CRT; and part of the answer has to do with efficiency, an ideological technology for creating incentives and a central theme in L&E. Both ideas - assimilation and efficiency - combine to tell a story about workplace discrimination that derives from what we call the homogeneity incentive. In sum, in order to increase efficiency, employers have incentives to screen prospective employees for homogeneity, and, in order to counter racial stereotypes, nonwhite employees have incentives to demonstrate a willingness and capacity to assimilate. In this sense, the modern workplace discrimination problem may be more about employers requiring people of color to demonstrate racial palatability than about employers totally excluding people of color for the workplace. We discuss whether and to what extent anti-discrimination law can ameliorate this problem

    High-density speckle contrast optical tomography (SCOT) for three dimensional tomographic imaging of the small animal brain

    Get PDF
    High-density speckle contrast optical tomography (SCOT) utilizing tens of thousands of source-detector pairs, was developed for in vivo imaging of blood flow in small animals. The reduction in cerebral blood flow (CBF) due to local ischemic stroke in a mouse brain was transcanially imaged and reconstructed in three dimensions. The reconstructed volume was then compared with corresponding magnetic resonance images demonstrating that the volume of reduced CBF agrees with the infarct zone at twenty-four hours.Peer ReviewedPostprint (author's final draft

    A Predictive Algorithm For Wetlands In Deep Time Paleoclimate Models

    Get PDF
    Methane is a powerful greenhouse gas produced in wetland environments via microbial action in anaerobic conditions. If the location and extent of wetlands are unknown, such as for the Earth many millions of years in the past, a model of wetland fraction is required in order to calculate methane emissions and thus help reduce uncertainty in the understanding of past warm greenhouse climates. Here we present an algorithm for predicting inundated wetland fraction for use in calculating wetland methane emission fluxes in deep time paleoclimate simulations. The algorithm determines, for each grid cell in a given paleoclimate simulation, the wetland fraction predicted by a nearest neighbours search of modern day data in a space described by a set of environmental, climate and vegetation variables. To explore this approach, we first test it for a modern day climate with variables obtained from observations and then for an Eocene climate with variables derived from a fully coupled global climate model (HadCM3BL-M2.2). Two independent dynamic vegetation models were used to provide two sets of equivalent vegetation variables which yielded two different wetland predictions. As a first test the method, using both vegetation models, satisfactorily reproduces modern data wetland fraction at a course grid resolution, similar to those used in paleoclimate simulations. We then applied the method to an early Eocene climate, testing its outputs against the locations of Eocene coal deposits. We predict global mean monthly wetland fraction area for the early Eocene of 8 to 10 × 106km2 with corresponding total annual methane flux of 656 to 909 Tg, depending on which of two different dynamic global vegetation models are used to model wetland fraction and methane emission rates. Both values are significantly higher than estimates for the modern-day of 4 × 106km2 and around 190Tg (Poulter et. al. 2017, Melton et. al., 2013

    Successful difficult airway intubation using the Miller laryngoscope blade and paraglossal technique – a comparison with the Macintosh blade and midline technique

    Get PDF
    In anaesthetic practice clinicians are often faced with di cult airway situations. The conventional approach to intubation is the midline technique using a curved Macintosh blade for direct laryngoscopy. However, we have been successful in such a case using old technology and a seldom-used technique. This case raised the question whether older, alternative, methods of tracheal intubation may o er an advantage in airway management above the conventional practice. During pre-operative evaluation a patient presented with a large visible epiglottis on evaluation of the mouth and oropharynx. On direct laryngoscopy with a Macintosh 3 laryngoscope blade and the midline technique, a Cormack and Lehane grade-3b view was obtained due to the long epiglottis but normal position of the larynx. The Miller 4 blade and the paraglossal technique yielded a Cormack and Lehane grade-1 view and the trachea was successfully intubated using this approach. Use of the Miller blade and the paraglossal technique provided a perfect view of the glottis. Based on this experience and the findings of several studies on this topic, this approach could be a viable alternative to airway management.Keywords: difficult airway, Miller blade, Mallampati 0, Macintosh technique, paraglossal techniqu

    Operation of Graphene Transistors at GHz Frequencies

    Full text link
    Top-gated graphene transistors operating at high frequencies (GHz) have been fabricated and their characteristics analyzed. The measured intrinsic current gain shows an ideal 1/f frequency dependence, indicating an FET-like behavior for graphene transistors. The cutoff frequency fT is found to be proportional to the dc transconductance gm of the device. The peak fT increases with a reduced gate length, and fT as high as 26 GHz is measured for a graphene transistor with a gate length of 150 nm. The work represents a significant step towards the realization of graphene-based electronics for high-frequency applications

    ENSO dynamics in current climate models: an investigation using nonlinear dimensionality reduction

    Get PDF
    International audienceLinear dimensionality reduction techniques, notably principal component analysis, are widely used in climate data analysis as a means to aid in the interpretation of datasets of high dimensionality. These linear methods may not be appropriate for the analysis of data arising from nonlinear processes occurring in the climate system. Numerous techniques for nonlinear dimensionality reduction have been developed recently that may provide a potentially useful tool for the identification of low-dimensional manifolds in climate data sets arising from nonlinear dynamics. Here, we apply Isomap, one such technique, to the study of El Niño/Southern Oscillation variability in tropical Pacific sea surface temperatures, comparing observational data with simulations from a number of current coupled atmosphere-ocean general circulation models. We use Isomap to examine El Niño variability in the different datasets and assess the suitability of the Isomap approach for climate data analysis. We conclude that, for the application presented here, analysis using Isomap does not provide additional information beyond that already provided by principal component analysis

    Transient simulations of the last 22,000 years, with a fully dynamic atmosphere in the GENIE earth-system framework

    No full text
    This paper presents and discusses an ensemble of transient model simulations from the Last Glacial Maximum to present-day. The model includes a fully dynamic, primitive equation atmosphere (the Reading IGCM), computed vegetation (TRIFFID), and a slab-ocean and seaice. The atmospheric model is more akin to a low-resolution GCM than traditional EMICS, and yet is fast enough for long ensemble simulations to be carried out. The model is tuned in a purely objective manner, using a genetic algorithm, which perturbs 30 tunable paramters in the model to find the best fit to a prescribed pre-industrial climate.The control deglaciation experiment has good agreement with data at the Last glacial Maximum and mid-Holocene. The deglaciation ensembles are over initial conditions, physical processes, and tunable model parameters. The ice-sheets are prescribed, and changes in oceanic heat transport are neglected, and yet the model exhibits rapid transitions in many of the ensemble members. These are attributable to the interaction of the dynamic atmosphere with the sea-ice, and are not observed when the ocean and sea-ice surface temperatures are prescribed. The timing of these transitions is sensitive to the initial conditions, pointing to the chaotic nature of the climate system.The simulations have been carried out making use of GRID technologies, developed as part of the GENIE project
    • …
    corecore