897 research outputs found

    Optimal slice thickness for object detection with longitudinal partial volume effects in computed tomography.

    Get PDF
    Longitudinal partial volume effects (z-axial PVE), which occur when an object partly occupies a slice, degrade image resolution and contrast in computed tomography (CT). Z-axial PVE is unavoidable for subslice objects and reduces their contrast according to their fraction contained within the slice. This effect can be countered using a smaller slice thickness, but at the cost of an increased image noise or radiation dose. The aim of this study is to offer a tool for optimizing the reconstruction parameters (slice thickness and slice spacing) in CT protocols in the case of partial volume effects. This optimization is based on the tradeoff between axial resolution and noise. For that purpose, we developed a simplified analytical model investigating the average statistical effect of z-axial PVE on contrast and contrast-to-noise ratio (CNR). A Catphan 500 phantom was scanned with various pitches and CTDI and reconstructed with different slice thicknesses to assess the visibility of subslice targets that simulate low contrast anatomical features present in CT exams. The detectability score of human observers was used to rank the perceptual image quality against the CNR. Contrast and CNR reduction due to z-axial PVE measured on experimental data were first compared to numerical calculations and then to the analytical model. Compared to numerical calculations, the simplified algebraic model slightly overestimated the contrast but the differences remained below 5%. It could determine the optimal reconstruction parameters that maximize the objects visibility for a given dose in the case of z-axial PVE. An optimal slice thickness equal to three-fourth of the object width was correctly proposed by the model for nonoverlapping slices. The tradeoff between detectability and dose is maximized for a slice spacing of half the slice thickness associated with a slice width equal to the characteristic object width

    Rapid mechanical stimulation of inner-ear hair cells by photonic pressure

    Get PDF
    Hair cells, the receptors of the inner ear, detect sounds by transducing mechanical vibrations into electrical signals. From the top surface of each hair cell protrudes a mechanical antenna, the hair bundle, which the cell uses to detect and amplify auditory stimuli, thus sharpening frequency selectivity and providing a broad dynamic range. Current methods for mechanically stimulating hair bundles are too slow to encompass the frequency range of mammalian hearing and are plagued by inconsistencies. To overcome these challenges, we have developed a method to move individual hair bundles with photonic force. This technique uses an optical fiber whose tip is tapered to a diameter of a few micrometers and endowed with a ball lens to minimize divergence of the light beam. Here we describe the fabrication, characterization, and application of this optical system and demonstrate the rapid application of photonic force to vestibular and cochlear hair cells

    Towards the development of an integrated sustainability and resilience benefits assessment framework of urban green growth interventions

    Get PDF
    Considering the current emerging demographic, urbanization and climatic trends, integrating sustainability and resilience principles into urban development becomes a key priority for decision-makers worldwide. Local and national governments, project developers and other urban stakeholders dealing with the complexities of urban development need projects with clear structure and outcomes in order to inform decision-making and ensure sources of financing. The need for developing an integrated assessment methodology that would capture and quantify multiple urban sustainability and resilience benefits of projects in one common framework and eventually lead to verifiable sustainability and resilience outcomes is immense and challenging at the same time. The main objective of this paper is to present the development of a methodological approach that aims to integrate sustainability and resilience benefits, derived from the implementation of green growth urban projects, into a unified framework of criteria addressing environmental, social, economic and institutional perspectives. The proposed sustainability and resilience benefits assessment (SRBA) methodology is a combination of top down and bottom up approaches, including GIS-based scenario building. The different types of sustainability and resilience benefits of urban green growth projects are also identified at different levels (i.e., individual, neighborhood, city and global). Moreover, the proposed methodology creates scenarios that can be illustrated by a map-based approach to enable a better illustration and visualization of benefits. It demonstrates how a map-based approach can assess not only the extent of sustainability and resilience benefits accrued (how much is benefitted), but also their spatial distribution (who is benefitted). The main methodological challenges and issues on developing an integrated sustainability and resilience benefits assessment are identified and discussed

    Hs 76 : Cicero: De officiis, dt. - Goldene Bulle - Freiheiten - Johannes Hartlieb: Kunst der gedächtnüs (um 1475)

    Get PDF
    Fair allocation of flows in multicommodity networks has been attracting a growing attention. In Max-Min Fair (MMF) flow allocation, not only the flow of the commodity with the smallest allocation is maximized but also, in turn, the second smallest, the third smallest, and so on. Since the MMF paradigm allows to approximate the TCP flow allocation when the routing paths are given and the flows are elastic, we address the network routing problem where, given a graph with arc capacities and a set of origin-destination pairs with unknown demands, we must route each commodity over a single path so as to maximize the throughput, subject to the constraint that the flows are allocated according to the MMF principle. After discussing two properties of the problem, we describe a column generation based heuristic and report some computational results

    New development: Directly elected mayors in Italy: creating a strong leader doesn’t mean creating strong leadership

    Get PDF
    More than 20 years after their introduction, directly elected mayors are key players in Italian urban governance. This article explains the main effects of this reform on local government systems and provides lessons for other countries considering directly elected mayors

    The NA48 event-building PC farm

    Get PDF
    The NA48 experiment at the CERN SPS aims to measure the parameter (ϵ/ϵ)\Re(\epsilon'/ \epsilon) of direct CP violation in the neutral kaon system with an accuracy of 2×1042 \times 10^{-4}. Based on the requirements of: \\\\ * high event rates (up to 10 kHz) with negligible dead time;\\ * support for a variety of detectors with very wide variation in the number of readout channels;\\ * data rates of up to 150 MByte/s sustained over the beam burst;\\ * level-3 filtering and remote data logging in the CERN computer center; \\\\ the collaboration has designed and built a modular pipelined data flow system with 40 MHz sampling rate. The architecture combines custom-designed components with commercially available hardware for cost effectiveness and flexibility. To increase the available data bandwidth and to add filtering and monitoring capabilities, the original custom-built event builder hardware has been replaced by a farm of 24 Intel PentiumII based PCs running the Linux operating system during the shutdown between the 1997 and 1998 data taking periods. During the data taking period 1998 the system has been successfully operated taking ca. 70 Terabyte of data

    Fast algorithm for real-time rings reconstruction

    Get PDF
    The GAP project is dedicated to study the application of GPU in several contexts in which real-time response is important to take decisions. The definition of real-time depends on the application under study, ranging from answer time of μs up to several hours in case of very computing intensive task. During this conference we presented our work in low level triggers [1] [2] and high level triggers [3] in high energy physics experiments, and specific application for nuclear magnetic resonance (NMR) [4] [5] and cone-beam CT [6]. Apart from the study of dedicated solution to decrease the latency due to data transport and preparation, the computing algorithms play an essential role in any GPU application. In this contribution, we show an original algorithm developed for triggers application, to accelerate the ring reconstruction in RICH detector when it is not possible to have seeds for reconstruction from external trackers

    ChPT tests at the NA48 and NA62 experiments at CERN

    Full text link
    The NA48/2 Collaboration at CERN has accumulated unprecedented statistics of rare kaon decays in the Ke4 modes: Ke4(+-) (K±π+πe±νK^\pm \to \pi^+ \pi^- e^\pm \nu) and Ke4(00) (K±π0π0e±νK^\pm \to \pi^0 \pi^0 e^\pm \nu) with nearly one percent background contamination. The detailed study of form factors and branching rates, based on these data, has been completed recently. The results brings new inputs to low energy strong interactions description and tests of Chiral Perturbation Theory (ChPT) and lattice QCD calculations. In particular, new data support the ChPT prediction for a cusp in the π0π0\pi^0\pi^0 invariant mass spectrum at the two charged pions threshold for Ke4(00) decay. New final results from an analysis of about 400 K±π±γγK^\pm \to \pi^\pm \gamma \gamma rare decay candidates collected by the NA48/2 and NA62 experiments at CERN during low intensity runs with minimum bias trigger configurations are presented. The results include a model-independent decay rate measurement and fits to ChPT description.Comment: XIIth International Conference on Heavy Quarks and Leptons 2014, Mainz, German
    corecore