536 research outputs found
Validity and reliability of the Structured Clinical Interview for Depersonalization-Derealization Spectrum (SCI-DER).
This study evaluates the validity and reliability of a new instrument developed to assess symptoms of depresonalization: the Structured Clinical Interview for the Depersonalization-Derealization Spectrum (SCI-DER). The instrument is based on a spectrum model that emphasizes soft-signs, sub-threshold syndromes as well as clinical and subsyndromal manifestations. Items of the interview include, in addition to DSM-IV criteria for depersonalization, a number of features derived from clinical experience and from a review of phenomenological descriptions. Study participants included 258 consecutive patients with mood and anxiety disorders, 16.7% bipolar I disorder, 18.6% bipolar II disorder, 32.9% major depression, 22.1% panic disorder, 4.7% obsessive compulsive disorder, and 1.5% generalized anxiety disorder; 2.7% patients were also diagnosed with depersonalization disorder. A comparison group of 42 unselected controls was enrolled at the same site. The SCI-DER showed excellent reliability and good concurrent validity with the Dissociative Experiences Scale. It significantly discriminated subjects with any diagnosis of mood and anxiety disorders from controls and subjects with depersonalization disorder from controls. The hypothesized structure of the instrument was confirmed empirically
Comparing Multiple Turbulence Restoration Algorithms Performance on Noisy Anisoplanatic Imagery
In this paper, we compare the performance of multiple turbulence mitigation algorithms to restore imagery degraded by atmospheric turbulence and camera noise. In order to quantify and compare algorithm performance, imaging scenes were simulated by applying noise and varying levels of turbulence. For the simulation, a Monte-Carlo wave optics approach is used to simulate the spatially and temporally varying turbulence in an image sequence. A Poisson-Gaussian noise mixture model is then used to add noise to the observed turbulence image set. These degraded image sets are processed with three separate restoration algorithms: Lucky Look imaging, bispectral speckle imaging, and a block matching method with restoration filter. These algorithms were chosen because they incorporate different approaches and processing techniques. The results quantitatively show how well the algorithms are able to restore the simulated degraded imagery
Procedures for the Estimation of Pavement and Bridge Preservation Costs for Fiscal Planning and Programming
Facility preservation generally refers to the set of activities that are carried out to keep a facility in usable condition until the next reconstruction activity. For fiscal planning and programming, it is necessary to know the expected costs of preservation projects and how long they would last. Such information, coupled with minimum standards and facility inventory data enable estimation of overall monetary needs for bridge and pavement preservation, and would assist INDOT in undertaking appropriate programming and attendant financial planning over the long term. However, detailed engineering analyses are not possible every year because of the time and effort involved, therefore simple procedures to help estimate annual pavement and bridge preservation needs are useful for long-term fiscal planning. The study methodology consisted of first undertaking a full analysis based on engineering principles and detailed work in order to determine pavement and bridge needs for a period of time. Then simple procedures to estimate yearly pavement and bridge preservation costs were developed and the results were compared to the detailed engineering needs. Deterioration and cost models to establish engineering needs were developed using an array of statistical techniques including analysis of variance and regression analysis. Using the deterioration models, system inventory and minimum standards, the level of physical needs was determined for the entire pavement and bridge network over the analysis period. Finally, using the identified physical needs and developed cost models, the monetary needs were estimated. An age-based approach (that considers fixed time intervals instead of deterioration trends and minimum standards) was used for the bridge preservation needs. Based on the historical expenditure records and the amount of work performed in the past, simple regression models were developed to estimate future annual pavement and bridge preservation needs. The results obtained proved to be consistent with the engineering analysis
Male factor infertility and assisted reproductive technologies. indications, minimum access criteria and outcomes
BackgroundInfertility, which is defined as the inability to conceive after at least 12 months of regular unprotected sexual intercourses, affects about 15-20% of couples worldwide and a male factor is involved in about half of the cases. The development of assisted reproductive technology (ART) made it possible to conceive also to individuals affected from severe oligospermia or azoospermia. However, the impact of the male factor on embryo development, implantation, prevalence of chromosomal abnormalities, genetic and epigenetic alterations, and clinical and obstetric outcomes is still controversial.PurposeThis narrative review examines the indications, minimum access criteria, and outcomes by individual ART technique in relation to the male factor
Block Matching and Wiener Filtering Approach to Optical Turbulence Mitigation and Its Application to Simulated and Real Imagery with Quantitative Error Analysis
We present a block-matching and Wiener filtering approach to atmospheric turbulence mitigation for long-range imaging of extended scenes. We evaluate the proposed method, along with some benchmark methods, using simulated and real-image sequences. The simulated data are generated with a simulation tool developed by one of the authors. These data provide objective truth and allow for quantitative error analysis. The proposed turbulence mitigation method takes a sequence of short-exposure frames of a static scene and outputs a single restored image. A block-matching registration algorithm is used to provide geometric correction for each of the individual input frames. The registered frames are then averaged, and the average image is processed with a Wiener filter to provide deconvolution. An important aspect of the proposed method lies in how we model the degradation point spread function (PSF) for the purposes of Wiener filtering. We use a parametric model that takes into account the level of geometric correction achieved during image registration. This is unlike any method we are aware of in the literature. By matching the PSF to the level of registration in this way, the Wiener filter is able to fully exploit the reduced blurring achieved by registration. We also describe a method for estimating the atmospheric coherence diameter (or Fried parameter) from the estimated motion vectors. We provide a detailed performance analysis that illustrates how the key tuning parameters impact system performance. The proposed method is relatively simple computationally, yet it has excellent performance in comparison with state-of-the-art benchmark methods in our study
Application of tilt correlation statistics to anisoplanatic optical turbulence modeling and mitigation
Atmospheric optical turbulence can be a significant source of image degradation, particularly in long range imaging applications. Many turbulence mitigation algorithms rely on an optical transfer function (OTF) model that includes the Fried parameter. We present anisoplanatic tilt statistics for spherical wave propagation. We transform these into 2D autocorrelation functions that can inform turbulence modeling and mitigation algorithms. Using these, we construct an OTF model that accounts for image registration. We also propose a spectral ratio Fried parameter estimation algorithm that is robust to camera motion and requires no specialized scene content or sources. We employ the Fried parameter estimation and OTF model for turbulence mitigation. A numerical wave-propagation turbulence simulator is used to generate data to quantitatively validate the proposed methods. Results with real camera data are also presented
Validation of the Italian version of the Patient Reported Experience Measures for intermediate care services
Background: Intermediate care (IC) services are a key component of integrated care for elderly people, providing a link between hospital and home through provision of rehabilitation and health and social care. The Patient Reported Experience Measures (PREMs) are designed to measure user experience of care in IC settings. Objective: To examine the feasibility and the scaling properties of the Italian version of PREMs questionnaires for use in IC services. Methods: A cross-sectional survey was conducted on consecutive users of 1 home-based and 4 bed-based IC services in Emilia-Romagna (Italy). The main outcome measure was the PREMs questionnaire results. PREMs for each home- and bed-based IC services were translated, back-translated, and adapted through consensus among the members of the advisory board and pilot testing of face validity in 15 patients. A total of 199 questionnaires were returned from users of bed-based services and 185 were returned by mail from users of home-based services. The return rates and responses were examined. Mokken analysis was used to examine the scaling properties of the PREMs. Results: Analysis performed on the bed-based PREMs (N=154) revealed that 13 items measured the same construct and formed a moderate-strength scale (Loevinger H=0.488) with good reliability (Cronbach’s alpha =0.843). Analysis of home-based PREMs (N=134 records) revealed that 15 items constituted a strong scale (Loevinger H=0.543) with good reliability (Cronbach’s alpha =0.875). Conclusion: The Italian versions of the bed- and home-based IC-PREMs questionnaires proved to be valid and reliable tools to assess patients’ experience of care. Future plans include monitoring user experience over time in the same facilities and in other Italian IC settings for between-service benchmarking
Fusion of interpolated frames superresolution in the presence of atmospheric optical turbulence
An extension of the fusion of interpolated frames superresolution (FIF SR) method to perform SR in the presence of atmospheric optical turbulence is presented. The goal of such processing is to improve the performance of imaging systems impacted by turbulence. We provide an optical transfer function analysis that illustrates regimes where significant degradation from both aliasing and turbulence may be present in imaging systems. This analysis demonstrates the potential need for simultaneous SR and turbulence mitigation (TM). While the FIF SR method was not originally proposed to address this joint restoration problem, we believe it is well suited for this task. We propose a variation of the FIF SR method that has a fusion parameter that allows it to transition from traditional diffraction-limited SR to pure TM with no SR as well as a continuum in between. This fusion parameter balances subpixel resolution, needed for SR, with the amount of temporal averaging, needed for TM and noise reduction. In addition, we develop a model of the interpolation blurring that results from the fusion process, as a function of this tuning parameter. The blurring model is then incorporated into the overall degradation model that is addressed in the restoration step of the FIF SR method. This innovation benefits the FIF SR method in all applications. We present a number of experimental results to demonstrate the efficacy of the FIF SR method in different levels of turbulence. Simulated imagery with known ground truth is used for a detailed quantitative analysis. Three real infrared image sequences are also used. Two of these include bar targets that allow for a quantitative resolution enhancement assessment
Super-resolution in the presence of atmospheric optical turbulence
The design of imaging systems involves navigating a complex trade space. As a result, many imaging systems employ focal plane arrays with a detector pitch that is insufficient to meet the Nyquist sampling criterion under diffraction-limited imaging conditions. This undersampling may result in aliasing artifacts and prevent the imaging system from achieving the full resolution afforded by the optics. Another potential source of image degradation, especially for long-range imaging, is atmospheric optical turbulence. Optical turbulence gives rise to spatially and temporally varying image blur and warping from fluctuations in the index of refraction along with optical path. Under heavy turbulence, the blurring from the turbulence acts as an anti-aliasing filter, and undersampling does not generally occur. However, under light to moderate turbulence, many imaging systems will exhibit both aliasing artifacts and turbulence degradation. Few papers in the literature have analyzed or addressed both of these degradations together. In this paper, we provide a novel analysis of undersampling in the presence of optical turbulence. Specifically, we provide an optical transfer function analysis that illustrates regimes where aliasing and turbulence are both present, and where they are not. We also propose and evaluate a super-resolution (SR) method for combating aliasing that offers robustness to optical turbulence. The method has a tuning parameter that allows it to transition from traditional diffraction-limited SR, to pure turbulence mitigation with no SR. The proposed method is based on Fusion of Interpolated Frames (FIF) SR, recently proposed by two of the current authors. We quantitatively evaluate the SR method with varying levels of optical turbulence using simulated sequences. We also presented results using real infrared imagery
Deep learning for anisoplanatic optical turbulence mitigation in long-range imaging
We present a deep learning approach for restoring images degraded by atmospheric optical turbulence. We consider the case of terrestrial imaging over long ranges with a wide field-of-view. This produces an anisoplanatic imaging scenario where turbulence warping and blurring vary spatially across the image. The proposed turbulence mitigation (TM) method assumes that a sequence of short-exposure images is acquired. A block matching (BM) registration algorithm is applied to the observed frames for dewarping, and the resulting images are averaged. A convolutional neural network (CNN) is then employed to perform spatially adaptive restoration. We refer to the proposed TM algorithm as the block matching and CNN (BM-CNN) method. Training the CNN is accomplished using simulated data from a fast turbulence simulation tool capable of producing a large amount of degraded imagery from declared truth images rapidly. Testing is done using independent data simulated with a different well-validated numerical wave-propagation simulator. Our proposed BM-CNN TM method is evaluated in a number of experiments using quantitative metrics. The quantitative analysis is made possible by virtue of having truth imagery from the simulations. A number of restored images are provided for subjective evaluation. We demonstrate that the BM-CNN TM method outperforms the benchmark methods in the scenarios tested
- …