3,100 research outputs found

    Prediction of annual joint rain fade on EHF networks by weighted rain field selection

    Get PDF
    ©2015. American Geophysical Union. All Rights Reserved. We present a computationally efficient method to predict joint rain fade on arbitrary networks of microwave links. Methods based on synthetic rain fields composed of a superposition of rain cells have been shown to produce useful predictions of joint fade, with low computational overhead. Other methods using rain fields derived from radar systems have much higher computational overhead but provide better predictions. The proposed method combines the best features of both methods by using a small number of measured rain fields to produce annual fade predictions. Rain fields are grouped into heavy rain and light rain groups by maximum rain rate. A small selection of rain fields from each group are downscaled and fade predictions generated by pseudointegration of specific attenuation. This paper presents a method to optimize the weights used to combine the heavy rain and light rain fade predictions to yield an estimate of the average annual distribution. The algorithm presented yields estimates of average annual fade distributions with an error small compared to year-to-year variation, using only 0.2% of the annual data set of rain fields

    A rain height model to predict fading due to wet snow on terrestrial links

    Get PDF
    Recommendation ITU‐R P.530‐13 provides an internationally recognized prediction model for the fading due to wet snow on low‐elevation, terrestrial microwave links. An important parameter in this model is the altitude difference between the link and the rain height. The top of rain events is usually assumed to be 360 m above the zero‐degreeisotherm (ZDI). Above this height, hydrometeors are ice with low specific attenuation. Below this level, melting ice particles produce a specific attenuation up to 4 times that of the associated rain rate. A previous paper identified increasing ZDI height trends across northern Europe, North America and central Asia with slopes up to 10 m/yr. This paper examines NOAA National Centers for Environmental Prediction-National Center for Atmospheric Research Reanalysis 1 data to identify global distributions of ZDI height around mean levels that increase linearly over time. The average annual distribution of ZDI heights relative to the annual mean are calculated for each NOAA Reanalysis grid square and skew normal distributions are fitted. These are compared to models in Recommendation ITU‐R P.530‐13 and Recommendation ITU‐R 452‐14. The effects of ZDI trends and the calculated skew normal distributions are illustrated using calculated trends in fading due to wet snow for two notional 38 GHz links in Edinburgh. A slow decrease in the incidence of fading due to wet snow is predicted over most of Europe. However, some links could experience increases where warming has increased the wetness of snow

    Parallel algorithms for three dimensional electrical impedance tomography

    Get PDF
    This thesis is concerned with Electrical Impedance Tomography (EIT), an imaging technique in which pictures of the electrical impedance within a volume are formed from current and voltage measurements made on the surface of the volume. The focus of the thesis is the mathematical and numerical aspects of reconstructing the impedance image from the measured data (the reconstruction problem). The reconstruction problem is mathematically difficult and most reconstruction algorithms are computationally intensive. Many of the potential applications of EIT in medical diagnosis and industrial process control depend upon rapid reconstruction of images. The aim of this investigation is to find algorithms and numerical techniques that lead to fast reconstruction while respecting the real mathematical difficulties involved. A general framework for Newton based reconstruction algorithms is developed which describes a large number of the reconstruction algorithms used by other investigators. Optimal experiments are defined in terms of current drive and voltage measurement patterns and it is shown that adaptive current reconstruction algorithms are a special case of their use. This leads to a new reconstruction algorithm using optimal experiments which is considerably faster than other methods of the Newton type. A tomograph is tested to measure the magnitude of the major sources of error in the data used for image reconstruction. An investigation into the numerical stability of reconstruction algorithms identifies the resulting uncertainty in the impedance image. A new data collection strategy and a numerical forward model are developed which minimise the effects of, previously, major sources of error. A reconstruction program is written for a range of Multiple Instruction Multiple Data, (MIMD), distributed memory, parallel computers. These machines promise high computational power for low cost and so look promising as components in medical tomographs. The performance of several reconstruction algorithms on these computers is analysed in detail

    C9orf72-Associated FTD/ALS: When Less Is More

    Get PDF
    Hexanucleotide repeat expansions in C9ORF72 cause neurodegeneration in FTD and ALS by unknown mechanisms. A new report, by Donnelly et al. (2013), finds that these repeats trigger a pathogenic gain-of-function cascade that can be corrected by suppressing expression of the repeat transcript, paving the way for therapeutic strategies aimed at eliminating the toxic RNA
    corecore