174 research outputs found

    Stochastic multiple-stream decoding of Cortex codes

    No full text
    International audienceCortex codes are short length block codes having a large Hamming distance. Their modular construction, based on simple and very short block codes, yield to difficulties in efficiently decoding them with digital decoders implementing the Sum-Product algorithm. However, this construction lends itself to analog decoding with performance close to ML decoding as was recently demonstrated. A digital decoding method close to analog decoding is stochastic decoding. This paper brings the two together to design a Cortex stochastic architecture with good decoding performance. Moreover, the proposed stochastic decoder architecture is simplified when compared to the customary one. Instead of edge or tracking forecast memories the proposed architecture uses multiple streams to represent the same probability and deterministic shufflers. This results in a more efficient architecture in terms of ratio between data throughput and hardware complexity. Finally, the proposed method offers decoding performance similar to a Min-Sum decoder with 50 iterations

    Effectiveness of organised versus opportunistic mammography screening

    Get PDF
    Background: Detailed comparison of effectiveness between organised and opportunistic mammography screening operating in the same country has seldom been carried out. Patients and methods: Prognostic indicators, as defined in the European Guidelines, were used to evaluate screening effectiveness in Switzerland. Matching of screening programmes' records with population-based cancer registries enabled to compare indicators of effectiveness by screening and detection modality (organised versus opportunistic screening, unscreened, interval cancers). Comparisons of prognostic profile were also drawn with two Swiss regions uncovered by service screening of low and high prevalence of opportunistic screening, respectively. Results: Opportunistic and organised screening yielded overall little difference in prognostic profile. Both screening types led to substantial stage shifting. Breast cancer prognostic indicators were systematically more favourable in Swiss regions covered by a programme. In regions without a screening programme, the higher the prevalence of opportunistic screening, the better was the prognostic profile. Conclusions: Organised screening appeared as effective as opportunistic screening. Mammography screening has strongly influenced the stage distribution of breast cancer in Switzerland, and a favourable impact on mortality is anticipated. Extension of organised mammography screening to the whole of Switzerland can be expected to further improve breast cancer prognosis in a cost-effective wa

    Rovdjur och mat gör krickans dag till natt.

    Get PDF

    Experimental functional response and inter-individual variation in foraging rate of teal (Anas crecca)

    Get PDF
    The functional response, i.e. the change in per capita food intake rate per time unit with changed food availability, is a widely used tool for understanding the ecology and behaviour of animals. However, waterfowl remain poorly explored in this context. In an aviary experiment we derived a functional response curve for teal (Anascrecca) foraging on rice (Oryzasativa) seeds. We found a linear relationship between intake rate and seed density, as expected for a ïŹlter-feeder. At high seed densities we found a threshold, above which intake rate still increased linearly but with a lower slope, possibly reïŹ‚ecting a switch from ïŹlter-feeding to a scooping foraging mode. The present study shows that food intake rate in teal is linearly related to food availability within the range of naturally occurring seed densities, a ïŹnding with major implications for management and conservation of wetland habitats.</p

    The origin of the "European Medieval Warm Period"

    Get PDF
    Proxy records and results of a three dimensional climate model show that European summer temperatures roughly a millennium ago were comparable to those of the last 25 years of the 20th century, supporting the existence of a summer "Medieval Warm Period" in Europe. Those two relatively mild periods were separated by a rather cold era, often referred to as the "Little Ice Age". Our modelling results suggest that the warm summer conditions during the early second millennium compared to the climate background state of the 13th–18th century are due to a large extent to the long term cooling induced by changes in land-use in Europe. During the last 200 years, the effect of increasing greenhouse gas concentrations, which was partly levelled off by that of sulphate aerosols, has dominated the climate history over Europe in summer. This induces a clear warming during the last 200 years, allowing summer temperature during the last 25 years to reach back the values simulated for the early second millennium. Volcanic and solar forcing plays a weaker role in this comparison between the last 25 years of the 20th century and the early second millennium. Our hypothesis appears consistent with proxy records but modelling results have to be weighted against the existing uncertainties in the external forcing factors, in particular related to land-use changes, and against the uncertainty of the regional climate sensitivity. Evidence for winter is more equivocal than for summer. The forced response in the model displays a clear temperature maximum at the end of the 20th century. However, the uncertainties are too large to state that this period is the warmest of the past millennium in Europe during winter

    Turbo décodage de code produit haut débit utilisant un code BCH étendu (128,106,8) t=3

    Get PDF
    - Cet article propose une nouvelle architecture de turbo dĂ©codage de codes BCH (128,106,8) Ă  entrĂ©es et sorties pondĂ©rĂ©es corrigeant 3 erreurs. En utilisant le concept de parallĂ©lisme et les propriĂ©tĂ©s de la matrice gĂ©nĂ©rĂ©e par un code produit, cet article prĂ©sente la conception et la complexitĂ© de trois dĂ©codeurs capables de traiter 2, 4 et 8 donnĂ©es en mĂȘme temps pour obtenir le haut dĂ©bit dans le dĂ©codage. Pour comparer les performances et la complexitĂ© entre les diffĂ©rents dĂ©codeurs, le langage C a Ă©tĂ© utilisĂ© pour les simulations, le langage VHDL pour les simulations fonctionnelles et Synopsys Design Compiler pour la synthĂšse. Les rĂ©sultats ainsi obtenus ouvrent la possibilitĂ© d'intĂ©gration sur le silicium de turbo dĂ©codeurs Ă  fort pouvoir de correction (distance de 64, rendement de 0,8) et Ă  haut dĂ©bit (6,4Gbits/s)
    • 

    corecore