90 research outputs found

    Cervical neoplasia–related factors and decreased prevalence of uterine fibroids among a cohort of African American women

    Get PDF
    To investigate whether the previously reported inverse association between cervical neoplasia and uterine fibroids is corroborated

    Prevalence of Uterine Leiomyomas in the First Trimester of Pregnancy: An Ultrasound-Screening Study

    Get PDF
    To estimate the proportion of pregnant women with one or more leiomyoma detected by research quality ultrasound screening in the first trimester; to describe the size and location of leiomyomas identified; and to report variation in prevalence by race/ethnicity

    Short-term change in growth of uterine leiomyoma: tumor growth spurts

    Get PDF
    To describe the short-term changes in growth of uterine leiomyomata (fibroids)

    Information and Discriminability as Measures of Reliability of Sensory Coding

    Get PDF
    Response variability is a fundamental issue in neural coding because it limits all information processing. The reliability of neuronal coding is quantified by various approaches in different studies. In most cases it is largely unclear to what extent the conclusions depend on the applied reliability measure, making a comparison across studies almost impossible. We demonstrate that different reliability measures can lead to very different conclusions even if applied to the same set of data: in particular, we applied information theoretical measures (Shannon information capacity and Kullback-Leibler divergence) as well as a discrimination measure derived from signal-detection theory to the responses of blowfly photoreceptors which represent a well established model system for sensory information processing. We stimulated the photoreceptors with white noise modulated light intensity fluctuations of different contrasts. Surprisingly, the signal-detection approach leads to a safe discrimination of the photoreceptor response even when the response signal-to-noise ratio (SNR) is well below unity whereas Shannon information capacity and also Kullback-Leibler divergence indicate a very low performance. Applying different measures, can, therefore, lead to very different interpretations concerning the system's coding performance. As a consequence of the lower sensitivity compared to the signal-detection approach, the information theoretical measures overestimate internal noise sources and underestimate the importance of photon shot noise. We stress that none of the used measures and, most likely no other measure alone, allows for an unbiased estimation of a neuron's coding properties. Therefore the applied measure needs to be selected with respect to the scientific question and the analyzed neuron's functional context

    Consequences of converting graded to action potentials upon neural information coding and energy efficiency

    Get PDF
    Information is encoded in neural circuits using both graded and action potentials, converting between them within single neurons and successive processing layers. This conversion is accompanied by information loss and a drop in energy efficiency. We investigate the biophysical causes of this loss of information and efficiency by comparing spiking neuron models, containing stochastic voltage-gated Na+ and K+ channels, with generator potential and graded potential models lacking voltage-gated Na+ channels. We identify three causes of information loss in the generator potential that are the by-product of action potential generation: (1) the voltage-gated Na+ channels necessary for action potential generation increase intrinsic noise and (2) introduce non-linearities, and (3) the finite duration of the action potential creates a ‘footprint’ in the generator potential that obscures incoming signals. These three processes reduce information rates by ~50% in generator potentials, to ~3 times that of spike trains. Both generator potentials and graded potentials consume almost an order of magnitude less energy per second than spike trains. Because of the lower information rates of generator potentials they are substantially less energy efficient than graded potentials. However, both are an order of magnitude more efficient than spike trains due to the higher energy costs and low information content of spikes, emphasizing that there is a two-fold cost of converting analogue to digital; information loss and cost inflation

    Network adaptation improves temporal representation of naturalistic stimuli in drosophila eye: II Mechanisms

    Get PDF
    Retinal networks must adapt constantly to best present the ever changing visual world to the brain. Here we test the hypothesis that adaptation is a result of different mechanisms at several synaptic connections within the network. In a companion paper (Part I), we showed that adaptation in the photoreceptors (R1-R6) and large monopolar cells (LMC) of the Drosophila eye improves sensitivity to under-represented signals in seconds by enhancing both the amplitude and frequency distribution of LMCs' voltage responses to repeated naturalistic contrast series. In this paper, we show that such adaptation needs both the light-mediated conductance and feedback-mediated synaptic conductance. A faulty feedforward pathway in histamine receptor mutant flies speeds up the LMC output, mimicking extreme light adaptation. A faulty feedback pathway from L2 LMCs to photoreceptors slows down the LMC output, mimicking dark adaptation. These results underline the importance of network adaptation for efficient coding, and as a mechanism for selectively regulating the size and speed of signals in neurons. We suggest that concert action of many different mechanisms and neural connections are responsible for adaptation to visual stimuli. Further, our results demonstrate the need for detailed circuit reconstructions like that of the Drosophila lamina, to understand how networks process information

    The Natural Variation of a Neural Code

    Get PDF
    The way information is represented by sequences of action potentials of spiking neurons is determined by the input each neuron receives, but also by its biophysics, and the specifics of the circuit in which it is embedded. Even the “code” of identified neurons can vary considerably from individual to individual. Here we compared the neural codes of the identified H1 neuron in the visual systems of two families of flies, blow flies and flesh flies, and explored the effect of the sensory environment that the flies were exposed to during development on the H1 code. We found that the two families differed considerably in the temporal structure of the code, its content and energetic efficiency, as well as the temporal delay of neural response. The differences in the environmental conditions during the flies' development had no significant effect. Our results may thus reflect an instance of a family-specific design of the neural code. They may also suggest that individual variability in information processing by this specific neuron, in terms of both form and content, is regulated genetically

    Evidence for dynamic network regulation of Drosophila photoreceptor function from mutants lacking the neurotransmitter histamine

    Get PDF
    Synaptic feedback from interneurons to photoreceptors can help to optimize visual information flow by balancing its allocation on retinal pathways under changing light conditions. But little is known about how this critical network operation is regulated dynamically. Here, we investigate this question by comparing signaling properties and performance of wild-type Drosophila R1-R6 photoreceptors to those of the hdcJK910 mutant, which lacks the neurotransmitter histamine and therefore cannot transmit information to interneurons. Recordings show that hdcJK910 photoreceptors sample similar amounts of information from naturalistic stimulation to wild-type photoreceptors, but this information is packaged in smaller responses, especially under bright illumination. Analyses reveal how these altered dynamics primarily resulted from network overload that affected hdcJK910 photoreceptors in two ways. First, the missing inhibitory histamine input to interneurons almost certainly depolarized them irrevocably, which in turn increased their excitatory feedback to hdcJK910 R1-R6s. This tonic excitation depolarized the photoreceptors to artificially high potentials, reducing their operational range. Second, rescuing histamine input to interneurons in hdcJK910 mutant also restored their normal phasic feedback modulation to R1-R6s, causing photoreceptor output to accentuate dynamic intensity differences at bright illumination, similar to the wild-type. These results provide mechanistic explanations of how synaptic feedback connections optimize information packaging in photoreceptor output and novel insight into the operation and design of dynamic network regulation of sensory neurons

    Power-Law Inter-Spike Interval Distributions Infer a Conditional Maximization of Entropy in Cortical Neurons

    Get PDF
    The brain is considered to use a relatively small amount of energy for its efficient information processing. Under a severe restriction on the energy consumption, the maximization of mutual information (MMI), which is adequate for designing artificial processing machines, may not suit for the brain. The MMI attempts to send information as accurate as possible and this usually requires a sufficient energy supply for establishing clearly discretized communication bands. Here, we derive an alternative hypothesis for neural code from the neuronal activities recorded juxtacellularly in the sensorimotor cortex of behaving rats. Our hypothesis states that in vivo cortical neurons maximize the entropy of neuronal firing under two constraints, one limiting the energy consumption (as assumed previously) and one restricting the uncertainty in output spike sequences at given firing rate. Thus, the conditional maximization of firing-rate entropy (CMFE) solves a tradeoff between the energy cost and noise in neuronal response. In short, the CMFE sends a rich variety of information through broader communication bands (i.e., widely distributed firing rates) at the cost of accuracy. We demonstrate that the CMFE is reflected in the long-tailed, typically power law, distributions of inter-spike intervals obtained for the majority of recorded neurons. In other words, the power-law tails are more consistent with the CMFE rather than the MMI. Thus, we propose the mathematical principle by which cortical neurons may represent information about synaptic input into their output spike trains

    “I Look in Your Eyes, Honey”: Internal Face Features Induce Spatial Frequency Preference for Human Face Processing

    Get PDF
    Numerous psychophysical experiments found that humans preferably rely on a narrow band of spatial frequencies for recognition of face identity. A recently conducted theoretical study by the author suggests that this frequency preference reflects an adaptation of the brain's face processing machinery to this specific stimulus class (i.e., faces). The purpose of the present study is to examine this property in greater detail and to specifically elucidate the implication of internal face features (i.e., eyes, mouth, and nose). To this end, I parameterized Gabor filters to match the spatial receptive field of contrast sensitive neurons in the primary visual cortex (simple and complex cells). Filter responses to a large number of face images were computed, aligned for internal face features, and response-equalized (“whitened”). The results demonstrate that the frequency preference is caused by internal face features. Thus, the psychophysically observed human frequency bias for face processing seems to be specifically caused by the intrinsic spatial frequency content of internal face features
    corecore