42 research outputs found
ΠΡΠ³Π° Π½Π΅ΡΡΠ°Π±ΠΈΠ»ΡΠ½ΠΎΡΡΠΈ Π² ΡΠΊΡΠ°ΠΈΠ½ΡΠΊΠΎΠΉ Π³Π΅ΠΎΠΏΠΎΠ»ΠΈΡΠΈΠΊΠ΅
Π‘ΡΠ°ΡΡΡ ΠΏΠΎΡΠ²ΡΡΠ΅Π½Π° ΠΏΡΠΎΠ±Π»Π΅ΠΌΠ΅ ΡΠΎΡΠΌΠΈΡΠΎΠ²Π°Π½ΠΈΡ Π³Π΅ΠΎΠΏΠΎΠ»ΠΈΡΠΈΡΠ΅ΡΠΊΠΎΠ³ΠΎ Π²Π΅ΠΊΡΠΎΡΠ° ΡΠΎΠ²ΡΠ΅ΠΌΠ΅Π½Π½ΠΎΠΉ Π£ΠΊΡΠ°ΠΈΠ½Ρ. Π ΡΡΠ°ΡΡΠ΅ ΠΏΠΎΠΊΠ°Π·Π°Π½Ρ Π½Π΅Π³Π°ΡΠΈΠ²Π½ΡΠ΅, ΡΠ°ΠΊΡΠΈΡΠ΅ΡΠΊΠΈ ΠΊΡΠΈΠ·ΠΈΡΠ½ΡΠ΅ ΡΠ²Π»Π΅Π½ΠΈΡ ΠΈ ΠΏΡΠΎΡΠ΅ΡΡΡ Π² Π³ΠΎΡΡΠ΄Π°ΡΡΡΠ²Π΅Π½Π½ΠΎΠΌ ΡΡΡΠΎΠΈΡΠ΅Π»ΡΡΡΠ²Π΅ Π£ΠΊΡΠ°ΠΈΠ½Ρ. Π£ΠΊΡΠ°ΠΈΠ½Π° - ΡΡΠ±Π΅ΠΆΠ½Π°Ρ ΡΡΡΠ°Π½Π° ΠΌΠ΅ΠΆΠ΄Ρ ΡΠΎΠ²ΡΠ΅ΠΌΠ΅Π½Π½ΡΠΌ ΠΠ°ΠΏΠ°Π΄ΠΎΠΌ ΠΈ ΠΠΎΡΡΠΎΠΊΠΎΠΌ. Π£ΡΠΈΠ»ΠΈΠ²Π°ΡΡΠ΅Π΅ΡΡ ΠΏΡΠΎΡΠΈΠ²ΠΎΡΡΠΎΡΠ½ΠΈΠ΅ ΠΠ°ΠΏΠ°Π΄Π° ΠΈ Π ΠΎΡΡΠΈΠΈ ΠΌΠΎΠΆΠ΅Ρ ΠΏΡΠΈΠ²Π΅ΡΡΠΈ ΠΊ ΡΠ°ΡΠΊΠΎΠ»Ρ ΡΠΊΡΠ°ΠΈΠ½ΡΠΊΠΎΠ³ΠΎ ΠΎΠ±ΡΠ΅ΡΡΠ²Π° ΠΈ Π³ΠΎΡΡΠ΄Π°ΡΡΡΠ²Π° Π² ΡΠ΅Π»ΠΎΠΌ. Π£ΡΡΡΠ΅ΠΌΠ»Π΅Π½Π½ΠΎΡΡΡ ΠΏΠΎΠ»ΠΈΡΠΈΠΊΠΎΠ² ΡΡΡΠ°Π½Ρ Π² ΡΡΠΎΡΠΎΠ½Ρ ΠΠ°ΠΏΠ°Π΄Π° Π½Π΅ΡΠ΅Ρ Π²ΠΎΡΡΠΎΡΠ½ΠΎΡΠ»Π°Π²ΡΠ½ΡΠΊΠΈΠΌ Π½Π°ΡΠΎΠ΄Π°ΠΌ ΠΎΠ³ΡΠΎΠΌΠ½ΡΠΉ Π²ΡΠ΅Π΄. ΠΡΠΎΠΈΡΡ
ΠΎΠ΄ΠΈΡ ΠΏΠΎΠ΄ΠΌΠ΅Π½Π° ΡΠ΅ΡΠ΅Π½ΠΈΡ ΠΆΠΈΠ·Π½Π΅Π½Π½ΠΎΠ²Π°ΠΆΠ½ΡΡ
ΡΠΊΠΎΠ½ΠΎΠΌΠΈΡΠ΅ΡΠΊΠΈΡ
ΠΏΡΠΎΠ±Π»Π΅ΠΌ ΠΏΠΎΠ»ΠΈΡΠΈΡΠ΅ΡΠΊΠΈΠΌΠΈ Π°ΠΊΡΠΈΡΠΌΠΈ, Π½Π°ΠΏΡΠ°Π²Π»Π΅Π½Π½ΡΠΌΠΈ Π½Π° ΡΠ°ΡΠΊΠΎΠ» Π²ΠΎΡΡΠΎΡΠ½ΠΎΡΠ»Π°Π²ΡΠ½ΡΠΊΠΈΡ
Π½Π°ΡΠΎΠ΄ΠΎΠ². Π£ΠΊΡΠ°ΠΈΠ½Π° ΠΏΠΎ Π²ΠΎΠ»Π΅ Π΅Π΅ ΡΡΠΊΠΎΠ²ΠΎΠ΄ΠΈΡΠ΅Π»Π΅ΠΉ ΡΡΠ°Π½ΠΎΠ²ΠΈΡΡΡ ΠΎΡΠ³Π°Π½ΠΈΠ·Π°ΡΠΎΡΠΎΠΌ Π°Π½ΡΠΈΡΠΎΡΡΠΈΠΉΡΠΊΠΎΠΉ ΠΏΠΎΠ»ΠΈΡΠΈΠΊΠΈ Π² ΡΠΎΠ²ΡΠ΅ΠΌΠ΅Π½Π½ΡΡ
ΡΡΠ»ΠΎΠ²ΠΈΡΡ
. ΠΠΎΠΏΡΡΠΊΠΈ ΠΏΡΠΎΠ΄Π²ΠΈΠ½ΡΡΡ ΠΈΠ΄Π΅ΠΈ "ΠΎΡΠ°Π½ΠΆΠ΅Π²ΠΎΠΉ ΡΠ΅Π²ΠΎΠ»ΡΡΠΈΠΈ" ΠΏΡΠΈ ΠΏΠΎΠ΄Π΄Π΅ΡΠΆΠΊΠ΅ Π‘Π¨Π Π½Π° ΠΠΎΡΡΠΎΠΊ Π½Π΅ ΡΠ²Π΅Π½ΡΠ°Π»ΠΈΡΡ ΡΡΠΏΠ΅Ρ
ΠΎΠΌ, Π½ΠΎ Π½Π°Π½ΠΎΡΡΡ ΠΎΠ³ΡΠΎΠΌΠ½ΡΠΉ ΡΠΊΠΎΠ½ΠΎΠΌΠΈΡΠ΅ΡΠΊΠΈΠΉ ΠΈ ΠΈΠ½ΠΎΠΉ Π²ΡΠ΅Π΄ Π³ΠΎΡΡΠ΄Π°ΡΡΡΠ²Ρ ΠΈ Π²ΡΠ΅ΠΌΡ ΡΠΊΡΠ°ΠΈΠ½ΡΠΊΠΎΠΌΡ Π½Π°ΡΠΎΠ΄Ρ.Π‘ΡΠ°ΡΡΡ ΠΏΡΠΈΡΠ²ΡΡΠ΅Π½Π° ΠΏΡΠΎΠ±Π»Π΅ΠΌΡ ΡΠΎΡΠΌΡΠ²Π°Π½Π½Ρ Π³Π΅ΠΎΠΏΠΎΠ»ΡΡΠΈΡΠ½ΠΎΠ³ΠΎ Π²Π΅ΠΊΡΠΎΡΠ° Π‘ΡΡΠ°ΡΠ½ΠΎΡ Π£ΠΊΡΠ°ΡΠ½ΠΈ. Π ΡΡΠ°ΡΡΡ ΠΏΠΎΠΊΠ°Π·Π°Π½Ρ Π½Π΅Π³Π°ΡΠΈΠ²Π½Ρ, ΡΠ°ΠΊΡΠΈΡΠ½ΠΎ ΠΊΡΠΈΠ·ΠΎΠ²Ρ ΡΠ²ΠΈΡΠ° Ρ ΠΏΡΠΎΡΠ΅ΡΠΈ Π² Π΄Π΅ΡΠΆΠ°Π²Π½ΠΎΠΌΡ Π±ΡΠ΄ΡΠ²Π½ΠΈΡΡΠ²Ρ Π£ΠΊΡΠ°ΡΠ½ΠΈ. Π£ΠΊΡΠ°ΡΠ½Π° - ΠΊΡΠ°ΡΠ½Π° ΡΡΠ±Π΅ΠΆΡ ΠΌΡΠΆ ΡΡΡΠ°ΡΠ½ΠΈΠΌ ΠΠ°Ρ
ΠΎΠ΄ΠΎΠΌ Ρ Π‘Ρ
ΠΎΠ΄ΠΎΠΌ. ΠΡΠΎΡΠΈΡΡΠΎΡΠ½Π½Ρ ΠΠ°Ρ
ΠΎΠ΄Ρ Ρ Π ΠΎΡΡΡ, ΡΠΎ ΠΏΠΎΡΠΈΠ»ΡΡΡΡΡΡ, ΠΌΠΎΠΆΠ΅ ΠΏΡΠΈΠ²Π΅ΡΡΠΈ Π΄ΠΎ ΡΠΎΠ·ΠΊΠΎΠ»Ρ ΡΠΊΡΠ°ΡΠ½ΡΡΠΊΠΎΠ³ΠΎ ΡΡΡΠΏΡΠ»ΡΡΡΠ²Π° Ρ Π΄Π΅ΡΠΆΠ°Π²ΠΈ Π² ΡΡΠ»ΠΎΠΌΡ. Π‘ΠΏΡΡΠΌΠΎΠ²Π°Π½ΡΡΡΡ ΠΏΠΎΠ»ΡΡΠΈΠΊΡΠ² ΠΊΡΠ°ΡΠ½ΠΈ Ρ Π±ΡΠΊ ΠΠ°Ρ
ΠΎΠ΄Ρ Π½Π΅ΡΠ΅ ΡΡ
ΡΠ΄Π½ΠΎ-ΡΠ»ΠΎΠ²'ΡΠ½ΡΡΠΊΠΈΠΌ Π½Π°ΡΠΎΠ΄Π°ΠΌ Π²Π΅Π»ΠΈΡΠ΅Π·Π½Ρ ΡΠΊΠΎΠ΄Ρ. ΠΡΠ΄Π±ΡΠ²Π°ΡΡΡΡΡ ΠΏΡΠ΄ΠΌΡΠ½Π° ΡΡΡΠ΅Π½Π½Ρ ΠΆΠΈΡΡΡΠ²Π°ΠΆΠ»ΠΈΠ²ΠΈΡ
Π΅ΠΊΠΎΠ½ΠΎΠΌΡΡΠ½ΠΈΡ
ΠΏΡΠΎΠ±Π»Π΅ΠΌ ΠΏΠΎΠ»ΡΡΠΈΡΠ½ΠΈΠΌΠΈ Π°ΠΊΡΡΡΠΌΠΈ, Π½Π°ΠΏΡΠ°Π²Π»Π΅Π½ΠΈΠΌΠΈ Π½Π° ΡΠΎΠ·ΠΊΠΎΠ» ΡΡ
ΡΠ΄Π½ΠΎΡΠ»ΠΎΠ²'ΡΠ½ΡΡΠΊΠΈΡ
Π½Π°ΡΠΎΠ΄ΡΠ². Π£ΠΊΡΠ°ΡΠ½ΠΈ ΠΏΠΎ Π²ΠΎΠ»Ρ ΡΡ ΠΊΠ΅ΡΡΠ²Π½ΠΈΠΊΡΠ² ΡΡΠ°Ρ ΠΎΡΠ³Π°Π½ΡΠ·Π°ΡΠΎΡΠΎΠΌ Π°Π½ΡΠΈΡΠΎΡΡΠΉΡΡΠΊΠΎΡ ΠΏΠΎΠ»ΡΡΠΈΠΊΠΈ Π² ΡΡΡΠ°ΡΠ½ΠΈΡ
ΡΠΌΠΎΠ²Π°Ρ
. Π‘ΠΏΡΠΎΠ±ΠΈ ΠΏΡΠΎΡΡΠ½ΡΡΠΈ ΡΠ΄Π΅Ρ "ΠΎΡΠ°Π½ΠΆΠ΅Π²ΠΎΡ ΡΠ΅Π²ΠΎΠ»ΡΡΡΡ" ΠΏΡΠΈ ΠΏΡΠ΄ΡΡΠΈΠΌΡΡ Π‘Π¨Π Π½Π° ΡΡ
ΡΠ΄ Π½Π΅ ΡΠ²ΡΠ½ΡΠ°Π»ΠΈΡΡ ΡΡΠΏΡΡ
ΠΎΠΌ, Π°Π»Π΅ Π½Π°Π½ΠΎΡΡΡΡ Π²Π΅Π»ΠΈΡΠ΅Π·Π½ΠΈΠΉ Π΅ΠΊΠΎΠ½ΠΎΠΌΡΡΠ½ΠΈΠΉ Ρ ΡΠ½ΡΠ° ΡΠΊΠΎΠ΄Π° Π΄Π΅ΡΠΆΠ°Π²Ρ ΡΠ° Π²ΡΡΠΎΠΌΡ ΡΠΊΡΠ°ΡΠ½ΡΡΠΊΠΎΠΌΡ Π½Π°ΡΠΎΠ΄Ρ.The article is devoted to the problem of forming of geopolitical vector of Modern Ukraine. The negative, actually crisis phenomena and processes in state building of Ukraine are shown in the article. Ukraine is a border country between modern the West and the East. Increasing opposition of the West and Russia can result in the dissidence of Ukrainian society and state on the whole. The tendency of politicians of country toward the West carries enormous harm to the east Slavonic people. There is substitution of decision economic problems by the political actions directed on the dissidence of eastslavonic people. Ukraine on will of its leaders becomes the organizer of anti-russian policy in modern terms. Attempts to move forward the ideas of "orange revolution" at support of the USA east were not crowned by success, but inflict enormous economic and other harm to the state to all Ukrainian people
The impact of acute asymmetric hearing loss on multisensory integration
Humans have the remarkable ability to integrate information from different senses, which greatly facilitates the detection, localization and identification of events in the environment. About 466 million people worldwide suffer from hearing loss. Yet, the impact of hearing loss on how the senses work together is rarely investigated. Here, we investigate how a common sensory impairment, asymmetric conductive hearing loss (AHL), alters the way our senses interact by examining human orienting behaviour with normal hearing (NH) and acute AHL. This type of hearing loss disrupts auditory localization. We hypothesized that this creates a conflict between auditory and visual spatial estimates and alters how auditory and visual inputs are integrated to facilitate multisensory spatial perception. We analysed the spatial and temporal properties of saccades to auditory, visual and audiovisual stimuli before and after plugging the right ear of participants. Both spatial and temporal aspects of multisensory integration were affected by AHL. Compared with NH, AHL caused participants to make slow, inaccurate and unprecise saccades towards auditory targets. Surprisingly, increased weight on visual input resulted in accurate audiovisual localization with AHL. This came at a cost: saccade latencies for audiovisual targets increased significantly. The larger the auditory localization errors, the less participants were able to benefit from audiovisual integration in terms of saccade latency. Our results indicate that observers immediately change sensory weights to effectively deal with acute AHL and preserve audiovisual accuracy in a way that cannot be fully explained by statistical models of optimal cue integration
Short-Latency Evoked Potentials of the Human Auditory System
Auditory Brainstem Responses (ABR) are short-latency electric potentials from the auditory nervous system that can be evoked by presenting transient acoustic stimuli to the ear. Sources of the ABR are the auditory nerve and brainstem auditory nuclei. Clinical application of ABRs includes identification of the site of lesion in retrocochlear hearing loss, establishing functional integrity of the auditory nerve, and objective audiometry. Recording of ABR requires a measurement setup with a high-quality amplifier with adequate filtering and low skin-electrode impedance to reduce non-physiological interference. Furthermore, signal averaging and artifact rejection are essential tools for obtaining a good signal-to-noise ratio. Comparing latencies for different peaks at different stimulus intensities allows the determination of hearing threshold, location of the site of lesion, and establishment of neural integrity. Audiological assessment of infants who are referred after failing hearing screening relies on accurate estimation of hearing thresholds. Frequency-specific ABR using tone-burst stimuli is a clinically feasible method for this. Appropriate correction factors should be applied to estimate the hearing threshold from the ABR threshold. Whenever possible, obtained thresholds should be confirmed with behavioral testing. The Binaural Interaction Component of the ABR provides important information regarding binaural processing in the brainstem
Auditory timing-tuned neural responses in the human auditory cortices
Perception of sub-second auditory event timing supports multisensory integration, and speech and music perception and production. Neural populations tuned for the timing (duration and rate) of visual events were recently described in several human extrastriate visual areas. Here we ask whether the brain also contains neural populations tuned for auditory event timing, and whether these are shared with visual timing. Using 7T fMRI, we measured responses to white noise bursts of changing duration and rate. We analyzed these responses using neural response models describing different parametric relationships between event timing and neural response amplitude. This revealed auditory timing-tuned responses in the primary auditory cortex, and auditory association areas of the belt, parabelt and premotor cortex. While these areas also showed tonotopic tuning for auditory pitch, pitch and timing preferences were not consistently correlated. Auditory timing-tuned response functions differed between these areas, though without clear hierarchical integration of responses. The similarity of auditory and visual timing tuned responses, together with the lack of overlap between the areas showing these responses for each modality, suggests modality-specific responses to event timing are computed similarly but from different sensory inputs, and then transformed differently to suit the needs of each modality
Visual timing-tuned responses in human association cortices and response dynamics in early visual cortex
Quantifying the timing (duration and frequency) of brief visual events is vital to human perception, multisensory integration and action planning. Tuned neural responses to visual event timing have been found in association cortices, in areas implicated in these processes. Here we ask how these timing-tuned responses are related to the responses of early visual cortex, which monotonically increase with event duration and frequency. Using 7-Tesla functional magnetic resonance imaging and neural model-based analyses, we find a gradual transition from monotonically increasing to timing-tuned neural responses beginning in the medial temporal area (MT/V5). Therefore, across successive stages of visual processing, timing-tuned response components gradually become dominant over inherent sensory response modulation by event timing. This additional timing-tuned response component is independent of retinotopic location. We propose that this hierarchical emergence of timing-tuned responses from sensory processing areas quantifies sensory event timing while abstracting temporal representations from spatial properties of their inputs
Cortical quantity representations of visual numerosity and timing overlap increasingly into superior cortices but remain distinct
Many sensory brain areas are organized as topographic maps where neural response preferences change gradually across the cortical surface. Within association cortices, 7-Tesla fMRI and neural model-based analyses have also revealed many topographic maps for quantities like numerosity and event timing, often in similar locations. Numerical and temporal quantity estimations also show behavioral similarities and even interactions. For example, the duration of high-numerosity displays is perceived as longer than that of low-numerosity displays. Such interactions are often ascribed to a generalized magnitude system with shared neural responses across quantities. Anterior quantity responses are more closely linked to behavior. Here, we investigate whether common quantity representations hierarchically emerge by asking whether numerosity and timing maps become increasingly closely related in their overlap, response preferences, and topography. While the earliest quantity maps do not overlap, more superior maps overlap increasingly. In these overlapping areas, some intraparietal maps have consistently correlated numerosity and timing preferences, and some maps have consistent angles between the topographic progressions of numerosity and timing preferences. However, neither of these relationships increases hierarchically like the amount of overlap does. Therefore, responses to different quantities are initially derived separately, then progressively brought together, without generally becoming a common representation. Bringing together distinct responses to different quantities may underlie behavioral interactions and allow shared access to comparison and action planning systems
Auditory distance perception in front and rear space
The distance of sound sources relative to the body can be estimated using acoustic level and direct-to-reverberant ratio cues. However, the ability to do this may differ for sounds that are in front compared to behind the listener. One reason for this is that vision, which plays an important role in calibrating auditory distance cues early in life, is unavailable for rear space. Furthermore, the filtering of sounds by the pinnae differs if they originate from the front compared to the back. We investigated auditory distance discrimination in front and rear space by comparing performance for auditory spatial bisection of distance and minimum audible distance discrimination (MADD) tasks. In the bisection task, participants heard three successive bursts of noise at three different distances and indicated whether the second sound (probe) was closer in space to the first or third sound (references). In the MADD task, participants reported which of two successive sounds was closer. An analysis of variance with factors task and region of space showed worse performance for rear than for front space, but no significant interaction between task and region of space. For the bisection task, the point of subjective equality (PSE) was slightly biased towards the body, but the absolute magnitude of the PSE did not differ between front and rear space. These results are consistent with the hypothesis that visual information is important in calibrating the auditory representation of front space in distance early in lif
Language Comprehension in the Balance: The Robustness of the Action-Compatibility Effect (ACE)
How does language comprehension interact with motor activity? We investigated the conditions under which comprehending an action sentence affects people's balance. We performed two experiments to assess whether sentences describing forward or backward movement modulate the lateral movements made by subjects who made sensibility judgments about the sentences. In one experiment subjects were standing on a balance board and in the other they were seated on a balance board that was mounted on a chair. This allowed us to investigate whether the action compatibility effect (ACE) is robust and persists in the face of salient incompatibilities between sentence content and subject movement. Growth-curve analysis of the movement trajectories produced by the subjects in response to the sentences suggests that the ACE is indeed robust. Sentence content influenced movement trajectory despite salient inconsistencies between implied and actual movement. These results are interpreted in the context of the current discussion of embodied, or grounded, language comprehension and meaning representation
Motion Perception: Auditory Motion Encoded in a Visual Motion Area
The motion of a translating sound source is easily perceived, yet clear evidence of motion mechanisms in auditory cortex has proved elusive. A new study may explain why, revealing auditory motion is encoded in a motion-specialised region of visual cortex