24 research outputs found

    Dental microwear as a behavioral proxy for distinguishing between canids at the Upper Paleolithic (Gravettian) site of Predmostí, Czech Republic

    Get PDF
    Morphological and genetic evidence put dog domestication during the Paleolithic, sometime between 40,000 and 15,000 years ago, with identification of the earliest dogs debated. We predict that these earliest dogs (referred to herein as protodogs), while potentially difficult to distinguish morphologically from wolves, experienced behavioral shifts, including changes in diet. Specifically, protodogs may have consumed more bone and other less desirable scraps within human settlement areas. Here we apply Dental Microwear Texture Analysis (DMTA) to canids from the Gravettian site of P�redmostí (approx. 28,500 BP), which were previously assigned to the Paleolithic dog or Pleistocene wolf morphotypes. We test whether these groups separate out significantly by diet- related variation in microwear patterning. Results are consistent with differences in dietary breadth, with the Paleolithic dog morphotype showing evidence of greater durophagy than those assigned to the wolf morphotype. This supports the presence of two morphologically and behaviorally distinct canid types at this middle Upper Paleolithic site. Our primary goal here was to test whether these two morphotypes expressed notable differences in dietary behavior. However, in the context of a major Gravettian settlement, this may also support evidence of early stage dog domestication. Dental microwear is a behavioral signal that may appear generations before morphological changes are established in a population. It shows promise for distinguishing protodogs from wolves in the Pleistocene and domesticated dogs from wolves elsewhere in the archaeological record

    Pre-dive Whole-Body Vibration Better Reduces Decompression-Induced Vascular Gas Emboli than Oxygenation or a Combination of Both

    Get PDF
    Purpose: Since non-provocative dive profiles are no guarantor of protection against decompression sickness, novel means including pre-dive “preconditioning” interventions, are proposed for its prevention. This study investigated and compared the effect of pre-dive oxygenation, pre-dive whole body vibration or a combination of both on post-dive bubble formation. Methods: Six healthy volunteers performed 6 no-decompression dives each, to a depth of 33 mfw for 20 min (3 control dives without preconditioning and 1 of each preconditioning protocol) with a minimum interval of 1 week between each dive. Post-dive bubbles were counted in the precordium by two-dimensional echocardiography, 30 and 90 min after the dive, with and without knee flexing. Each diver served as his own control. Results: Vascular gas emboli (VGE) were systematically observed before and after knee flexing at each post-dive measurement. Compared to the control dives, we observed a decrease in VGE count of 23.8 ± 7.4% after oxygen breathing (p < 0.05), 84.1 ± 5.6% after vibration (p < 0.001), and 55.1 ± 9.6% after vibration combined with oxygen (p < 0.001). The difference between all preconditioning methods was statistically significant. Conclusions: The precise mechanism that induces the decrease in post-dive VGE and thus makes the diver more resistant to decompression stress is still not known. However, it seems that a pre-dive mechanical reduction of existing gas nuclei might best explain the beneficial effects of this strategy. The apparent non-synergic effect of oxygen and vibration has probably to be understood because of different mechanisms involved

    Comparison of insulation provided by dry or wetsuits among recreational divers during cold water immersion (< 5°C)

    No full text
    Background: Divers thermal status influences susceptibility to decompression sickness hence the need for proper insulation during immersion in cold water. However, there is a lack of data on thermal protection provided by diving suits, hence this study. Materials and methods: Two different groups of divers wearing either a wetsuit (n = 15) or a dry suit (n = 15) volunteered for this study. Anthropometric data and dive experience were recorded; skin temperatures at the cervical-supraclavicular (C-SC) area and hands were assessed through high-resolution thermal infrared imaging taken pre and post-dive. Results: As far as anthropometrics, pre-dive C-SC temperatures (37.0 ± 0.4°C), depth (dry: 43 ± 4.6 mfw vs. wet: 40.3 ± 4.0 mfw) and water temperature exposure (4.3°C) are concerned, both groups were comparable. Total dive time was slightly longer for dry suit divers (39.6 ± 4.0 min vs. 36.5 ± 4.1 min, p = 0.049). Post-dive, C-SC temperature was increased in dry suit divers by 0.6 ± 0.6°C, and significantly decreased in wetsuit divers by 0.8 ± 0.6°C. The difference between groups was highly significant (dry: 37.5 ± 0.7°C vs. wet: 36.2 ± 0.7°C, p = 0.004). Hand’s temperature decreased significantly in both groups (dry: 30.3 ± ± 1.2°C vs. wet: 29.8 ± 0.8°C, p = 0.33). Difference between groups was not significant. Conclusions: Medium-duration immersion in cold water (< 5°C), of healthy and fully protected subjects was well tolerated. It was demonstrated that proper insulation based on a three-layer strategy allows maintaining or even slightly improve thermal balance. However, from an operational point of view, skin extremities are not preserved.SCOPUS: ar.jinfo:eu-repo/semantics/publishe

    Influences of Atmospheric Pressure and Temperature on the Intraocular Pressure

    No full text
    PURPOSE: To determine if the atmospheric pressure change experienced during diving can induce changes in the intraocular pressure (IOP) of eyes in a normal population. METHODS: The IOP of 27 healthy volunteers (aged 23,8 +/- 4,9 years old (18-44)) was measured with a Perkins applanation tonometer by two independent investigators, who were blinded to the previous measurements. Measurements were taken at baseline (normal atmospheric pressure of 1 Bar and 24degrees of Celsius), at temperatures of both 28 degrees of Celsius and 24 degrees of Celsius after increasing the atmospheric pressure to 2 Bar in a hyperbaric chamber, at baseline again and finally, at the normal atmospheric pressure of 1 Bar, but a temperature of 28 degrees of Celsius. A multivariate regression analysis was used to evaluate the RESULTS: RESULTS: The mean IOP significantly decreased from 11.8 mmHg in the right eye (RE) and 11.7 mmHg in the left eye (LE) at 1 Bar to 10.7 mmHg (RE) and 10.3 mmHg (LE) at 2 Bar (P = 0.024, RE and P =0.0006, LE). The IOP decrease remained constant during the atmospheric pressure rise period (40 minutes) and was independent of the temperature change. The temperature increase alone did not significantly influence the IOP. CONCLUSIONS: An increase of the atmospheric pressure to 2 Bar (equal to conditions experienced during underwater diving at 10 meters) modestly, but significantly decreased the IOP independent of the temperature change. During the period of increased atmospheric pressure (60 minutes), the IOP decrease remained stable and was independent of blood pressure change or corneal thickness.status: publishe

    Influences of atmospheric pressure and temperature on intraocular pressure.&quot; Invest Ophthalmol Vis Sci 49(12

    No full text
    PURPOSE. To determine whether the atmospheric pressure (ATM) change experienced during diving can induce changes in the intraocular pressure (IOP) of eyes in a normal population. METHODS. The IOP of 27 healthy volunteers (ages, 23.8 Ϯ 4.9 years; range, 18 -44) was measured with a Perkins applanation tonometer by two independent investigators who were masked to the previous measurements. Measurements were taken at baseline (normal ATM, 1 Bar and 24°C), at 28°C and 24°C after the ATM was increased to 2 Bar in a hyperbaric chamber, at baseline again, and finally at the normal ATM of 1 Bar but a temperature of 28°C. Multivariate regression analysis was used to evaluate the results. RESULTS. The mean IOP decreased significantly from 11.8 mm Hg in the right eye (RE) and 11.7 mm Hg in the left eye (LE) at 1 Bar to 10.7 mm Hg (RE) and 10.3 mm Hg (LE) at 2 Bar (P ϭ 0.024, RE; P ϭ 0.0006, LE). The IOP decrease remained constant during the ATM increase period (40 minutes) and was independent of the temperature change. The temperature increase alone did not significantly influence the IOP. CONCLUSIONS. An increase of the ATM to 2 Bar (equal to conditions experienced during underwater diving at 10 meters) modestly but significantly decreased the IOP independently of the temperature change. During the period of increased ATM (60 minutes), the IOP decrease remained stable and was independent of blood pressure change or corneal thickness. (Invest Ophthalmol Vis Sci. 2008;49:5392-5396) DOI:10.1167/ iovs.07-1578 G laucoma is the second leading cause of irreversible blindness worldwide and affects approximately 70 million people, of whom 7 million are blind. 1,2 Elevated intraocular pressure (IOP) is widely regarded as the most important modifiable risk factor associated with the development and progression of glaucomatous optic neuropathy. Little is known about the effects of external factors such as atmospheric pressure (ATM) and surrounding temperature (T) on the IOP. Rather than looking at changes in IOP, most published reports on the effects of high altitude (low ATM) focus on high-altitude retinal hemorrhage 3-5 or on systemic side effects, such as increased blood pressure, 6,7 and on cardiac side effects, 8 which lead to mountain sickness. However, one study does mention that a higher baseline IOP is a significant risk factor for altitude retinopathy. 9 IOP at high altitude has been the subject of controversy for many years. Ninety years ago, Wilmer and Berens 10 measured the IOP of 14 aviators in a hypobaric chamber, but no significant changes were found. More recently, the effect of decreased ATM on IOP has been studied by several groups; however, their results are conflicting. Some groups have observed a decrease in IOP, 11 some have found either an increase in IOP 21 This controversy may be partially attributable to the different methodologies used in each of these studies. Some investigators examined subjects only several days before departure to and after descent from high altitude; in these cases, the IOP always returned to baseline levels but could have been even lower with prolonged exposure to altitude. 18 Reports on the effect of increased ATM on IOP are rare. Ersanli et al. The aim of this prospective study was to investigate the effect of ATM and temperature increase on the IOP of healthy eyes. This study was inspired primarily by frequent inquiries from glaucoma patients about the potential deleterious effects of underwater diving, airplane travel, and mountaineering on IOP. MATERIALS AND METHODS Our sample population, consisting of 27 healthy, nonsmoking subjects (54 eyes; subject ages, 23.8 Ϯ 4.9 [range, 18 -44] years; male/female ratio, 1.25) was randomly divided into two groups. None of the subjects had any ophthalmologic disorder apart from a refractive correction of less than Ϯ4 D with glasses or contact lenses. Physical preexamination consisted of standard intake examination for hyperbaric treatments, including plain chest x-ray, tonal audiometry, and resting electrocardiography. Ophthalmologic examination consisted of biomicroscopy, nondilated funduscopy, and central corneal thickness (CCT) measurement. All subjects were treated in accordance with the Declaration of Helsinki. They were asked to fill out a general health From th

    Increased Risk of Decompression Sickness When Diving With a Right-to-Left Shunt: Results of a Prospective Single-Blinded Observational Study (The “Carotid Doppler” Study)

    No full text
    Introduction: Divers with a patent Foramen Ovale (PFO) have an increased risk for decompression sickness (DCS) when diving with compressed breathing gas. The relative risk increase, however, is difficult to establish as the PFO status of divers is usually only determined after a DCS occurrence. Methods: This prospective, single-blinded, observational study was designed to collect DCS data from volunteer divers after screening for right-to-left shunt (RLS) using a Carotid Doppler test. Divers were blinded to the result of the test, but all received a standardized briefing on current scientific knowledge of diving physiology and “low-bubble” diving techniques; they were then allowed to dive without restrictions. After a mean interval of 8 years, a questionnaire was sent collecting data on their dives and cases of DCS (if any occurred). Results: Data was collected on 148 divers totaling 66,859 dives. There was no significant difference in diving data between divers with or without RLS. Divers with RLS had a 3.02 times higher incidence of (confirmed) DCS than divers without RLS ( p = 0.04). When all cases of (confirmed or possible DCS) were considered, the Relative Risk was 1.42 ( p = 0.46). DCS occurred mainly in divers who did not dive according to “low-bubble” diving techniques, in both groups. Conclusion: This prospective study confirms that DCS is more frequent in divers with RLS (such as a PFO), with a Relative Risk of 1.42 (all DCS) to 3.02 (confirmed DCS). It appears this risk is linked to diving behavior, more specifically diving to the limits of the adopted decompression procedures.info:eu-repo/semantics/publishe

    First impressions: Use of the Azoth Systems O'Dive subclavian bubble monitor on a liveaboard dive vessel.

    No full text
    The Azoth Systems O'Dive bubble monitor is marketed at recreational and professional divers as a tool to improve personal diving decompression safety. We report the use of this tool during a 12-day dive trip aboard a liveaboard vessel.info:eu-repo/semantics/publishe

    A Neuro-fuzzy Approach of Bubble Recognition in Cardiac Video Processing

    No full text
    2D echocardiography which is the golden standard in clinics becomes the new trend of analysis in diving via its high advantages in portability for diagnosis. By the way, the major weakness of this system is non-integrated analysis platform for bubble recognition. In this study, we developed a full automatic method to recognize bubbles in videos. Gabor Wavelet based neural networks are commonly used in face recognition and biometrics. We adopted a similar approach to overcome recognition problem by training our system through real bubble morphologies. Our method does not require a segmentation step which is almost crucial in several studies. Our correct detection rate varies between 82.7-94.3%. After the detection, we classified our findings on ventricles and atria using fuzzy k-means algorithm. Bubbles are clustered in three different subjects with 84.3-93.7% accuracy rates. We suggest that this routine would be useful in longitudinal analysis and subjects with congenital risk factors

    Increasing Oxygen Partial Pressures Induce a Distinct Transcriptional Response in Human PBMC: A Pilot Study on the “Normobaric Oxygen Paradox”

    No full text
    The term “normobaric oxygen paradox” (NOP), describes the response to the return to normoxia after a hyperoxic event, sensed by tissues as oxygen shortage, and resulting in up-regulation of the Hypoxia-inducible factor 1α (HIF-1α) transcription factor activity. The molecular characteristics of this response have not been yet fully characterized. Herein, we report the activation time trend of oxygen-sensitive transcription factors in human peripheral blood mononuclear cells (PBMCs) obtained from healthy subjects after one hour of exposure to mild (MH), high (HH) and very high (VHH) hyperoxia, corresponding to 30%, 100%, 140% O2, respectively. Our observations confirm that MH is perceived as a hypoxic stress, characterized by the activation of HIF-1α and Nuclear factor (erythroid-derived 2)-like 2 (NRF2), but not Nuclear Factor kappa-light-chain-enhancer of activated B cells (NF-κB). Conversely, HH is associated to a progressive loss of NOP response and to an increase in oxidative stress leading to NRF2 and NF-kB activation, accompanied by the synthesis of glutathione (GSH). After VHH, HIF-1α activation is totally absent and oxidative stress response, accompanied by NF-κB activation, is prevalent. Intracellular GSH and Matrix metallopeptidase 9 (MMP-9) plasma levels parallel the transcription factors activation pattern and remain elevated throughout the observation time. In conclusion, our study confirms that, in vivo, the return to normoxia after MH is sensed as a hypoxic trigger characterized by HIF-1α activation. On the contrary, HH and VHH induce a shift toward an oxidative stress response, characterized by NRF2 and NF-κB activation in the first 24 h post exposure.info:eu-repo/semantics/publishe
    corecore