65 research outputs found

    Importance of Achromatic Contrast in Short-Range Fruit Foraging of Primates

    Get PDF
    Trichromatic primates have a ‘red-green’ chromatic channel in addition to luminance and ‘blue-yellow’ channels. It has been argued that the red-green channel evolved in primates as an adaptation for detecting reddish or yellowish objects, such as ripe fruits, against a background of foliage. However, foraging advantages to trichromatic primates remain unverified by behavioral observation of primates in their natural habitats. New World monkeys (platyrrhines) are an excellent model for this evaluation because of the highly polymorphic nature of their color vision due to allelic variation of the L-M opsin gene on the X chromosome. In this study we carried out field observations of a group of wild, frugivorous black-handed spider monkeys (Ateles geoffroyi frontatus, Gray 1842, Platyrrhini), consisting of both dichromats (n = 12) and trichromats (n = 9) in Santa Rosa National Park, Costa Rica. We determined the color vision types of individuals in this group by genotyping their L-M opsin and measured foraging efficiency of each individual for fruits located at a grasping distance. Contrary to the predicted advantage for trichromats, there was no significant difference between dichromats and trichromats in foraging efficiency and we found that the luminance contrast was the main determinant of the variation of foraging efficiency among red-green, blue-yellow and luminance contrasts. Our results suggest that luminance contrast can serve as an important cue in short-range foraging attempts despite other sensory cues that could be available. Additionally, the advantage of red-green color vision in primates may not be as salient as previously thought and needs to be evaluated in further field observations

    Food color is in the eye of the beholder: the role of human trichromatic vision in food evaluation

    Get PDF
    Non-human primates evaluate food quality based on brightness of red and green shades of color, with red signaling higher energy or greater protein content in fruits and leafs. Despite the strong association between food and other sensory modalities, humans, too, estimate critical food features, such as calorie content, from vision. Previous research primarily focused on the effects of color on taste/flavor identification and intensity judgments. However, whether evaluation of perceived calorie content and arousal in humans are biased by color has received comparatively less attention. In this study we showed that color content of food images predicts arousal and perceived calorie content reported when viewing food even when confounding variables were controlled for. Specifically, arousal positively co-varied with red-brightness, while green-brightness was negatively associated with arousal and perceived calorie content. This result holds for a large array of food comprising of natural food - where color likely predicts calorie content - and of transformed food where, instead, color is poorly diagnostic of energy content. Importantly, this pattern does not emerged with nonfood items. We conclude that in humans visual inspection of food is central to its evaluation and seems to partially engage the same basic system as non-human primates

    History of clinical transplantation

    Get PDF
    The emergence of transplantation has seen the development of increasingly potent immunosuppressive agents, progressively better methods of tissue and organ preservation, refinements in histocompatibility matching, and numerous innovations is surgical techniques. Such efforts in combination ultimately made it possible to successfully engraft all of the organs and bone marrow cells in humans. At a more fundamental level, however, the transplantation enterprise hinged on two seminal turning points. The first was the recognition by Billingham, Brent, and Medawar in 1953 that it was possible to induce chimerism-associated neonatal tolerance deliberately. This discovery escalated over the next 15 years to the first successful bone marrow transplantations in humans in 1968. The second turning point was the demonstration during the early 1960s that canine and human organ allografts could self-induce tolerance with the aid of immunosuppression. By the end of 1962, however, it had been incorrectly concluded that turning points one and two involved different immune mechanisms. The error was not corrected until well into the 1990s. In this historical account, the vast literature that sprang up during the intervening 30 years has been summarized. Although admirably documenting empiric progress in clinical transplantation, its failure to explain organ allograft acceptance predestined organ recipients to lifetime immunosuppression and precluded fundamental changes in the treatment policies. After it was discovered in 1992 that long-surviving organ transplant recipient had persistent microchimerism, it was possible to see the mechanistic commonality of organ and bone marrow transplantation. A clarifying central principle of immunology could then be synthesized with which to guide efforts to induce tolerance systematically to human tissues and perhaps ultimately to xenografts

    Effects of oestradiol and tamoxifen on VEGF, soluble VEGFR-1, and VEGFR-2 in breast cancer and endothelial cells

    Get PDF
    Angiogenesis is regulated by the balance between pro- and antiangiogenic factors. Vascular endothelial growth factor (VEGF), acting via the receptors VEGFR-1 and VEGFR-2, is a key mediator of tumour angiogenesis. The soluble form of the VEGF receptor-1 (sVEGFR-1) is an important negative regulator of VEGF-mediated angiogenesis. The majority of breast cancers are oestrogen dependent, but it is not fully understood how oestrogen and the antioestrogen, tamoxifen, affect the balance of angiogenic factors. Angiogenesis is a result of the interplay between cancer and endothelial cells, and sex steroids may exert effects on both cell types. In this study we show that oestradiol decreased secreted sVEGFR-1, increased secreted VEGF, and decreased the ratio of sVEGFR-1/VEGF in MCF-7 human breast cancer cells. The addition of tamoxifen opposed these effects. Moreover, human umbilical vein endothelial cells (HUVEC) incubated with supernatants from oestradiol-treated MCF-7 cells exhibited higher VEGFR-2 levels than controls. In vivo, MCF-7 tumours from oestradiol+tamoxifen-treated nude mice exhibited decreased tumour vasculature. Our results suggest that tamoxifen and oestradiol exert dual effects on the angiogenic environment in breast cancer by regulating cancer cell-secreted angiogenic ligands such as VEGF and sVEGFR-1 and by affecting VEGFR-2 expression of endothelial cells

    The potential benefits of low-molecular-weight heparins in cancer patients

    Get PDF
    Cancer patients are at increased risk of venous thromboembolism due to a range of factors directly related to their disease and its treatment. Given the high incidence of post-surgical venous thromboembolism in cancer patients and the poor outcomes associated with its development, thromboprophylaxis is warranted. A number of evidence-based guidelines delineate anticoagulation regimens for venous thromboembolism treatment, primary and secondary prophylaxis, and long-term anticoagulation in cancer patients. However, many give equal weight to several different drugs and do not make specific recommendations regarding duration of therapy. In terms of their efficacy and safety profiles, practicality of use, and cost-effectiveness the low-molecular-weight heparins are at least comparable to, and offer several advantages over, other available antithrombotics in cancer patients. In addition, data are emerging that the antithrombotics, and particularly low-molecular-weight heparins, may exert an antitumor effect which could contribute to improved survival in cancer patients when given for long-term prophylaxis. Such findings reinforce the importance of thromboprophylaxis with low-molecular-weight heparin in cancer patients

    A History of Clinical Transplantation

    Get PDF

    Effort-related functions of nucleus accumbens dopamine and associated forebrain circuits

    Get PDF
    Background Over the last several years, it has become apparent that there are critical problems with the hypothesis that brain dopamine (DA) systems, particularly in the nucleus accumbens, directly mediate the rewarding or primary motivational characteristics of natural stimuli such as food. Hypotheses related to DA function are undergoing a substantial restructuring, such that the classic emphasis on hedonia and primary reward is giving way to diverse lines of research that focus on aspects of instrumental learning, reward prediction, incentive motivation, and behavioral activation. Objective The present review discusses dopaminergic involvement in behavioral activation and, in particular, emphasizes the effort-related functions of nucleus accumbens DA and associated forebrain circuitry. Results The effects of accumbens DA depletions on food-seeking behavior are critically dependent upon the work requirements of the task. Lever pressing schedules that have minimal work requirements are largely unaffected by accumbens DA depletions, whereas reinforcement schedules that have high work (e.g., ratio) requirements are substantially impaired by accumbens DA depletions. Moreover, interference with accumbens DA transmission exerts a powerful influence over effort-related decision making. Rats with accumbens DA depletions reallocate their instrumental behavior away from food-reinforced tasks that have high response requirements, and instead, these rats select a less-effortful type of food-seeking behavior. Conclusions Along with prefrontal cortex and the amygdala, nucleus accumbens is a component of the brain circuitry regulating effort-related functions. Studies of the brain systems regulating effort-based processes may have implications for understanding drug abuse, as well as energy-related disorders such as psychomotor slowing, fatigue, or anergia in depression

    Route of tracer administration does not affect real endogenous nitrogen recovery measured with the N-15-isotope dilution technique in pigs fed rapidly digestible diets

    No full text
    The N-15-isotope dilution technique (N-15-IDT), with either pulse-dose oral administration or continuous i.v. administration of [N-15]-L-leucine (carotid artery), both at 5 mg/(kg body weight . d), was used to measure ileal (postvalve T-cecum cannula) endogenous nitrogen recovery (ENR) in pigs (9 +/- 0.6 kg). Diets were cornstarch, enzyme-hydrolyzed casein with no (control) or high (4%) content of quebracho extract (Schinopsis spp.) rich in condensed tannins. Blood was sampled from a catheter in the external jugular vein. Mean plasma N-15-enrichment at d 8-10 was higher (P = 0.0009) after i.v. than after oral administration [0.0356 vs. 0.0379 atom% excess (APE)]. Plasma 15N-enrichment for i.v. infused pigs was 0.01117 APE higher (P <0.0001) and for orally dosed pigs 0.0081 APE lower (P <0.0001) at 11 h postprandial compared with 1 h postprandial. Apparent ileal N digestibility was higher (P <0.0001) for the control (85.5%) than for the quebracho diet (69.5%). ENR was calculated from the ratio of N-15-enrichment of plasma and digesta. The ENR for the quebracho diet was similar to300% higher than for the control diet (6.03 vs. 1.94 g/kg dry matter intake, P <0.001). The real N digestibility (92.2 +/- 0.4%) was equal for both diets (P = 0.1030) and both tracer methods (P = 0.9730). We concluded that oral administration of [N-15]leucine provides reasonable estimates of ENR in pigs fed semipurified diets with high or low content of tannins; however, one must be careful in extrapolating this conclusion to studies with other protein sources or feeding frequencies
    corecore