614 research outputs found

    Regulated deficit irrigation in table olive trees during a sensitive period

    Get PDF
    Olive tree is one of the most important irrigated fruit at Spain (around 400.000 ha). The water needs in olive orchard are greater than the water availability. Therefore, deficit conditions are common at the field. The aim of this work is to study a regulated deficit irrigation (RDI) scheduling based on midday stem water potential (Y) that limits irrigation before harvest. The experiment was performed at La Hampa experimental farm (Coria del río, Seville, Spain) in 45 years-old olive (cv Manzanillo). Three irrigation treatments in a complete randomized block design were performed during 2014.This research was supported by the Spanish Ministerio de Economía y Competitividad (MINECO), (AGL2013-45922-C2-1-R).Peer Reviewe

    Limitations and usefulness of maximum daily shrinkage (MDS) and trunk growth rate (TGR) indicators in the irrigation scheduling of table olive trees

    Get PDF
    8 páginas.-- 7 figuras.-- 2 tablas.-- 32 referenciasMaximum daily trunk shrinkage (MDS) is the most popular indicator derived from trunk diameter fluctuations in most fruit trees and has been reported to be one of the earliest signs in the detection of water stress. However, in some species such as olive trees (. Olea europaea L.), MDS does not usually change in water stress conditions and trunk growth rate (TGR) has been suggested as better indicator. Most of this lack of sensitivity to drought conditions has been related to the relationship between the MDS and the water potential. This curvilinear relationship produces an uncertain zone were great variations of water potential do not imply any changes of MDS. The MDS signal, the ratio between measured MDS and estimated MDS with full irrigation, has been thought to be a better indicator than MDS, as it reduces the effect of the environment. On the other hand, though literature results suggest an effect of environment in TGR values, there are not clear relationship between this indicator and meteorological data. The aims of this work are, on one hand, to study the improvements of the baseline approach in the MDS signal and, on the other, study the influence of several meteorological variables in TGR. Three years' data from an irrigation experiment were used in to carry out the MDS analysis and six years' data for full irrigated trees during pit hardening period were used for TGR study. The comparison between MDS vs. water potential and MDS signal vs. water potential presented a great scattering in both relationships. Values of MDS signal between 1.1 and 1.4 were always identified with moderate water stress conditions (-1.4 to -2. MPa of water potential). However, since this MDS signal values are around the maximum in the curvilineal relationship with water potential, greater values of MDS signal (in the range of 1.1-1.4) were not necessary lower values of water potential. In addition, during low fruit load seasons MDS signal was not an accurate indicator. On the other hand, absolute values of several climatological measurements were not significantly related with TGR. Only daily increments explain part of the variations of TGR in full irrigated trees. In all the data analysed, the daily increment of average vapour pressure deficit was the best indicator related with TGR. The increase of this indicator decreased TGR values. In addition, the agreement between this indicator and TGR was affected for fruit load. Great yield seasons decrease the influence of VPD increment in TGR.This research was supported by the Spanish Ministerio de Ciencia e Innovación (MICINN), (AGL2010-19201-CO4-03). Thanks are due to J. Rodriguez and A. Montero for help with field measurements.Peer reviewe

    Marginal bone loss around implants placed in maxillary native bone or grafted sinuses: a retrospective cohort study

    Full text link
    Objectives To assess differences in marginal bone loss around implants placed in maxillary pristine bone and implants placed following maxillary sinus augmentation over a period of 3 years after functional loading. Material and methods Two cohorts of subjects (Group 1: Subjects who received sinus augmentation with simultaneous implant placement; Group 2: Subjects who underwent conventional implant placement in posterior maxillary pristine bone) were included in this retrospective study. Radiographic marginal bone loss was measured around one implant per patient on digitized panoramic radiographs that were obtained at the time of prosthesis delivery (baseline) and 12, 24, and 36 months later. The influence of age, gender, smoking habits, history of periodontal disease, and type of prosthetic connection (internal or external) on marginal bone loss was analyzed in function of the type of osseous support (previously grafted or pristine). Results A total of 105 subjects were included in this study. Cumulative radiographic marginal bone loss ranged from 0 mm to 3.9 mm after 36 months of functional loading. There were statistically significant differences in marginal bone loss between implants placed in grafted and pristine bone at the 12‐month assessment, but not in the subsequent progression rate. External prosthetic connection, smoking, and history of periodontitis negatively influenced peri‐implant bone maintenance, regardless of the type of osseous substrate. Conclusions Implants placed in sites that received maxillary sinus augmentation exhibited more marginal bone loss than implants placed in pristine bone, although marginal bone loss mainly occurred during the first 12 months after functional loading. Implants with external implant connection were strongly associated with increased marginal bone loss overtime.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/102685/1/clr12122.pd

    Alveolar ridge preservation reduces the need for ancillary bone augmentation in the context of implant therapy.

    Get PDF
    BACKGROUND There is limited information on the need for bone augmentation in the context of delayed implant placement whether alveolar ridge preservation (ARP) is previously performed or not. The primary aim of this retrospective cohort study was to evaluate the efficacy of ARP therapy after tooth extraction compared with unassisted socket healing (USH) in reducing the need for ancillary bone augmentation before or at the time of implant placement. METHODS Adult subjects that underwent non-molar single tooth extraction with or without simultaneous ARP therapy were included in this study. Cone beam computed tomography scans obtained before tooth extraction and after a variable healing period were used to record the baseline facial bone thickness and to virtually plan implant placement according to a standard method. A logistic regression model was used to evaluate the effect of facial alveolar bone thickness upon tooth extraction and baseline therapy (USH or ARP) on the need for additional bone augmentation, adjusting for several covariates (i.e., age, sex, baseline KMW, and tooth type). RESULTS One hundred and forty subjects that were equally distributed between both baseline therapy groups constituted the study population. Implant placement was deemed virtually feasible in all study sites. Simultaneous bone augmentation was considered necessary in 60% and 11.4% of the sites in the USH and ARP group, respectively. Most of these sites (64.2% in the USH group and 87.5% in the ARP group) exhibited a thin facial bone phenotype (<1 mm) at baseline. Logistic regression revealed that the odds of not needing ancillary bone augmentation were 17.8 times higher in sites that received ARP therapy. Furthermore, the need for additional bone augmentation was reduced 7.7 times for every 1 mm increase in facial bone thickness, regardless of baseline therapy. CONCLUSIONS Based on a digital analysis, ARP therapy, compared with USH, and thick facial alveolar bone largely reduce the need for ancillary bone augmentation at the time of implant placement in non-molar sites

    Modeling effects of voltage dependent properties of the cardiac muscarinic receptor on human sinus node function

    Get PDF
    The cardiac muscarinic receptor (M2R) regulates heart rate, in part, by modulating the ace-tylcholine (ACh) activated K⁺ current IK,ACh_{K,ACh} through dissociation of G-proteins, that in turn activate KACh_{ ACh} channels. Recently, M2Rs were noted to exhibit intrinsic voltage sensitivity, i.e. their affinity for ligands varies in a voltage dependent manner. The voltage sensitivity of M2R implies that the affinity for Ach (and thus the Ach effect) varies throughout the time course of a cardiac electrical cycle. The aim of this study was to investigate the contribution of M2R voltage sensitivity to the rate and shape of the human sinus node action potentials in physiological and pathophysiological conditions. We developed a Markovian model of the IK,ACh_{K,ACh} modulation by voltage and integrated it into a computational model of human sinus node. We performed simulations with the integrated model varying Ach concentration and voltage sensitivity. Low Ach exerted a larger effect on IK,ACh_{K,ACh} at hyperpolarized versus depolarized membrane voltages. This led to a slowing of the pacemaker rate due to an attenuated slope of phase 4 depolarization with only marginal effect on action potential duration and amplitude. We also simulated the theoretical effects of genetic variants that alter the voltage sensitivity of M2R. Modest negative shifts in voltage sensitivity, predicted to increase the affinity of the receptor for ACh, slowed the rate of phase 4 depolarization and slowed heart rate, while modest positive shifts increased heart rate. These simulations support our hypothesis that altered M2R voltage sensitivity contributes to disease and provide a novel mechanistic foundation to study clinical disorders such as atrial fibrillation and inappropriate sinus tachycardia

    Identification of water stress conditions in olive trees through frequencies of trunk growth rate

    Get PDF
    Continuous monitoring of the tree water status will enhance irrigation performance, particularly when applying deficit schedules. The olive tree is a highly drought-resistant species and management of the water stress could increase water savings. Trunk diameter fluctuations can be displayed as daily curves representing the shrinkage and swelling, and can provide information about tree water status. In olive trees, trunk growth rate (TGR) is the most useful indicator, but the daily variability reduced the commercial applications. Recently, weekly frequencies of TGR values were associated to the water status in one seasonal experiment. The aim of this work is to study the seasonal pattern and the interannual variations of these parameters in order to integrate them in an irrigation scheduling tool. The experiment was performed during two consecutive seasons (2018 and 2019) in a superhigh density mature olive orchard at Carmona (Seville, Spain). Three different irrigation scheduling treatments were considered in a randomized complete block design. The control treatment was fully irrigated with 150–175% crop evapotranspiration (ETc) in order to ensure an optimum water status. Regulated deficit irrigation-1 (RDI-1) was scheduled using only TGR data provided through the continuous measurements from a dendrometer. In this treatment, water stress conditions were controlled during the pit hardening period. RDI-2 was similar to RDI-1, but with a more severe water stress conditions during pit hardening and a maximum seasonal amount of water that limited rehydration. Water stress was greater during the 2019 season than the 2018 season, according to the midday stem water potential (SWP). Weekly frequencies of TGR values lower than − 0.3 mm day− 1 (Severe FR) and values between − 0.1 and 0.3 mm day− 1 (Good FR) described the water status pattern in the three treatments for both seasons. Only under severe water stress conditions (SWP more negative than − 4 MPa) the values of these frequencies did not identify accurately the water status. However, the use of weekly frequencies of values greater than 0.3 mm day− 1 (Alert FR) and the pattern of these Severe FR and Good FR themselves identified such conditions. The use of these three weekly frequencies (Severe, Good and Alert (SGA) approach) are suggested for continuous deficit irrigation scheduling in olive trees

    Plant water status indicators for detecting water stress in pomegranate trees

    Get PDF
    Measurements obtained by the continuous monitoring of trunk diameter fluctuations were compared with discrete measurements of midday stem water potential (stem) and midday leaf conductance (gl) in adult pomegranate trees (Punica granatum (L.) cv. Mollar de Elche). Control plants (T0) were irrigated daily above their crop water requirements in order to attain non‐limiting soil water conditions, while T1 plants were subjected to water stress by depriving them of irrigation water for 34 days, after which time irrigation was restored and plant recovery was studied for 7 days. T1 plants showed a substantial degree of water stress, which developed slowly. Maximum daily trunk shrinkage (MDS) was identified to be the most suitable plant‐based indicator for irrigation scheduling in adult pomegranate trees, because its signal:noise ((T1/T0):coefficient of variation) ratio was higher than that for stem ((T1/T0):coefficient of variation) and gl ((T0/T1):coefficient of variation). MDS increased in response to water stress, but when the stem fell below −1.67 MPa, the MDS values decreased.This research was supported by CICYT/FEDER (AGL2010‐19201‐C04‐01AGR) and AECID (A1/035430/11) grants to the authors. AG, JCG and ZNC were funded by a FPU, a FPI and a AECID grant, respectively
    corecore