360 research outputs found

    Natural intraepithelial lymphocyte populations rise during necrotic enteritis in chickens

    Get PDF
    Intraepithelial lymphocytes (IEL) reside in the epithelium at the interface between the contents of the intestinal lumen and the sterile environment of the lamina propria. Because of this strategic location, IEL play a crucial role in various immunological processes, ranging from pathogen control to tissue stability. In mice and humans, IEL exhibit high diversity, categorized into induced IEL (conventional CD4 and CD8αβ T cells) and natural IEL (TCRαβCD8αα, TCRγδ, and TCRneg IEL). In chickens, however, the subpopulations of IEL and their functions in enteric diseases remain unclear. Thus, we conducted this study to investigate the role of IEL populations during necrotic enteritis (NE) in chickens. At 14 days of age, sixty-three Specific-pathogen-free (SPF) birds were randomly assigned to three treatments: Control (sham challenge), Eimeria maxima challenge (EM), and Eimeria maxima + Clostridium Perfringens (C. Perfringens) co-challenge (EM/CP). The EM and EM/CP birds were infected with Eimeria maxima at day 14 of age, and EM/CP birds were additionally orally inoculated with C. perfringens at days 18 and 19 of age. Birds were weighed at days 18, 20, and 26 of age to assess body weight gain (BWG). At 20 days of age (1 day-post C. perfringens infection; dpi), and 26 days of age (7 dpi), 7 birds per treatment were euthanized, and jejunum was harvested for gross lesion scores, IEL isolation, and gene expression. The EM/CP birds exhibited subclinical NE disease, lower BWG and shorter colon length. The Most changes in the IEL populations were observed at 1 dpi. The EM/CP group showed substantial increases in the total number of natural IEL subsets, including TCRαβ+CD4-CD8-, TCRαβ+CD8αα+, TCRγδ+, TCRneg and innate CD8α (iCD8α) cells by at least two-fold. However, by 7 dpi, only the number of TCRαβ+CD4-CD8- and TCRαβ+CD8αα+ IEL maintained their increase in the EM/CP group. The EM/CP group had significantly higher expression of proinflammatory cytokines (IL-1β and IFN-γ) and Osteopontin (OPN) in the jejunum at 1 dpi. These findings suggest that natural IEL with innate and innate-like functions might play a critical role in the host response during subclinical NE, potentially conferring protection against C. perfringens infection

    Influence of temperature, salinity and Mg:Ca ratio on microbially-mediated formation of Mg-rich carbonates by Virgibacillus strains isolated from a sabkha environment.

    Get PDF
    Studies have demonstrated that microbes facilitate the incorporation of Mg into carbonate minerals, leading to the formation of potential dolomite precursors. Most microbes that are capable of mediating Mg-rich carbonates have been isolated from evaporitic environments in which temperature and salinity are higher than those of average marine environments. However, how such physicochemical factors affect and concur with microbial activity influencing mineral precipitation remains poorly constrained. Here, we report the results of laboratory precipitation experiments using two mineral-forming Virgibacillus strains and one non-mineral-forming strain of Bacillus licheniformis, all isolated from the Dohat Faishakh sabkha in Qatar. They were grown under different combinations of temperature (20°, 30°, 40 °C), salinity (3.5, 7.5, 10 NaCl %w/v), and Mg:Ca ratios (1:1, 6:1 and 12:1). Our results show that the incorporation of Mg into the carbonate minerals is significantly affected by all of the three tested factors. With a Mg:Ca ratio of 1, no Mg-rich carbonates formed during the experiments. With a Mg:Ca ratios of 6 and 12, multivariate analysis indicates that temperature has the highest impact followed by salinity and Mg:Ca ratio. The outcome of this study suggests that warm and saline environments are particularly favourable for microbially mediated formation of Mg-rich carbonates and provides new insight for interpreting ancient dolomite formations

    Quality by Design (QbD) based process optimisation to develop functionalised particles with modified release properties using novel dry particle coating technique

    Get PDF
    Quality by Design (QbD), a current trend employed to develop and optimise various critical pharmaceutical processes, is a systematic approach based on the ethos that quality should be designed into the product itself, not just end tested after manufacture. The present work details a step-wise application of QbD principles to optimise process parameters for production of particles with modified functionalities, using dry particle coating technology. Initial risk assessment identified speed, air pressure, processing time and batch size (independent factors) as having high-to-medium impact on the dry coating process. A design of experiments (DOE) using MODDE software employed a D-optimal design to determine the effect of variations in these factors on identified responses (content uniformity, dissolution rate, particle size and intensity of Fourier transform infrared (FTIR) C = O spectrum). Results showed that batch size had the most significant effect on dissolution rate, particle size and FTIR; with an increase in batch size enhancing dissolution rate, decreasing particle size (depicting absence of coated particles) and increasing the FTIR intensity. While content uniformity was affected by various interaction terms, with speed and batch size having the highest negative effect. Optimal design space for producing functionalised particles with optimal properties required maximum air pressure (40psi), low batch size (6g), speed between 850 to 1500 rpm and processing times between 15 to 60 minutes. The validity and predictive ability of the revised model demonstrated reliability for all experiments. Overall, QbD was demonstrated to provide an expedient and cost effective tool for developing and optimising processes in the pharmaceutical industry

    Assessment of diabetes and prediabetes prevalence and predictors by HbA1c in a population from sub-Saharan Africa with a high proportion of anemia: a prospective cross-sectional study

    Get PDF
    Epidemiological data about diabetes mellitus (DM) for sub-Saharan Africa (SSA) are scarce and the utility of glycated hemoglobin (HbA1c) to diagnose DM is uncertain in African populations with a high proportion of anemia.; In a cross-sectional study, age-adjusted prevalence rates and predictors for DM and pre-DM were prospectively assessed by HbA1c in a semirural walk-in population of Tanzania (n=992). Predictors for DM were calculated by logistic regression. Correlations between HbA1c, hemoglobin, and blood glucose levels were done by Pearson's correlation.; Overall, DM and pre-DM prevalence rates were 6.8% (95% CI 5.3 to 8.5) and 25% (95% CI 22.8 to 28.3), respectively. There was an increase in DM prevalence in patients 50-59 (14.9%; 95% CI 9.1 to 22.5), ≥60 years old (18.5%; 95% CI 12.2 to 26.2) and in patients with overweight (9.3%; 95% CI 5.9 to 13.7), obesity (10.9%; 95% CI 6.9 to 16) compared with patients 18-29 years old (2.2%; 95% CI 0.9 to 4.4) (p<0.001) and to normal-weight patients (3.6%; 95% CI 2.1 to 5.6) (p<0.01), respectively. Age (OR 1.08, 95% CI 1.05 to 1.12; p<0.001), body mass index (BMI) (OR 1.10, 95% CI 1.04 to 1.16; p<0.001), and acute infection (OR 3.46, 95% CI 1.02 to 10.8; p=0.038) were predictors for DM. Comparing patients with a BMI of 20 kg/m; 2; and a BMI of 35 kg/m; 2; , the relative risk for DM increases in average by 2.12-fold (range 1.91-2.24) across the age groups. Comparing patients 20 years old with patients 70 years old, the relative risk for DM increases in average 9.7-fold (range 8.9-10.4) across the BMI groups. Overall, 333 patients (36%) suffered from anemia. Pearson's correlation coefficients (r) between HbA1c and hemoglobin was -0.009 (p=0.779), and between HbA1c and fasting blood glucose and random blood glucose, it was 0.775 and 0.622, respectively (p<0.001).; We observed a high prevalence of DM and pre-DM, mainly triggered by increasing age and BMI, and provide evidence that HbA1c is suitable to assess DM also in populations of SSA with high proportions of anemia.; NCT03458338

    Utilization of millet husk ash as a supplementary cementitious material in eco-friendly concrete: RSM modelling and optimization

    Get PDF
    The environment has been greatly impacted by the increase in cement consumption. However, a huge quantity of energy is consumed and large amount of poisonous gases releases into the atmosphere during the cement production, which harms the environment. In order to decrease not only cement manufacturing but also energy usage and to aid in environmental protection, scientists are attempting to introduce agricultural and industrial waste materials with cementitious characteristics. Therefore, millet husk ash is used as supplementary cementitious material (SCM) in the concrete for producing sustainable environmental. The main purpose of this investigation is to check the workability, compressive strength, splitting tensile strength, flexural strength and drying shrinkage of concrete incorporating 0 %, 5 %, 10 %, 15 % and 20 % of MHA as SCM in concrete. A total of 165 concrete samples was made with mix proportion of 1:1.5:3 and cured at ages of 7, 28, and 90 days. The investigational outcomes displayed that there was an improvement in compressive strength, tensile strength, and flexural strength by 11.39 %, 9.80 %, and 9.39 %, correspondingly, at 10 % of MHA replacement of cement. Also, the water absorption reduced as MHA content increased after 28 days. There was also a reduction in drying shrinkage of concrete as the MHA increased after 28 days. Though, the workability is declined as the proportion of MHA increased in concrete. Moreover, the embodied carbon is declined while the content of PC substituted with MHA rises in concrete. In addition, response prediction models were built and validated using ANOVA at a 95 % significance level. R2 values for the models varied from 87.47 to 99.59 percent. The study concludes that the accumulation of 10 % MHA in concrete has a favourable effect on the characteristics of the concret

    Choosing the most suitable classifier For supporting assistive technology adoption In people with Parkinson’s disease: a fuzzy Multi-criteria approach

    Get PDF
    Parkinson’s disease (PD) is the second most common neurodegenerative disorder which requires a long-term, interdisciplinary disease management. While there remains no cure for Parkinson’s disease, treatments are available to help reduce the main symptoms and maintain quality of life for as long as possible. Owing to the global burden faced by chronic conditions such as PD, Assistive technologies (AT’s) are becoming an increasingly common prescribed form of treatment. Low adoption is hampering the potential of digital technologies within health and social care. It is then necessary to employ classification algorithms have been developed for differentiating adopters and non-adopters of these technologies; thereby, potential negative effects on people with PD and cost overruns can be further minimized. This paper bridges this gap by extending the Multi-criteria decision-making approach adopted in technology adoption modeling for people with dementia. First, the fuzzy Analytic Hierarchy Process (FAHP) is applied to estimate the initial relative weights of criteria and sub-criteria. Then, the Decisionmaking Trial and Evaluation Laboratory (DEMATEL) is used for evaluating the interrelations and feedback among criteria and sub-criteria. The Technique for Order of Preferences by Similarity to Ideal Solution (TOPSIS) is finally implemented to rank three classifiers (Lazy IBk – knearest neighbors, Naïve bayes, and J48 decision tree) according to their ability to model technology adoption. A real case study considering is presented to validate the proposed approach

    Wild mammals in the Gaza Strip, with particular reference to Wadi Gaza

    Get PDF
    Mammalian fauna are considered good indicators of the degree of anthropogenic disturbance to the various ecosystems. Many mammalian species disappeared in the Gaza Strip during the last 5-6 decades and no efforts have been made to stop such disappearance. The present work aims at surveying the remnant wild mammals in the Gaza Strip; particularly in Wadi Gaza as a natural area. A total number of 15 mammalian species belonging to 5 orders and 11 families were recorded. Most mammalian species were of small sizes and residents. Seven disappeared mammalian species were mentioned by locals. The causes of disappearance were mostly anthropogenic and included the limited area of the Gaza Strip, over-population, residential and agricultural encroachment on the expense of natural areas and the over-exploitation of natural resources of which hunting was and stills a common practice threatening wildlife. The Israeli Occupation is still adversely affecting wildlife ecology in the area. Finally, the authors recommend improving cooperation of different parties to enhance the public awareness and to implement environmental laws and legislations to conserve nature and to protect wildlife

    Fracture analysis of steel fibre-reinforced concrete using Finite element method modeling

    Get PDF
    Concrete has a great capacity to withstand compressive strength, but it is rather weak at resisting tensile stresses, which ultimately result in the formation of cracks in concrete buildings. The development of cracks has a significant impact on the durability of concrete because they serve as direct pathways for corrosive substances that harm the concrete’s constituents. Consequently, the reinforced concrete may experience degradation, cracking, weakening, or progressive disintegration. To mitigate such problems, it is advisable to include discrete fibres uniformly throughout the concrete mixture. The fibers function by spanning the voids created by fractures, therefore decelerating the mechanism of fracture initiation and advancement. It is not practical to assess the beginning and spread of cracks when there are uncertainties in the components and geometrical factors through probabilistic methods. This research examines the behaviour of variation of steel fibers in Fiber Reinforced Concrete (FRC) via Finite Element Method (FEM) modeling. In this study also the fracture parameters such as fracture energy, and fracture toughness have been computed through FEM analysis. The FEM constitutive model developed was also validated with the experimental result. The compressive strength of the developed constitutive model was 28.50 MPa which is very close to the 28-day compressive strength obtained through the experiment, i.e., 28.79 MPa. Load carrying capacity obtained through FEM was 7.9 kN, 18 kN, and 24 kN for three FEM models developed for three varying percentages of steel fiber 0.25%, 0.5%, and 0.75% respectively. The study developed a FEM model which can be used for calculating the fracture parameters of Steel Fibre-Reinforced Concrete (SFRC)

    Assessment of treatment burden and its impact on quality of life in dialysis-dependent and pre-dialysis chronic kidney disease patients

    Get PDF
    Background The management of chronic kidney disease (CKD) and its complications places a significant burden on patients, resulting in impairment of their health-related quality of life (HR-QOL). Little is known about treatment-related burden in pre-dialysis and hemodialysis (HD) CKD patients. Objective This study aimed to investigate the magnitude of treatment-related burden and its impact on HR-QOL among patients with CKD. Methods This was a prospective, cross-sectional study to assess treatment-related burden and HR-QOL among patients with CKD in Qatar. Treatment-related burden and HR-QOL were assessed quantitatively using the Treatment Burden Questionnaire (TBQ) and the Kidney Disease Quality of Life (KDQOL™) questionnaire, respectively. The total TBQ score ranges from 0 to 150, with a higher score indicating higher treatment burden, while the range of total possible scores for the KDQOL™ are from 0 to 3600 with higher transformed score indicating better QOL. Pre-dialysis and hemodialysis (HD) CKD patients who had regular follow-up appointments at Fahad Bin Jassim Kidney Center in Qatar were enrolled. Data were analyzed descriptively and inferentially using SPSS version-24. Results Two hundred-eighty CKD patients (HD = 223 and pre-dialysis = 57) were included in the analyses (response rate 60.9%). Approximately 35% of the participants reported moderate to high treatment-related burden (TBQ global score 51–150). HD patients experienced significantly higher treatment burden compared to pre-dialysis patients with a median (IQR) score of 45 (36) versus 25 (33), respectively (p < 0.001). Medication burden and lifestyle changes burden were the highest perceived treatment-related burden. Overall, the perceived median (IQR) HR-QOL measured using the KDQOL-36™ among the participants was 2280.6 (1096.2) compared to the maximum global score of 3600. Similarly, the HD patients demonstrated significantly lower HR-QOL compared to the pre-dialysis patients [median (IQR) score of 2140 (1100) vs. 2930 (995), respectively; p < 0.001). There was a strong negative correlation between TBQ score and KDQOL-36™ score [rs (251) = −0.616, p < 0.001], signifying that HR-QOL decreases as treatment burden increases. Conclusions This study suggests that a considerable proportion of CKD patients suffered from treatment-related burden and deterioration in HR-QOL at a varying degree of seriousness. HD patients experienced significantly higher burden of treatment and lower HR-QOL compared to pre-dialysis patients and that HR-QOL declines as treatment burden increases. Therefore, treatment-related burden should be considered in CKD management and factors that increase it should be considered when designing healthcare interventions directed to CKD patients.This research was funded by Qatar University under Student Grant number QUST-CPH-SPR/2017-19 [Approved amount QAR 20,000.00 (~US$ 5,480)]. The funders had no role in the design, planning, and implementation of the study. The content is the sole responsibility of the authors.Scopu

    Perceived Risk of falls among Acute Care Patients

    Get PDF
    Purpose: In an effort to lower the number of falls that occur among hospitalized patients, several facilities have begun introducing various fall prevention programs. However, the efficacy of fall prevention programs is diminished if patients do not consider themselves to be at risk for falls and do not follow recommended procedures. The goal of this study was to characterize how patients in four different acute care specialist services felt about their risk of falling while in the hospital. Methods: One hundred patients admitted to the study hospital with a Morse Fall Scale score of 45 or higher were given the Patient Perception Questionnaire, a tool designed to assess a patient's perception of their own fall risk, fear of falling, and motivation to take part in fall prevention efforts. Scores on the Morse Fall Scale were gathered through a historical assessment of medical records. Descriptive statistics, Pearson's correlation coefficients, and independent sample t tests were used to examine the data. Results: The average age was 65, and around half (52%) were men and half (48%) were women. Based on their ratings on the Morse Fall Scale, all 100 participants were classified as being at high risk for falls. However, only 55.5% of the individuals agreed with this assessment. The likelihood that a patient would seek assistance and the degree to which they feared falling both declined as their faith in their mobility improved. Patients hospitalized after a fall exhibited considerably lower confidence scores and greater fear scores than patients who had not been injured in a fall. Conclusions: Patients who have a high fall risk assessment score may not believe they are at risk for falls and may not take any steps to reduce their risk. The prevalence of falls in hospitals might be mitigated by the creation of a fall risk assessment technique that takes into account both objective and subjective factors
    • …
    corecore