30 research outputs found
A phase III, randomized, two-armed, double-blind, parallel, active controlled, and non-inferiority clinical trial to compare efficacy and safety of biosimilar adalimumab (CinnoRA (R)) to the reference product (Humira (R)) in patients with active rheumatoid arthritis
Background: This study aimed to compare efficacy and safety of test-adalimumab (CinnoRA (R), CinnaGen, Iran) to the innovator product (Humira (R), AbbVie, USA) in adult patients with active rheumatoid arthritis (RA). Methods: In this randomized, double-blind, active-controlled, non-inferiority trial, a total of 136 patients with active RA were randomized to receive 40 mg subcutaneous injections of either CinnoRA (R) or Humira (R) every other week, while receiving methotrexate (15 mg/week), folic acid (1 mg/day), and prednisolone (7.5 mg/day) over a period of 24 weeks. Physical examinations, vital sign evaluations, and laboratory tests were conducted in patients at baseline and at 12-week and 24-week visits. The primary endpoint in this study was the proportion of patients achieving moderate and good disease activity score in 28 joints-erythrocyte sedimentation rate (DAS28-ESR)-based European League Against Rheumatism (EULAR) response. The secondary endpoints were the proportion of patients achieving American College of Rheumatology (ACR) criteria for 20% (ACR20), 50% (ACR50), and 70% (ACR70) responses along with the disability index of health assessment questionnaire (HAQ), and safety. Results: Patients who were randomized to CinnoRA (R) or Humira (R) arms had comparable demographic information, laboratory results, and disease characteristics at baseline. The proportion of patients achieving good and moderate EULAR responses in the CinnoRA (R) group was non-inferior to the Humira (R) group at 12 and 24 weeks based on both intention-to-treat (ITT) and per-protocol (PP) populations (all p values >0.05). No significant difference was noted in the proportion of patients attaining ACR20, ACR50, and ACR70 responses in the CinnoRA (R) and Humira (R) groups (all p values >0.05). Further, the difference in HAQ scores and safety outcome measures between treatment arms was not statistically significant. Conclusion: CinnoRA (R) was shown to be non-inferior to Humira (R) in terms of efficacy at week 24 with a comparable safety profile to the reference product
Use of a Visual Scoring System to Assess External Udder Conformation in Ewes and the Relationship to Lamb Growth Rates
Implications of external udder conformation and the relationship to colostrum quality and lamb growth rates have not been widely considered in sheep used for meat production. We hypothesize that a favorable udder conformation will correlate to higher colostrum quality and increased growth rates in lambs. We looked at 50 ewes within 6-8 hours of parturition to evaluate their udders for conformation and sampled colostrum from both halves of the udder. Colostrum was analyzed with a refractometer to measure total proteins to determine overall quality. At day 2, day 45, and day 60 after lambing, lamb weights were recorded, and udder conformation measurements were repeated. This study used a visual scoring system assessing udder floor (1-4; 1= defined halfling, 2= too flat, 3= broken, 4=asymmetric), udder depth (1-9; 1=low udder, 9=shallow udder, 5=hock as a reference point), teat placement (1-9; 1=most medial, 9=most lateral), teat lesions (present or absent), and the presence of wool (present or absent) to assess external udder conformation. Normal udder parameters include udder depth scores of 5 or 6, udder floor score of 1 or 2, teat placement scores 4, 5, and 6, and the absence of teat lesions and wool. All ewes not meeting normal parameters were treated as abnormal. Upon initial evaluation, 22% of ewes displayed ‘normal’ conformation, with average total protein of 14.82±0.58mg/dl, while 78% of the ewes displayed ‘abnormal’ conformation, with an average total protein of 13.31±0.33mg/dl. The data was analyzed using the GLM procedure in SAS with significance declared at P \u3c 0.05. A significant difference (P = 0.0284) was detected in the total proteins between the ewes with a ‘normal’ udder conformation and the ewes with ‘abnormal’ conformation. No significant difference (P \u3e 0.05) was detected between normal and abnormal conformations regarding lamb weights. This data provides evidence of increased total protein values present in ewes with ‘normal’ udder conformation and no difference in lamb weights between udder conformations
Use of a Visual Scoring System to Assess External Udder Conformation and Its Relationship to Colostrum Quality and Lamb Growth Rates
In sheep raised for meat production, the relationship between external udder conformation, colostrum quality, and lamb growth rates has not received much attention. We hypothesized that ewes with a more desirable udder conformation at lambing would have greater colostrum quality and greater growth rates in lambs. Fifty Suffolk ewes were used in this study. Within 6–8 h of parturition, colostrum samples from both halves of the udder were collected and visual scoring of the udder was conducted. Colostrum quality was measured for total proteins using both optical and Brix refractometers. On day 2, day 45, and day 60 after parturition, lamb weights were recorded, and udder conformation measurements were repeated. A visual scoring system evaluating udder floor (scale 1–4), udder depth (scale 1–9), teat placement (scale 1–9), teat/mammary lesions (present or absent), and the presence of wool (present or absent) was used to assess the external udder conformation. Normal udder parameters included udder depth scores of 5 or 6; udder floor scores of 1 or 2; teat placement scores of 4, 5, or 6; and the absence of teat/mammary lesions and wool. All ewes not meeting normal parameters were considered to have an abnormal udder. The data were analyzed using the GLM procedure. Mean total colostrum protein was greater (p = 0.03) in ewes displaying a ‘normal’ udder conformation compared with those with an ‘abnormal’ conformation (14.82 ± 0.5 and 13.31 ± 0.3 mg/dL, respectively). Mean Brix values were also greater (p = 0.03) for ewes with a ‘normal’ udder compared to an abnormal udder confirmation (21.70 ± 0.8 and 19.54 ± 0.5, respectively). On day 2 after parturition, the mean lamb body weight was not different between ewes with ‘normal’ and abnormal udders (5.38 ± 0.26 vs. 5.46 ± 0.15). No differences (p > 0.05) in lamb weights were detected between ewes with normal and abnormal udder conformations on day 45 and 60 after parturition. These data provide evidence of greater colostrum total protein values and greater Brix values present in ewes with a ‘normal’ udder conformation. There were no differences in the weights of lambs born to ewes with normal or abnormal udder conformations
STUDYING THE IMPACT OF GOVERNMENT EXPENDITURES SHOCKS ON MACROECONOMIC VARIABLES OF THE IRANIAN ECONOMY
This paper studies impact of government expenditures shocks on Gross DomesticProduct (GDP), personal consumption, trade balanceand effective exchange rate.To the purpose, time series data of Iranian macroeconomic variables were usedcovering from 1976 to 2007. Vector autoregressive (VAR) model, forecast errorvariance decomposition and momentary reaction functions were used in order tostudy the impact of government expenditures shockson macroeconomic variablesof Iranian economy. Extracted results from the estimate of VAR model andanalyses of forecast error variance decomposition showed that: positive shocks ofthe government expenditures increase GDP and personal consumption butdecrease trade balance. Impact of government expenditures positive shocksdecrease effective exchange rate only in first yearthen government expendituresshocks had positive but very little impact on effective exchange rate
Inflammatory response of canine gingiva to a chemical retraction agent placed at different time intervals
Background: Exposure of the gingival sulcus while controlling hemorrhage is prerequisites for maximizing treatment outcomes of cervical carious lesions and for obtaining quality impressions for the fabrication of indirect restorations with cervical finish lines. Gingival retraction cords saturated with different chemical agents are widely used for this purpose. The aim of this study was to investigate and compare the inflammatory potential of 15.5%ferric sulfate on connective tissue when placed at different times.
Materials and Methods: All procedures were performed on three dogs under general anesthesia. Retraction cords saturated with a 15.5% ferric sulfate solution were placed into the gingival sulcus and evaluated after 3 min and 10 min of exposure to the chemical agent. Excisional biopsies of the exposed gingival tissue were then obtained at intervals of 1 h, 24 h, and 7 days. For all specimens, histology evaluation was performed using light microscopy. Data collected from the microscopic images of all tissue specimens were analyzed by using the Wilcoxon Signed Rank and Kruskal-Wallis Tests. P value less than 0.05 was considered as significant.
Results: Histopathologic examination of the biopsied gingival tissue revealed that the ferric sulfate solution caused significant tissue changes at the beginning of both the 3-min and 10-min gingival exposure time (P > 0.05). However, the tissue returned to a normal histological appearance by the end of day 7 in all cases (P > 0.05).
Conclusion: The results of this study revealed that the biologic effects of 15.5% ferric sulfate solution are clinically acceptable and reliable when gingival exposure times of 3 min and 10 min are used for gingival retraction
Neonatal and early infancy burns in the only referral Burn center in Northeast of Iran: Report of a decade
Background: Burn in the neonatal and early infancy period (<6 months of age) is a relatively rare accident, but it can cause severe problems. This study is designed to evaluate the epidemiology and etiology of burn injury in neonatal and early infancy period.
Methods: In a cross-sectional study, we collected information about neonatal and early infancy burn injuries from the hospital information system in a 10-year period starting from January 1, 2007, in Imam Reza Hospital in the northeast of Iran. Data were analyzed by SPSS version 16.
Results: There were 3 neonatal and 47 early infancy burn injuries (0.7% of all burn injury admissions). All injuries occurred at home. The mean age was 122.3 ± 51.7 days and 31 (62%) were males. The mean percentage of burn total body surface area (TBSA) was 19.21 ± 11.44 (range = 3–55). Mean of hospital stay was 11.9 ± 7.5 days. The fatality rate was 2%. The most common mechanisms of burn injury were scald (41, 82%) and flame (5, 10%). The most common hot liquid containers were kettle (21, 42%) and samovar (8, 16%). Explosion (28.50 ± 2.12) had caused the longest hospital stay (P = 0.01). Patients burnt by hot liquid splashed from samovar had a more burnt TBSA (30.13 ± 10.71) (P = 0.04).
Conclusions: Hot beverages and food preparation simultaneous with child care are dangerous conditions which can cause burn injury in infants and neonates. The results of this study set a valuable background for running some prevention programs to prevent neonates and infants from burn injury
Dietary protein intake and mortality among survivors of liver cirrhosis: a prospective cohort study
Abstract Background Liver cirrhosis is a worldwide burden and is associated with poor clinical outcomes, including increased mortality. The beneficial effects of dietary modifications in reducing morbidity and mortality are inevitable. Aim The current study aimed to evaluate the potential association of dietary protein intake with the cirrhosis-related mortality. Methods In this cohort study, 121 ambulatory cirrhotic patients with at least 6 months of cirrhosis diagnosis were followed-up for 48 months. A 168-item validated food frequency questionnaire was used for dietary intake assessment. Total dietary protein was classified as dairy, vegetable and animal protein. We estimated crude and multivariable-adjusted hazard ratios (HRs) with 95% confidence intervals (CIs), applying Cox proportional hazard analyses. Results After full adjustment for confounders, analyses showed that total (HR = 0.38, 95% CI = 0.2–1.1, p trend = 0.045) and dairy (HR = 0.38, 95% CI = 0.13–1.1, p trend = 0.046) protein intake was associated with a 62% lower risk of cirrhosis-related mortality. While a higher intake of animal protein was associated with a 3.8-fold increase in the risk of mortality in patients (HR = 3.8, 95% CI = 1.7–8.2, p trend = 0.035). Higher intake of vegetable protein was inversely but not significantly associated with mortality risk. Conclusion A comprehensive evaluation of the associations of dietary protein intake with cirrhosis-related mortality indicated that a higher intakes of total and dairy protein and a lower intakes of animal protein are associated with a reduced risk of mortality in cirrhotic patients
Identification of Xq22.1-23 as a region linked with hereditary recurrent spontaneous abortion in a family
Background: Recurrent spontaneous abortion (RSA) is one of the most
common health complications with a strong genetic component. Several
genetic disorders were identified as etiological factors of hereditary
X linked RSA. However, more genetic factors remain to be identified.
Objective: In this study we performed linkage analysis on a large X
linked RSA pedigree to find a novel susceptibility locus for RSA.
Materials and Methods: A linkage scan using 11 microsatellites was
performed in 27 members of a large pedigree of hereditary X-linked RSA.
Two point parametric Linkage was performed using Superlink v 1.6
program. Results: Evidence of linkage was observed to markers at Xq23,
DXS7133 and at Xq22.1 DXS101, with LOD score of 3.12 and 1.60,
respectively. Conclusion: Identified locus in this study may carry a
responsible gene in RSA. Narrowing down of this region may leads to
identification of this gene
The quality of care index for low back pain: a systematic analysis of the global burden of disease study 1990–2017
Abstract Background . Low back pain is one of the major causes of morbidity worldwide. Studies on low back pain quality of care are limited. This study aimed to evaluate the quality of care of low back pain worldwide and compare gender, age, and socioeconomic groups. Methods . This study used GBD data from 1990 to 2017 from the Institute for Health Metrics and Evaluation (IHME) website. Extracted data included low back pain incidence, prevalence, disability-adjusted life years (DALYs), and years lived with disability (YLDs). DALYs to prevalence ratio and prevalence to incidence ratio were calculated and used in the principal component analysis (PCA) to make a proxy of the quality-of-care index (QCI). Age groups, genders, and countries with different socioeconomic statuses regarding low back pain care quality from 1990 to 2017 were compared. Results The proxy of QCI showed a slight decrease from 36.44 in 1990 to 35.20 in 2017. High- and upper-middle-income countries showed a decrease in the quality of care from 43.17 to 41.57 and from 36.37 to 36.00, respectively, from 1990 to 2017. On the other hand, low and low-middle-income countries improved, from a proxy of QCI of 20.99 to 27.89 and 27.74 to 29.36, respectively. Conclusion . Despite improvements in the quality of care for low back pain in low and lower-middle-income countries between 1990 and 2017, there is still a large gap between these countries and higher-income countries. Continued steps must be taken to reduce healthcare barriers in these countries