4,866 research outputs found
Lack of Z-DNA Conformation in Mitomycin-Modified Polynucleotides Having Inverted Circular Dichroism
Poly(dG-dC)· poly(dG-dC) and Micrococcus lysodeikticus DNA were modified by exposure to reductively activated mitomycin C, an antitumor antibiotic. The resulting covalent drug-polynucleotide complexes displayed varying degrees of CD inversions, which are strikingly similar to the inverted spectrum observed with Z-DNA. The following criteria have been used to establish, however, that the inverted CD pattern seen in mitomycin C-polynucleotide complexes does not reflect a Z-DNA conformation. (i) The ethanol-induced transition of poly(dG-dC)· poly(dG-dC) from B to Z conformation is not facilitated but rather is inhibited by mitomycin C modification. This may be due to the presence of crosslinks. (ii) Radioimmunoassay indicated no competition for Z-DNA-specific antibody by any of the mitomycin C-modified polynucleotides. (iii) 31P NMR of the complexes yielded a single relatively narrow resonance, which is inconsistent with the dinucleotide repeat characteristic of Z-DNA. Alternative explanations for the inverted CD pattern include a drug-induced left-handed but non-Z conformational change or the superposition of an induced CD onto the CD of B-DNA due to drug-base electronic interactions. These results illustrate the need for caution in interpreting CD changes alone as an indication of Z-DNA conformation
A computer-assisted motivational social network intervention to reduce alcohol, drug and HIV risk behaviors among Housing First residents.
BackgroundIndividuals transitioning from homelessness to housing face challenges to reducing alcohol, drug and HIV risk behaviors. To aid in this transition, this study developed and will test a computer-assisted intervention that delivers personalized social network feedback by an intervention facilitator trained in motivational interviewing (MI). The intervention goal is to enhance motivation to reduce high risk alcohol and other drug (AOD) use and reduce HIV risk behaviors.Methods/designIn this Stage 1b pilot trial, 60 individuals that are transitioning from homelessness to housing will be randomly assigned to the intervention or control condition. The intervention condition consists of four biweekly social network sessions conducted using MI. AOD use and HIV risk behaviors will be monitored prior to and immediately following the intervention and compared to control participants' behaviors to explore whether the intervention was associated with any systematic changes in AOD use or HIV risk behaviors.DiscussionSocial network health interventions are an innovative approach for reducing future AOD use and HIV risk problems, but little is known about their feasibility, acceptability, and efficacy. The current study develops and pilot-tests a computer-assisted intervention that incorporates social network visualizations and MI techniques to reduce high risk AOD use and HIV behaviors among the formerly homeless. CLINICALTRIALS.Gov identifierNCT02140359
Influence of Neck-Rail Placement on Free-Stall Preference, Use, and Cleanliness
Three experiments examined how the presence of a neck rail at different heights and locations influenced dairy cattle behavior and stall cleanliness. Experiment 1 compared 4 levels of neck-rail height (102, 114, and 127 cm and no neck rail; presented at 160 or 180 cm from the curb) in a preference test. Cows (n = 10) showed no consistent preference based on neck-rail height, regardless of the horizontal position of the neck rail. When cows were restricted to each treatment in turn, however, time spent standing fully (with all 4 hooves) in the stall was least in the stall with the lowest neck rail (mean, 22 min/24 h) and was greatest in the stall with no neck rail (mean, 83 min/24 h). A second experiment examined the effect of a neck rail placed at 3 distances from the curb (140, 175, and 233 cm) when height was held constant (131 cm; n = 12). Time spent standing fully in the stall was least when the neck rail was close to the curb (140 cm; mean, 11 min/24 h) and was greatest when the neck rail was furthest from the curb (233 cm; mean, 86 min/24 h). When the neck rail was far from the curb, the cows were more likely to soil the stall by defecating while standing fully in the stall. Experiment 3 compared soiling of the stall by 14 cows with and without a neck rail at a height of 124.5 cm. When the neck rail was removed, cows were more than twice as likely to soil the stall by defecating while standing fully in the stall compared with when the neck rail was present (1.3 vs. 0.5 defecations/24 h). Thus, restrictive neck-rail placement prevents cows from standing in stall, but helps keep stalls clean. Access to more comfortable flooring surfaces outside the stall may help mitigate the negative effects of restrictive neck rails
Tail Docking Dairy Cattle: Effects on Cow Cleanliness and Udder Health
To determine whether tail docking would influence cow cleanliness and udder health in a free-stall system, we monitored milking cows after half the animals in a herd were docked. A sample of 223 docked and 190 undocked cows (reducing to 169 and 105 over the study as cows were dried off) were monitored for 8 wk. Cow cleanliness was scored in two areas: along the spine, and the rump adjacent to the tail at 1, 2, 3, 5, and 8 wk after docking. Cleanliness was evaluated by counting squares that were soiled (0 to 14 on a 5- × 17.5-cm grid) and judging soiling severity on a scale from 0 (clean) to 3 (thickly caked). Udder cleanliness was scored with the same scale (0 to 3) and by counting the number of teats with debris on them. Udder health was assessed by measuring SCC of two milk samples and the number of animals diagnosed as mastitic by the on-farm veterinarian. No treatment differences were found in four measures of cow cleanliness, two measures of udder cleanliness, or udder health. However, cow cleanliness did differ over time, and analysis of a subsample of cows illustrated individual differences in cleanliness
Effects of Three Types of Free-Stall Surfaces on Preferences and Stall Usage by Dairy Cows
One important criterion in choosing appropriate housing systems for dairy cattle is that the freestall provides a comfortable surface for the cow. This paper describes two experiments testing the effects of commonly used lying surfaces on stall preference and stall usage by Holstein cows. In both experiments, 12 cows were housed individually in separate pens. Each pen contained three free stalls with a different surface: deep-bedded sawdust, deep-bedded sand, and a geotextile mattress covered with 2 to 3 cm of sawdust. The animals were restricted to each surface in turn, in a random order for either 2 (Experiment 1) or 3 d (Experiment 2). Both before and after this restriction phase, the animals were allowed access to all three surfaces, and preference was determined, based on lying times. Of the 12 cows used in Experiment 1, 10 preferred sawdust before and nine after the restriction phase. During the restriction phase, average lying times and number of lying events during the restriction phase were significantly lower for the sand-bedded stalls (P ≤ 0.05), and standing times were higher on mattresses (P ≤ 0.05), compared with sawdust. Although these cows had some experience with all three surfaces during the experiment, they had been housed in sawdust-bedded stalls during their previous lactation. Cows used in Experiment 2 had spent their previous lactation in sand bedded stalls. In this experiment, about half the cows preferred sand and half sawdust, after the restriction phase. During the restriction phase of experiment, lying times and number of lying events were lower, and standing times were higher when the animals were restricted to the mattresses compared to either sand or sawdust (P ≤ 0.05). These results indicate that (1) free stall surface can affect both stall preferences and stall usage, and (2) mattresses are less preferred
Free-Stall Dimensions: Effects on Preference and Stall Usage
In 2 experiments, free-stall dimensions were examined to determine how they affected stall preference, usage, cleanliness, and milk production in Holstein dairy cattle. In experiment 1, stall width (112 or 132 cm) and stall length (229 and 274 cm from curb to wall) were compared in a 2 × 2 factorial arrangement of stall treatments using 15 individually housed, non-lactating animals. Cows showed no clear preference for stall size as measured by lying time. When animals had no choice between stalls, average lying time was higher in the wide stalls than in the narrow stalls (10.8 vs. 9.6 ± 0.3 h/24 h). Both length and width affected time spent standing with only the front hooves in the stall; total stall area is best explained by the variation associated with this behavior. In experiment 2, 27 lactating dairy cattle were alternately housed with access to stalls of 106, 116, or 126 cm in width using a cross-over design with exposure to each treatment lasting 3 wk. Animals spent an additional 42 min/24 h lying in stalls measuring 126 cm in width compared with stalls with only 106 cm between partitions. Free-stall width influenced the time spent standing with the front 2 hooves in the stall; animals averaged 58 min/24 h in the widest stalls and 85 min/24 h in the narrowest stalls. The amount of time spent standing with all 4 hooves in the stall tended to be longer in wider stalls, and these stalls were also most likely to become soiled with feces. Stall width did not affect the number of lying events or milk production. In conclusion, animals spent more time lying down, and less time was spent standing with only the front hooves in larger stalls, but larger stalls were also more likely to become soiled
The impact of obstructive sleep apnea variability measured in-lab versus in-home on sample size calculations
<p>Abstract</p> <p>Background</p> <p>When conducting a treatment intervention, it is assumed that variability associated with measurement of the disease can be controlled sufficiently to reasonably assess the outcome. In this study we investigate the variability of Apnea-Hypopnea Index obtained by polysomnography and by in-home portable recording in untreated mild to moderate obstructive sleep apnea (OSA) patients at a four- to six-month interval.</p> <p>Methods</p> <p>Thirty-seven adult patients serving as placebo controls underwent a baseline polysomnography and in-home sleep study followed by a second set of studies under the same conditions. The polysomnography studies were acquired and scored at three independent American Academy of Sleep Medicine accredited sleep laboratories. The in-home studies were acquired by the patient and scored using validated auto-scoring algorithms. The initial in-home study was conducted on average two months prior to the first polysomnography, the follow-up polysomnography and in-home studies were conducted approximately five to six months after the initial polysomnography.</p> <p>Results</p> <p>When comparing the test-retest Apnea-hypopnea Index (AHI) and apnea index (AI), the in-home results were more highly correlated (r = 0.65 and 0.68) than the comparable PSG results (r = 0.56 and 0.58). The in-home results provided approximately 50% less test-retest variability than the comparable polysomnography AHI and AI values. Both the overall polysomnography AHI and AI showed a substantial bias toward increased severity upon retest (8 and 6 events/hr respectively) while the in-home bias was essentially zero. The in-home percentage of time supine showed a better correlation compared to polysomnography (r = 0.72 vs. 0.43). Patients biased toward more time supine during the initial polysomnography; no trends in time supine for in-home studies were noted.</p> <p>Conclusion</p> <p>Night-to-night variability in sleep-disordered breathing can be a confounding factor in assessing treatment outcomes. The sample size of this study was small given the night-to-night variability in OSA and limited understanding of polysomnography reliability. We found that in-home studies provided a repeated measure of sleep disordered breathing less variable then polysomnography. Investigators using polysomnography to assess treatment outcomes should factor in the increased variability and bias toward increased AHI values upon retest to ensure the study is adequately powered.</p
Distribution of Abundant and Active Planktonic Ciliates in Coastal and Slope Waters Off New England
Despite their important role of linking microbial and classic marine food webs, data on biogeographical patterns of microbial eukaryotic grazers are limited, and even fewer studies have used molecular tools to assess active (i.e., those expressing genes) community members. Marine ciliate diversity is believed to be greatest at the chlorophyll maximum, where there is an abundance of autotrophic prey, and is often assumed to decline with depth. Here, we assess the abundant (DNA) and active (RNA) marine ciliate communities throughout the water column at two stations off the New England coast (Northwest Atlantic)—a coastal station 43 km from shore (40 m depth) and a slope station 135 km off shore (1,000 m). We analyze ciliate communities using a DNA fingerprinting technique, Denaturing Gradient Gel Electrophoresis (DGGE), which captures patterns of abundant community members. We compare estimates of ciliate communities from SSU-rDNA (abundant) and SSU-rRNA (active) and find complex patterns throughout the water column, including many active lineages below the photic zone. Our analyses reveal (1) a number of widely-distributed taxa that are both abundant and active; (2) considerable heterogeneity in patterns of presence/absence of taxa in offshore samples taken 50 m apart throughout the water column; and (3) three distinct ciliate assemblages based on position from shore and depth. Analysis of active (RNA) taxa uncovers biodiversity hidden to traditional DNA-based approaches (e.g., clone library, rDNA amplicon studies)
Motor vehicle accidents in patients with an implantable cardioverter-defibrillator
Objectives.This study was designed to examine driving safety in patients at risk for sudden death after implantation of a cardioverter-defibrillator.Background.Cardioverter-defibrillators are frequently implanted in patients at high risk for sudden death. Despite concern about the safety of driving in these patients, little is known about their actual motor vehicle accident rates.Methods.Surveys were sent to all 742 physicians in the United States involved in cardioverter-defibrillator implantation and follow-up. Physicians were questioned about numbers of patients followed up, numbers of fatal and nonfatal accidents, physician recommendations to patients about driving and knowledge of state driving laws.Results.Surveys were returned by 452 physicians (61%). A total of 30 motor vehicle accidents related to shocks from implantable defibrillators were reported by 25 physicians over a 12-year period from 1980 to 1992. Of these, nine were fatal accidents involving eight patients with a defibrillator and one passenger in a car driven by a patient. No bystanders were fatally injured. There were 21 nonfatal accidents involving 15 patients, 3 passengers and 3 bystanders. The estimated fatality rate for patients with a defibrillator, 7.5/100,000 patient-years, is significantly lower than that for the general population (18.4/100,000 patient-years, p < 0.05). The estimated injury rate, 17.6/100,000 patient-years, is also significantly lower than that for the general public (2,224/100,000 patient-years, p < 0.05). Only 10.5% (30 of 286) of all defibrillator discharges during driving resulted in accidents. Regarding physician recommendations, most physicians (58.1%) ask their patients to wait a mean (± SD) of 7.3 ± 3.4 months after implantation or a shock before driving again.Conclusions.The motor vehicle accident rate caused by discharge from an implantable cardioverter-defibrillator is low. Although restricting driving for a short period of time after implantation may be appropriate, excessive restrictions or a total ban on driving appears to be unwarranted
- …