9,187 research outputs found
Niacin therapy and the risk of new-onset diabetes: a meta-analysis of randomized controlled trials
Objective Previous studies have suggested that niacin treatment raises glucose levels in patients with diabetes and may increase the risk of developing diabetes. We undertook a meta-analysis of published and unpublished data from randomised trials to confirm whether an association exists between niacin and new-onset diabetes.
Methods We searched Medline, EMBASE and the Cochrane Central Register of Controlled Trials, from 1975 to 2014, for randomised controlled trials of niacin primarily designed to assess its effects on cardiovascular endpoints and cardiovascular surrogate markers. We included trials with ≥50 non-diabetic participants and average follow-up of ≥24 weeks. Published data were tabulated and unpublished data sought from investigators. We calculated risk ratios (RR) for new-onset diabetes with random-effects meta-analysis. Heterogeneity between trials was assessed using the I2 statistic.
Results In 11 trials with 26 340 non-diabetic participants, 1371 (725/13 121 assigned niacin; 646/13 219 assigned control) were diagnosed with diabetes during a weighted mean follow-up of 3.6 years. Niacin therapy was associated with a RR of 1.34 (95% CIs 1.21 to 1.49) for new-onset diabetes, with limited heterogeneity between trials (I2=0.0%, p=0.87). This equates to one additional case of diabetes per 43 (95% CI 30 to 70) initially non-diabetic individuals who are treated with niacin for 5 years. Results were consistent regardless of whether participants received background statin therapy (p for interaction=0.88) or combined therapy with laropiprant (p for interaction=0.52).
Conclusions Niacin therapy is associated with a moderately increased risk of developing diabetes regardless of background statin or combination laropiprant therapy
Advancing an Organizational Health Perspective for Insider Threat Prevention and Management
Malicious insiders pose a serious risk to valued organizational assets, including proprietary information, institutional processes, personnel, finances, reputation, and firm connections. Research-based solutions for predicting, detecting, and mitigating insider threats have focused heavily on individual, organizational, and cyber risk factors (Kont et al. 2015; Greitzer et al. 2018). To that end, scholars have increasingly recognized that people’s personalities, motivations, grievances, and work stressors raise the risk of insider threat events, and the corresponding interventional strategies involve cybersecurity and work design practices to safeguard the organization against human error and deviance (Homoliak et al. 2019; Greitzer et al. 2013; Maasberg, Warren, and Beebe 2015). Yet, despite evidence that insider threat events are perpetrated by people situated within a social and organizational context, discussions of insider threat have only started to recognize the importance of socio-organizational protective factors for reducing the occurrence of insider threats (Moore, Gardner, and Rousseau 2022; Whitty 2021). We argue that a healthy organization—an organization whose people, practices, and policies effectively sustain its survival and performance—may be key to preventing and managing insider threats
Quantitative assessment of the use of modified nucleoside triphosphates in expression profiling: differential effects on signal intensities and impacts on expression ratios
BACKGROUND: The power of DNA microarrays derives from their ability to monitor the expression levels of many genes in parallel. One of the limitations of such powerful analytical tools is the inability to detect certain transcripts in the target sample because of artifacts caused by background noise or poor hybridization kinetics. The use of base-modified analogs of nucleoside triphosphates has been shown to increase complementary duplex stability in other applications, and here we attempted to enhance microarray hybridization signal across a wide range of sequences and expression levels by incorporating these nucleotides into labeled cRNA targets. RESULTS: RNA samples containing 2-aminoadenosine showed increases in signal intensity for a majority of the sequences. These results were similar, and additive, to those seen with an increase in the hybridization time. In contrast, 5-methyluridine and 5-methylcytidine decreased signal intensities. Hybridization specificity, as assessed by mismatch controls, was dependent on both target sequence and extent of substitution with the modified nucleotide. Concurrent incorporation of modified and unmodified ATP in a 1:1 ratio resulted in significantly greater numbers of above-threshold ratio calls across tissues, while preserving ratio integrity and reproducibility. CONCLUSIONS: Incorporation of 2-aminoadenosine triphosphate into cRNA targets is a promising method for increasing signal detection in microarrays. Furthermore, this approach can be optimized to minimize impact on yield of amplified material and to increase the number of expression changes that can be detected
Recommended from our members
Active Transport of Peptides Across the Intact Human Tympanic Membrane.
We previously identified peptides that are actively transported across the intact tympanic membrane (TM) of rats with infected middle ears. To assess the possibility that this transport would also occur across the human TM, we first developed and validated an assay to evaluate transport in vitro using fragments of the TM. Using this assay, we demonstrated the ability of phage bearing a TM-transiting peptide to cross freshly dissected TM fragments from infected rats or from uninfected rats, guinea pigs and rabbits. We then evaluated transport across fragments of the human TM that were discarded during otologic surgery. Human trans-TM transport was similar to that seen in the animal species. Finally, we found that free peptide, unconnected to phage, was transported across the TM at a rate comparable to that seen for peptide-bearing phage. These studies provide evidence supporting the concept of peptide-mediated drug delivery across the intact TM and into the middle ears of patients
Deterministic/Fragmented-Stochastic Exchange for Large Scale Hybrid DFT Calculations
We develop an efficient approach to evaluate range-separated exact exchange
for grid or plane-wave based representations within the Generalized Kohn-Sham
DFT (GKS-DFT) framework. The Coulomb kernel is fragmented in reciprocal space,
and we employ a mixed deterministic-stochastic representation, retaining long
wavelength (low-) contributions deterministically and using a sparse
("fragmented") stochastic basis for the high- part. Coupled with a
projection of the Hamiltonian onto a subspace of valence and conduction states
from a prior local-DFT calculation, this method allows for the calculation of
long-range exchange of large molecular systems with hundreds and potentially
thousands of coupled valence states delocalized over millions of grid points.
We find that even a small number of valence and conduction states is sufficient
for converging the HOMO and LUMO energies of the GKS-DFT. Excellent tuning of
long-range separated hybrids (RSH) is easily obtained in the method for very
large systems, as exemplified here for the chlorophyll hexamer of Photosystem
II with 1,320 electrons.Comment: 9 pages, 3 figure
The SIMPSONS project: An integrated Mars transportation system
In response to the Request for Proposal (RFP) for an integrated transportation system network for an advanced Martian base, Frontier Transportation Systems (FTS) presents the results of the SIMPSONS project (Systems Integration for Mars Planetary Surface Operations Networks). The following topics are included: the project background, vehicle design, future work, conclusions, management status, and cost breakdown. The project focuses solely on the surface-to-surface transportation at an advanced Martian base
A Case Study Exploring William & Mary Military Alums\u27 Experiences with Career Transition
Background: The purpose of this qualitative study was to understand William & Mary (W&M) Master of Business Administration (MBA) military alums’ experiences with their career transitions post-graduation and the influence of the program on their experiences.
Methods: This study utilized a case study design, including focus groups and interviews with 21 alums from the last 10 years and formal, semi-structured interviews with six current faculty and staff. We used Schlossberg’s (1981) individuals’ situation, self, support, and strategies to analyze data using thematic analysis, and then iteratively reducing codes into themes.
Results: Demographic findings showed a lack of diversity, such as women, people of color, and LGBTQ+. Our findings indicated participants’ perceptions of the program’s impact on their transitions were positive. We recognized academic and career development resources, impact of military on transitions, and offering of the Executive Partner Program. Our research recognized influential policies and networks and determined the importance of human networks. Lastly, we identified a lack of reported disabilities and shared experiences of minorities.
Conclusions: We found that active-duty alums’ situations differed from veterans due to delayed transitions into the civilian workforce. Our participants shared different supports used, such as the Center for Military Transition (CMT), the Executive Director of the CMT, the Graduate Career Management Center, and the Executive Partners Program. Networking and use of transition coaches also benefited military alums
A Simplified Faceted Approach To Information Retrieval for Reusable Software Classification
Software Reuse is widely recognized as the most promising technique presently available in reducing the cost of software production. It is the adaptation or incorporation of previously developed software components, designs or other software-related artifacts (i.e. test plans) into new software or software development regimes. Researchers and vendors are doubling their efforts and devoting their time primarily to the topic of software reuse. Most have focused on mechanisms to construct reusable software but few have focused on the problem of discovering components or designs to meet specific needs. In order for software reuse to be successful, it must be perceived to be less costly to discover a software component or related artifact to satisfy a given need than to discover one anew. As results, this study will describe a method to classify software components that meet a specified need.
Specifically, the purpose of the present research study is to provide a flexible system, comprised of a classification scheme and searcher system, entitled Guides-Search, in which processes can be retrieved by carrying out a structured dialogue with the user. The classification scheme provides both the structure of questions to be posed to the user, and the set of possible answers to each question. The model is not an attempt to replace current structures; but rather, seeks to provide a conceptual and structural method to support the improvement of software reuse methodology.
The investigation focuses on the following goals and objectives for the classification scheme and searcher system: the classification will be flexible and extensible, but usable by the Searcher; the user will not be presented with a large number of questions; the user will never be required to answer a question not known to be germane to the query
Optimized Attenuated Interaction: Enabling Stochastic Bethe-Salpeter Spectra for Large Systems
We develop an improved stochastic formalism for the Bethe-Salpeter equation,
based on an exact separation of the effective-interaction to two parts,
where the latter is formally any translationally-invariant
interaction . When optimizing the fit of exchange kernel to
, by using a stochastic sampling of , the difference becomes
quite small. Then, in the main BSE routine, this small difference is
stochastically sampled. The number of stochastic samples needed for an accurate
spectrum is then largely independent of system size. While the method is
formally cubic in scaling, the scaling prefactor is small due to the constant
number of stochastic orbitals needed for sampling .Comment: 9 pages, 5 figures, 2 table
- …