76 research outputs found
Expression and Purification of a Cleavable Recombinant Fortilin from \u3ci\u3eEscherichia coli\u3c/i\u3e for Structure Activity Studies
Complications related to atherosclerosis account for approximately 1 in 4 deaths in the United States and treatment has focused on lowering serum LDL-cholesterol levels with statins. However, approximately 50% of those diagnosed with atherosclerosis have blood cholesterol levels within normal parameters. Human fortilin is an anti-apoptotic protein and a factor in macrophage-mediated atherosclerosis and is hypothesized to protect inflammatory macrophages from apoptosis, leading to subsequent cardiac pathogenesis. Fortilin is unique because it provides a novel drug target for atherosclerosis that goes beyond lowering cholesterol and utilization of a solution nuclear magnetic resonance (NMR) spectroscopy, structure-based drug discovery approach requires milligram quantities of pure, bioactive, recombinant fortilin. Here, we designed expression constructs with different affinity tags and protease cleavage sites to find optimal conditions to obtain the quantity and purity of protein necessary for structure activity relationship studies. Plasmids encoding fortilin with maltose binding protein (MBP), 6-histidine (6His) and glutathione-S-transferase (GST), N- terminal affinity tags were expressed and purified from Escherichia coli (E. coli). Cleavage sites with tobacco etch virus (TEV) protease and human rhinovirus (HRV) 3C protease were assessed. Despite high levels of expression of soluble protein, the fusion constructs were resistant to proteinases without the inclusion of amino acids between the cleavage site and N-terminus. We surveyed constructs with increasing lengths of glycine/serine (GGS) linkers between the cleavage site and fortilin and found that inclusion of at least one GGS insert led to successful protease cleavage and pure fortilin with conserved binding to calcium as measured by NMR
Instrumentation for Routine Analysis of Acrylamide in French Fries: Assessing Limitations for Adoption
The purpose of this experimental review was to detect acrylamide in French fries using methods most adaptable to the food process industry for quality control assessment of products. French fries were prepared at different cook times using the same fryer oil over a five-day period to assess the influence of oil degradation and monitor trends in acrylamide formation. Acrylamide detection was performed using LC-MS, GC-MS and FT-NIR. The low levels of acrylamide produced during frying, low molecular weight of the analyte, and complexity of the potato matrix make routine acrylamide measurement challenging in a well-outfitted analytical lab with trained personnel. The findings of this study are presented from the perspective of pros and cons of each acrylamide measurement method in enough detail for food processors to appraise the method that may work best for them based on their available instrumentation and extent of personnel training
Physical science research needed to evaluate the viability and risks of marine cloud brightening
Marine cloud brightening (MCB) is the deliberate injection of aerosol particles into shallow marine clouds to increase their reflection of solar radiation and reduce the amount of energy absorbed by the climate system. From the physical science perspective, the consensus of a broad international group of scientists is that the viability of MCB will ultimately depend on whether observations and models can robustly assess the scale-up of local-to-global brightening in today\u27s climate and identify strategies that will ensure an equitable geographical distribution of the benefits and risks associated with projected regional changes in temperature and precipitation. To address the physical science knowledge gaps required to assess the societal implications of MCB, we propose a substantial and targeted program of research-field and laboratory experiments, monitoring, and numerical modeling across a range of scales
Recommended from our members
2023 State of Open at the University of Colorado Boulder: An Update on Open Access Practices Based on Data from 2022
Using data from 2022, this report is the fifth annual update to the “State of Open at the University of Colorado Boulder: A Baseline Analysis of Open Access Practices from 2012 to 2018”: https://doi.org/10.25810/vprn-v113. It includes analyses of open access (OA) article publishing activities, OA repository usage, and data publishing practices by researchers at the University of Colorado Boulder (CU Boulder). Data used to produce this report can be found here: https://doi.org/10.25810/ktb4-ce48
Key findings from this report include:
72% of articles published in 2022 by CU Boulder authors are available via some type of OA (Gold, Green, Hybrid, or Bronze) (up from 62% at the time of the 2021 report);
In 2022, the CU Boulder Libraries OA Fund funded author fees totaling 89,761 for 53 journal articles in 2021); however, these decreases have more to do with the OA Fund being exhausted earlier in the fiscal year than an actual decrease in funding;
At the end of 2022, there were 16,090 OA items in the CU Scholar institutional repository (up from 13,791 in 2021), and these items were downloaded a total of 36,730 times in 2022 (down from 39,393 in 2021);
In the annual Faculty Report of Professional Activities (FRPA), faculty reported 56 published data sets in 2022 (down from 92 in 2021) with 87.5% of these citations including Digital Object Identifiers (DOIs) (up from 82.6% in 2021) and 95% of these citations identifying a formal data repository (same as 95% in 2021);
The Libraries and its partners registered 335 DataCite DOIs for published data sets in 2022 (down from 416 in 2021);
This is the first year there has been a decrease in either the number of reported published data sets in FRPA or the number of DataCite DOIs registered for published data sets, so it will be important to monitor these numbers in the coming years to see if this is an anomaly or the start of a new trend.</p
Robust Single-view Cone-beam X-ray Pose Estimation with Neural Tuned Tomography (NeTT) and Masked Neural Radiance Fields (mNeRF)
Many tasks performed in image-guided, mini-invasive, medical procedures can
be cast as pose estimation problems, where an X-ray projection is utilized to
reach a target in 3D space. Expanding on recent advances in the differentiable
rendering of optically reflective materials, we introduce new methods for pose
estimation of radiolucent objects using X-ray projections, and we demonstrate
the critical role of optimal view synthesis in performing this task. We first
develop an algorithm (DiffDRR) that efficiently computes Digitally
Reconstructed Radiographs (DRRs) and leverages automatic differentiation within
TensorFlow. Pose estimation is performed by iterative gradient descent using a
loss function that quantifies the similarity of the DRR synthesized from a
randomly initialized pose and the true fluoroscopic image at the target pose.
We propose two novel methods for high-fidelity view synthesis, Neural Tuned
Tomography (NeTT) and masked Neural Radiance Fields (mNeRF). Both methods rely
on classic Cone-Beam Computerized Tomography (CBCT); NeTT directly optimizes
the CBCT densities, while the non-zero values of mNeRF are constrained by a 3D
mask of the anatomic region segmented from CBCT. We demonstrate that both NeTT
and mNeRF distinctly improve pose estimation within our framework. By defining
a successful pose estimate to be a 3D angle error of less than 3 deg, we find
that NeTT and mNeRF can achieve similar results, both with overall success
rates more than 93%. However, the computational cost of NeTT is significantly
lower than mNeRF in both training and pose estimation. Furthermore, we show
that a NeTT trained for a single subject can generalize to synthesize
high-fidelity DRRs and ensure robust pose estimations for all other subjects.
Therefore, we suggest that NeTT is an attractive option for robust pose
estimation using fluoroscopic projections
Comparison of Nanotrap® Microbiome A Particles, membrane filtration, and skim milk workflows for SARS-CoV-2 concentration in wastewater
IntroductionSevere acute respiratory syndrome coronavirus-2 (SARS-CoV-2) RNA monitoring in wastewater has become an important tool for Coronavirus Disease 2019 (COVID-19) surveillance. Grab (quantitative) and passive samples (qualitative) are two distinct wastewater sampling methods. Although many viral concentration methods such as the usage of membrane filtration and skim milk are reported, these methods generally require large volumes of wastewater, expensive lab equipment, and laborious processes.MethodsThe objectives of this study were to compare two workflows (Nanotrap® Microbiome A Particles coupled with MagMax kit and membrane filtration workflows coupled with RNeasy kit) for SARS-CoV-2 recovery in grab samples and two workflows (Nanotrap® Microbiome A Particles and skim milk workflows coupled with MagMax kit) for SARS-CoV-2 recovery in Moore swab samples. The Nanotrap particle workflow was initially evaluated with and without the addition of the enhancement reagent 1 (ER1) in 10 mL wastewater. RT-qPCR targeting the nucleocapsid protein was used for detecting SARS-CoV-2 RNA.ResultsAdding ER1 to wastewater prior to viral concentration significantly improved viral concentration results (P < 0.0001) in 10 mL grab and swab samples processed by automated or manual Nanotrap workflows. SARS-CoV-2 concentrations in 10 mL grab and Moore swab samples with ER1 processed by the automated workflow as a whole showed significantly higher (P < 0.001) results than 150 mL grab samples using the membrane filtration workflow and 250 mL swab samples using the skim milk workflow, respectively. Spiking known genome copies (GC) of inactivated SARS-CoV-2 into 10 mL wastewater indicated that the limit of detection of the automated Nanotrap workflow was ~11.5 GC/mL using the RT-qPCR and 115 GC/mL using the digital PCR methods.DiscussionThese results suggest that Nanotrap workflows could substitute the traditional membrane filtration and skim milk workflows for viral concentration without compromising the assay sensitivity. The manual workflow can be used in resource-limited areas, and the automated workflow is appropriate for large-scale COVID-19 wastewater-based surveillance
REFORMS: Reporting Standards for Machine Learning Based Science
Machine learning (ML) methods are proliferating in scientific research.
However, the adoption of these methods has been accompanied by failures of
validity, reproducibility, and generalizability. These failures can hinder
scientific progress, lead to false consensus around invalid claims, and
undermine the credibility of ML-based science. ML methods are often applied and
fail in similar ways across disciplines. Motivated by this observation, our
goal is to provide clear reporting standards for ML-based science. Drawing from
an extensive review of past literature, we present the REFORMS checklist
(porting Standards achine Learning
Based cience). It consists of 32 questions and a paired set of
guidelines. REFORMS was developed based on a consensus of 19 researchers across
computer science, data science, mathematics, social sciences, and biomedical
sciences. REFORMS can serve as a resource for researchers when designing and
implementing a study, for referees when reviewing papers, and for journals when
enforcing standards for transparency and reproducibility
- …