776 research outputs found
Cheminformatics Analysis and Computational Modeling of Detergent-Sensitive Aggregation
Small molecule aggregates cause detergent reversible protein sequestration and are the most prevalent source of nonspecific activity in biochemical screening assays. Large volumes of publicly available dose-response screens performed in the presence or absence of detergent have enabled cheminformatics analysis into chemical aggregation which reinforces prior notions that aggregation is prevalent and context dependent. We report the development of random forest classifiers trained on screens of β-lactamase or cruzain targets under well-defined assay conditions which distinguish putative aggregators and non-aggregators with balanced accuracies as high as 78%. These models overcome limitations of existing computational predictors related to programmatic errors, blurred modeling endpoints, and poor external predictivity. Model interpretation indicated that polarity, aliphaticity, and weight are significantly correlated with aggregation propensity, although these features alone estimate behavior with under 70% accuracy. Our curated datasets and validated models will help identify potential aggregators and reduce resource waste during drug discovery and optimization
More Than Just Middlemen: The Legacy and Influence of Art Dealers Joseph Duveen, Peggy Guggenheim and Leo Castelli on Shaping Art Collections
The purpose of this study is to examine whether art gallerists are replaceable in the current climate in which the plea for removing the middlemen has been growing. The speed and ease of art transactions through digital platforms provide an alternative to the relationship-based in-person elements of the art world. Before the pandemic, the art market was seen as notoriously opaque, and gallerists have been stereotyped as middlemen who take high commission from art sales. However, art gallerists have played an important role throughout art history, not only buying and selling works of art like traders, but also shaping the art historical narrative by championing important artists. They influence the tastes of major collectors and place art into the most significant museum collections to preserve art for the next generation. Through tracing our roots within the history of art dealing, this thesis focuses on examining the influence and legacy of three different art dealers, Joseph Duveen, Peggy Guggenheim and Leo Castelli, to examine whether gallerists are replaceable or indispensable
Agricultural science, plant breeding and the emergence of a Mendelian system in Britain, 1880-1930
Following Thomas P. Hughes’s systems approach in the history of technology, and making use of previously unexamined sources, this dissertation seeks to show that
the development of British Mendelism may be explained, and the success it enjoyed more accurately gauged, by analysing the emergence of a system whose elements justified the theory, protected it, made it useful, and slowly territorialized the world. Accordingly, the analysis will cover the principle elements of this system: the system builders, institutes, ideas and varieties that were, in one way or another, Mendelian. The first of the Mendelian system builders, William Bateson, is already well known for his introduction of Mendelism to Britain in the years after 1901 and his coinage of a new name for the discipline; Genetics. He was joined by two colleagues, Rowland Biffen and Thomas Wood, both of whom collaborated with Bateson in creating a string of institutes concerned with changing agriculture by using the new Mendelian theory. The proponents of the new theory often talked of their new found ability to transfer characters and build up new varieties of agricultural value. These claims were welcomed by politicians and the popular press and the idea that the new genetics would lead to a beneficial revolution in
agriculture became a popular cause of the day. However, the release of the first of these new Mendelian varieties in 1910 in Britain is far less well known than the
almost simultaneous development of the chromosome theory at Columbia University by Thomas Hunt Morgan. On one view of the history of genetics, the discipline, which had been born in Moravia, and popularised in Britain, was from
1910 most fruitfully developed in Morgan’s fly room. From this perspective it might be thought that the British School, under Bateson, became a disciplinary backwater,
at least in part because Bateson refused to accept chromosome theory. This thesis argues that far from being in a genetic backwater, Bateson along with Mendelian
allies Biffen and Wood were at the cutting edge of a wide ranging movement to improve agriculture through the introduction of new Mendelian varieties
Making Informed Choices about Microarray Data Analysis
This article describes the typical stages in the analysis of microarray data for non-specialist researchers in systems biology and medicine. Particular attention is paid to significant data analysis issues that are commonly encountered among practitioners, some of which need wider airing. The issues addressed include experimental design, quality assessment, normalization, and summarization of multiple-probe data. This article is based on the ISMB 2008 tutorial on microarray data analysis. An expanded version of the material in this article and the slides from the tutorial can be found at http://www.people.vcu.edu/~mreimers/OGMDA/index.html
At the Locus of Performance: A Case Study in Enhancing CPUs with Copious 3D-Stacked Cache
Over the last three decades, innovations in the memory subsystem were
primarily targeted at overcoming the data movement bottleneck. In this paper,
we focus on a specific market trend in memory technology: 3D-stacked memory and
caches. We investigate the impact of extending the on-chip memory capabilities
in future HPC-focused processors, particularly by 3D-stacked SRAM. First, we
propose a method oblivious to the memory subsystem to gauge the upper-bound in
performance improvements when data movement costs are eliminated. Then, using
the gem5 simulator, we model two variants of LARC, a processor fabricated in
1.5 nm and enriched with high-capacity 3D-stacked cache. With a volume of
experiments involving a board set of proxy-applications and benchmarks, we aim
to reveal where HPC CPU performance could be circa 2028, and conclude an
average boost of 9.77x for cache-sensitive HPC applications, on a per-chip
basis. Additionally, we exhaustively document our methodological exploration to
motivate HPC centers to drive their own technological agenda through enhanced
co-design
- …