64 research outputs found
funcX: A Federated Function Serving Fabric for Science
Exploding data volumes and velocities, new computational methods and
platforms, and ubiquitous connectivity demand new approaches to computation in
the sciences. These new approaches must enable computation to be mobile, so
that, for example, it can occur near data, be triggered by events (e.g.,
arrival of new data), be offloaded to specialized accelerators, or run remotely
where resources are available. They also require new design approaches in which
monolithic applications can be decomposed into smaller components, that may in
turn be executed separately and on the most suitable resources. To address
these needs we present funcX---a distributed function as a service (FaaS)
platform that enables flexible, scalable, and high performance remote function
execution. funcX's endpoint software can transform existing clouds, clusters,
and supercomputers into function serving systems, while funcX's cloud-hosted
service provides transparent, secure, and reliable function execution across a
federated ecosystem of endpoints. We motivate the need for funcX with several
scientific case studies, present our prototype design and implementation, show
optimizations that deliver throughput in excess of 1 million functions per
second, and demonstrate, via experiments on two supercomputers, that funcX can
scale to more than more than 130000 concurrent workers.Comment: Accepted to ACM Symposium on High-Performance Parallel and
Distributed Computing (HPDC 2020). arXiv admin note: substantial text overlap
with arXiv:1908.0490
Energy-Efficient Databases Using Sweet Spot Frequencies
Database management systems (DBMS) are typically tuned for high performance and scalability. Nevertheless, carbon footprint and energy efficiency are also becoming increasing concerns. Unfortunately, existing studies mainly present theoretical contributions but fall short on proposing practical techniques. These could be used by administrators or query optimizers to increase the energy efficiency of the DBMS. Thus, this paper explores the effect of so-called sweet spots, which are energy-efficient CPU frequencies, on the energy required to execute queries. From our findings, we derive the Sweet Spot Technique, which relies on identifying energy-efficient sweet spots and the optimal number of threads that minimizes energy consumption for a query or an entire database workload. The technique is simple and has a practical implementation leading to energy savings of up to 50% compared to using the nominal frequency and maximum number of threads
On a Linear Program for Minimum-Weight Triangulation
Minimum-weight triangulation (MWT) is NP-hard. It has a polynomial-time
constant-factor approximation algorithm, and a variety of effective polynomial-
time heuristics that, for many instances, can find the exact MWT. Linear
programs (LPs) for MWT are well-studied, but previously no connection was known
between any LP and any approximation algorithm or heuristic for MWT. Here we
show the first such connections: for an LP formulation due to Dantzig et al.
(1985): (i) the integrality gap is bounded by a constant; (ii) given any
instance, if the aforementioned heuristics find the MWT, then so does the LP.Comment: To appear in SICOMP. Extended abstract appeared in SODA 201
Recognizing Treelike k-Dissimilarities
A k-dissimilarity D on a finite set X, |X| >= k, is a map from the set of
size k subsets of X to the real numbers. Such maps naturally arise from
edge-weighted trees T with leaf-set X: Given a subset Y of X of size k, D(Y) is
defined to be the total length of the smallest subtree of T with leaf-set Y .
In case k = 2, it is well-known that 2-dissimilarities arising in this way can
be characterized by the so-called "4-point condition". However, in case k > 2
Pachter and Speyer recently posed the following question: Given an arbitrary
k-dissimilarity, how do we test whether this map comes from a tree? In this
paper, we provide an answer to this question, showing that for k >= 3 a
k-dissimilarity on a set X arises from a tree if and only if its restriction to
every 2k-element subset of X arises from some tree, and that 2k is the least
possible subset size to ensure that this is the case. As a corollary, we show
that there exists a polynomial-time algorithm to determine when a
k-dissimilarity arises from a tree. We also give a 6-point condition for
determining when a 3-dissimilarity arises from a tree, that is similar to the
aforementioned 4-point condition.Comment: 18 pages, 4 figure
Neighborhoods of trees in circular orderings
In phylogenetics, a common strategy used to construct an evolutionary tree for a set of species X is to search in the space of all such trees for one that optimizes some given score function (such as the minimum evolution, parsimony or likelihood score). As this can be computationally intensive, it was recently proposed to restrict such searches to the set of all those trees that are compatible with some circular ordering of the set X. To inform the design of efficient algorithms to perform such searches, it is therefore of interest to find bounds for the number of trees compatible with a fixed ordering in the neighborhood of a tree that is determined by certain tree operations commonly used to search for trees: the nearest neighbor interchange (nni), the subtree prune and regraft (spr) and the tree bisection and reconnection (tbr) operations. We show that the size of such a neighborhood of a binary tree associated with the nni operation is independent of the tree’s topology, but that this is not the case for the spr and tbr operations. We also give tight upper and lower bounds for the size of the neighborhood of a binary tree for the spr and tbr operations and characterize those trees for which these bounds are attained
Extending the honey bee venome with the antimicrobial peptide apidaecin and a protein resembling wasp antigen 5
Honey bee venom is a complex mixture of toxic proteins and peptides. In the present study we tried to extend our knowledge of the venom composition using two different approaches. First, worker venom was analysed by liquid chromatography-mass spectrometry and this revealed the antimicrobial peptide apidaecin for the first time in such samples. Its expression in the venom gland was confirmed by reverse transcription PCR and by a peptidomic analysis of the venom apparatus tissue. Second, genome mining revealed a list of proteins with resemblance to known insect allergens or venom toxins, one of which showed homology to proteins of the antigen 5 (Ag5)/Sol i 3 cluster. It was demonstrated that the honey bee Ag5-like gene is expressed by venom gland tissue of winter bees but not of summer bees. Besides this seasonal variation, it shows an interesting spatial expression pattern with additional production in the hypopharyngeal glands, the brains and the midgut. Finally, our immunoblot study revealed that both synthetic apidaecin and the Ag5-like recombinant from bacteria evoke no humoral activity in beekeepers. Also, no IgG4-based cross-reactivity was detected between the honey bee Ag5-like protein and its yellow jacket paralogue Ves v 5
Radiochemotherapy with or without cetuximab for unresectable esophageal cancer: final results of a randomized phase 2 trial (LEOPARD-2)
Abstract
Purpose
To investigate the efficacy and toxicity of cetuximab when added to radiochemotherapy for unresectable esophageal cancer.
Methods
This randomized phase 2 trial (clinicaltrials.gov, identifier NCT01787006) compared radiochemotherapy plus cetuximab (arm A) to radiochemotherapy (arm B) for unresectable esophageal cancer. Primary objective was 2‑year overall survival (OS). Arm A was considered insufficiently active if 2‑year OS was ≤40% (null hypothesis = H0), and promising if the lower limit of the 95% confidence interval was >45%. If that lower limit was >40%, H0 was rejected. Secondary objectives included progression-free survival (PFS), locoregional control (LC), metastases-free survival (MFS), response, and toxicity. The study was terminated early after 74 patients; 68 patients were evaluable.
Results
Two-year OS was 71% in arm A (95% CI: 55–87%) vs. 53% in arm B (95% CI: 36–71%); H0 was rejected. Median OS was 49.1 vs. 24.1 months (p = 0.147). Hazard ratio (HR) for death was 0.60 (95% CI: 0.30–1.21). At 2 years, PFS was 56% vs. 44%, LC 84% vs. 72%, and MFS 74% vs. 54%. HRs were 0.51 (0.25–1.04) for progression, 0.43 (0.13–1.40) for locoregional failure, and 0.43 (0.17–1.05) for distant metastasis. Overall response was 81% vs. 69% (p = 0.262). Twenty-six and 27 patients, respectively, experienced at least one toxicity grade ≥3 (p = 0.573). A significant difference was found for grade ≥3 allergic reactions (12.5% vs. 0%, p = 0.044).
Conclusion
Given the limitations of this trial, radiochemotherapy plus cetuximab was feasible. There was a trend towards improved PFS and MFS. Larger studies are required to better define the role of cetuximab for unresectable esophageal cancer
Proper application of antibodies for immunohistochemical detection: antibody crimes and how to prevent them
For several decades antibodies raised against specific proteins, peptides, or peptide epitopes have proven to be versatile and very powerful tools to demonstrate molecular identity in cells and tissues. New techniques of immunohistochemistry and immunofluorescence have improved both the optical resolution of such protein identification as well as its sensitivity, particularly through the use of amplification methodology. However, this improved sensitivity has also increased the risks of false-positive and false-negative staining and thereby raised the necessity for proper and adequate controls. In this review, the authors drawonmanyyears of experience to illuminate many of the more common errors and problematic issues in immunohistochemistry, and how these may be avoided. A key factor in all of this is that techniques need to be properly documented and especially antibodies and procedures must be adequately described. Antibodies are a valuable and shared resource within the scientific community; it is essential therefore that mistakes involving antibodies and their controls are not perpetuated through inadequate reporting in the literature
Stereotypical Chronic Lymphocytic Leukemia B-Cell Receptors Recognize Survival Promoting Antigens on Stromal Cells
Chronic lymphocytic leukemia (CLL) is the most common leukemia in the Western world. Survival of CLL cells depends on their close contact with stromal cells in lymphatic tissues, bone marrow and blood. This microenvironmental regulation of CLL cell survival involves the stromal secretion of chemo- and cytokines as well as the expression of adhesion molecules. Since CLL survival may also be driven by antigenic stimulation through the B-cell antigen receptor (BCR), we explored the hypothesis that these processes may be linked to each other. We tested if stromal cells could serve as an antigen reservoir for CLL cells, thus promoting CLL cell survival by stimulation through the BCR. As a proof of principle, we found that two CLL BCRs with a common stereotyped heavy chain complementarity-determining region 3 (previously characterized as “subset 1”) recognize antigens highly expressed in stromal cells – vimentin and calreticulin. Both antigens are well-documented targets of autoantibodies in autoimmune disorders. We demonstrated that vimentin is displayed on the surface of viable stromal cells and that it is present and bound by the stereotyped CLL BCR in CLL-stroma co-culture supernatant. Blocking the vimentin antigen by recombinant soluble CLL BCR under CLL-stromal cell co-culture conditions reduces stroma-mediated anti-apoptotic effects by 20–45%. We therefore conclude that CLL BCR stimulation by stroma-derived antigens can contribute to the protective effect that the stroma exerts on CLL cells. This finding sheds a new light on the understanding of the pathobiology of this so far mostly incurable disease
- …