690 research outputs found
Normal edge-colorings of cubic graphs
A normal -edge-coloring of a cubic graph is an edge-coloring with
colors having the additional property that when looking at the set of colors
assigned to any edge and the four edges adjacent it, we have either exactly
five distinct colors or exactly three distinct colors. We denote by
the smallest , for which admits a normal
-edge-coloring. Normal -edge-colorings were introduced by Jaeger in order
to study his well-known Petersen Coloring Conjecture. More precisely, it is
known that proving for every bridgeless cubic graph is
equivalent to proving Petersen Coloring Conjecture and then, among others,
Cycle Double Cover Conjecture and Berge-Fulkerson Conjecture. Considering the
larger class of all simple cubic graphs (not necessarily bridgeless), some
interesting questions naturally arise. For instance, there exist simple cubic
graphs, not bridgeless, with . On the other hand, the known
best general upper bound for was . Here, we improve it by
proving that for any simple cubic graph , which is best
possible. We obtain this result by proving the existence of specific no-where
zero -flows in -edge-connected graphs.Comment: 17 pages, 6 figure
Targeted LC-MS/MS-based metabolomics and lipidomics on limited hematopoietic stem cell numbers
Metabolism is important for the regulation of hematopoietic stem cells (HSCs) and drives cellular fate. Due to the scarcity of HSCs, it has been technically challenging to perform metabolome analyses gaining insight into HSC metabolic regulatory networks. Here, we present two targeted liquid chromatographyâmass spectrometry approaches that enable the detection of metabolites after fluorescence-activated cell sorting when sample amounts are limited. One protocol covers signaling lipids and retinoids, while the second detects tricarboxylic acid cycle metabolites and amino acids. For complete details on the use and execution of this protocol, please refer to Schönberger et al. (2022)
25 jaar moord in Nederland: Een trendanalyse van geslacht en leeftijd van slachtoffers van moord.
Deze studie beschrijft de trend in slachtofferschap van moord in Nederland in deperiode 1992-2016. Hierbij is gebruik gemaakt van de Dutch Homicide Monitor. Debevindingen laten zien dat het moordcijfer sinds de jaren negentig aan het dalen is.Deze daling is het grootst onder mannelijke en vrouwelijke slachtoffers in de leeftijd van20 tot en met 39 jaar. Dit onderzoek benadrukt het belang om de discussie rondom dedalende moordtrend te verschuiven naar een verdiepende analyse van geslacht en leeftijdbij slachtoffers.Security and Global Affair
Gab2 deficiency prevents Flt3-ITD driven acute myeloid leukemia in vivo
Internal tandem duplications (ITD) of the FMS-like tyrosine kinase 3 (FLT3) predict poor prognosis in acute myeloid leukemia (AML) and often co-exist with inactivating DNMT3A mutations. In vitro studies implicated Grb2-associated binder 2 (GAB2) as FLT3-ITD effector. Utilizing a Flt3-ITD knock-in, Dnmt3a haploinsufficient mouse model, we demonstrate that Gab2 is essential for the development of Flt3-ITD driven AML in vivo, as Gab2 deficient mice displayed prolonged survival, presented with attenuated liver and spleen pathology and reduced blast counts. Furthermore, leukemic bone marrow from Gab2 deficient mice exhibited reduced colony-forming unit capacity and increased FLT3 inhibitor sensitivity. Using transcriptomics, we identify the genes encoding for Axl and the Ret co-receptor Gfra2 as targets of the Flt3-ITD/Gab2/Stat5 axis. We propose a pathomechanism in which Gab2 increases signaling of these receptors by inducing their expression and by serving as downstream effector. Thereby, Gab2 promotes AML aggressiveness and drug resistance as it incorporates these receptor tyrosine kinases into the Flt3-ITD signaling network. Consequently, our data identify GAB2 as a promising biomarker and therapeutic target in human AML
Catching the flu: Syndromic surveillance, algorithmic governmentality and global health security
How do algorithms shape the imaginary and practice of security? Does their proliferation point to a shift in the political rationality of security? If so, what is the nature and extent of that shift? This article explores these questions in relation to global health security. Prompted by an epidemic of new infectious disease outbreaks â from HIV, SARS and pandemic flu, through to MERS and Ebola â many governments are making health security an integral part of their national security strategies. Algorithms are central to these developments because they underpin a number of nextgeneration syndromic surveillance systems now routinely used by governments and international organizations to rapidly detect new outbreaks globally. This article traces the origins, design and evolution of three such internet-based surveillance systems: 1) the Program for Monitoring Emerging Diseases, 2) the Global Public Health Intelligence Network, and 3) HealthMap. The article shows how the successive introduction of those three syndromic surveillance systems has propelled algorithmic technologies into the heart of global outbreak detection. This growing recourse to algorithms for the purposes of strengthening global health security, the article argues, signals a significant shift in the underlying problem, nature, and role of knowledge in contemporary security practices
Forecasting in the light of Big Data
Predicting the future state of a system has always been a natural motivation
for science and practical applications. Such a topic, beyond its obvious
technical and societal relevance, is also interesting from a conceptual point
of view. This owes to the fact that forecasting lends itself to two equally
radical, yet opposite methodologies. A reductionist one, based on the first
principles, and the naive inductivist one, based only on data. This latter view
has recently gained some attention in response to the availability of
unprecedented amounts of data and increasingly sophisticated algorithmic
analytic techniques. The purpose of this note is to assess critically the role
of big data in reshaping the key aspects of forecasting and in particular the
claim that bigger data leads to better predictions. Drawing on the
representative example of weather forecasts we argue that this is not generally
the case. We conclude by suggesting that a clever and context-dependent
compromise between modelling and quantitative analysis stands out as the best
forecasting strategy, as anticipated nearly a century ago by Richardson and von
Neumann
Modeling the impact of amino acid substitution in a monoclonal antibody on cation exchange chromatography
A vital part of biopharmaceutical research is decision making around which lead candidate should be progressed in early-phase development. When multiple antibody candidates show similar biological activity, developability aspects are taken into account to ease the challenges of manufacturing the potential drug candidate. While current strategies for developability assessment mainly focus on drug product stability, only limited information is available on how antibody candidates with minimal differences in their primary structure behave during downstream processing. With increasing time-to-market pressure and an abundance of monoclonal antibodies (mAbs) in development pipelines, developability assessments should also consider the ability of mAbs to integrate into the downstream platform. This study investigates the influence of amino acid substitutions in the complementarity-determining region (CDR) of a full-length IgG1 mAb on the elution behavior in preparative cation exchange chromatography. Single amino acid substitutions within the investigated mAb resulted in an additional positive charge in the light chain (L) and heavy chain (H) CDR, respectively. The mAb variants showed an increased retention volume in linear gradient elution compared with the wild-type antibody. Furthermore, the substitution of tryptophan with lysine in the H-CDR3 increased charge heterogeneity of the product. A multiscale in silico analysis, consisting of homology modeling, protein surface analysis, and mechanistic chromatography modeling increased understanding of the adsorption mechanism. The results reveal the potential effects of lead optimization during antibody drug discovery on downstream processing
Socially sensitive lactation: Exploring the social context of breastfeeding
Many women report difficulties with breastfeeding and do not maintain the practice for as long as intended. Although psychologists and other researchers have explored some of the difficulties they experience, fuller exploration of the relational contexts in which breastfeeding takes place is warranted to enable more in-depth analysis of the challenges these pose for breastfeeding women. The present paper is based on qualitative data collected from 22 first-time breastfeeding mothers through two phases of interviews and audio-diaries which explored how the participants experienced their relationships with significant others and the wider social context of breastfeeding in the first five weeks postpartum. Using a thematic analysis informed by symbolic interactionism, we develop the overarching theme of âPractising socially sensitive lactationâ which captures how participants felt the need to manage tensions between breastfeeding and their perceptions of the needs, expectations and comfort of others. We argue that breastfeeding remains a problematic social act, despite its agreed importance for child health. Whilst acknowledging the limitations of our sample and analytic approach, we suggest ways in which perinatal and public health interventions can take more effective account of the social challenges of breastfeeding in order to facilitate the health and psychological well-being of mothers and their infants
- âŠ