85 research outputs found
Generic local distinguishability and completely entangled subspaces
A subspace of a multipartite Hilbert space is completely entangled if it
contains no product states. Such subspaces can be large with a known maximum
size, S, approaching the full dimension of the system, D. We show that almost
all subspaces with dimension less than or equal to S are completely entangled,
and then use this fact to prove that n random pure quantum states are
unambiguously locally distinguishable if and only if n does not exceed D-S.
This condition holds for almost all sets of states of all multipartite systems,
and reveals something surprising. The criterion is identical for separable and
for nonseparable states: entanglement makes no difference.Comment: 12 page
Evaluation of Patients for Radiotherapy for Prostate Adenocarcinoma
Prostate adenocarcinoma is the most common non-cutaneous malignancy among men in the United States, and the second leading cause of death. However, most prostate adenocarcinoma diagnoses are now diagnosed at early stages and are curable, or if they recur, are associated with such long survival times that the patients usually succumb to competing co-morbidities. This chapter would discuss a brief history of prostate cancer evaluation and its pertinence today, including the Gleason scoring system, advent of PSA testing, and development of the NCCN classification system that is used today. Alternative classification systems, such as the UCSF-CAPRA scoring system, would also be discussed. The latter half of the chapter will discuss the evolution from personalized medicine to precision medicine, including PSMA imaging and prostate cancer genomics, with ongoing trials and future directions. Furthermore, included within this chapter would be a discussion of selecting appropriate men for active surveillance, and appropriate regimens for active surveillance
Topic Modeling and Text Analysis for Qualitative Policy Research
This paper contributes to a critical methodological discussion that has direct ramifications for policy studies: how computational methods can be concretely incorporated into existing processes of textual analysis and interpretation without compromising scientific integrity. We focus on the computational method of topic modeling and investigate how it interacts with two larger families of qualitative methods: content and classification methods characterized by interest in words as communication units and discourse and representation methods characterized by interest in the meaning of communicative acts. Based on analysis of recent academic publications that have used topic modeling for textual analysis, our findings show that different mixed‐method research designs are appropriate when combining topic modeling with the two groups of methods. Our main concluding argument is that topic modeling enables scholars to apply policy theories and concepts to much larger sets of data. That said, the use of computational methods requires genuine understanding of these techniques to obtain substantially meaningful results. We encourage policy scholars to reflect carefully on methodological issues, and offer a simple heuristic to help identify and address critical points when designing a study using topic modeling.Peer reviewe
Whole-genome sequencing identifies emergence of a quinolone resistance mutation in a case of Stenotrophomonas maltophilia bacteremia
Whole-genome sequences for Stenotrophomonas maltophilia serial isolates from a bacteremic patient before and after development of levofloxacin resistance were assembled de novo and differed by one single-nucleotide variant in smeT, a repressor for multidrug efflux operon smeDEF. Along with sequenced isolates from five contemporaneous cases, they displayed considerable diversity compared against all published complete genomes. Whole-genome sequencing and complete assembly can conclusively identify resistance mechanisms emerging in S. maltophilia strains during clinical therapy
AI is a viable alternative to high throughput screening: a 318-target study
: High throughput screening (HTS) is routinely used to identify bioactive small molecules. This requires physical compounds, which limits coverage of accessible chemical space. Computational approaches combined with vast on-demand chemical libraries can access far greater chemical space, provided that the predictive accuracy is sufficient to identify useful molecules. Through the largest and most diverse virtual HTS campaign reported to date, comprising 318 individual projects, we demonstrate that our AtomNet® convolutional neural network successfully finds novel hits across every major therapeutic area and protein class. We address historical limitations of computational screening by demonstrating success for target proteins without known binders, high-quality X-ray crystal structures, or manual cherry-picking of compounds. We show that the molecules selected by the AtomNet® model are novel drug-like scaffolds rather than minor modifications to known bioactive compounds. Our empirical results suggest that computational methods can substantially replace HTS as the first step of small-molecule drug discovery
Assumption without representation: the unacknowledged abstraction from communities and social goods
We have not clearly acknowledged the abstraction from unpriceable “social goods” (derived from
communities) which, different from private and public goods, simply disappear if it is attempted to
market them. Separability from markets and economics has not been argued, much less established.
Acknowledging communities would reinforce rather than undermine them, and thus facilitate
the production of social goods. But it would also help economics by facilitating our understanding
of – and response to – financial crises as well as environmental destruction and many social problems,
and by reducing the alienation from economics often felt by students and the public
- …