7,153 research outputs found

    The Effects of Combination Treatment Using Phenoxodiol and Docetaxel, and Phenoxodiol and Secreted Frizzled-related Protein 4 on Prostate Cancer Cell Lines

    Get PDF
    Although much progress has been made for the treatment of prostate cancer, patients with advanced prostate cancer still have a poor 5 year survival rate. Current practices for hormone-refractory/castrate resistant, metastatic prostate cancer involve the use of taxanes. Docetaxel, in particular, is being incorporated in numerous current clinical trials either as a single or combination agent against androgen-independent prostate cancer. Combination therapies have the potential to increase the effectiveness of drug treatments while simultaneously increasing quality of life by reducing side effects, lowering effective dosage rates, or by increasing effectiveness of one compound once combined with another. Using three diverse human prostate cancer cell lines, LNCap, DU145, and PC3, we studied the effect of the novel prostate cancer drug phenoxodiol in combination with docetaxel by utilizing isobolograms, and found that docetaxelinduced cell death was attenuated by co-treatment or pre-treatment of cells with phenoxodiol. This attenuation is associated with the prevention of cells from entering the G2/M phase of the cell cycle where docetaxel is functional in damaging the spindle fibers, and potentially due to p21WAF1 mediated cell survival after docetaxel treatment.We also investigated the use of the Wnt signaling pathway antagonist secreted frizzled-related protein 4 (sFRP4) to increase the effectiveness of phenoxodiol treatment. We found that, through stabilization of the GSK3β molecule, sFRP4 induces degradation of active β-catenin, which causes an increased sensitivity to isoflavone cytotoxic induction by increasing p21WAF1 expression and decreasing expression of c-Myc, Cyclin-D1, and other potent oncogenes. Phenoxodiol induces significant cytotoxicity when combined with a Wnt/β-catenin receptor blocker such as sFRP4. This promotes the concept that combination therapy of a Wnt inhibitor with phenoxodiol might increase the effectiveness of phenoxodiol and give a subset population of prostate cancer sufferers a more effective treatment regime

    Block CUR: Decomposing Matrices using Groups of Columns

    Full text link
    A common problem in large-scale data analysis is to approximate a matrix using a combination of specifically sampled rows and columns, known as CUR decomposition. Unfortunately, in many real-world environments, the ability to sample specific individual rows or columns of the matrix is limited by either system constraints or cost. In this paper, we consider matrix approximation by sampling predefined \emph{blocks} of columns (or rows) from the matrix. We present an algorithm for sampling useful column blocks and provide novel guarantees for the quality of the approximation. This algorithm has application in problems as diverse as biometric data analysis to distributed computing. We demonstrate the effectiveness of the proposed algorithms for computing the Block CUR decomposition of large matrices in a distributed setting with multiple nodes in a compute cluster, where such blocks correspond to columns (or rows) of the matrix stored on the same node, which can be retrieved with much less overhead than retrieving individual columns stored across different nodes. In the biometric setting, the rows correspond to different users and columns correspond to users' biometric reaction to external stimuli, {\em e.g.,}~watching video content, at a particular time instant. There is significant cost in acquiring each user's reaction to lengthy content so we sample a few important scenes to approximate the biometric response. An individual time sample in this use case cannot be queried in isolation due to the lack of context that caused that biometric reaction. Instead, collections of time segments ({\em i.e.,} blocks) must be presented to the user. The practical application of these algorithms is shown via experimental results using real-world user biometric data from a content testing environment.Comment: shorter version to appear in ECML-PKDD 201

    Peer-review in a world with rational scientists: Toward selection of the average

    Full text link
    One of the virtues of peer review is that it provides a self-regulating selection mechanism for scientific work, papers and projects. Peer review as a selection mechanism is hard to evaluate in terms of its efficiency. Serious efforts to understand its strengths and weaknesses have not yet lead to clear answers. In theory peer review works if the involved parties (editors and referees) conform to a set of requirements, such as love for high quality science, objectiveness, and absence of biases, nepotism, friend and clique networks, selfishness, etc. If these requirements are violated, what is the effect on the selection of high quality work? We study this question with a simple agent based model. In particular we are interested in the effects of rational referees, who might not have any incentive to see high quality work other than their own published or promoted. We find that a small fraction of incorrect (selfish or rational) referees can drastically reduce the quality of the published (accepted) scientific standard. We quantify the fraction for which peer review will no longer select better than pure chance. Decline of quality of accepted scientific work is shown as a function of the fraction of rational and unqualified referees. We show how a simple quality-increasing policy of e.g. a journal can lead to a loss in overall scientific quality, and how mutual support-networks of authors and referees deteriorate the system.Comment: 5 pages 4 figure

    Interprofessional curriculum on environmental and social determinants of health in rural Kenya: Aga Khan University East Africa University of California San Francisco integrated primary health care program

    Get PDF
    The Aga Khan University East Africa (AKU)-University of California San Francisco (UCSF) Integrated Primary Health Care Program (IPHC) is a public–private partnership with community and government to strengthen the primary health care (PHC) system in Kenya. IPHC provides opportunities for health professions students to work and learn together in the rural and underserved district of Kaloleni

    On-Chip Microwave Quantum Hall Circulator

    Full text link
    Circulators are non-reciprocal circuit elements integral to technologies including radar systems, microwave communication transceivers, and the readout of quantum information devices. Their non-reciprocity arises from the interference of microwaves over the centimetre-scale of the signal wavelength in the presence of bulky magnetic media that break time-reversal symmetry. Here we realize a completely passive on-chip microwave circulator with size one-thousandth the wavelength by exploiting the chiral, slow-light response of a 2-dimensional electron gas (2DEG) in the quantum Hall regime. For an integrated GaAs device with 330 um diameter and 1 GHz centre frequency, a non-reciprocity of 25 dB is observed over a 50 MHz bandwidth. Furthermore, the direction of circulation can be selected dynamically by varying the magnetic field, an aspect that may enable reconfigurable passive routing of microwave signals on-chip

    Synchronization and Control in Intrinsic and Designed Computation: An Information-Theoretic Analysis of Competing Models of Stochastic Computation

    Full text link
    We adapt tools from information theory to analyze how an observer comes to synchronize with the hidden states of a finitary, stationary stochastic process. We show that synchronization is determined by both the process's internal organization and by an observer's model of it. We analyze these components using the convergence of state-block and block-state entropies, comparing them to the previously known convergence properties of the Shannon block entropy. Along the way, we introduce a hierarchy of information quantifiers as derivatives and integrals of these entropies, which parallels a similar hierarchy introduced for block entropy. We also draw out the duality between synchronization properties and a process's controllability. The tools lead to a new classification of a process's alternative representations in terms of minimality, synchronizability, and unifilarity.Comment: 25 pages, 13 figures, 1 tabl

    A comparative framework: how broadly applicable is a 'rigorous' critical junctures framework?

    Get PDF
    The paper tests Hogan and Doyle's (2007, 2008) framework for examining critical junctures. This framework sought to incorporate the concept of ideational change in understanding critical junctures. Until its development, frameworks utilized in identifying critical junctures were subjective, seeking only to identify crisis, and subsequent policy changes, arguing that one invariably led to the other, as both occurred around the same time. Hogan and Doyle (2007, 2008) hypothesized ideational change as an intermediating variable in their framework, determining if, and when, a crisis leads to radical policy change. Here we test this framework on cases similar to, but different from, those employed in developing the exemplar. This will enable us determine whether the framework's relegation of ideational change to a condition of crisis holds, or, if ideational change has more importance than is ascribed to it by this framework. This will also enable us determined if the framework itself is robust, and fit for the purposes it was designed to perform — identifying the nature of policy change
    • …
    corecore