4,258 research outputs found
Truncated human endothelin receptor A produced by alternative splicing and its expression in melanoma
In this study, reverse transcriptase polymerase chain reaction was used to amplify human endothelin receptor A (ETA) and ETB receptor mRNA. A truncated ETA receptor transcript with exons 3 and 4 skipped was found. The skipping of these two exons results in 109 amino acids being deleted from the receptor. The truncated receptor was expressed in all tissues and cells examined, but the level of expression varied. In melanoma cell lines and melanoma tissues, the truncated receptor gene was the major species, whereas the wild-type ETA was predominant in other tissues. A 1.9-kb ETA transcript was identified in melanoma cell lines by Northern blot, which was much smaller than the transcript in heart and in other tissues reported previously (4.3 kb). The cDNA coding regions of the truncated and wild-type ETA receptors were stably transfected into Chinese hamster ovary (CHO) cells. The truncated ETA receptor-transfected CHO cells did not show binding affinity to endothelin 1 (ET-1) or endothelin 3 (ET-3). The function and biological significance of this truncated ETA receptor is not clear, but it may have regulatory roles for cell responses to ETs
Small Vessel Disease in the Heart and Brain: Current Knowledge, Unmet Therapeutic Need and Future Directions
No abstract available
Recommended from our members
Evaluation of genetic markers as instruments for Mendelian randomization studies on vitamin D.
Mendelian randomization (MR) studies use genetic variants mimicking the influence of a modifiable exposure to assess and quantify a causal association with an outcome, with an aim to avoid problems with confounding and reverse causality affecting other types of observational studies
Realism, Objectivity, and Evaluation
I discuss Benacerraf's epistemological challenge for realism about areas like mathematics, metalogic, and modality, and describe the pluralist response to it. I explain why normative pluralism is peculiarly unsatisfactory, and use this explanation to formulate a radicalization of Moore's Open Question Argument. According to the argument, the facts -- even the normative facts -- fail to settle the practical questions at the center of our normative lives. One lesson is that the concepts of realism and objectivity, which are widely identified, are actually in tension
Semantic distillation: a method for clustering objects by their contextual specificity
Techniques for data-mining, latent semantic analysis, contextual search of
databases, etc. have long ago been developed by computer scientists working on
information retrieval (IR). Experimental scientists, from all disciplines,
having to analyse large collections of raw experimental data (astronomical,
physical, biological, etc.) have developed powerful methods for their
statistical analysis and for clustering, categorising, and classifying objects.
Finally, physicists have developed a theory of quantum measurement, unifying
the logical, algebraic, and probabilistic aspects of queries into a single
formalism. The purpose of this paper is twofold: first to show that when
formulated at an abstract level, problems from IR, from statistical data
analysis, and from physical measurement theories are very similar and hence can
profitably be cross-fertilised, and, secondly, to propose a novel method of
fuzzy hierarchical clustering, termed \textit{semantic distillation} --
strongly inspired from the theory of quantum measurement --, we developed to
analyse raw data coming from various types of experiments on DNA arrays. We
illustrate the method by analysing DNA arrays experiments and clustering the
genes of the array according to their specificity.Comment: Accepted for publication in Studies in Computational Intelligence,
Springer-Verla
Tamsulosin-induced severe hypotension during general anesthesia: a case report.
Introduction: Tamsulosin, a selective alpha1-adrenergic receptor (alpha1-AR) antagonist, is a widely prescribed first-line agent for benign prostatic hypertrophy (BPH). Its interaction with anesthetic agents has not been described. Case Presentation: We report the case of 54-year-old Asian man undergoing elective left thyroid lobectomy. The only medication the Patient was taking was tamsulosin 0.4 mg for the past year for BPH. He developed persistent hypotension during the maintenance phase of anesthesia while receiving oxygen, nitrous oxide and 1% isoflurane. The hypotension could have been attributable to a possible interaction between inhalational anesthetic and tamsulosin. Conclusion: Vigilance for unexpected hypotension is important in surgical Patients who are treated with selective alpha1-AR blockers. If hypotension occurs, vasopressors that act directly on adrenergic receptors could be more effective
Chiral tunneling and the Klein paradox in graphene
The so-called Klein paradox - unimpeded penetration of relativistic particles
through high and wide potential barriers - is one of the most exotic and
counterintuitive consequences of quantum electrodynamics (QED). The phenomenon
is discussed in many contexts in particle, nuclear and astro- physics but
direct tests of the Klein paradox using elementary particles have so far proved
impossible. Here we show that the effect can be tested in a conceptually simple
condensed-matter experiment by using electrostatic barriers in single- and
bi-layer graphene. Due to the chiral nature of their quasiparticles, quantum
tunneling in these materials becomes highly anisotropic, qualitatively
different from the case of normal, nonrelativistic electrons. Massless Dirac
fermions in graphene allow a close realization of Klein's gedanken experiment
whereas massive chiral fermions in bilayer graphene offer an interesting
complementary system that elucidates the basic physics involved.Comment: 15 pages, 4 figure
Pauli's Principle in Probe Microscopy
Exceptionally clear images of intramolecular structure can be attained in
dynamic force microscopy through the combination of a passivated tip apex and
operation in what has become known as the "Pauli exclusion regime" of the
tip-sample interaction. We discuss, from an experimentalist's perspective, a
number of aspects of the exclusion principle which underpin this ability to
achieve submolecular resolution. Our particular focus is on the origins,
history, and interpretation of Pauli's principle in the context of interatomic
and intermolecular interactions.Comment: This is a chapter from "Imaging and Manipulation of Adsorbates using
Dynamic Force Microscopy", a book which is part of the "Advances in Atom and
Single Molecule Machines" series published by Springer
[http://www.springer.com/series/10425]. To be published late 201
Disease progression in Plasmodium knowlesi malaria is linked to variation in invasion gene family members.
Emerging pathogens undermine initiatives to control the global health impact of infectious diseases. Zoonotic malaria is no exception. Plasmodium knowlesi, a malaria parasite of Southeast Asian macaques, has entered the human population. P. knowlesi, like Plasmodium falciparum, can reach high parasitaemia in human infections, and the World Health Organization guidelines for severe malaria list hyperparasitaemia among the measures of severe malaria in both infections. Not all patients with P. knowlesi infections develop hyperparasitaemia, and it is important to determine why. Between isolate variability in erythrocyte invasion, efficiency seems key. Here we investigate the idea that particular alleles of two P. knowlesi erythrocyte invasion genes, P. knowlesi normocyte binding protein Pknbpxa and Pknbpxb, influence parasitaemia and human disease progression. Pknbpxa and Pknbpxb reference DNA sequences were generated from five geographically and temporally distinct P. knowlesi patient isolates. Polymorphic regions of each gene (approximately 800 bp) were identified by haplotyping 147 patient isolates at each locus. Parasitaemia in the study cohort was associated with markers of disease severity including liver and renal dysfunction, haemoglobin, platelets and lactate, (r = ≥ 0.34, p = <0.0001 for all). Seventy-five and 51 Pknbpxa and Pknbpxb haplotypes were resolved in 138 (94%) and 134 (92%) patient isolates respectively. The haplotypes formed twelve Pknbpxa and two Pknbpxb allelic groups. Patients infected with parasites with particular Pknbpxa and Pknbpxb alleles within the groups had significantly higher parasitaemia and other markers of disease severity. Our study strongly suggests that P. knowlesi invasion gene variants contribute to parasite virulence. We focused on two invasion genes, and we anticipate that additional virulent loci will be identified in pathogen genome-wide studies. The multiple sustained entries of this diverse pathogen into the human population must give cause for concern to malaria elimination strategists in the Southeast Asian region
No imminent quantum supremacy by boson sampling
It is predicted that quantum computers will dramatically outperform their
conventional counterparts. However, large-scale universal quantum computers are
yet to be built. Boson sampling is a rudimentary quantum algorithm tailored to
the platform of photons in linear optics, which has sparked interest as a rapid
way to demonstrate this quantum supremacy. Photon statistics are governed by
intractable matrix functions known as permanents, which suggests that sampling
from the distribution obtained by injecting photons into a linear-optical
network could be solved more quickly by a photonic experiment than by a
classical computer. The contrast between the apparently awesome challenge faced
by any classical sampling algorithm and the apparently near-term experimental
resources required for a large boson sampling experiment has raised
expectations that quantum supremacy by boson sampling is on the horizon. Here
we present classical boson sampling algorithms and theoretical analyses of
prospects for scaling boson sampling experiments, showing that near-term
quantum supremacy via boson sampling is unlikely. While the largest boson
sampling experiments reported so far are with 5 photons, our classical
algorithm, based on Metropolised independence sampling (MIS), allowed the boson
sampling problem to be solved for 30 photons with standard computing hardware.
We argue that the impact of experimental photon losses means that demonstrating
quantum supremacy by boson sampling would require a step change in technology.Comment: 25 pages, 9 figures. Comments welcom
- …