186 research outputs found
Transverse Electronic Transport through DNA Nucleotides with Functionalized Graphene Electrodes
Graphene nanogaps and nanopores show potential for the purpose of electrical
DNA sequencing, in particular because single-base resolution appears to be
readily achievable. Here, we evaluated from first principles the advantages of
a nanogap setup with functionalized graphene edges. To this end, we employed
density functional theory and the non-equilibrium Green's function method to
investigate the transverse conductance properties of the four nucleotides
occurring in DNA when located between the opposing functionalized graphene
electrodes. In particular, we determined the electrical tunneling current
variation as a function of the applied bias and the associated differential
conductance at a voltage which appears suitable to distinguish between the four
nucleotides. Intriguingly, we observe for one of the nucleotides a negative
differential resistance effect.Comment: 19 pages, 7 figure
A new protocol for the propagation of dendritic cells from rat bone marrow using recombinant GM-CSF, and their quantification using the mAb OX-62
Bone marrow (BM)-derived dendritic cells (DC) are the most potent known antigen (Ag) presenting cell in vivo and in vitro. Detailed analysis of their properties and mechanisms of action requires an ability to produce large numbers of DC. Although DC have been isolated from several rat tissues, including BM, the yield is uniformly low. We describe a simple method for the propagation of large numbers of DC from rat BM and document cell yield with the rat DC marker, OX-62. After depletion of plastic-adherent and Fc+ cells by panning on dishes coated with normal serum, residual BM cells were cultured in gelatin coated flasks using murine rGM-CSF supplemented medium. Prior to analysis, non-adherent cells were re-depleted of contaminating Fc+ cells. Propagation of DC was monitored by double staining for FACS analysis (major histocompatibility complex (MHC) class II+ OX-62+, OX-19-). Functional assay, morphological analysis and evaluation of homing patterns of cultured cells revealed typical DC characteristics. MHC class II and OX-62 antigen expression increased with time in culture and correlated with allostimulatory ability. DC yield increased until day 7, when 3.3 × 106 DC were obtained from an initial 3 × 108 unfractionated BM cells. Significant numbers of DC can be generated from rat BM using these simple methods. This should permit analysis and manipulation of rat DC functions in vivo and in vitro. © 1995
30 years of collaboration
We highlight some of the most important cornerstones of the long standing and very fruitful collaboration of the Austrian Diophantine Number Theory research group and the Number Theory and Cryptography School of Debrecen. However, we do not plan to be complete in any sense but give some interesting data and selected results that we find particularly nice. At the end we focus on two topics in more details, namely a problem that origins from a conjecture of Rényi and Erdős (on the number of terms of the square of a polynomial) and another one that origins from a question of Zelinsky (on the unit sum number problem). This paper evolved from a plenary invited talk that the authors gaveat the Joint Austrian-Hungarian Mathematical Conference 2015, August 25-27, 2015 in Győr (Hungary)
Market Efficiency after the Financial Crisis: It's Still a Matter of Information Costs
Compared to the worldwide financial carnage that followed the Subprime Crisis of 2007-2008, it may seem of small consequence that it is also said to have demonstrated the bankruptcy of an academic financial institution: the Efficient Capital Market Hypothesis (“ECMH”). Two things make this encounter between theory and seemingly inconvenient facts of consequence. First, the ECMH had moved beyond academia, fueling decades of a deregulatory agenda. Second, when economic theory moves from academics to policy, it also enters the realm of politics, and is inevitably refashioned to serve the goals of political argument. This happened starkly with the ECMH. It was subject to its own bubble – as a result of politics, it expanded from a narrow but important academic theory about the informational underpinnings of market prices to a broad ideological preference for market outcomes over even measured regulation. In this Article we examine the Subprime Crisis as a vehicle to return the ECMH to its information cost roots that support a more modest but sensible regulatory policy. In particular, we argue that the ECMH addresses informational efficiency, which is a relative, not an absolute measure. This focus on informational efficiency leads to a more focused understanding of what went wrong in 2007-2008. Yet informational efficiency is related to fundamental efficiency – if all information relevant to determining a security’s fundamental value is publicly available and the mechanisms by which that information comes to be reflected in the securities market price operate without friction, fundamental and informational efficiency coincide. But where all value relevant information is not publicly available and/or the mechanisms of market efficiency operate with frictions, the coincidence is an empirical question both as to the information efficiency of prices and their relation to fundamental value. Properly framing market efficiency focuses our attention on the frictions that drive a wedge between relative efficiency and efficiency under perfect market conditions. So framed, relative efficiency is a diagnostic tool that identifies the information costs and structural barriers that reduce price efficiency which, in turn, provides part of a realistic regulatory strategy. While it will not prevent future crises, improving the mechanisms of market efficiency will make prices more efficient, frictions more transparent, and the influence of politics on public agencies more observable, which may allow us to catch the next problem earlier. Recall that on September 8, 2008, the Congressional Budget Office publicly stated its uncertainty about whether there would be a recession and predicted 1.5 percent growth in 2009. Eight days later, Lehman Brothers had failed, and AIG was being nationalized
- …
