204 research outputs found
Distances from Surface Brightness Fluctuations
The practice of measuring galaxy distances from their spatial fluctuations in
surface brightness is now a decade old. While several past articles have
included some review material, this is the first intended as a comprehensive
review of the surface brightness fluctuation (SBF) method. The method is
conceptually quite simple, the basic idea being that nearby (but unresolved)
star clusters and galaxies appear "bumpy", while more distant ones appear
smooth. This is quantified via a measurement of the amplitude of the Poisson
fluctuations in the number of unresolved stars encompassed by a CCD pixel
(usually in an image of an elliptical galaxy). Here, we describe the technical
details and difficulties involved in making SBF measurements, discuss
theoretical and empirical calibrations of the method, and review the numerous
applications of the method from the ground and space, in the optical and
near-infrared. We include discussions of stellar population effects and the
"universality" of the SBF standard candle. A final section considers the future
of the method.Comment: Invited review article to appear in: `Post-Hipparcos Cosmic Candles',
A. Heck & F. Caputo (Eds), Kluwer Academic Publ., Dordrecht, in press. 22
pages, including 3 postscript figures; uses Kluwer's crckapb.sty LaTex macro
file, enclose
The Distances of the Magellanic Clouds
The present status of our knowledge of the distances to the Magellanic Clouds
is evaluated from a post-Hipparcos perspective. After a brief summary of the
effects of structure, reddening, age and metallicity, the primary distance
indicators for the Large Magellanic Cloud are reviewed: The SN 1987A ring,
Cepheids, RR Lyraes, Mira variables, and Eclipsing Binaries. Distances derived
via these methods are weighted and combined to produce final "best" estimates
for the Magellanic Clouds distance moduli.Comment: Invited review article to appear in ``Post Hipparcos Cosmic
Candles'', F. Caputo & A. Heck (Eds.), Kluwer Academic Publ., Dordrecht, in
pres
Astronomical Spectroscopy
Spectroscopy is one of the most important tools that an astronomer has for
studying the universe. This chapter begins by discussing the basics, including
the different types of optical spectrographs, with extension to the ultraviolet
and the near-infrared. Emphasis is given to the fundamentals of how
spectrographs are used, and the trade-offs involved in designing an
observational experiment. It then covers observing and reduction techniques,
noting that some of the standard practices of flat-fielding often actually
degrade the quality of the data rather than improve it. Although the focus is
on point sources, spatially resolved spectroscopy of extended sources is also
briefly discussed. Discussion of differential extinction, the impact of
crowding, multi-object techniques, optimal extractions, flat-fielding
considerations, and determining radial velocities and velocity dispersions
provide the spectroscopist with the fundamentals needed to obtain the best
data. Finally the chapter combines the previous material by providing some
examples of real-life observing experiences with several typical instruments.Comment: An abridged version of a chapter to appear in Planets, Stars and
Stellar Systems, to be published in 2011 by Springer. Slightly revise
Simultaneous quantification of 12 different nucleotides and nucleosides released from renal epithelium and in human urine samples using ion-pair reversed-phase HPLC
Nucleotides and nucleosides are not only involved in cellular metabolism but also act extracellularly via P1 and P2 receptors, to elicit a wide variety of physiological and pathophysiological responses through paracrine and autocrine signalling pathways. For the first time, we have used an ion-pair reversed-phase high-performance liquid chromatography ultraviolet (UV)-coupled method to rapidly and simultaneously quantify 12 different nucleotides and nucleosides (adenosine triphosphate, adenosine diphosphate, adenosine monophosphate, adenosine, uridine triphosphate, uridine diphosphate, uridine monophosphate, uridine, guanosine triphosphate, guanosine diphosphate, guanosine monophosphate, guanosine): (1) released from a mouse renal cell line (M1 cortical collecting duct) and (2) in human biological samples (i.e., urine). To facilitate analysis of urine samples, a solid-phase extraction step was incorporated (overall recovery rate ? 98 %). All samples were analyzed following injection (100 ?l) into a Synergi Polar-RP 80 Å (250 × 4.6 mm) reversed-phase column with a particle size of 10 ?m, protected with a guard column. A gradient elution profile was run with a mobile phase (phosphate buffer plus ion-pairing agent tetrabutylammonium hydrogen sulfate; pH 6) in 2-30 % acetonitrile (v/v) for 35 min (including equilibration time) at 1 ml min(-1) flow rate. Eluted compounds were detected by UV absorbance at 254 nm and quantified using standard curves for nucleotide and nucleoside mixtures of known concentration. Following validation (specificity, linearity, limits of detection and quantitation, system precision, accuracy, and intermediate precision parameters), this protocol was successfully and reproducibly used to quantify picomolar to nanomolar concentrations of nucleosides and nucleotides in isotonic and hypotonic cell buffers that transiently bathed M1 cells, and urine samples from normal subjects and overactive bladder patients
The porin and the permeating antibiotic: A selective diffusion barrier in gram-negative bacteria
Gram-negative bacteria are responsible for a large proportion of antibiotic resistant bacterial diseases. These bacteria have a complex cell envelope that comprises an outer membrane and an inner membrane that delimit the periplasm. The outer membrane contains various protein channels, called porins, which are involved in the influx of various compounds, including several classes of antibiotics. Bacterial adaptation to reduce influx through porins is an increasing problem worldwide that contributes, together with efflux systems, to the emergence and dissemination of antibiotic resistance. An exciting challenge is to decipher the genetic and molecular basis of membrane impermeability as a bacterial resistance mechanism. This Review outlines the bacterial response towards antibiotic stress on altered membrane permeability and discusses recent advances in molecular approaches that are improving our knowledge of the physico-chemical parameters that govern the translocation of antibiotics through porin channel
Episodic Source Memory over Distribution by Quantum-Like Dynamics – A Model Exploration
In source memory studies, a decision-maker is concerned with identifying the context in which a given episodic experience occurred. A common paradigm for studying source memory is the ‘three-list’ experimental paradigm, where a subject studies three lists of words and is later asked whether a given word appeared on one or more of the studied lists. Surprisingly, the sum total of the acceptance probabilities generated by asking for the source of a word separately for each list (‘list 1?’, ‘list 2?’, ‘list 3?’) exceeds the acceptance probability generated by asking whether that word occurred on the union of the lists (‘list 1 or 2 or 3?’). The episodic memory for a given word therefore appears over distributed on the disjoint contexts of the lists. A quantum episodic memory model [QEM] was proposed by Brainerd, Wang and Reyna [8] to explain this type of result. In this paper, we apply a Hamiltonian dynamical extension of QEM for over distribution of source memory. The Hamiltonian operators are simultaneously driven by parameters for re-allocation of gist-based and verbatim-based acceptance support as subjects are exposed to the cue word in the first temporal stage, and are attenuated for description-dependence by the querying probe in the second temporal stage. Overall, the model predicts well the choice proportions in both separate list and union list queries and the over distribution effect, suggesting that a Hamiltonian dynamics for QEM can provide a good account of the acceptance processes involved in episodic memory tasks
Statistical learning leads to persistent memory: evidence for one-year consolidation
Statistical learning is a robust mechanism of the brain that enables the extraction of environmental patterns, which is crucial in perceptual and cognitive domains. However, the dynamical change of processes underlying long-term statistical memory formation has not been tested in an appropriately controlled design. Here we show that a memory trace acquired by statistical learning is resistant to inference as well as to forgetting after one year. Participants performed a statistical learning task and were retested one year later without further practice. The acquired statistical knowledge was resistant to interference, since after one year, participants showed similar memory performance on the previously practiced statistical structure after being tested with a new statistical structure. These results could be key to understand the stability of long-term statistical knowledge
A new method for determining physician decision thresholds using empiric, uncertain recommendations
<p>Abstract</p> <p>Background</p> <p>The concept of risk thresholds has been studied in medical decision making for over 30 years. During that time, physicians have been shown to be poor at estimating the probabilities required to use this method. To better assess physician risk thresholds and to more closely model medical decision making, we set out to design and test a method that derives thresholds from actual physician treatment recommendations. Such an approach would avoid the need to ask physicians for estimates of patient risk when trying to determine individual thresholds for treatment. Assessments of physician decision making are increasingly relevant as new data are generated from clinical research. For example, recommendations made in the setting of ocular hypertension are of interest as a large clinical trial has identified new risk factors that should be considered by physicians. Precisely how physicians use this new information when making treatment recommendations has not yet been determined.</p> <p>Results</p> <p>We derived a new method for estimating treatment thresholds using ordinal logistic regression and tested it by asking ophthalmologists to review cases of ocular hypertension before expressing how likely they would be to recommend treatment. Fifty-eight physicians were recruited from the American Glaucoma Society. Demographic information was collected from the participating physicians and the treatment threshold for each physician was estimated. The method was validated by showing that while treatment thresholds varied over a wide range, the most common values were consistent with the 10-15% 5-year risk of glaucoma suggested by expert opinion and decision analysis.</p> <p>Conclusions</p> <p>This method has advantages over prior means of assessing treatment thresholds. It does not require physicians to explicitly estimate patient risk and it allows for uncertainty in the recommendations. These advantages will make it possible to use this method when assessing interventions intended to alter clinical decision making.</p
Empirical Examination of the Role of Three Sets of Innovation Attributes for Determining Adoption of IRCTC Mobile Ticketing Service
The Indian Railway Catering and Tourism Corporation Limited’s (IRCTC) mobile ticketing was recently introduced in India. In this study of its adoption, three competing attribute-sets are compared. This study aims to reveal the attribute-set best predicting its adoption. The research model was empirically tested and validated using SPSS. Four attributes from the Diffusion of Innovations (DOI) theory, four from the PCI theory, and four from Tornatzky and Klein’s meta-analysis significantly affected behavioral intentions. Only complexity failed to influence use intentions, and behavioral intention and riskiness significantly impacted adoption
- …