407 research outputs found
Long-Term Potentiation: One Kind or Many?
Do neurobiologists aim to discover natural kinds? I address this question in this chapter via a critical analysis of classification practices operative across the 43-year history of research on long-term potentiation (LTP). I argue that this 43-year history supports the idea that the structure of scientific practice surrounding LTP research has remained an obstacle to the discovery of natural kinds
Potentiality in Biology
We take the potentialities that are studied in the biological sciences (e.g., totipotency) to be an important subtype of biological dispositions. The goal of this paper is twofold: first, we want to provide a detailed understanding of what biological dispositions are. We claim that two features are essential for dispositions in biology: the importance of the manifestation process and the diversity of conditions that need to be satisfied for the disposition to be manifest. Second, we demonstrate that the concept of a disposition (or potentiality) is a very useful tool for the analysis of the explanatory practice in the biological sciences. On the one hand it allows an in-depth analysis of the nature and diversity of the conditions under which biological systems display specific behaviors. On the other hand the concept of a disposition may serve a unificatory role in the philosophy of the natural sciences since it captures not only the explanatory practice of biology, but of all natural sciences. Towards the end we will briefly come back to the notion of a potentiality in biology
Clinical significance of perioperative Q-wave myocardial infarction: The Emory Angioplasty versus Surgery Trial
AbstractObjective: The primary end point of the Emory Angioplasty versus Surgery Trial was a composite of three events: death, Q-wave infarction, and a new large defect on 3-year postoperative thallium scan. This study examines the clinical significance of Q-wave infarction in the surgical cohort (194 patients) of the Emory trial. Methods: Twenty patients (10.3%) with Q-wave infarctions were identified: 13 patients had inferior Q-wave infarctions and seven patients had anterior, lateral, septal, or posterior Q-wave infarctions (termed anterior Q-wave infarctions). Results: In the inferior Q-wave infarction group, postoperative cardiac catheterization (at 1 year or 3 years) in 11 patients revealed normal ejection fraction (ejection fraction >55%) in 10 (91%), no wall motion abnormalities in 10 (91%), and all grafts patent in 10 (91%). In the anterior Q-wave infarction group, postoperative catheterizatiOn in six patients revealed normal ejection fractions in five (83%), no wall motion abnormalities in three (50%), and all grafts patent in three (50%). Average peak postoperative creatine kinase MB levels were as follows: no Q-wave infarction (n = 174) 37 ± 43 IU/L, inferior Q-wave infarction 40 ± 27 IU/L, and anterior Q-wave infarction 58 ± 38 IU/L. Mortality in the 20 patients with Q-wave infarctions was 5% (1/20) at 3 years; in patients without a Q-wave infarction it was 6.3% (11/174) (p = 0.64). Of 17 patients with a Q-wave infarction who underwent postoperative catheterization, 11 (65%) had a normal ejection fraction, normal wall motion, and all grafts patent with an uneventful 3-year postoperative course. Conclusions: The core laboratory screening of postoperative electrocardiograms, particularly in the case of inferior Q-wave infarctions, appears to identify a number of patients as having a Q-wave infarction with minimal clinical significance. Q-wave infarction identified in the postoperative period seems to be a weak end point with little prognostic significance and therefore not valuable for future randomized trials. (J Thorac Cardiovasc Surg 1996;112:1447-54
Compare and Contrast: How to Assess the Completeness of Mechanistic Explanation
Opponents of the new mechanistic account of scientific explanation argue that the new mechanists are committed to a ‘More Details Are Better’ claim: adding details about the mechanism always improves an explanation. Due to this commitment, the mechanistic account cannot be descriptively adequate as actual scientific explanations usually leave out details about the mechanism. In reply to this objection, defenders of the new mechanistic account have highlighted that only adding relevant mechanistic details improves an explanation and that relevance is to be determined relative to the phenomenon-to-be-explained. Craver and Kaplan (B J Philos Sci 71:287–319, 2020) provide a thorough reply along these lines specifying that the phenomena at issue are contrasts. In this paper, we will discuss Craver and Kaplan’s reply. We will argue that it needs to be modified in order to avoid three problems, i.e., what we will call the Odd Ontology Problem, the Multiplication of Mechanisms Problem, and the Ontic Completeness Problem. However, even this modification is confronted with two challenges: First, it remains unclear how explanatory relevance is to be determined for contrastive explananda within the mechanistic framework. Second, it remains to be shown as to how the new mechanistic account can avoid what we will call the ‘Vertical More Details are Better’ objection. We will provide answers to both challenges
Radiative capture of protons by deuterons
The differential cross section for radiative capture of protons by deuterons
is calculated using different realistic NN interactions. We compare our results
with the available experimental data below . Excellent agreement
is found when taking into account meson exchange currents, dipole and
quadrupole contributions, and the full initial state interaction. There is only
a small difference between the magnitudes of the cross sections for the
different potentials considered. The angular distributions, however, are
practically potential independent.Comment: 4 pages (twocolumn), 4 postscript figures included, submitted for
publication, revised versio
JLab Measurement of the He Charge Form Factor at Large Momentum Transfers
The charge form factor of ^4He has been extracted in the range 29 fm
fm from elastic electron scattering, detecting He
nuclei and electrons in coincidence with the High Resolution Spectrometers of
the Hall A Facility of Jefferson Lab. The results are in qualitative agreement
with realistic meson-nucleon theoretical calculations. The data have uncovered
a second diffraction minimum, which was predicted in the range of this
experiment, and rule out conclusively long-standing predictions of dimensional
scaling of high-energy amplitudes using quark counting.Comment: 4 pages, 2 figure
JLab Measurements of the 3He Form Factors at Large Momentum Transfers
The charge and magnetic form factors, FC and FM, of 3He have been extracted
in the kinematic range 25 fm-2 < Q2 < 61 fm-2 from elastic electron scattering
by detecting 3He recoil nuclei and electrons in coincidence with the High
Resolution Spectrometers of the Hall A Facility at Jefferson Lab. The
measurements are indicative of a second diffraction minimum for the magnetic
form factor, which was predicted in the Q2 range of this experiment, and of a
continuing diffractive structure for the charge form factor. The data are in
qualitative agreement with theoretical calculations based on realistic
interactions and accurate methods to solve the three-body nuclear problem
On the Gold Standard for Security of Universal Steganography
While symmetric-key steganography is quite well understood both in the
information-theoretic and in the computational setting, many fundamental
questions about its public-key counterpart resist persistent attempts to solve
them. The computational model for public-key steganography was proposed by von
Ahn and Hopper in EUROCRYPT 2004. At TCC 2005, Backes and Cachin gave the first
universal public-key stegosystem - i.e. one that works on all channels -
achieving security against replayable chosen-covertext attacks (SS-RCCA) and
asked whether security against non-replayable chosen-covertext attacks (SS-CCA)
is achievable. Later, Hopper (ICALP 2005) provided such a stegosystem for every
efficiently sampleable channel, but did not achieve universality. He posed the
question whether universality and SS-CCA-security can be achieved
simultaneously. No progress on this question has been achieved since more than
a decade. In our work we solve Hopper's problem in a somehow complete manner:
As our main positive result we design an SS-CCA-secure stegosystem that works
for every memoryless channel. On the other hand, we prove that this result is
the best possible in the context of universal steganography. We provide a
family of 0-memoryless channels - where the already sent documents have only
marginal influence on the current distribution - and prove that no
SS-CCA-secure steganography for this family exists in the standard
non-look-ahead model.Comment: EUROCRYPT 2018, llncs styl
Deeply Virtual Compton Scattering off the neutron
The present experiment exploits the interference between the Deeply Virtual
Compton Scattering (DVCS) and the Bethe-Heitler processes to extract the
imaginary part of DVCS amplitudes on the neutron and on the deuteron from the
helicity-dependent D cross section measured at =1.9
GeV and =0.36. We extract a linear combination of generalized parton
distributions (GPDs) particularly sensitive to , the least constrained
GPD. A model dependent constraint on the contribution of the up and down quarks
to the nucleon spin is deduced.Comment: Published in Phys. Rev. Let
- …