199 research outputs found

    A Study of Ductile Damage and Failure of Pure Copper – Part I: Constitutive Equations and Experiments

    Get PDF
    This paper presents the results of an experimental study of ductile damage and failure of pure copper. Uniaxial tension tests were performed for specimens with different arrangements of pre-drilled micro-holes representing the simulation models of cylindrical voids. This experimental method has already been applied by a number of researchers in order to investigate the damage of metals under plastic deformation and proved to be useful for studying an evolution of damage in ductile materials in terms of local strains of both representative volume elements (RVE) and meso-elements (i.e., material unit cells with a single void). Two measures are used for the assessment of damage in the deformed material. The first one relates damage to an increase in the void volume. The second measure accounts for the damage associated with a change in the void shape. Both measures were introduced as part of a tensorial theory of damage in Zapara et al. (2008). They are based on experimental studies of damage kinetics in metallic materials under plasticity conditions. In combination with similar data from the literature the obtained results are important for the modeling of metal forming processes with dominating tensile deformation (e.g., deep-drawing, ironing, wire drawing)

    A Study of Ductile Damage and Failure of Pure Copper – Part II: Analysis of the Deep Drawing Process of a Cylindrical Shell

    Get PDF
    The analysis of the stress-strain state and strain induced damage of a cylindrical shell made of copper during the process of deep drawing is presented. The stresses on the contact surface of operating tools (punch and die) are assigned implicitly, which leads to the mixed boundary-value problem. The results are obtained on the basis of the solution of the constitutive differential equations presenting plane plastic flow in curvilinear characteristic coordinates. The material functions required for the analysis of deep drawing were obtained by experimental studies of ductile damage and failure of pure copper (cf., Zapara et al., 2011). The process of deformation with discontinuities of the tangential velocities at the plastic zone boundaries is discussed. An estimate of the local strains and damage in the material is given both for the plastic zone and for its boundaries. The distributions of strains and of the damage within the wall of a finished part are determined. These distributions strongly affect the strength properties of a shell. The modeling of ductile damage in a material during deep drawing is based on experimental results and considerably extends them for a wider range of stress triaxialities. It is shown that the use of a drawing die with a cone angle of 12°...13° leads to a noticeable shift of the stress triaxialities into a range of negative values as compared to deep drawing with a die of larger angle ( 15°...18° ). The modeling reveals a smoother increase and decrease in damage of the finished part in case of the smaller cone angle as well as the absence of void coalescence. This fact is very important when manufacturing such products at high operating speeds. The obtained results in combination with similar ones from the literature can be applied to the analysis of metal forming processes with dominant tensile deformation (e.g., drawing, deep drawing, stripdrawing, ironing)

    X-ray resonant photoexcitation: line widths and energies of K{\alpha} transitions in highly charged Fe ions

    Full text link
    Photoabsorption by and fluorescence of the K{\alpha} transitions in highly charged iron ions are essential mechanisms for X-ray radiation transfer in astrophysical environments. We study photoabsorption due to the main K{\alpha} transitions in highly charged iron ions from heliumlike to fluorinelike (Fe 24+...17+) using monochromatic X-rays around 6.6 keV at the PETRA III synchrotron photon source. Natural linewidths were determined with hitherto unattained accuracy. The observed transitions are of particular interest for the understanding of photoexcited plasmas found in X-ray binaries and active galactic nuclei.Comment: Revised versio

    Classificatory Theory in Data-Intensive Science: The Case of Open Biomedical Ontologies

    Get PDF
    publication-status: Publishedtypes: ArticleThis is the author's version of a paper that was subsequently published in International Studies in the Philosophy of Science. Please cite the published version by following the DOI link.Knowledge-making practices in biology are being strongly affected by the availability of data on an unprecedented scale, the insistence on systemic approaches and growing reliance on bioinformatics and digital infrastructures. What role does theory play within data-intensive science, and what does that tell us about scientific theories in general? To answer these questions, I focus on Open Biomedical Ontologies, digital classification tools that have become crucial to sharing results across research contexts in the biological and biomedical sciences, and argue that they constitute an example of classificatory theory. This form of theorizing emerges from classification practices in conjunction with experimental know-how and expresses the knowledge underpinning the analysis and interpretation of data disseminated online.Economic and Social Research Council (ESRC)The British AcademyLeverhulme Trus

    The emergence of modern statistics in agricultural science : Analysis of variance, experimental design and the reshaping of research at Rothamsted Experimental Station, 1919–1933

    Get PDF
    During the twentieth century statistical methods have transformed research in the experimental and social sciences. Qualitative evidence has largely been replaced by quantitative results and the tools of statistical inference have helped foster a new ideal of objectivity in scientific knowledge. The paper will investigate this transformation by considering the genesis of analysis of variance and experimental design, statistical methods nowadays taught in every elementary course of statistics for the experimental and social sciences. These methods were developed by the mathematician and geneticist R. A. Fisher during the 1920s, while he was working at Rothamsted Experimental Station, where agricultural research was in turn reshaped by Fisher’s methods. Analysis of variance and experimental design required new practices and instruments in field and laboratory research, and imposed a redistribution of expertise among statisticians, experimental scientists and the farm staff. On the other hand the use of statistical methods in agricultural science called for a systematization of information management and made computing an activity integral to the experimental research done at Rothamsted, permanently integrating the statisticians’ tools and expertise into the station research programme. Fisher’s statistical methods did not remain confined within agricultural research and by the end of the 1950s they had come to stay in psychology, sociology, education, chemistry, medicine, engineering, economics, quality control, just to mention a few of the disciplines which adopted them

    Matter in Strong Magnetic Fields

    Full text link
    The properties of matter are significantly modified by strong magnetic fields, B>>2.35×109B>>2.35\times 10^9 Gauss (1G=104Tesla1 G =10^{-4} Tesla), as are typically found on the surfaces of neutron stars. In such strong magnetic fields, the Coulomb force on an electron acts as a small perturbation compared to the magnetic force. The strong field condition can also be mimicked in laboratory semiconductors. Because of the strong magnetic confinement of electrons perpendicular to the field, atoms attain a much greater binding energy compared to the zero-field case, and various other bound states become possible, including molecular chains and three-dimensional condensed matter. This article reviews the electronic structure of atoms, molecules and bulk matter, as well as the thermodynamic properties of dense plasma, in strong magnetic fields, 109G<<B<1016G10^9G << B < 10^{16}G. The focus is on the basic physical pictures and approximate scaling relations, although various theoretical approaches and numerical results are also discussed. For the neutron star surface composed of light elements such as hydrogen or helium, the outermost layer constitutes a nondegenerate, partially ionized Coulomb plasma if B<<1014GB<<10^{14}G, and may be in the form of a condensed liquid if the magnetic field is stronger (and temperature <106<10^6 K). For the iron surface, the outermost layer of the neutron star can be in a gaseous or a condensed phase depending on the cohesive property of the iron condensate.Comment: 45 pages with 9 figures. Many small additions/changes. Accepted for publication in Rev. Mod. Phy

    Secondary Endoleak Management Following TEVAR and EVAR.

    Get PDF
    Endovascular abdominal and thoracic aortic aneurysm repair and are widely used to treat increasingly complex aneurysms. Secondary endoleaks, defined as those detected more than 30 days after the procedure and after previous negative imaging, remain a challenge for aortic specialists, conferring a need for long-term surveillance and reintervention. Endoleaks are classified on the basis of their anatomic site and aetiology. Type 1 and type 2 endoleaks (EL1 and EL2) are the most common endoleaks necessitating intervention. The management of these requires an understanding of their mechanics, and the risk of sac enlargement and rupture due to increased sac pressure. Endovascular techniques are the main treatment approach to manage secondary endoleaks. However, surgery should be considered where endovascular treatments fail to arrest aneurysm growth. This chapter reviews the aetiology, significance, management strategy and techniques for different endoleak types

    Cold atoms in space: community workshop summary and proposed road-map

    Get PDF
    We summarise the discussions at a virtual Community Workshop on Cold Atoms in Space concerning the status of cold atom technologies, the prospective scientific and societal opportunities offered by their deployment in space, and the developments needed before cold atoms could be operated in space. The cold atom technologies discussed include atomic clocks, quantum gravimeters and accelerometers, and atom interferometers. Prospective applications include metrology, geodesy and measurement of terrestrial mass change due to, e.g., climate change, and fundamental science experiments such as tests of the equivalence principle, searches for dark matter, measurements of gravitational waves and tests of quantum mechanics. We review the current status of cold atom technologies and outline the requirements for their space qualification, including the development paths and the corresponding technical milestones, and identifying possible pathfinder missions to pave the way for missions to exploit the full potential of cold atoms in space. Finally, we present a first draft of a possible road-map for achieving these goals, that we propose for discussion by the interested cold atom, Earth Observation, fundamental physics and other prospective scientific user communities, together with the European Space Agency (ESA) and national space and research funding agencies.publishedVersio

    Cold atoms in space: community workshop summary and proposed road-map

    Get PDF
    We summarise the discussions at a virtual Community Workshop on Cold Atoms in Space concerning the status of cold atom technologies, the prospective scientific and societal opportunities offered by their deployment in space, and the developments needed before cold atoms could be operated in space. The cold atom technologies discussed include atomic clocks, quantum gravimeters and accelerometers, and atom interferometers. Prospective applications include metrology, geodesy and measurement of terrestrial mass change due to, e.g., climate change, and fundamental science experiments such as tests of the equivalence principle, searches for dark matter, measurements of gravitational waves and tests of quantum mechanics. We review the current status of cold atom technologies and outline the requirements for their space qualification, including the development paths and the corresponding technical milestones, and identifying possible pathfinder missions to pave the way for missions to exploit the full potential of cold atoms in space. Finally, we present a first draft of a possible road-map for achieving these goals, that we propose for discussion by the interested cold atom, Earth Observation, fundamental physics and other prospective scientific user communities, together with the European Space Agency (ESA) and national space and research funding agencies
    corecore