916 research outputs found

    Speciation without chromatography: Part I. Determination of tributyltin in aqueous samples by chloride generation, headspace solid-phase microextraction and inductively coupled plasma time of flight mass spectrometry

    Get PDF
    An analytical procedure was developed for the determination of tributyltin in aqueous samples. The relatively high volatility of the organometal halide species confers suitability for their headspace sampling from the vapour phase above natural waters or leached solid samples. Tributyltin was collected from the sample headspace above various chloride-containing matrices, including HCl, sodium chloride solution and sea-water, by passive sampling using a polydimethylsiloxane/divinylbenzene (PDMS/DVB)-coated solid-phase microextraction (SPME) fiber. Inductively coupled plasma time-of-flight mass spectrometry (ICP-TOFMS) was used for detection following thermal desorption of analytes from the fiber. A detection limit of 5.8 pg ml–1(as tin) was realized in aqueous samples. Method validation was achieved using NRCC PACS-2 (Sediment) certified reference material, for which reasonable agreement between certified and measured values for tributyltin content was obtained

    RGB-D Mapping and Tracking in a Plenoxel Radiance Field

    Full text link
    Building on the success of Neural Radiance Fields (NeRFs), recent years have seen significant advances in the domain of novel view synthesis. These models capture the scene's volumetric radiance field, creating highly convincing dense photorealistic models through the use of simple, differentiable rendering equations. Despite their popularity, these algorithms suffer from severe ambiguities in visual data inherent to the RGB sensor, which means that although images generated with view synthesis can visually appear very believable, the underlying 3D model will often be wrong. This considerably limits the usefulness of these models in practical applications like Robotics and Extended Reality (XR), where an accurate dense 3D reconstruction otherwise would be of significant value. In this technical report, we present the vital differences between view synthesis models and 3D reconstruction models. We also comment on why a depth sensor is essential for modeling accurate geometry in general outward-facing scenes using the current paradigm of novel view synthesis methods. Focusing on the structure-from-motion task, we practically demonstrate this need by extending the Plenoxel radiance field model: Presenting an analytical differential approach for dense mapping and tracking with radiance fields based on RGB-D data without a neural network. Our method achieves state-of-the-art results in both the mapping and tracking tasks while also being faster than competing neural network-based approaches.Comment: *The two authors contributed equally to this pape

    Removing Adverse Volumetric Effects From Trained Neural Radiance Fields

    Full text link
    While the use of neural radiance fields (NeRFs) in different challenging settings has been explored, only very recently have there been any contributions that focus on the use of NeRF in foggy environments. We argue that the traditional NeRF models are able to replicate scenes filled with fog and propose a method to remove the fog when synthesizing novel views. By calculating the global contrast of a scene, we can estimate a density threshold that, when applied, removes all visible fog. This makes it possible to use NeRF as a way of rendering clear views of objects of interest located in fog-filled environments. Additionally, to benchmark performance on such scenes, we introduce a new dataset that expands some of the original synthetic NeRF scenes through the addition of fog and natural environments. The code, dataset, and video results can be found on our project page: https://vegardskui.com/fognerf/Comment: This work has been submitted to the IEEE for possible publication. Copyright may be transferred without notice, after which this version may no longer be accessibl

    Perceptions of Licensure: A Survey of Michigan Genetic Counselors

    Full text link
    This study by the Michigan Genetic Counselor Licensure Committee is the first known published documentation of genetic counselors’ beliefs and attitudes about licensure. The response rate from genetic counselors in Michigan was 66% (41/62). Ninety‐five percent of respondents were supportive of licensure. Respondents believed licensure would legitimize genetic counseling as a distinct allied healthcare profession (97.5%), increase the public’s protection (75%), and allow genetic counselors to practice independently (67%). While 45% felt licensure would increase counselor involvement in lawsuits, this did not impact licensure support (p = 0.744). Opinions were split regarding physician supervision and ordering tests. Even though 28% favored physician supervision, there was overwhelming support for genetic counselors performing some components of genetic testing (95%) and ordering some types of genetic tests (82%) independent of a physician. Use of this survey may be helpful in other states to assess genetic counselors’ interest in licensure and for drafting legislation.Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/147114/1/jgc40357.pd

    Hydrostatic pressure does not cause detectable changes to survival of human retinal ganglion

    Get PDF
    Purpose: Elevated intraocular pressure (IOP) is a major risk factor for glaucoma. One consequence of raised IOP is that ocular tissues are subjected to increased hydrostatic pressure (HP). The effect of raised HP on stress pathway signaling and retinal ganglion cell (RGC) survival in the human retina was investigated. Methods: A chamber was designed to expose cells to increased HP (constant and fluctuating). Accurate pressure control (10-100mmHg) was achieved using mass flow controllers. Human organotypic retinal cultures (HORCs) from donor eyes (<24h post mortem) were cultured in serum-free DMEM/HamF12. Increased HP was compared to simulated ischemia (oxygen glucose deprivation, OGD). Cell death and apoptosis were measured by LDH and TUNEL assays, RGC marker expression by qRT-PCR (THY-1) and RGC number by immunohistochemistry (NeuN). Activated p38 and JNK were detected by Western blot. Results: Exposure of HORCs to constant (60mmHg) or fluctuating (10-100mmHg; 1 cycle/min) pressure for 24 or 48h caused no loss of structural integrity, LDH release, decrease in RGC marker expression (THY-1) or loss of RGCs compared with controls. In addition, there was no increase in TUNEL-positive NeuN-labelled cells at either time-point indicating no increase in apoptosis of RGCs. OGD increased apoptosis, reduced RGC marker expression and RGC number and caused elevated LDH release at 24h. p38 and JNK phosphorylation remained unchanged in HORCs exposed to fluctuating pressure (10-100mmHg; 1 cycle/min) for 15, 30, 60 and 90min durations, whereas OGD (3h) increased activation of p38 and JNK, remaining elevated for 90min post-OGD. Conclusions: Directly applied HP had no detectable impact on RGC survival and stress-signalling in HORCs. Simulated ischemia, however, activated stress pathways and caused RGC death. These results show that direct HP does not cause degeneration of RGCs in the ex vivo human retina

    The Laser Astrometric Test of Relativity Mission

    Get PDF
    This paper discusses new fundamental physics experiment to test relativistic gravity at the accuracy better than the effects of the 2nd order in the gravitational field strength. The Laser Astrometric Test Of Relativity (LATOR) mission uses laser interferometry between two micro-spacecraft whose lines of sight pass close by the Sun to accurately measure deflection of light in the solar gravity. The key element of the experimental design is a redundant geometry optical truss provided by a long-baseline (100 m) multi-channel stellar optical interferometer placed on the International Space Station. The geometric redundancy enables LATOR to measure the departure from Euclidean geometry caused by the solar gravity field to a very high accuracy. LATOR will not only improve the value of the parameterized post-Newtonian (PPN) parameter gamma to unprecedented levels of accuracy of 1 part in 1e8, it will also reach ability to measure effects of the next post-Newtonian order (1/c^4) of light deflection resulting from gravity's intrinsic non-linearity. The solar quadrupole moment parameter, J2, will be measured with high precision, as well as a variety of other relativistic. LATOR will lead to very robust advances in the tests of fundamental physics: this mission could discover a violation or extension of general relativity, or reveal the presence of an additional long range interaction in the physical law. There are no analogs to the LATOR experiment; it is unique and is a natural culmination of solar system gravity experiments.Comment: 8 pages, 2 figures, invited talk given at the Second International Conference on Particle and Fundamental Physics in Space (SpacePart'03), 10-12 December 2003, Washington, D

    Space-based research in fundamental physics and quantum technologies

    Full text link
    Space-based experiments today can uniquely address important questions related to the fundamental laws of Nature. In particular, high-accuracy physics experiments in space can test relativistic gravity and probe the physics beyond the Standard Model; they can perform direct detection of gravitational waves and are naturally suited for precision investigations in cosmology and astroparticle physics. In addition, atomic physics has recently shown substantial progress in the development of optical clocks and atom interferometers. If placed in space, these instruments could turn into powerful high-resolution quantum sensors greatly benefiting fundamental physics. We discuss the current status of space-based research in fundamental physics, its discovery potential, and its importance for modern science. We offer a set of recommendations to be considered by the upcoming National Academy of Sciences' Decadal Survey in Astronomy and Astrophysics. In our opinion, the Decadal Survey should include space-based research in fundamental physics as one of its focus areas. We recommend establishing an Astronomy and Astrophysics Advisory Committee's interagency ``Fundamental Physics Task Force'' to assess the status of both ground- and space-based efforts in the field, to identify the most important objectives, and to suggest the best ways to organize the work of several federal agencies involved. We also recommend establishing a new NASA-led interagency program in fundamental physics that will consolidate new technologies, prepare key instruments for future space missions, and build a strong scientific and engineering community. Our goal is to expand NASA's science objectives in space by including ``laboratory research in fundamental physics'' as an element in agency's ongoing space research efforts.Comment: a white paper, revtex, 27 pages, updated bibliograph

    LTC: a novel algorithm to improve the efficiency of contig assembly for physical mapping in complex genomes

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Physical maps are the substrate of genome sequencing and map-based cloning and their construction relies on the accurate assembly of BAC clones into large contigs that are then anchored to genetic maps with molecular markers. High Information Content Fingerprinting has become the method of choice for large and repetitive genomes such as those of maize, barley, and wheat. However, the high level of repeated DNA present in these genomes requires the application of very stringent criteria to ensure a reliable assembly with the FingerPrinted Contig (FPC) software, which often results in short contig lengths (of 3-5 clones before merging) as well as an unreliable assembly in some difficult regions. Difficulties can originate from a non-linear topological structure of clone overlaps, low power of clone ordering algorithms, and the absence of tools to identify sources of gaps in Minimal Tiling Paths (MTPs).</p> <p>Results</p> <p>To address these problems, we propose a novel approach that: (i) reduces the rate of false connections and Q-clones by using a new cutoff calculation method; (ii) obtains reliable clusters robust to the exclusion of single clone or clone overlap; (iii) explores the topological contig structure by considering contigs as networks of clones connected by significant overlaps; (iv) performs iterative clone clustering combined with ordering and order verification using re-sampling methods; and (v) uses global optimization methods for clone ordering and Band Map construction. The elements of this new analytical framework called Linear Topological Contig (LTC) were applied on datasets used previously for the construction of the physical map of wheat chromosome 3B with FPC. The performance of LTC vs. FPC was compared also on the simulated BAC libraries based on the known genome sequences for chromosome 1 of rice and chromosome 1 of maize.</p> <p>Conclusions</p> <p>The results show that compared to other methods, LTC enables the construction of highly reliable and longer contigs (5-12 clones before merging), the detection of "weak" connections in contigs and their "repair", and the elongation of contigs obtained by other assembly methods.</p
    corecore