1,432 research outputs found

    Blue-Green Coalitions: Fighting for Safe Workplaces and Healthy Communities

    Get PDF
    [Excerpt] My goal in this book is to examine the formation of labor-environmental alliances that focus on health issues. Health concerns are increasingly a common ground on which blue-green coalitions are developing across the United States. Activists from both movements often see health issues through different lenses, which lends a particular slant to how they approach potential solutions for reducing exposures to toxics. The coalition framework emphasizes the fundamental link between occupational and environmental health, providing an internal cohesion and a politically persuasive agenda based on the centrality of health-related issues. By engaging labor and environmental activists in a common dialogue regarding the need for cooperative action to reduce the risks of community and workplace exposures, blue-green coalitions are creating new opportunities for progressive social change

    Automatic Processing of High-Rate, High-Density Multibeam Echosounder Data

    Get PDF
    Multibeam echosounders (MBES) are currently the best way to determine the bathymetry of large regions of the seabed with high accuracy. They are becoming the standard instrument for hydrographic surveying and are also used in geological studies, mineral exploration and scientific investigation of the earth\u27s crustal deformations and life cycle. The significantly increased data density provided by an MBES has significant advantages in accurately delineating the morphology of the seabed, but comes with the attendant disadvantage of having to handle and process a much greater volume of data. Current data processing approaches typically involve (computer aided) human inspection of all data, with time-consuming and subjective assessment of all data points. As data rates increase with each new generation of instrument and required turn-around times decrease, manual approaches become unwieldy and automatic methods of processing essential. We propose a new method for automatically processing MBES data that attempts to address concerns of efficiency, objectivity, robustness and accuracy. The method attributes each sounding with an estimate of vertical and horizontal error, and then uses a model of information propagation to transfer information about the depth from each sounding to its local neighborhood. Embedded in the survey area are estimation nodes that aim to determine the true depth at an absolutely defined location, along with its associated uncertainty. As soon as soundings are made available, the nodes independently assimilate propagated information to form depth hypotheses which are then tracked and updated on-line as more data is gathered. Consequently, we can extract at any time a “current-best” estimate for all nodes, plus co-located uncertainties and other metrics. The method can assimilate data from multiple surveys, multiple instruments or repeated passes of the same instrument in real-time as data is being gathered. The data assimilation scheme is sufficiently robust to deal with typical survey echosounder errors. Robustness is improved by pre-conditioning the data, and allowing the depth model to be incrementally defined. A model monitoring scheme ensures that inconsistent data are maintained as separate but internally consistent depth hypotheses. A disambiguation of these competing hypotheses is only carried out when required by the user. The algorithm has a low memory footprint, runs faster than data can currently be gathered, and is suitable for real-time use. We call this algorithm CUBE (Combined Uncertainty and Bathymetry Estimator). We illustrate CUBE on two data sets gathered in shallow water with different instruments and for different purposes. We show that the algorithm is robust to even gross failure modes, and reliably processes the vast majority of the data. In both cases, we confirm that the estimates made by CUBE are statistically similar to those generated by hand

    Error Estimation of Bathymetric Grid Models Derived from Historic and Contemporary Data Sets

    Get PDF
    The past century has seen remarkable advances in technologies associated with positioning and the measurement of depth. Lead lines have given way to single beam echo sounders, which in turn are being replaced by multibeam sonars and other means of remotely and rapidly collecting dense bathymetric datasets. Sextants were replaced by radio navigation, then transit satellite, GPS and now differential GPS. With each new advance comes tremendous improvement in the accuracy and resolution of the data we collect. Given these changes and given the vastness of the ocean areas we must map, the charts we produce are mainly compilations of multiple data sets collected over many years and representing a range of technologies. Yet despite our knowledge that the accuracy of the various technologies differs, our compilations have traditionally treated each sounding with equal weight. We address these issues in the context of generating regularly spaced grids containing bathymetric values. Gridded products are required for a number of earth sciences studies and for generating the grid we are often forced to use a complex interpolation scheme due to the sparseness and irregularity of the input data points. Consequently, we are faced with the difficult task of assessing the confidence that we can assign to the final grid product, a task that is not usually addressed in most bathymetric compilations. Traditionally the hydrographic community has considered each sounding equally accurate and there has been no error evaluation of the bathymetric end product. This has important implications for use of the gridded bathymetry, especially when it is used for generating further scientific interpretations. In this paper we approach the problem of assessing the confidence of the final bathymetry gridded product via a direct-simulation Monte Carlo method. We start with a small subset of data from the International Bathymetric Chart of the Arctic Ocean (IBCAO) grid model [Jakobsson et al., 2000]. This grid is compiled from a mixture of data sources ranging from single beam soundings with available metadata, to spot soundings with no available metadata, to digitized contours; the test dataset shows examples of all of these types. From this database, we assign a priori error variances based on available meta-data, and when this is not available, based on a worst-case scenario in an essentially heuristic manner. We then generate a number of synthetic datasets by randomly perturbing the base data using normally distributed random variates, scaled according to the predicted error model. These datasets are next re-gridded using the same methodology as the original product, generating a set of plausible grid models of the regional bathymetry that we can use for standard deviation estimates. Finally, we repeat the entire random estimation process and analyze each run’s standard deviation grids in order to examine sampling bias and standard error in the predictions. The final products of the estimation are a collection of standard deviation grids, which we combine with the source data density in order to create a grid that contains information about the bathymetric model’s reliability

    Expression of the CD6 T lymphocyte differentiation antigen in normal human brain

    Get PDF
    Antigens shared by the immune and central nervous systems (CNS) have been described repeatedly. The present study reports the expression of the CD6 lymphocyte differentiation antigen in normal human brain evidenced by immunohistochemistry and Northern blot analysis. A panel of various anti-CD6 monoclonal antibodies (mabs) tested on serial cryostat sections identified CD6-positive cells randomly scattered in parenchyma of all examined brain areas. Northern blot analysis with a highly sensitive cRNA probe revealed a 3.1 kb CD6-specific mRNA in various brain regions, especially in basalganglia and cortex cerebellum. Staining with mabs raised against different hematopoietic cell types, as well as hybridization with probes specific for the Ăź- and y-T cell receptor (TCR) chains support the notion that CD6 is expressed by original brain cells. The nature of the CD6-positive cell type and possible functions of shared antigens in immune and nervous systems are discusse

    Seafloor mapping in the Arctic: support for a potential U.S. extended continental shelf

    Get PDF
    For the United States, the greatest opportunity for an extended continental shelf under UNCLOS is in the ice-covered regions of the Arctic north of Alaska. Since 2003, CCOM/JHC has been using the icebreaker Healy equipped with a multibeam echosounder, chirp subbottom profiler, and dredges, to map and sample the region of Chukchi Borderland and Alpha-Mendeleev Ridge complex. These data have led to the discovery of several new features, have radically changed our view of the bathymetry and geologic history of the area, and may have important ramifications for the determination of the limits of a U.S. extended continental shelf under Article 76

    Providing the Third Dimension: High-resolution Multibeam Sonar as a Tool for Archaeological Investigations - An Example from the D-day Beaches of Normandy

    Get PDF
    In general, marine archaeological investigations begin in the archives, using historic maps, coast surveys, and other materials, to define submerged areas suspected to contain potentially significant historical sites. Following this research phase, a typical archaeological survey uses sidescan sonar and marine magnetometers as initial search tools. Targets are then examined through direct observation by divers, video, or photographs. Magnetometers can demonstrate the presence, absence, and relative susceptibility of ferrous objects but provide little indication of the nature of the target. Sidescan sonar can present a clear image of the overall nature of a target and its surrounding environment, but the sidescan image is often distorted and contains little information about the true 3-D shape of the object. Optical techniques allow precise identification of objects but suffer from very limited range, even in the best of situations. Modern high-resolution multibeam sonar offers an opportunity to cover a relatively large area from a safe distance above the target, while resolving the true three-dimensional (3-D) shape of the object with centimeter-level resolution. A clear demonstration of the applicability of highresolution multibeam sonar to wreck and artifact investigations occurred this summer when the Naval Historical Center (NHC), the Center for Coastal and Ocean Mapping (CCOM) at the University of New Hampshire, and Reson Inc., collaborated to explore the state of preservation and impact on the surrounding environment of a series of wrecks located off the coast of Normandy, France, adjacent to the American landing sectors The survey augmented previously collected magnetometer and high-resolution sidescan sonar data using a Reson 8125 high-resolution focused multibeam sonar with 240, 0.5° (at nadir) beams distributed over a 120° swath. The team investigated 21 areas in water depths ranging from about three -to 30 meters (m); some areas contained individual targets such as landing craft, barges, a destroyer, troop carrier, etc., while others contained multiple smaller targets such as tanks and trucks. Of particular interest were the well-preserved caissons and blockships of the artificial Mulberry Harbor deployed off Omaha Beach. The near-field beam-forming capability of the Reson 8125 combined with 3-D visualization techniques provided an unprecedented level of detail including the ability to recognize individual components of the wrecks (ramps, gun turrets, hatches, etc.), the state of preservation of the wrecks, and the impact of the wrecks on the surrounding seafloor

    Proposition 3: The Children’s Hospital Bond Act of 2008

    Get PDF

    Robust Automatic Multi-beam Bathymetric Processing

    Get PDF

    Monetizing Athlete Brand Image: An Investigation of Athlete Managers’ Perspectives

    Get PDF
    In a highly competitive sport marketplace, personal branding is a top priority for athletes. Thusly, marketers should leverage athletes’ talents and influence in creative ways to maximize their earning potential. This research explored the attributes of a marketable athlete, as well as promotional strategies to help secure athlete sponsorships. Semi-structured interviews were conducted with purposefully selected talent marketing practitioners with sport marketing agencies. The findings revealed relatable story, as well as perceived persona, as prevalent themes for a marketable athlete. Additionally, the themes of athlete-brand alignment and social media marketing were important to securing client promotion and sponsorships. These findings extended previous conceptualizations by illuminating the essential role of brand authenticity not only in quality of fan-athlete interaction, but in appeal to prospective sponsors
    • …
    corecore