756 research outputs found

    Panel Discussion - Management of Eurasian watermilfoil in the United States using native insects: State regulatory and management issues

    Get PDF
    While researchers have evaluated the potential of native insect herbivores to manage nonindigenous aquatic plant species such as Eurasian watermilfoil ( Myriophyllum spicatum L.), the practical matters of regulatory compliance and implementation have been neglected. A panel of aquatic nuisance species program managers from three state natural resource management agencies (Minnesota, Vermont and Washington) discussed their regulatory and policy concerns. In addition, one ecological consultant attempting to market one of the native insects to manage Eurasian watermilfoil added his perspective on the special challenges of distributing a native biological control agent for management of Eurasian watermilfoil

    Fabrication and Characterization of Electrospun Poly(acrylonitrile-co-Methyl Acrylate)/Lignin Nanofibers: Effects of Lignin Type and Total Polymer Concentration

    Get PDF
    Lignin macromolecules are potential precursor materials for producing electrospun nanofibers for composite applications. However, little is known about the effect of lignin type and blend ratios with synthetic polymers. This study analyzed blends of poly(acrylonitrile-co-methyl acrylate) (PAN-MA) with two types of commercially available lignin, low sulfonate (LSL) and alkali, kraft lignin (AL), in DMF solvent. The electrospinning and polymer blend solution conditions were optimized to produce thermally stable, smooth lignin-based nanofibers with total polymer content of up to 20 wt % in solution and a 50/50 blend weight ratio. Microscopy studies revealed that AL blends possess good solubility, miscibility, and dispersibility compared to LSL blends. Despite the lignin content or type, rheological studies demonstrated that PAN-MA concentration in solution dictated the blend’s viscosity. Smooth electrospun nanofibers were fabricated using AL depending upon the total polymer content and blend ratio. AL’s addition to PAN-MA did not affect the glass transition or degradation temperatures of the nanofibers compared to neat PAN-MA. We confirmed the presence of each lignin type within PAN-MA nanofibers through infrared spectroscopy. PAN-MA/AL nanofibers possessed similar morphological and thermal properties as PAN-MA; thus, these lignin-based nanofibers can replace PAN in future applications, including production of carbon fibers and supercapacitors

    Global optimization of data quality checks on 2‐D and 3‐D networks of GPR cross‐well tomographic data for automatic correction of unknown well deviations

    Full text link
    Significant errors related to poor time zero estimation, well deviation or mislocation of the transmitter (TX) and receiver (RX) stations can render even the most sophisticated modeling and inversion routine useless. Previous examples of methods for the analysis and correction of data errors in geophysical tomography include the works of Maurer and Green (1997), Squires et al. (1992) and Peterson (2001). Here we follow the analysis and techniques of Peterson (2001) for data quality control and error correction. Through our data acquisition and quality control procedures we have very accurate control on the surface locations of wells, the travel distance of both the transmitter and receiver within the boreholes, and the change in apparent zero time. However, we often have poor control on well deviations, either because of economic constraints or the nature of the borehole itself prevented the acquisition of well deviation logs. Also, well deviation logs can sometimes have significant errors. Problems with borehole deviations can be diagnosed prior to inversion of travel-time tomography data sets by plotting the apparent velocity of a straight ray connecting a transmitter (TX) to a receiver (RX) against the take-off angle of the ray. Issues with the time-zero pick or distances between wells appear as symmetric smiles or frown in these QC plots. Well deviation or dipping-strong anisotropy will result in an asymmetric correlation between apparent velocity and take-off angle (Figure 1-B). In addition, when a network of interconnected GPR tomography data is available, one has the additional quality constraint of insuring that there is continuity in velocity between immediately adjacent tomograms. A sudden shift in the mean velocity indicates that either position deviations are present or there is a shift in the pick times. Small errors in well geometry may be effectively treated during inversion by including weighting, or relaxation, parameters into the inversion (e.g. Bautu et al., 2006). In the technique of algebraic reconstruction tomography (ART), which is used herein for the travel time inversion (Peterson et al., 1985), a small relaxation parameter will smooth imaging artifacts caused by data errors at the expense of resolution and contrast (Figure 2). However, large data errors such as unaccounted well deviations cannot be adequately suppressed through inversion weighting schemes. Previously, problems with tomograms were treated manually. However, in large data sets and/or networks of data sets, trial and error changes to well geometries become increasingly difficult and ineffective. Mislocation of the transmitter and receiver stations of GPR cross-well tomography data sets can lead to serious imaging artifacts if not accounted for prior to inversion. Previously, problems with tomograms have been treated manually prior to inversion. In large data sets and/or networks of tomographic data sets, trial and error changes to well geometries become increasingly difficult and ineffective. Our approach is to use cross-well data quality checks and a simplified model of borehole deviation with particle swarm optimization (PSO) to automatically correct for source and receiver locations prior to tomographic inversion. We present a simple model of well deviation, which is designed to minimize potential corruption of actual data trends. We also provide quantitative quality control measures based on minimizing correlations between take-off angle and apparent velocity, and a quality check on the continuity of velocity between adjacent wells. This methodology is shown to be accurate and robust for simple 2-D synthetic test cases. Plus, we demonstrate the method on actual field data where it is compared to deviation logs. This study shows the promise for automatic correction of well deviations in GPR tomographic data. Analysis of synthetic data shows that very precise estimates of well deviation can be made for small deviations, even in the presence of static data errors. However, the analysis of the synthetic data and the application of the method to a large network of field data show that the technique is sensitive to data errors varying between neighboring tomograms

    Scaling laws governing stochastic growth and division of single bacterial cells

    Get PDF
    Uncovering the quantitative laws that govern the growth and division of single cells remains a major challenge. Using a unique combination of technologies that yields unprecedented statistical precision, we find that the sizes of individual Caulobacter crescentus cells increase exponentially in time. We also establish that they divide upon reaching a critical multiple (≈\approx1.8) of their initial sizes, rather than an absolute size. We show that when the temperature is varied, the growth and division timescales scale proportionally with each other over the physiological temperature range. Strikingly, the cell-size and division-time distributions can both be rescaled by their mean values such that the condition-specific distributions collapse to universal curves. We account for these observations with a minimal stochastic model that is based on an autocatalytic cycle. It predicts the scalings, as well as specific functional forms for the universal curves. Our experimental and theoretical analysis reveals a simple physical principle governing these complex biological processes: a single temperature-dependent scale of cellular time governs the stochastic dynamics of growth and division in balanced growth conditions.Comment: Text+Supplementar

    Making Classical Ground State Spin Computing Fault-Tolerant

    Full text link
    We examine a model of classical deterministic computing in which the ground state of the classical system is a spatial history of the computation. This model is relevant to quantum dot cellular automata as well as to recent universal adiabatic quantum computing constructions. In its most primitive form, systems constructed in this model cannot compute in an error free manner when working at non-zero temperature. However, by exploiting a mapping between the partition function for this model and probabilistic classical circuits we are able to show that it is possible to make this model effectively error free. We achieve this by using techniques in fault-tolerant classical computing and the result is that the system can compute effectively error free if the temperature is below a critical temperature. We further link this model to computational complexity and show that a certain problem concerning finite temperature classical spin systems is complete for the complexity class Merlin-Arthur. This provides an interesting connection between the physical behavior of certain many-body spin systems and computational complexity.Comment: 24 pages, 1 figur

    The components of career capital and how they are acquired by knowledge workers across different industries

    Get PDF
    The literature shows that the way in which knowledge workers manage their careers in the global economy has changed fundamentally in the last twenty years. Career capital is a tradable commodity between and within organisations which impacts both human resource managers and knowledge workers. There is insufficient empirical evidence of the components of career capital and how these are acquired and there has been a dearth of investigation as to whether career capital is managed differently in different industries. The research was conducted in two phases. The first qualitative phase via 21 in-depth interviews identified 27 components of career capital and 23 methods of career capital accrual. In phase two quantitative data was collected, using those constructs, from 200 knowledge workers in four sectors: the public service sector and in manufacturing, financial and high tech research and development industries. The research determined the most important career capital components and methods of their accrual and showed these to differ greatly between the four employment sectors. The data raises questions with regard to two important themes in the career literature

    Supporting Practices to Adopt Registry-Based Care (SPARC): protocol for a randomized controlled trial

    Get PDF
    Background: Diabetes is predicted to increase in incidence by 42% from 1995 to 2025. Although most adults with diabetes seek care from primary care practices, adherence to treatment guidelines in these settings is not optimal. Many practices lack the infrastructure to monitor patient adherence to recommended treatment and are slow to implement changes critical for effective management of patients with chronic conditions. Supporting Practices to Adopt Registry-Based Care (SPARC) will evaluate effectiveness and sustainability of a low-cost intervention designed to support work process change in primary care practices and enhance focus on population-based care through implementation of a diabetes registry. Methods: SPARC is a two-armed randomized controlled trial (RCT) of 30 primary care practices in the Virginia Ambulatory Care Outcomes Research Network (ACORN). Participating practices (including control groups) will be introduced to population health concepts and tools for work process redesign and registry adoption at a meeting of practice-level implementation champions. Practices randomized to the intervention will be assigned study peer mentors, receive a list of specific milestones, and have access to a physician informaticist. Peer mentors are clinicians who successfully implemented registries in their practices and will help champions in the intervention practices throughout the implementation process. During the first year, peer mentors will contact intervention practices monthly and visit them quarterly. Control group practices will not receive support or guidance for registry implementation. We will use a mixed-methods explanatory sequential design to guide collection of medical record, participant observation, and semistructured interview data in control and intervention practices at baseline, 12 months, and 24 months. We will use grounded theory and a template-guided approach using the Consolidated Framework for Implementation Research to analyze qualitative data on contextual factors related to registry adoption. We will assess intervention effectiveness by comparing changes in patient-level hemoglobin A1c scores from baseline to year 1 between intervention and control practices. Discussion: Findings will enhance our understanding of how to leverage existing practice resources to improve diabetes care in primary care practices by implementing and using a registry. SPARC has the potential to validate the effectiveness of low-cost implementation strategies that target practice change in primary care

    Supporting Practices to Adopt Registry-Based Care (SPARC): protocol for a randomized controlled trial

    Full text link
    Background: Diabetes is predicted to increase in incidence by 42% from 1995 to 2025. Although most adults with diabetes seek care from primary care practices, adherence to treatment guidelines in these settings is not optimal. Many practices lack the infrastructure to monitor patient adherence to recommended treatment and are slow to implement changes critical for effective management of patients with chronic conditions. Supporting Practices to Adopt Registry-Based Care (SPARC) will evaluate effectiveness and sustainability of a low-cost intervention designed to support work process change in primary care practices and enhance focus on population-based care through implementation of a diabetes registry. Methods: SPARC is a two-armed randomized controlled trial (RCT) of 30 primary care practices in the Virginia Ambulatory Care Outcomes Research Network (ACORN). Participating practices (including control groups) will be introduced to population health concepts and tools for work process redesign and registry adoption at a meeting of practice-level implementation champions. Practices randomized to the intervention will be assigned study peer mentors, receive a list of specific milestones, and have access to a physician informaticist. Peer mentors are clinicians who successfully implemented registries in their practices and will help champions in the intervention practices throughout the implementation process. During the first year, peer mentors will contact intervention practices monthly and visit them quarterly. Control group practices will not receive support or guidance for registry implementation. We will use a mixed-methods explanatory sequential design to guide collection of medical record, participant observation, and semistructured interview data in control and intervention practices at baseline, 12 months, and 24 months. We will use grounded theory and a template-guided approach using the Consolidated Framework for Implementation Research to analyze qualitative data on contextual factors related to registry adoption. We will assess intervention effectiveness by comparing changes in patient-level hemoglobin A1c scores from baseline to year 1 between intervention and control practices. Discussion: Findings will enhance our understanding of how to leverage existing practice resources to improve diabetes care in primary care practices by implementing and using a registry. SPARC has the potential to validate the effectiveness of low-cost implementation strategies that target practice change in primary care
    • 

    corecore