284,735 research outputs found

    Upper Missouri Waterkeeper v. EPA

    Get PDF
    State water quality standards developed under the Clean Water Act play a key role in curtailing the negative environmental, economic, and human health impacts of water pollution. Under the state water quality regulatory framework, EPA may grant variances to state standards should the state demonstrate the compliance with its standards is infeasible for a certain pollutant discharger or waterbody. Montana DEQ developed a variance for nutrients based on evidence that compliance with those standards would cause economic harm. EPA approved Montana\u27s nutrient pollutant variance, and Upper Missouri Waterkeeper challenged EPA\u27s approval on the grounds that the variance violates the Clean Water Act. The Ninth Circuit held that (1) EPA may consider the cost of implementing pollution control technology to attain compliance with state standards when approving variance requests, and (2) EPA properly interpreted its regulations as requiring compliance with the variance standard only at the end of the variance term. This note will explore how the decision may incentivize states to engage in a water quality race-to-the-bottom, sacrificing improvements in the name of cost, failing to protect the health of our nation\u27s waters, and further exposing low-income communities to degraded resources

    Arnavutluk'ta Profesyonel Müziğin Başlangıcı

    Get PDF
    DergiPark: 583433bmsdThe instrumentalminiature in Albanian music is related to the beginnings of the professionalmusic composition. The instrumental miniature in general and especially theminiature for piano have got very rich history compared to the other genres ,starting with the first composition of Martin Gjoka “March in D major” for piano. The developmentof small forms at the time, although its simple musical language , shows theeffort to create an Albanian national music. As to the period after the SecondWorld War, the development of music had a new momentum and during this time were created a bignumber of works which were called miniatures. The priority to the miniatures ingeneral and the miniature for piano in particular has got different reasons ,but the main are the creation of a music education system which broughtabout the opening of the piano class andthe necessity of developing a nationalpedagogical and concert repertoire

    Bridging Single-Particle Characterisation Gaps of Optical Microscopy in the Nano-Submicron Regime

    Get PDF
    As the practical importance of particles in the nano-submicron size regime continues to increase in both biomedical applications and industrial processes, so does the need for accurate and versatile characterisation methods. Optical scattering microscopy methods are commonly used for single-particle characterisation as they provide quick measurements at physiologically relevant conditions with detection limits reaching down to individual biomolecules. However, quantitative particle characterisation using optical microscopy often rely on assumptions about the surrounding media and theparticle, including solution viscosity, boundary conditions, as well as particle shape and material. Since these assumptions are difficult to evaluate, particle characterisation beyond hydrodynamic radius and/or mass remains challenging.The aim of this thesis is to contribute to bridging the gaps that limit quantitative optical microscopy-based characterisation of individual particles in the nano-submicron regime by both developing new and improving existing microscopy methods. Specifically, in Paper I a method was developed to evaluate the relation between diffusivity and particle size to enable measurements of the hydrodynamic boundary condition. Papers II-V are based around the development of holographic nanoparticle tracking (H-NTA) and extensions thereof, with the intent of using the complex-valued optical field for material sensitive particle characterisation with minimal dependence on the surrounding media. In Paper II, H-NTA by itself was used to characterise suspensions containing nanobubbles and molecular aggregates. In Paper III, the combination of H-NTA with deep learning was used to achieve simultaneous quantification of size and refractive index directly from single microscopy images, which allowed detection of reversible fluctuations in nanoparticle aggregates. In Paper IV, H-NTA augmented with a low frequency attenuation filter, coined twilight holography, was used to investigate the interaction between herpes viruses and functionalised gold nanoparticles in terms of size, bound gold mass, and virus refractive index. In Paper V, the combination of twilight holography and interferometric scattering microscopy (iSCAT) was used to quantify both size and polarizability of individual nanoparticles without the need of detailed knowledge about the surrounding media. Taken together, the presented results in this thesis provide both new insights into heterogenous nanoparticle systems and contributes to narrowing the gap for detailed optical particle characterisation

    Optical characterisation of subwavelength dielectric particles using particle tracking beyond the Stokes-Einstein relation

    Get PDF
    As the importance of nanoparticles continues to increase in both biology and industrial processes, so does the need for accurate and versatile characterisation methods. However, most light-based methods to quantify size and refractive index of individual particles are either limited to snapshot observations, particles larger than the wavelength of light, non-dynamic particle properties, or assuming the hydrodynamic boundary conditions without experimental evaluation. The aim of this thesis is to partially overcome these limitations by further developing two different characterisation methods based on optical microscopy combined with particle tracking, where the analysis goes beyond the ordinary Stokes-Einstein relation. The first method combines off-axis holographic nanoparticle tracking with deep learning (Paper I). By utilizing the optical signal, both size and refractive index of individual particles with a minimum size of R=150 nm were accurately determined using only five particle observations. The method was evaluated using particles of different sizes, refractive indices, surrounding media as well as for polystyrene nanoparticle clusters, for which reversible fluctuations of the number of monomers could be resolved while the fractal dimension remained constant. The second method is based particles tethered to a laterally fluid supported lipid bilayer and quantification of their diffusivity and flow-induced motion (Paper II). By separating the friction contributions from the tethers and the particle, simultaneous measurement of size and diffusivity enabled a comparison with theory using partial slip as a fitting parameter. This was used to quantify the slip length for different lipid vesicles, and to clarify the size-dependent mechanistic aspects concerning the mobility of membrane-attached nanoparticles

    Estimating and testing multiple structural changes in models with endogenous regressors

    Full text link
    We consider the problem of estimating and testing for multiple breaks in a single equation framework with regressors that are endogenous, i.e., correlated with the errors. First, we show based on standard assumptions about the regressors, instruments and errors that the second stage regression of the instrumental variable (IV) procedure involves regressors and errors that satisfy all the assumptions in Perron and Qu (2006) so that the results about consistency, rate of convergence and limit distributions of the estimates of the break dates, as well as the limit distributions of the tests, are obtained as simple consequences. More importantly from a practical perspective, we show that even in the presence of endogenous regressors, it is still preferable to simply estimate the break dates and test for structural change using the usual ordinary least-squares (OLS) framework. It delivers estimates of the break dates with higher precision and tests with higher power compared to those obtained using an IV method. To illustrate the relevance of our theoretical results, we consider the stability of the New Keynesian hybrid Phillips curve. IV-based methods do not indicate any instability. On the other hand, OLS-based ones strongly indicate a change in 1991:1 and that after this date the model looses all explanatory power

    Using OLS to estimate and test for structural changes in models with endogenous regressors

    Full text link
    We consider the problem of estimating and testing for multiple breaks in a single-equation framework with regressors that are endogenous, i.e. correlated with the errors. We show that even in the presence of endogenous regressors it is still preferable, in most cases, to simply estimate the break dates and test for structural change using the usual ordinary least squares (OLS) framework. Except for some knife-edge cases, it delivers estimates of the break dates with higher precision and tests with higher power compared to those obtained using an instrumental variable (IV) method. Also, the OLS method avoids potential weak identification problems caused by weak instruments. To illustrate the relevance of our theoretical results, we consider the stability of the New Keynesian hybrid Phillips curve. IV-based methods only provide weak evidence of instability. On the other hand, OLS-based ones strongly indicate a change in 1991:Q1 and that after this date the model loses all explanatory power

    Reusable rocket engine intelligent control system framework design, phase 2

    Get PDF
    Elements of an advanced functional framework for reusable rocket engine propulsion system control are presented for the Space Shuttle Main Engine (SSME) demonstration case. Functional elements of the baseline functional framework are defined in detail. The SSME failure modes are evaluated and specific failure modes identified for inclusion in the advanced functional framework diagnostic system. Active control of the SSME start transient is investigated, leading to the identification of a promising approach to mitigating start transient excursions. Key elements of the functional framework are simulated and demonstration cases are provided. Finally, the advanced function framework for control of reusable rocket engines is presented

    Large-scale linear regression: Development of high-performance routines

    Full text link
    In statistics, series of ordinary least squares problems (OLS) are used to study the linear correlation among sets of variables of interest; in many studies, the number of such variables is at least in the millions, and the corresponding datasets occupy terabytes of disk space. As the availability of large-scale datasets increases regularly, so does the challenge in dealing with them. Indeed, traditional solvers---which rely on the use of black-box" routines optimized for one single OLS---are highly inefficient and fail to provide a viable solution for big-data analyses. As a case study, in this paper we consider a linear regression consisting of two-dimensional grids of related OLS problems that arise in the context of genome-wide association analyses, and give a careful walkthrough for the development of {\sc ols-grid}, a high-performance routine for shared-memory architectures; analogous steps are relevant for tailoring OLS solvers to other applications. In particular, we first illustrate the design of efficient algorithms that exploit the structure of the OLS problems and eliminate redundant computations; then, we show how to effectively deal with datasets that do not fit in main memory; finally, we discuss how to cast the computation in terms of efficient kernels and how to achieve scalability. Importantly, each design decision along the way is justified by simple performance models. {\sc ols-grid} enables the solution of 101110^{11} correlated OLS problems operating on terabytes of data in a matter of hours
    corecore