6,186,163 research outputs found
Efficacy Of A Structured Free Recall Intervention To Improve Rating Quality In Performance Evaluations
This experiment investigated the effects of a rater training on halo errors and accuracy during performance evaluations. 408 participants were randomly assigned to three groups (n=136) where they were either presented with a structured free recall intervention (SFRI), frame of reference training (FoRT), or no training. The purpose of this study was to further investigate the efficacy of SFRI against prominent training methods and no training at all. Results were not significant, and did not support previous finding in the literature. Further explanations are offered and a discussion is presented as to why these results were obtained
Bronco Ember An Edge Computing Acceleration Platform with Computer Vision
Bronco Ember is a nascent wildfire detection system that leverages edge computing capabilities, multi-spectral imaging, and artificial intelligence to greatly increase the performance of small satellite remote sensing payloads. The core hardware onboard is a SWIR InGaAs camera imaging in the 900nm to 1700nm wavelength and a GPU enabled single board computer. Artificial intelligence is used for fire detection and analysis using computer vision and neural networks being able to detect fires only filling a few pixels in each image. The system is based on traditional CNN networks and includes time series analysis that gives the system an 85% success rate in being able to detect wildfires with about a 50m diameter from a high-altitude balloon technology demonstration flight. The neural net is trained to monitor the movement and spread of the fire compared to prediction maps. This greatly reduces the number of false positive detected. The development of this payload has been supported through the NASA TechLeap Autonomous Observation Challenge No. 1 that has pushed the technology from concept to test flight in less than one calendar year. The system acts a rapid response remote sensing technology
Individualized Learning Plans in Guiding Career-Technical Course-Taking and Achieving Post-High-School Employment Goals
Most adolescents and young adults in the U.S. seek employment after high school regardless of their education or work status, yet career readiness and work preparation have not received equal attention as the college readiness and preparation at the secondary level. Using data from the High School Longitudinal Study of 2009 (HSLS:2009), we explored possible connections between individualized learning plans (ILP) and both secondary Career-Technical Education (CTE) course taking and employment goal attainment in the U.S. Results showed that ILPs were positively associated with establishing employment goals, securing employment, and achieving employment goals after high school. Students who had employment goals were likely to earn more CTE credits and had higher probabilities of working after high school. However, ILPs did not moderate the relationship between employment goals and earned CTE credits, nor moderate the relationship between employment goals and work activities. Findings reflect an overlook of integrated college and career readiness preparation and underutilization of school-based career education resources. Keywords: Individualized learning plan; career and technical education; employment; school-based career development; college and career readiness DOI: 10.7176/JEP/14-16-04 Publication date:June 30th 202
Maximum discharges and maximum runoffs in Poland
Published in: Natural environment of Poland and its protection in Łódź University Geographical Research, edited by E. Kobojek and T.Marsza
Recommended from our members
Five Total Maximum Daily Loads for Indicator Bacteria in Four Austin Streams
The report provides an overview of Waller Creek as well as detailed information about land use, flow duration, and TMDL of bacteria.EXECUTIVE SUMMARY: This document describes total maximum daily loads (TMDLs) for four Austin streams and their tributaries in which concentrations of indicator bacteria exceed the criteria used to evaluate attainment of the contact recreation use. The Texas Commission on Environmental Quality (TCEQ) first identified the impairments to the Spicewood Tributary to Shoal Creek (Segment 1403J) and Taylor Slough South (1403K) in the 2002 State of Texas Clean Water Act Section 303(d) List
(TCEQ, 2002), adding Waller Creek (1429C) and Walnut Creek (1428B) when the list was updated in 2006. The impaired segments and corresponding assessment units (AUs) are:
Spicewood Tributary to Shoal Creek (1403J_01); ? Taylor Slough South (1403K_01); ? Walnut Creek (1428B_05); ? Waller Creek (1429C_02, 1429C_03);
Together these four freshwater streams total approximately 31.6 miles in length with watersheds covering 63.465 square miles. They are almost entirely within the City of Austin full purpose, planning, or exterritorial jurisdiction. They are almost entirely within Travis County, except that the Walnut Creek watershed includes a very small portion of Williamson County.
Currently, there are not any permitted domestic wastewater discharges within the watersheds of any of these streams. The Walnut Creek Wastewater Treatment Facility (WWTF), operated by the City of Austin, discharges its effluent directly into the Colorado River instead of Walnut Creek. There are not any permitted industrial bacteria discharges within the watersheds. The Freescale Semiconductor WWTF only discharges bacteria-free process water. The primary loads are from various nonpoint sources that enter the streams via stormwater.
The Spicewood Tributary to Shoal Creek is an intermittent freshwater stream approximately 1.4 mile in length from MOPAC/Loop-1 upstream to its headwaters near Spicewood Springs Road and Mesa Drive. The watershed is about 0.650 square miles and is entirely located in the City of Austin. There are no regulated wastewater discharges within this watershed.
Taylor Slough South is a perennial freshwater stream approximately 1.1 mile in length from Lake Austin upstream to its headwaters near West 35th Street and MOPAC/Loop-1. The watershed is 0.650 square miles and is entirely located in the City of Austin. There are no regulated wastewater discharges within this watershed.
Waller Creek is a perennial freshwater stream approximately 6.7 miles in length from its confluence with Lady Bird Lake upstream to its headwaters near Northcrest Boulevard and West St. Johns Avenue. The watershed is 5.648 square miles and is entirely located in the City of Austin. There are no regulated domestic wastewater discharges within this watershed.
Walnut Creek is a perennial freshwater stream approximately 22.4 miles in length from its confluence with the Colorado River upstream to its headwaters near McNeil Drive and Parmer Lane. The watershed is approximately 56.517 square miles and is mostly in the City of Austin full purpose jurisdiction. However portions are in the planning, or exterritorial jurisdictions. Currently, there is only one industrial wastewater discharge located within its watershed, the Freescale Semiconductor plant, which only discharges bacteria-free process water into Walnut Creek AU 1428B_01.
Escherichia coli (E. coli) are the preferred indicator bacteria for assessing the contact recreation use in freshwater, and were used for development of the TMDLs, with one exception. Fecal coliform bacteria were used for assessment of Walnut Creek AU 1428B_02 because it was the standard when data were collected in 1999. E. coli data are not currently available, but will be collected in the future.
The criteria for assessing attainment of the contact recreation use are expressed as the number (or “counts”) of bacteria. The primary contact recreation use is not supported when the geometric mean of E. coli samples exceeds 126 most probable number (MPN) per 100 milliliters (mL), or the geometric mean of fecal coliform samples exceeds 200 MPN per 100 mL.
For the 2012 assessment period, the geometric means of all AUs examined exceeded 126 MPN/100 mL E. coli or 200 MPN/100 mL fecal coliform, indicating non-support of primary contact recreation.
Possible sources of indicator bacteria within the watersheds of the impaired AUs are stormwater runoff from regulated storm sewers, illicit discharges from storm sewers, sanitary sewer overflows (SSOs), and unregulated sources such as wildlife, unmanaged feral animals, and pets.
Load duration curve (LDC) analyses of instream flows were used to estimate allowable pollutant loads and specific TMDL allocations. Because bacteria loads are usually highest at high flow, the very high flow regime was used as the critical flow for determining the TMDL.
Predictions of future growth of existing or new domestic point sources were not necessary. The City of Austin has informed TCEQ that it intends to accommodate all growth with its central wastewater treatment system, which discharges directly into the Colorado River instead of these watersheds.
The wasteload allocation (WLA) for regulated stormwater was based on the percentage of each watershed regulated under a Phase I or Phase II Texas Pollutant Discharge Elimination System (TPDES) stormwater permit.
Compliance with these TMDLs is based on keeping indicator bacteria concentrations in the selected waters below the geometric mean criterion of E. coli less than 126 MPN/100 mL or fecal coliform less than 200 MPN/100 mL.Waller Creek Working Grou
Maximum Fidelity
The most fundamental problem in statistics is the inference of an unknown
probability distribution from a finite number of samples. For a specific
observed data set, answers to the following questions would be desirable: (1)
Estimation: Which candidate distribution provides the best fit to the observed
data?, (2) Goodness-of-fit: How concordant is this distribution with the
observed data?, and (3) Uncertainty: How concordant are other candidate
distributions with the observed data? A simple unified approach for univariate
data that addresses these traditionally distinct statistical notions is
presented called "maximum fidelity". Maximum fidelity is a strict frequentist
approach that is fundamentally based on model concordance with the observed
data. The fidelity statistic is a general information measure based on the
coordinate-independent cumulative distribution and critical yet previously
neglected symmetry considerations. An approximation for the null distribution
of the fidelity allows its direct conversion to absolute model concordance (p
value). Fidelity maximization allows identification of the most concordant
model distribution, generating a method for parameter estimation, with
neighboring, less concordant distributions providing the "uncertainty" in this
estimate. Maximum fidelity provides an optimal approach for parameter
estimation (superior to maximum likelihood) and a generally optimal approach
for goodness-of-fit assessment of arbitrary models applied to univariate data.
Extensions to binary data, binned data, multidimensional data, and classical
parametric and nonparametric statistical tests are described. Maximum fidelity
provides a philosophically consistent, robust, and seemingly optimal foundation
for statistical inference. All findings are presented in an elementary way to
be immediately accessible to all researchers utilizing statistical analysis.Comment: 66 pages, 32 figures, 7 tables, submitte
Distributed Approximation of Maximum Independent Set and Maximum Matching
We present a simple distributed -approximation algorithm for maximum
weight independent set (MaxIS) in the model which completes
in rounds, where is the maximum
degree, is the number of rounds needed to compute a maximal
independent set (MIS) on , and is the maximum weight of a node. %Whether
our algorithm is randomized or deterministic depends on the \texttt{MIS}
algorithm used as a black-box.
Plugging in the best known algorithm for MIS gives a randomized solution in
rounds, where is the number of nodes.
We also present a deterministic -round algorithm based
on coloring.
We then show how to use our MaxIS approximation algorithms to compute a
-approximation for maximum weight matching without incurring any additional
round penalty in the model. We use a known reduction for
simulating algorithms on the line graph while incurring congestion, but we show
our algorithm is part of a broad family of \emph{local aggregation algorithms}
for which we describe a mechanism that allows the simulation to run in the
model without an additional overhead.
Next, we show that for maximum weight matching, relaxing the approximation
factor to () allows us to devise a distributed algorithm
requiring rounds for any constant
. For the unweighted case, we can even obtain a
-approximation in this number of rounds. These algorithms are
the first to achieve the provably optimal round complexity with respect to
dependency on
Maximum entanglement of formation for a two-mode Gaussian state over passive operations
We quantify the maximum amount of entanglement of formation (EoF) that can be
achieved by continuous-variable states under passive operations, which we refer
to as EoF-potential. Focusing, in particular, on two-mode Gaussian states we
derive analytical expressions for the EoF-potential for specific classes of
states. For more general states, we demonstrate that this quantity can be
upper-bounded by the minimum amount of squeezing needed to synthesize the
Gaussian modes, a quantity called squeezing of formation. Our work, thus,
provides a new link between non-classicality of quantum states and the
non-classicality of correlations.Comment: Revised versio
- …