1,587 research outputs found
3.4. The Development of the PaleoWay Digital Workflows in the Context of Archaeological Consulting
PaleoWest Archaeology began to develop technology and methods for digital data collection in 2010, and quickly became the first archaeological consulting firm in the United State to adopt an all-digital workflow. The initial phase of research and development of this workflow coincided with a period of rapid software and hardware development, most notably the launch of the first- and second-generation iPads. The digital archaeological toolkit we assembled was used to collect survey data from tens of thousands of acres, document thousands of isolated artifacts, and record hundreds of archaeological sites throughout the American Southwest and elsewhere. This experience informed a second phase of development in which a custom database was constructed using FileMaker Pro. Ultimately, we developed a number of all-digital workflows that we refer to collectively as the PaleoWay. The development of this workflow has allowed us to collect better-quality data while becoming more efficient in our field and reporting operations.https://dc.uwm.edu/arthist_mobilizingthepast/1016/thumbnail.jp
Using The DuPont Decomposing Process To Create A Marketing Model
The DuPont Model of Business is almost 100 years old. It provides a classic template for the decomposition process that can be used to build integrated systems of performance metrics. The marketing field has never embraced the DuPont Model as completely as accounting and finance because the focus is on the earning being return to the owner’s equity. Earnings flow is not the same as cash flow and marketing managers are more focused on operating profits than net profits. However, the decomposition process that is used to create the DuPont Model can be applied to the process of creating a Marketing Model. The paper reviews the classic decomposition process and uses it to create a new Marketing Model of Profitability
Making Price Elasticity A Useful Metric For Maximizing Profit
An estimate of a product’s price elasticity can be used to calculate whether a price change will increase or decrease sales revenue. However, the price elasticity of demand does not indicate if a price change will increase or decrease gross profit because the marginal cost per unit confounds the calculation. However, an estimate of the price elasticity can be combined with the product’s markup to calculate if a change in the selling price will increase or decrease the profit. The purpose of this paper is to demonstrate how estimates of the price elasticity and the markup can be combined to help managers decide if greater profits can be realized with a price decrease or a price increase
Three-Dimensional Spectral Classification of Low-Metallicity Stars Using Artificial Neural Networks
We explore the application of artificial neural networks (ANNs) for the estimation of atmospheric parameters (Teff, log g, and [Fe/H]) for Galactic F- and G-type stars. The ANNs are fed with medium-resolution (Δλ ~ 1-2 Å) non-flux-calibrated spectroscopic observations. From a sample of 279 stars with previous high-resolution determinations of metallicity and a set of (external) estimates of temperature and surface gravity, our ANNs are able to predict Teff with an accuracy of σ(Teff) = 135-150 K over the range 4250 ≤ Teff ≤ 6500 K, log g with an accuracy of σ(log g) = 0.25-0.30 dex over the range 1.0 ≤ log g ≤ 5.0 dex, and [Fe/H] with an accuracy σ([Fe/H]) = 0.15-0.20 dex over the range -4.0 ≤ [Fe/H] ≤ 0.3. Such accuracies are competitive with the results obtained by fine analysis of high-resolution spectra. It is noteworthy that the ANNs are able to obtain these results without consideration of photometric information for these stars. We have also explored the impact of the signal-to-noise ratio (S/N) on the behavior of ANNs and conclude that, when analyzed with ANNs trained on spectra of commensurate S/N, it is possible to extract physical parameter estimates of similar accuracy with stellar spectra having S/N as low as 13. Taken together, these results indicate that the ANN approach should be of primary importance for use in present and future large-scale spectroscopic surveys
Orienting Graphical User Interfaces Reduces Errors: The Low Error Voting Interface
This paper demonstrates opportunities for reducing errors with orienting graphical interfaces for voting. We have built many interfaces to explore opportunities for keeping voters aware of selections they have made and are making. Tests of our best prototypes show that missed races and incorrect selection errors are greatly reduced with orienting graphics. The interface reduces errors significantly while extending the time required to vote
Do Economic Downturns Dampen Patent Litigation?
Recent studies estimate that the economic impact of U.S. patent litigation may be as large as $80 billion per year and that the overall rate of U.S. patent litigation has been growing rapidly over the past twenty years. And yet, the relationship of the macroeconomy to patent litigation rates has never been studied in any rigorous fashion. This lacuna is notable given that there are two opposing theories among lawyers regarding the effect of economic downturns on patent litigation. One camp argues for a substitution theory, holding that patent litigation should increase in a downturn because potential plaintiffs have a greater incentive to exploit patent assets relative to other investments. The other camp posits a capital constraint theory that holds that the decrease in cash flow and available capital disincentivizes litigation. Analyzing quarterly patent infringement suit filing data from 1971-2009 using a time-series vector autoregression (VAR) model, we show that economic downturns have significantly affected patent litigation rates. (To aid other researchers in testing and extending our analyses, we have made our entire dataset available online.) Importantly, we find that these effects have changed over time. In particular, patent litigation has become more dependent on credit availability in a downturn. We hypothesize that such changes resulted from an increase in use of contingent-fee attorneys by patent plaintiffs and the rise of non-practicing entities (NPEs), which unlike most operating companies, generally fund their lawsuits directly from outside capital sources. Over roughly the last twenty years, we find that macroeconomic conditions have affected patent litigation in contrasting ways. Decreases in GDP (particularly economy-wide investment) are correlated with significant increases in patent litigation and countercyclical economic trends. On the other hand, increases in T-bill and real interest rates as well as increases in economy-wide financial risk are generally correlated with significant decreases in patent suits, leading to procyclical trends. Thus, the specific nature of a downturn predicts whether patent litigation rates will tend to rise or fall
(Teff,log g,[Fe/H]) Classification of Low-Resolution Stellar Spectra using Artificial Neural Networks
New generation large-aperture telescopes, multi-object spectrographs, and
large format detectors are making it possible to acquire very large samples of
stellar spectra rapidly. In this context, traditional star-by-star
spectroscopic analysis are no longer practical. New tools are required that are
capable of extracting quickly and with reasonable accuracy important basic
stellar parameters coded in the spectra. Recent analyses of Artificial Neural
Networks (ANNs) applied to the classification of astronomical spectra have
demonstrated the ability of this concept to derive estimates of temperature and
luminosity. We have adapted the back-propagation ANN technique developed by von
Hippel et al. (1994) to predict effective temperatures, gravities and overall
metallicities from spectra with resolving power ~ 2000 and low signal-to-noise
ratio. We show that ANN techniques are very effective in executing a
three-parameter (Teff,log g,[Fe/H]) stellar classification. The preliminary
results show that the technique is even capable of identifying outliers from
the training sample.Comment: 6 pages, 3 figures (5 files); to appear in the proceedings of the
11th Cambridge Workshop on Cool Stars, Stellar Systems and the Sun, held on
Tenerife (Spain), October 1999; also available at http://hebe.as.utexas.ed
Recommended from our members
Vascular access-specific health-related quality of life impacts among hemodialysis patients: qualitative development of the hemodialysis access-related quality of life (HARQ) instrument.
BackgroundEnd stage kidney disease and hemodialysis dependence are associated with impairments in health-related quality of life (HRQOL), which may be related to vascular access (VA). Few HRQOL measures are VA-specific and none differentiate HRQOL impact by VA type. We developed a VA-targeted HRQOL measure to distinguish the impact of fistulas, grafts and catheters.MethodsWe created an initial item pool based on literature review and then conducted focus groups at 4 US sites with 37 adults and interviews with nine dialysis clinicians about VA's impact on HRQOL. We then drafted the Hemodialysis Access-Related Quality of Life (HARQ) measure and cognitively tested it with 17 hemodialysis patients. Focus group and cognitive interview participants were diverse in age, gender, years on dialysis, and VA.ResultsWe identified six domains for the HARQ: symptoms, physical functioning, emotional impacts, social and role functioning, sleep, and care-related burdens. Cognitive interviews indicated that items were easily understood and supported content validity. Attributing HRQOL impact to VA as opposed to other hemodialysis burden was challenging for some items. Some items were dropped that were considered redundant by patients, limitations while dressing was added, and reference to VA-specific impact was included for each item. The average Flesch-Kincaid reading grade level for the revised 47-item HARQ was 5.3.ConclusionsThe HARQ features VA-specific content not addressed in other HRQOL measures, making it ideal for comparisons of different VA types and new VA technologies. The psychometric properties of the HARQ will be evaluated in future research
Mapping Genre at the Page Level in English-Language Volumes from HathiTrust, 1700-1899
Using regularized logistic regression and hidden Markov models, we predict genre at the page level in a collection of 469,000 volumes from HathiTrust Digital Library. Accuracy is comparable to human crowdsourcing.Ope
Network Structure and Biased Variance Estimation in Respondent Driven Sampling
This paper explores bias in the estimation of sampling variance in Respondent Driven Sampling (RDS). Prior methodological work on RDS has focused on its problematic assumptions and the biases and inefficiencies of its estimators of the population mean. Nonetheless, researchers have given only slight attention to the topic of estimating sampling variance in RDS, despite the importance of variance estimation for the construction of confidence intervals and hypothesis tests. In this paper, we show that the estimators of RDS sampling variance rely on a critical assumption that the network is First Order Markov (FOM) with respect to the dependent variable of interest. We demonstrate, through intuitive examples, mathematical generalizations, and computational experiments that current RDS variance estimators will always underestimate the population sampling variance of RDS in empirical networks that do not conform to the FOM assumption. Analysis of 215 observed university and school networks from Facebook and Add Health indicates that the FOM assumption is violated in every empirical network we analyze, and that these violations lead to substantially biased RDS estimators of sampling variance. We propose and test two alternative variance estimators that show some promise for reducing biases, but which also illustrate the limits of estimating sampling variance with only partial information on the underlying population social network
- …