966 research outputs found
Tort Law - Reckless Misconduct in Sports
The United States Court of Appeals for the Tenth Circuit has allowed a professional athlete to recover for an injury caused during a game by the reckless misconduct of an opponent in violation of game rules and customs.
Hackbart v. Cincinnati Bengals, Inc., 601 F.2d 516 (10th Cir.), cert. denied, 444 U.S. 931 (1979)
Substantial Justification Further Defined by Phillips
The Tax Court in Phillips (referred to as Phillips II) has set helpful guidelines for making a determination of when the IRS has taken a position which is not substantially justified. This is one of the tests that must be satisfied before a court of proper jurisdiction can grant a motion for litigation costs pursuant to Section 7430. Phillips I also addresses the issue of settlement attempts, and whether they are needed to exhaust administrative remedies, which is a further requirement for Section 7430 treatment. The following discussion describes the legislative purpose of Section 7430 and its effectiveness in promoting fee-shifting. Attention will be focused upon Phillips and other cases in analyzing the tests of Section 7430
Observation of large-scale multi-agent based simulations
The computational cost of large-scale multi-agent based simulations (MABS)
can be extremely important, especially if simulations have to be monitored for
validation purposes. In this paper, two methods, based on self-observation and
statistical survey theory, are introduced in order to optimize the computation
of observations in MABS. An empirical comparison of the computational cost of
these methods is performed on a toy problem
Optical implementation of systolic array processing
Algorithms for matrix vector multiplication are implemented using acousto-optic cells for multiplication and input data transfer and using charge coupled devices detector arrays for accumulation and output of the results. No two dimensional matrix mask is required; matrix changes are implemented electronically. A system for multiplying a 50 component nonnegative real vector by a 50 by 50 nonnegative real matrix is described. Modifications for bipolar real and complex valued processing are possible, as are extensions to matrix-matrix multiplication and multiplication of a vector by multiple matrices
In vivo effects on intron retention and exon skipping by the U2AF large subunit and SF1/BBP in the nematode Caenorhabditis elegans
The in vivo analysis of the roles of splicing factors in regulating alternative splicing in animals remains a challenge. Using a microarray-based screen, we identified a Caenorhabditis elegans gene, tos-1, that exhibited three of the four major types of alternative splicing: intron retention, exon skipping, and, in the presence of U2AF large subunit mutations, the use of alternative 3' splice sites. Mutations in the splicing factors U2AF large subunit and SF1/BBP altered the splicing of tos-1. 3' splice sites of the retained intron or before the skipped exon regulate the splicing pattern of tos-1. Our study provides in vivo evidence that intron retention and exon skipping can be regulated largely by the identities of 3' splice sites
Is the Most Accurate AI the Best Teammate? Optimizing AI for Teamwork
AI practitioners typically strive to develop the most accurate systems,
making an implicit assumption that the AI system will function autonomously.
However, in practice, AI systems often are used to provide advice to people in
domains ranging from criminal justice and finance to healthcare. In such
AI-advised decision making, humans and machines form a team, where the human is
responsible for making final decisions. But is the most accurate AI the best
teammate? We argue "No" -- predictable performance may be worth a slight
sacrifice in AI accuracy. Instead, we argue that AI systems should be trained
in a human-centered manner, directly optimized for team performance. We study
this proposal for a specific type of human-AI teaming, where the human overseer
chooses to either accept the AI recommendation or solve the task themselves. To
optimize the team performance for this setting we maximize the team's expected
utility, expressed in terms of the quality of the final decision, cost of
verifying, and individual accuracies of people and machines. Our experiments
with linear and non-linear models on real-world, high-stakes datasets show that
the most accuracy AI may not lead to highest team performance and show the
benefit of modeling teamwork during training through improvements in expected
team utility across datasets, considering parameters such as human skill and
the cost of mistakes. We discuss the shortcoming of current optimization
approaches beyond well-studied loss functions such as log-loss, and encourage
future work on AI optimization problems motivated by human-AI collaboration.Comment: v
Invasion speeds for structured populations in fluctuating environments
We live in a time where climate models predict future increases in
environmental variability and biological invasions are becoming increasingly
frequent. A key to developing effective responses to biological invasions in
increasingly variable environments will be estimates of their rates of spatial
spread and the associated uncertainty of these estimates. Using stochastic,
stage-structured, integro-difference equation models, we show analytically that
invasion speeds are asymptotically normally distributed with a variance that
decreases in time. We apply our methods to a simple juvenile-adult model with
stochastic variation in reproduction and an illustrative example with published
data for the perennial herb, \emph{Calathea ovandensis}. These examples
buttressed by additional analysis reveal that increased variability in vital
rates simultaneously slow down invasions yet generate greater uncertainty about
rates of spatial spread. Moreover, while temporal autocorrelations in vital
rates inflate variability in invasion speeds, the effect of these
autocorrelations on the average invasion speed can be positive or negative
depending on life history traits and how well vital rates ``remember'' the
past
On Counting Triangles through Edge Sampling in Large Dynamic Graphs
Traditional frameworks for dynamic graphs have relied on processing only the
stream of edges added into or deleted from an evolving graph, but not any
additional related information such as the degrees or neighbor lists of nodes
incident to the edges. In this paper, we propose a new edge sampling framework
for big-graph analytics in dynamic graphs which enhances the traditional model
by enabling the use of additional related information. To demonstrate the
advantages of this framework, we present a new sampling algorithm, called Edge
Sample and Discard (ESD). It generates an unbiased estimate of the total number
of triangles, which can be continuously updated in response to both edge
additions and deletions. We provide a comparative analysis of the performance
of ESD against two current state-of-the-art algorithms in terms of accuracy and
complexity. The results of the experiments performed on real graphs show that,
with the help of the neighborhood information of the sampled edges, the
accuracy achieved by our algorithm is substantially better. We also
characterize the impact of properties of the graph on the performance of our
algorithm by testing on several Barabasi-Albert graphs.Comment: A short version of this article appeared in Proceedings of the 2017
IEEE/ACM International Conference on Advances in Social Networks Analysis and
Mining (ASONAM 2017
- …