8,459 research outputs found
Medical Liability Erased: How the Protecting Access to Care Act of 2017 Limits Patients’ Access to Proper Care
This paper outlines the severe impact that the Protecting Access to Care Act would have on victims of malpractice who have suffered grave injuries, and also explains how the bill would nearly eliminate patients’ ability to recover damages when doctors or hospitals provide negligent care. Part II of this paper will examine some of the limits that this bill would impose and the impact it would have on injured patients’ ability to recover damages. Part III will describe those entities that are truly driving this bill and what their motives for doing so are. Part IV will clarify some of the misconceptions about tort reform and caps on damages and why the enactment of this bill would ultimately do more harm than good. Finally, Part V will examine the benefits of medical malpractice litigation and why it is imperative to ensure that patients have the ability to find redress in a court of law
Breakdown of the Internet under intentional attack
We study the tolerance of random networks to intentional attack, whereby a
fraction p of the most connected sites is removed. We focus on scale-free
networks, having connectivity distribution of P(k)~k^(-a) (where k is the site
connectivity), and use percolation theory to study analytically and numerically
the critical fraction p_c needed for the disintegration of the network, as well
as the size of the largest connected cluster. We find that even networks with
a<=3, known to be resilient to random removal of sites, are sensitive to
intentional attack. We also argue that, near criticality, the average distance
between sites in the spanning (largest) cluster scales with its mass, M, as
sqrt(M), rather than as log_k M, as expected for random networks away from
criticality. Thus, the disruptive effects of intentional attack become relevant
even before the critical threshold is reached.Comment: Latex, 4 pages, 3 eps figure
Studies of Bacterial Branching Growth using Reaction-Diffusion Models for Colonial Development
Various bacterial strains exhibit colonial branching patterns during growth
on poor substrates. These patterns reflect bacterial cooperative
self-organization and cybernetic processes of communication, regulation and
control employed during colonial development. One method of modeling is the
continuous, or coupled reaction-diffusion approach, in which continuous time
evolution equations describe the bacterial density and the concentration of the
relevant chemical fields. In the context of branching growth, this idea has
been pursued by a number of groups. We present an additional model which
includes a lubrication fluid excreted by the bacteria. We also add fields of
chemotactic agents to the other models. We then present a critique of this
whole enterprise with focus on the models' potential for revealing new
biological features.Comment: 1 latex file, 40 gif/jpeg files (compressed into tar-gzip). Physica
A, in pres
Modeling branching and chiral colonial patterning of lubricating bacteria
In nature, microorganisms must often cope with hostile environmental
conditions. To do so they have developed sophisticated cooperative behavior and
intricate communication capabilities, such as: direct cell-cell physical
interactions via extra-membrane polymers, collective production of
extracellular "wetting" fluid for movement on hard surfaces, long range
chemical signaling such as quorum sensing and chemotactic (bias of movement
according to gradient of chemical agent) signaling, collective activation and
deactivation of genes and even exchange of genetic material. Utilizing these
capabilities, the colonies develop complex spatio-temporal patterns in response
to adverse growth conditions. We present a wealth of branching and chiral
patterns formed during colonial development of lubricating bacteria (bacteria
which produce a wetting layer of fluid for their movement). Invoking ideas from
pattern formation in non-living systems and using ``generic'' modeling we are
able to reveal novel survival strategies which account for the salient features
of the evolved patterns. Using the models, we demonstrate how communication
leads to self-organization via cooperative behavior of the cells. In this
regard, pattern formation in microorganisms can be viewed as the result of the
exchange of information between the micro-level (the individual cells) and the
macro-level (the colony). We mainly review known results, but include a new
model of chiral growth, which enables us to study the effect of chemotactic
signaling on the chiral growth. We also introduce a measure for weak chirality
and use this measure to compare the results of model simulations with
experimental observations.Comment: 50 pages, 24 images in 44 GIF/JPEG files, Proceedings of IMA
workshop: Pattern Formation and Morphogenesis (1998
Real-Time Classification of Twitter Trends
Social media users give rise to social trends as they share about common
interests, which can be triggered by different reasons. In this work, we
explore the types of triggers that spark trends on Twitter, introducing a
typology with following four types: 'news', 'ongoing events', 'memes', and
'commemoratives'. While previous research has analyzed trending topics in a
long term, we look at the earliest tweets that produce a trend, with the aim of
categorizing trends early on. This would allow to provide a filtered subset of
trends to end users. We analyze and experiment with a set of straightforward
language-independent features based on the social spread of trends to
categorize them into the introduced typology. Our method provides an efficient
way to accurately categorize trending topics without need of external data,
enabling news organizations to discover breaking news in real-time, or to
quickly identify viral memes that might enrich marketing decisions, among
others. The analysis of social features also reveals patterns associated with
each type of trend, such as tweets about ongoing events being shorter as many
were likely sent from mobile devices, or memes having more retweets originating
from a few trend-setters.Comment: Pre-print of article accepted for publication in Journal of the
American Society for Information Science and Technology copyright @ 2013
(American Society for Information Science and Technology
Geographical Embedding of Scale-Free Networks
A method for embedding graphs in Euclidean space is suggested. The method
connects nodes to their geographically closest neighbors and economizes on the
total physical length of links. The topological and geometrical properties of
scale-free networks embedded by the suggested algorithm are studied both
analytically and through simulations. Our findings indicate dramatic changes in
the embedded networks, in comparison to their off-lattice counterparts, and
call into question the applicability of off-lattice scale-free models to
realistic, everyday-life networks
Two-Source Condensers with Low Error and Small Entropy Gap via Entropy-Resilient Functions
In their seminal work, Chattopadhyay and Zuckerman (STOC\u2716) constructed a two-source extractor with error epsilon for n-bit sources having min-entropy {polylog}(n/epsilon). Unfortunately, the construction\u27s running-time is {poly}(n/epsilon), which means that with polynomial-time constructions, only polynomially-small errors are possible. Our main result is a {poly}(n,log(1/epsilon))-time computable two-source condenser. For any k >= {polylog}(n/epsilon), our condenser transforms two independent (n,k)-sources to a distribution over m = k-O(log(1/epsilon)) bits that is epsilon-close to having min-entropy m - o(log(1/epsilon)). Hence, achieving entropy gap of o(log(1/epsilon)).
The bottleneck for obtaining low error in recent constructions of two-source extractors lies in the use of resilient functions. Informally, this is a function that receives input bits from r players with the property that the function\u27s output has small bias even if a bounded number of corrupted players feed adversarial inputs after seeing the inputs of the other players. The drawback of using resilient functions is that the error cannot be smaller than ln r/r. This, in return, forces the running time of the construction to be polynomial in 1/epsilon.
A key component in our construction is a variant of resilient functions which we call entropy-resilient functions. This variant can be seen as playing the above game for several rounds, each round outputting one bit. The goal of the corrupted players is to reduce, with as high probability as they can, the min-entropy accumulated throughout the rounds. We show that while the bias decreases only polynomially with the number of players in a one-round game, their success probability decreases exponentially in the entropy gap they are attempting to incur in a repeated game
- …