1,986 research outputs found
Google Votes: A Liquid Democracy Experiment on a Corporate Social Network
This paper introduces Google Votes, an experiment in liquid democracy built on Google\u27s internal corporate Google+ social network. Liquid democracy decision-making systems can scale to cover large groups by enabling voters to delegate their votes to other voters. This approach is in contrast to direct democracy systems where voters vote directly on issues, and representative democracy systems where voters elect representatives to vote on issues for them. Liquid democracy systems can provide many of the benefits of both direct and representative democracy systems with few of the weaknesses. Thus far, high implementation complexity and infrastructure costs have prevented widespread adoption. Google Votes demonstrates how the use of social-networking technology can overcome these barriers and enable practical liquid democracy systems. The case-study of Google Votes usage at Google over a 3 year timeframe is included, as well as a framework for evaluating vote visibility called the Golden Rule of Liquid Democracy
Private Multiplicative Weights Beyond Linear Queries
A wide variety of fundamental data analyses in machine learning, such as
linear and logistic regression, require minimizing a convex function defined by
the data. Since the data may contain sensitive information about individuals,
and these analyses can leak that sensitive information, it is important to be
able to solve convex minimization in a privacy-preserving way.
A series of recent results show how to accurately solve a single convex
minimization problem in a differentially private manner. However, the same data
is often analyzed repeatedly, and little is known about solving multiple convex
minimization problems with differential privacy. For simpler data analyses,
such as linear queries, there are remarkable differentially private algorithms
such as the private multiplicative weights mechanism (Hardt and Rothblum, FOCS
2010) that accurately answer exponentially many distinct queries. In this work,
we extend these results to the case of convex minimization and show how to give
accurate and differentially private solutions to *exponentially many* convex
minimization problems on a sensitive dataset
Testing Conditional Independence of Discrete Distributions
We study the problem of testing \emph{conditional independence} for discrete
distributions. Specifically, given samples from a discrete random variable on domain , we want to distinguish,
with probability at least , between the case that and are
conditionally independent given from the case that is
-far, in -distance, from every distribution that has this
property. Conditional independence is a concept of central importance in
probability and statistics with a range of applications in various scientific
domains. As such, the statistical task of testing conditional independence has
been extensively studied in various forms within the statistics and
econometrics communities for nearly a century. Perhaps surprisingly, this
problem has not been previously considered in the framework of distribution
property testing and in particular no tester with sublinear sample complexity
is known, even for the important special case that the domains of and
are binary.
The main algorithmic result of this work is the first conditional
independence tester with {\em sublinear} sample complexity for discrete
distributions over . To complement our upper
bounds, we prove information-theoretic lower bounds establishing that the
sample complexity of our algorithm is optimal, up to constant factors, for a
number of settings. Specifically, for the prototypical setting when , we show that the sample complexity of testing conditional
independence (upper bound and matching lower bound) is
\[
\Theta\left({\max\left(n^{1/2}/\epsilon^2,\min\left(n^{7/8}/\epsilon,n^{6/7}/\epsilon^{8/7}\right)\right)}\right)\,.
\
Global Solution to the Three-Dimensional Incompressible Flow of Liquid Crystals
The equations for the three-dimensional incompressible flow of liquid
crystals are considered in a smooth bounded domain. The existence and
uniqueness of the global strong solution with small initial data are
established. It is also proved that when the strong solution exists, all the
global weak solutions constructed in [16] must be equal to the unique strong
solution
Foundation and empire : a critique of Hardt and Negri
In this article, Thompson complements recent critiques of Hardt and Negri's Empire (see Finn Bowring in Capital and Class, no. 83) using the tools of labour process theory to critique the political economy of Empire, and to note its unfortunate similarities to conventional theories of the knowledge economy
Differentially Private Model Selection with Penalized and Constrained Likelihood
In statistical disclosure control, the goal of data analysis is twofold: The
released information must provide accurate and useful statistics about the
underlying population of interest, while minimizing the potential for an
individual record to be identified. In recent years, the notion of differential
privacy has received much attention in theoretical computer science, machine
learning, and statistics. It provides a rigorous and strong notion of
protection for individuals' sensitive information. A fundamental question is
how to incorporate differential privacy into traditional statistical inference
procedures. In this paper we study model selection in multivariate linear
regression under the constraint of differential privacy. We show that model
selection procedures based on penalized least squares or likelihood can be made
differentially private by a combination of regularization and randomization,
and propose two algorithms to do so. We show that our private procedures are
consistent under essentially the same conditions as the corresponding
non-private procedures. We also find that under differential privacy, the
procedure becomes more sensitive to the tuning parameters. We illustrate and
evaluate our method using simulation studies and two real data examples
Footpaths: pedogenic and geomorphological long-term effects of human trampling
Footpaths are of the oldest and most widely distributed forms of human imprint on the landscape. These elongated features are the result of continuous usage of a certain route for walking, at time scales ranging from days to centuries or millennia. In this qualitative investigation, we take a holistic approach combining micromorphology (including voids analysis), chemical soil parameters (such as selective iron oxide dissolution), and remote sensing (spatial distribution and orientation of footpaths in the landscape) to evaluate the long-term residues and environmental effects resulting from the formation of footpaths. Our diverse case studies incorporate footpaths used for recreational and transport purposes in temperate and sub-humid climates from both recent and historical perspectives. A reduction of the large pores was observed down to 3Â cm below current and historical surfaces compared to control areas without footpaths. The lower porosity subsequently hinders of the supply of oxygen and/or water into the sub-surface and encourages water stagnation on the compacted footpath surface. These processes result in higher amounts of pedogenic Fe oxides and, at times, macro-organic residues under footpaths and hindering of soil formation. As an additional result of compaction, surface runoff is promoted. The latter may either trigger the initiation of gullies directly downslope from footpaths or lead to incision of the footpaths themselves. Incised footpaths are more likely to occur when the footpath is oriented parallel to the stream network. Once an incised footpath is formed, it may reduce gully erosion susceptibility downslope as the incised footpath acts as a channel that decreases a footpathâs âoverbankâ flow. With a better understanding of footpaths as landscape units we can (1) pose archaeological questions related to human environmental interaction, (2) assess carbon storage potential under footpaths and (3) use incised footpaths as possible measures against gully erosion
Real-Time control of sheet stability during forming
During the stamping of complex three-dimensional sheet metal parts, the in-plane compressive stresses created often lead to failure by buckling. These are typically suppressed by binding the material at the periphery to provide a tensile bias. In practice, these biases are difficult to determine, and must be addressed with a combination of a priori analysis and die-making skill. Even then, in-process variations Introduction Three-dimensional sheet forming is a highly productive process capable of forming complex shapes at high rates. However, this productivity comes at the expense of lengthy and costly tooling development. A primary element of this tooling is the "blankholder" which provides the in-plane tensile bias necessary to avoid buckling failure of the sheet caused by in-plane compressive strains. Blankholder design is complicated not only by the difficult contours involved, but also by the critical nature of sheet stability in such bi-axial strain conditions. As a result, sheet-forming production is often disrupted by tensile or compressive instabilities (tearing and wrinkling failures) caused by incorrect blankholder forces. Despite careful design and optimization, variations in lubrication, material properties, and blankholder wear can drive a process into an unstable region of operation. This paper treats the problem of sheet stability as a real-time process control problem. The objective is to keep the margins of process stability within acceptable limits even when the abovementioned disturbances occur. The approach taken here is largely empirical, and is based on the concept of trajectory or signature following. In this method, two accessible measures of process performance (punch force and flange draw-in) are monitored during "optimal" forming conditions. In subsequent forming cycles, the process is forced to follow these trajectories, and the blankholder force is modulated to accomplish real-time tracking. The key issues become robustness of the scheme to the expected variations and the ability to apply the method to general processes. In earlier reports on this work Background Research into the stability of sheet metal forming has concentrated on topics such as material properties, circular grid strain analysis, forming limit diagrams, finite element analysis, strain path corrections, and shape analysis. Below is a brief review of studies involving tearing, buckling, and forming limits, concentrating on those of direct relevance to the conical cup geometry. The frequently used forming limit diagram, developed by Goodwin (1968) and Keeler (1969), is a good indicator of the tearing strains in plane strain, loading. For a given material, these diagrams are developed by using a hemispherical punch stretch test and plotting the circumferential and radial strains
Fairness Beyond Disparate Treatment & Disparate Impact: Learning Classification without Disparate Mistreatment
Automated data-driven decision making systems are increasingly being used to
assist, or even replace humans in many settings. These systems function by
learning from historical decisions, often taken by humans. In order to maximize
the utility of these systems (or, classifiers), their training involves
minimizing the errors (or, misclassifications) over the given historical data.
However, it is quite possible that the optimally trained classifier makes
decisions for people belonging to different social groups with different
misclassification rates (e.g., misclassification rates for females are higher
than for males), thereby placing these groups at an unfair disadvantage. To
account for and avoid such unfairness, in this paper, we introduce a new notion
of unfairness, disparate mistreatment, which is defined in terms of
misclassification rates. We then propose intuitive measures of disparate
mistreatment for decision boundary-based classifiers, which can be easily
incorporated into their formulation as convex-concave constraints. Experiments
on synthetic as well as real world datasets show that our methodology is
effective at avoiding disparate mistreatment, often at a small cost in terms of
accuracy.Comment: To appear in Proceedings of the 26th International World Wide Web
Conference (WWW), 2017. Code available at:
https://github.com/mbilalzafar/fair-classificatio
Grid services for the MAGIC experiment
Exploring signals from the outer space has become an observational science
under fast expansion. On the basis of its advanced technology the MAGIC
telescope is the natural building block for the first large scale ground based
high energy gamma-ray observatory. The low energy threshold for gamma-rays
together with different background sources leads to a considerable amount of
data. The analysis will be done in different institutes spread over Europe.
Therefore MAGIC offers the opportunity to use the Grid technology to setup a
distributed computational and data intensive analysis system with the nowadays
available technology. Benefits of Grid computing for the MAGIC telescope are
presented.Comment: 5 pages, 1 figures, to be published in the Proceedings of the 6th
International Symposium ''Frontiers of Fundamental and Computational
Physics'' (FFP6), Udine (Italy), Sep. 26-29, 200
- âŠ