489,158 research outputs found

    New Observation on Division Property

    Get PDF
    Feistel structure is among the most popular choices for designing ciphers. Recently, 3-round/5-round integral distinguishers for Feistel structures with non-bijective/bijective round functions are presented. At EUROCRYPT 2015, Todo proposed the Division Property to effectively construct integral distinguishers for both Feistel and SPN structures. In this paper, firstly, it is proved that if X, which is a subset of F_2^n, has the division property D_k^n, the number of elements in X is at least 2^k, based on which we can conclude that if a multi-set X has the division property D_n^n, it is in some sense equivalent to either F_2^n or the empty set. Secondly, let d be the algebraic degree of the round function of a Feistel structure. If d\le n-1, the corresponding integral distinguishers are improved as follows: there exists a 3-round integral distinguisher with at most 2^n chosen plaintexts and a 4-round integral distinguisher with at most 2^{2n-2} chosen plaintexts. These results can give new insights to both the division property and Feistel structures

    New Complexity Results and Algorithms for the Minimum Tollbooth Problem

    Full text link
    The inefficiency of the Wardrop equilibrium of nonatomic routing games can be eliminated by placing tolls on the edges of a network so that the socially optimal flow is induced as an equilibrium flow. A solution where the minimum number of edges are tolled may be preferable over others due to its ease of implementation in real networks. In this paper we consider the minimum tollbooth (MINTB) problem, which seeks social optimum inducing tolls with minimum support. We prove for single commodity networks with linear latencies that the problem is NP-hard to approximate within a factor of 1.13771.1377 through a reduction from the minimum vertex cover problem. Insights from network design motivate us to formulate a new variation of the problem where, in addition to placing tolls, it is allowed to remove unused edges by the social optimum. We prove that this new problem remains NP-hard even for single commodity networks with linear latencies, using a reduction from the partition problem. On the positive side, we give the first exact polynomial solution to the MINTB problem in an important class of graphs---series-parallel graphs. Our algorithm solves MINTB by first tabulating the candidate solutions for subgraphs of the series-parallel network and then combining them optimally

    Approximation Algorithms for Polynomial-Expansion and Low-Density Graphs

    Full text link
    We study the family of intersection graphs of low density objects in low dimensional Euclidean space. This family is quite general, and includes planar graphs. We prove that such graphs have small separators. Next, we present efficient (1+Δ)(1+\varepsilon)-approximation algorithms for these graphs, for Independent Set, Set Cover, and Dominating Set problems, among others. We also prove corresponding hardness of approximation for some of these optimization problems, providing a characterization of their intractability in terms of density

    High-quality patents for emerging science and technology through external actors: community scientific experts and knowledge societies

    Get PDF
    This article explores one type of administrative mechanism to achieve high-quality patents: Article 115 of the European Patent Convention, which permits the inclusion of third parties to provide input to the prior art search and to communicate relevant information to the examiner in charge. Our empirical research analyzes the field of human genetic inventions. The empirical findings here show that third parties usually participate only after patents have been granted. Between 1999 and 2009, only a limited number of human gene patent cases made use of third-party, pre-grant interventions. There is thus an imbalance between third-party participation in the pre- and post-grant phase of patent prosecution, and we urge for greater participation of knowledge communities in the search and examination process. Europe should create a funnel for participation through advisory bodies and learned societies, which would allow judicious consideration of the search and examination, with a resultant improvement in patent quality

    Dividing bads under additive utilities

    Get PDF
    We compare the Egalitarian rule (aka Egalitarian Equivalent) and the Competitive rule (aka Comeptitive Equilibrium with Equal Incomes) to divide bads (chores). They are both welfarist: the competitive disutility profile(s) are the critical points of their Nash product on the set of efficient feasible profiles. The C rule is Envy Free, Maskin Monotonic, and has better incentives properties than the E rule. But, unlike the E rule, it can be wildly multivalued, admits no selection continuous in the utility and endowment parameters, and is harder to compute. Thus in the division of bads, unlike that of goods, no rule normatively dominates the other

    Stable states of perturbed Markov chains

    Full text link
    Given an infinitesimal perturbation of a discrete-time finite Markov chain, we seek the states that are stable despite the perturbation, \textit{i.e.} the states whose weights in the stationary distributions can be bounded away from 00 as the noise fades away. Chemists, economists, and computer scientists have been studying irreducible perturbations built with exponential maps. Under these assumptions, Young proved the existence of and computed the stable states in cubic time. We fully drop these assumptions, generalize Young's technique, and show that stability is decidable as long as f∈O(g)f\in O(g) is. Furthermore, if the perturbation maps (and their multiplications) satisfy f∈O(g)f\in O(g) or g∈O(f)g\in O(f), we prove the existence of and compute the stable states and the metastable dynamics at all time scales where some states vanish. Conversely, if the big-OO assumption does not hold, we build a perturbation with these maps and no stable state. Our algorithm also runs in cubic time despite the general assumptions and the additional work. Proving the correctness of the algorithm relies on new or rephrased results in Markov chain theory, and on algebraic abstractions thereof
    • 

    corecore