48 research outputs found

    Interactive Coding with Constant Round and Communication Blowup

    Get PDF

    Locally Decodable Codes with Randomized Encoding

    Get PDF
    We initiate a study of locally decodable codes with randomized encoding. Standard locally decodable codes are error correcting codes with a deterministic encoding function and a randomized decoding function, such that any desired message bit can be recovered with good probability by querying only a small number of positions in the corrupted codeword. This allows one to recover any message bit very efficiently in sub-linear or even logarithmic time. Besides this straightforward application, locally decodable codes have also found many other applications such as private information retrieval, secure multiparty computation, and average-case complexity. However, despite extensive research, the tradeoff between the rate of the code and the number of queries is somewhat disappointing. For example, the best known constructions still need super-polynomially long codeword length even with a logarithmic number of queries, and need a polynomial number of queries to achieve a constant rate. In this paper, we show that by using a randomized encoding, in several models we can achieve significantly better rate-query tradeoff. In addition, our codes work for both the standard Hamming errors, and the more general and harder edit errors.Comment: 23 page

    Protecting Single-Hop Radio Networks from Message Drops

    Get PDF
    Single-hop radio networks (SHRN) are a well studied abstraction of communication over a wireless channel. In this model, in every round, each of the n participating parties may decide to broadcast a message to all the others, potentially causing collisions. We consider the SHRN model in the presence of stochastic message drops (i.e., erasures), where in every round, the message received by each party is erased (replaced by ?) with some small constant probability, independently. Our main result is a constant rate coding scheme, allowing one to run protocols designed to work over the (noiseless) SHRN model over the SHRN model with erasures. Our scheme converts any protocol ? of length at most exponential in n over the SHRN model to a protocol ?\u27 that is resilient to constant fraction of erasures and has length linear in the length of ?. We mention that for the special case where the protocol ? is non-adaptive, i.e., the order of communication is fixed in advance, such a scheme was known. Nevertheless, adaptivity is widely used and is known to hugely boost the power of wireless channels, which makes handling the general case of adaptive protocols ? both important and more challenging. Indeed, to the best of our knowledge, our result is the first constant rate scheme that converts adaptive protocols to noise resilient ones in any multi-party model

    Noisy Radio Network Lower Bounds via Noiseless Beeping Lower Bounds

    Get PDF

    Joint, Incremental Disfluency Detection and Utterance Segmentation from Speech

    Get PDF
    Hough J, Schlangen D. Joint, Incremental Disfluency Detection and Utterance Segmentation from Speech. In: Proceedings of the Annual Meeting of the European Chapter of the Association for Computational Linguistics (EACL). 2017: 326-336

    Quantifying mutual-understanding in dialogue

    Get PDF
    PhDThere are two components of communication that provide a natural index of mutual-understanding in dialogue. The first is Repair; the ways in which people detect and deal with problems with understanding. The second is Ellipsis/Anaphora; the use of expressions that depend directly on the accessibility of the local context for their interpretation. This thesis explores the use of these two phenomena in systematic comparative analyses of human-human dialogue under different task and media conditions. In order to do this it is necessary to a) develop reliable, valid protocols for coding the different Repair and Ellipsis/Anaphora phenomena b) establish their baseline patterns of distribution in conversation and c) model their basic statistical inter-relationships and their predictive value. Two new protocols for coding Repair and Ellipsis/Anaphora phenomena are presented and applied to two dialogue corpora, one of ordinary 'everyday' conversations and one of task-oriented dialogues. These data illustrate that there are significant differences in how understanding is created and negotiated across conditions. Repair is shown to be a ubiquitous feature in all dialogue. The goals of the speaker directly affect the type of Repair used. Giving instructions leads to a higher rate of self-editing; following instructions increases corrections and requests for clarification. Medium and familiarity also influence Repair; when eye contact is not possible there are a greater number of repeats and clarifications. Anaphora are used less frequently in task-oriented dialogue whereas types of Ellipsis increase. The use of Elliptical phrases that check, confirm or acknowledge is higher when there is no eye contact. Familiar pairs use more elliptical expressions, especially endophora and elliptical questions. Following instructions leads to greater use of elliptical (non-sentential) phrases. Medium, task and social norms all have a measureable effect on the components of dialogue that underpin mutual-understanding

    Noisy Interactive Quantum Communication

    Get PDF
    We consider the problem of implementing two-party interactive quantum communication over noisy channels, a necessary endeavor if we wish to fully reap quantum advantages for communication. For an arbitrary protocol with n messages, designed for noiseless qudit channels (where d is arbitrary), our main result is a simulation method that fails with probability less than 2â»á¶żâœâżá”‹âŸ and uses a qudit channel n(1 + Θ(√Δ)) times, of which Δ fraction can be corrupted adversarially. The simulation is thus capacity achieving to leading order, and we conjecture that it is optimal up to a constant factor in the √Δ term. Furthermore, the simulation is in a model that does not require pre-shared resources such as randomness or entanglement between the communicating parties. Surprisingly, this outperforms the best known overhead of 1 + O(√(Δ log log 1/Δ)) in the corresponding classical model, which is also conjectured to be optimal [Haeupler, FOCS’14]. Our work also improves over the best previously known quantum result where the overhead is a non-explicit large constant [Brassard et al., FOCS’14] for small Δ

    Pseudorandom Constructions: Computing in Parallel and Applications to Edit Distance Codes

    Get PDF
    The thesis focuses on two problems about pseudorandom constructions. The first problem is how to compute pseudorandom constructions by constant depth circuits. Pseudorandom constructions are deterministic functions which are used to substitute random constructions in various computational tasks. Constant depth circuits here refer to the computation model which can compute functions using circuits of \AND, \OR and negation gates, with constant depth, unbounded fan-in, taking function inputs by input wires and giving function outputs by output wires. They can be simulated by fast parallel algorithms. We study such constructions mainly for randomness extractors, secret sharing schemes and their applications. Randomness extractors are functions which transform biased random bits to uniform ones. They can be used to recycle random bits in computations if there are some entropies remaining. Secret sharing schemes efficiently share secrets among multi-parties s.t. the collusion of a bounded number of parties cannot recover any information of the secret while a certain larger number of parties can recover the secret. Our work constructs these objects with near optimal parameters and explores their applications. The second problem is about applying pseudorandom constructions to build error correcting codes (ECCs) for edit distance. ECCs project messages to codewords in a metric space s.t. one can recover the codewords even if there are bounded number of errors which can drive the codeword away by some bounded distance. They are widely used in both the theoretical and practical part of computer science. Classic errors are hamming errors which are substitutions and erasures of symbols. They are well studied by numerous literatures before. We consider one kind of more general errors i.e. edit errors, consists of insertions and deletions that may change the positions of symbols. Our work give explicit constructions of binary ECCs for edit errors with redundancy length near optimal. The constructions utilize document exchange protocols which can let two party synchronize their strings with bounded edit distance, by letting one party send a short sketch of its string to the other. We apply various pseudorandom constructions to get deterministic document exchange protocols from randomized ones. Then we construct ECCs using them. We also extend these constructions to handle block insertions/deletions and transpositions. All these constructions have near optimal parameters

    Searching Spontaneous Conversational Speech:Proceedings of ACM SIGIR Workshop (SSCS2008)

    Get PDF
    corecore