19,870 research outputs found

    Institutions, Capital Constraints and Entrepreneurial Firm Dynamics: Evidence from Europe

    Get PDF
    We explore the impact of the institutional environment on the nature of entrepreneurial activity across Europe. Political, legal, and regulatory variables that have been shown to impact capital market development influence entrepreneurial activity in the emerging markets of Europe, but not in the more mature economies of Europe. Greater fairness and greater protection of property rights increase entry rates, reduce exit rates, and lower average firm size. Additionally, these same factors also associated with increased industrial vintage a size-weighted measure of age and reduced skewness in firm-size distributions. The results suggest that capital constraints induced by these institutional factors impact both entry and the ability of firms to transition and grow, particularly in lesser-developed markets.

    Book Review

    Get PDF
    A Scholarly Review of “Error Control for Network-On-Chip Links” (Authors: Bo Fu and Paul Ampadu, 2012)Fu, B.; and Ampadu, P. 2012. Error Control for Network-On-Chip Links.Springer Science+Business Media, LLC, New York, NY, USA.Available: <http://dx.doi.org/10.1007/978-1-4419-9313-7>

    Phased burst error-correcting array codes

    Get PDF
    Various aspects of single-phased burst-error-correcting array codes are explored. These codes are composed of two-dimensional arrays with row and column parities with a diagonally cyclic readout order; they are capable of correcting a single burst error along one diagonal. Optimal codeword sizes are found to have dimensions n1×n2 such that n2 is the smallest prime number larger than n1. These codes are capable of reaching the Singleton bound. A new type of error, approximate errors, is defined; in q-ary applications, these errors cause data to be slightly corrupted and therefore still close to the true data level. Phased burst array codes can be tailored to correct these codes with even higher rates than befor

    Coding for interactive communication correcting insertions and deletions

    Get PDF
    We consider the question of interactive communication, in which two remote parties perform a computation while their communication channel is (adversarially) noisy. We extend here the discussion into a more general and stronger class of noise, namely, we allow the channel to perform insertions and deletions of symbols. These types of errors may bring the parties "out of sync", so that there is no consensus regarding the current round of the protocol. In this more general noise model, we obtain the first interactive coding scheme that has a constant rate and resists noise rates of up to 1/18ε1/18-\varepsilon. To this end we develop a novel primitive we name edit distance tree code. The edit distance tree code is designed to replace the Hamming distance constraints in Schulman's tree codes (STOC 93), with a stronger edit distance requirement. However, the straightforward generalization of tree codes to edit distance does not seem to yield a primitive that suffices for communication in the presence of synchronization problems. Giving the "right" definition of edit distance tree codes is a main conceptual contribution of this work

    Binary Biometrics: An Analytic Framework to Estimate the Performance Curves Under Gaussian Assumption

    Get PDF
    In recent years, the protection of biometric data has gained increased interest from the scientific community. Methods such as the fuzzy commitment scheme, helper-data system, fuzzy extractors, fuzzy vault, and cancelable biometrics have been proposed for protecting biometric data. Most of these methods use cryptographic primitives or error-correcting codes (ECCs) and use a binary representation of the real-valued biometric data. Hence, the difference between two biometric samples is given by the Hamming distance (HD) or bit errors between the binary vectors obtained from the enrollment and verification phases, respectively. If the HD is smaller (larger) than the decision threshold, then the subject is accepted (rejected) as genuine. Because of the use of ECCs, this decision threshold is limited to the maximum error-correcting capacity of the code, consequently limiting the false rejection rate (FRR) and false acceptance rate tradeoff. A method to improve the FRR consists of using multiple biometric samples in either the enrollment or verification phase. The noise is suppressed, hence reducing the number of bit errors and decreasing the HD. In practice, the number of samples is empirically chosen without fully considering its fundamental impact. In this paper, we present a Gaussian analytical framework for estimating the performance of a binary biometric system given the number of samples being used in the enrollment and the verification phase. The error-detection tradeoff curve that combines the false acceptance and false rejection rates is estimated to assess the system performance. The analytic expressions are validated using the Face Recognition Grand Challenge v2 and Fingerprint Verification Competition 2000 biometric databases

    The employment effects of innovation

    Get PDF
    The issue of technological unemployment receives perennial popular attention. Although there are previous empirical investigations that have focused on the relationship between innovation and employment, the originality of our approach lies in our choice of method. We focus on four 2-digit manufacturing industries that are known for their high patenting activity. We then use Principal Components Analysis to generate a firm-and year-specific "innovativeness" index by extracting the common variance in a firm's patenting and R&D expenditure histories. To begin with, we explore the heterogeneity of firms by using semi-parametric quantile regression. Whilst some firms may reduce employment levels after innovating, others increase employment. We then move on to a weighted least squares (WLS) analysis, which explicitly takes into account the different job-creating potential of firms of different sizes. As a result, we focus on the effect of innovation on total number of jobs, whereas previous studies have focused on the effect of innovation on firm behavior. Indeed, previous studies have typically taken the firm as the unit of analysis, implicity weighting each firm equally according to the principle of "one firm equals one observation". Our results suggest that firm-level innovative activity leads to employment creation that may have been underestimated in previous studies.Technological unemployment, innovation, firm growth, Weighted Least Squares, aggregation, quantile regression.

    Demystifying the Information Reconciliation Protocol Cascade

    Full text link
    Cascade is an information reconciliation protocol proposed in the context of secret key agreement in quantum cryptography. This protocol allows removing discrepancies in two partially correlated sequences that belong to distant parties, connected through a public noiseless channel. It is highly interactive, thus requiring a large number of channel communications between the parties to proceed and, although its efficiency is not optimal, it has become the de-facto standard for practical implementations of information reconciliation in quantum key distribution. The aim of this work is to analyze the performance of Cascade, to discuss its strengths, weaknesses and optimization possibilities, comparing with some of the modified versions that have been proposed in the literature. When looking at all design trade-offs, a new view emerges that allows to put forward a number of guidelines and propose near optimal parameters for the practical implementation of Cascade improving performance significantly in comparison with all previous proposals.Comment: 30 pages, 13 figures, 3 table

    We need to talk - or do we? Geographic distance and the commercialization of technologies from public research

    Get PDF
    Using a new dataset with detailed geographic information about licensing activities of the Max Planck Society, Germany's largest non-university public research organization, we analyze how the probability and magnitude of commercial success are affected by geographic distance between licensors and licensees. Our evidence suggests that proximity is not generally associated with superior commercialization outcomes. A negative association between distance and commercialization success is identified only for the specific cases of, first, spin-off licensees located outside Germany and, second, foreign licensees within the subsample of inventions with multiple licensees.academic inventions, licensing, spin-off entrepreneurship, geographic distance

    Advanced flight control system study

    Get PDF
    A fly by wire flight control system architecture designed for high reliability includes spare sensor and computer elements to permit safe dispatch with failed elements, thereby reducing unscheduled maintenance. A methodology capable of demonstrating that the architecture does achieve the predicted performance characteristics consists of a hierarchy of activities ranging from analytical calculations of system reliability and formal methods of software verification to iron bird testing followed by flight evaluation. Interfacing this architecture to the Lockheed S-3A aircraft for flight test is discussed. This testbed vehicle can be expanded to support flight experiments in advanced aerodynamics, electromechanical actuators, secondary power systems, flight management, new displays, and air traffic control concepts

    Minimum Energy to Send kk Bits Over Multiple-Antenna Fading Channels

    Full text link
    This paper investigates the minimum energy required to transmit kk information bits with a given reliability over a multiple-antenna Rayleigh block-fading channel, with and without channel state information (CSI) at the receiver. No feedback is assumed. It is well known that the ratio between the minimum energy per bit and the noise level converges to 1.59-1.59 dB as kk goes to infinity, regardless of whether CSI is available at the receiver or not. This paper shows that lack of CSI at the receiver causes a slowdown in the speed of convergence to 1.59-1.59 dB as kk\to\infty compared to the case of perfect receiver CSI. Specifically, we show that, in the no-CSI case, the gap to 1.59-1.59 dB is proportional to ((logk)/k)1/3((\log k) /k)^{1/3}, whereas when perfect CSI is available at the receiver, this gap is proportional to 1/k1/\sqrt{k}. In both cases, the gap to 1.59-1.59 dB is independent of the number of transmit antennas and of the channel's coherence time. Numerically, we observe that, when the receiver is equipped with a single antenna, to achieve an energy per bit of 1.5 - 1.5 dB in the no-CSI case, one needs to transmit at least 7×1077\times 10^7 information bits, whereas 6×1046\times 10^4 bits suffice for the case of perfect CSI at the receiver
    corecore