1,482 research outputs found

    What are Firms? Evolution from Birth to Public Companies

    Get PDF
    We study how firm characteristics evolve from early business plan to initial public offering to public company for 49 venture capital financed companies. The average time elapsed is almost 6 years. We describe the financial performance, business idea, point(s) of differentiation, non-human capital assets, growth strategy, customers, competitors, alliances, top management, ownership structure, and the board of directors. Our analysis focuses on the nature and stability of those flrIn attributes. Firm business lines remain remarkably stable from business plan through public company. Within those business lines, non-human capital aspects of the businesses appear more stable than human capital aspects. In the cross-section, firms with more alienable assets have substantially more human capital turnover.Theory of the firm; Entrepreneurship; Venture capital; Firm life cycle

    Decoupling the Spread of Grasslands from the Evolution of Grazer-type Herbivores in South America

    Get PDF
    The evolution of high-crowned cheek teeth (hypsodonty) in herbivorous mammals during the late Cenozoic is classically regarded as an adaptive response to the near-global spread of grass-dominated habitats. Precocious hypsodonty in middle Eocene (~38 million years (Myr) ago) faunas from Patagonia, South America, is therefore thought to signal Earth’s first grasslands, 20 million years earlier than elsewhere. Here, using a high-resolution, 43–18 million-year record of plant silica (phytoliths) from Patagonia, we show that although open-habitat grasses existed in southern South America since the middle Eocene (~40 Myr ago), they were minor floral components in overall forested habitats between 40 and 18 Myr ago. Thus, distinctly different, continent-specific environmental conditions (arid grasslands versus ash-laden forests) triggered convergent cheek–tooth evolution in Cenozoic herbivores. Hypsodonty evolution is an important example where the present is an insufficient key to the past, and contextual information from fossils is vital for understanding processes of adaptation

    Biodegradation in soil effects on PLA/sisal and PHBV/sisal biocomposites

    Get PDF
    The use of bio-based composites like lignocellulosic fibres/polymer composites as an alternative materials are continuously increasing in several applications such as automobile manufacturing, packaging, construction or household and agricultural equipments. In order to warranty the durability on green biocomposites based on polymer matrixes like poly(hydroxy butyrate-co-valerate) (PHBV) and poly(lactide) (PLA), the previous knowledge about the influence of the ambient agents on their macromolecular properties is necessary. In this sense, biodegradation in soil normalised experiments are useful. In this work, two commercial PHBV and PLA were reinforced with sisal fibres at 10 %, 20% and 30% of weight, with the aid of maleic anhydride as coupling agent.the influence of the amount of sisal fiber and the effect of the coupling agent on the impact of the biodegradation in soil on the materiales, in terms of the variation of the physico-chemical properties of the biocomposites

    Uncertainty Analysis and Order-by-Order Optimization of Chiral Nuclear Interactions

    Get PDF
    Chiral effective field theory (chi EFT) provides a systematic approach to describe low-energy nuclear forces. Moreover, chi EFT is able to provide well-founded estimates of statistical and systematic uncertainties-although this unique advantage has not yet been fully exploited. We fill this gap by performing an optimization and statistical analysis of all the low-energy constants (LECs) up to next-to-next-to-leading order. Our optimization protocol corresponds to a simultaneous fit to scattering and bound-state observables in the pion-nucleon, nucleon-nucleon, and few-nucleon sectors, thereby utilizing the full model capabilities of chi EFT. Finally, we study the effect on other observables by demonstrating forward-error-propagation methods that can easily be adopted by future works. We employ mathematical optimization and implement automatic differentiation to attain efficient and machine-precise first-and second-order derivatives of the objective function with respect to the LECs. This is also vital for the regression analysis. We use power-counting arguments to estimate the systematic uncertainty that is inherent to chi EFT, and we construct chiral interactions at different orders with quantified uncertainties. Statistical error propagation is compared with Monte Carlo sampling, showing that statistical errors are, in general, small compared to systematic ones. In conclusion, we find that a simultaneous fit to different sets of data is critical to (i) identify the optimal set of LECs, (ii) capture all relevant correlations, (iii) reduce the statistical uncertainty, and (iv) attain order-by-order convergence in chi EFT. Furthermore, certain systematic uncertainties in the few-nucleon sector are shown to get substantially magnified in the many-body sector, in particular when varying the cutoff in the chiral potentials. The methodology and results presented in this paper open a new frontier for uncertainty quantification in ab initio nuclear theory

    The mixed problem in L^p for some two-dimensional Lipschitz domains

    Get PDF
    We consider the mixed problem for the Laplace operator in a class of Lipschitz graph domains in two dimensions with Lipschitz constant at most 1. The boundary of the domain is decomposed into two disjoint sets D and N. We suppose the Dirichlet data, f_D has one derivative in L^p(D) of the boundary and the Neumann data is in L^p(N). We find conditions on the domain and the sets D and N so that there is a p_0>1 so that for p in the interval (1,p_0), we may find a unique solution to the mixed problem and the gradient of the solution lies in L^p

    Genuine Counterfactual Communication with a Nanophotonic Processor

    Full text link
    In standard communication information is carried by particles or waves. Counterintuitively, in counterfactual communication particles and information can travel in opposite directions. The quantum Zeno effect allows Bob to transmit a message to Alice by encoding information in particles he never interacts with. The first suggested protocol not only required thousands of ideal optical components, but also resulted in a so-called "weak trace" of the particles having travelled from Bob to Alice, calling the scalability and counterfactuality of previous proposals and experiments into question. Here we overcome these challenges, implementing a new protocol in a programmable nanophotonic processor, based on reconfigurable silicon-on-insulator waveguides that operate at telecom wavelengths. This, together with our telecom single-photon source and highly-efficient superconducting nanowire single-photon detectors, provides a versatile and stable platform for a high-fidelity implementation of genuinely trace-free counterfactual communication, allowing us to actively tune the number of steps in the Zeno measurement, and achieve a bit error probability below 1%, with neither post-selection nor a weak trace. Our demonstration shows how our programmable nanophotonic processor could be applied to more complex counterfactual tasks and quantum information protocols.Comment: 6 pages, 4 figure

    Hardness as a Spectral Peak Estimator for Gamma-Ray Bursts

    Full text link
    Simple hardness ratios are found to be a good estimator for the spectral peak energy in Gamma-Ray Bursts (GRBs). Specifically, a high correlation strength is found between the νFν\nu F_{\nu} peak in the spectrum of BATSE GRBs, \epo, and the hardness of GRBs, \hr, as defined by the fluences in channels 3 and 4, divided by the combined fluences in channels 1 and 2 of the BATSE Large Area Detectors. The correlation is independent of the type of the burst, whether Long-duration GRB (LGRB) or Short-duration (SGRB) and remains almost linear over the wide range of the BATSE energy window (20-2000 KeV). Based on Bayes theorem and Markov Chain Monte Carlo techniques, we also present multivariate analyses of the observational data while accounting for data truncation and sample-incompleteness. Prediction intervals for the proposed \hrep ~relation are derived. Results and further simulations are used to compute \epo estimates for nearly the entire BATSE catalog: 2130 GRBs. These results may be useful for investigating the cosmological utility of the spectral peak in GRBs intrinsic luminosity estimates.Comment: MNRAS submitted, Some technical side analyses removed or reduced following the referee's review, 68 pages, 13 figure

    Exclusivity and exclusion on platform markets

    Get PDF
    We examine conditions under which an exclusive license granted by the upstream producer of a component that some consumers regard as essential to one of two potential suppliers of a downstream platform market can make the unlicensed supplier unprofitable, although both firms would be profitable if both were licensed. If downstream varieties are close substitutes, an exclusive license need not be exclusionary. If downstream varieties are highly differentiated, an exclusive license is exclusionary, but it is not in the interest of the upstream firm to grant an exclusive license. For intermediate levels of product differentiation, an exclusive license is exclusionary and maximizes the upstream firm’s payoff

    An Experimental Investigation of Colonel Blotto Games

    Get PDF
    "This article examines behavior in the two-player, constant-sum Colonel Blotto game with asymmetric resources in which players maximize the expected number of battlefields won. The experimental results support all major theoretical predictions. In the auction treatment, where winning a battlefield is deterministic, disadvantaged players use a 'guerilla warfare' strategy which stochastically allocates zero resources to a subset of battlefields. Advantaged players employ a 'stochastic complete coverage' strategy, allocating random, but positive, resource levels across the battlefields. In the lottery treatment, where winning a battlefield is probabilistic, both players divide their resources equally across all battlefields." (author's abstract)"Dieser Artikel untersucht das Verhalten von Individuen in einem 'constant-sum Colonel Blotto'-Spiel zwischen zwei Spielern, bei dem die Spieler mit unterschiedlichen Ressourcen ausgestattet sind und die erwartete Anzahl gewonnener Schlachtfelder maximieren. Die experimentellen Ergebnisse bestätigen alle wichtigen theoretischen Vorhersagen. Im Durchgang, in dem wie in einer Auktion der Sieg in einem Schlachtfeld deterministisch ist, wenden die Spieler, die sich im Nachteil befinden, eine 'Guerillataktik' an, und verteilen ihre Ressourcen stochastisch auf eine Teilmenge der Schlachtfelder. Spieler mit einem Vorteil verwenden eine Strategie der 'stochastischen vollständigen Abdeckung', indem sie zufällig eine positive Ressourcenmenge auf allen Schlachtfeldern positionieren. Im Durchgang, in dem sich der Gewinn eines Schlachtfeldes probabilistisch wie in einer Lotterie bestimmt, teilen beide Spieler ihre Ressourcen gleichmäßig auf alle Schlachtfelder auf." (Autorenreferat
    corecore