921 research outputs found

    What influences the speed of prototyping? An empirical investigation of twenty software startups

    Full text link
    It is essential for startups to quickly experiment business ideas by building tangible prototypes and collecting user feedback on them. As prototyping is an inevitable part of learning for early stage software startups, how fast startups can learn depends on how fast they can prototype. Despite of the importance, there is a lack of research about prototyping in software startups. In this study, we aimed at understanding what are factors influencing different types of prototyping activities. We conducted a multiple case study on twenty European software startups. The results are two folds, firstly we propose a prototype-centric learning model in early stage software startups. Secondly, we identify factors occur as barriers but also facilitators for prototyping in early stage software startups. The factors are grouped into (1) artifacts, (2) team competence, (3) collaboration, (4) customer and (5) process dimensions. To speed up a startups progress at the early stage, it is important to incorporate the learning objective into a well-defined collaborative approach of prototypingComment: This is the author's version of the work. Copyright owner's version can be accessed at doi.org/10.1007/978-3-319-57633-6_2, XP2017, Cologne, German

    A Hierarchy of Polynomial Kernels

    Full text link
    In parameterized algorithmics, the process of kernelization is defined as a polynomial time algorithm that transforms the instance of a given problem to an equivalent instance of a size that is limited by a function of the parameter. As, afterwards, this smaller instance can then be solved to find an answer to the original question, kernelization is often presented as a form of preprocessing. A natural generalization of kernelization is the process that allows for a number of smaller instances to be produced to provide an answer to the original problem, possibly also using negation. This generalization is called Turing kernelization. Immediately, questions of equivalence occur or, when is one form possible and not the other. These have been long standing open problems in parameterized complexity. In the present paper, we answer many of these. In particular, we show that Turing kernelizations differ not only from regular kernelization, but also from intermediate forms as truth-table kernelizations. We achieve absolute results by diagonalizations and also results on natural problems depending on widely accepted complexity theoretic assumptions. In particular, we improve on known lower bounds for the kernel size of compositional problems using these assumptions

    Human Prion Diseases in The Netherlands (1998–2009): Clinical, Genetic and Molecular Aspects

    Get PDF
    Prion diseases are rare and fatal neurodegenerative disorders that can be sporadic, inherited or acquired by infection. Based on a national surveillance program in the Netherlands we describe here the clinical, neuropathological, genetic and molecular characteristics of 162 patients with neuropathologically confirmed prion disease over a 12-year period (1998–2009). Since 1998, there has been a relatively stable mortality of Creutzfeldt-Jakob disease (CJD) in the Netherlands, ranging from 0.63 to 1.53 per million inhabitants per annum. Genetic analysis of the codon 129 methionine/valine (M/V) polymorphism in all patients with sporadic CJD (sCJD) showed a trend for under-representation of VV cases (7.0%), compared with sCJD cohorts in other Western countries, whereas the MV genotype was relatively over-represented (22,4%). Combined PrPSc and histopathological typing identified all sCJD subtypes known to date, except for the VV1 subtype. In particular, a “pure" phenotype was demonstrated in 60.1% of patients, whereas a mixed phenotype was detected in 39.9% of all sCJD cases. The relative excess of MV cases was largely accounted for by a relatively high incidence of the MV 2K subtype. Genetic analysis of the prion protein gene (PRNP) was performed in 161 patients and showed a mutation in 9 of them (5.6%), including one FFI and four GSS cases. Iatrogenic CJD was a rare phenomenon (3.1%), mainly associated with dura mater grafts. Three patients were diagnosed with new variant CJD (1.9%) and one with variably protease-sensitive prionopathy (VPSPr). Post-mortem examination revealed an alternative diagnosis in 156 patients, most commonly Alzheimer's disease (21.2%) or vascular causes of dementia (19.9%). The mortality rates of sCJD in the Netherlands are similar to those in other European countries, whereas iatrogenic and genetic cases are relatively rare. The unusual incidence of the VV2 sCJD subtype compared to that reported to date in other Western countries deserves further investigation

    Chiral Effective Lagrangian and Quark Masses

    Get PDF
    The status of lattice determinations of quark masses is reviewed (with the exception of m_b). Attempts to extract the low-energy constants in the effective chiral Lagrangian are discussed, with special emphasis on those couplings which are required to test the hypothesis of a massless up-quark. Furthermore, the issue of quenched chiral logarithms is addressed.Comment: Invited talk presented at Lattice2002(plenary), 12 pages, 3 figure

    Time sequence of the damage to the acceptor and donor sides of photosystem II by UV-B radiation as evaluated by chlorophyll a fluorescence

    Get PDF
    The effects of ultraviolet-B (UV-B) radiation on photosystem II (PS II) were studied in leaves of Chenopodium album. After the treatment with UV-B the damage was estimated using chlorophyll a fluorescence techniques. Measurements of modulated fluorescence using a pulse amplitude modulated fluorometer revealed that the efficiency of photosystem II decreased both with increasing time of UV-B radiation and with increasing intensity of the UV-B. Fluorescence induction rise curves were analyzed using a mechanistic model of energy trapping. It appears that the damage by UV-B radiation occurs first at the acceptor side of photosystem II, and only later at the donor side

    On opportunistic software reuse

    Get PDF
    The availability of open source assets for almost all imaginable domains has led the software industry toopportunistic design-an approach in which people develop new software systems in an ad hoc fashion by reusing and combining components that were not designed to be used together. In this paper we investigate this emerging approach. We demonstrate the approach with an industrial example in whichNode.jsmodules and various subsystems are used in an opportunistic way. Furthermore, to study opportunistic reuse as a phenomenon, we present the results of three contextual interviews and a survey with reuse practitioners to understand to what extent opportunistic reuse offers improvements over traditional systematic reuse approaches.Peer reviewe

    The ρ(1S,2S)\rho(1S,2S), ψ(1S,2S)\psi(1S,2S), Υ(1S,2S)\Upsilon(1S,2S) and ψt(1S,2S)\psi_t(1S,2S) mesons in a double pole QCD Sum Rule

    Full text link
    We use the method of double pole QCD sum rule which is basically a fit with two exponentials of the correlation function, where we can extract the masses and decay constants of mesons as a function of the Borel mass. We apply this method to study the mesons: ρ(1S,2S)\rho(1S,2S), ψ(1S,2S)\psi(1S,2S), Υ(1S,2S)\Upsilon(1S,2S) and ψt(1S,2S)\psi_t(1S,2S). We also present predictions for the toponiuns masses ψt(1S,2S)\psi_t(1S,2S) of m(1S)=357 GeV and m(2S)=374 GeV.Comment: 14 pages, 11 figures in Braz J Phys (2016

    Towards the glueball spectrum from unquenched lattice QCD

    Full text link
    We use a variational technique to study heavy glueballs on gauge configurations generated with 2+1 flavours of ASQTAD improved staggered fermions. The variational technique includes glueball scattering states. The measurements were made using 2150 configurations at 0.092 fm with a pion mass of 360 MeV. We report masses for 10 glueball states. We discuss the prospects for unquenched lattice QCD calculations of the oddballs.Comment: 19 pages, 4 tables and 8 figures. One figure added. Now matches the published versio

    Crossing Paths with Hans Bodlaender:A Personal View on Cross-Composition for Sparsification Lower Bounds

    Get PDF
    On the occasion of Hans Bodlaender’s 60th birthday, I give a personal account of our history and work together on the technique of cross-composition for kernelization lower bounds. I present several simple new proofs for polynomial kernelization lower bounds using cross-composition, interlaced with personal anecdotes about my time as Hans’ PhD student at Utrecht University. Concretely, I will prove that Vertex Cover, Feedback Vertex Set, and the H-Factor problem for every graph H that has a connected component of at least three vertices, do not admit kernels of (formula presented) bits when parameterized by the number of vertices n for any (formula presented), unless (formula presented). These lower bounds are obtained by elementary gadget constructions, in particular avoiding the use of the Packing Lemma by Dell and van Melkebeek.</p
    corecore