46 research outputs found

    Linguistic sleuthing for innovators

    Get PDF
    For centuries “innovation” has been a topic of book authors and academic researchers as documented by Ngram and Google Scholar search results. In contrast, “innovators” have had substantially less attention in both the popular domain and the academic domain. The purpose of this paper is to introduce a text analysis research methodology to linguistically identify “innovators” and “non-innovators” using Hebert F. Crovitz’s 42 relational words. Specifically, we demonstrate how to combine the use of two complementary text analysis software programs: Linguistic Inquiry and Word Count and WORDij to simply count the percent of use of these relational words and determine the statistical difference in use between “innovators” and “non-innovators.” We call this the “Crovitz Innovator Identification Method” in honor of Herbert F. Crovitz, who envisioned the possibility of using a small group of 42 words to signal “innovation” language. The Crovitz Innovator Identification Method is inexpensive, fast, scalable, and ready to be applied by others using this example as their guide. Nevertheless, this method does not confirm the viability of any innovation being created, used or implemented; it simply detects how a person’s language signals innovative thinking. We invite other scholars to join us in this linguistic sleuthing for innovators

    Scientific mindfulness: a foundation for future themes in international business

    Get PDF
    We conceptualize new ways to qualify what themes should dominate the future IB research agenda by examining three questions: Whom should we ask? What should we ask and which selection criteria should we apply? What are the contextual forces? We propose scientific mindfulness as the way forward for generating themes in IB research

    Input-state-output representations and constructions of finite-support 2D convolutional codes

    Get PDF
    Two-dimensional convolutional codes are considered, with codewords having compact support indexed in N^2 and taking values in F^n, where F is a finite field. Input-state-output representations of these codes are introduced and several aspects of such representations are discussed. Constructive procedures of such codes with a designed distance are also presented. © 2010 AIMS-SDU

    A state space approach to periodic convolutional codes

    Get PDF
    In this paper we study periodically time-varying convolutional codes by means of input-state-output representations. Using these representations we investigate under which conditions a given time-invariant convolutional code can be transformed into an equivalent periodic time-varying one. The relation between these two classes of convolutional codes is studied for period 2. We illustrate the ideas presented in this paper by constructing a periodic time-varying convolutional code from a time-invariant one. The resulting periodic code has larger free distance than any time-invariant convolutional code with equivalent parameters

    Column Rank Distances of Rank Metric Convolutional Codes

    Get PDF
    In this paper, we deal with the so-called multi-shot network coding, meaning that the network is used several times (shots) to propagate the information. The framework we present is slightly more general than the one which can be found in the literature. We study and introduce the notion of column rank distance of rank metric convolutional codes for any given rate and finite field. Within this new framework we generalize previous results on column distances of Hamming and rank metric convolutional codes [3, 8]. This contribution can be considered as a continuation follow-up of the work presented in [10]

    Decoding of 2D convolutional codes over an erasure channel

    Get PDF
    In this paper we address the problem of decoding 2D convolutional codes over an erasure channel. To this end we introduce the notion of neighbors around a set of erasures which can be considered an analogue of the notion of sliding window in the context of 1D convolutional codes. The main idea is to reduce the decoding problem of 2D convolutional codes to a problem of decoding a set of associated 1D convolutional codes. We first show how to recover sets of erasures that are distributed on vertical, horizontal and diagonal lines. Finally we outline some ideas to treat any set of erasures distributed randomly on the 2D plane. © 2016 AIMS

    Periodic state-space representations of periodic convolutional codes

    Get PDF
    In this paper we study the representation of periodically time-varying convolutional codes by means of periodic input-state-output models. In particular, we focus on period two and investigate under which conditions a given two-periodic convolutional code (obtained by alternating two time-invariant encoders) can be represented by a periodic input-state-output system. We first show that one cannot expect, in general, to obtain a periodic input-state-output representation of a periodic convolutional code by means of the individual realizations of each of the associated time-invariant codes. We, however, provide sufficient conditions for this to hold in terms of the column degrees of the associated column reduced generator matrices. Moreover, we derive a sufficient condition to obtain a periodic state-space realization that is minimal. Finally, examples to illustrate the results are presented.publishe

    On optimal extended row distance profile

    Get PDF
    In this paper, we investigate extended row distances of Unit Memory (UM) convolutional codes. In particular, we derive upper and lower bounds for these distances and moreover present a concrete construction of a UM convolutional code that almost achieves the derived upper bounds. The generator matrix of these codes is built by means of a particular class of matrices, called superregular matrices. We actually conjecture that the construction presented is optimal with respect to the extended row distances as it achieves the maximum extended row distances possible. This in particular implies that the upper bound derived is not completely tight. The results presented in this paper further develop the line of research devoted to the distance properties of convolutional codes which has been mainly focused on the notions of free distance and column distance. Some open problems are left for further research
    corecore