50 research outputs found

    Efficient Algorithms for Constructing Minimum-Weight Codewords in Some Extended Binary BCH Codes

    Full text link
    We present O(m3)O(m^3) algorithms for specifying the support of minimum-weight words of extended binary BCH codes of length n=2mn=2^m and designed distance d(m,s,i):=2m1s2m1isd(m,s,i):=2^{m-1-s}-2^{m-1-i-s} for some values of m,i,sm,i,s, where mm may grow to infinity. The support is specified as the sum of two sets: a set of 22i12i12^{2i-1}-2^{i-1} elements, and a subspace of dimension m2ism-2i-s, specified by a basis. In some detail, for designed distance 62j6\cdot 2^j, we have a deterministic algorithm for even m4m\geq 4, and a probabilistic algorithm with success probability 1O(2m)1-O(2^{-m}) for odd m>4m>4. For designed distance 282j28\cdot 2^j, we have a probabilistic algorithm with success probability 1/3O(2m/2)\geq 1/3-O(2^{-m/2}) for even m6m\geq 6. Finally, for designed distance 1202j120\cdot 2^j, we have a deterministic algorithm for m8m\geq 8 divisible by 44. We also present a construction via Gold functions when 2im2i|m. Our construction builds on results of Kasami and Lin (IEEE T-IT, 1972), who proved that for extended binary BCH codes of designed distance d(m,s,i)d(m,s,i), the minimum distance equals the designed distance. Their proof makes use of a non-constructive result of Berlekamp (Inform. Contrl., 1970), and a constructive ``down-conversion theorem'' that converts some words in BCH codes to lower-weight words in BCH codes of lower designed distance. Our main contribution is in replacing the non-constructive argument of Berlekamp by a low-complexity algorithm. In one aspect, we extends the results of Grigorescu and Kaufman (IEEE T-IT, 2012), who presented explicit minimum-weight words for designed distance 66 (and hence also for designed distance 62j6\cdot 2^j, by a well-known ``up-conversion theorem''), as we cover more cases of the minimum distance. However, the minimum-weight words we construct are not affine generators for designed distance >6>6

    LIPIcs, Volume 261, ICALP 2023, Complete Volume

    Get PDF
    LIPIcs, Volume 261, ICALP 2023, Complete Volum

    Defying Convention: Devising new approaches to heritage values in Valletta, Malta.

    Get PDF
    This thesis is concerned with heritage values and significance in the context of the World Heritage listed city of Valletta, the capital of the Maltese Islands, and the author’s home. Though the city’s fortunes have ebbed and flowed over the years, it has recently experienced a period of accelerated regeneration. Now a popular destination for cultural tourists, Valletta is a city in transition, where renewal has led to the conservation and restoration of its urban fabric yet change wrought by these processes has had a demonstrable, detrimental effect on its lesser recognized heritage. An intentional focus on these ‘unofficial’ heritage values is the principal subject of this research, in order to produce an alternative reading of a heritage landscape most often associated with more traditional criteria. This research is framed and informed by more contemporary approaches to heritage values, and the intellectual foundation for this approach is drawn from recent scholarship and related heritage conventions. Of particular note are the Burra Charter (Australia ICOMOS 2013 [1979]) and the Faro Convention (Council of Europe 2005) which reconceptualize the ideas of significance and emphasize public contribution in the process of defining what it is about a heritage place that is significant. This thesis adopts an innovative, in-situ data collection method, where the participants lead the researcher on walks around the city. Interviews were conducted with 19 participants in three phases between 2021 and 2022. The interviews themselves are participant-defined and captured on video as well as audio. The resultant data are analysed, organized and presented thematically. This analysis demonstrates the pivotal role of social values in understanding significance from a grassroots perspective, an approach that has never previously been applied in the context of Valletta

    Decreasing norm-trace codes

    Full text link
    The decreasing norm-trace codes are evaluation codes defined by a set of monomials closed under divisibility and the rational points of the extended norm-trace curve. In particular, the decreasing norm-trace codes contain the one-point algebraic geometry (AG) codes over the extended norm-trace curve. We use Gr\"obner basis theory and find the indicator functions on the rational points of the curve to determine the basic parameters of the decreasing norm-trace codes: length, dimension, and minimum distance. We also obtain their dual codes. We give conditions for a decreasing norm-trace code to be a self-orthogonal or a self-dual code. We provide a linear exact repair scheme to correct single erasures for decreasing norm-trace codes, which applies to higher rate codes than the scheme developed by Jin, Luo, and Xing (IEEE Transactions on Information Theory {\bf 64} (2), 900-908, 2018) when applied to the one-point AG codes over the extended norm-trace curve

    First Draft Thinking: Reading, Writing, and Researching at the Doctoral Level Using Social Annotation

    Get PDF
    This study addressed doctoral students’ academic challenges in earning their degrees. A mixed method needs assessment conducted in October 2020 of U.S. doctoral students in various fields (n = 270) indicated that the top academic challenges were reading the primary literature (PL), understanding research methods, finding information, and writing. These findings corroborated prior research indicating that reading, writing, and researching skills are co-constructed. The main study focused on one of these factors, reading PL because it is foundational to developing the others. This is the first study to examine PL skills at the doctoral level. Reading PL is part of a hidden curriculum often assumed and unaided by coursework in doctoral programs, although it is essential for degree progress. The study examined how participation in a four-week online intervention changed doctoral students' reading skills, annotation practices, and reading self-perceptions. The findings indicated that assessed pre-posttest skills improved significantly with participation. The pre-post reading comprehension (RC) assessment indicated that the intervention resulted in a statistically significant improvement in RC as critical reading of PL outside the participants' fields of study, t(23) = 13.6, p<.0001 with a large effect size, Cohen's d =1.68. The pre-post research self-efficacy (RSE) scores indicated that the intervention increased RSE, t(23) = 4.9, p<.0001 with a medium effect size, Cohen's d = 0.72. Finally, the pre-post reading apprehension (RA) scores indicated that the intervention decreased RA of PL outside the participants' fields of study, t(23) = 4.3, p<.0001 with a medium effect size, Cohen's d =0.71. In addition, participants reported significant positive changes in self-perceptions of their PL reading abilities in terms of ease and confidence. They received the most benefit from learning a structured reading method combined with low-stakes peer-based discussion. The study findings update doctoral preparation pedagogy concerning critical reading of the primary literature. The main implication is that doctoral students need and benefit from explicit instruction in critical reading skills related to the PL. The main recommendation is that all doctoral programs should explicitly teach critical reading. Another recommendation is explicitly teaching critical reading in higher education wherever students encounter PL

    A right to roam? A trans-species approach to understanding cathuman relations and social discourses associated with freeroaming urban cats (Felis catus)

    Get PDF
    This thesis employs thematic discourse analysis to elucidate prominent themes and points of contention associated with roaming cats (Felis catus). The data comprised 2476 online user comments responding to content related to roaming cats, 75 qualitative survey responses, 771 Facebook responses, and biographies reconstructed from eight case studies of cat-human relationships. These reflect broader social discourses surrounding more-than-human animals and human governance over other animals. Notions of guardian (owner) responsibilities are underpinned by different perceptions of companion cats (pets), ranging from childlike dependents who need to be protected and ‘parented’ to wild-like animals whose confinement would be morally wrong. Comments reveal how discourses from scientists, cat and wildlife advocacy groups, and the media are filtered through a local lens and often applied out of context. The data supports the notion that media reporting is instigating a moral panic over roaming cats by evoking emotive responses to predation by cats. These invariably become entangled within discourses related to cat safety, welfare, and complaints of ‘nuisance’ behaviours. Discourses surrounding cats in the community are further examined within a morethan- human biopolitical framework that describes how cohesive social mechanisms exert control over feline bodies through normalisation of practices such as desexing and confinement. Language was found to play a key role in biopolitical control by dominating the narrative of ‘responsible’ cat guardianship. Language is also central to moral panic theory, and the term ‘feral’ was shown to reinforce a ‘folk devil’ trope of free-living cats as transgressive and inherently different from companion cats. ‘Feral’ also invoked pity among those adamant cats need human love and care. However, cats are not without agency and can co-create meaning within a multispecies home or community. Case studies demonstrated cat-human intersubjectivity (joint meaning-making) and the various relationships formed between cats and non-feline animals (including human), both inside and outside of their homes

    Decoding and constructions of codes in rank and Hamming metric

    Get PDF
    As coding theory plays an important role in data transmission, decoding algorithms for new families of error correction codes are of great interest. This dissertation is dedicated to the decoding algorithms for new families of maximum rank distance (MRD) codes including additive generalized twisted Gabidulin (AGTG) codes and Trombetti-Zhou (TZ) codes, decoding algorithm for Gabidulin codes beyond half the minimum distance and also encoding and decoding algorithms for some new optimal rank metric codes with restrictions. We propose an interpolation-based decoding algorithm to decode AGTG codes where the decoding problem is reduced to the problem of solving a projective polynomial equation of the form q(x) = xqu+1 +bx+a = 0 for a,b ∈ Fqm. We investigate the zeros of q(x) when gcd(u,m)=1 and proposed a deterministic algorithm to solve a linearized polynomial equation which has a close connection to the zeros of q(x). An efficient polynomial-time decoding algorithm is proposed for TZ codes. The interpolation-based decoding approach transforms the decoding problem of TZ codes to the problem of solving a quadratic polynomial equation. Two new communication models are defined and using our models we manage to decode Gabidulin codes beyond half the minimum distance by one unit. Our models also allow us to improve the complexity for decoding GTG and AGTG codes. Besides working on MRD codes, we also work on restricted optimal rank metric codes including symmetric, alternating and Hermitian rank metric codes. Both encoding and decoding algorithms for these optimal families are proposed. In all the decoding algorithms presented in this thesis, the properties of Dickson matrix and the BM algorithm play crucial roles. We also touch two problems in Hamming metric. For the first problem, some cryptographic properties of Welch permutation polynomial are investigated and we use these properties to determine the weight distribution of a binary linear codes with few weights. For the second one, we introduce two new subfamilies for maximum weight spectrum codes with respect to their weight distribution and then we investigate their properties.Doktorgradsavhandlin

    Role competition in Central Asia? Network analysis, role theory and great power regionalism: a framework for analysis

    Get PDF
    The present thesis develops an analytical framework that rests on three pillars: 1. Network Analysis; 2. Role Theory; 3. Neorealism. These theoretical and analytical approaches have been hitherto disconnected in IR and FPA, despite their potential for synthesis. Through a critical appreciation of each approach, the author highlights their interoperability and reconceptualizes central themes in international relations such as the agency-structure debate, the concept of power, interdependence and institutions, and the security dilemma. It comes to the conclusion that the analysis of real-world phenomena needs to take into account both material and ideational factors, since ideational and material structures are inextricably interlinked in the conduct of foreign policy. The second part of the thesis applies this analytical framework to the regional case of Central Asia, and traces how great powers have engaged in role competition between 2007 and 2022. In an interpretative content analysis, it finds 13 roles conceptualized by the United States and Russia respectively; five of them are the most salient ones. In addition, it explores the roles enacted by the European Union and China. The main finding is that the great powers engage in competitive role-play and reject each other’s role conceptions; create conflicting role expectations; and eventually find themselves in ideational security dilemmas that are partially characterized by capacity-identity gaps. Importantly, the case demonstrates the interdependence of regional subsystems through international feedback loops. Role location processes in the Central Asian network cluster contributed to the deterioration of great power relations – and conflictual great power relations shaped the regional context

    Trellis Decoding And Applications For Quantum Error Correction

    Get PDF
    Compact, graphical representations of error-correcting codes called trellises are a crucial tool in classical coding theory, establishing both theoretical properties and performance metrics for practical use. The idea was extended to quantum error-correcting codes by Ollivier and Tillich in 2005. Here, we use their foundation to establish a practical decoder able to compute the maximum-likely error for any stabilizer code over a finite field of prime dimension. We define a canonical form for the stabilizer group and use it to classify the internal structure of the graph. Similarities and differences between the classical and quantum theories are discussed throughout. Numerical results are presented which match or outperform current state-of-the-art decoding techniques. New construction techniques for large trellises are developed and practical implementations discussed. We then define a dual trellis and use algebraic graph theory to solve the maximum-likely coset problem for any stabilizer code over a finite field of prime dimension at minimum added cost. Classical trellis theory makes occasional theoretical use of a graph product called the trellis product. We establish the relationship between the trellis product and the standard graph products and use it to provide a closed form expression for the resulting graph, allowing it to be used in practice. We explore its properties and classify all idempotents. The special structure of the trellis allows us to present a factorization procedure for the product, which is much simpler than that of the standard products. Finally, we turn to an algorithmic study of the trellis and explore what coding-theoretic information can be extracted assuming no other information about the code is available. In the process, we present a state-of-the-art algorithm for computing the minimum distance for any stabilizer code over a finite field of prime dimension. We also define a new weight enumerator for stabilizer codes over F_2 incorporating the phases of each stabilizer and provide a trellis-based algorithm to compute it.Ph.D

    The Subfield Codes of Some Few-Weight Linear Codes

    Full text link
    Subfield codes of linear codes over finite fields have recently received a lot of attention, as some of these codes are optimal and have applications in secrete sharing, authentication codes and association schemes. In this paper, the qq-ary subfield codes Cˉf,g(q)\bar{C}_{f,g}^{(q)} of six different families of linear codes Cˉf,g\bar{C}_{f,g} are presented, respectively. The parameters and weight distribution of the subfield codes and their punctured codes Cˉf,g(q)\bar{C}_{f,g}^{(q)} are explicitly determined. The parameters of the duals of these codes are also studied. Some of the resultant qq-ary codes Cˉf,g(q),\bar{C}_{f,g}^{(q)}, Cˉf,g(q)\bar{C}_{f,g}^{(q)} and their dual codes are optimal and some have the best known parameters. The parameters and weight enumerators of the first two families of linear codes Cˉf,g\bar{C}_{f,g} are also settled, among which the first family is an optimal two-weight linear code meeting the Griesmer bound, and the dual codes of these two families are almost MDS codes. As a byproduct of this paper, a family of [24m2,2m+1,24m3][2^{4m-2},2m+1,2^{4m-3}] quaternary Hermitian self-dual code are obtained with m2m \geq 2. As an application, several infinite families of 2-designs and 3-designs are also constructed with three families of linear codes of this paper.Comment: arXiv admin note: text overlap with arXiv:1804.06003, arXiv:2207.07262 by other author
    corecore