4,405 research outputs found

    From sequence-defined macromolecules to macromolecular pin codes

    Get PDF
    Dynamic sequence-defined oligomers carrying a chemically written pin code are obtained through a strategy combining multicomponent reactions with the thermoreversible addition of 1,2,4-triazoline-3,5-diones (TADs) to indole substrates. The precision oligomers are specifically designed to be encrypted upon heating as a result of the random reshuffling of the TAD-indole covalent bonds within the backbone, thereby resulting in the scrambling of the encoded information. The encrypted pin code can eventually be decrypted following a second heating step that enables the macromolecular pin code to be deciphered using 1D electrospray ionization-mass spectrometry (ESI-MS). The herein introduced concept of encryption/decryption represents a key advancement compared with current strategies that typically use uncontrolled degradation to erase and tandem mass spectrometry (MS/MS) to analyze, decipher, and read-out chemically encrypted information. Additionally, the synthesized macromolecules are coated onto a high-value polymer material, which demonstrates their potential application as coded product tags for anti-counterfeiting purposes

    Measuring Infringement of Intellectual Property Rights

    Get PDF
    © Crown Copyright 2014. You may re-use this information (excluding logos) free of charge in any format or medium, under the terms of the Open Government Licence. To view this licence, visit http://www.nationalarchives.gov. uk/doc/open-government-licence/ Where we have identified any third party copyright information you will need to obtain permission from the copyright holders concernedThe review is wide-ranging in scope and overall our findings evidence a lack of appreciation among those producing research for the high-level principles of measurement and assessment of scale. To date, the approaches adopted by industry seem more designed for internal consumption and are usually contingent on particular technologies and/or sector perspectives. Typically, there is a lack of transparency in the methodologies and data used to form the basis of claims, making much of this an unreliable basis for policy formulation. The research approaches we found are characterised by a number of features that can be summarised as a preference for reactive approaches that look to establish snapshots of an important issue at the time of investigation. Most studies are ad hoc in nature and on the whole we found a lack of sustained longitudinal approaches that would develop the appreciation of change. Typically the studies are designed to address specific hypotheses that might serve to support the position of the particular commissioning body. To help bring some structure to this area, we propose a framework for the assessment of the volume of infringement in each different area. The underlying aim is to draw out a common approach wherever possible in each area, rather than being drawn initially to the differences in each field. We advocate on-going survey tracking of the attitudes, perceptions and, where practical, behaviours of both perpetrators and claimants in IP infringement. Clearly, the nature of perpetrators, claimants and enforcement differs within each IPR but in our view the assessment for each IPR should include all of these elements. It is important to clarify that the key element of the survey structure is the adoption of a survey sampling methodology and smaller volumes of representative participation. Once selection is given the appropriate priority, a traditional offline survey will have a part to play, but as the opportunity arises, new technological methodologies, particularly for the voluntary monitoring of online behaviour, can add additional detail to the overall assessment of the scale of activity. This framework can be applied within each of the IP right sectors: copyright, trademarks,patents, and design rights. It may well be that the costs involved with this common approach could be mitigated by a syndicated approach to the survey elements. Indeed, a syndicated approach has a number of advantages in addition to cost. It could be designed to reduce any tendency either to hide inappropriate/illegal activity or alternatively exaggerate its volume to fit with the theme of the survey. It also has the scope to allow for monthly assessments of attitudes rather than being vulnerable to unmeasured seasonal impacts

    Social Media’s impact on Intellectual Property Rights

    Get PDF
    This is a draft chapter. The final version is available in Handbook of Research on Counterfeiting and Illicit Trade, edited by Peggy E. Chaudhry, published in 2017 by Edward Elgar Publishing Ltd, https://doi.org/10.4337/9781785366451. This material is for private use only, and cannot be used for any other purpose without further permission of the publisher.Peer reviewe

    Nano-artifact metrics based on random collapse of resist

    Full text link
    Artifact metrics is an information security technology that uses the intrinsic characteristics of a physical object for authentication and clone resistance. Here, we demonstrate nano-artifact metrics based on silicon nanostructures formed via an array of resist pillars that randomly collapse when exposed to electron-beam lithography. The proposed technique uses conventional and scalable lithography processes, and because of the random collapse of resist, the resultant structure has extremely fine-scale morphology with a minimum dimension below 10 nm, which is less than the resolution of current lithography capabilities. By evaluating false match, false non-match and clone-resistance rates, we clarify that the nanostructured patterns based on resist collapse satisfy the requirements for high-performance security applications

    The entropy of keys derived from laser speckle

    Full text link
    Laser speckle has been proposed in a number of papers as a high-entropy source of unpredictable bits for use in security applications. Bit strings derived from speckle can be used for a variety of security purposes such as identification, authentication, anti-counterfeiting, secure key storage, random number generation and tamper protection. The choice of laser speckle as a source of random keys is quite natural, given the chaotic properties of speckle. However, this same chaotic behaviour also causes reproducibility problems. Cryptographic protocols require either zero noise or very low noise in their inputs; hence the issue of error rates is critical to applications of laser speckle in cryptography. Most of the literature uses an error reduction method based on Gabor filtering. Though the method is successful, it has not been thoroughly analysed. In this paper we present a statistical analysis of Gabor-filtered speckle patterns. We introduce a model in which perturbations are described as random phase changes in the source plane. Using this model we compute the second and fourth order statistics of Gabor coefficients. We determine the mutual information between perturbed and unperturbed Gabor coefficients and the bit error rate in the derived bit string. The mutual information provides an absolute upper bound on the number of secure bits that can be reproducibly extracted from noisy measurements
    • …
    corecore