76 research outputs found

    Subpopulations of Mononuclear Cells in Microscopic Lesions of Psoriatic Patients. Selective Accumulation of Suppressor/Cytotoxic T Cells in Epidermis During the Evolution of the Lesion

    Get PDF
    The age of microscopic lesions in psoriatic subjects was assessed from the stacking characteristics in the horny layer and related to type and density (cells/tissue volume) of mononuclear cells in the epidermis and the dermis determined by immunoperoxidase methods using monoclonal antibodies. Pan T cells (Lyt-2+, Lyt-3+, Leu-4+, OKT3+), T helper cells (Leu-3a+, OKT4+), T suppressor/cytotoxic cells (Leu-2a+, OKT8+), Ia+ cells and monocytes (OKM2+, BRL αmono+) were determined in epidermis and dermis. The psoriatic lesion was divided into regions underneath a parakeratotic and an orthohyperkeratotic/hypergranular portion of the horny layer and contrasted with perilesional and uninvolved psoriatic skin as well as with healthy skin. In the various regions and skin layers, the cell density was highest in parakeratosis and decreased toward normality with decreasing histologic abnormality. The relation between epidermal and dermal cell densities of the T-cell subsets was modified in the involved psoriatic skin with a selective preponderance of T suppressor/cytotoxic cells in the epidermis. The accumulation was present in the youngest lesion found (3 days) and cell densities were unchanged in older lesions. The finding suggests that the altered relationship in the subsets of T cells has an important role during the induction and progress of the psoriatic process in the skin

    Data-driven decoding of quantum error correcting codes using graph neural networks

    Full text link
    To leverage the full potential of quantum error-correcting stabilizer codes it is crucial to have an efficient and accurate decoder. Accurate, maximum likelihood, decoders are computationally very expensive whereas decoders based on more efficient algorithms give sub-optimal performance. In addition, the accuracy will depend on the quality of models and estimates of error rates for idling qubits, gates, measurements, and resets, and will typically assume symmetric error channels. In this work, instead, we explore a model-free, data-driven, approach to decoding, using a graph neural network (GNN). The decoding problem is formulated as a graph classification task in which a set of stabilizer measurements is mapped to an annotated detector graph for which the neural network predicts the most likely logical error class. We show that the GNN-based decoder can outperform a matching decoder for circuit level noise on the surface code given only simulated experimental data, even if the matching decoder is given full information of the underlying error model. Although training is computationally demanding, inference is fast and scales approximately linearly with the space-time volume of the code. We also find that we can use large, but more limited, datasets of real experimental data [Google Quantum AI, Nature {\bf 614}, 676 (2023)] for the repetition code, giving decoding accuracies that are on par with minimum weight perfect matching. The results show that a purely data-driven approach to decoding may be a viable future option for practical quantum error correction, which is competitive in terms of speed, accuracy, and versatility.Comment: 15 pages, 12 figure

    Error-rate-agnostic decoding of topological stabilizer codes

    Get PDF
    Efficient high-performance decoding of topological stabilizer codes has the potential to crucially improve the balance between logical failure rates and the number and individual error rates of the constituent qubits. High-threshold maximum-likelihood decoders require an explicit error model for Pauli errors to decode a specific syndrome, whereas lower-threshold heuristic approaches such as minimum weight matching are "error agnostic". Here we consider an intermediate approach, formulating a decoder that depends on the bias, i.e., the relative probability of phase-flip to bit-flip errors, but is agnostic to error rate. Our decoder is based on counting the number and effective weight of the most likely error chains in each equivalence class of a given syndrome. We use Metropolis-based Monte Carlo sampling to explore the space of error chains and find unique chains, that are efficiently identified using a hash table. Using the error-rate invariance the decoder can sample chains effectively at an error rate which is higher than the physical error rate and without the need for "thermalization" between chains in different equivalence classes. Applied to the surface code and the XZZX code, the decoder matches maximum-likelihood decoders for moderate code sizes or low error rates. We anticipate that, because of the compressed information content per syndrome, it can be taken full advantage of in combination with machine-learning methods to extrapolate Monte Carlo-generated data.Comment: 15 pages, 9 figures; V2 Added analysis of low error-rate performanc

    Long-term effects of no-take zones in Swedish waters

    Get PDF
    Marine protected areas (MPAs) are increasingly established worldwide to protect and restore degraded ecosystems. However, the level of protection varies among MPAs and has been found to affect the outcome of the closure. In no-take zones (NTZs), no fishing or extraction of marine organisms is allowed. The EU Commission recently committed to protect 30% of European waters by 2030 through the updated Biodiversity Strategy. Importantly, one third of these 30% should be of strict protection. Exactly what is meant by strict protection is not entirely clear, but fishing would likely have to be fully or largely prohibited in these areas. This new target for strictly protected areas highlights the need to evaluate the ecological effects of NTZs, particularly in regions like northern Europe where such evaluations are scarce. The Swedish NTZs made up approximately two thirds of the total areal extent of NTZs in Europe a decade ago. Given that these areas have been closed for at least 10 years and can provide insights into long-term effects of NTZs on fish and ecosystems, they are of broad interest in light of the new 10% strict protection by 2030 commitment by EU member states.In total, eight NTZs in Swedish coastal and offshore waters were evaluated in the current report, with respect to primarily the responses of focal species for the conservation measure, but in some of the areas also ecosystem responses. Five of the NTZs were established in 2009-2011, as part of a government commission, while the other three had been established earlier. The results of the evaluations are presented in a synthesis and also in separate, more detailed chapters for each of the eight NTZs. Overall, the results suggest that NTZs can increase abundances and biomasses of fish and decapod crustaceans, given that the closed areas are strategically placed and of an appropriate size in relation to the life cycle of the focal species. A meta-regression of the effects on focal species of the NTZs showed that CPUE was on average 2.6 times higher after three years of protection, and 3.8 times higher than in the fished reference areas after six years of protection. The proportion of old and large individuals increased in most NTZs, and thereby also the reproductive potential of populations. The increase in abundance of large predatory fish also likely contributed to restoring ecosystem functions, such as top-down control. These effects appeared after a 5-year period and in many cases remained and continued to increase in the longer term (>10 years). In the two areas where cod was the focal species of the NTZs, positive responses were weak, likely as an effect of long-term past, and in the Kattegat still present, recruitment overfishing. In the Baltic Sea, predation by grey seal and cormorant was in some cases so high that it likely counteracted the positive effects of removing fisheries and led to stock declines in the NTZs. In most cases, the introduction of the NTZs has likely decreased the total fishing effort rather than displacing it to adjacent areas. In the Kattegat NTZ, however, the purpose was explicitly to displace an unselective coastal mixed bottom-trawl fishery targeting Norway lobster and flatfish to areas where the bycatches of mature cod were smaller. In two areas that were reopened to fishing after 5 years, the positive effects of the NTZs on fish stocks eroded quickly to pre-closure levels despite that the areas remained closed during the spawning period, highlighting that permanent closures may be necessary to maintain positive effects.We conclude from the Swedish case studies that NTZs may well function as a complement to other fisheries management measures, such as catch, effort and gear regulations. The experiences from the current evaluation show that NTZs can be an important tool for fisheries management especially for local coastal fish populations and areas with mixed fisheries, as well as in cases where there is a need to counteract adverse ecosystem effects of fishing. NTZs are also needed as reference for marine environmental management, and for understanding the effects of fishing on fish populations and other ecosystem components in relation to other pressures. MPAs where the protection of both fish and their habitats is combined may be an important instrument for ecosystembased management, where the recovery of large predatory fish may lead to a restoration of important ecosystem functions and contribute to improving decayed habitats.With the new Biodiversity Strategy, EUs level of ambition for marine conservation increases significantly, with the goal of 30% of coastal and marine waters protected by 2030, and, importantly, one third of these areas being strictly protected. From a conservation perspective, rare, sensitive and/or charismatic species or habitats are often in focus when designating MPAs, and displacement of fisheries is then considered an unwanted side effect. However, if the establishment of strictly protected areas also aims to rebuild fish stocks, these MPAs should be placed in heavily fished areas and designed to protect depleted populations by accounting for their home ranges to generate positive outcomes. Thus, extensive displacement of fisheries is required to reach benefits for depleted populations, and need to be accounted for e.g. by specific regulations outside the strictly protected areas. These new extensive EU goals for MPA establishment pose a challenge for management, but at the same time offer an opportunity to bridge the current gap between conservation and fisheries management

    Introduction: building the history of language learning and teaching (HoLLT)

    Get PDF
    The papers presented in this issue are the result of a workshop held at the University of Nottingham in December 2012 as part of an Arts and Humanities Research Council research network Towards a History of Modern Foreign Language Teaching and Learning (2012–14) intended to stimulate historical research into language teaching and learning. This, the first workshop in the programme, focused on exchanging information on the history of language learning and teaching (HoLLT) across the different language traditions, for it had become clear to us that scholars working within their own language disciplines were often relatively unaware of work outside these. We hope that this special issue — with overview articles on the history of English, French, German, and Spanish as second/foreign languages — will help overcome that lack of awareness and facilitate further research collaboration. Charting the history of language teaching and learning will, in turn, make us all better informed in facing challenges and changes to policy and practice now and in the future. It is instructive in the current climate, for example, to realize that grave doubts were held about whether second foreign languages could survive alongside French in British schools in the early twentieth century (McLelland, forthcoming), or to look back at earlier attempts to establish foreign languages in primary schools (Bayley, 1989; Burstall et al., 1974; Hoy, 1977). As we write, language learning in England is undergoing yet more radical change. Language teaching for all children from the age of seven is being made compulsory in primary schools from 2014, while at Key Stage 3 (up to age 16), where a foreign language has not been compulsory since 2002, the most recent programme of study for England has virtually abandoned the recent focus on intercultural competence and now requires learners to ‘read great literature in the original language’,1 a radical change in emphasis compared to the previous half-century, which seems to reflect a very different view of what language learning is for. We seem to be little closer in 2014 than we were at the dawn of the twentieth century to answering with any certainty the questions that lie at the very foundations of language teaching: who should learn a foreign language, why learners learn, what they need to learn, and what we want to teach them — answers that we need before we can consider how we want to teach. The research programme begun under our research network is intended to help us to take ‘the long view’ on such questions

    Orders From the Cloud : Business Integration as a Service

    No full text
    This thesis describes the development of a SOA-based architecture for integrating large EDI-using manufacturing companies purchasing processes with smaller non-EDI capable manufacturing companies, using online services. The underlying need for this project lies with the fact that these small manufacturing companies risk to miss out on business, due to their inability to communicate via the industry standard EDI format. At the same time, getting EDI-capable involves significant investments in software licenses, connectivity services and consulting or training that these small companies may not be ready to make. The mentor company of this thesis project, System Andersson, produces resource planning software for this type of companies and would like to be able to provide them with an easy to use way of “jacking in” EDI support into their businesses, without having to make such significant investments. Ideally this feature would be developed as a standalone subscription based service that could be provided to users in such a way that their existing System Andersson software could connect to it and no further hard- or software would be needed on site. This EDI-enabling should be as easy as to be entirely transparent for the end-user companies. The task handed to the author was thus to develop an architecture for how such a subscription based service could be developed. Furthermore, in order to promote re-use and simplify development, the architecture was to be based on SOA concepts. As a result of the project, such an architecture has been developed. The architecture details two services for translating and storing for later delivery a number of EDI message types of the EDIFACT variety. The architecture also specifies communications protocols (SOAP over HTTPS and AS2 over HTTPS) and APIs (web services) for how to communicate with these services. These specifications can be used to implement a system that performs the necessary integration, so that the smaller companies may indeed communicate via EDI. The fitness of the developed architecture has been tested by implementing a prototype version of such a system based on it. It has also been validated by way of comparing to how well it adheres to SOA design principles. All in all, this design appears to be quite sound and presents a working solution to the studied problem

    Reasoning Performance Indicators for Ontology Design Patterns

    No full text
    Ontologies are increasingly used in systems where performance is an important requirement. While there is a lot of work on reasoning performance-altering structures in ontologies, how these structures appear in Ontology Design Patterns (ODPs) is as of yet relatively unknown. This paper surveys existing literature on performance indicators in ontologies applicable to ODPs, and studies how those indicators are expressed in patterns published on two well known ODP portals. Based on this, it proposes recommendations and design principles for the development of new ODPs

    Towards an Ontology Design Pattern Quality Model

    No full text
    The use of semantic technologies and Semantic Web ontologies in particular have enabled many recent developments in information integration, search engines, and reasoning over formalised knowledge. Ontology Design Patterns have been proposed to be useful in simplifying the development of Semantic Web ontologies by codifying and reusing modelling best practices. This thesis investigates the quality of Ontology Design Patterns. The main contribution of the thesis is a theoretically grounded and partially empirically evaluated quality model for such patterns including a set of quality characteristics, indicators, measurement methods and recommendations. The quality model is based on established theory on information system quality, conceptual model quality, and ontology evaluation. It has been tested in a case study setting and in two experiments. The main findings of this thesis are that the quality of Ontology Design Patterns can be identified, formalised and measured, and furthermore, that these qualities interact in such a way that ontology engineers using patterns need to make tradeoffs regarding which qualities they wish to prioritise. The developed model may aid them in making these choices. This work has been supported by Jönköing University
    corecore