29 research outputs found

    Prediction of Extracellular Proteases of the Human Pathogen Helicobacter pylori Reveals Proteolytic Activity of the Hp1018/19 Protein HtrA

    Get PDF
    Exported proteases of Helicobacter pylori (H. pylori) are potentially involved in pathogen-associated disorders leading to gastric inflammation and neoplasia. By comprehensive sequence screening of the H. pylori proteome for predicted secreted proteases, we retrieved several candidate genes. We detected caseinolytic activities of several such proteases, which are released independently from the H. pylori type IV secretion system encoded by the cag pathogenicity island (cagPAI). Among these, we found the predicted serine protease HtrA (Hp1019), which was previously identified in the bacterial secretome of H. pylori. Importantly, we further found that the H. pylori genes hp1018 and hp1019 represent a single gene likely coding for an exported protein. Here, we directly verified proteolytic activity of HtrA in vitro and identified the HtrA protease in zymograms by mass spectrometry. Overexpressed and purified HtrA exhibited pronounced proteolytic activity, which is inactivated after mutation of Ser205 to alanine in the predicted active center of HtrA. These data demonstrate that H. pylori secretes HtrA as an active protease, which might represent a novel candidate target for therapeutic intervention strategies

    Domain Organization of Long Signal Peptides of Single-Pass Integral Membrane Proteins Reveals Multiple Functional Capacity

    Get PDF
    Targeting signals direct proteins to their extra - or intracellular destination such as the plasma membrane or cellular organelles. Here we investigated the structure and function of exceptionally long signal peptides encompassing at least 40 amino acid residues. We discovered a two-domain organization (“NtraC model”) in many long signals from vertebrate precursor proteins. Accordingly, long signal peptides may contain an N-terminal domain (N-domain) and a C-terminal domain (C-domain) with different signal or targeting capabilities, separable by a presumably turn-rich transition area (tra). Individual domain functions were probed by cellular targeting experiments with fusion proteins containing parts of the long signal peptide of human membrane protein shrew-1 and secreted alkaline phosphatase as a reporter protein. As predicted, the N-domain of the fusion protein alone was shown to act as a mitochondrial targeting signal, whereas the C-domain alone functions as an export signal. Selective disruption of the transition area in the signal peptide impairs the export efficiency of the reporter protein. Altogether, the results of cellular targeting studies provide a proof-of-principle for our NtraC model and highlight the particular functional importance of the predicted transition area, which critically affects the rate of protein export. In conclusion, the NtraC approach enables the systematic detection and prediction of cryptic targeting signals present in one coherent sequence, and provides a structurally motivated basis for decoding the functional complexity of long protein targeting signals

    Asthmakontrolle auf fĂŒnf Stufen

    No full text

    Document collections, mobilized regulations, and the making of customary law at the end of the Middle Ages

    Full text link
    Using late medieval examples from Switzerland, this paper argues that the emergence of formally organized archives around 1500 was part of an important shift in how documents could be deployed. However, this shift was not away from an oral and toward a literate culture, as argued in some earlier studies, but rather away from seeing documents as testimony that reminded a community about past authoritative actors, and toward relating the texts of documents to other texts, that is, to contexts. This shift took place largely through the appropriation of methods for using and organizing written material that had been developed in the realms of scholastic theology and liturgy, and applying them to secular lordship and administration. These methods provided new models for organizing collections of parchments and papers into connected archives and gave rise to new forms of text collection such as reorganized versions of law books (Spiegel, Coutumiers) containing new search tools such as tables of contents (capitulationes) and indices (abecedaria). Individual charters and scattered legal norms were also organized into textus–glossae structures in larger and smaller administrative units. In the Swiss case, the contextualization of legal texts was accompanied by an increased attribution of authority to ‘custom’ in general, because the community-oriented attribution of meaning found in earlier use was lost. Ultimately, recasting individual documents as part of larger textual contexts increased the power of rulers and ushered in an age of lawyers and of archives

    Utilising urban context recognition and machine learning to improve the generalisation of buildings

    Full text link
    The introduction of automated generalisation procedures in map production systems requires that generalisation systems are capable of processing large amounts of map data in acceptable time and that cartographic quality is similar to traditional map products. With respect to these requirements, we examine two complementary approaches that should improve generalisation systems currently in use by national topographic mapping agencies. Our focus is particularly on self-evaluating systems, taking as an example those systems that build on the multi-agent paradigm. The first approach aims to improve the cartographic quality by utilising cartographic expert knowledge relating to spatial context. More specifically, we introduce expert rules for the selection of generalisation operations based on a classification of buildings into five urban structure types, including inner city, urban, suburban, rural, and industrial and commercial areas. The second approach aims to utilise machine learning techniques to extract heuristics that allow us to reduce the search space and hence the time in which a good cartographical solution is reached. Both approaches are tested individually and in combination for the generalisation of buildings from map scale 1:5000 to the target map scale of 1:25 000. Our experiments show improvements in terms of efficiency and effectiveness. We provide evidence that both approaches complement each other and that a combination of expert and machine learnt rules give better results than the individual approaches. Both approaches are sufficiently general to be applicable to other forms of self-evaluating, constraint-based systems than multi-agent systems, and to other feature classes than buildings. Problems have been identified resulting from difficulties to formalise cartographic quality by means of constraints for the control of the generalisation process
    corecore