12,078 research outputs found

    Processing strategies. QLIF subproject 5: Development of a framework for the design of minimum and low input processing strategies, that guarantee food quality and safety

    Get PDF
    Organic processing standards already prohibit the use of many chemicals, additives and preservatives, which are used in the processing of conventional foods. Yet, there are frequent discussions of the underlying principles on organic food processing regarding such aspects as environmentally friendly processing, minimal use of additives, sensory quality, and transportation. It is therefore essential to develop a framework or code of practice, which can be used to determine whether novel processing strategies are compatible with organic processing standards and principles as well as consumer demands and expectations in relation to quality characteristics of processed food. In the present subproject part of the research comprised an assessment of alternative sanitising protocols under controlled laboratory conditions in the processing of fresh cut lettuce and mixtures of fresh-cut-vegetables. During these studies it was found that, with regard to ecological aspects, ozone is a good alternative to the existing disinfectants, such as chlorine, in the organic field. Further research concerned an assessment of processing technologies that may improve the nutritional composition of dairy products, such as increased content of conjugated linolenic acids (CLA)

    Standard Model physics at the LHC

    Full text link
    In this contribution to the volume "From My Vast Repertoire -- The Legacy of Guido Altarelli" I discuss selected topics of Standard Model (SM) physics at the LHC and their connection to the work of Guido Altarelli. An overarching connection is given by Guido's fundamental contributions to our understanding of the strong force, parton distribution functions and their evolution, and techniques to calculate theoretical predictions through perturbative Quantum Chromodynamics. The increasingly precise LHC measurements of diverse processes can be confronted with SM predictions relying on these foundations to test the SM and contribute to the knowledge of the proton structure. The LHC experiments have also started to perform precise measurements of SM parameters, such as the mass of the W boson, and have measured a variety of processes sensitive to interactions of multiple electroweak bosons. The discovery of the Higgs boson in 2012 and the measurements of its properties are crucial to establish the nature of the mechanism of electroweak symmetry breaking and the status of these studies is summarised.Comment: Contribution to the volume "From my Vast Repertoire - the Legacy of Guido Altarelli" : 24 pages, 13 figure

    Complex systems in the history of American English

    Get PDF
    Kretzschmar 2009 has demonstrated that language in use, speech as opposed to linguistic systems as usually described by linguists, satisfies the conditions for complex systems as defined in sciences such as physics, evolutionary biology, and economics. This finding has strong methodological consequences for study of the history of American English. This paper discusses implications for the initial formation of American English and its varieties, with reference to Schneider 2007, as the product of random interactions between speakers of different input varieties of English. It also considers westward expansion of American dialects, with reference to Kretzschmar 1996, as an effect of proximity, especially along settlement routes. Finally, it describes how sociolinguistic discussions of more recent change should also be understood as occurring within the different intersecting scales of complex systems of speech in America

    African American voices in Atlanta

    Get PDF
    Survey research in Atlanta suggests that the usual national generalizations about race and language need to be examined in the light of local evidence. The Linguistic Atlas of the Gulf States preserves recordings of interviews with a number of African Americans from the 1970s, to set a historical baseline for the community. A contemporary random-sample study of African Americans in Atlanta showed that our speakers were highly variable in their vowel production. They not only did not match national generalizations, but appeared to have more of Labov's "Southern Shift" than the local non-African-American speakers who were supposed to be characterized by it. Only a minority of speakers show “mean” behavior for the whole set of vowels. Still, black/white speech relations in the Atlanta metro area create perceptions such that a child from a historic African American neighborhood in Roswell had to "learn how to talk hood" to fit in with children from the Atlanta public schools. And Atlanta, with its central place in the hip-hop community alongside New York and Los Angeles, maintains an identity on the national scene with roots in local speech. History and contemporary evidence combine to show that African American voices in Atlanta belong to a complex system in which speakers can be themselves in their neighborhoods, while at the same time they participate in historical and national trends

    Approaches Used in Organic and Low Input Food Processing – Impact on Food Quality and Safety. Results of a delphi survey from an expert consultation in 13 European Countries.

    Get PDF
    The overall objective of the subproject on processing, where the Delphi expert survey was an important task, is “to develop of a framework for the design of “minimum” and “low input” processing strategies, which guarantee food quality and safety.” It should support the overall aim of the integrated QLIF Project (Quality of Low-Input Food) in improving quality, ensuring safety and reducing costs along the European organic and “low input” food supply chains through research, dissemination and training activities. The method chosen was the Delphi method. The work was carried out in the form of a two step Delphi survey. In the first round 250 experts in 13 countries in Europe were involved, and were asked to respond to a standardised questionnaire in October and November 2004 and the second round from March to May 2005. The Delphi expert survey was designed in such a way that the most important and currently discussed aspects regarding organic food processing have been taken up. 120 experts from 13 countries (Austria, Belgium, Czech Republic, Denmark, Finland, France, Germany, Great Britain, Italy, Netherlands, Slovakia, Spain, and Switzerland) answered the first round and 83 experts from 13 countries answered the second round. Based on the experiences from other EU projects (Hamm et al. 2002), a classification was made with regard to the development stage of the country in the organic market development

    Complex systems and the history of the English language

    Get PDF
    Complexity theory (Mitchell 2009, Kretzschmar 2009) is something that historical linguists not only can use but should use in order to improve the relationship between the speech we observe in historical settings and the generalizations we make from it. Complex systems, as described in physics, ecology, and many other sciences, are made up of massive numbers of components interacting with one another, and this results in self-organization and emergent order. For speech, the “components” of a complex system are all of the possible variant realizations of linguistic features as they are deployed by human agents, speakers and writers. The order that emerges in speech is simply the fact that our use of words and other linguistic features is significantly clustered in the spatial and social and textual groups in which we actually communicate. Order emerges from such systems by means of self-organization, but the order that arises from speech is not the same as what linguists study under the rubric of linguistic structure. In both texts and regional/social groups, the frequency distribution of features occurs as the same pattern: an asymptotic hyperbolic curve (or “A-curve”). Formal linguistic systems, grammars, are thus not the direct result of the complex system, and historical linguists must use complexity to mediate between the language production observed in the community and the grammars we describe. The history of the English language does not proceed as regularly as like clockwork, and an understanding of complex systems helps us to see why and how, and suggests what we can do about it. First, the scaling property of complex systems tells us that there are no representative speakers, and so our observation of any small group of speakers is unlikely to represent any group at a larger scale—and limited evidence is the necessary condition of many of our historical studies. The fact that underlying complex distributions follow the 80/20 rule, i.e. 80% of the word tokens in a data set will be instances of only 20% of the word types, while the other 80% of the word types will amount to only 20% of the tokens, gives us an effective tool for estimating the status of historical states of the language. Such a frequency-based technique is opposed to the typological “fit” technique that relies on a few texts that can be reliably located in space, and which may not account for the crosscutting effects of text type, another dimension in which the 80/20 rule applies. Besides issues of sampling, the frequency-based approach also affects how we can think about change. The A-curve immediately translates to the S-curve now used to describe linguistic change, and explains that “change” cannot reasonably be considered to be a qualitative shift. Instead, we can use to model of “punctuated equilibrium” from evolutionary biology (e.g., see Gould and Eldredge 1993), which suggests that multiple changes occur simultaneously and compete rather than the older idea of “phyletic gradualism” in evolution that corresponds to the traditional method of historical linguistics. The Great Vowel Shift, for example, is a useful overall generalization, but complex systems and punctuated equilibrium explain why we should not expect it ever to be “complete” or to appear in the same form in different places. These applications of complexity can help us to understand and interpret our existing studies better, and suggest how new studies in the history of the English language can be made more valid and reliable

    Creation of regions for dialect features using a cellular automaton

    Get PDF
    An issue in dialect research has been how to make generalizations from survey data about where some dialect feature might be found. Pre-computational methods included drawing isoglosses or using shadings to indicate areas where an analyst expected a feature to be found. The use of computers allowed for faster plotting of locations where any given feature had been eÂŹlicited, and also allowed for the use of statistical techniques from technical geography to estimate regions where particular features might be found. However, using the computer did not make the analysis less subjective than isoglosses, and statistical methods from technical geography have turned out to be limited in use. We have prepared a cellular automaton (CA) for use with data collected for the Linguistic Atlas Project that can address the problems involved in this type of data visualization. The CA plots the locations where survey data was elicited, and then through the application of rules creates an estimate of the spatial distributions of selected features. The application of simple rules allows the CA to create objective and reproducible estimates based on the data it was given, without the use of statistical methods
    • 

    corecore