1,938 research outputs found

    How Knowledge Grows

    Get PDF
    An argument that the development of scientific practice and growth of scientific knowledge are governed by Darwin's evolutionary model of descent with modification. Although scientific investigation is influenced by our cognitive and moral failings as well as all of the factors impinging on human life, the historical development of scientific knowledge has trended toward an increasingly accurate picture of an increasing number of phenomena. Taking a fresh look at Thomas Kuhn's 1962 work, The Structure of Scientific Revolutions, in How Knowledge Grows Chris Haufe uses evolutionary theory to explain both why scientific practice develops the way it does and how scientific knowledge expands. This evolutionary model, claims Haufe, helps to explain what is epistemically special about scientific knowledge: its tendency to grow in both depth and breadth. Kuhn showed how intellectual communities achieve consensus in part by discriminating against ideas that differ from their own and isolating themselves intellectually from other fields of inquiry and broader social concerns. These same characteristics, says Haufe, determine a biological population's degree of susceptibility to modification by natural selection. He argues that scientific knowledge grows, even across generations of variable groups of scientists, precisely because its development is governed by Darwinian evolution. Indeed, he supports the claim that this susceptibility to modification through natural selection helps to explain the epistemic power of certain branches of modern science. In updating and expanding the evolutionary approach to scientific knowledge, Haufe provides a model for thinking about science that acknowledges the historical contingency of scientific thought while showing why we nevertheless should trust the results of scientific research when it is the product of certain kinds of scientific communities

    Arbeitsmaterial zur Rezeptionsgeschichte der Weimarer Klassik in der Zeit von 1933 bis 1945 : Nationalsozialistische Instrumentalisierung, Hoffnung und Trost für die Verfolgten ; Begleitmaterial zur Ausstellung „Anne Frank - Eine Geschichte für heute“

    Get PDF
    Annes Interesse richtete sich, wie wir dem Tagebuch entnehmen können, besonders auf die zeitgenössische niederländische Literatur. Der Wille des Vaters, den Kindern die Dramen Goethes und Schillers näher zu bringen, ist zwar durch den zitierten Eintrag belegt, welche Bedeutung diese Lektüre für Anne selbst hatte, darüber lässt sich jedoch nur spekulieren. Wir wissen aber aus anderen Quellen, dass viele der von den Nationalsozialisten Verfolgten, Vertriebenen, Eingesperrten und Gequälten aus den Werken der „Klassiker“ und ihrer humanistischen Botschaft Hoffnung schöpften und darin Trost fanden. Gleichzeitig jedoch vereinnahmten die Machthaber dieses kulturelle Erbe propagandistisch für ihre nationalistischen und rassistischen Zwecke. Dass die Ausstellung „Anne Frank – Eine Geschichte für heute“ jetzt in der „Klassikerstadt“ Weimar gezeigt wird, in deren unmittelbarer Umgebung sich mit dem Konzentrationslager Buchenwald ein Ort befindet, dessen Name zu einem Synonym für den nationalsozialistischen Terror geworden ist, kann daher durchaus auch zum Anlass genommen werden, diesen beiden so gegensätzlichen Strängen der Rezeptionsgeschichte der „Weimarer Klassik“ und insbesondere der Werke Goethes und Schillers in den Jahren von 1933 bis 1945 einmal etwas genauer nachzugehen. Das vorliegende Arbeitsmaterial stellt dafür in kommentierter Form exemplarische Texte zur Verfügung

    Analyse des Leichtbaupotentials von parametrischen fraktalen Waben unter statischer Belastung mittels numerischer Methoden

    Get PDF
    Für die geplante Verwertung des Patentes EP2114755B1 für eine „Leichtbaukonstruktion mit einer fraktal gegliederten Stützstruktur“ wurden numerische Berechnungen der Verformung und der vonMises Vergleichsspannung versteifter Platten unter einer statischen Flächenlast durchgeführt. Als versteifende Struktur diente eine in mehreren Ebenen fraktal gegliederte Wabenstruktur, die in ihrer Form der Schalengeometrie von Diatomeen nachempfunden ist. Auf Basis einer parametrisch veränderbaren Wabenstruktur wurden Parameterstudien an einer ver-steiften Platte und einer Sandwichplatte mit fraktalem Wabenkern durchgeführt. Es konnte gezeigt werden, dass durch fraktale Waben die maximale Verformung und die maximale Vergleichsspannung im Vergleich zu Platten mit normalen Waben gleicher Masse gesenkt werden kann. Besonders bei kleinen Plattendicken bewirkten die fraktalen Ebenen große Reduktionen der Verformung und Vergleichsspannung. Außerdem reduzierte sich mit zunehmender Anzahl fraktaler Ebenen der Einfluss der konstruktiven Parameter der Waben auf die Verformung und Vergleichsspannung. Für die Gewichtsoptimierung einer fraktal versteiften Platte und einer Sandwichplatte mit fraktalem Wabenkern wurde ein Software-Setup aufgebaut. Mittels einer Evolu-tionsstrategie und einer Adaptive Response Surface Method wurden Optimierungen durchgeführt, um die Eignung parametrischer fraktaler Waben als Leichtbaumethode zu demonstrieren. Bei beiden Modellen erzielte die Evolutionsstrategie das bessere Ergebnis

    Automated Theorem Proving for General Game Playing

    Get PDF
    While automated game playing systems like Deep Blue perform excellent within their domain, handling a different game or even a slight change of rules is impossible without intervention of the programmer. Considered a great challenge for Artificial Intelligence, General Game Playing is concerned with the development of techniques that enable computer programs to play arbitrary, possibly unknown n-player games given nothing but the game rules in a tailor-made description language. A key to success in this endeavour is the ability to reliably extract hidden game-specific features from a given game description automatically. An informed general game player can efficiently play a game by exploiting structural game properties to choose the currently most appropriate algorithm, to construct a suited heuristic, or to apply techniques that reduce the search space. In addition, an automated method for property extraction can provide valuable assistance for the discovery of specification bugs during game design by providing information about the mechanics of the currently specified game description. The recent extension of the description language to games with incomplete information and elements of chance further induces the need for the detection of game properties involving player knowledge in several stages of the game. In this thesis, we develop a formal proof method for the automatic acquisition of rich game-specific invariance properties. To this end, we first introduce a simple yet expressive property description language to address knowledge-free game properties which may involve arbitrary finite sequences of successive game states. We specify a semantic based on state transition systems over the Game Description Language, and develop a provably correct formal theory which allows to show the validity of game properties with respect to their semantic across all reachable game states. Our proof theory does not require to visit every single reachable state. Instead, it applies an induction principle on the game rules based on the generation of answer set programs, allowing to apply any off-the-shelf answer set solver to practically verify invariance properties even in complex games whose state space cannot totally be explored. To account for the recent extension of the description language to games with incomplete information and elements of chance, we correctly extend our induction method to properties involving player knowledge. With an extensive evaluation we show its practical applicability even in complex games

    Maturity based approach for ISMS Governance

    Get PDF
    Information security is an integral element of fiduciary duty. The purpose of information security is to protect an organization’s valuable resources, such as information. Information security is also a subset of IT governance and must be managed within an Information Security Management System (ISMS). Key element of the operation of an ISMS are ISMS processes. Current research focuses on economics and cost benefit analysis of information security investment regarding single measures protecting information. ISMS processes are not in the focus of current research. Actually a specific ISMS process framework which clearly differentiates between ISMS processes and security measures controlled by ISMS processes as well as a description of ISMS processes and their interaction does not exist yet. ISMS processes as well as their maturity level need to be aligned to the implementing organization and their mission to be cost-effective. Considering limited resources as well as ensuring an efficient use of those resources not every ISMS process should be established and operated at the same level of maturity. Taking into account that business alignment and cost-effectiveness are important for the successful operation of an ISMS, research contributions must address both problems – ISMS processes as well as the determination their target maturity level. Therefore the overall objective of this doctoral thesis is to make the appropriateness of an ISMS transparent as well as to avoid unnecessary costs of information governance which is still a major issue/problem for many organizations. This doctoral thesis aims to fill this research gap by proposing an ISMS process framework, based on a set of agreed upon ISMS processes in existing applicable standards like ISO 27000 series, COBIT and ITIL. Within the framework, identified processes are described and their interaction and interfaces are specified. This framework helps to focus on the operation of the ISMS instead of focusing on measures and controls. By this the systemic character of the ISMS and the perception of relevant roles of the ISMS as a management system consisting of processes is strengthened. For an efficient use of the ISMS process framework a method to determine the individually necessary maturity level of the ISMS processes is proposed.La seguridad de la información es un elemento integral del deber fiduciario. El propósito de la seguridad de la información es proteger los recursos de una organización, incluyendo en los mismos la información. La seguridad de la información es también un subconjunto de la gobernanza de TI y debe gestionarse dentro de un Sistema de Gestión de la Seguridad de la Información (por sus siglas en inglés ISMS). El elemento clave del funcionamiento de un ISMS son los procesos del ISMS. La investigación actual se centra en aspectos económicos como el análisis de coste-beneficio de la inversión en seguridad de la información en relación a medidas individuales de protección de la información. De esta forma, los procesos del ISMS no están en el foco de la investigación actual. Así, todavía no existe un marco de proceso ISMS específico que diferencie claramente entre procesos ISMS y medidas de seguridad controladas por procesos ISMS, así como una descripción de procesos ISMS y su interacción. Para construir este marco, los procesos del ISMS, así como su nivel de madurez, deben estar alineados con la organización que los implanta así como con su misión. Tomando en consideración que las empresas presentan unos recursos limitados y que los recursos disponibles deben ser explotados de forma eficiente, no todos los procesos del ISMS deben ser establecidos y operados en el mismo nivel de madurez. Teniendo en cuenta que la alineación con el negocio y la rentabilidad son aspectos importantes para el funcionamiento exitoso de un ISMS, las contribuciones a la investigación del tópico deben abordar tanto los procesos del ISMS como la determinación de su nivel de madurez objetivo. Por lo tanto, el objetivo general de esta tesis doctoral es encaminar a las organizaciones hacia la construcción de un ISMS transparente, así como evitar costos innecesarios de la gobernanza de la información aspecto que sigue siendo una dificultad para muchas organizaciones. Esta tesis doctoral propone un marco de proceso ISMS basado en un conjunto de procesos acordados de ISMS en las normas vigentes existentes como la serie ISO 27000, COBIT e ITIL. Dentro del marco, se describen los procesos identificados y se especifica su interacción y las interfaces entre los mismos. Este marco ayuda a centrarse en el funcionamiento del ISMS en lugar de poner el foco en medidas y controles. Con esta aproximación, se fortalece el carácter sistémico del ISMS y la percepción de los roles relevantes del ISMS como un sistema de gestión que consiste en procesos. Para un uso eficiente del marco del proceso ISMS se propone un método para determinar el nivel de madurez individualmente necesario de los procesos del ISMS.Programa Oficial de Doctorado en Ciencia y Tecnología InformáticaPresidente: Antonio de Amescua Seco.- Secretario: Tomás San Feliú Gilabert.- Vocal: Rafael Valencia Garcí

    Modeling sparse connectivity between underlying brain sources for EEG/MEG

    Full text link
    We propose a novel technique to assess functional brain connectivity in EEG/MEG signals. Our method, called Sparsely-Connected Sources Analysis (SCSA), can overcome the problem of volume conduction by modeling neural data innovatively with the following ingredients: (a) the EEG is assumed to be a linear mixture of correlated sources following a multivariate autoregressive (MVAR) model, (b) the demixing is estimated jointly with the source MVAR parameters, (c) overfitting is avoided by using the Group Lasso penalty. This approach allows to extract the appropriate level cross-talk between the extracted sources and in this manner we obtain a sparse data-driven model of functional connectivity. We demonstrate the usefulness of SCSA with simulated data, and compare to a number of existing algorithms with excellent results.Comment: 9 pages, 6 figure

    GEANT4 Simulation of Detector Properties in the MOLLER Experiment

    Get PDF
    To explore the existence of new physics beyond the scope of the electroweak theory, international collaborations of nuclear physicists have constructed several precision-measurement experiments. One of these is the MOLLER experiment---a low-energy parity violation experiment that will utilize the 12 GeV upgrade of Jefferson Lab\u27s CEBAF accelerator. The motivation of this experiment is to measure the parity violating asymmetry of Møller scattering in a liquid hydrogen target. This measurement would allow for a more precise determination of the electron\u27s weak charge and the weak mixing angle. While still in its planning stages, the MOLLER experiment requires a detailed simulation framework in order to determine how the project should be run in the future. The simulation framework for MOLLER, called remoll , is written in C++ and uses the GEANT4 framework and libraries. It has recently been updated to include the full detector geometry of the experiment. The viability of the detector geometry in the real experiment will rely on a detailed analysis of the detector\u27s properties through simulation. The first part of this analysis assesses the impact of particle crosstalk in the design

    Auctions for Renewable Energy Support

    Get PDF

    Co-Creation and Sustainable Urban Planning: Who Co-Creates Sustainable Mobility Solutions at the Neighbourhood Level? Experiences from the Horizon 2020 Project “Sunrise”

    Get PDF
    Co-creation is applied as a key concept to develop, implement, assess, and facilitate learning about new ways to address urban mobility challenges at the neighbourhood level in the HORIZON 2020 project SUNRISE1(“Sustainable Urban Neighbourhoods - Research and Implementation Support in Europe”). SUNRISE’s objective is to contribute to sustainable urban development by stimulating co-creative processes and problem solutions in neighbourhoods in the field of new mobility concepts and new forms of mobility. Towards this aim, six cities (Bremen, Budapest Jerusalem, Malmö Southend on Sea, Thessaloniki) are fostering comprehensive collaborative processes with various actors in specific neighbourhoods with the explicitmandate to implement sustainable mobility solutions. The involvement of different actors is an important aspect and a challenge for co-creation processes. On the one hand, the involvement of residents and other stakeholders in sustainable urban planning is seen aspromising. in terms of achieving better results to improve the adaptability of socio-ecological systems. On the other hand, there are often questions such as: "who is participating?", "how can different actors be reached?" and "what results can be achieved with co-creation?". This paper provides answers to these questions based on experiences from co-creation processes in the SUNRISE project. After defining and embedding the term co-creation in planning theory, this paper gives an overview of the involved actors in the co-creation processes in SUNRISE, the co-creation activities carried out, and the mobility solutions developed on the neighbourhood level. Finally, the challenges of involving various actors in co-creative processes and the opportunities for co-creation when planning sustainable mobility solutions on the neighbourhood level will be discussed
    corecore