1,582 research outputs found

    Chatbots for Modelling, Modelling of Chatbots

    Full text link
    Tesis Doctoral inédita leída en la Universidad Autónoma de Madrid, Escuela Politécnica Superior, Departamento de Ingeniería Informática. Fecha de Lectura: 28-03-202

    Ratio und similitudo: die vernunftkonforme Argumentation im Dialogus des Petrus Alfonsi

    Full text link
    Anders als in religionspolemischen Werken früherer Autoren, die auf die exegetische Diskussion zentriert sind, argumentiert Petrus Alfonsi in seinem Dialogus (um 1110 verfasst) gleichermaßen auf der Grundlage von auctoritas (der Bibel) und von ratio. In diesem Beitrag wird diskutiert, wie Petrus Alfonsi die vernunftbasierte Argumentation begrifflich fasst und umsetzt. An einer Stelle präzisiert Petrus Alfonsi drei Quellen der rationalen Erkenntnis. Die Aussage wird anhand einer genauen Lektüre und durch die Heranziehung der Quelle, das Werk des jüdischen Philosophen Saadia Gaon, Emunoth we-Deoth, interpretiert. Petrus Alfonsi unterscheidet darin die spontane Erkenntnis durch die Sinne, die deduktive Argumentation auf der Grundlage von allgemein anerkannten Prämissen (necessariae rationes) und die similitudo, die sich als die evidenzbasierte Argumentation verstehen lässt. Im Dialogus argumentiert Petrus Alfonsi nur selten auf der Grundlage von Prämissen, immer wieder findet sich eine Argumentation, die auf beobachtbaren Phänomenen basiert. Häufig legt Petrus Erkenntnisse der Naturphilosophie dar, die er durch Naturbespiele erläutert. Für dieses Verfahren setzt er auch den Begriff similitudo ein

    A Formal Engineering Approach for Interweaving Functional and Security Requirements of RESTful Web APIs

    Get PDF
    RESTful Web API adoption has become ubiquitous with the proliferation of REST APIs in almost all domains with modern web applications embracing the micro-service architecture. This vibrant and expanding adoption of APIs, has made an increasing amount of data to be funneled through systems which require proper access management to ensure that web assets are secured. A RESTful API provides data using the HTTP protocol over the network, interacting with databases and other services and must preserve its security properties. Currently, practitioners are facing two major challenges for developing high quality secure RESTful APIs. One, REST is not a protocol. Instead, it is a set of guidelines that define how web resources can be designed and accessed over HTTP endpoints. There are a set of guidelines which stipulate how related resources should be structured using hierarchical URIs as well as how specific well-defined actions on those resources should be represented using different HTTP verbs. Whereas security has always been critical in the design of RESTful APIs, there are no clear formal models utilizing a secure-by-design approach that interweaves both the functional and security requirements. The other challenge is how to effectively utilize a model driven approach for constructing precise requirements and design specifications so that the security of a RESTFul API is considered as a concern that transcends across functionality rather than individual isolated operations.This thesis proposes a novel technique that encourages a model driven approach to specifying and verifying APIs functional and security requirements with the practical formal method SOFL (Structured-Object-Oriented Formal Language). Our proposed approach provides a generic 6 step model driven approach for designing security aware APIs by utilizing concepts of domain models, domain primitives, Ecore metamodel and SOFL. The first step involves generating a flat file with APIs resource listings. In this step, we extract resource definitions from an input RESTful API documentation written in RAML using an existing RAML parser. The output of this step is a flat file representing API resources as defined in the RAML input file. This step is fully automated. The second step involves automatic construction of an API resource graph that will work as a blue print for creating the target API domain model. The input for this step is the flat file generated from step 1 and the output is a directed graph (digraph) of API resource. We leverage on an algorithm which we created that takes a list of lists of API resource nodes and the defined API root resource node as an input, and constructs a digraph highlighting all the API resources as an output. In step 3, we use the generated digraph as a guide to manually define the API’s initial domain model as the target output with an aggregate root corresponding to the root node of the input digraph and the rest of the nodes corresponding to domain model entities. In actual sense, the generated digraph in step 2 is a barebone representation of the target domain model, but what is missing in the domain model at this stage in the distinction between containment and reference relationship between entities. The resulting domain model describes the entire ecosystem of the modeled API in the form of Domain Driven Design Concepts of aggregates, aggregate root, entities, entity relationships, value objects and aggregate boundaries. The fourth step, which takes our newly defined domain model as input, involves a threat modeling process using Attack Defense Trees (ADTrees) to identify potential security vulnerabilities in our API domain model and their countermeasures. aCountermeasures that can enforce secure constructs on the attributes and behavior of their associated domain entities are modeled as domain primitives. Domain primitives are distilled versions of value objects with proper invariants. These invariants enforce security constraints on the behavior of their associated entities in our API domain model. The output of this step is a complete refined domain model with additional security invariants from the threat modeling process defined as domain primitives in the refined domain model. This fourth step achieves our first interweaving of functional and security requirements in an implicit manner. The fifth step involves creating an Ecore metamodel that describes the structure of our API domain model. In this step, we rely on the refined domain model as input and create an Ecore metamodel that our refined domain model corresponds to, as an output. Specifically, this step encompasses structural modeling of our target RESTful API. The structural model describes the possible resource types, their attributes, and relations as well as their interface and representations. The sixth and the final step involves behavioral modeling. The input for this step is an Ecore metamodel from step 5 and the output is formal security aware RESTful API specifications in SOFL language. Our goal here is to define RESTful API behaviors that consist of actions corresponding to their respective HTTP verbs i.e., GET, POST, PUT, DELETE and PATCH. For example, CreateAction creates a new resource, an UpdateAction provides the capability to change the value of attributes and ReturnAction allows for response definition including the Representation and all metadata. To achieve behavioral modelling, we transform our API methods into SOFL processes. We take advantage of the expressive nature of SOFL processes to define our modeled API behaviors. We achieve the interweaving of functional and security requirements by injecting boolean formulas in post condition of SOFL processes. To verify whether the interweaved functional and security requirements implement all expected functions correctly and satisfy the desired security constraints, we can optionally perform specification testing. Since implicit specifications do not indicate algorithms for implementation but are rather expressed with predicate expressions involving pre and post conditions for any given specification, we can substitute all the variables involved a process with concrete values of their types with results and evaluate their results in the form of truth values true or false. When conducting specification testing, we apply SOFL process animation technique to obtain the set of concrete values of output variables for each process functional scenario. We analyse test results by comparing the evaluation results with an analysis criteria. An analysis criteria is a predicate expression representing the properties to be verified. If the evaluation results are consistent with the predicate expression, the analysis show consistency between the process specification and its associated requirement. We generate the test cases for both input and output variables based on the user requirements. The test cases generated are usually based on test targets which are predicate expressions, such as the pre and post conditions of a process. when testing for conformance of a process specification to its associated service operation, we only need to observe the execution results of the process by providing concrete input values to all of its functional scenarios and analyze their defining conditions relative to user requirements. We present an empirical case study for validating the practicality and usability of our model driven formal engineering approach by applying it in developing a Salon Booking System. A total of 32 services covering functionalities provided by the Salon Booking System API were developed. We defined process specifications for the API services with their respective security requirements. The security requirements were injected in the threat modeling and behavioral modeling phase of our approach. We test for the interweaving of functional and security requirements in the specifications generated by our approach by conducting tests relative to original RAML specifications. Failed tests were exhibited in cases where injected security measure like requirement of an object level access control is not respected i.e., object level access control is not checked. Our generated SOFL specification correctly rejects such case by returning an appropriate error message while the original RAML specification incorrectly dictates to accept such request, because it is not aware of such measure. We further demonstrate a technique for generating SOFL specifications from a domain model via model to text transformation. The model to text transformation technique semi-automates the generation of SOFL formal specification in step 6 of our proposed approach. The technique allows for isolation of dynamic and static sections of the generated specifications. This enables our technique to have the capability of preserving the static sections of the target specifications while updating the dynamic sections in response to the changes of the underlying domain model representing the RESTful API in design. Specifically, our contribution is provision of a systemic model driven formal engineering approach for design and development of secure RESTful web APIs. The proposed approach offers a six-step methodology covering both structural and behavioral modelling of APIs with a focus on security. The most distinguished merit of the model to text transformation is the utilization of the API’s domain model as well as a metamodel that the domain model corresponds to as the foundation for generation of formal SOFL specifications that is a representation of API’s functional and security requirements.博士(理学)法政大学 (Hosei University

    Development and implementation of in silico molecule fragmentation algorithms for the cheminformatics analysis of natural product spaces

    Get PDF
    Computational methodologies extracting specific substructures like functional groups or molecular scaffolds from input molecules can be grouped under the term “in silico molecule fragmentation”. They can be used to investigate what specifically characterises a heterogeneous compound class, like pharmaceuticals or Natural Products (NP) and in which aspects they are similar or dissimilar. The aim is to determine what specifically characterises NP structures to transfer patterns favourable for bioactivity to drug development. As part of this thesis, the first algorithmic approach to in silico deglycosylation, the removal of glycosidic moieties for the study of aglycones, was developed with the Sugar Removal Utility (SRU) (Publication A). The SRU has also proven useful for investigating NP glycoside space. It was applied to one of the largest open NP databases, COCONUT (COlleCtion of Open Natural prodUcTs), for this purpose (Publication B). A contribution was made to the Chemistry Development Kit (CDK) by developing the open Scaffold Generator Java library (Publication C). Scaffold Generator can extract different scaffold types and dissect them into smaller parent scaffolds following the scaffold tree or scaffold network approach. Publication D describes the OngLai algorithm, the first automated method to identify homologous series in input datasets, group the member structures of each group, and extract their common core. To support the development of new fragmentation algorithms, the open Java rich client graphical user interface application MORTAR (MOlecule fRagmenTAtion fRamework) was developed as part of this thesis (Publication E). MORTAR allows users to quickly execute the steps of importing a structural dataset, applying a fragmentation algorithm, and visually inspecting the results in different ways. All software developed as part of this thesis is freely and openly available (see https://github.com/JonasSchaub)

    A precise, General, Non-Invasive and Automatic Speed Estimation Method for MCSA Steady-State Diagnosis and Efficiency Estimation of Induction Motors in the 4.0 Industry

    Full text link
    Tesis por compendio[ES] Hay dos aspectos cruciales a la hora de operar motores de inducción en la industria: la estimación de su eficiencia (para minimizar el consumo de energía) y su diagnóstico (para evitar paradas intempestivas y reducir los costes de mantenimiento). Para estimar la eficiencia del motor es necesario medir tensiones y corrientes. Por ello, resulta conveniente y muy útil utilizar la misma corriente para diagnosticar también el motor (Motor Current Signature Analysis: MCSA). En este sentido, la técnica MCSA más adecuada es aquella basada en la localización de armónicos de fallo en el espectro de la corriente de línea del estator en régimen permanente, pues esta es la condición de funcionamiento de la mayoría de los motores de inducción de la industria. Por otro lado, dado que la frecuencia de estos armónicos depende de la velocidad, resulta imprescindible conocer esta magnitud con precisión, ya que esto permite localizar correctamente los armónicos de fallo, y, por tanto, reducir las posibilidades de falsos positivos/negativos. A su vez, una medida precisa de la velocidad también permite calcular con precisión la potencia mecánica, lo que se traduce en una estimación más exacta del rendimiento. Por último, para adaptarse a las necesidades de la Industria 4.0, en la que se monitoriza continuamente un gran número de motores, la velocidad también debe ser obtenida de manera no invasiva, automática y para cualquier motor de inducción. A este respecto, dado que la medición precisa de la velocidad a través de un encóder es invasiva y costosa, las técnicas de estimación de velocidad sin sensores (SSE en inglés) se convierten en la mejor opción. En la primera parte de esta tesis se realiza un análisis exhaustivo de las familias de técnicas SSE presentes en la literatura técnica. Como se demuestra en ella, aquellos métodos basados en armónicos de ranura (RSHs en inglés) y en armónicos laterales de frecuencia rotacional (RFSHs) son potencialmente los únicos que pueden satisfacer todos los requisitos mencionados anteriormente. Sin embargo, como también se demuestra en esta parte, y hasta esta tesis, siempre había existido un compromiso entre la precisión (característica de los RSHs) y la aplicabilidad general del método (característica de los RFSHs). En la segunda parte, y núcleo de esta tesis, se presenta una metodología que acaba con este compromiso, proporcionando así el primer método de estimación de velocidad preciso, general, no invasivo y automático para el diagnóstico en estado estacionario MCSA y la estimación de la eficiencia de motores de inducción que operan en un contexto de Industria 4.0. Esto se consigue desarrollando una novedosa técnica basada en RSHs que, por primera vez en la literatura técnica, elimina la necesidad de conocer/estimar el número de ranuras del rotor, lo que había impedido hasta la fecha que estos métodos fueran de aplicación general. Esta técnica proporciona además un procedimiento fiable y automático para localizar la familia de RSHs en el espectro de la corriente de línea de un motor de inducción. De igual forma y sin la ayuda de un experto, la técnica es capaz de determinar los parámetros necesarios para estimar la velocidad a partir de los RSHs, utilizando medidas tomadas en régimen estacionario. La metodología es validada utilizando motores con diferentes características y tipos de alimentaciones, empleando para ello simulaciones, pruebas de laboratorio y 105 motores industriales. Además, se muestra un caso de aplicación industrial en el que el algoritmo desarrollado se implementa en un sistema de monitorización continua mediante MCSA, lo que acaba conduciendo al descubrimiento de un nuevo fallo en motores sumergibles de pozo profundo: el desgaste de los anillos de cortocircuito. Por último, se presenta una segunda aplicación directa para este tipo de motores derivada del procedimiento de detección de RSHs: el uso de estos armónicos para diagnosticar, en fase temprana, cortocircuitos entre espiras.[CA] Hi ha dos aspectes crucials a l'hora d'operar motors d'inducció en la indústria: l'estimació de la seua eficiència (per a minimitzar el consum d'energia) i el seu diagnòstic (per a evitar parades intempestives i reduir els costos de manteniment). Per a estimar l'eficiència del motor és necessari mesurar tensions i corrents. Per això, resulta convenient i molt útil utilitzar el mateix corrent per a diagnosticar també el motor (Motor Current Signature Analysis: MCSA). En aquest sentit, la tècnica MCSA més adequada és aquella basada en la localització d'harmònics de fallada en l'espectre del corrent de línia de l'estator en règim permanent, ja que aquesta és la condició de funcionament de la majoria dels motors d'inducció de la indústria. D'altra banda, atés que la freqüència d'aquests harmònics depén de la velocitat, resulta imprescindible conéixer aquesta magnitud amb precisió, ja que això permet localitzar correctament els harmònics de fallada i, per tant, reduir les possibilitats de falsos positius/negatius. Al seu torn, una mesura precisa de la velocitat també permet calcular amb precisió la potència mecànica, la qual cosa es tradueix en una estimació més exacta del rendiment. Finalment, per a adaptar-se a les necessitats de la Indústria 4.0, en la qual es monitora contínuament un gran nombre de motors, la velocitat també ha de ser obtinguda de manera no invasiva, automàtica i per a qualsevol motor d'inducció. En aquest sentit, atès que el mesurament precís de la velocitat a través d'un encóder és invasiva i costosa, les tècniques d'estimació de velocitat sense sensors (SSE en anglés) es converteixen en la millor opció. En la primera part d'aquesta tesi es realitza una anàlisi exhaustiva de totes les famílies de tècniques SSE presents en la literatura tècnica. Com es demostra en ella, aquells mètodes basats en harmònics de ranura (RSHs en anglès) i harmònics laterals de freqüència rotacional (RFSHs en anglés) són els més prometedors, ja que son potencialment els únics que poden satisfer tots els requisits esmentats anteriorment. No obstant això, com també es demostra en aquesta part, i fins a aquesta tesi, sempre havia existit un compromís entre la precisió (característica dels RSHs) i l'aplicabilitat general del mètode (característica dels RFSHs). En la segona part, i nucli d'aquesta tesi, es presenta una metodologia que acaba amb aquest compromís, proporcionant així el primer mètode d'estimació de velocitat precís, general, no invasiu i automàtic per al diagnòstic en estat estacionari MCSA i l'estimació de l'eficiència de motors d'inducció que operen en un context d'Indústria 4.0. Això s'aconsegueix desenvolupant una nova tècnica basada en RSHs que, per primera vegada en la literatura tècnica, elimina la necessitat de conéixer/estimar el nombre de ranures del rotor, cosa que havia impedit fins avui que aquests mètodes foren d'aplicació general. Aquesta tècnica proporciona a més un procediment fiable i automàtic per a localitzar la família de RSHs en l'espectre del corrent de línia d'un motor d'inducció. De la mateixa forma i sense l'ajuda d'un expert, la tècnica és capaç de determinar els paràmetres necessaris per a estimar la velocitat a partir dels RSHs, utilitzant mesures preses en règim estacionari. La metodologia és validada utilitzant motors amb diferents característiques i condicions d'alimentació, emprant per a això simulacions, proves de laboratori i 105 motors industrials. A més, es mostra un cas real d'aplicació industrial en el qual l'algoritme desenvolupat és implementat en un sistema de monitoratge continu mitjançant MCSA, la qual cosa acaba conduint al descobriment d'una nova fallada en motors submergibles de pou profund: el desgast dels anells de curtcircuit. Finalment, es presenta una segona aplicació directa per a aquest tipus de motors derivada del procediment de detecció de RSHs: l'ús d'aquests harmònics per a diagnosticar, en fase primerenca, curtcircuits entre espires.[EN] There are two crucial aspects when operating induction motors in industry: efficiency estimation (to minimize energy consumption) and diagnosis (to avoid untimely outages and reduce maintenance costs). To estimate the motor's efficiency, it is necessary to measure voltages and currents. Hence, it is convenient and very useful using the same current to also diagnose the motor (Motor Current Signature Analysis: MCSA). In this regard, the most suitable MCSA technique is that based on locating fault harmonics in the spectrum of the stator line current under steady-state, as this is the operating condition of most induction motors in industry. Since the frequency of these harmonics depends on the speed, it becomes essential to be able to know this magnitude with precision, as this makes it possible to correctly locate the fault harmonics, and therefore, reduce the chances of false positives/negatives. In turn, an accurate speed information also allows to calculate the mechanical power with precision, which results in a more accurate estimation of the motor performance. Finally, to adapt to the needs of 4.0 Industry, where large numbers of motors are continuously monitored, the speed must not only be obtained very accurately, but also non-invasively, automatically (without the need for an expert) and for any induction motor. In this regard, since precise speed measurement through a shaft sensor is invasive and expensive, Sensorless Speed Estimation (SSE) techniques become the best option. The first part of this thesis conducts a thorough analysis of all the families of SSE techniques present in the technical literature. As demonstrated therein, those techniques based on Slotting and Rotational Frequency Sideband Harmonics are the most promising, as they can potentially meet all the aforementioned requirements. However, as also proved in this part, and up to this thesis, there had always been a trade-off between accuracy, characteristic of Rotor Slot Harmonics (RSHs), and general applicability, characteristic of Rotational Frequency Sideband Harmonics (RFSHs). The second part, and core of this thesis, presents a methodology that ends with this trade-off between accuracy and general applicability, thus providing the first precise, general, noninvasive and automatic speed estimation method for MCSA steady-state diagnosis and efficiency estimation of induction motors that operate in a 4.0 Industry context. This is achieved by developing a novel RSH-based technique that, for the first time in technical literature, eliminates the need to know/estimate the number of rotor slots, which had so far prevented these techniques to be generally applicable. This technique also provides a reliable and automatic procedure to, from among the high number of significant harmonics present in the spectrum of the line current of an induction motor, locate the RSHs family. Also automatically and without the help of an expert, the technique is able to determine the parameters needed to estimate speed from RSHs, using only measurements taken during the motor normal operation at steady-state. The methodology is validated using motors with different characteristics and supply conditions, by simulations, lab tests and with 105 industrial motors. Furthermore, a real industrial case of application is shown as well, where the speed estimation algorithm is implemented in a continuous motor condition monitoring system via MCSA, which eventually leads to the discovery of a new fault in deep-well submersible motors: the wear of end-rings. Finally, a second direct application derived from the reliable and automatic procedure to detect RSHs is presented: the use of these harmonics to diagnose early-stage inter-turn faults in induction motors of deep-well submersible pumps.Bonet Jara, J. (2023). A precise, General, Non-Invasive and Automatic Speed Estimation Method for MCSA Steady-State Diagnosis and Efficiency Estimation of Induction Motors in the 4.0 Industry [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/194269Compendi

    Fictional Practices of Spirituality I: Interactive Media

    Get PDF
    "Fictional Practices of Spirituality" provides critical insight into the implementation of belief, mysticism, religion, and spirituality into worlds of fiction, be it interactive or non-interactive. This first volume focuses on interactive, virtual worlds - may that be the digital realms of video games and VR applications or the imaginary spaces of life action role-playing and soul-searching practices. It features analyses of spirituality as gameplay facilitator, sacred spaces and architecture in video game geography, religion in video games and spiritual acts and their dramaturgic function in video games, tabletop, or LARP, among other topics. The contributors offer a first-time ever comprehensive overview of play-rites as spiritual incentives and playful spirituality in various medial incarnations

    Occupant-Centric Simulation-Aided Building Design Theory, Application, and Case Studies

    Get PDF
    This book promotes occupants as a focal point for the design process

    The Cryogenic AntiCoincidence detector for Athena X-IFU

    Get PDF
    Athena is an ESA project for a space telescope for the X-ray astrophysics. The scientific goal is to study the Universe by measuring the evolution of baryonic matter in large-scale structures, such as the warm-hot intergalactic medium, as well as in energetic compact objects. Because most of the baryonic component of the Universe is locked up in hot gas at temperatures of about a million degrees, and because of the extreme energetics of the processes close to the event horizon of black holes, understanding the hot and energetic Universe requires space-based observations in the X-ray band. The topic requires for spatially resolved X-ray spectroscopy and deep wide-field X-ray spectral imaging with capabilities far beyond those of current observatories like XMM-Newton and Chandra. The observatory will be a fixed 12-meter fixed-focus telescope with two instruments, the innovative X-ray Integral Field Unit (X-IFU), based on cryogenic detectors; and the Wide Field Imager (WFI). These two instruments combine the high spectral resolution of X-IFU with the high spatial resolution of WFI to achieve the scientific goals, with a measurement spectrum from 0.5 to 10 keV. X-IFU is based on 50 mK cooled Transition Edge Sensors (TES), that exploit the metal-superconductor transition. These can provide the required energy resolution, while offering exceptional efficiency compared to the spectrometers on the current generation of X-ray observatories. Since the telescope will operate in an environment rich in cosmic rays, it would be impossible to separate the signals from the background on the X-ray detector. In X-IFU, this problem will be solved by an active anticoincidence layer, which would make it possible to achieve the scientific goals for the spectroscopy of faint or distant sources. The work done in this thesis was focused on the anticoincidence detector, which is one of the core parts of the instrument. Its scope is the reduction of the signal background by about 2 orders of magnitude and will be positioned only 1 mm below the spectrometer. The Demonstration Model (DM) of the detector has been studied, realized and tested. With particular interest in improving the understanding and technology of microfabrication of superconducting devices. The detector is fabricated using optical microlithography and PLD, electro-beam evaporator, and RF-sputtering film deposition systems. The DM active area consists of 96 Ir/Au TES films connected in parallel with superimposed Nb strip lines, insulated with a SiO film, and four heaters on a Si absorber. The pixel is freestanding and attached to a gold frame with four Si beams. The frame is needed to have a strong coupling to a cryostat, since the operating point is below 1 K, and the heaters and the beams are needed to control the decoupling of the active area. Measurements are performed at temperatures around to 0.1 K (the theoretical operating point of X-IFU) in a dilution cryostat reading signals from radiation sources such as Am 241 at 60 keV or Fe 55 at 5 keV. The very low impedance of TES sensors requires a SQuID to read the output signal. In addition some structural models of the detector have been fabricated and vibrated to understand the structural characteristics and to test the response to stresses that the detector will experience during launch. Variations of the detector were studied to test its spectroscopic capabilities and to measure its thermal characteristics. To better understand the overall signal generation inside the absorber, a model and simulation of the phononic distribution of the a-thermal transient was developed. Finally, the detector was tested in conjunction with the NASA spectrometer to verify its anticoincidence performance

    Offene-Welt-Strukturen: Architektur, Stadt- und Naturlandschaft im Computerspiel

    Get PDF
    Welche Rolle spielen Algorithmen für den Bildbau und die Darstellung von Welt und Wetter in Computerspielen? Wie beeinflusst die Gestaltung der Räume, Level und Topografien die Entscheidungen und das Verhalten der Spieler_innen? Ist der Brutalismus der erste genuine Architekturstil der Computerspiele? Welche Bedeutung haben Landschaftsgärten und Nationalparks im Strukturieren von Spielwelten? Wie wird Natur in Zeiten des Klimawandels dargestellt? Insbesondere in den letzten 20 Jahren adaptieren digitale Spielwelten akribischer denn je Merkmale der physisch-realen Welt. Durch aufwändige Produktionsverfahren und komplexe Visualisierungsstrategien wird die Angleichung an unsere übrige Alltagswelt stets in Abhängigkeit von Spielmechanik und Weltlichkeit erzeugt. Wie sich spätestens am Beispiel der Open-World-Spiele zeigt, führt die Übernahme bestimmter Weltbilder und Bildtraditionen zu ideologischen Implikationen, die weit über die bisher im Fokus der Forschung stehenden, aus anderen Medienformaten transferierten Erzählkonventionen hinausgehen. Mit seiner Theorie der Architektur als medialem Scharnier legt der Autor offen, dass digitale Spielwelten medienspezifische Eigenschaften aufweisen, die bisher nicht zu greifen waren und der Erforschung harrten. Durch Verschränken von Konzepten aus u.a. Medienwissenschaft, Game Studies, Philosophie, Architekturtheorie, Humangeografie, Landschaftstheorie und Kunstgeschichte erarbeitet Bonner ein transdisziplinäres Theoriemodell und ermöglicht anhand der daraus entwickelten analytischen Methoden erstmals, die komplexe Struktur heutiger Computerspiele - vom Indie Game bis zur AAA Open World - zu verstehen und zu benennen. Mit "Offene-Welt-Strukturen" wird die Architektonik digitaler Spielwelten umfassend zugänglich
    corecore