284 research outputs found

    Nuclear architecture and dynamics of stably integrated retroviral sequences

    Get PDF

    Communication Processes in Agro-environmental Policy Development and Decision-Making: Case Study Sachsen-Anhalt

    Get PDF
    The development of a decision support approach with regard to agro-environmental programmes is part of a larger research project on agricultural transformation and structural change processes. Research objective are improved communication processes and an enhanced quality of political decision-making. The investigation is based on the assumption that the success of agro-environmental programmes depends largely on their acceptance by all major stakeholders. This implies an early integration of varying interests in the decision-making process. Introducing participatory approaches into a bureaucratic setting poses particular problems. In order to achieve greater efficiency and effectiveness, the introduction of interactive modelling approaches has to be coupled with communication processes which increase transparency and allow for consensual decision-making.communication processes, agro-environmental programmes, decision-making support, conflict resolution, participation and acceptance, rural areas, agricultural extension, sustainable agriculture, structural change, transformation processes, Environmental Economics and Policy,

    On the necessity and a generalized conceptual model for the consideration of large strains in rock mechanics

    Get PDF
    This contribution presents a generalized conceptual model for the finite element solution of quasi-static isothermal hydro-mechanical processes in (fractured) porous media at large strains. A frequently used averaging procedure, known as Theory of Porous Media, serves as background for the complex multifield approach presented here. Within this context, a consistent representation of the weak formulation of the governing equations (i.e., overall balance equations for mass and momentum) in the reference configuration of the solid skeleton is preferred. The time discretization and the linearization are performed for the individual variables and nonlinear functions representing the integrands of the weak formulation instead of applying these conceptual steps to the overall nonlinear system of weighted residuals. Constitutive equations for the solid phase deformation are based on the multiplicative split of the deformation gradient allowing the adaptation of existing approaches for technical materials and biological tissues to rock materials in order to describe various inelastic effects, growth and remodeling in a thermodynamically consistent manner. The presented models will be a feature of the next version of the scientific open-source finite element code OpenGeoSys developed by an international developer and user group, and coordinated by the authors

    Conversational hierarchy for interaction with virtual assistant

    Get PDF
    User interfaces for conversations, e.g., with virtual assistants, typically model conversation as a back-and-forth exchange between participants, e.g., without regard to the type or topic of conversation. Under this user interface, a filter, refinement, or query by the user is added as a new inline block or a full-page refresh, making the user interface cumbersome to use. This disclosure describes techniques that model conversation as a hierarchy, e.g., with major and minor conversational turns. The user interface is designed around this hierarchy, and a flow between the front-end and the back-end of conversations informs user interactions and client-side page interactions

    On the term and concepts of numerical model validation in geoscientific applications

    Get PDF
    Modeling and numerical simulation of the coupled physical and chemical processes observed in the subsurface are the only options for long-term analyses of complex geological systems. This contribution discusses some more general aspects of the (dynamic) process modeling for geoscientific applications including reflections about the slightly different understanding of the terms model and model validation in different scientific communities, and about the term and methods of model calibration in the geoscientifc context. Starting from the analysis of observations of a certain part of the perceived reality, the process of model development comprises the establishment of the physical model characterizing relevant processes in a problem-oriented manner, and subsequently the mathematical and numerical models. Considering the steps of idealization and approximation in the course of model development, Oreskes et al. [1] state that process and numerical models can neither be verified nor validated in general. Rather the adequacy of models with specific assumptions and parameterizations made during model set-up can be confirmed. If the adequacy of process models with observations can be confirmed using lab as well as field tests and process monitoring, the adequacy of numerical models can be confirmed using numerical benchmarking and code comparison. Model parameters are intrinsic elements of process and numerical models, in particular constitutive parameters. As they are often not directly measurable, they have to be established by solving inverse problems based on an optimal numerical adaptation of observation results. In addition, numerical uncertainty analyses should be an obligatory part of numerical studies for critical real world applications

    Agent-Oriented Coupling of Activity-Based Demand Generation with Multiagent Traffic Simulation

    Get PDF
    The typical method to couple activity-based demand generation (ABDG) and dynamic traffic assignment (DTA) is time-dependent origin-destination (O-D) matrices. With that coupling method, the individual traveler's information gets lost. Delays at one trip do not affect later trips. However, it is possible to retain the full agent information from the ABDG by writing out all agents' plans, instead of the O-D matrix. A plan is a sequence of activities, connected by trips. Because that information typically is already available inside the ABDG, this is fairly easy to achieve. Multiagent simulation (MATSim) takes such plans as input. It iterates between the traffic flow simulation (sometimes called network loading) and the behavioral modules. The currently implemented behavioral modules are route finding and time adjustment. Activity resequencing or activity dropping are conceptually clear but not yet implemented. Such a system will react to a time-dependent toll by possibly rearranging the complete day; in consequence, it goes far beyond DTA (which just does route adaptation). This paper reports on the status of the current Berlin implementation. The initial plans are taken from an ABDG, originally developed by Kutter; to the authors' knowledge, this is the first time traveler-based information (and not just O-D matrices) is taken from an ABDG and used in a MATSim. The simulation results are compared with real-world traffic counts from about 100 measurement stations

    Wireless Networks In-the-Loop: Software Radio as the Enabler

    Get PDF
    A software architecture to rapidly develop and test radio networks in real and physical environments is proposed. Radio network terminals are developed in software and run on generic hardware to maximize reconfigurability. Due to the software nature of the radio terminals, radio networks can be simulated in a virtual environment, where physical channels are emulated by software entities. Without any changes to the code base, the same waveform can also be run in a real, physical environment. This feature is used to rapidly switch between real and virtual networks, thus bridging the gap between simulation and physical reality. Aspects of the proposed system are implemented and demonstrated with the GNU Software Radio framework

    Quantitative comparison of DNA detection by GFP-lac repressor tagging, fluorescence in situ hybridization and immunostaining

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>GFP-fusion proteins and immunostaining are methods broadly applied to investigate the three-dimensional organization of cells and cell nuclei, the latter often studied in addition by fluorescence in situ hybridization (FISH). Direct comparisons of these detection methods are scarce, however.</p> <p>Results</p> <p>We provide a quantitative comparison of all three approaches. We make use of a cell line that contains a transgene array of lac operator repeats which are detected by GFP-lac repressor fusion proteins. Thus we can detect the same structure in individual cells by GFP fluorescence, by antibodies against GFP and by FISH with a probe against the transgene array. Anti-GFP antibody detection was repeated after FISH. Our results show that while all four signals obtained from a transgene array generally showed qualitative and quantitative similarity, they also differed in details.</p> <p>Conclusion</p> <p>Each of the tested methods revealed particular strengths and weaknesses, which should be considered when interpreting respective experimental results. Despite the required denaturation step, FISH signals in structurally preserved cells show a surprising similarity to signals generated before denaturation.</p

    Large Deviations for Random Matricial Moment Problems

    Get PDF
    We consider the moment space MnK\mathcal{M}_n^{K} corresponding to p×pp \times p complex matrix measures defined on KK (K=[0,1]K=[0,1] or K=\D). We endow this set with the uniform law. We are mainly interested in large deviations principles (LDP) when nn \rightarrow \infty. First we fix an integer kk and study the vector of the first kk components of a random element of MnK\mathcal{M}_n^{K}. We obtain a LDP in the set of kk-arrays of p×pp\times p matrices. Then we lift a random element of MnK\mathcal{M}_n^{K} into a random measure and prove a LDP at the level of random measures. We end with a LDP on Carth\'eodory and Schur random functions. These last functions are well connected to the above random measure. In all these problems, we take advantage of the so-called canonical moments technique by introducing new (matricial) random variables that are independent and have explicit distributions.Comment: 34 page
    corecore