340 research outputs found

    Dissecting regional heterogeneity and modeling transcriptional cascades in brain organoids

    Get PDF
    Over the past decade, there has been a rapid expansion in the development and utilization of brain organoid models, enabling three-dimensional in vivo-like views of fundamental neurodevelopmental features of corticogenesis in health and disease. Nonetheless, the methods used for generating cortical organoid fates exhibit widespread heterogeneity across different cell lines. Here, we show that a combination of dual SMAD and WNT inhibition (Triple-i protocol) establishes a robust cortical identity in brain organoids, while other widely used derivation protocols are inconsistent with respect to regional specification. In order to measure this heterogeneity, we employ single-cell RNA-sequencing (scRNA-Seq), enabling the sampling of the gene expression profiles of thousands of cells in an individual sample. However, in order to draw meaningful conclusions from scRNA-Seq data, technical artifacts must be identified and removed. In this thesis, we present a method to detect one such artifact, empty droplets that do not contain a cell and consist mainly of free-floating mRNA in the sample. Furthermore, from their expression profiles, cells can be ordered along a developmental trajectory which recapitulates the progression of cells as they differentiate. Based on this ordering, we model gene expression using a Bayesian inference approach in order to measure transcriptional dynamics within differentiating cells. This enables the ordering of genes along transcriptional cascades, statistical testing for differences in gene expression changes, and measuring potential regulatory gene interactions. We apply this approach to differentiating cortical neural stem cells into cortical neurons via an intermediate progenitor cell type in brain organoids to provide a detailed characterization of the endogenous molecular processes underlying neurogenesis.Im letzten Jahrzent hat die Entwicklung und Nutzung von Organoidmodellen des Gehirns stark zugenommen. Diese Modelle erlauben dreidimensionale, in-vivo ähnliche Einblicke in fundamentale Aspekte der neurologischen Entwicklung des Hirnkortex in Gesundheit und Krankheit. Jedoch weisen die Methoden, um die Entwicklung kortikaler Organoide zu verfolgen, starke Heterogenität zwischen verschiedenen Zelllinien auf. Hier weisen wir nach, dass eine Kombination dualer SMAD und WNT Hemmung (Triple-i Protokoll) eine konstante kortikale Zuordnung in Hirnorganoiden erzeugt, während andere, weit verbreitete und genutzte Protokolle in Bezug auf kortikale Spezifizierung keine konstanten Ergebnisse liefern. Um die Heterogenität zu messen, haben wir Einzelzell-RNA Sequenzierung (scRNA-Seq) benutzt, wodurch die Erfassung der Genexpression von Tausenden von Zellen in einer Probe möglich ist. Um jedoch sinnvolle Schlüsse aus diesen scRNA-Seq Daten zu ziehen, müssen technische Artifakte identifiziert und aus den Daten entfernt werden. In dieser Dissertation stellen wir eine Methode vor, um eines solcher Artifakte zu erkennen: leere Tröpfchen (ohne Zellen), die hauptsächlich aus freischwebender mRNA in der Probe bestehen. Weiterhin können Zellen anhand ihrer Genexpressionsprofile entlang einer Entwicklungsschiene angeordnet werden, die die Entwicklung der Zellen während ihrer Differenzierung rekapituliert. Auf der Grundlage dieser Entwicklungsreihenfolge modellieren wir die Genexpression mit einem Bayes’schen Inferenzansatz, um die Dynamik der Transkription in sich differenzierenden Zellen zu messen. Dies ermöglicht das Anordnen von Genen entlang einer Transkriptionskaskade, sowie statistische Untersuchungen in Hinblick auf Unterschiede in der Veränderung von Genexpression, und das Messen des Einflusses möglicher Regulationsgene. Wir wenden diese Methode an, um kortikale neuronale Stammzellen zu untersuchen, die sich über einen intermediären Vorläuferzelltyp in kortikale Neuronen in Hirnorganoiden differenzieren, und um eine detaillierte Charakterisierung der molekularen Prozesse zu liefern, die der Neurogenese zugrunde liegen

    Updating structural wind turbine blade models via invertible neural networks

    Get PDF
    Wind turbine rotor blades are huge and complex composite structures that are exposed to exceptionally high loads, both extreme and fatigue loads. These can result in damages causing severe downtimes or repair costs. It is thus of utmost importance that the blades are carefully designed, including uncertainty analyses in order to produce safe, reliable, and cost-efficient wind turbines. An accurate reliability assessment should already start during the design and manufacturing phases. Recent developments in digitalization give rise to the concept of a digital twin, which replicates a product and its properties into a digital environment. Model updating is a technique, which helps to adapt the digital twin according to the measured characteristics of the real structure. Current model updating techniques are most often based on heuristic optimization algorithms, which are computationally expensive, can only deal with a relatively small parameter space, or do not estimate the uncertainty of the computed results. This thesis’ objective is to present a computationally efficient model updating method that recovers parameter deviation. This method is able to consider uncertainties and a high fidelity degree of the rotor blade model. A validated, fully parameterized model generator is used to perform a physics-informed training of a conditional invertible neural network. This network finally represents a surrogate of the inverse physical model, which then can be used to recover model parameters based on the structural responses of the blade. All presented generic model updating applications show excellent results, predicting the a posteriori distribution of the significant model parameters accurately.Bundesministerium für Wirtschaft und Klimaschutz/Energietechnologien (BMWi)/0324032C, 0324335B/E

    Demand Response in Smart Grids

    Get PDF
    The Special Issue “Demand Response in Smart Grids” includes 11 papers on a variety of topics. The success of this Special Issue demonstrates the relevance of demand response programs and events in the operation of power and energy systems at both the distribution level and at the wide power system level. This reprint addresses the design, implementation, and operation of demand response programs, with focus on methods and techniques to achieve an optimized operation as well as on the electricity consumer

    .NET/C# instrumentation for search-based software testing

    Get PDF
    C# is one of the most widely used programming languages. However, to the best of our knowledge, there has been no work in the literature aimed at enabling search-based software testing techniques for applications running on the .NET platform, like the ones written in C#. In this paper, we propose a search-based approach and an open source tool to enable white-box testing for C# applications. The approach is integrated with a .NET bytecode instrumentation, in order to collect code coverage at runtime during the search. In addition, by taking advantage of Branch Distance, we define heuristics to better guide the search, e.g., how heuristically close it is to cover a branch in the source code. To empirically evaluate our technique, we integrated our tool into the EvoMaster test generation tool and conducted experiments on three .NET RESTful APIs as case studies. Results show that our technique significantly outperforms gray-box testing tools in terms of code coverage.publishedVersio

    Computer Aided Verification

    Get PDF
    This open access two-volume set LNCS 13371 and 13372 constitutes the refereed proceedings of the 34rd International Conference on Computer Aided Verification, CAV 2022, which was held in Haifa, Israel, in August 2022. The 40 full papers presented together with 9 tool papers and 2 case studies were carefully reviewed and selected from 209 submissions. The papers were organized in the following topical sections: Part I: Invited papers; formal methods for probabilistic programs; formal methods for neural networks; software Verification and model checking; hyperproperties and security; formal methods for hardware, cyber-physical, and hybrid systems. Part II: Probabilistic techniques; automata and logic; deductive verification and decision procedures; machine learning; synthesis and concurrency. This is an open access book

    Automated Approaches for Program Verification and Repair

    Get PDF
    Formal methods techniques, such as verification, analysis, and synthesis,allow programmers to prove properties of their programs, or automatically derive programs from specifications. Making such techniques usable requires care: they must provide useful debugging information, be scalable, and enable automation. This dissertation presents automated analysis and synthesis techniques to ease the debugging of modular verification systems and allow easy access to constraint solvers from functional code. Further, it introduces machine learning based techniques to improve the scalability of off-the-shelf syntax-guided synthesis solvers and techniques to reduce the burden of network administrators writing and analyzing firewalls. We describe the design and implementationof a symbolic execution engine, G2, for non-strict functional languages such as Haskell. We extend G2 to both debug and automate the process of modular verification, and give Haskell programmers easy access to constraints solvers via a library named G2Q. Modular verifiers, such as LiquidHaskell, Dafny, and ESC/Java,allow programmers to write and prove specifications of their code. When a modular verifier fails to verify a program, it is not necessarily because of an actual bug in the program. This is because when verifying a function f, modular verifiers consider only the specification of a called function g, not the actual definition of g. Thus, a modular verifier may fail to prove a true specification of f if the specification of g is too weak. We present a technique, counterfactual symbolic execution, to aid in the debugging of modular verification failures. The approach uses symbolic execution to find concrete counterexamples, in the case of an actual inconsistency between a program and a specification; and abstract counterexamples, in the case that a function specification is too weak. Further, a counterexample-guided inductive synthesis (CEGIS) loop based technique is introduced to fully automate the process of modular verification, by using found counterexamples to automatically infer needed function specifications. The counterfactual symbolic execution and automated specification inference techniques are implemented in G2, and evaluated on existing LiquidHaskell errors and programs. We also leveraged G2 to build a library, G2Q, which allows writing constraint solving problemsdirectly as Haskell code. Users of G2Q can embed specially marked Haskell constraints (Boolean expressions) into their normal Haskell code, while marking some of the variables in the constraint as symbolic. Then, at runtime, G2Q automatically derives values for the symbolic variables that satisfy the constraint, and returns those values to the outside code. Unlike other constraint solving solutions, such as directly calling an SMT solver, G2Q uses symbolic execution to unroll recursive function definitions, and guarantees that the use of G2Q constraints will preserve type correctness. We further consider the problem of synthesizing functions viaa class of tools known as syntax-guided synthesis (SyGuS) solvers. We introduce a machine learning based technique to preprocess SyGuS problems, and reduce the space that the solver must search for a solution in. We demonstrate that the technique speeds up an existing SyGuS solver, CVC4, on a set of SyGuS solver benchmarks. Finally, we describe techniques to ease analysis and repair of firewalls.Firewalls are widely deployed to manage network security. However, firewall systems provide only a primitive interface, in which the specification is given as an ordered list of rules. This makes it hard to manually track and maintain the behavior of a firewall. We introduce a formal semantics for iptables firewall rules via a translation to first-order logic with uninterpreted functions and linear integer arithmetic, which allows encoding of firewalls into a decidable logic. We then describe techniques to automate the analysis and repair of firewalls using SMT solvers, based on user provided specifications of the desired behavior. We evaluate this approach with real world case studies collected from StackOverflow users

    Optical In-Process Measurement Systems

    Get PDF
    Information is key, which means that measurements are key. For this reason, this book provides unique insight into state-of-the-art research works regarding optical measurement systems. Optical systems are fast and precise, and the ongoing challenge is to enable optical principles for in-process measurements. Presented within this book is a selection of promising optical measurement approaches for real-world applications

    ECOS 2012

    Get PDF
    The 8-volume set contains the Proceedings of the 25th ECOS 2012 International Conference, Perugia, Italy, June 26th to June 29th, 2012. ECOS is an acronym for Efficiency, Cost, Optimization and Simulation (of energy conversion systems and processes), summarizing the topics covered in ECOS: Thermodynamics, Heat and Mass Transfer, Exergy and Second Law Analysis, Process Integration and Heat Exchanger Networks, Fluid Dynamics and Power Plant Components, Fuel Cells, Simulation of Energy Conversion Systems, Renewable Energies, Thermo-Economic Analysis and Optimisation, Combustion, Chemical Reactors, Carbon Capture and Sequestration, Building/Urban/Complex Energy Systems, Water Desalination and Use of Water Resources, Energy Systems- Environmental and Sustainability Issues, System Operation/ Control/Diagnosis and Prognosis, Industrial Ecology
    • …
    corecore