527 research outputs found

    Towards Net Zero Greenhouse Gas Emissions in the Energy and Chemical Sectors in Switzerland and Beyond - A Review

    Get PDF
    In today's societies, climate-damaging and finite fossil resources such as oil and natural gas serve a dual purpose as energy source and as carbon source for chemicals and plastics. To respond to the finite availability and to meet international climate goals, a change to a renewable energy and raw material basis is inevitable and represents a highly complex task. In this review, we assess possible technology paths for Switzerland to reach these goals. First, we provide an overview of Switzerland's current energy demand and discuss possible renewable technologies as well as proposed scenarios to defossilize the current energy system. In here, electric vehicles and heat pumps are key technologies, whereas mainly photovoltaics replace nuclear power to deliver clean electricity. The production of chemicals also consumes fossil resources and for Switzerland, the oil demand for imported domestically used chemicals and plastics corresponds to around 20% of the current energetic oil demand. Thus, we additionally summarize technologies and visions for a sustainable chemical sector based on the renewable carbon sources biomass, CO2 and recycled plastic. As biomass is the most versatile renewable energy and carbon source, although with a limited availability, aspects and proposed strategies for an optimal use are discussed

    Biochemical Conversion Processes of Lignocellulosic Biomass to Fuels and Chemicals ā€“ A Review

    Get PDF
    Lignocellulosic biomass ā€“ such as wood, agricultural residues or dedicated energy crops ā€“ is a promising renewable feedstock for production of fuels and chemicals that is available at large scale at low cost without direct competition for food usage. Its biochemical conversion in a sugar platform biorefinery includes three main unit operations that are illustrated in this review: the physico-chemical pretreatment of the biomass, the enzymatic hydrolysis of the carbohydrates to a fermentable sugar stream by cellulases and finally the fermentation of the sugars by suitable microorganisms to the target molecules. Special emphasis in this review is put on the technology, commercial status and future prospects of the production of second-generation fuel ethanol, as this process has received most research and development efforts so far. Despite significant advances, high enzyme costs are still a hurdle for large scale competitive lignocellulosic ethanol production. This could be overcome by a strategy termed 'consolidated bioprocessing' (CBP), where enzyme production, enzymatic hydrolysis and fermentation is integrated in one step ā€“ either by utilizing one genetically engineered superior microorganism or by creating an artificial co-culture. Insight is provided on both CBP strategies for the production of ethanol as well as of advanced fuels and commodity chemicals

    Optimization of Quantum Optical Metrology Systems

    Get PDF
    It can be said that all of humanity\u27s efforts can be understood as a problem of optimization. We each have a natural sense of what is ``good\u27\u27 or ``bad\u27\u27 and thus our actions tend towards maximizing -- or optimizing -- some notion of good and minimizing those things we perceive as bad or undesirable. Within the sciences, the greatest form of good is knowledge. It is this pursuit of knowledge that leads to not only life-saving innovations and technology, but also to furthering our understanding of our natural world and driving our philosophical pursuits. The principle method of obtaining knowledge in the sciences is by performing measurements; the simple act of comparing one attribute of a system to a known standard and recording the observed value is how all scientific progress is made. The act of performing measurements is in fact so important that there is an entire field of study surrounding it: metrology. One critical component of metrology is the development of new techniques to perform measurements, or alternative measurement schemes that are more optimal in some way. This is where there is room to exploit quantum physics to improve our techniques \--- we can perform quantum metrology. In quantum mechanics we routinely deal with the smallest, weakest, most delicate of systems. Quantum properties are inherently very sensitive to their environment; this of course makes them highly intolerant of noise but also makes them great resources to perform sensitive measurements. Quantum metrology concerns itself with utilizing quantum phenomena to extract more information from the natural world than is possible by conventional, or classical, means. To perform optimal measurements, these quantum systems must of course be optimal by some metric. Performing the ``optimal\u27\u27 measurement requires several ingredients. First, we need the optimal tools or instrumentation. In quantum mechanical language, this means we need the optimal probe state. Then, we need to optimize the interaction of our instrumentation with the system we which to interrogate so that we can extract the desired information. This translates to needing the best possible interaction between the probe state and the system in question -- in other words, we need to optimize the evolution of the probe. Finally, we must take care to extract the most information as possible at the output; we must not neglect any information present in the evolved probe state. The entire quantum metrology process can be summarized as thus: probe state preparation, probe state evolution, and evolved state detection. These elements make up the basis of this thesis. Within, I will discuss several works in which I optimize the performance of systems that implement these metrology elements. Specifically, I will first discuss one such system in which I optimize the probe interrogation of the phase, i.e. perform phase-estimation, in a Boson-sampling device. Then, I will show some strategies to progressively build up highly-useful Fock states starting from single photons as a resource. Lastly, I show how to utilize quantum properties of quantum light (specifically, squeezed light) to accurately calibrate a single-photon detector without need for a reference detector

    Computing multi-scale organizations built through assembly

    Get PDF
    The ability to generate and control assembling structures built over many orders of magnitude is an unsolved challenge of engineering and science. Many of the presumed transformational benefits of nanotechnology and robotics are based directly on this capability. There are still significant theoretical difficulties associated with building such systems, though technology is rapidly ensuring that the tools needed are becoming available in chemical, electronic, and robotic domains. In this thesis a simulated, general-purpose computational prototype is developed which is capable of unlimited assembly and controlled by external input, as well as an additional prototype which, in structures, can emulate any other computing device. These devices are entirely finite-state and distributed in operation. Because of these properties and the unique ability to form unlimited size structures of unlimited computational power, the prototypes represent a novel and useful blueprint on which to base scalable assembly in other domains. A new assembling model of Computational Organization and Regulation over Assembly Levels (CORAL) is also introduced, providing the necessary framework for this investigation. The strict constraints of the CORAL model allow only an assembling unit of a single type, distributed control, and ensure that units cannot be reprogrammed - all reprogramming is done via assembly. Multiple units are instead structured into aggregate computational devices using a procedural or developmental approach. Well-defined comparison of computational power between levels of organization is ensured by the structure of the model. By eliminating ambiguity, the CORAL model provides a pragmatic answer to open questions regarding a framework for hierarchical organization. Finally, a comparison between the designed prototypes and units evolved using evolutionary algorithms is presented as a platform for further research into novel scalable assembly. Evolved units are capable of recursive pairing ability under the control of a signal, a primitive form of unlimited assembly, and do so via symmetry-breaking operations at each step. Heuristic evidence for a required minimal threshold of complexity is provided by the results, and challenges and limitations of the approach are identified for future evolutionary studies

    Semirings of Evidence

    Full text link
    In traditional justification logic, evidence terms have the syntactic form of polynomials, but they are not equipped with the corresponding algebraic structure. We present a novel semantic approach to justification logic that models evidence by a semiring. Hence justification terms can be interpreted as polynomial functions on that semiring. This provides an adequate semantics for evidence terms and clarifies the role of variables in justification logic. Moreover, the algebraic structure makes it possible to compute with evidence. Depending on the chosen semiring this can be used to model trust, probabilities, cost, etc. Last but not least the semiring approach seems promising for obtaining a realization procedure for modal fixed point logics

    Large-Scale MIMO Detection for 3GPP LTE: Algorithms and FPGA Implementations

    Full text link
    Large-scale (or massive) multiple-input multiple-output (MIMO) is expected to be one of the key technologies in next-generation multi-user cellular systems, based on the upcoming 3GPP LTE Release 12 standard, for example. In this work, we propose - to the best of our knowledge - the first VLSI design enabling high-throughput data detection in single-carrier frequency-division multiple access (SC-FDMA)-based large-scale MIMO systems. We propose a new approximate matrix inversion algorithm relying on a Neumann series expansion, which substantially reduces the complexity of linear data detection. We analyze the associated error, and we compare its performance and complexity to those of an exact linear detector. We present corresponding VLSI architectures, which perform exact and approximate soft-output detection for large-scale MIMO systems with various antenna/user configurations. Reference implementation results for a Xilinx Virtex-7 XC7VX980T FPGA show that our designs are able to achieve more than 600 Mb/s for a 128 antenna, 8 user 3GPP LTE-based large-scale MIMO system. We finally provide a performance/complexity trade-off comparison using the presented FPGA designs, which reveals that the detector circuit of choice is determined by the ratio between BS antennas and users, as well as the desired error-rate performance.Comment: To appear in the IEEE Journal of Selected Topics in Signal Processin

    Steam Explosion Pretreatment of Beechwood. Part 2: Quantification of Cellulase Inhibitors and Their Effect on Avicel Hydrolysis

    Get PDF
    Steam explosion is a well-known process to pretreat lignocellulosic biomass in order to enhance sugar yields in enzymatic hydrolysis, but pretreatment conditions have to be optimized individually for each material. In this study, we investigated how the results of a pretreatment optimization procedure are influenced by the chosen reaction conditions in the enzymatic hydrolysis. Beechwood was pretreated by steam explosion and the resulting biomass was subjected to enzymatic hydrolysis at glucan loadings of 1% and 5% employing either washed solids or the whole pretreatment slurry. For enzymatic hydrolysis in both reaction modes at a glucan loading of 1%, the glucose yields markedly increased with increasing severity and with increasing pretreatment temperature at identical severities and maximal values were reached at a pretreatment temperature of 230 Ā°C. However, the optimal severity was 5.0 for washed solids enzymatic hydrolysis, but only 4.75 for whole slurry enzymatic hydrolysis. When the glucan loading was increased to 5%, glucose yields hardly increased for pretreatment temperatures between 210 and 230 Ā°C at a given severity, and a pretreatment temperature of 220 Ā°C was sufficient under these conditions. Consequently, it is important to precisely choose the desired conditions of the enzymatic hydrolysis reaction, when aiming to optimize the pretreatment conditions for a certain biomass

    Engineering of ecological niches to create stable artificial consortia for complex biotransformations

    Get PDF
    The design of controllable artificial microbial consortia has attracted considerable interest in recent years to capitalize on the inherent advantages in comparison to monocultures such as the distribution of the metabolic burden by division of labor, the modularity and the ability to convert complex substrates. One promising approach to control the consortia composition, function and stability is the provision of defined ecological niches fitted to the specific needs of the consortium members. In this review, we discuss recent examples for the creation of metabolic niches by biological engineering of resource partitioning and syntrophic interactions. Moreover, we introduce a complementing process engineering approach to provide defined spatial niches with differing abiotic conditions (e.g. O2, T, light) in stirred tank reactors harboring biofilms. This enables the co-cultivation of microorganisms with non-overlapping abiotic requirements and the control of the strain ratio in consortia characterized by substrate competition
    • ā€¦
    corecore