11,037 research outputs found

    Co-authorship networks in Swiss political research

    Get PDF
    Co-authorship is an important indicator of scientific collaboration. Co-authorship networks are composed of sub-communities, and researchers can gain visibility by connecting these insulated subgroups. This article presents a comprehensive co-authorship network analysis of Swiss political science. Three levels are addressed: disciplinary cohesion and structure at large, communities, and the integrative capacity of individual researchers. The results suggest that collaboration exists across geographical and language borders even though different regions focus on complementary publication strategies. The subfield of public policy and administration has the highest integrative capacity. Co-authorship is a function of several factors, most importantly being in the same subfield. At the individual level, the analysis identifies researchers who belong to the “inner circle” of Swiss political science and who link different communities. In contrast to previous research, the analysis is based on the full set of publications of all political researchers employed in Switzerland in 2013, including past publications

    From Social Simulation to Integrative System Design

    Full text link
    As the recent financial crisis showed, today there is a strong need to gain "ecological perspective" of all relevant interactions in socio-economic-techno-environmental systems. For this, we suggested to set-up a network of Centers for integrative systems design, which shall be able to run all potentially relevant scenarios, identify causality chains, explore feedback and cascading effects for a number of model variants, and determine the reliability of their implications (given the validity of the underlying models). They will be able to detect possible negative side effect of policy decisions, before they occur. The Centers belonging to this network of Integrative Systems Design Centers would be focused on a particular field, but they would be part of an attempt to eventually cover all relevant areas of society and economy and integrate them within a "Living Earth Simulator". The results of all research activities of such Centers would be turned into informative input for political Decision Arenas. For example, Crisis Observatories (for financial instabilities, shortages of resources, environmental change, conflict, spreading of diseases, etc.) would be connected with such Decision Arenas for the purpose of visualization, in order to make complex interdependencies understandable to scientists, decision-makers, and the general public.Comment: 34 pages, Visioneer White Paper, see http://www.visioneer.ethz.c

    The Application of Robust Regression to a Production Function Comparison – the Example of Swiss Corn

    Get PDF
    The adequate representation of crop response functions is crucial for agri-environmental modeling and analysis. So far, the evaluation of such functions focused on the comparison of different functional forms. The perspective is expanded in this article by considering an alternative regression method. This is motivated by the fact that exceptional crop yield observations (outliers) can cause misleading results if least squares regression is applied. We show that such outliers are adequately treated if robust regression is used instead. The example of simulated Swiss corn yields shows that the use of robust regression narrows the range of optimal input levels across different functional forms and reduces potential costs of misspecification.production function estimation, production function comparison, robust regression, crop response

    Marcel Grossmann and his contribution to the general theory of relativity

    Full text link
    This article reviews the biography of the Swiss mathematician Marcel Grossmann (1878-1936) and his contributions to the emergence of the general theory of relativity. The first part is his biography, while the second part reviews his collaboration with Einstein in Zurich which resulted in the Einstein-Grossmann theory of 1913. This theory is a precursor version of the final theory of general relativity with all the ingredients of that theory except for the correct gravitational field equations. Their collaboration is analyzed in some detail with a focus on the question of exactly what role Grossmann played in it.Comment: 52pp, 7 figs, to appear in Proceedings of 13th Marcel Grossmann meeting; revised version with some minor stylistic emendation

    Transformations of High-Level Synthesis Codes for High-Performance Computing

    Full text link
    Specialized hardware architectures promise a major step in performance and energy efficiency over the traditional load/store devices currently employed in large scale computing systems. The adoption of high-level synthesis (HLS) from languages such as C/C++ and OpenCL has greatly increased programmer productivity when designing for such platforms. While this has enabled a wider audience to target specialized hardware, the optimization principles known from traditional software design are no longer sufficient to implement high-performance codes. Fast and efficient codes for reconfigurable platforms are thus still challenging to design. To alleviate this, we present a set of optimizing transformations for HLS, targeting scalable and efficient architectures for high-performance computing (HPC) applications. Our work provides a toolbox for developers, where we systematically identify classes of transformations, the characteristics of their effect on the HLS code and the resulting hardware (e.g., increases data reuse or resource consumption), and the objectives that each transformation can target (e.g., resolve interface contention, or increase parallelism). We show how these can be used to efficiently exploit pipelining, on-chip distributed fast memory, and on-chip streaming dataflow, allowing for massively parallel architectures. To quantify the effect of our transformations, we use them to optimize a set of throughput-oriented FPGA kernels, demonstrating that our enhancements are sufficient to scale up parallelism within the hardware constraints. With the transformations covered, we hope to establish a common framework for performance engineers, compiler developers, and hardware developers, to tap into the performance potential offered by specialized hardware architectures using HLS

    Strategies to mitigate greenhouse gas and nitrogen emissions in Swiss agriculture: the application of an integrated sector model

    Get PDF
    Environmental impacts of agricultural production, such as greenhouse gas (GHG) and nitrogen emissions, are of major concern for scientists and policy makers throughout the world. Global agricultural activities account for about 60% of nitrous oxide and about 50% of methane emissions. From a global perspective, methane and nitrous oxide constitute crucial GHGs. They contribute substantially to climate change due to their high potential for effecting global warming compared to carbon dioxide. Emissions of these gases depend on the extent of agricultural production and applied technologies. Therefore, analysis of potential mitigation opportunities is challenging and requires an integrated approach in order to link agricultural economic perspectives to environmental aspects. In view of this, a mathematical programming model has been developed which enables assessment of cost-effective strategies for mitigating GHG and nitrogen emissions in the agricultural sector in Switzerland. This model is applied to improve understanding of the agricultural sector and its behavior with changing conditions in technology and policy. The presented recursive-dynamic model mimics the structure and inter- dependencies of Swiss agriculture and links that framework to core sources of GHG and nitrogen emissions. Calculated results for evaluation and application indicate that employed flexibility constraints provide a feasible approach to sufficiently validate the described model. Recursive-dynamic elements additionally enable adequate modeling of both an endogenous development of livestock dynamics and investments in buildings and machinery, also taking sunk costs into account. The presented findings reveal that the specified model approach is suitable to accurately estimate agricultural structure, GHG and nitrogen emissions within a tolerable range. The model performance can therefore be described as sufficiently robust and satisfactory. Thus, the model described here appropriately models strategies for GHG and nitrogen abatement in Swiss agriculture. The results indicate that there are limits to the ability of Swiss agriculture to contribute substantially to the mitigation of GHG and nitrogen emissions. There is only a limited level of mitigation available through technical approaches, and these approaches have high cost.resource use, environmental economics, greenhouse gas emission, nitrogen emission, integrated modeling

    Data-driven Flood Emulation: Speeding up Urban Flood Predictions by Deep Convolutional Neural Networks

    Full text link
    Computational complexity has been the bottleneck of applying physically-based simulations on large urban areas with high spatial resolution for efficient and systematic flooding analyses and risk assessments. To address this issue of long computational time, this paper proposes that the prediction of maximum water depth rasters can be considered as an image-to-image translation problem where the results are generated from input elevation rasters using the information learned from data rather than by conducting simulations, which can significantly accelerate the prediction process. The proposed approach was implemented by a deep convolutional neural network trained on flood simulation data of 18 designed hyetographs on three selected catchments. Multiple tests with both designed and real rainfall events were performed and the results show that the flood predictions by neural network uses only 0.5 % of time comparing with physically-based approaches, with promising accuracy and ability of generalizations. The proposed neural network can also potentially be applied to different but relevant problems including flood predictions for urban layout planning

    Icequakes coupled with surface displacements for predicting glacier break-off

    Full text link
    A hanging glacier at the east face of Weisshorn (Switzerland) broke off in 2005. We were able to monitor and measure surface motion and icequake activity for 25 days up to three days prior to the break-off. The analysis of seismic waves generated by the glacier during the rupture maturation process revealed four types of precursory signals of the imminent catastrophic rupture: (i) an increase in seismic activity within the glacier, (ii) a decrease in the waiting time between two successive icequakes, (iii) a change in the size-frequency distribution of icequake energy, and (iv) a modification in the structure of the waiting time distributions between two successive icequakes. Morevover, it was possible to demonstrate the existence of a correlation between the seismic activity and the log-periodic oscillations of the surface velocities superimposed on the global acceleration of the glacier during the rupture maturation. Analysis of the seismic activity led us to the identification of two regimes: a stable phase with diffuse damage, and an unstable and dangerous phase characterized by a hierarchical cascade of rupture instabilities where large icequakes are triggered.Comment: 16 pages, 7 figure

    Cosmology in doubly coupled massive gravity: constraints from SNIa, BAO and CMB

    Get PDF
    Massive gravity in the presence of doubly coupled matter field via en effective composite metric yields an accelerated expansion of the universe. It has been recently shown that the model admits stable de Sitter attractor solutions and could be used as a dark energy model. In this work, we perform a first analysis of the constraints imposed by the SNIa, BAO and CMB data on the massive gravity model with the effective composite metric and show that all the background observations are mutually compatible at the one sigma level with the model.Comment: 7 pages, 4 figure
    • …
    corecore