3,650 research outputs found

    Double polarisation experiments in meson photoproduction

    Full text link
    One of the remaining challenges within the standard model is to gain a good understanding of QCD in the non-perturbative regime. A key step towards this aim is baryon spectroscopy, investigating the spectrum and the properties of baryon resonances. To gain access to resonances with small πN\pi N partial width, photoproduction experiments provide essential information. Partial wave analyses need to be performed to extract the contributing resonances. Here, a complete experiment is required to unambiguously determine the contributing amplitudes. This involves the measurement of carefully chosen single and double polarisation observables. In a joint endeavour by MAMI, ELSA, and Jefferson Laboratory, a new generation of experiments with polarised beams, polarised proton and neutron targets, and 4π4\pi particle detectors have been performed in recent years. Many results of unprecedented quality were recently published by all three experiments, and included by the various partial wave analysis groups in their analyses, leading to substantial improvements, e.g. a more precise determination of resonance parameters. An overview of recent results is given, with an emphasis on results from the CBELSA/TAPS experiment, and their impact on our understanding of the nucleon excitation spectrum is discussed.Comment: 6 pages, 5 figures, Proceedings of MESON2016. arXiv admin note: text overlap with arXiv:1601.0132

    Industry, firm, year, and country effects on profitability: Evidence from a large sample of EU food processing firms

    Get PDF
    This paper analyzes the variance in accounting profitability within the European food industry. Based on a large panel data set, the variance in return on assets (ROA) is decomposed into year, country, industry, and firm effects. Further on, we include all possible interactions between year, country, and industry and discuss the theoretical foundations for these effects. After singling out the significant effect classes in a nested ANOVA with a thoroughly designed rotation regarding the order of effect introduction, we determine effect magnitude using components of variance (COV). Our results show that firm characteristics seem to be far more important than industry structure in determining the level of economic return within the food industry. Year and country effects, as well as their interactions were weak or insignificant, indicating that macroeconomics and trade theory offer little potential to serve as a basis for the explanation of performance differentials. While neither national nor industry-specific cycles were significant, EU-wide fluctuations significantly contributed to explaining differences in performance, suggesting that economic cycles in the EU are by and large synchronized.variance components, abnormal profit, EU-27, MBV, RBV, comparative advantage, Agribusiness, Agricultural Finance, Financial Economics, Industrial Organization, Marketing,

    Mathematics and Statistics in the Social Sciences

    Get PDF
    Over the years, mathematics and statistics have become increasingly important in the social sciences1 . A look at history quickly confirms this claim. At the beginning of the 20th century most theories in the social sciences were formulated in qualitative terms while quantitative methods did not play a substantial role in their formulation and establishment. Moreover, many practitioners considered mathematical methods to be inappropriate and simply unsuited to foster our understanding of the social domain. Notably, the famous Methodenstreit also concerned the role of mathematics in the social sciences. Here, mathematics was considered to be the method of the natural sciences from which the social sciences had to be separated during the period of maturation of these disciplines. All this changed by the end of the century. By then, mathematical, and especially statistical, methods were standardly used, and their value in the social sciences became relatively uncontested. The use of mathematical and statistical methods is now ubiquitous: Almost all social sciences rely on statistical methods to analyze data and form hypotheses, and almost all of them use (to a greater or lesser extent) a range of mathematical methods to help us understand the social world. Additional indication for the increasing importance of mathematical and statistical methods in the social sciences is the formation of new subdisciplines, and the establishment of specialized journals and societies. Indeed, subdisciplines such as Mathematical Psychology and Mathematical Sociology emerged, and corresponding journals such as The Journal of Mathematical Psychology (since 1964), The Journal of Mathematical Sociology (since 1976), Mathematical Social Sciences (since 1980) as well as the online journals Journal of Artificial Societies and Social Simulation (since 1998) and Mathematical Anthropology and Cultural Theory (since 2000) were established. What is more, societies such as the Society for Mathematical Psychology (since 1976) and the Mathematical Sociology Section of the American Sociological Association (since 1996) were founded. Similar developments can be observed in other countries. The mathematization of economics set in somewhat earlier (Vazquez 1995; Weintraub 2002). However, the use of mathematical methods in economics started booming only in the second half of the last century (Debreu 1991). Contemporary economics is dominated by the mathematical approach, although a certain style of doing economics became more and more under attack in the last decade or so. Recent developments in behavioral economics and experimental economics can also be understood as a reaction against the dominance (and limitations) of an overly mathematical approach to economics. There are similar debates in other social sciences. It is, however, important to stress that problems of one method (such as axiomatization or the use of set theory) can hardly be taken as a sign of bankruptcy of mathematical methods in the social sciences tout court. This chapter surveys mathematical and statistical methods used in the social sciences and discusses some of the philosophical questions they raise. It is divided into two parts. Sections 1 and 2 are devoted to mathematical methods, and Sections 3 to 7 to statistical methods. As several other chapters in this handbook provide detailed accounts of various mathematical methods, our remarks about the latter will be rather short and general. Statistical methods, on the other hand, will be discussed in-depth

    Assessment of the microbial community in the cathode compartment of a plant microbial fuel cell

    Get PDF
    Introduction: In plant microbial fuel cells (plant-MFCs) living plants and microorganisms form an electrochemical unit able to produce clean and sustainable electricity from solar energy. It is reasonable to assume that besides the bacteria in the anode compartment also the cathode compartment plays a crucial role for a stable high current producing plant-MFC. In this study we aim to identify dominant bacterial species in the cathode compartment of the plant-MFC

    Uncertainty Modelling of Laser Scanning Point Clouds Using Machine-Learning Methods

    Get PDF
    Terrestrial laser scanners (TLSs) are a standard method for 3D point cloud acquisition due to their high data rates and resolutions. In certain applications, such as deformation analysis, modelling uncertainties in the 3D point cloud is crucial. This study models the systematic deviations in laser scan distance measurements as a function of various influencing factors using machine-learning methods. A reference point cloud is recorded using a laser tracker (Leica AT 960) and a handheld scanner (Leica LAS-XL) to investigate the uncertainties of the Z+F Imager 5016 in laboratory conditions. From 49 TLS scans, a wide range of data are obtained, covering various influencing factors. The processes of data preparation, feature engineering, validation, regression, prediction, and result analysis are presented. The results of traditional machine-learning methods (multiple linear and nonlinear regression) are compared with eXtreme gradient boosted trees (XGBoost). Thereby, it is demonstrated that it is possible to model the systemic deviations of the distance measurement with a coefficient of determination of 0.73, making it possible to calibrate the distance measurement to improve the laser scan measurement. An independent TLS scan is used to demonstrate the calibration results

    EU School Fruit Scheme: Strengthening Local Businesses

    Get PDF
    The EU School Fruit Scheme (SFS) provides children with fruits and vegetables (F&V), aiming to promote consumption of F&V among European school children. In addition, another objective of the program is to stabilize the fruit and vegetable market in the EU. The program varies between EU countries and with respect to some countries, e.g. Germany, even between the different federal states involved. This paper will concentrate on the specific situation in North Rhine-Westphalia (NRW), Germany.Our research therefore aims to map and analyze the situation of companies involved in the SFS in NRW, to reveal the social and economic driving forces for those companies to get engaged in the SFS, identify the networks that have developed as well as the factors that lead to success for those companies. For this purpose quantitative data of commodity flows of delivered goods and the logistics are combined with case studies gained from qualitative Interviews.The results show, that companies involved in the school fruit scheme range from small farms and one man retail businesses to large multinational retail companies. According to our findings especially small and medium sized enterprises (SME) benefit from the SFS in NRW. In urban areas some firms have developed relationships with more than 30 schools leading to high turnover for these companies from their engagement in the SFS. Generally supply relationships vary in economical characteristics (variety, product value, origin) as well as in social attributes (motivation, social embedding)

    Fitting Terrestrial Laser Scanner Point Clouds with T-Splines: Local Refinement Strategy for Rigid Body Motion

    Get PDF
    T-splines have recently been introduced to represent objects of arbitrary shapes using a smaller number of control points than the conventional non-uniform rational B-splines (NURBS) or B-spline representatizons in computer-aided design, computer graphics and reverse engineering. They are flexible in representing complex surface shapes and economic in terms of parameters as they enable local refinement. This property is a great advantage when dense, scattered and noisy point clouds are approximated using least squares fitting, such as those from a terrestrial laser scanner (TLS). Unfortunately, when it comes to assessing the goodness of fit of the surface approximation with a real dataset, only a noisy point cloud can be approximated: (i) a low root mean squared error (RMSE) can be linked with an overfitting, i.e., a fitting of the noise, and should be correspondingly avoided, and (ii) a high RMSE is synonymous with a lack of details. To address the challenge of judging the approximation, the reference surface should be entirely known: this can be solved by printing a mathematically defined T-splines reference surface in three dimensions (3D) and modeling the artefacts induced by the 3D printing. Once scanned under different configurations, it is possible to assess the goodness of fit of the approximation for a noisy and potentially gappy point cloud and compare it with the traditional but less flexible NURBS. The advantages of T-splines local refinement open the door for further applications within a geodetic context such as rigorous statistical testing of deformation. Two different scans from a slightly deformed object were approximated; we found that more than 40% of the computational time could be saved without affecting the goodness of fit of the surface approximation by using the same mesh for the two epochs

    Using Least-Squares Residuals to Assess the Stochasticity of Measurements—Example: Terrestrial Laser Scanner and Surface Modeling

    Get PDF
    Terrestrial laser scanners (TLS) capture a large number of 3D points rapidly, with high precision and spatial resolution. These scanners are used for applications as diverse as modeling architectural or engineering structures, but also high-resolution mapping of terrain. The noise of the observations cannot be assumed to be strictly corresponding to white noise: besides being heteroscedastic, correlations between observations are likely to appear due to the high scanning rate. Unfortunately, if the variance can sometimes be modeled based on physical or empirical considerations, the latter are more often neglected. Trustworthy knowledge is, however, mandatory to avoid the overestimation of the precision of the point cloud and, potentially, the non-detection of deformation between scans recorded at different epochs using statistical testing strategies. The TLS point clouds can be approximated with parametric surfaces, such as planes, using the Gauss–Helmert model, or the newly introduced T-splines surfaces. In both cases, the goal is to minimize the squared distance between the observations and the approximated surfaces in order to estimate parameters, such as normal vector or control points. In this contribution, we will show how the residuals of the surface approximation can be used to derive the correlation structure of the noise of the observations. We will estimate the correlation parameters using the Whittle maximum likelihood and use comparable simulations and real data to validate our methodology. Using the least-squares adjustment as a “filter of the geometry” paves the way for the determination of a correlation model for many sensors recording 3D point clouds
    • …
    corecore