60 research outputs found

    Genetic Optimization Using Derivatives: The rgenoud Package for R

    Get PDF
    genoud is an R function that combines evolutionary algorithm methods with a derivative-based (quasi-Newton) method to solve difficult optimization problems. genoud may also be used for optimization problems for which derivatives do not exist. genoud solves problems that are nonlinear or perhaps even discontinuous in the parameters of the function to be optimized. When the function to be optimized (for example, a log-likelihood) is nonlinear in the model's parameters, the function will generally not be globally concave and may have irregularities such as saddlepoints or discontinuities. Optimization methods that rely on derivatives of the objective function may be unable to find any optimum at all. Multiple local optima may exist, so that there is no guarantee that a derivative-based method will converge to the global optimum. On the other hand, algorithms that do not use derivative information (such as pure genetic algorithms) are for many problems needlessly poor at local hill climbing. Most statistical problems are regular in a neighborhood of the solution. Therefore, for some portion of the search space, derivative information is useful. The function supports parallel processing on multiple CPUs on a single machine or a cluster of computers.

    Benford's law predicted digit distribution of aggregated income taxes: the surprising conformity of Italian cities and regions

    Full text link
    The yearly aggregated tax income data of all, more than 8000, Italian municipalities are analyzed for a period of five years, from 2007 to 2011, to search for conformity or not with Benford's law, a counter-intuitive phenomenon observed in large tabulated data where the occurrence of numbers having smaller initial digits is more favored than those with larger digits. This is done in anticipation that large deviations from Benford's law will be found in view of tax evasion supposedly being widespread across Italy. Contrary to expectations, we show that the overall tax income data for all these years is in excellent agreement with Benford's law. Furthermore, we also analyze the data of Calabria, Campania and Sicily, the three Italian regions known for strong presence of mafia, to see if there are any marked deviations from Benford's law. Again, we find that all yearly data sets for Calabria and Sicily agree with Benford's law whereas only the 2007 and 2008 yearly data show departures from the law for Campania. These results are again surprising in view of underground and illegal nature of economic activities of mafia which significantly contribute to tax evasion. Some hypothesis for the found conformity is presented.Comment: 18 pages, 5 tables, 4 figures, 61 references, To appear in European Physical Journal

    Complete Higgs sector constraints on dimension-6 operators

    Get PDF
    Constraints on the full set of Standard Model dimension-6 operators have previously used triple-gauge couplings to complement the constraints obtainable from Higgs signal strengths. Here we extend previous analyses of the Higgs sector constraints by including information from the associated production of Higgs and massive vector bosons (H+V production), which excludes a direction of limited sensitivity allowed by partial cancellations in the triple-gauge sector measured at LEP. Kinematic distributions in H+V production provide improved sensitivity to dimension-6 operators, as we illustrate here with simulations of the invariant mass and pT distributions measured by D0 and ATLAS, respectively. We provide bounds from a global fit to a complete set of CP-conserving operators affecting Higgs physics

    The effective Standard Model after LHC Run I

    Get PDF
    We treat the Standard Model as the low-energy limit of an effective field theory that incorporates higher-dimensional operators to capture the effects of decoupled new physics. We consider the constraints imposed on the coefficients of dimension-6 operators by electroweak precision tests (EWPTs), applying a framework for the effects of dimension- 6 operators on electroweak precision tests that is more general than the standard S, T formalism, and use measurements of Higgs couplings and the kinematics of associated Higgs production at the Tevatron and LHC, as well as triple-gauge couplings at the LHC. We highlight the complementarity between EWPTs, Tevatron and LHC measurements in obtaining model-independent limits on the effective Standard Model after LHC Run 1. We illustrate the combined constraints with the example of the two-Higgs doublet model

    On species distribution modelling, spatial scales and environmental flow assessment with Multi Layer Perceptron Ensembles: A case study on the redfin barbel (Barbus haasi; Mertens, 1925)

    Full text link
    Inconsistent performance of Species Distribution Models (SDMs), which may depend on several factors such as the initial conditions or the applied modelling technique, is one of the greatest challenges in ecological modelling. To overcome this problem, ensemble modelling combines the forecasts of several individual models. A commonly applied ensemble modelling technique is the Multi Layer Perceptron (MLP) Ensemble, which was envisaged in the 1990s. However, despite its potential for ecological modelling, it has received little attention in the development of SDMs for freshwater fish. Although this approach originally included all the developed MLPs, Genetic Algorithms (GA) now allow selection of the optimal subset of MLPs and thus substantial improvement of model performance. In this study, MLP Ensembles were used to develop SDMs for the redfin barbel (Barbus haasi; Mertens, 1925) at two different spatial scales: the micro scale and the meso scale. Finally, the potential of the MLP Ensembles for environmental flow (e flow) assessment was tested by linking model results to hydraulic simulation. MLP Ensembles with a candidate selection based on GA outperformed the optimal single MLP or the ensemble of the whole set of MLPs. The micro scale model complemented previous studies, showing high suitability of relatively deep areas with coarse substrate and corroborating the need for cover and the rheophilic nature of the redfin barbel. The meso scale model highlighted the advantages of using cross scale variables, since elevation (a macro scale variable) was selected in the optimal model. Although the meso scale model also demonstrated that redfin barbel selects deep areas, it partially contradicted the micro scale model because velocity had a clearer positive effect on habitat suitability and redfin barbel showed a preference for fine substrate in the meso scale model. Although the meso scale model suggested an overall higher habitat suitability of the test site, this did not result in a notable higher minimum environmental flow. Our results demonstrate that MLP Ensembles are a promising tool in the development of SDMs for freshwater fish species and proficient in e flow assessment.This study was funded by the Spanish Ministry of Economy and Competitiveness with the projects SCARCE (Consolider-Ingenio 2010 CSD2009-00065). We thank to Confederacion Hidrografica del Jucar (Spanish Ministry of Agriculture, Food and Environment), especially to the Office for Water Planning and Teodoro Estrela for the data provided to develop the SDMs. Finally we would like to thank TECNOMA S.A. for the development of the hydraulic model in the Mijares River and all the people who participated in the field data collection.Muñoz Mas, R.; Martinez-Capel, F.; Alcaraz-Hernández, JD.; Mouton, A. (2017). On species distribution modelling, spatial scales and environmental flow assessment with Multi Layer Perceptron Ensembles: A case study on the redfin barbel (Barbus haasi; Mertens, 1925). Limnologica. 62. https://doi.org/10.1016/j.limno.2016.09.004S6

    Election Forensics Toolkit DRG Center Working Paper

    No full text
    There is an acute need for methods of detecting and investigating fraud in elections, because the consequences of electoral fraud are grave for democratic stability and quality. When the electoral process is compromised by fraud, intimidation, or even violence, elections can become corrosive and destabilizing—sapping support for democratic institutions; inflaming suspicion; and stimulating demand for extra-constitutional means of pursuing political agendas, including violence. Accurate information about irregularities can help separate false accusations from evidence of electoral malfeasance. Accurate information about the scope of irregularities can also provide a better gauge of election quality. Finally, accurate information about the geographic location of malfeasance—the locations where irregularities occurred and how they cluster—can allow election monitors and pro-democracy organizations to focus attention and resources more efficiently and to substantiate their assessments of electoral quality.Election forensics is an emerging field in which scholars use a diverse set of statistical tools—including techniques similar to those developed to detect financial fraud—to analyze numerical electoral data and detect where patterns deviate from those that should occur naturally, following demonstrated mathematical principles. Numbers that humans have manipulated present patterns that are unlikely to occur if produced by a natural process—such as free and fair elections or normal commercial transactions. These deviations suggest either that the numbers were intentionally altered or that other factors—such as a range of normal strategic voting practices—influenced the electoral results. The greater the number of statistical tests that identify patterns that deviate from what is expected to naturally occur, the more likely that the deviation results from fraud rather than legal strategic voting.Through a Research and Innovation Grant funded by USAID's Center of Excellence on Democracy, Human Rights, and Governance under the Democracy Fellows and Grants Program, a research team from the University of Michigan, led by Professors Walter Mebane and Allen Hicken, built an innovative online tool, the Election Forensics Toolkit, that allows researchers and practitioners to conduct complex statistical analysis on detailed, localized data produced through the electoral process. The Election Forensics Toolkit presents results in a variety of ways—including detailed country maps showing "hot spots" of potential fraud—that allow practitioners not only to see where electoral fraud may have occurred but also the probability that the disturbances in the election data that the statistical analyses detect are attributable to fraud, rather than to other cultural or political influences, such as gerrymandering or geographic distribution of voting constituencies, among others.The team also produced two publications under the DFG grant: a Guide to Election Forensics and a more detailed Elections Forensics Toolkit DRG Center Working Paper. The Guide provides a more general introduction to election forensics as a field, and the DRG Center Working Paper focuses on presenting in detail the results of applying election forensics to specific elections in Afghanistan, Albania, Bangladesh, Cambodia, Kenya, Libya, South Africa, and Ugand
    • …
    corecore