496 research outputs found

    The Socio-Economic Correlates of Party Reform

    Get PDF

    Theory of Finite Pseudoalgebras

    Get PDF
    Conformal algebras, recently introduced by Kac, encode an axiomatic description of the singular part of the operator product expansion in conformal field theory. The objective of this paper is to develop the theory of ``multi-dimensional'' analogues of conformal algebras. They are defined as Lie algebras in a certain ``pseudotensor'' category instead of the category of vector spaces. A pseudotensor category (as introduced by Lambek, and by Beilinson and Drinfeld) is a category equipped with ``polylinear maps'' and a way to compose them. This allows for the definition of Lie algebras, representations, cohomology, etc. An instance of such a category can be constructed starting from any cocommutative (or more generally, quasitriangular) Hopf algebra HH. The Lie algebras in this category are called Lie HH-pseudoalgebras. The main result of this paper is the classification of all simple and all semisimple Lie HH-pseudoalgebras which are finitely generated as HH-modules. We also start developing the representation theory of Lie pseudoalgebras; in particular, we prove analogues of the Lie, Engel, and Cartan-Jacobson Theorems. We show that the cohomology theory of Lie pseudoalgebras describes extensions and deformations and is closely related to Gelfand-Fuchs cohomology. Lie pseudoalgebras are closely related to solutions of the classical Yang-Baxter equation, to differential Lie algebras (introduced by Ritt), and to Hamiltonian formalism in the theory of nonlinear evolution equations. As an application of our results, we derive a classification of simple and semisimple linear Poisson brackets in any finite number of indeterminates.Comment: 102 pages, 7 figures, AMS late

    The Architecture of MEG Simulation and Analysis Software

    Full text link
    MEG (Mu to Electron Gamma) is an experiment dedicated to search for the μ+e+γ\mu^+ \rightarrow e^+\gamma decay that is strongly suppressed in the Standard Model but predicted in several Super Symmetric extensions of it at an accessible rate. MEG is a small-size experiment (5060\approx 50-60 physicists at any time) with a life span of about 10 years. The limited human resource available, in particular in the core offline group, emphasized the importance of reusing software and exploiting existing expertise. Great care has been devoted to provide a simple system that hides implementation details to the average programmer. That allowed many members of the collaboration to contribute to the development of the software of the experiment with limited programming skill. The offline software is based on two frameworks: {\bf REM} in FORTRAN 77 used for the event generation and detector simulation package {\bf GEM}, based on GEANT 3, and {\bf ROME} in C++ used in the readout simulation {\bf Bartender} and in the reconstruction and analysis program {\bf Analyzer}. Event display in the simulation is based on GEANT 3 graphic libraries and in the reconstruction on ROOT graphic libraries. Data are stored in different formats in various stage of the processing. The frameworks include utilities for input/output, database handling and format conversion transparent to the user.Comment: Presented at the IEEE NSS Knoxville, 2010 Revised according to referee's remarks Accepted by European Physical Journal Plu

    Abundance and American democracy: a test of dire predictions

    Get PDF
    The American political system was severely tested in the 1970s and it is not yet obvious that the system's response to those tests was adequate. Some scholars have argued that the confusion we witnessed in energy, environmental and economic policies was symptomatic of even worse situations to come. Their consensus is that our style of democratic politics is incapable of dealing with the problems we increasingly face. Consequently, they predict that democracy's days are numbered. Furthermore, many Americans sense that the "joy ride" may be over, and that our economy may be hard pressed to maintain standards, much less continue its historic growth. One poll showed a 34 percent increase, since 1977, in respondents who believe, "The United States is in deep and serious trouble," and a well known economist, employing the terminology of game theory, has suggested that ours has become a "Zero-Sum society.

    Análise lexical e discursiva em notícias da Internet

    Get PDF
    Anais e artigos do 28º Fórum Acadêmico de Letras, realizado nos dias 23 a 25 de agosto de 2017 na Universidade Federal da Integração Latino-Americana (Unila) e Universidade Estadual do Oeste do Paraná (Unioeste) com tema: A pesquisa nos cursos de letras em contexto de línguas e culturas em contato.No decorrer deste artigo, serão investigadas duas notícias veiculadas na internet com o objetivo de analisar e comparar os elementos lexicais utilizados nos textos, dentre eles as palavras utilizadas, seus significados e sentidos proporcionados ao interlocutor. As análises se respaldam na Lexicologia e na Linguística Textual, demonstrando que o emprego de uma expressão ou de outra não é casual, ao contrário, as expressões se destinam a uma função específica relacionada ao processo de referenciação. A análise considerou o texto como um todo, a finalidade argumentativa do autor e o contexto de produção que caracteriza o discurso, visto que o texto é marcado pela coerência, encadeamento de ideias e pela progressão textual, sendo que sua principal característica é a comunicabilidade que dependem da produção e da interpretaçã

    Congressional Voting and Ecological Issues

    Get PDF

    Irredundant Triangular Decomposition

    Full text link
    Triangular decomposition is a classic, widely used and well-developed way to represent algebraic varieties with many applications. In particular, there exist sharp degree bounds for a single triangular set in terms of intrinsic data of the variety it represents, and powerful randomized algorithms for computing triangular decompositions using Hensel lifting in the zero-dimensional case and for irreducible varieties. However, in the general case, most of the algorithms computing triangular decompositions produce embedded components, which makes it impossible to directly apply the intrinsic degree bounds. This, in turn, is an obstacle for efficiently applying Hensel lifting due to the higher degrees of the output polynomials and the lower probability of success. In this paper, we give an algorithm to compute an irredundant triangular decomposition of an arbitrary algebraic set WW defined by a set of polynomials in C[x_1, x_2, ..., x_n]. Using this irredundant triangular decomposition, we were able to give intrinsic degree bounds for the polynomials appearing in the triangular sets and apply Hensel lifting techniques. Our decomposition algorithm is randomized, and we analyze the probability of success

    The first version Buffered Large Analog Bandwidth (BLAB1) ASIC for high luminosity collider and extensive radio neutrino detectors

    Full text link
    Future detectors for high luminosity particle identification and ultra high energy neutrino observation would benefit from a digitizer capable of recording sensor elements with high analog bandwidth and large record depth, in a cost-effective, compact and low-power way. A first version of the Buffered Large Analog Bandwidth (BLAB1) ASIC has been designed based upon the lessons learned from the development of the Large Analog Bandwidth Recorder and Digitizer with Ordered Readout (LABRADOR) ASIC. While this LABRADOR ASIC has been very successful and forms the basis of a generation of new, large-scale radio neutrino detectors, its limited sampling depth is a major drawback. A prototype has been designed and fabricated with 65k deep sampling at multi-GSa/s operation. We present test results and directions for future evolution of this sampling technique.Comment: 15 pages, 26 figures; revised, accepted for publication in NIM

    Bose-Einstein Condensation in a CO_2-laser Optical Dipole Trap

    Full text link
    We report on the achieving of Bose-Einstein condensation of a dilute atomic gas based on trapping atoms in tightly confining CO_2-laser dipole potentials. Quantum degeneracy of rubidium atoms is reached by direct evaporative cooling in both crossed and single beam trapping geometries. At the heart of these all-optical condensation experiments is the ability to obtain high initial atomic densities in quasistatic dipole traps by laser cooling techniques. Finally, we demonstrate the formation of a condensate in a field insensitive m_F=0 spin projection only. This suppresses fluctuations of the chemical potential from stray magnetic fields.Comment: 8 pages, 5 figure

    Understanding Sample Generation Strategies for Learning Heuristic Functions in Classical Planning

    Full text link
    We study the problem of learning good heuristic functions for classical planning tasks with neural networks based on samples that are states with their cost-to-goal estimates. It is well known that the learned model quality depends on the training data quality. Our main goal is to understand better the influence of sample generation strategies on the performance of a greedy best-first heuristic search guided by a learned heuristic function. In a set of controlled experiments, we find that two main factors determine the quality of the learned heuristic: the regions of the state space included in the samples and the quality of the cost-to-goal estimates. Also, these two factors are interdependent: having perfect estimates of cost-to-goal is insufficient if an unrepresentative part of the state space is included in the sample set. Additionally, we study the effects of restricting samples to only include states that could be evaluated when solving a given task and the effects of adding samples with high-value estimates. Based on our findings, we propose practical strategies to improve the quality of learned heuristics: three strategies that aim to generate more representative states and two strategies that improve the cost-to-goal estimates. Our resulting neural network heuristic has higher coverage than a basic satisficing heuristic. Also, compared to a baseline learned heuristic, our best neural network heuristic almost doubles the mean coverage and can increase it for some domains by more than six times.Comment: 27 page
    corecore