816 research outputs found

    Principios de metodología de superficie de respuesta para modelos logísticos

    Get PDF
    En esta tesis doctoral abordamos algunos principios para estudiar la Metodología de Superficie de Respuesta (que abreviaremos en adelante como MSR) para datos que siguen distribuciones binarias (Bernoulli y binomial), y que se ajustan mediante Modelos Lineales Generalizados (que abreviaremos como MLG). El punto de partida elegido ha sido el enfoque clásico de la MSR, es decir, en el contexto de modelos lineales y normales y, en particular, a partir del trabajo seminal de Box y Wilson (1951).Nuestra pregunta de investigación alrededor de la cual hemos elaborado este trabajo gira alrededor del siguiente planteamiento: "¿cómo podría proceder el experimentador cuando la naturaleza de su proceso no sigue los supuestos clásicos de normalidad y linealidad?". Enlazando esta cuestión con el estado actual del arte en materia de la MSR, una segunda pregunta fue: "¿Cómo podría ser un proceso secuencial de aprendizaje del funcionamiento de un sistema en los que intervengan respuestas de naturaleza binaria en el que se persiga un objetivo determinado?". Para poder investigar con mayor profundidad esta pregunta, y mediante un sustento metodológico lo suficientemente sólido, nos apoyamos en los MLG. Estos modelos -a partir de su primera presentación y formulación en el trabajo de Nelder y Wedderburn (1972)- son la herramienta que elegimos para encontrar una metodología de aplicación sistemática, que nos permita buscar modelos adecuados que puedan ajustar respuestas de naturaleza binaria. Consideramos como estrategia particular aquella en la que se encontraría el experimentador cuando dispone de un número fijo de observaciones a realizar de las variables de un sistema, que traducimos con el nombre de "estrategia de presupuesto fijo". Así, el objetivo será poder cuantificar de alguna forma la ganancia de información que alcanzamos a conocer del proceso luego de haber utilizado todo el presupuesto disponible. En todos los casos nuestro plan es el de utilizar familias de estrategias de diseños factoriales a dos niveles, secuencialmente encadenados. Nuestro estudio comienza definiendo una familia de estrategias de exploración de un proceso representado por una superficie de respuesta teórica binaria, en la que hemos identificado tres variables: un valor llamado w, acotado entre 0 y 1, el cual es utilizado para definir el primer centro de experimentación. Luego, se considera una segunda variable, que será el valor que tenga el rango de variación de los factores, L, y finalmente, cuando se ensayen nuevas alternativas de puntos de diseño, habrá un valor S, que llamaremos "salto", que representará la longitud que separa un centro de diseño del siguiente. De esta manera, diremos que una estrategia de diseño queda caracterizada por los valores L, S y w. Partiendo así de una superficie de respuesta que sea la que mejor se considera que se aproxima a un proceso real, el objetivo será el de encontrar a través de simulaciones los niveles de w, L y S que alcancen los mejores valores posibles bajo dos criterios de selección de diseños: (a) una basada en el determinante de la Matriz de Información de Fisher (que hemos llamado "criterio de la cantidad de información"), y (b) el otro, basado en el valor de la superficie teórica evaluado en las mejores condiciones que se obtengan del modelo ajustado (que hemos llamado "criterio de proximidad al máximo"). A tal efecto, hemos utilizado programas escritos en el lenguaje R (www.r-project.orq), un entorno de programación potente y flexible,La completa revisión bibliográfica de ambos temas (MSR y MLG), junto con el diseño de herramientas informáticas "ad-hoc", ofrecen un enfoque novedoso y origina! que puede servir como punto de partida para continuar buscando el enlace entre estas dos metodologías y su aplicación en problemas prácticos sobre la base de criterios objetivos que puedan soportar la toma de decisiones.In this PhD thesis we approached some principles that relate to the study the Response Surface Methodology (abbreviated as RSM) for binary responses (Bernoulli and binomial distributions), modellable through the scope of Generalized Linear Models (abbreviated as GLM}. Our starting point is the classic approach of the RSM, in the context of linear normal models and, particularly, from the seminal work on the subject, by the article of Box and Wilson (1951). Our first research question from which we started ellaborating this work was around of the following statement: "How could experimenters deal with this problem when the nature of the process does not follow the classical assumptions of normality and linearity?". Connecting this question with the present state-of-the-art in RSM, the second question that we address is: "How could one design a sequential strategy to learn about the operation of a system with binary response, when certain objectives are persecuted?". In order to explore these questions deeper by means of a methodological support, we leaned towards the GLM approach. These models -presented and formulated primarily in the work of Nelder and Wedderburn (1972)- are the tool that we have chosen in order to find a systematic applied methodology, that aims for suitable models that can be fitted to binary response.We consider as a particular strategy, the one in which the experimenter has a fixed number of observations to be made, in what we labeled as "strategy of fixed budget". Thus, the objective will be to quantify the information gain once we have used all the budget available. In both cases, our plan is to carry out 2-level factorial and sequential designs. Our approach starts with a definition of a family of design strategies for exploration of a process that is being represented by a certain response surface. These strategies are characterized though three variables: w, bounded between 0 and 1, used to define the first experimentation center point. Once that is determined, a second variable is considered: L, or the range of variation of the factors. Finally, when several experimental conditions were considered, the variable S, identifies the jump length that connects one center point of experimentation with the following one, Having defined the scope this way, we can say that a design strategy may be characterized by means of a three-variable picture: L, S and w. Once the experimenter defined what kind of response surface is the best one to approach the real process, the goal will be to find the levels of L, S and w that maximizes the value of two alternative criteria: the first one is based on the determinant of the Fisher's Information Matrix, and it captures (he amount of information gathered by the design, and the second one is the value taken by the theoretical surface on the maximum of the fitted surface. In order to this scope, we have written some programs in R language (www.r-proiect.org), a powerful and flexible environment of programming and doing statistics.A complete bibliographical review of both topics (RSM and GLM), as well as the design of "ad-hoc" specific software, try to offer a new and an original point of view to study this problem, which maybe useful as a starting point for continuing the research in these areas and the link between these two methodologies. It is of special interest the exploration of new practical applications to real problems based on some objective criteria that can support the process of decision making.Postprint (published version

    Diseños experimentales secuenciales para modelos logísticos de regresión

    Get PDF
    Cuando los supuestos habituales de normalidad y varianza constante no se cumplen (e.g. en procesos de Bernoulli o binomiales), el problema de la elección de diseños adecuados ocasiona cierta dificultad a los experimentadores, especialmente cuando lo que se persigue es una exploración secuencial del proceso. Este artículo está basado en De Zan (2006), en donde se proponen dos criterios para evaluar estrategias de diseño. Una de ellas toma en cuenta la cantidad de información contenida en el modelo ajustado, mientras que la otra explora la información contenida en las mejores condiciones de experimentación encontradas en el modelo ajustado. Se desarrolla un ejemplo simulado con el paquete R acerca de cómo funcionan estas estrategias.When the usual hypotheses of normality and constant variance do not hold (e.g. in binomial or Bernoulli processes), the problem of choosing appropriate designs creates problems to researches when pursuing a sequential exploration of process. This paper is based on De Zan (2006), where the author proposes two criteria to evaluate design strategies, that take the amount of information as the main evaluation tool. One into account the information of the fitted model, and the other explores the information that is contained on the approximation of a set of the best conditions of factors found on a fitted model. An example of how these strategies work is also given through a simulation using R software

    Phase Diagram of the Two-Flavor Schwinger Model at Zero Temperature

    Full text link
    We examine the phase structure of the two-flavor Schwinger model as a function of the θ\theta-angle and the two masses, m1m_1 and m2m_2. In particular, we find interesting effects at θ=π\theta=\pi: along the SU(2)SU(2)-invariant line m1=m2=mm_1 = m_2 = m, in the regime where mm is much smaller than the charge gg, the theory undergoes logarithmic RG flow of the Berezinskii-Kosterlitz-Thouless type. As a result, in this regime there is a non-perturbatively small mass gap eAg2/m2\sim e^{- A g^2/m^2}. The SU(2)SU(2)-invariant line lies within a region of the phase diagram where the charge conjugation symmetry is spontaneously broken and whose boundaries we determine numerically. Our numerical results are obtained using the Hamiltonian lattice gauge formulation that includes the mass shift mlat=mg2a/4m_\text{lat} = m- g^2 a/4 dictated by the discrete chiral symmetry.Comment: 7 pages, 3 figures; v2 minor improvements, refs adde

    An optimal rewiring strategy for cooperative multiagent social learning

    Get PDF
    Multiagent coordination is a key problem in cooperative multiagent systems (MASs). It has been widely studied in both fixed-agent repeated interaction setting and static social learning framework. However, two aspects of dynamics in real-world MASs are currently neglected. First, the network topologies can change during the course of interaction dynamically. Second, the interaction utilities can be different among each pair of agents and usually unknown before interaction. Both issues mentioned above increase the difficulty of coordination. In this paper, we consider the multiagent social learning in a dynamic environment in which agents can alter their connections and interact with randomly chosen neighbors with unknown utilities beforehand. We propose an optimal rewiring strategy to select most beneficial peers to maximize the accumulated payoffs in long-run interactions. We empirically demonstrate the effects of our approach in a variety of large-scale MASs

    Evaluation of temperature distribution for bone drilling considering aging factor

    Get PDF
    Bone drilling is a routine operation in surgeries, such as neurosurgery and orthopedics. However, the excessive drilling temperature may cause severe thermal damage to the bone tissue. Therefore, the drilling temperature determination of bone tissue can reduce the harm caused by thermal damage. A time-varying temperature field simulation model of bone drilling was set up by ABAQUS software in this paper, based on the Johnson-Cook model. Then it was validated with experiments by drilling cortical bone of fresh bovine shaft of the femur. The relative error between the experimental values and the theoretical values within 7.67% showed a good consistency. Furthermore, the aging factor is also considered to evaluate the temperature field of bone drilling. The results showed that the drilling temperature near the bone-drill area increased significantly. The drilling temperature of cortical bone decreases sharply with the radial distance and exhibits a hysteresis lag in the axial distribution. The aging factor mainly affects the peak of drilling temperature. The peak of drilling temperature tends to increase with age. The peak drilling temperature in the elderly (70y) was up to 6.8% higher than that in the young (20y), indicating that the elderly is more prone to excessive drilling temperature. Therefore, special attention should be paid to the temperature control of elderly bone tissue

    Graphene re-knits its holes

    Get PDF
    Nano-holes, etched under an electron beam at room temperature in singlelayer graphene sheets as a result of their interaction with metalimpurities, are shown to heal spontaneously by filling up with either non-hexagon, graphene-like, or perfect hexagon 2D structures. Scanning transmission electron microscopy was employed to capture the healing process and study atom-by-atom the re-grown structure. A combination of these nano-scale etching and re-knitting processes could lead to new graphene tailoring approaches.Comment: 11 pages, 4 figure

    Statistical Study of the Relationship Between Ion Upflow and Field-Aligned Current in the Topside Ionosphere for Both Hemispheres During Geomagnetic Disturbed and Quiet Time

    Get PDF
    A statistical study of ion upflow and field‐aligned currents (FACs) has been performed in the topside ionosphere of both hemispheres for magnetic quiet and disturbed times by using DMSP satellite observations from 2010–2013. Distributions in MLT/MLat reveal that ion upflow occurrence shows a dawn‐dusk asymmetry distribution that matches well with the Region 1 FACs. In addition, there are highest occurrence regions near noon and within the midnight auroral disturbance area, corresponding to dayside cusp and nightside auroral disturbance regions, respectively. Both the ion upflow occurrence and FAC regions expand equatorward to a wider area during disturbed times.publishedVersio
    corecore