13 research outputs found

    Finding largest small polygons with GloptiPoly

    Get PDF
    A small polygon is a convex polygon of unit diameter. We are interested in small polygons which have the largest area for a given number of vertices nn. Many instances are already solved in the literature, namely for all odd nn, and for n=4,6n=4, 6 and 8. Thus, for even n≄10n\geq 10, instances of this problem remain open. Finding those largest small polygons can be formulated as nonconvex quadratic programming problems which can challenge state-of-the-art global optimization algorithms. We show that a recently developed technique for global polynomial optimization, based on a semidefinite programming approach to the generalized problem of moments and implemented in the public-domain Matlab package GloptiPoly, can successfully find largest small polygons for n=10n=10 and n=12n=12. Therefore this significantly improves existing results in the domain. When coupled with accurate convex conic solvers, GloptiPoly can provide numerical guarantees of global optimality, as well as rigorous guarantees relying on interval arithmetic

    A História da Alimentação: balizas historiogråficas

    Full text link
    Os M. pretenderam traçar um quadro da HistĂłria da Alimentação, nĂŁo como um novo ramo epistemolĂłgico da disciplina, mas como um campo em desenvolvimento de prĂĄticas e atividades especializadas, incluindo pesquisa, formação, publicaçÔes, associaçÔes, encontros acadĂȘmicos, etc. Um breve relato das condiçÔes em que tal campo se assentou faz-se preceder de um panorama dos estudos de alimentação e temas correia tos, em geral, segundo cinco abardagens Ia biolĂłgica, a econĂŽmica, a social, a cultural e a filosĂłfica!, assim como da identificação das contribuiçÔes mais relevantes da Antropologia, Arqueologia, Sociologia e Geografia. A fim de comentar a multiforme e volumosa bibliografia histĂłrica, foi ela organizada segundo critĂ©rios morfolĂłgicos. A seguir, alguns tĂłpicos importantes mereceram tratamento Ă  parte: a fome, o alimento e o domĂ­nio religioso, as descobertas europĂ©ias e a difusĂŁo mundial de alimentos, gosto e gastronomia. O artigo se encerra com um rĂĄpido balanço crĂ­tico da historiografia brasileira sobre o tema

    Map representation and diagnostic performance of the standard 12-lead ECG

    No full text
    The diagnostic information contained in the standard 12-lead electrocardiogram was assessed by comparing the classification results produced by the standard leads for various clinical settings, such as normal versus myocardial infarction or versus left ventricular hypertrophy to those achieved by 120-lead data or body surface potential maps (BSPMs). Separately, optimal signal leads were extracted from the BSPM by ranking all leads in function of their capability of reconstructing the BSPM. Ranking was achieved by deriving eigenvalues from the covariance matrix calculated from all leads and corresponding measurements. Thus, while comparing the results from the standard leads (diagnostic leads) to those from the original raw map data, a comparison was also performed with respect to the best signal leads, namely the four best and the eight best. From the results observed for all bi- and multigroup classifications, it appeared that the diagnostic yield of the 12 standard leads matched those obtained with a number of signal leads lying between 4 and 8. This indicated that a large overlap still existed between the leads composing the 12-lead ECG (in fact, only 8 independent leads). Another interesting observation resulted from this investigation: although classifiers (discriminating variables) used for classification were identical, whether they originated from the raw standard leads (derived from the raw maps) or from standard leads reconstructed with four or eight signal leads, reconstructed measurements performed better than original measurements. This paradox can be explained by looking at the respective F values. Indeed, since increased F values result from higher ratios between the difference of group means and the composite variance from the pooled groups, higher differences and/or smaller variances produce larger ratios and, hence, better group separations. © 1995, Churchill Livingstone Inc. All rights reserved.SCOPUS: ar.jinfo:eu-repo/semantics/publishe

    Estimating ECG distributions from small numbers of leads

    No full text
    The utility of body surface potential mapping to improve interpretation of electrocardiographic information lies in the presentation of thoracic surface distributions to characterize underlying electrophysiology less ambiguously than that afforded by conventional electrocardiography. Localized cardiac disease or abnormal electrophysiology presents itself electrocardiographically on the body surface in a manner in which pattern plays an important role for identifying or characterizing these abnormalities. Thus, in myocardial infarction, transient myocardial ischemia, Wolff-Parkinson-White syndrome, or ventricular ectopy, observation of elecrocardiographic potential patterns, their extrema, and their magnitudes permits localization and quantization of the abnormal activity. Conventional electrocardiography assesses pattern information incompletely and does not use information of distribution extrema locations or magnitudes. Thus, increases or decreases in the magnitudes of electrocardiographic features (ST-segment potential displacement, amplitude, or morphology of Q, R, S, or T waves) associated with changes in cardiac sources (ischemia, infarction, conduction abnormalities, etc.) as measured from fixed leads have a high likelihood of being misinterpreted if the distribution itself is changing. In this study, the authors demonstrate the utility of estimating distributions from small numbers of optimally selected leads, including conventional leads, to reduce uncertainty in the interpretation of electrocardiographic information. This issue is highly relevant when thresholds are used to detect significance of potential levels (exercise testing, detection of myocardial infarction, and continuous monitoring to assess ST-segment changes). Significance of this work lies in improved detection and characterization of abnormal electrophysiology using conventional or enhanced leadsets and methods to estimate thoracic potential distributions. © 1995, Churchill Livingstone Inc. All rights reserved.SCOPUS: ar.jinfo:eu-repo/semantics/publishe

    Dynamic many-process applications on many-tile embedded systems and HPC clusters: The EURETILE programming environment and execution platfor

    No full text
    International audienceIn the next decade, a growing number of scientific and industrial applications will require power-efficient systems providing unprecedented computation, memory, and communication resources. A promising paradigm foresees the use of heterogeneous many-tile architectures. The resulting computing systems are complex: they must be protected against several sources of faults and critical events, and application programmers must be provided with programming paradigms, software environments and debugging tools adequate to manage such complexity. The EURETILE (European Reference Tiled Architecture Experiment) consortium conceived, designed, and implemented: 1- an innovative many-tile, many-process dynamic fault-tolerant programming paradigm and software environment, grounded onto a lightweight operating system generated by an automated software synthesis mechanism that takes into account the architecture and application specificities; 2- a many-tile heterogeneous hardware system, equipped with a high-bandwidth, low-latency, point-to-point 3D-toroidal interconnect. The inter-tile interconnect processor is equipped with an experimental mechanism for systemic fault-awareness; 3- a full-system simulation environment, supported by innovative parallel technologies and equipped with debugging facilities. We also designed and coded a set of application benchmarks representative of requirements of future HPC and Embedded Systems, including: 4- a set of dynamic multimedia applications and 5- a large scale simulator of neural activity and synaptic plasticity. The application benchmarks, compiled through the EURETILE software tool-chain, have been efficiently executed on both the many-tile hardware platform and on the software simulator, up to a complexity of a few hundreds of software processes and hardware cores
    corecore