105 research outputs found

    Automatic 3D Object Segmentation in Multiple Views using Volumetric Graph-Cuts

    Get PDF
    We propose an algorithm for automatically obtaining a segmentation of a rigid object in a sequence of images that are calibrated for camera pose and intrinsic parameters. Until recently, the best segmentation results have been obtained by interactive methods that require manual labelling of image regions. Our method requires no user input but instead relies on the camera fixating on the object of interest during the sequence. We begin by learning a model of the object’s colour, from the image pixels around the fixation points. We then extract image edges and combine these with the object colour information in a volumetric binary MRF model. The globally optimal segmentation of 3D space is obtained by a graph-cut optimisation. From this segmentation an improved colour model is extracted and the whole process is iterated until convergence. Our first finding is that the fixation constraint, which requires that the object of interest is more or less central in the image, is enough to determine what to segment and initialise an automatic segmentation process. Second, we find that by performing a single segmentation in 3D, we implicitly exploit a 3D rigidity constraint, expressed as silhouette coherency, which significantly improves silhouette quality over independent 2D segmentations. We demonstrate the validity of our approach by providing segmentation results on real sequences

    Higher-rank Numerical Ranges and Kippenhahn Polynomials

    Full text link
    We prove that two n-by-n matrices A and B have their rank-k numerical ranges Λk(A)\Lambda_k(A) and Λk(B)\Lambda_k(B) equal to each other for all k, 1kn/2+11\le k\le \lfloor n/2\rfloor+1, if and only if their Kippenhahn polynomials pA(x,y,z)det(xReA+yImA+zIn)p_A(x,y,z)\equiv\det(x Re A+y Im A+zI_n) and pB(x,y,z)det(xReB+yImB+zIn)p_B(x,y,z)\equiv\det(x Re B+y Im B+zI_n) coincide. The main tools for the proof are the Li-Sze characterization of higher-rank numerical ranges, Weyl's perturbation theorem for eigenvalues of Hermitian matrices and Bezout's theorem for the number of common zeros for two homogeneous polynomials.Comment: 16 pages, 1 figur

    Multiobjective evolutionary algorithms for multivariable PI controller design

    Full text link
    A multiobjective optimisation engineering design (MOED) methodology for PI controller tuning in multivariable processes is presented. The MOED procedure is a natural approach for facing multiobjective problems where several requirements and specifications need to be fulfilled. An algorithm based on the differential evolution technique and spherical pruning is used for this purpose. To evaluate the methodology, a multivariable control benchmark is used. The obtained results validate the MOED procedure as a practical and useful technique for parametric controller tuning in multivariable processes.This work was partially supported by the FPI-2010/19 grant and the project PAID-06-11 from the Universitat Politecnica de Valencia and the projects DPI2008-02133, TIN2011-28082 and ENE2011-25900 from the Spanish Ministry of Science and Innovation.Reynoso Meza, G.; Sanchís Saez, J.; Blasco Ferragud, FX.; Herrero Durá, JM. (2012). Multiobjective evolutionary algorithms for multivariable PI controller design. Expert Systems with Applications. 39(9):7895-7907. https://doi.org/10.1016/j.eswa.2012.01.111S7895790739

    Using the ResearchEHR platform to facilitate the practical application of the EHR standards

    Full text link
    Possibly the most important requirement to support co-operative work among health professionals and institutions is the ability of sharing EHRs in a meaningful way, and it is widely acknowledged that standardization of data and concepts is a prerequisite to achieve semantic interoperability in any domain. Different international organizations are working on the definition of EHR architectures but the lack of tools that implement them hinders their broad adoption. In this paper we present ResearchEHR, a software platform whose objective is to facilitate the practical application of EHR standards as a way of reaching the desired semantic interoperability. This platform is not only suitable for developing new systems but also for increasing the standardization of existing ones. The work reported here describes how the platform allows for the edition, validation, and search of archetypes, converts legacy data into normalized, archetypes extracts, is able to generate applications from archetypes and finally, transforms archetypes and data extracts into other EHR standards. We also include in this paper how ResearchEHR has made possible the application of the CEN/ISO 13606 standard in a real environment and the lessons learnt with this experience. © 2011 Elsevier Inc..This work has been partially supported by the Spanish Ministry of Science and Innovation under Grants TIN2010-21388-C02-01 and TIN2010-21388-C02-02, and by the Health Institute Carlos in through the RETICS Combiomed, RD07/0067/2001. Our most sincere thanks to the Hospital of Fuenlabrada in Madrid, including its Medical Director Pablo Serrano together with Marta Terron and Luis Lechuga for their support and work during the development of the medications reconciliation project.Maldonado Segura, JA.; Martínez Costa, C.; Moner Cano, D.; Menárguez-Tortosa, M.; Boscá Tomás, D.; Miñarro Giménez, JA.; Fernández-Breis, JT.... (2012). Using the ResearchEHR platform to facilitate the practical application of the EHR standards. Journal of Biomedical Informatics. 45(4):746-762. doi:10.1016/j.jbi.2011.11.004S74676245

    Cosmic Microwave Background anisotropies from second order gravitational perturbations

    Get PDF
    This paper presents a complete analysis of the effects of second order gravitational perturbations on Cosmic Microwave Background anisotropies, taking explicitly into account scalar, vector and tensor modes. We also consider the second order perturbations of the metric itself obtaining them, for a universe dominated by a collision-less fluid, in the Poisson gauge, by transforming the known results in the synchronous gauge. We discuss the resulting second order anisotropies in the Poisson gauge, and analyse the possible relevance of the different terms. We expect that, in the simplest scenarios for structure formation, the main effect comes from the gravitational lensing by scalar perturbations, that is known to give a few percent contribution to the anisotropies at small angular scales.Comment: 15 pages, revtex, no figures. Version to be published in Phys. Rev.

    Phytoplankton absorption, photosynthetic parameters, and primary production off Baja California: summer and autumn

    Get PDF
    Abstract To estimate ocean primary production at large space and time scales, it is necessary to use models combined with ocean-color satellite data. Detailed estimates of primary production are typically done at only a few representative stations. To get survey-scale estimates of primary production, one must introduce routinely measured Chlorophyll-a (Chl-a) into models. For best precision, models should be based on accurate parameterizations developed from optical and photosynthesis data collected in the region of interest. To develop regional model parameterizations 14 Cbicarbonate was used to estimate in situ primary production and photosynthetic parameters ða à ; P à m , and E k ) derived from photosynthesis-irradiance (P-E) experiments from IMECOCAL cruises to the southern California Current during July and October 1998. The P-E experiments were done for samples collected from the 50% surface light depth for which we also determined particle and phytoplankton absorption coefficients (a p , a f , and a à f Þ. Physical data collected during both surveys indicated that the 1997-1998 El Nin˜o was abating during the summer of 1998, with a subsequent transition to the typical California Current circulation and coastal upwelling conditions. Phytoplankton chl-a and in situ primary production were elevated at coastal stations for both surveys, with the highest values during summer. Phytoplankton specific absorption coefficients in the blue peak ða à f (440) ) ranged from 0.02 to 0.11 m 2 (mg Chl-a) À1 with largest values in offshore surface waters. In general a à f was lower at depth compared to the surface. P-E samples were collected at the 50% light level that was usually in the surface mixed layer. Using a à and spectral absorption, we estimated maximum photosynthetic quantum yields (f max ; mol C/mol quanta). f max values were lowest in offshore surface waters, with a total range of 0.01-0.07. Mean values of f max for July and October were 0.011 and 0.022, respectively. In July P à m was approximately double and a à was about 1.4 times the values for October. Since the P-E samples were generally within the upper mixed layer, these tendencies in the photosynthetic parameters are attributed to deeper mixing of this layer during October when the mean mixed layer for the photosynthesis stations was 35 m compared to a mean of 10 m in July. Application of a semi-analytical model using mean values of P-E parameters determined at the 50% light depth provided good agreement with 14 C in situ estimates at the discrete 50% light depth and for the water-column integrated primary production.

    Comparison of design concepts in multi-criteria decision-making using level diagrams

    Full text link
    [EN] In this work, we address the evaluation of design concepts and the analysis of multiple Pareto fronts in multi-criteria decision-making using level diagrams. Such analysis is relevant when two (or more) design concepts with different design alternatives lie in the same objective space, but describe different Pareto fronts. Therefore, the problem can be stated as a Pareto front comparison between two (or more) design concepts that only differ in their relative complexity, implementation issues, or the theory applied to solve the problem at hand. Such analysis will help the decision maker obtain a better insight of a conceptual solution and be able to decide if the use of a complex concept is justified instead of a simple concept. The approach is validated in a set of multi-criteria decision making benchmark problems. © 2012 Elsevier Inc. All rights reserved.This work was partially supported by the FPI-2010/19 Grant and Project PAID-06-11 from the Universitat Politecnica de Valencia and by Projects ENE2011-25900, TIN2011-28082 (Spanish Ministry of Science and Innovation) and GV/2012/073, PROMETEO/2012/028 (Generalitat Valenciana).Reynoso Meza, G.; Blasco Ferragud, FX.; Sanchís Saez, J.; Herrero Durá, JM. (2013). Comparison of design concepts in multi-criteria decision-making using level diagrams. INFORMATION SCIENCES. 221(1):124-141. https://doi.org/10.1016/j.ins.2012.09.049S124141221

    On the dynamic adaptation of language models based on dialogue information

    Get PDF
    We present an approach to adapt dynamically the language models (LMs) used by a speech recognizer that is part of a spoken dialogue system. We have developed a grammar generation strategy that automatically adapts the LMs using the semantic information that the user provides (represented as dialogue concepts), together with the information regarding the intentions of the speaker (inferred by the dialogue manager, and represented as dialogue goals). We carry out the adaptation as a linear interpolation between a background LM, and one or more of the LMs associated to the dialogue elements (concepts or goals) addressed by the user. The interpolation weights between those models are automatically estimated on each dialogue turn, using measures such as the posterior probabilities of concepts and goals, estimated as part of the inference procedure to determine the actions to be carried out. We propose two approaches to handle the LMs related to concepts and goals. Whereas in the first one we estimate a LM for each one of them, in the second one we apply several clustering strategies to group together those elements that share some common properties, and estimate a LM for each cluster. Our evaluation shows how the system can estimate a dynamic model adapted to each dialogue turn, which helps to improve the performance of the speech recognition (up to a 14.82% of relative improvement), which leads to an improvement in both the language understanding and the dialogue management tasks

    Effect of storage conditions on furosine formation in milk-cereal based baby foods

    Full text link
    [EN] The effect of storage during 9 months at 25, 30 and 37 degrees C on furosine formation in three milk-cereal based baby foods was studied to evaluate development of the Maillard reaction. Furosine was measured by HPLC-UV. Immediately after the manufacturing process, furosine contents were 310-340 mg/100 g protein and at the 9th storage month were 426-603 mg/100 g protein. Storage time and temperature have a significant increase (p < 0.05) of furosine content during storage. Furosine contents were higher in sample containing honey than in those without honey. Interactions (p < 0.05) between storage time and temperature or type of sample were found. A predictive model equation of the evolution of furosine during storage explaining 80% of the variability in furosine content was obtained. The blockage of lysine through storage calculated using the furosine and total lysine provided values ranged from 9.5% to 18.1% for analysed baby foods. (C) 2007 Elsevier Ltd. All rights reserved.L. Bosch is the holder of a grant from the Spanish Ministry of Education and Science. Thanks are due to the Generalitat Valenciana for the financial support given to the Bionutest (group 03/003), and also to Hero Espan˜a S.A. for providing the samples and for financing help.Bosch, L.; Alegría, A.; Farre, R.; Clemente Marín, G. (2008). Effect of storage conditions on furosine formation in milk-cereal based baby foods. Food Chemistry. 107(4):1681-1686. doi:10.1016/j.foodchem.2007.09.051S16811686107
    corecore