382 research outputs found

    Razpad lažnega vakuuma z več skalarnimi polji

    Full text link
    As in boiling super-heated liquids, the decay of a false vacuum is a first-order phase transition. A local ground state decays to an energetically more favorable minimum of lower energy due to the thermal and quantum fluctuations of the fields. In this work, we present an efficient semi-analytic method that computes the decay rate of such a state for any number of scalar fields and space-time dimensions. It is based on the collection of an arbitrary number of linear segments that describe a potential with several minima. The exact evolution of the field for each segment to provide the complete description of the bounce field configuration, which provides the leading contribution of the decay rate. By increasing the number of segments, one obtains the bounce action up to the desired precision. The resulting matching equations are solved semi-analytically and the generalization to more fields is computed iteratively via linear analytic perturbations. Based on this construction, we provide a robust and user-friendly Mathematica package that implements our method, named FindBounce. As it preserves the semianalytic structure of the method, its computational time grows linearly with the number of fields and segments. We present several applications and comparisons with other tools, where typical running time is roughly less than 1 (2) seconds for 10 (20) fields with 0.5% accuracy of the action. Finally, we describe a procedure that computes subleading contributions of the decay rate for any smooth potentials and extend it to include potentials with discontinuous first derivatives. As a consequence, we exhibit an exact decay rate at one loop for a real and complex scalar field in a bi-quartic potential with two treelevel minima. We compute the product of eigenvalues, remove the translational zero modes and renormalize the divergences with the zeta function formalism. We end up with a complete decay rate in a closed-form.Kot pri vrenju pregretih tekočin, je razpad lažnega vakuuma fazni prehod prvega reda. Lokalno osnovno stanje preide v energetsko bolj ugoden nižji minimum energije, ki se zgodi zaradi termičnih in kvantnih fluktuacij polj. V tem delu predstavimo učinkovito semi-analitično metodo, ki izračuna razpadni čas takšnega stanja za poljubno število skalarnih polj in prostorsko-časovnih dimenzij. Osnovana je na naboru poljubnega števila linearnih segmentov, ki opišejo potencial z več minimi. Eksaktne rešitve razvoja polja za vse segmente so združene v popoln opis konfiguracije odbojnega polja, ki dá vodilni prispevek k razpadni širini. S povečevanjem števila segmentov, dobimo odbojno akcijo do željene natančnosti. Ujemane enačbe, ki se pri tem pojavijo, se reši analitično, posplošitev na več polj pa je izračunana iterativno s pomočjo linearnih analitičnih perturbacij. Na osnovi te konstrukcije smo ustvarili robusten in uporabniku prijazen Mathematica paket, imenovan FindBounce, ki implementira našo metodo. Zaradi semianalitične strukture, računska zahtevnost raste linearno s številom polj in segmentov. Predstavimo nekaj aplikacij in primerjav z drugimi orodji, pri katerih je izvajalni čas v grobem manj kot 1 (2) sekundi za 10 (20) polj z 0.5% natančnostjo akcije. Za konec opišemo postopek, ki izvrednoti prispevke višjega reda k razpadni širini za poljuben gladek potencial, in ga posplošimo tako, da zaobjame tudi potenciale z nezveznimi prvimi odvodi. Posledično dobimo točno razpadno širino na nivoju ene zanke za realno in kompleksno skalarno polje v dvojnem kvartičnem potencialu z dvema minima na drevesnem redu. Izračunamo produkt lastnih vrednosti, odstranimo translacijske ničelne načine in renormaliziramo divergence s formalizmom zeta funkcije. Ostane nam zaključena oblika celotne razpadne širine

    Economía del ocio y trabajo no remunerado.

    Get PDF
    Sin resume

    Coverage of Hybrid Terrestrial-Satellite Location in Mobile Communications

    Get PDF
    This work studies the improvement in service coverage obtained by three different ways of hybridising (terrestrial and satellite) triangulation location methods for cellular networks. Though the authors assume that terrestrial cellular networks use Enhanced Observed Time Difference (E-OTD) in 2G or Observed Time Difference Of Arrival (OTDOA) in 3G, and that the satellite GNSS uses Assisted Global Positioning System (A-GPS), their analysis can easily be generalized to address any other triangulation method. A simple analytical model is presented, which is used for evaluating the service coverage of each approach. The numerical results show how hybridisation leads to a high improvement and an easy balance between traffic and geographical coverage.Peer Reviewe

    A Voxelized Fractal Descriptor for 3D Object Recognition

    Get PDF
    Currently, state-of-the-art methods for 3D object recognition rely in a deep learning-pipeline. Nonetheless, these methods require a large amount of data that is not easy to obtain. In addition to that, the majority of them exploit features of the datasets, like the fact of being CAD models to create rendered representation which will not work in real life because the 3D sensors provide point clouds. We propose a novel global descriptor for point clouds which takes advantage of the fractal dimension of the objects. Our approach introduces many benefits, such as being agnostic to the density of points of the sample, number of points in the input cloud, sensor of choice, and noise up to a level, and it works on real life point cloud data provided by commercial sensors. We tested our descriptor for 3D object recognition using ModelNet, which is a well-known dataset for that task. Our approach achieves 92.84% accuracy on the ModelNet10, and 88.74% accuracy on the ModelNet40.This work was supported in part by the Spanish Government, with Feder funds, under Grant PID2019-104818RB-I00, and in part by the Spanish Grants for Ph.D. studies under Grant ACIF/2017/243 and Grant FPU16/00887

    Entity Identification Problem in Big and Open Data

    Get PDF
    Big and Open Data provide great opportunities to businesses to enhance their competitive advantages if utilized properly. However, during past few years’ research in Big and Open Data process, we have encountered big challenge in entity identification reconciliation, when trying to establish accurate relationships between entities from different data sources. In this paper, we present our innovative Intelligent Reconciliation Platform and Virtual Graphs solution that addresses this issue. With this solution, we are able to efficiently extract Big and Open Data from heterogeneous source, and integrate them into a common analysable format. Further enhanced with the Virtual Graphs technology, entity identification reconciliation is processed dynamically to produce more accurate result at system runtime. Moreover, we believe that our technology can be applied to a wide diversity of entity identification problems in several domains, e.g., e- Health, cultural heritage, and company identities in financial world.Ministerio de Ciencia e Innovación TIN2013-46928-C3-3-

    Par3DNet: Using 3DCNNs for Object Recognition on Tridimensional Partial Views

    Get PDF
    Deep learning-based methods have proven to be the best performers when it comes to object recognition cues both in images and tridimensional data. Nonetheless, when it comes to 3D object recognition, the authors tend to convert the 3D data to images and then perform their classification. However, despite its accuracy, this approach has some issues. In this work, we present a deep learning pipeline for object recognition that takes a point cloud as input and provides the classification probabilities as output. Our proposal is trained on synthetic CAD objects and is able to perform accurately when fed with real data provided by commercial sensors. Unlike most approaches, our method is specifically trained to work on partial views of the objects rather than on a full representation, which is not the representation of the objects as captured by commercial sensors. We trained our proposal with the ModelNet10 dataset and achieved a 78.39% accuracy. We also tested it by adding noise to the dataset and against a number of datasets and real data with high success.This work has been funded by the Spanish Government TIN2016-76515-R grant for the COMBAHO project, supported with Feder funds. It has also been supported by Spanish grants for PhD studies ACIF/2017/243 and FPU16/00887

    A quality management based on the Quality Model life cycle

    Get PDF
    Managing quality is a hard and expensive task that involves the execution and control of processes and techniques. For a good quality management, it is important to know the current state and the objective to be achieved. It is essential to take into account with a Quality Model that specifies the purposes of managing quality. QuEF (Quality Evaluation Framework) is a framework to manage quality in MDWE (Model-driven Web Engineering). This paper suggests managing quality but pointing out the Quality Model life cycle. The purpose is to converge toward a quality continuous improvement by means of reducing effort and time.Ministerio de Ciencia e Innovación TIN2010-20057-C03-02Ministerio de Ciencia e Innovación TIN 2010-12312-EJunta de Andalucía TIC-578

    An Extension of NDT to Model Entity Reconciliation Problems

    Get PDF
    Within the development of software systems, the development of web applications may be one of the most widespread at present due to the great number of advantages they provide such as: multiplatform, speed of access or the not requiring extremely powerful hardware among others. The fact that so many web applications are being developed, makes grotesque the volume of information that it is generated daily. In the management of all this information, it appears the entity reconciliation problem, which is to identify objects referring to the same real-world entity. This paper proposes to give a solution to this problem through a web perspective. To this end, the NDT methodology has been taken as a reference and has been extended adding new activities, artefacts and documents to cover this problem.Ministerio de Economía y Competitividad TIN2013-46928-C3-3-RMinisterio de Economía y Competitividad TIN2016-76956-C3-2-RMinisterio de Economía y Competitividad TIN2015-71938-RED
    corecore