104 research outputs found

    Knowledge management: the issue of multimedia contents

    Get PDF
    Knowledge Management is a very important topic in business and in academy research. There are many fields of applications for knowledge management, including cognitive science, sociology, management science, information science, knowledge engineering, artificial intelligence, and economics. Many studies on different aspects of Knowledge Management have been published, becoming common in the early 1990s. In this work, we want to represent Knowledge through a mixed-iterative approach, where top-down and bottom-up analyses of the knowledge domain which has to be represented are applied: these are typical approaches for this kind of problems. In this case, they are applied following an iterative approach which allows, through further refinements, for the efficient formalization able to represent the domain's knowledge of interest. We start from the concept of the “domain knowledge base”. The fundamental body of knowledge available on a domain is the knowledge valuable for the knowledge users. We need to represent and manage this knowledge, to define a formalization and codification of the knowledge in the domain. After this formalization we can manage this knowledge using knowledge repositories. In this thesis, we present four different formalization and management of knowledge for multimedia contents, using our proposed approach: 1. User Generated Contents from famous platform (Flickr, YouTube, etc.); 2. audio recordings regarding linguistic corpus and information added to that corpus with annotations; 3. knowledge associated with construction processes; 4. descriptions and reviews of Italian wines. The most important result we achieved with this thesis was the opportunity to make this disparaged knowledge available and manageable. In the current market, exploiting existing knowledge is a mainstream business, but in order to exploit it, one must be able to manage it first. As a token of this importance, not only about ten scientific publications, but most of all a number of industrial research projects, in partnership with ICT companies – one of which with a total value above one million Euros – stemmed from the studies discussed in this thesis

    Incidence of predatory journals in computer science literature

    Get PDF
    Purpose One of the main tasks of a researcher is to properly communicate the results he obtained. The choice of the journal in which to publish the work is therefore very important. However, not all journals have suitable characteristics for a correct dissemination of scientific knowledge. Some publishers turn out to be unreliable and, against a payment, they publish whatever researchers propose. The authors call "predatory journals" these untrustworthy journals. The purpose of this paper is to analyse the incidence of predatory journals in computer science literature and present a tool that was developed for this purpose. Design/methodology/approach The authors focused their attention on editors, universities and publishers that are involved in this kind of publishing process. The starting point of their research is the list of scholarly open-access publishers and open-access stand-alone journals created by Jeffrey Beall. Specifically, they analysed the presence of predatory journals in the search results obtained from Google Scholar in the engineering and computer science fields. They also studied the change over time of such incidence in the articles published between 2011 and 2015. Findings The analysis shows that the phenomenon of predatory journals somehow decreased in 2015, probably due to a greater awareness of the risks related to the reputation of the authors. Originality/value We focused on computer science field, using a specific sample of queries. We developed a software to automatically make queries to the search engine, and to detect predatory journals, using Beall's list

    Algoritmos primais-duais de ponto fixo aplicados ao problema Ridge Regression

    Get PDF
    In this work we propose algorithms for solving a fixed-point general primal-dual formulation applied to the Ridge Regression problem. We study the primal formulation for regularized least squares problems, especially L2-norm, named Ridge Regression and then describe convex duality for that class of problems. Our strategy was to consider together primal and dual formulations and minimize the duality gap between them. We established the primal-dual fixed point algorithm, named SRP and a reformulation for this method, the main contribution of the thesis, which was more efficient and robust, called acc-SRP method or accelerated version of the SRP method. The theoretical study of the algorithms was done through the analysis of the spectral properties of the associated iteration matrices. We proved the linear convergence of algorithms and some numerical examples comparing two variants for each algorithm proposed were presented. We also showed that our best method, acc-SRP, has excellent numerical performance for solving very ill-conditioned problems, when compared to the conjugate gradient method, which makes it computationally more attractive.Neste trabalho propomos algoritmos para resolver uma formulação primal-dual geral de ponto fixo aplicada ao problema de Ridge Regression. Estudamos a formulação primal para problemas de quadrados mínimos regularizado, em especial na norma L2, nomeados Ridge Regression e descrevemos a dualidade convexa para essa classe de problemas. Nossa estratégia foi considerar as formulações primal e dual conjuntamente, e minimizar o gap de dualidade entre elas. Estabelecemos o algoritmo de ponto fixo primal-dual, nomeado SRP e uma reformulação para esse método, contribuição principal da tese, a qual mostrou-se mais eficaz e robusta, designada por método acc-SRP, ou versão acelerada do método SRP. O estudo teórico dos algoritmos foi feito por meio da análise de propriedades espectrais das matrizes de iteração associadas. Provamos a convergência linear dos algoritmos e apresentamos alguns exemplos numéricos comparando duas variantes para cada algoritmo proposto. Mostramos também que o nosso melhor método, acc-SRP, possui excelente desempenho numérico na resolução de problemas muito mal-condicionados quando comparado ao Método de Gradientes Conjugados, o que o torna computacionalmente mais atraente

    Science with the Einstein Telescope: a comparison of different designs

    Get PDF
    The Einstein Telescope (ET), the European project for a third-generation gravitational-wave detector, has a reference configuration based on a triangular shape consisting of three nested detectors with 10 km arms, where in each arm there is a `xylophone' configuration made of an interferometer tuned toward high frequencies, and an interferometer tuned toward low frequencies and working at cryogenic temperature. Here, we examine the scientific perspectives under possible variations of this reference design. We perform a detailed evaluation of the science case for a single triangular geometry observatory, and we compare it with the results obtained for a network of two L-shaped detectors (either parallel or misaligned) located in Europe, considering different choices of arm-length for both the triangle and the 2L geometries. We also study how the science output changes in the absence of the low-frequency instrument, both for the triangle and the 2L configurations. We examine a broad class of simple `metrics' that quantify the science output, related to compact binary coalescences, multi-messenger astronomy and stochastic backgrounds, and we then examine the impact of different detector designs on a more specific set of scientific objectives.Comment: 197 pages, 72 figure

    Science case study and scientific simulations for the enhanced X-ray Timing Polarimetry mission, eXTP

    Get PDF
    The X-ray astronomy mission eXTP (enhanced X-ray Timing Polarimetry) is designed to study matter under extreme conditions of density, gravity and magnetism. Primary goals are the determination of the equation of state (EoS) of matter at supranuclear density, the physics in extremely strong magnetic fields, the study of accretion in strong-field gravity (SFG) regime. Primary targets include isolated and binary neutron stars, strong magneticfield systems like magnetars, and stellar-mass and supermassive black holes. In this paper we report about key observations and simulations with eXTP on the primary objectives involving accretion under SFG regimes and determination of NS-Eo

    Report from Working Group 3: Beyond the standard model physics at the HL-LHC and HE-LHC

    Get PDF
    This is the third out of five chapters of the final report [1] of the Workshop on Physics at HL-LHC, and perspectives on HE-LHC [2]. It is devoted to the study of the potential, in the search for Beyond the Standard Model (BSM) physics, of the High Luminosity (HL) phase of the LHC, defined as 33 ab1^{-1} of data taken at a centre-of-mass energy of 14 TeV, and of a possible future upgrade, the High Energy (HE) LHC, defined as 1515 ab1^{-1} of data at a centre-of-mass energy of 27 TeV. We consider a large variety of new physics models, both in a simplified model fashion and in a more model-dependent one. A long list of contributions from the theory and experimental (ATLAS, CMS, LHCb) communities have been collected and merged together to give a complete, wide, and consistent view of future prospects for BSM physics at the considered colliders. On top of the usual standard candles, such as supersymmetric simplified models and resonances, considered for the evaluation of future collider potentials, this report contains results on dark matter and dark sectors, long lived particles, leptoquarks, sterile neutrinos, axion-like particles, heavy scalars, vector-like quarks, and more. Particular attention is placed, especially in the study of the HL-LHC prospects, to the detector upgrades, the assessment of the future systematic uncertainties, and new experimental techniques. The general conclusion is that the HL-LHC, on top of allowing to extend the present LHC mass and coupling reach by 2050%20-50\% on most new physics scenarios, will also be able to constrain, and potentially discover, new physics that is presently unconstrained. Moreover, compared to the HL-LHC, the reach in most observables will, generally more than double at the HE-LHC, which may represent a good candidate future facility for a final test of TeV-scale new physics

    Search for dark matter produced in association with bottom or top quarks in √s = 13 TeV pp collisions with the ATLAS detector

    Get PDF
    A search for weakly interacting massive particle dark matter produced in association with bottom or top quarks is presented. Final states containing third-generation quarks and miss- ing transverse momentum are considered. The analysis uses 36.1 fb−1 of proton–proton collision data recorded by the ATLAS experiment at √s = 13 TeV in 2015 and 2016. No significant excess of events above the estimated backgrounds is observed. The results are in- terpreted in the framework of simplified models of spin-0 dark-matter mediators. For colour- neutral spin-0 mediators produced in association with top quarks and decaying into a pair of dark-matter particles, mediator masses below 50 GeV are excluded assuming a dark-matter candidate mass of 1 GeV and unitary couplings. For scalar and pseudoscalar mediators produced in association with bottom quarks, the search sets limits on the production cross- section of 300 times the predicted rate for mediators with masses between 10 and 50 GeV and assuming a dark-matter mass of 1 GeV and unitary coupling. Constraints on colour- charged scalar simplified models are also presented. Assuming a dark-matter particle mass of 35 GeV, mediator particles with mass below 1.1 TeV are excluded for couplings yielding a dark-matter relic density consistent with measurements

    Myocyte membrane and microdomain modifications in diabetes: determinants of ischemic tolerance and cardioprotection

    Full text link
    corecore