9,507 research outputs found

    The big five: Discovering linguistic characteristics that typify distinct personality traits across Yahoo! answers members

    Get PDF
    Indexación: Scopus.This work was partially supported by the project FONDECYT “Bridging the Gap between Askers and Answers in Community Question Answering Services” (11130094) funded by the Chilean Government.In psychology, it is widely believed that there are five big factors that determine the different personality traits: Extraversion, Agreeableness, Conscientiousness and Neuroticism as well as Openness. In the last years, researchers have started to examine how these factors are manifested across several social networks like Facebook and Twitter. However, to the best of our knowledge, other kinds of social networks such as social/informational question-answering communities (e.g., Yahoo! Answers) have been left unexplored. Therefore, this work explores several predictive models to automatically recognize these factors across Yahoo! Answers members. As a means of devising powerful generalizations, these models were combined with assorted linguistic features. Since we do not have access to ask community members to volunteer for taking the personality test, we built a study corpus by conducting a discourse analysis based on deconstructing the test into 112 adjectives. Our results reveal that it is plausible to lessen the dependency upon answered tests and that effective models across distinct factors are sharply different. Also, sentiment analysis and dependency parsing proven to be fundamental to deal with extraversion, agreeableness and conscientiousness. Furthermore, medium and low levels of neuroticism were found to be related to initial stages of depression and anxiety disorders. © 2018 Lithuanian Institute of Philosophy and Sociology. All rights reserved.https://www.cys.cic.ipn.mx/ojs/index.php/CyS/article/view/275

    Compact union of disjoint boxes: An efficient decomposition model for binary volumes

    Get PDF
    This paper presents in detail the CompactUnion of Disjoint Boxes (CUDB), a decomposition modelfor binary volumes that has been recently but brieflyintroduced. This model is an improved version of aprevious model called Ordered Union of Disjoint Boxes(OUDB). We show here, several desirable features thatthis model has versus OUDB, such as less unitary basicelements (boxes) and thus, a better efficiency in someneighborhood operations. We present algorithms forconversion to and from other models, and for basiccomputations as area (2D) or volume (3D). We alsopresent an efficient algorithm for connected-componentlabeling (CCL) that does not follow the classical two-passstrategy. Finally we present an algorithm for collision (oradjacency) detection in static environments. We test theefficiency of CUDB versus existing models with severaldatasets.Peer ReviewedPostprint (published version

    Estado del arte sobre Frameworks de infraestructura para Big Data

    Get PDF
    En la actualidad, el análisis Big Data se ha convertido en un gran reto para las organizaciones educativas, gubernamentales y comerciales, esto debido a la gran cantidad de datos procesados, por lo tanto resulta muy difícil llevar a cabo los diferentes procesos de análisis con herramientas de bases de datos y analíticas convencionales. Las tendencias Big Data traen consigo una gran cantidad de herramientas y aplicaciones que han sido desarrolladas específicamente para el apoyo al crecimiento de dicha tecnología para el análisis de datos, algunas de éstas trabajan en conjunto en soluciones de arquitecturas ya implementadas en las que se fundamenta gran parte de este trabajo monográfico, sintetizando la información necesaria para generar una propuesta de una arquitectura Big Data. Este trabajo pretende mostrar los componentes necesarios de infraestructura para brindar soporte al análisis Big Data basados en soluciones implementadas por los proveedores más conocidos, tomando como referencia modelos, diagramas y herramientas de software, enfocadas a los distintos despliegues que se puedan generar con base a unos requerimientos específicos, obteniendo como resultado una solución para una arquitectura Big Data utilizando las características de los escenarios propuestos en este trabajo

    Facial Geometry Identification through Fuzzy Patterns with RGBD Sensor

    Get PDF
    Automatic human facial recognition is an important and complicated task; it is necessary to design algorithms capable of recognizing the constant patterns in the face and to use computing resources efficiently. In this paper we present a novel algorithm to recognize the human face in real time; the systems input is the depth and color data from the Microsoft KinectTM device. The algorithm recognizes patterns/shapes on the point cloud topography. The template of the face is based in facial geometry; the forensic theory classifies the human face with respect to constant patterns: cephalometric points, lines, and areas of the face. The topography, relative position, and symmetry are directly related to the craniometric points. The similarity between a point cloud cluster and a pattern description is measured by a fuzzy pattern theory algorithm. The face identification is composed by two phases: the first phase calculates the face pattern hypothesis of the facial points, configures each point shape, the related location in the areas, and lines of the face. Then, in the second phase, the algorithm performs a search on these face point configurations

    A bibliometric overview of University-business collaboration between 1980 and 2016

    Get PDF
    Bibliometrics is a research field that analyses bibliographic material from a quantitative point of view. Aiming at providing a comprehensive overview, this study scrutinises the academic literature in university business collaboration and technology transfer research for the period post the Bayh-Dole Act (1980-2016). The study employs the Web of Science as the main database from where information is collected. Bibliometric indicators such as number of publications, citations, productivity, and the H-index are used to analyse the results. The main findings are displayed in the form of tables and are further discussed. The focus is on the identification of the most relevant journals in this area, the most cited papers, most prolific authors, leading institutions, and countries. The results show that the USA, England, Spain, Italy, and the Netherlands are highly active in this area. Scientific production tends to fall within the research areas of business and economics, engineering or public administration, and is mainly published in journals such as Research Policy, Technovation and Journal of Technology Transfer.Peer ReviewedPostprint (published version

    Herramientas forenses de software libre

    Get PDF
    El aumento de los delitos informáticos y su impacto en la sociedad ha estimulado la creación de un conjunto de herramientas, y capacitación del personal técnico en el área de informática forense, todo con el objetivo de atacar esta problemática. Las compañías comerciales de software y la comunidad de software de código abierto, dan respuesta a esta necesidad con una serie de programas que proporcionan nuevas funcionalidades y herramientas más sofisticadas en las que destacan ENCASE, HELIX, CAINE & DEFT. Pero, ¿Por qué escoger una herramienta de software libre?, el uso de código abierto juega un papel destacado en la educación de futuros analistas forenses, ya que permite comprender en rofundidad, las técnicas utilizadas para reconstrucción de pruebas, examinar el código, entender la relación entre las imágenes binarias y relevante estructuras de datos, y en la ganancia de este proceso crear herramientas forenses de software nuevo y mejorado a bajo costo. En el presente trabajo, se muestra el funcionamiento de la computación forense utilizando las herramientas de software libre de la distribución Backtrack 4 r2, y se determina el conocimiento de ésta en los estudiantes de la carrera de ciencias de la computación en el Recinto Universitario Rubén Darío de la Universidad Nacional Autónoma de Nicaragua, Unan - Managua. Primero se da a conocer algunas técnicas que utilizan los hackers para poder entrar a los sistemas remotos, esto con el objetivo de adquirir el pensamiento de los intrusos, que identifican su objetivo, analizan como obtener información, luego la procesan y hacen su plan de ataques a dicho objetivo, que puede ser una empresa, organización, centro de estudios, entre otros que cuenten con un centro de cómputos con acceso a internet. Además, se muestran unas pequeñas pruebas de intrusión a dicho objetivo, si el atacante logra ingresar a dicho sistema, tiene que ser lo más precavido para no ser descubierto, tiene que borrar sus pistas, y ocultarse para poder ingresar sin problemas. A veces los hackers vulneran un sistema que está próximo al objetivo principal, para acercarse lo más posible, esto quiere decir que pueden usar un sistema como puente, este puente puede ser para ataques o para accede

    Variable neighborhood search for solving the DNA fragment assembly problem

    Get PDF
    The fragment assembly problem consists in the building of the DNA sequence from several hundreds (or even, thousands) of fragments obtained by biologists in the laboratory. This is an important task in any genome project, since the accuracy of the rest of the phases depends of the result of this stage. In addition, real instances are very large and therefore, the efficiency is also a very important issue in the design of fragment assemblers. In this paper, we propose two Variable Neighborhood Search variants for solving the DNA fragment assembly problem. These algorithms are specifically adapted for the problem being the difference between them the optimization orientation (fitness function). One of them maximizes the Parsons’s fitness function (which only considers the overlapping among the fragments) and the other estimates the variation in the number of contigs during a local search movement, in order to minimize the number of contigs. The results show that doesn’t exist a direct relation between these functions (even in several cases opposite values are generated) although for the tested instances, both variants allow to find similar and very good results but the second option reduces significatively the consumed-time.VIII Workshop de Agentes y Sistemas InteligentesRed de Universidades con Carreras en Informática (RedUNCI

    FPGA-Based Digital filters using Bit-Serial arithmetic

    Get PDF
    This paper presents an efficient method for implementation of digital filters targeted FPGA architectures. The traditional approach is based on application of general purpose multipliers. However, multipliers implemented in FPGA architectures do not allow to construct economic Digital Filters. For this reason, multipliers are replaced by Lookup Tables and Adder-Substractor, which use Bit-Serial Arithmetic. Lookup Tables can be of considerable size in high order filters, thus interconnection techniques will be used to construct high order filters from a set of low order filters. The paper presents several examples confirming that these techniques allow a reduction in logic cells utilization of filters implementation based on Bit-Serial Arithmetic concept.II Workshop de Arquitecturas, Redes y Sistemas OperativosRed de Universidades con Carreras en Informática (RedUNCI

    FPGA-Based Digital filters using Bit-Serial arithmetic

    Get PDF
    This paper presents an efficient method for implementation of digital filters targeted FPGA architectures. The traditional approach is based on application of general purpose multipliers. However, multipliers implemented in FPGA architectures do not allow to construct economic Digital Filters. For this reason, multipliers are replaced by Lookup Tables and Adder-Substractor, which use Bit-Serial Arithmetic. Lookup Tables can be of considerable size in high order filters, thus interconnection techniques will be used to construct high order filters from a set of low order filters. The paper presents several examples confirming that these techniques allow a reduction in logic cells utilization of filters implementation based on Bit-Serial Arithmetic concept.II Workshop de Arquitecturas, Redes y Sistemas OperativosRed de Universidades con Carreras en Informática (RedUNCI

    Solving constrained optimization using a T-Cell artificial immune system

    Get PDF
    In this paper, we present a novel model of an artificial immune system (AIS), based on the process that suffers the T-Cell. The proposed model is used for solving constrained (numerical) optimization problems. The model operates on three populations: Virgins, Effectors and Memory. Each of them has a different role. Also, the model dynamically adapts the tolerance factor in order to improve the exploration capabilities of the algorithm. We also develop a new mutation operator which incorporates knowledge of the problem. We validate our proposed approach with a set of test functions taken from the specialized literature and we compare our results with respect to Stochastic Ranking (which is an approach representative of the state-of-the-art in the area) and with respect to an AIS previously proposed.En este trabajo, se presenta un nuevo modelo de Sistema Inmune Artificial (SIA), basado en el proceso que sufren las células T. El modelo propuesto se usa para resolver problemas de optimización (numéricos) restringidos. El modelo trabaja sobre tres poblaciones: Vírgenes, Efectoras y de Memoria. Cada una de ellas tiene un rol diferente. Además, el modelo adapta dinamicamente el factor de tolerancia para mejorar las capacidades de exploración del algoritmo. Se desarrolló un nuevo operador de mutación el cual incorpora conocimiento del problema. El modelo fue validado con un conjunto de funciones de prueba tomado de la literatura especializada y se compararon los resultados con respecto a Stochastic Ranking (el cual es un enfoque representativo del estado del arte en el área) y con respecto a un SIA propuesto previamente.VIII Workshop de Agentes y Sistemas InteligentesRed de Universidades con Carreras en Informática (RedUNCI
    corecore