7 research outputs found

    Skeletons for Distributed Topological Computation

    Get PDF
    Parallel implementation of topological algorithms is highly desirable, but the challenges, from reconstructing algorithms around independent threads through to runtime load balancing, have proven to be formidable. This problem, made all the more acute by the diversity of hardware platforms, has led to new kinds of implementation platform for computational science, with sophisticated runtime systems managing and coordinating large threadcounts to keep processing elements heavily utilized. While simpler and more portable than direct management of threads, these approaches still entangle program logic with resource management. Similar kinds of highly parallel runtime system have also been developed for functional languages. Here, however, language support for higher-order functions allows a cleaner separation between the algorithm and `skeletons' that express generic patterns of parallel computation. We report results on using this technique to develop a distributed version of the Joint Contour Net, a generalization of the Contour Tree to multifields. We present performance comparisons against a recent Haskell implementation using shared-memory parallelism, and initial work on a skeleton for distributed memory implementation that utilizes an innovative strategy to reduce inter-process communication overheads

    Automatic Humor Evaluation

    Get PDF
    Cílem této práce je vytvoření systému pro automatické hodnocení humoru. Systém umožňuje predikovat vtipnost a kategorii pro vstup zadaný v angličtině. Hlavní podstatou je vytvoření klasifikátoru a trénování modelu na vytvořených datových sadách pro získání co nejlepších výsledků. Architektura klasifikátoru je založena na neuronových sítích. Systém zároveň obsahuje webové uživatelské rozhraní pro komunikaci s uživatelem. Výsledek je webová aplikace propojená s klasifikátorem umožňující hodnocení uživatelského vstupu a poskytování zpětné vazby od uživatelů.The aim of this thesis is to create a system for automatic humor evaluation. The system allow to predict humor and category for english input. The main essence is to create a classifier and train the model with the created datasets to get the best possible results. The classifier architecture is based on neural networks. The system also includes a web user interface for communication with the user. The result is a web application linked to a classifier that allows user input to be evaluated and user feedback to be provided.

    Analyse de complexité d'enveloppes convexes aléatoires

    Get PDF
    In this thesis, we give some new results about the average size of convex hulls made of points chosen in a convex body. This size is known when the points are chosen uniformly (and independently) in a convex polytope or in a "smooth" enough convex body. This average size is also known if the points are independently chosen according to a centered Gaussian distribution. In the first part of this thesis, we introduce a technique that will give new results when the points are chosen arbitrarily in a convex body, and then noised by some random perturbations. This kind of analysis, called smoothed analysis, has been initially developed by Spielman and Teng in their study of the simplex algorithm. For an arbitrary set of point in a ball, we obtain a lower and a upper bound for this smoothed complexity, in the case of uniform perturbation in a ball (in arbitrary dimension) and in the case of Gaussian perturbations in dimension 2. The asymptotic behavior of the expected size of the convex hull of uniformly random points in a convex body is polynomial for a "smooth" body and polylogarithmic for a polytope. In the second part, we construct a convex body so that the expected size of the convex hull of points uniformly chosen in that body oscillates between these two behaviors when the number of points increases. In the last part, we present an algorithm to generate efficiently a random convex hull made of points chosen uniformly and independently in a disk. We also compute its average time and space complexity. This algorithm can generate a random convex hull without explicitly generating all the points. It has been implemented in C++ and integrated in the CGAL library.Dans cette thèse, nous donnons de nouveaux résultats sur la taille moyenne d’enveloppes convexes de points choisis dans un convexe. Cette taille est connue lorsque les points sont choisis uniformément (et indépendamment) dans un polytope convexe, ou un convexe suffisamment «lisse» ; ou encore lorsque les points sont choisis indépendamment selon une loi normale centrée. Dans la première partie de cette thèse, nous développons une technique nous permettant de donner de nouveaux résultats lorsque les points sont choisis arbitrairement dans un convexe, puis «bruités» par une perturbation aléatoire. Ce type d’analyse, appelée analyse lissée, a initialement été développée par Spielman et Teng dans leur étude de l’algorithme du simplexe. Pour un ensemble de points arbitraires dans une boule, nous obtenons une borne inférieure et une borne supérieure de cette complexité lissée dans le cas de perturbations uniformes dans une boule en dimension arbitraire, ainsi que dans le cas de perturbations gaussiennes en dimension 2. La taille de l'enveloppe convexe de points choisis uniformément dans un convexe, peut avoir un comportement logarithmique si ce convexe est un polytope ou polynomial s’il est lisse. Nous construisons un convexe produisant un comportement oscillant entre ces deux extrêmes. Dans la dernière partie, nous présentons un algorithme pour engendrer efficacement une enveloppe convexe aléatoire de points choisis uniformément et indépendamment dans un disque sans avoir à engendrer explicitement tous les points. Il a été implémenté en C++ et intégré dans la bibliothèque CGAL

    Despliegue y análisis del entorno de aceleración de aplicaciones para plataformas de hardware acelerado XILINX (VITIS)

    Get PDF
    Resumen (Español): En los últimos 75 años, se ha disparado el crecimiento tecnológico gracias al desarrollo de numerosas soluciones Software y Hardware. Ambos conceptos y su desarrollo se encuentran intrínsecamente relacionados, de forma que, el Software, cada vez más exigente, requiere de Hardware de mayor rendimiento, y el propio Hardware, también en desarrollo, permite soluciones Software con requerimientos más elevados. Esto ha permitido no solo extender el ámbito de la computación a prácticamente la totalidad de sectores a nivel global, sino que ha permitido la aparición de nuevas tecnologías que requieren procesamientos muy exigentes, en cuanto a complejidad de computación, así como en cuanto a volumen de información a procesar. Esto se traduce en servicios que requieren procesar gran cantidad de información en tiempos reducidos, o en servicios que requieren simplemente minimizar este tiempo de procesamiento por su propia naturaleza. Algunos ejemplos de ello son Big Data Analysis, Machine Learning, Time Sensitive Networking o Internet of Things, entre muchos otros. Estas nuevas tecnologías y sus requerimientos, han permitido la aparición de dispositivos electrónicos que permiten transformar aplicaciones tradicionalmente más lentas y menos eficientes, en aplicaciones que se adaptan a este nuevo entorno de alto rendimiento. Además, permiten establecer el entorno para la creación de nuevas aplicaciones, directamente sobre este tipo de dispositivos. Estos dispositivos permiten lo que se conoce como Aceleración Hardware, introduciendo un nuevo paradigma para el desarrollo de aplicaciones de alto rendimiento. A pesar de su gran versatilidad y posibilidades, se encuentra en un punto de baja maduración y es necesario realizar un estudio y análisis profundo de las tecnologías, para establecer los primeros pasos hacia este entorno de desarrollo y despliegue de aplicaciones de alto rendimiento, de forma que pueda extenderse a los diferentes sectores tecnológicos que podrían beneficiarse de ello. Es por ello, que surge este proyecto, para estudiar y desplegar el entorno de desarrollo de este tipo de soluciones, así como para analizar las diferentes vías de desarrollo y metodologías, visualizando sus resultados, para determinar las posibilidades de este tipo de tecnología tan prometedora. Para ello, se utiliza el entorno de desarrollo Xilinx Vitis, implementando las soluciones sobre las tarjetas de aceleración Xilinx Alveo.Abstract (English): Over the last 75 years, technological growth has exploded thanks to the development of numerous software and hardware solutions. Both concepts and their development are intrinsically related, in such a way that the increasingly demanding software requires higher performance hardware, and the hardware itself, also in development, allows software solutions with higher requirements. This has not only extended the scope of computing to practically all sectors globally, but has also allowed the emergence of new technologies that require very demanding processing, in terms of computing complexity, as well as in terms of the volume of information to be processed. This translates into services that require processing large amounts of information in reduced times, or services that simply require minimising this processing time by their very nature. Some examples are Big Data Analysis, Machine Learning, Time Sensitive Networking or Internet of Things, among many others. These new technologies and their requirements have enabled the emergence of electronic devices that transform traditionally slower and less efficient applications into applications that adapt to this new high-performance environment. Moreover, they make it possible to establish the environment for the creation of new applications directly on this type of device. These devices enable what is known as Hardware Acceleration, introducing a new paradigm for the development of high performance applications. Despite its great versatility and possibilities, it is at a point of low maturity and it is necessary to carry out an in-depth study and analysis of the technologies in order to establish the first steps towards this environment for the development and deployment of high-performance applications, so that it can be extended to the different technological sectors that could benefit from it. This is why this project was created to study and deploy the development environment for this type of solution, as well as to analyse the different development paths and methodologies, visualising their results, in order to determine the possibilities of this type of promising technology. For this purpose, the Xilinx Vitis development environment is used, implementing the solutions on Xilinx Alveo acceleration cards.Laburpena (Euskara): Azken 75 urteetan, teknologia-hazkundeak gora egin du, Software eta Hardware soluzio ugariren garapenari esker. Bi kontzeptu horiek eta horien garapena berez lotuta daude; horrela, gero eta zorrotzagoa den Softwareak errendimendu handiagoko Hardwarea behar du, eta Hardwareak berak ere, garatzen ari denak, eskakizun handiagoko Software-soluzioak ahalbidetzen ditu. Horri esker, konputazioaren esparrua ia sektore guztietara zabaldu da maila globalean, eta, horrez gain, prozesamendu oso zorrotzak eskatzen dituzten teknologia berriak agertu dira, bai konputazioaren konplexutasunari dagokionez, bai prozesatu beharreko informazioaren bolumenari dagokionez. Horren ondorioz, denbora laburrean informazio asko prozesatzea eskatzen duten zerbitzuak ematen dira, edo berez prozesamendu-denbora hori minimizatzea besterik eskatzen ez duten zerbitzuak. Horren adibide dira Big Data Analysis, Machine Learning, Time Sensitive Networking edo Internet of Things, besteak beste. Teknologia berri horiei eta haien eskakizunei esker, tradizionalki motelagoak eta eraginkortasun txikiagokoak izan diren aplikazioak eraldatzeko aukera ematen duten gailu elektronikoak sortu dira, errendimendu handiko ingurune berri horretara egokitzen direnak. Gainera, aplikazio berriak sortzeko ingurunea ezartzea ahalbidetzen dute, zuzenean horrelako gailuetan. Gailu hauek Hardware azelerazioa deritzona ahalbidetzen dute, errendimendu altuko aplikazioak garatzeko paradigma berri bat sartuz. Moldakortasun eta aukera handiak dituen arren, heldutasun txikiko puntu batean dago, eta beharrezkoa da teknologien azterketa eta analisi sakona egitea, errendimendu handiko aplikazioak garatzeko eta hedatzeko ingurune horretara lehen urratsak ezartzeko, horren onurak jaso ditzaketen sektore teknologikoetara zabaldu ahal izateko. Hori dela eta, proiektu hau sortu da horrelako soluzioen garapen-ingurunea aztertzeko eta hedatzeko, baita garapen-bideak eta metodologiak aztertzeko ere, emaitzak bistaratuz, etorkizun handiko teknologia mota horren aukerak zehazteko.Horretarako, Xilinx Vitis garapen-ingurunea erabiltzen da, Xilinx Alveo azelerazio-txartelen gaineko soluzioak inplementatuz

    Symbols Purely Mechanical: Language, Modernity, and the Rise of the Algorithm, 1605–1862

    Full text link
    In recent decades, scholars in both Digital Humanities and Critical Media Studies have encountered a disconnect between algorithms and what are typically thought of as “cultural” concerns. In Digital Humanities, researchers employing algorithmic methods in the study of literature have faced what Alan Liu has called a “meaning problem”—a difficulty in reconciling computational results with traditional forms of interpretation. Conversely, in Critical Media Studies, some thinkers have questioned the adequacy of interpretive methods as means of understanding computational systems. This dissertation offers a historical account of how this disconnect came into being by examining the attitudes toward algorithms that existed in the three centuries prior to the development of the modern computer. Bringing together the histories of semiotics, poetics, and mathematics, I show that the present divide between algorithmic and interpretive methods results from a cluster of assumptions about historical change that developed in the eighteenth and nineteenth centuries and that implicates attempts to give meaning to algorithms in the modern narrative of technological progress. My account organizes the early-modern discourse on algorithms into three distinct intellectual traditions that arose in subsequent periods. The first tradition, which reached its peak in the mid-seventeenth century, held that the correspondence between algorithm and meaning was guaranteed by divine providence, making algorithms a potential basis for a non- arbitrary mode of representation that can apply to any field of knowledge, including poetics as well as mathematics. A second tradition, most influential from the last decades of the seventeenth century to around 1800, denied that the correspondence between algorithm and meaning was pre-ordained and sought, instead, to create this correspondence by altering the ways people think. Finally, starting in the Romantic period, algorithms and culture came to be viewed as operating autonomously from one another, an intellectual turn that, I argue, continues to inform the way people view algorithms in the present day. By uncovering this history, this dissertation reveals some of the tacit assumptions that underlie present debates about the interface between computation and culture. The reason algorithms present humanists with a meaning problem, I argue, is that cultural and technical considerations now stand in different relations to history: culture is seen as arising from collective practices that lie beyond the control of any individual, whereas the technical details of algorithms are treated as changeable at will. It is because of this compartmentalization, I maintain, that the idea of progress plays such a persistent role in discussions of digital technologies; similarly to the Modernist avant garde, computing machines have license to break with established semantic conventions and thus to lead culture in new directions. As an alternative to this technocratic arrangement, I call for two complementary practices: a philology of algorithms that resituates them in history, and a poetic approach to computation that embraces misalignments between algorithm and meaning

    Self-definition and modernization: Identity between individualism and globalization

    Get PDF
    [ES] Esta tesis doctoral explora la compleja interacción entre la autodefinición individual y los contextos socioculturales y macrosociales en la sociedad contemporánea, caracterizada por sus complejas dinámicas de modernización, individualización y globalización. Utilizando el Test de los Veinte Enunciados (TST o Twenty Statement Test) desarrollado por Kuhn y McPartland (1954) en una muestra de individuos de Chile, España, Sudáfrica, Estados Unidos y Reino Unido, se analiza cómo distintas variables sociodemográficas, así como el país de origen de las respuestas, influyen en la autodefinición de los individuos. Metodológicamente, la investigación enfrenta el desafío de codificar y analizar respuestas del TST, dado la naturaleza textual y abierta de sus respuestas. Se usa el sistema de codificación tridimensional para el TST propuesto por Escobar (1988), que clasifica el contenido de tres dimensiones: referencia, atributo y sentido, donde el sentido es entendido como el contenido de lo que expresan los individuos en sus enunciados, la referencia es el conjunto de entidad o conjunto de entidades mencionadas por el sujeto de forma explícita y la atribución es el conjunto de adjetivos calificativos que definen a la persona. Con el objetivo de explorar las ventajas y limitaciones de las distintas técnicas de codificación y clasificación de respuestas abiertas, se emplean técnicas como la codificación manual, semiautomática y automática, donde se compara la capacidad y calidad de los distintos métodos. Destaca la eficacia de la codificación semiautomática con diccionarios, que demuestra tener un desempeño para este tipo de respuestas. Los hallazgos empíricos indican que la individualización, modernización y globalización impactan en la forma en que los individuos se definen, sugiriendo una interacción compleja entre la autodefinición y el contexto macrosocial. Se observan diferencias notables en la autodefinición según el grado de individualización de los países, reflejando las tendencias de los individuos de países más individualizados–Reino Unido y Estados Unidos–hacia un menor número de autodefiniciones, siendo estas más cortas en su longitud y con una mayor propensión al uso de dudas, evasiones e indefiniciones en sus respuestas, mientras que, en Chile, España y Sudáfrica, las respuestas reflejan mayor referencia a roles familiares y sociales. Sin embargo, se subraya una tendencia a la autodefinición subconsensual en todos los países estudiados. De forma paralela, se observa cómo elementos como el género, el nivel de estudios, el tamaño del lugar de residencia o la edad juegan roles cruciales en la construcción de la identidad. Además, se examina la relación entre el entorno laboral y la autodefinición, enfatizando la importancia de las estructuras sociales en la formación de la identidad. Este estudio profundiza la comprensión de cómo los cambios socioeconómicos y culturales de la modernidad afectan la percepción del sí mismo, destacando la variabilidad de la autodefinición en distintos contextos culturales. Los resultados enfatizan la necesidad de enfoques integradores que consideren tanto influencias globales como locales en la construcción de la identidad. Esta investigación contribuye de manera significativa al entendimiento de la autodefinición en un contexto globalizado, ofreciendo perspectivas valiosas para campos como la sociología, la psicología social y la educación, y sentando las bases para futuras investigaciones en la intersección de estas disciplinas. [EN] This thesis examines the complex interaction between individual self-definition and sociocultural as well as macrosocial contexts in contemporary society, characterized by its intricate dynamics of modernization, individualization, and globalization. Employing the Twenty Statements Test (TST) developed by Kuhn and McPartland (1954) on a diverse sample of individuals from Chile, Spain, South Africa, the United States, and the United Kingdom, it analyzes how various sociodemographic variables, along with the country of origin of the responses, influence individuals' self-definitions. Methodologically, the research confronts the challenge of coding and analyzing responses from the TST, given the textual nature of its responses. It utilizes the three-dimensional coding system for the TST proposed by Escobar (1988), which classifies content across three dimensions: reference, attribute, and sense. Sense is understood as the content expressed by individuals in their statements, reference encompasses the set of entity or entities explicitly mentioned by the subject, and attribute comprises the set of descriptive adjectives defining the person. Aiming to explore the advantages and limitations of various coding and classification techniques for open-ended responses, methods such as manual, semi-automatic, and automatic coding are employed, and the capability and quality of the different methods are compared. The efficacy of semi-automatic coding with dictionaries stands out, demonstrating superior performance for this type of responses. The empirical findings indicate that individualization, modernization, and globalization impact the way individuals define themselves, suggesting a complex interaction between self-definition and the macrosocial context. Notable differences in self-definition are observed according to the degree of individualization of countries, reflecting the tendencies of individuals from more individualized countries –the United Kingdom and the United States– towards a lesser number of self-definitions, which are shorter in length and exhibit a greater propensity for doubts, evasions, and indecisiveness in their responses. In contrast, responses from Chile, Spain, and South Africa show a greater reference to family and social roles. However, a trend towards subconsensual self-definition is underscored across all observed countries. Concurrently, it is observed how elements such as gender, level of education, size of the place of residence, or age play crucial roles in identity construction. Additionally, the relationship between the work environment and self-definition is examined, highlighting the importance of social structures in identity formation. This study deepens the understanding of how socioeconomic and cultural changes of modernity affect self-perception, underscoring the variability of self-definition in different cultural contexts. The results emphasize the need for integrative approaches that consider both global and local influences in identity construction. This research significantly contributes to the understanding of self-definition in a globalized context, offering valuable perspectives for fields such as sociology, social psychology, and education, and laying the groundwork for future research at the intersection of these disciplines
    corecore