2,857 research outputs found

    Análise de lacunas no mercado Praxis

    Get PDF
    Praxis is a platform that allows students to search internship offers submitted by companies, research labs and higher education institutions. The success of the platform depends on its adoption by students that use keywords to find internships and organizations that need candidates to fill their vacancies. The two problems detected with the platform are the high number of searches without results and the low percentage of internships with candidates. This work explores text mining techniques with the objective of creating a tool to analyse the discrepancy between supply and demand, helping to identify market gaps and possible improvements to the search engine. The final solution allows Praxis administrators to intuitively analyse the available data on a dashboard. Additionally, from the analysis made, limitations were found on the existing search engine, so an improved one was created. Software engineering best practices were followed and the defined objectives were achieved.O Praxis é uma plataforma que permite a estudantes procurar propostas de estágio submetidas por empresas, centros de investigação e instituições de ensino superior. O sucesso da plataforma depende da sua adoção por parte dos estudantes, que utilizam palavras-chave para pesquisar estágios, e das organizações que necessitam de candidatos para preencher as suas vagas. Os dois problemas detetados na plataforma são o elevado número de pesquisas sem resultados e a baixa percentagem de estágios com candidatos. Este trabalho visa explorar técnicas de processamento automático de texto com o objetivo de criar uma ferramenta para analisar a discrepância entre a oferta e a procura, ajudando a identificar lacunas no mercado e possíveis melhorias no processo de pesquisa. A solução final permite aos administradores do Praxis visualizar os dados disponíveis num painel de controlo, de maneira intuitiva. Além disso, a partir da análise realizada, foram encontradas limitações no motor de busca existente, motivando assim a criação de um novo. Foram seguidas boas práticas de engenharia informática e atingiu-se os objetivos definidos

    Integrating GRASS GIS and Jupyter Notebooks to facilitate advanced geospatial modeling education

    Get PDF
    Open education materials are critical for the advancement of open science and the development of open-source soft-ware. These accessible and transparent materials provide an important pathway for sharing both standard geospa-tial analysis workflows and advanced research methods. Computational notebooks allow users to share live code with in-line visualizations and narrative text, making them a powerful interactive teaching tool for geospatial analyt-ics. Specifically, Jupyter Notebooks are quickly becoming a standard format in open education. In this article, we intro-duce a new GRASS GIS package, grass.jupyter, that enhances the existing GRASS Python API to allow Jupyter Notebook users to easily manage and visualize GRASS data including spatiotemporal datasets. While there are many Python-based geospatial libraries available for use in Jupyter Notebooks, GRASS GIS has extensive geospatial functionality including support for multi-temporal analysis and dynamic simulations, making it a powerful teaching tool for advanced geospatial analytics. We discuss the devel-opment of grass.jupyter and demonstrate how the package facilitates teaching open-source geospatial mode-ling with a collection of Jupyter Notebooks designed for a graduate-level geospatial modeling course. The open educa-tion notebooks feature spatiotemporal data visualizations, hydrologic modeling, and spread simulations such as the spread of invasive species and urban growthpublishedVersio

    Proofs Versus Experiments: Wittgensteinian Themes Surrounding the Four-Color Theorem

    Get PDF
    The Four-Colour Theorem (4CT) proof, presented to the mathematical community in a pair of papers by Appel and Haken in the late 1970's, provoked a series of philosophical debates. Many conceptual points of these disputes still require some elucidation. After a brief presentation of the main ideas of Appel and Haken’s procedure for the proof and a reconstruction of Thomas Tymoczko’s argument for the novelty of 4CT’s proof, we shall formulate some questions regarding the connections between the points raised by Tymoczko and some Wittgensteinian topics in the philosophy of mathematics such as the importance of the surveyability as a criterion for distinguishing mathematical proofs from empirical experiments. Our aim is to show that the “characteristic Wittgensteinian invention” (Mühlhölzer 2006) – the strong distinction between proofs and experiments – can shed some light in the conceptual confusions surrounding the Four-Colour Theorem

    von reproduzierbarer Forschung bis zur künstlichen Intelligenz

    Get PDF
    Die vorgestellten Arbeiten nutzen alle Möglichkeiten der fortgeschrittenen Datenverarbeitung und des “reproducible research” Ansatzes wie in der Einleitung der Arbeit beschrieben. Die erste Arbeit (siehe Publikation 1), eine tierexperimentelle Arbeit aus der biomedizinischen Grundlagenwissenschaft, nutzt hierarchische Modellierung und einen formalisierten Analyseansatz. Die folgenden drei Arbeiten kommen aus dem Bereich der klinischen Forschung; hier wurde eine umfassende Programmbibliothek entwickelt, mit deren Hilfe agile und dennoch wissenschaftlich höchsten Ansprüchen genügende Analysen retro- und auch prospektiver Kohorten semi-automatisch ermöglicht werden. Alle Tabellen, Grafiken und analytischen Ergebnisse aus den Publikationen stammen direkt aus der Verwendung der entwickelten Programmbibliothek. In Publikation 5 wird ein Ansatz der Echtzeit-Qualitätskontrolle für chirurgische Eingriffe vorgestellt. Ein Aufbau eines solchen Systems ermöglicht die Nutzung von Big Data Methoden und die Verwendung der von uns publizierten Open Source Programmbibliothek rcusum (siehe Publikation 5). Die letzte Arbeit beschreibt die Entwicklung und Validierung einer Big Data und KI Plattform zur Echtzeit Prädiktion von chirurgischen Komplikationen in der postoperativen Phase auf der Intensivstation (Publikation 6). Zusammenfassend erlaubt die Nutzung der in dieser kumulativen Arbeit genutzten Methoden der Datenverarbeitung und -analytik sowohl die Erstellung robuster und schneller Forschungsarbeiten als auch die Entwicklung innovativer Risikostratifizierungs-Werkzeuge. Beides kommt konkret mittel- und unmittelbar den Patienten und dem medizinischen Fortschritt im Allgemeinen zugute

    Psychopower and Ordinary Madness: Reticulated Dividuals in Cognitive Capitalism

    Get PDF
    Despite the seemingly neutral vantage of using nature for widely-distributed computational purposes, neither post-biological nor post-humanist teleology simply concludes with the real "end of nature" as entailed in the loss of the specific ontological status embedded in the identifier "natural." As evinced by the ecological crises of the Anthropocene—of which the 2019 Brazil Amazon rainforest fires are only the most recent—our epoch has transfixed the “natural order" and imposed entropic artificial integration, producing living species that become “anoetic,” made to serve as automated exosomatic residues, or digital flecks. I further develop Gilles Deleuze’s description of control societies to upturn Foucauldian biopower, replacing its spacio-temporal bounds with the exographic excesses in psycho-power; culling and further detailing Bernard Stiegler’s framework of transindividuation and hyper-control, I examine how becoming-subject is predictively facilitated within cognitive capitalism and what Alexander Galloway terms “deep digitality.” Despite the loss of material vestiges qua virtualization—which I seek to trace in an historical review of industrialization to postindustrialization—the drive-based and reticulated "internet of things" facilitates a closed loop from within the brain to the outside environment, such that the aperture of thought is mediated and compressed. The human brain, understood through its material constitution, is susceptible to total datafication’s laminated process of “becoming-mnemotechnical,” and, as neuroplasticity is now a valid description for deep-learning and neural nets, we are privy to the rebirth of the once-discounted metaphor of the “cybernetic brain.” Probing algorithmic governmentality while posing noetic dreaming as both technical and pharmacological, I seek to analyze how spirit is blithely confounded with machine-thinking’s gelatinous cognition, as prosthetic organ-adaptation becomes probabilistically molded, networked, and agentially inflected (rather than simply externalized)

    Game of Templates. Deploying and (re-)using Virtualized Research Environments in High-Performance and High-Throughput Computing

    Get PDF
    The Virtual Open Science Collaboration Environment project worked on different use cases to evaluate the necessary steps for virtualization or containerization especially when considering the external dependencies of digital workflows. Virtualized Research Environments (VRE) can both help to broaden the user base of an HPC cluster like NEMO and offer new forms of packaging scientific workflows as well as managing software stacks. The eResearch initiative on VREs sponsored by the state of Baden-Württemberg provided the necessary framework for both the researchers of various disciplines as well as the providers of (large-scale) compute infrastructures to define future operational models of HPC clusters and scientific clouds. In daily operations, VREs running on virtualization or containerization technologies such as OpenStack or Singularity help to disentangle the responsibilities regarding the software stacks needed to fulfill a certain task. Nevertheless, the reproduction of VREs as well as the provisioning of research data to be computed and stored afterward creates a couple of challenges which need to be solved beyond the traditional scientific computing models
    corecore