13 research outputs found
Invisible Pixels Are Dead, Long Live Invisible Pixels!
Privacy has deteriorated in the world wide web ever since the 1990s. The
tracking of browsing habits by different third-parties has been at the center
of this deterioration. Web cookies and so-called web beacons have been the
classical ways to implement third-party tracking. Due to the introduction of
more sophisticated technical tracking solutions and other fundamental
transformations, the use of classical image-based web beacons might be expected
to have lost their appeal. According to a sample of over thirty thousand images
collected from popular websites, this paper shows that such an assumption is a
fallacy: classical 1 x 1 images are still commonly used for third-party
tracking in the contemporary world wide web. While it seems that ad-blockers
are unable to fully block these classical image-based tracking beacons, the
paper further demonstrates that even limited information can be used to
accurately classify the third-party 1 x 1 images from other images. An average
classification accuracy of 0.956 is reached in the empirical experiment. With
these results the paper contributes to the ongoing attempts to better
understand the lack of privacy in the world wide web, and the means by which
the situation might be eventually improved.Comment: Forthcoming in the 17th Workshop on Privacy in the Electronic Society
(WPES 2018), Toronto, AC
Propagating semantic information in biochemical network models
<p>Abstract</p> <p>Background</p> <p>To enable automatic searches, alignments, and model combination, the elements of systems biology models need to be compared and matched across models. Elements can be identified by machine-readable biological annotations, but assigning such annotations and matching non-annotated elements is tedious work and calls for automation.</p> <p>Results</p> <p>A new method called "semantic propagation" allows the comparison of model elements based not only on their own annotations, but also on annotations of surrounding elements in the network. One may either propagate feature vectors, describing the annotations of individual elements, or quantitative similarities between elements from different models. Based on semantic propagation, we align partially annotated models and find annotations for non-annotated model elements.</p> <p>Conclusions</p> <p>Semantic propagation and model alignment are included in the open-source library semanticSBML, available on sourceforge. Online services for model alignment and for annotation prediction can be used at <url>http://www.semanticsbml.org</url>.</p
Artificial neural networks for reducing the dimensionality of gene expression data
Book synopsis: Bioinformatics as well as Computational Intelligence are undoubtedly remarkably fast growing fields of research and real-world applications with enormous potential for current and future developments. "Bioinformatics using Computational Intelligence Paradigms" contains recent theoretical approaches and guiding applications of biologically inspired information processing systems(Computational Intelligence) against the background of bioinformatics. This carefully edited monograph combines the latest results of Bioinformatics and Computational Intelligence and offers a promising cross-fertilisation and interdisciplinary work between these growing fields
Evolutionary search for improved path diagrams
A path diagram relates observed, pairwise, variable correlations to a functional structure which describes the hypothesized causal relations between the variables. Here we combine path diagrams, heuristics and evolutionary search into a system which seeks to improve existing gene regulatory models. Our evaluation shows that once a correct model has been identified it receives a lower prediction error compared to incorrect models, indicating the overall feasibility of this approach. However, with smaller samples the observed correlations gradually become more misleading, and the evolutionary search increasingly converges on suboptimal models. Future work will incorporate publicly available sources of experimentally verified biological facts to computationally suggest model modifications which might improve the model's fitness
Open Source Software Resilience Framework
Part 2: OSS Projects ValidityInternational audienceAn Open Source Software (OSS) project can be utilized either as is, to serve specific needs on an application level, or on the source code level, as a part of another software system serving as a component, a library, or even an autonomous third party dependency. There are several OSS quality models that provide metrics to measure specific aspects of the project, like its structural quality. Although other dimensions, like community health and activity, software governance principles or license permissiveness, are taken into account, there is no universally accepted OSS assessment model. In this work we are proposing an evaluation approach based on the adaptation of the City Resilience Framework to OSS with the aim of providing a strong theoretical basis for evaluating OSS projects
Hackers on Forking
All open source licenses allow the copying of an existing body of code for use as the basis of a separate development project. This practice is commonly known as forking the code. This paper presents the results of a study in which 11 programmers were interviewed about their opinions on the right to fork and the impact of forking on open source software development. The results show that there is a general consensus among programmers’ views regarding both the favourable and unfavourable aspects that stem from the right to fork. Interestingly, while all programmers noted potential downsides to the right to fork, it was seen by all as an integral component of open source software, and a right that must not be infringed regardless of circumstance or outcome