7,089 research outputs found

    Ghera: A Repository of Android App Vulnerability Benchmarks

    Full text link
    Security of mobile apps affects the security of their users. This has fueled the development of techniques to automatically detect vulnerabilities in mobile apps and help developers secure their apps; specifically, in the context of Android platform due to openness and ubiquitousness of the platform. Despite a slew of research efforts in this space, there is no comprehensive repository of up-to-date and lean benchmarks that contain most of the known Android app vulnerabilities and, consequently, can be used to rigorously evaluate both existing and new vulnerability detection techniques and help developers learn about Android app vulnerabilities. In this paper, we describe Ghera, an open source repository of benchmarks that capture 25 known vulnerabilities in Android apps (as pairs of exploited/benign and exploiting/malicious apps). We also present desirable characteristics of vulnerability benchmarks and repositories that we uncovered while creating Ghera.Comment: 10 pages. Accepted at PROMISE'1

    An extensive analysis of efficient bug prediction configurations

    Get PDF
    Background: Bug prediction helps developers steer maintenance activities towards the buggy parts of a software. There are many design aspects to a bug predictor, each of which has several options, i.e., software metrics, machine learning model, and response variable. Aims: These design decisions should be judiciously made because an improper choice in any of them might lead to wrong, misleading, or even useless results. We argue that bug prediction con?gurations are intertwined and thus need to be evaluated in their entirety, in contrast to the common practice in the ?eld where each aspect is investigated in isolation. Method: We use a cost-aware evaluation scheme to evaluate 60 di?erent bug prediction con?guration combinations on ?ve open source Java projects. Results:We ?nd out that the best choices for building a cost-e?ective bug predictor are change metrics mixed with source code metrics as independent variables, Random Forest as the machine learning model, and the number of bugs as the response variable. Combining these con?guration options results in the most e?cient bug predictor across all subject systems. Conclusions: We demonstrate a strong evidence for the interplay among bug prediction con?gurations and provide concrete guidelines for researchers and practitioners on how to build and evaluate e?cient bug predictors

    Promises of Law: The Unlawful Dispossession of Japanese Canadians

    Get PDF
    This article is about the origins, betrayal, and litigation of a promise of law. In 1942, while it ordered the internment of over twenty-one thousand Canadians of Japanese descent, the Canadian government enacted orders in council authorizing the Custodian of Enemy Property to seize all real and personal property owned by Japanese Canadians living within coastal British Columbia. Demands from the Japanese-Canadian community and concern from within the corridors of government resulted in amendments to those orders stipulating that the Custodian held that property as a “protective” trust and would return it to Japanese Canadians at the conclusion of the war. That is not what happened. In January 1943, a new order in council authorized the sale of all property seized from Japanese Canadians. The trust abandoned, a promise broken, the Custodian sold everything. This article traces the promise to protect property from its origins in the federal bureaucracy and demands on the streets to its demise in Nakashima v Canada, the Exchequer Court decision that held that the legal promise carried no legal consequence. We argue that the failure of the promise should not obscure its history as a product of multi-vocal processes, community activism, conflicting wartime pressures, and competing conceptions of citizenship, legality, and justice. Drawing from a rich array of archival sources, our article places the legacy of the property loss of Japanese Canadians at the disjuncture between law as a blunt instrument capable of gross injustice and its role as a social institution of good faith

    WhiteHaul: an efficient spectrum aggregation system for low-cost and high capacity backhaul over white spaces

    Get PDF
    We address the challenge of backhaul connectivity for rural and developing regions, which is essential for universal fixed/mobile Internet access. To this end, we propose to exploit the TV white space (TVWS) spectrum for its attractive properties: low cost, abundance in under-served regions and favorable propagation characteristics. Specifically, we propose a system called WhiteHaul for the efficient aggregation of the TVWS spectrum tailored for the backhaul use case. At the core of WhiteHaul are two key innovations: (i) a TVWS conversion substrate that can efficiently handle multiple non-contiguous chunks of TVWS spectrum using multiple low cost 802.11n/ac cards but with a single antenna; (ii) novel use of MPTCP as a link-level tunnel abstraction and its use for efficiently aggregating multiple chunks of the TVWS spectrum via a novel uncoupled, cross-layer congestion control algorithm. Through extensive evaluations using a prototype implementation of WhiteHaul, we show that: (a) WhiteHaul can aggregate almost the whole of TV band with 3 interfaces and achieve nearly 600Mbps TCP throughput; (b) the WhiteHaul MPTCP congestion control algorithm provides an order of magnitude improvement over state of the art algorithms for typical TVWS backhaul links. We also present additional measurement and simulation based results to evaluate other aspects of the WhiteHaul design

    WhiteHaul: An Efficient Spectrum Aggregation System for Low-Cost and High Capacity Backhaul over White Spaces

    Get PDF
    We address the challenge of backhaul connectivity for rural and developing regions, which is essential for universal fixed/mobile Internet access. To this end, we propose to exploit the TV white space (TVWS) spectrum for its attractive properties: low cost, abundance in under-served regions and favorable propagation characteristics. Specifically, we propose a system called WhiteHaul for the efficient aggregation of the TVWS spectrum tailored for the backhaul use case. At the core of WhiteHaul are two key innovations: (i) a TVWS conversion substrate that can efficiently handle multiple non-contiguous chunks of TVWS spectrum using multiple low cost 802.11n/ac cards but with a single antenna; (ii) novel use of MPTCP as a link-level tunnel abstraction and its use for efficiently aggregating multiple chunks of the TVWS spectrum via a novel uncoupled, cross-layer congestion control algorithm. Through extensive evaluations using a prototype implementation of WhiteHaul, we show that: (a) WhiteHaul can aggregate almost the whole of TV band with 3 interfaces and achieve nearly 600Mbps TCP throughput; (b) the WhiteHaul MPTCP congestion control algorithm provides an order of magnitude improvement over state of the art algorithms for typical TVWS backhaul links. We also present additional measurement and simulation based results to evaluate other aspects of the WhiteHaul design

    Scholarly Impact: a Bibliometric and Altmetric study of the Journal of Community Informatics

    Get PDF
    Demonstrating scholarly impact is a matter of growing importance. This paper reports on a bibliometric and altmetric analysis conducted on the Journal of Community Informatics (JOCI). Besides the bibliometric analysis the study also looked into JOCI article-level metrics by comparing usage metrics (article views), alternative metrics (Mendeley readership), and traditional citation metrics (Google Scholar citations). The main contribution is to provide more insight into the metrics that could influence the citation impact in Community Informatics research. Furthermore, the study used article-level metrics data to identify, compare and rank the most impactful papers published in JOCI over a 12-year period
    corecore