118,690 research outputs found

    Faulty Metrics and the Future of Digital Journalism

    Get PDF
    This report explores the industry of Internet measurement and its impact on news organizations working online. It investigates this landscape through a combination of documentary research and interviews with measurement companies, trade groups, advertising agencies, media scholars, and journalists from national newspapers, regional papers, and online-only news ventures

    Smart Search: A Firefox Add-On to Compute a Web Traffic Ranking

    Get PDF
    Search engines results are typically ordered according to some notion of importance of a web page as well as relevance of the content of a web page to a query. Web page importance is usually calculated based on some graph theoretic properties of the web. Another common technique to measure page importance is to make use of the traffic that goes to a particular web page as measured by a browser toolbar. Currently, there are some traffic ranking tools available like www.alexa.com, www.ranking.com, www.compete.com that give such analytic as to the number of users who visit a web site. Alexa provides the traffic rank for a website based on two factors: The number of users that view a website and the number of pages viewed. The Alexa toolbar is not open-source.The main goal of our project was to create a Smart Search Firefox add-on for the Yioop search engine, an open source search engine developed by my project advisor, Dr. Chris Pollett. This add-on would provide similar analytic data to the Yioop search engine, but in a transparent and open-source way. With the results received from the Smart Search toolbar extension, the Yioop search engine refines the search results as well as provides user centric-search results. Eventually, users would benefit from these better search results

    Digital Transitions: Nonprofit Investigative Journalism: Evaluation Report on the Center for Public Integrity

    Get PDF
    Summarizes outcomes of a one-year grant to CPI to transform itself into a leader in digital nonprofit journalism. Examines CPI's track record, use of new tools and methods, capacity as an effective and credible online presence, and areas for improvement

    A user-oriented network forensic analyser: the design of a high-level protocol analyser

    Get PDF
    Network forensics is becoming an increasingly important tool in the investigation of cyber and computer-assisted crimes. Unfortunately, whilst much effort has been undertaken in developing computer forensic file system analysers (e.g. Encase and FTK), such focus has not been given to Network Forensic Analysis Tools (NFATs). The single biggest barrier to effective NFATs is the handling of large volumes of low-level traffic and being able to exact and interpret forensic artefacts and their context – for example, being able extract and render application-level objects (such as emails, web pages and documents) from the low-level TCP/IP traffic but also understand how these applications/artefacts are being used. Whilst some studies and tools are beginning to achieve object extraction, results to date are limited to basic objects. No research has focused upon analysing network traffic to understand the nature of its use – not simply looking at the fact a person requested a webpage, but how long they spend on the application and what interactions did they have with whilst using the service (e.g. posting an image, or engaging in an instant message chat). This additional layer of information can provide an investigator with a far more rich and complete understanding of a suspect’s activities. To this end, this paper presents an investigation into the ability to derive high-level application usage characteristics from low-level network traffic meta-data. The paper presents a three application scenarios – web surfing, communications and social networking and demonstrates it is possible to derive the user interactions (e.g. page loading, chatting and file sharing ) within these systems. The paper continues to present a framework that builds upon this capability to provide a robust, flexible and user-friendly NFAT that provides access to a greater range of forensic information in a far easier format

    The National Dialogue on the Quadrennial Homeland Security Review

    Get PDF
    Six years after its creation, the Department of Homeland Security (DHS) undertook the first Quadrennial Homeland Security Review (QHSR) to inform the design and implementation of actions to ensure the safety of the United States and its citizens. This review, mandated by the Implementing the 9/11 Commission Recommendations Act of 2007, represents the first comprehensive examination of the homeland security strategy of the nation. The QHSR includes recommendations addressing the long-term strategy and priorities of the nation for homeland security and guidance on the programs, assets, capabilities, budget, policies, and authorities of the department.Rather than set policy internally and implement it in a top-down fashion, DHS undertook the QHSR in a new and innovative way by engaging tens of thousands of stakeholders and soliciting their ideas and comments at the outset of the process. Through a series of three-week-long, web-based discussions, stakeholders reviewed materials developed by DHS study groups, submitted and discussed their own ideas and priorities, and rated or "tagged" others' feedback to surface the most relevant ideas and important themes deserving further consideration.Key FindingsThe recommendations included: (1) DHS should enhance its capacity for coordinating stakeholder engagement and consultation efforts across its component agencies, (2) DHS and other agencies should create special procurement and contracting guidance for acquisitions that involve creating or hosting such web-based engagement platforms as the National Dialogue, and (3) DHS should begin future stakeholder engagements by crafting quantitative metrics or indicators to measure such outcomes as transparency, community-building, and capacity

    DON'T FEED THE TROLLS!: Managing troublemakers in magazines' online communities

    Get PDF
    “Trolling” and other negative behaviour on magazine websites is widespread, ranging from subtly provocative behaviour to outright abuse. Publishers have sought to develop lively online communities, with high levels of user-generated content. Methods of building sites have developed quickly, but methods of managing them have lagged behind. Some publishers have then felt overwhelmed by the size and behaviour of the communities they have created. This paper considers the reasons behind trolling and the tools digital editors have developed to manage their communities, taking up the role of Zygmunt Bauman's gardeners in what they sometimes refer to as “walled gardens” within the Internet's wild domains. Interviews were conducted with online editors at the front line of site management at Bauer, Giraffe, IPC, Natmags, RBI and the Times. This article shows how publishers are designing sites that encourage constructive posting, and taking a more active part in site management. Web 2.0 and the spread of broadband, which have made management of fast-growing communities difficult, may themselves bring positive change. As uploading material becomes technically easier, “ordinary” citizens can outnumber those who, lacking social skills or with little regard for social norms, originally made the Internet their natural habitat

    Issues Related to the Emergence of the Information Superhighway and California Societal Changes, IISTPS Report 96-4

    Get PDF
    The Norman Y. Mineta International Institute for Surface Transportation Policy Studies (IISTPS) at San José State University (SJSU) conducted this project to review the continuing development of the Internet and the Information Superhighway. Emphasis was placed on an examination of the impact on commuting and working patterns in California, and an analysis of how public transportation agencies, including Caltrans, might take advantage of the new communications technologies. The document reviews the technology underlying the current Internet “structure” and examines anticipated developments. It is important to note that much of the research for this limited-scope project was conducted during 1995, and the topic is so rapidly evolving that some information is almost automatically “dated.” The report also examines how transportation agencies are basically similar in structure and function to other business entities, and how they can continue to utilize the emerging technologies to improve internal and external communications. As part of a detailed discussion of specific transportation agency functions, it is noted that the concept of a “Roundtable Forum,” growing out of developments in Concurrent Engineering, can provide an opportunity for representatives from multiple jurisdictions to utilize the Internet for more coordinated decision-making. The report also included an extensive analysis of demographic trends in California in recent years, such as commute and recreational activities, and identifies how the emerging technologies may impact future changes
    • …
    corecore