116 research outputs found

    Certification and Voting Software: Position Statement

    Get PDF
    Computers are important in every aspect of modern life. Automative tabulating machines are designed to be the most consistent and reliable counting approach invented. Still, questions of reliability, security and auditability persist. Ken Thompson and others have shown that, like other carelessly composed processes, computer programs can harbor potentially criminal activity. To be useful for voting, software must simplify and improve the ability to record and report intentions. Best practices must be used in creating important software to guard against bugs and malware. In spite of the fact that malware can be hidden in any program, there are ways to assure that it is not impacting the operation of the software. First, test vectors must allow testing of the software in every conceivable situation. Second, demonstrations can be arranged to show that it is running correctly when it is actually used. Third, computers can produce multiple records to assure that it has performed correctly

    The Voter Verifiable Audio Audit Transcript Trail (VVAATT)

    Get PDF
    The debate about verifiable voter audit trails has prematurely narrowed into two camps: those who categorically deny the need for any back-up records and those who advocate the Voter Verifiable Paper Audit Trail (VVPAT). As jurisdictions and watch-dog groups prepare to define their election processes we should consider all alternatives. This paper describes a new approach for a verification audit trail, the Voter Verifiable Audio Audit Transcript Trail (VVAATT), which improves the recount process because it produces a transcript of ballots that can be counted either by hand or by computer (or by both methods). Because it allows voters to confirm selections as they proceed, rather than after the fact, it simultaneously reduces adjacency errors (i.e. inadvertent selections of candidates whose names appear next to the desired choices). The audio transcript format makes it beneficially difficult for individual votes to be accidentally or maliciously separated out from the larger group. The system is also inexpensive and easy to implement using current technologies

    Testimony on Voter Verification

    Get PDF
    In the past five years, following the 2000 Florida election fiasco, the voting technologies used in the United States have undergone a significant change. The use of direct recording electronic (DRE) voting machines has increased and provided great opportunities for advances in accessibility and voting user interface design. Auditing steps of elections is important. Demonstrating that a computer program in a optical scan or DRE system is collecting them correctly must be done by testing and might be improved by redundant media created by separate means (electronic, or physical). One audit trail proposal is for the Voter Verified Paper Audit Trail (VVPAT). The VVPAT system adds a printer to a machine and adds an extra step to the end of the voting process when the voter prints out and approves a paper receipt of their vote. We have introduced the idea of a voter verified audio audit transcript trail (VVAATT). A VVAATT system adds audio feedback to the voting process. The audio feedback is recorded and the recording serves as an audit for the election

    Processes Can Improve Electronic Voting: A Case Study of An Election

    Get PDF
    Across the United States, I have personally watched hundreds of precincts vote since 2001. Most recently, I traveled to Reno/Sparks, Nevada to observe the rollout of the Sequoia Direct record electronic voting systems with verifiable paper trail printers on September 7, 2004. This experience was also enriched by the members of the Secretary of State of California’s poll-watching effort, who invited me to join them to watch the election progress at eleven different polling places, which together represent almost forty different precincts. At each polling place I conducted interviews with poll workers and election officials as well as exit polls of voters. The California team, comprised of election officials from all over the state, who’s broad experience provided helpful context and insights. What I witnessed at the election both encouraged and horrified me. The paper “receipts” were less confusing than I feared they would be. Poll workers and voters alike showed an eagerness to “get it right,” even when the new technology required them to endure some amount of initial frustration. However, things went gravely wrong when workers did not have adequate time to set up or test equipment; when, in the pressure of the moment, procedures were ignored or forgotten and, instead, solutions were improvised; or when no standard policy existed to guide election officials in proper protocol. In my experience, such problems are not unique to that election, to the Sequoia electronic voting machines, or to the paper trail audit system. Indeed, the shortcomings I encountered in Nevada resemble those I have seen in precincts throughout the country and with every kind of voting system. Luckily, most of these problems can easily be solved if we focus on improving both training and process. We can learn from our mistakes. Toward that end, a detailed account of my day in Reno follows

    Study Shows Ballot Design and Voter Preparation Could Have Eliminated Sarasota Florida Voting Errors

    Get PDF
    Election results showed extremely inconsistent voting rates for two high profile races in the Florida counties of Sarasota, Charlotte, and Sumter on November 7, 2006. The expected missing selections for these races were around 1.5 percent. The second race on the ballot in Sarasota -- Congressional Race 13 for the House of Representatives CR13 -- was missing 13.7 percent of voter selections. The fourth race in Charlotte and Sumter -- Attorney General -- was missing 21 percent of selections. These races were both between only two candidates and appeared on the ballot page next to a large race that had seven candidates

    AI for the Generation and Testing of Ideas Towards an AI Supported Knowledge Development Environment

    Full text link
    New systems employ Machine Learning to sift through large knowledge sources, creating flexible Large Language Models. These models discern context and predict sequential information in various communication forms. Generative AI, leveraging Transformers, generates textual or visual outputs mimicking human responses. It proposes one or multiple contextually feasible solutions for a user to contemplate. However, generative AI does not currently support traceability of ideas, a useful feature provided by search engines indicating origin of information. The narrative style of generative AI has gained positive reception. People learn from stories. Yet, early ChatGPT efforts had difficulty with truth, reference, calculations, and aspects like accurate maps. Current capabilities of referencing locations and linking to apps seem to be better catered by the link-centric search methods we've used for two decades. Deploying truly believable solutions extends beyond simulating contextual relevance as done by generative AI. Combining the creativity of generative AI with the provenance of internet sources in hybrid scenarios could enhance internet usage. Generative AI, viewed as drafts, stimulates thinking, offering alternative ideas for final versions or actions. Scenarios for information requests are considered. We discuss how generative AI can boost idea generation by eliminating human bias. We also describe how search can verify facts, logic, and context. The user evaluates these generated ideas for selection and usage. This paper introduces a system for knowledge workers, Generate And Search Test, enabling individuals to efficiently create solutions previously requiring top collaborations of experts.Comment: 8 pages, 21 reference

    Technology of Access: Allowing People of Age to Vote for Themselves, The

    Get PDF

    An Active Approach to Voting Verification

    Get PDF
    As our voting systems have come to rely more deeply on computer technology there have been great opportunities to improve the voting process, however, recently computer scientists and the general public have become wary of the amount of trust we place in the computers running our elections. Many proposals for audit systems to monitor our elections have been created. One popular audit system is the voter verified paper audit trail (VVPAT). Another more recent proposal is the voter verified audio audit transcript trail (VVAATT). In order to compare these two systems we conducted a user study where we purposely added errors to the audit trail in order to see if voters would be able to find these errors. Our results showed that voters found many more errors using the VVAATT system than they did with the VVPAT system

    Visual Secrets : A recognition-based security primitive and its use for boardroom voting

    Get PDF
    This paper presents and evaluates a new security primitive in the form of non-transferable “visual secrets”. We show how they can be used in the design of voting systems. More specifically, we introduce a receipt-free low-tech visually verifiable boardroom voting system which is built for simplicity and can serve as a teaching tool to introduce people to verifiable voting
    corecore