be given the values in advance and that the values be keptsecret.At the start of an election day, the poll worker wouldfollow the procedures outlined in Figure 1. As illus-trated, the poll worker enters a random challenge using akey pad.
In response to this challenge, the voting ma-chine displays a checksum value and a hash value as de-picted in Steps d and g. The poll worker veriﬁes that thechecksum is identical to the expected response, based onthe value on the scratch-off card. The poll worker alsoveriﬁes that the time taken by the voting machine to com-pute the checksum is below a preset threshold
and thatthe hash of the software is correct. If all veriﬁcation stepssucceed, the machine’s software would be considered au-thentic.
3.2 The Attestation Process
We now explain the process by which our system se-curely attests itself. The attestation occurs in two distinctphases:
Shortly after the machine is pow-ered on, the bootloader computes a checksum overall its running code, and reports the result to theuser. If the response is veriﬁed to be correct, thisdynamically establishes a root of trust in the boot-loader.
The now trusted bootloader computes a stan-dard cryptographic hash over the disk image aboutto be booted. It then reports this hash to the user,who can verify it and thus validate that the softwarebeing booted is authentic.
BOOTLOADERVOTINGMACHINE1)Challenge2) SelfChecksumDISPLAY3) Checksum5) Hash4) ComputeHash
Figure 2: Proposed architectureThe attestation process is depicted in Figure 2, andwe explain it in more detail below. Since the processof hashing is straightforward, we limit our discussion tothe self-checksum and the attestation process as a whole.To compute the self-checksum, we leverage Pio-neer . Pioneer is the work of Seshadri, Luk, Shi,Perrig, van Doorn, and Khosla. It is a challenge-responsebased tool originally designed for enforcing untamperedcode execution and produces checksums intended forveriﬁcation by remote computers. Although we refer thereader to  for its details, we brieﬂy describe Pioneerin the context of our work.Pioneer can simply be thought of as an
optimally im- plemented
function over a challenge and the state of a speciﬁc range of memory that contains its code. Itreads from pseudo-random addresses within the speci-ﬁed memory range and incorporates their values into itsoutput (the checksum) and the pseudo-randomness itself.Notice that in this process, Pioneer reads and attests itsown code.The general idea behind Pioneer is that some author-ity will run the authentic code of interest (in our case,the honest bootloader), and can provide the correct func-tion outputs corresponding to the “good” state of mem-ory. These outputs can then be veriﬁed against the out-puts of executions on other machines, and in this way, itoperates like a standard checksum.The critical property of Pioneer that distinguishes itfrom a regular checksum is its
. It was de-signed with the intention of preventing any code fromcomputing the Pioneer function faster than it is currentlyimplemented, i.e. when it is computing a legitimatechecksum. Thus, an adversary may be able to writemalicious code other than Pioneer that gives the samechecksums for the same challenges, but that code shouldnot run as fast as Pioneer. The veriﬁcation process thenmakes use of this fact by requiring the veriﬁcation of thePioneer execution time as well as the correctness of itschecksum. It is in this way that Pioneer can dynamicallyestablish a root of trust in itself.The major difference between the original Pio-neer  and our use of Pioneer, is that we designedour version to be human veriﬁable, an idea that was ﬁrstexplored independently by Franklin, Luk, Seshadri, andPerrig [12,13]. The original Pioneer provides a means of ensuring code execution on remote machines and is thusdesigned to run very quickly. As a result, the maliciousand honest Pioneer execution times were distinguishableonly by machines. In our work, we drastically increasethe number of checksum iterations executed by Pioneerto force the best known attack version to run approxi-mately 3 seconds slower than the legitimate one. Withthe aid of a clock, a human challenging our modiﬁed Pi-oneer can independently verify its integrity. We describe