Abstract
An electronic voting protocol provides cast-as-intended verifiability if the voter can verify that her encrypted vote contains the voting options that she selected. There are some proposals of protocols with cast-as-intended verifiability in the literature, but all of them have drawbacks either in terms of usability or in terms of security. In this paper, we propose a new voting scheme with cast-as-intended verifiability which allows to audit the vote to be cast, while providing measures for avoiding coercion by allowing the voter to create fake proofs of the content of her vote. We provide an efficient implementation and formally analize its security properties.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Wombat voting system (2015). http://www.wombat-voting.com/
Adida, B.: Helios: web-based open-audit voting. In: van Oorschot, P.C. (ed.) USENIX Security Symposium, pp. 335–348. USENIX Association (2008)
Bayer, S., Groth, J.: Efficient zero-knowledge argument for correctness of a shuffle. In: Pointcheval, D., Johansson, T. (eds.) EUROCRYPT 2012. LNCS, vol. 7237, pp. 263–280. Springer, Heidelberg (2012). doi:10.1007/978-3-642-29011-4_17
Bellare, M., Rogaway, P.: Random oracles are practical: a paradigm for designing efficient protocols. In: Proceedings of CCS 1993, NY, USA, pp. 62–73 (1993)
Bernhard, D., Cortier, V., Galindo, D., Pereira, O., Warinschi, B.: A comprehensive analysis of game-based ballot privacy definitions. IACR Cryptology ePrint Archive 2015, 255 (2015)
Bernhard, D., Pereira, O., Warinschi, B.: On necessary and sufficient conditions for private ballot submission. IACR Cryptology ePrint Archive 2012, 236 (2012)
Brassard, G., Chaum, D., Crépeau, C.: Minimum disclosure proofs of knowledge. J. Comput. Syst. Sci. 37(2), 156–189 (1988)
Chaum, D.: Surevote: Technical report (2001). http://www.iavoss.org/mirror/wote01/pdfs/surevote.pdf
Chaum, D., Pedersen, T.P.: Wallet databases with observers. In: Brickell, E.F. (ed.) CRYPTO 1992. LNCS, vol. 740, pp. 89–105. Springer, Heidelberg (1993). doi:10.1007/3-540-48071-4_7
Chen, X., Wu, Q., Zhang, F., Tian, H., Wei, B., Lee, B., Lee, H., Kim, K.: New receipt-free voting scheme using double-trapdoor commitment. Inf. Sci. 181(8), 1493–1502 (2011)
Cortier, V., Galindo, D., Glondu, S., Izabachène, M.: Election verifiability for helios under weaker trust assumptions. In: Kutyłowski, M., Vaidya, J. (eds.) ESORICS 2014. LNCS, vol. 8713, pp. 327–344. Springer, Cham (2014). doi:10.1007/978-3-319-11212-1_19
Cramer, R., Gennaro, R., Schoenmakers, B.: A secure and optimally efficient multi-authority election scheme. In: Fumy, W. (ed.) EUROCRYPT 1997. LNCS, vol. 1233, pp. 103–118. Springer, Heidelberg (1997). doi:10.1007/3-540-69053-0_9
Damgård, I.: Commitment schemes and zero-knowledge protocols. In: Damgård, I.B. (ed.) EEF School 1998. LNCS, vol. 1561, pp. 63–86. Springer, Heidelberg (1999). doi:10.1007/3-540-48969-X_3
Damgård, I., Mikkelsen, G.L.: Efficient, robust and constant-round distributed RSA key generation. In: Micciancio, D. (ed.) TCC 2010. LNCS, vol. 5978, pp. 183–200. Springer, Heidelberg (2010). doi:10.1007/978-3-642-11799-2_12
ElGamal, T.: A public key cryptosystem and a signature scheme based on discrete logarithms. In: Blakley, G.R., Chaum, D. (eds.) CRYPTO 1984. LNCS, vol. 196, pp. 10–18. Springer, Heidelberg (1985). doi:10.1007/3-540-39568-7_2
Fiat, A., Shamir, A.: How to prove yourself: practical solutions to identification and signature problems. In: Odlyzko, A.M. (ed.) CRYPTO 1986. LNCS, vol. 263, pp. 186–194. Springer, Heidelberg (1987). doi:10.1007/3-540-47721-7_12
Gharadaghy, R., Volkamer, M.: Verifiability in electronic voting - explanations for non security experts. In: Proceedings of EVOTE 2010. LNI, vol. 167, pp. 151–162. GI (2010)
Gjøsteen, K.: Analysis of an internet voting protocol. IACR Cryptology ePrint Archive 2010, 380 (2010)
Guasch, S., Morillo, P.: How to challenge and cast your e-vote. IACR Cryptology ePrint Archive (2016). To be published
Jakobsson, M., Sako, K., Impagliazzo, R.: Designated verifier proofs and their applications. In: Maurer, U. (ed.) EUROCRYPT 1996. LNCS, vol. 1070, pp. 143–154. Springer, Heidelberg (1996). doi:10.1007/3-540-68339-9_13
Juels, A., Catalano, D., Jakobsson, M.: Coercion-resistant electronic elections. In: Proceedings of WPES 2005, pp. 61–70. ACM (2005)
Krawczyk, H., Rabin, T.: Chameleon hashing and signatures. IACR Cryptology ePrint Archive 1998, 010 (1998)
Moran, T., Naor, M.: Receipt-free universally-verifiable voting with everlasting privacy. In: Dwork, C. (ed.) CRYPTO 2006. LNCS, vol. 4117, pp. 373–392. Springer, Heidelberg (2006). doi:10.1007/11818175_22
Neff, C.A.: Practical high certainty intent verification for encrypted votes (2004)
Pedersen, T.P.: A threshold cryptosystem without a trusted party. In: Davies, D.W. (ed.) EUROCRYPT 1991. LNCS, vol. 547, pp. 522–526. Springer, Heidelberg (1991). doi:10.1007/3-540-46416-6_47
Ryan, P.Y.A., Teague, V.: Pretty good democracy. In: Christianson, B., Malcolm, J.A., Matyáš, V., Roe, M. (eds.) Security Protocols 2009. LNCS, vol. 7028, pp. 111–130. Springer, Heidelberg (2013). doi:10.1007/978-3-642-36213-2_15
Sako, K., Kilian, J.: Receipt-free mix-type voting scheme. In: Guillou, L.C., Quisquater, J.-J. (eds.) EUROCRYPT 1995. LNCS, vol. 921, pp. 393–403. Springer, Heidelberg (1995). doi:10.1007/3-540-49264-X_32
Santis, A.D., Persiano, G.: Zero-knowledge proofs of knowledge without interaction (extended abstract). In: FOCS, pp. 427–436. IEEE Computer Society (1992)
Schnorr, C.P.: Efficient identification and signatures for smart cards. In: Brassard, G. (ed.) CRYPTO 1989. LNCS, vol. 435, pp. 239–252. Springer, New York (1990). doi:10.1007/0-387-34805-0_22
Schnorr, C.P., Jakobsson, M.: Security of signed ElGamal encryption. In: Okamoto, T. (ed.) ASIACRYPT 2000. LNCS, vol. 1976, pp. 73–89. Springer, Heidelberg (2000). doi:10.1007/3-540-44448-3_7
Smyth, B., Frink, S., Clarkson, M.R.: Computational election verifiability: definitions and an analysis of helios and JCJ. IACR Cryptology ePrint Archive 2015, 233 (2015)
Wikström, D.: A commitment-consistent proof of a shuffle. IACR Cryptology ePrint Archive 2011, 168 (2011)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
AÂ Security Definitions and Analysis Results
AÂ Security Definitions and Analysis Results
1.1 A.1Â Definitions
In this section we define the notions of ballot privacy, and cast as intended verifiability for an electronic voting scheme such as that described in Sect. 3. We take as basis the definitions from [5] and then adapt them to the particularities of our scheme. Other definitions such as strong consistency or strong correctness are available in the full version [19].
Ballot Privacy. It is defined by means of an experiment where an adversary is presented with two experiments and has to be able to distinguish between them. In each experiment the adversary has indirect access to a ballot box which receives the ballots created by honest voters, as well as ballots cast by the adversary itself on behalf of corrupted voters. In the case of honest voters, we let the adversary choose two possible votes which they will use to create their ballots. Which vote is used to cast a voter’s ballot that goes to a specific ballot box depends on the experiment that is taking place.
At the end of the experiment, the adversary is presented with the result of tallying the ballot box, which is the same regardless of the experiment. As noted in [5], revealing the true tally in each experiment would easily allow the adversary to distinguish between both ballot boxes. Additionally, for the votes cast by honest voters, we provide the resulting encryption proof data to the adversary in order to model a coercer which uses it to learn something about the vote. We will use a simulation functionality to generate fake proofs when required.
\({{\mathbf {\mathsf{{Exp}}}}}_{\mathcal {A}, V}^{{{\mathbf {\mathsf{{priv}}}}}, \beta }\) :
-
1.
Setup phase: The challenger \(\mathcal {C}\) sets up two empty bulletin boards \(\mathsf {BB}_0\) and \(\mathsf {BB}_1\) and runs the \(\mathsf {Setup}(1^\lambda )\) algorithm to obtain the election key pair \((pk, sk)\) and the empty list of credentials \(\mathsf {ID}\). \(\mathcal {A}\) is given access to \(\mathsf {BB}_0\) when \(\beta =0\) and to \(\mathsf {BB}_1\) when \(\beta =1\).
-
2.
Registration phase: The adversary may make the following query:
-
\({\varvec{\mathcal {O}}}\) Register(\(\mathtt {id}\)): \(\mathcal {A}\) provides an identity \(\mathtt {id}\not \in \mathsf {ID}\). \(\mathcal {C}\) runs \(\mathsf {Register}(1^\lambda , \mathtt {id})\), provides the voter key pair \((pk_{\mathtt {id}}, sk_{\mathtt {id}})\) to \(\mathcal {A}\), and adds \((\mathtt {id}, pk_{\mathtt {id}})\) to \(\mathsf {ID}\).
-
-
3.
Voting phase: The adversary may make the following types of queries:
-
\({\varvec{\mathcal {O}}}\) VoteLR(\(\mathtt {id}, v_0, v_1\)): this query models the votes cast by honest voters. \(\mathcal {A}\) provides an identity \(\mathtt {id}\in \mathsf {ID}\) and two possible votes \(v_0\), \(v_1\) \(\in V\). The challenger \(\mathcal {C}\) does the following:
-
It picks the corresponding \(pk_{\mathtt {id}}\) from \(\mathsf {ID}\) and executes \(\mathsf {CreateVote} (v_0, pk_{\mathtt {id}})\) and \(\mathsf {CreateVote}(v_1, pk_{\mathtt {id}})\) which produce the ballots \(b^0\) and \(b^1\) respectively and their encryption proof data \(\sigma _0\) and \(\sigma _1\).
-
Then it executes \(\mathsf {CastVote}(b^0, sk_{\mathtt {id}}, \mathtt {id})\), \(\mathsf {CastVote}(b^1, sk_{\mathtt {id}}, \mathtt {id})\) to obtain the authenticated ballots \(b^0_a\) and \(b^1_a\), and \(\mathsf {ProcessBallot}(BB_0, b^0_a)\) and \(\mathsf {ProcessBallot}(BB_1, b^1_a)\). If both processes return 1, the ballot boxes \(\mathsf {BB}_0\) and \(\mathsf {BB}_1\) are updated with \(b^0_a\) and \(b^1_a\) respectively. Otherwise, \(\mathcal {C}\) stops and returns \(\perp \).
-
Finally, \(\mathcal {C}\) executes \(\mathsf {FakeProof}(b^\beta , sk_{\mathtt {id}}, pk_{\mathtt {id}}, v_{\overline{\beta }})\) and provides \(\sigma _\beta \) and the simulated encryption proof data \(\sigma _\beta ^*\) to \(\mathcal {A}\).
-
-
\({\varvec{\mathcal {O}}}\) Cast(\(b_a\)): this query models the votes cast by corrupted voters. \(\mathcal {A}\) provides an authenticated ballot \(b_a\), and then \(\mathcal {C}\) executes \(\mathsf {ProcessBallot}\) with \(b_a\) and each ballot box. If both algorithms return 1, both ballot boxes are updated with \(b_a\). Otherwise, \(\mathcal {C}\) halts and none of the ballot boxes are updated.
-
-
4.
Counting phase: The challenger runs \(\textsf {Tally}(\mathsf {BB}_0, sk)\) and obtains the result r and the tally proof \(\varPi \), which are provided to \(\mathcal {A}\) in case \(\beta =0\). In case \(\beta =1\), \(\mathcal {C}\) runs \(\mathsf {SimProof}(\mathsf {BB}_1, r)\) to obtain \(\varPi ^*\), and provides \((r, \varPi ^*)\) to \(\mathcal {A}\).
-
5.
Output: The output of the experiment is the guess of the adversary for the bit \(\beta \).
We say that a voting protocol as defined in Sect. 3 has ballot privacy if there exists an algorithm \(\mathsf {SimProof}\) such that for any probabilistic polynomial time (p.p.t.) adversary \(\mathcal {A}\), the following advantage is negligible in the security parameter \(\lambda \):
Cast-as-Intended Verifiability. A voting system is defined to be cast-as-intended verifiable if a corrupt voting device is unable to cast a vote on behalf of a voter, with a voting option different than the one chosen by the voter, without being detected. In our definition, we consider an adversary who posts ballots in the bulletin board on behalf of honest and corrupt voters. In case of honest voters, they follow the protocol and perform some validations before approving the ballot to be cast. Corrupt voters provide their approval without doing any prior verification.
\({{\mathbf {\mathsf{{Exp}}}}}_{\mathcal {A}, V}^{{{\mathbf {\mathsf{{CaI}}}}}}\) :
-
1.
Setup phase: The challenger \(\mathcal {C}\) runs the \(\mathsf {Setup}(1^\lambda )\) algorithm and provides the election key pair \((pk, sk)\) to \(\mathcal {A}\). Then it publishes the empty lists of voter credentials \(\mathsf {ID}_h\) and \(\mathsf {ID}_c\) such that \(\mathsf {ID}= (\mathsf {ID}_h \cup \mathsf {ID}_c)\). Finally \(\mathcal {A}\) is given read access to \(\mathsf {BB}\).
-
2.
Registration phase: The adversary may make the following queries:
-
\({\varvec{\mathcal {O}}}\) RegisterHonest(\(\mathtt {id}\)): \(\mathcal {A}\) provides an identity \(\mathtt {id}\not \in \mathsf {ID}\). The challenger \(\mathcal {C}\) runs \(\mathsf {Register}(1^\lambda , \mathtt {id})\), and adds \((\mathtt {id}, pk_{\mathtt {id}})\) to \(\mathsf {ID}_h\).
-
\({\varvec{\mathcal {O}}}\) RegisterCorrupt(\(\mathtt {id}\)): \(\mathcal {A}\) provides an identity \(\mathtt {id}\not \in \mathsf {ID}\). The challenger \(\mathcal {C}\) runs \(\mathsf {Register}(1^\lambda , \mathtt {id})\), and adds \((\mathtt {id}, pk_{\mathtt {id}})\) to \(\mathsf {ID}_c\).
-
-
3.
Voting phase: The adversary may make the following types of queries:
-
\({\varvec{\mathcal {O}}}\) VoteHonest(\(\mathtt {id}, v_i, b, \sigma \)): this query models the votes cast by honest voters. \(\mathcal {A}\) provides an identity \(\mathtt {id}\in \mathsf {ID}_h\), a ballot b, an encryption proof data \(\sigma \) and the voting option \(v_i\). The challenger \(\mathcal {C}\) runs \(\mathsf {AuditVote}(v_i, b, \sigma , pk_{\mathtt {id}})\), and only if the result is 1 it provides \(sk_{\mathtt {id}}\) to \(\mathcal {A}\).
-
\({\varvec{\mathcal {O}}}\) VoteCorrupt(\(b, \mathtt {id}\)): this query models the votes cast by corrupted voters. \(\mathcal {A}\) provides a ballot b and an identity \(\mathtt {id}\in \mathsf {ID}_c\). \(\mathcal {C}\) answers with \(sk_{\mathtt {id}}\).
-
-
4.
Output: The adversary submits an authenticated ballot \(b_a' = (\mathtt {id}', b', \psi ')\). The output of the experiment is 1 if the following conditions hold:
-
\(\mathtt {id}' \in \mathsf {ID}_h\)
-
\(\mathsf {ProcessBallot}(\mathsf {BB},b_a')=1\)
-
\(\mathsf {VerifyVote}(\mathsf {BB}, b',\mathtt {id}')=1\)
-
\(\mathsf {Extract}(b_a'; sk)\ne v_i'\), where \(v_i'\) is the voting option submitted by the adversary in the \(\mathcal {O}\)VoteHonest query.
-
We say that a voting protocol as defined in Sect. 3 has cast-as-intended verifiability if, given an \(\mathsf {Extract}\) algorithm for which the protocol is consistent with respect to \(\rho \), the following advantage is negligible in the security parameter \(\lambda \) for any probabilistic polynomial time (p.p.t.) adversary \(\mathcal {A}\):
1.2 A.2Â Security Analysis Results
In this section we provide the results of our security analysis, which is available in the full version of this paper [19].
Theorem 1
Let \((\mathsf {Gen_e}, \mathsf {Enc}, \mathsf {Dec})\) be an NM-CPA secure encryption scheme and \((\mathsf {GenCRS}, \mathsf {NIZKProve}, \mathsf {NIZKVerify}, \mathsf {NIZKSimulate})\) a NIZKPK which provides zero-knowledge. Then the protocol presented in Sect. 3 satisfies the ballot privacy property.
Theorem 2
Let \((\mathsf {Gen_e}, \mathsf {Enc}, \mathsf {Dec}, \mathsf {EncVerify})\) be a probabilistic encryption scheme, \((\mathsf {GenCRS}, \mathsf {NIZKProve}, \mathsf {NIZKVerify}, \mathsf {NIZKSimulate})\) a NIZKPK which is sound and \((\mathsf {Gen_s}, \mathsf {Sign}, \mathsf {SignVerify})\) an unforgeable signature scheme. Then the protocol presented in Sect. 3 satisfies the cast-as-intended verifiability property.
Rights and permissions
Copyright information
© 2017 International Financial Cryptography Association
About this paper
Cite this paper
Guasch, S., Morillo, P. (2017). How to Challenge and Cast Your e-Vote. In: Grossklags, J., Preneel, B. (eds) Financial Cryptography and Data Security. FC 2016. Lecture Notes in Computer Science(), vol 9603. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-54970-4_8
Download citation
DOI: https://doi.org/10.1007/978-3-662-54970-4_8
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-662-54969-8
Online ISBN: 978-3-662-54970-4
eBook Packages: Computer ScienceComputer Science (R0)