novaderrik
novaderrik SuperDork
3/3/12 1:12 p.m.

Bender Bending Rodriguez got elected to the DC school board by some hackers that were "invited" to try to hack into the election computers..

http://www.pcworld.com/article/251187/hackers_elect_futuramas_bender_to_the_washington_dc_school_board.html

notice how "secure" it was- it still had the default admin user name and pasword.. and the Iranians were already in there playing around when these hackers were in there playing around..

http://www.youtube.com/watch?v=SRnq-PFboMI

T.J.
T.J. SuperDork
3/3/12 1:42 p.m.

I LOL'ed when I read that. I wish it surprised me the least little bit. I have believed for years that we are allowed to vote, but we really don't have much say in who gets elected.

Not going to comment on the Iranian or Chinese intrusions, but also not surprised in the least that they were in there digging around.

patgizz
patgizz GRM+ Memberand SuperDork
3/3/12 5:03 p.m.

that's awesome. the U of M finally figured out how to do something besides lose to ohio state.

i'm very discouraged by the electronic voting system.

madmallard
madmallard HalfDork
3/3/12 5:12 p.m.

And some people -still- want the government to regulate content on these networks they have no idea how they work?

Keith
Keith GRM+ Memberand SuperDork
3/3/12 5:19 p.m.

Good. Systems of this importance need to be properly tested and should really be open source so all vulnerabilities can be laid bare. Badly designed system (injection attack) and terrible implementation (default passwords).

I, for one, welcome our new robotic overlord.

peter
peter Reader
3/3/12 5:33 p.m.

One of my professors in grad school was (is) actually very involved in voting security issues and made quite a name for himself in the field. Google Avi Rubin Diebold.

As an exercise in his security class (grad level) he had us break into teams and write a voting machine, from scratch. Each team wrote two versions: one good, one evil. The game was to pass out the source code to these machines to other students and have them decide if the version they were given was good or evil.

Ours was... slightly ingenious. Complete control over voting results. Except for one small, insignificant programming "error", the source code was clean. What wasn't clean was the first several bytes of an image we used in the voting machine. The programming error unpacked those bytes, replaced the "clean" code with the evil code, and the installer never knew the difference. In the entire class, one, count it, one person found that we were doing something sneaky. By sheer luck. And he was never 100% sure what we did. PhD candidate in computer science, big name schools. Smart cookie. Fooled all the rest, including employees of a certain 3-letter agency (granted, these were the short-bus employees, but...).

Don't kid yourself, this stuff is hard. But there are people who have very good ideas on how to make it reasonably secure*. Unfortunately they're professors and grad students, not rich corporations with strong lobbying arms. So their ideas will never make it to production, while crap written by code monkeys who don't know the first thing about security gets approved by politicians who got free champagne from some slimy sales weenie.

  • edit - "reasonably" typically does not include internet-enabled, let alone from a computer not directly controlled by the software designer.
peter
peter Reader
3/3/12 5:38 p.m.
Keith wrote: Good. Systems of this importance need to be properly tested and should really be open source so all vulnerabilities can be laid bare. Badly designed system (injection attack) and terrible implementation (default passwords). I, for one, welcome our new robotic overlord.

You posted while I was writing my thesis. Open source is good, but as above, it needs to be very, very thoroughly vetted by people who know what the berkeley they're looking at.

There are many people who claim that skill, but far fewer who actually have it. And for most of those, looking at code all day is not what they want to be doing. So you have to pay them well.

Security is one of those things companies don't like paying for. Lots of money goes in, but it's very hard to convince the customer that the improved product is worth the added cost. They'd rather have a shiny new button.

It's like safety in a car. Only even less visible.

madmallard
madmallard HalfDork
3/3/12 5:48 p.m.
peter wrote: Security is one of those things companies don't like paying for. Lots of money goes in, but it's very hard to convince the customer that the improved product is worth the added cost. They'd rather have a shiny new button.

see if Sony and Playstation 3 owners still feel that way.

Keith
Keith GRM+ Memberand SuperDork
3/4/12 12:19 a.m.
peter wrote:
Keith wrote: Good. Systems of this importance need to be properly tested and should really be open source so all vulnerabilities can be laid bare. Badly designed system (injection attack) and terrible implementation (default passwords). I, for one, welcome our new robotic overlord.
You posted while I was writing my thesis. Open source is good, but as above, it needs to be very, very thoroughly vetted by people who know what the berkeley they're looking at. There are many people who claim that skill, but far fewer who actually have it. And for most of those, looking at code all day is not what they want to be doing. So you have to pay them well. Security is one of those things companies don't like paying for. Lots of money goes in, but it's very hard to convince the customer that the improved product is worth the added cost. They'd rather have a shiny new button. It's like safety in a car. Only even less visible.

That's the advantage of publishing your code. People who are motivated by more than just a paycheck can check it. Or people who are being paid by your opponents. Voting machines in particular are going to have a lot of very motivated smart people checking them out.

By open source, I didn't mean it should be created by a bunch of volunteers. I meant that the code should be made public and available for scrutiny. Poor choice of words on my part. It's why standard cryptographic algorithms are published before they're adopted.

peter
peter Reader
3/4/12 11:21 a.m.
Keith wrote: That's the advantage of publishing your code. People who are motivated by more than just a paycheck can check it. Or people who are being paid by your opponents. Voting machines in particular are going to have a lot of very motivated smart people checking them out. By open source, I didn't mean it should be created by a bunch of volunteers. I meant that the code should be made public and available for scrutiny. Poor choice of words on my part. It's why standard cryptographic algorithms are published before they're adopted.

I understood what you were saying, but I don't think this is like the crypto-algo competitions. Those were legitimate scientific battles, with tenure, grants, PhDs, and more hanging in the balance.

I think the science of voting machines is dead as a door nail. It's been a while since I was in the field, but as I recall, even then there were legit, accepted ways to do it.

Without scientific interest, you've got to rely on hobbyists to vet your code. Sometimes, like OpenBSD, that works great. But it's a massive effort, especially sorting the good analysts from the idiots. And once you do that, it usually boils down to a handful of talented people who take an agonizingly long time to get anything done (with good reason).

And don't forget - in my example above, our source code was 99.9% clean (it may have been cleaner than that... I forget how much code we actually wrote). An entire class, whose grades (and thus future) depended on it, missed the single mistyped character. No one ever saw the evil code, but that was the stuff that got loaded onto the voting machine.

It's not just about source code, it's about the underlying operating systems, the compiler, the hardware, the top-to-bottom system. That's a E36 M3-ton to analyze.

Of course, the old ways of doing this - pen and paper, punch cards, those clunky old booths with the levers - have many of the same flaws and more. You can throw Florida 2000 at me, but it's much, much harder to compromise* a heterogeneous population of physical systems than a single electronic one.

  • and by this I mean that while it's easy to steal a ballot box full of votes, it's harder to change all of those votes without anyone noticing (yes, Florida, hanging chads, confusing arrows...I get it).
Keith
Keith GRM+ Memberand SuperDork
3/4/12 1:12 p.m.
peter wrote: Without scientific interest, you've got to rely on hobbyists to vet your code. Sometimes, like OpenBSD, that works great. But it's a massive effort, especially sorting the good analysts from the idiots. And once you do that, it usually boils down to a handful of talented people who take an agonizingly long time to get anything done (with good reason). And don't forget - in my example above, our source code was 99.9% clean (it may have been cleaner than that... I forget how much code we actually wrote). An entire class, whose grades (and thus future) depended on it, missed the single mistyped character. No one ever saw the evil code, but that was the stuff that got loaded onto the voting machine.

I dunno, seems to me that with the truckloads of money that get spent to win an election (legitimately or otherwise), it wouldn't be hobbyists looking at the code for a voting machine. There would be a lot of high-paid people doing it, with real stakes. Your class was only motivated by a grade, which was one of several for the year. Imagine if you'd been motivated by having to live under a political system you didn't like.

1988RedT2
1988RedT2 SuperDork
3/4/12 1:23 p.m.

I think it's funny and sad to see government working so hard to do everything "electronically." Instead of making processes and information more accurate and secure, it's having precisely the opposite effect.

peter
peter Reader
3/4/12 1:46 p.m.
Keith wrote: I dunno, seems to me that with the truckloads of money that get spent to win an election (legitimately or otherwise), it wouldn't be hobbyists looking at the code for a voting machine. There would be a lot of high-paid people doing it, with real stakes. Your class was only motivated by a grade, which was one of several for the year. Imagine if you'd been motivated by having to live under a political system you didn't like.

I mistook your "motivated by more than a paycheck" statement then. I thought you meant do-good volunteers. Yes, you can throw money at this, and there are are companies that will look at code for you. But even the bad ones cost tons. My original statement was that you'd have to do just that (I think). And no voting machine company is going to do that when they can just lobby to have their stuff certified by a company that they own, or can simply buy an approval from.

The original Diebold machine was approved by somebody. And boy was it a crock.

ADA-compliance is motivating a lot of the move to electronic voting. There's a lot of money to be made in selling a $200 computer and a computer program for tens of thousands of dollars a pop.

I've been hacking a medical device recently. It's amazing that even in incredibly important fields, the computer science/engineering side of things is incredibly cheap and utter crap. Expensive medical device, great sales side, electronic and software behaviors that were designed by a drunk monkey.

Keith
Keith GRM+ Memberand SuperDork
3/4/12 5:23 p.m.

See, I believe that before a voting machine gets approved, the code should be made public so it can be reviewed by all interested parties. And whether the problems are found by a volunteer leftist nutjob or by a team of professionals, they should be found. Sure, the lobbyists will cry. But I'm okay with that.

The original Diebold wasn't put under public scrutiny. And yes, that worked out so well

You'll need to log in to post.

Our Preferred Partners
vp1iArKC3IMD9FJP45eL6CZMWCWfGu0yh5g3Kma961cWpbD8BTwPMQUhZs2bKfoC