AdaCamp: San Francisco June 8-9

(Posted for friends)

AdaCamp is a conference dedicated to increasing women’s participation in open technology and culture: open source software, Wikipedia-related projects, open data, open geo, fan fiction, remix culture, and more. The conference will be held June 8 and 9th in San Francisco.

There will be two tracks at the conference: one for people who identify as significantly female, and one (likely on just the Saturday) for allies and supporters of all genders.

Attendance to AdaCamp is by invitation following application. Please spread the word and apply. Travel assistance applications are due April 12; all other applications are due April 30. (Yes, the application process looks kind of daunting, but it’s worth it!)

Details at

My AusCert Gala talk

At AusCert, I had the privilege to share a the gala dinner stage with LaserMan and Axis of Awesome, and talk about a few security lessons from Star Wars.

I forgot to mention onstage that I’ve actually illustrated all eight of the Saltzer and Schroeder principles, and collected them up as a single page. That is “The Security Principles of Salzter and Schroeder, Illustrated with Scenes from Star Wars“. Enjoy!

We Robot: The Conference

This looks like it has the potential to be a very interesting event:

A human and robotinc hand reaching towards each other, reminiscent of Da Vinci

The University of Miami School of Law seeks submissions for “We Robot” – an inaugural conference on legal and policy issues relating to robotics to be held in Coral Gables, Florida on April 21 & 22, 2012. We invite contributions by academics, practitioners, and industry in the form of scholarly papers or presentations of relevant projects.

We seek reports from the front lines of robot design and development, and invite contributions for works-in-progress sessions. In so doing, we hope to encourage conversations between the people designing, building, and deploying robots, and the people who design or influence the legal and social structures in which robots will operate.

Robotics seems increasingly likely to become a transformative technology. This conference will build on existing scholarship exploring the role of robotics to examine how the increasing sophistication of robots and their widespread deployment everywhere from the home, to hospitals, to public spaces, and even to the battlefield disrupts existing legal regimes or requires rethinking of various policy issues.

They’re still looking for papers at: I encourage you to submit a paper on who will get successfully sued when the newly armed police drones turn out to be no more secure than Predators, with their viruses and unencrypted connections. (Of course, maybe the malware was just spyware.) Bonus points for entertainingly predicting quotes from the manufacturers about how no one could have seen that coming. Alternately, what will happen when the riot-detection algorithms decide that policemen who’ve covered their barcodes are the rioters, and opens fire on them?

The possibilities for emergent chaos are nearly endless.

The 1st Software And Usable Security Aligned for Good Engineering (SAUSAGE) Workshop

National Institute of Standards and Technology
Gaithersburg, MD USA
April 5-6, 2011

Call for Participation

The field of usable security has gained significant traction in recent years, evidenced by the annual presentation of usability papers at the top security conferences, and security papers at the top human-computer interaction (HCI) conferences. Evidence is growing that significant security vulnerabilities are often caused by security designers’ failure to account for human factors. Despite growing attention to the issue, these problems are likely to continue until the underlying development processes address usable security.

See for more details.

Hacker Hide and Seek

Core Security Ariel Waissbein has been building security games for a while now. He was They were kind enough to send a copy of his their “Exploit” game after I released Elevation of Privilege. [Update: I had confused Ariel Futoransky and Ariel Waissbein, because Waissbein wrote the blog post. Sorry!] At Defcon, he and his colleagues will be running a more capture-the-flag sort of game, titled “Hide and seek the backdoor:”

For starters, a backdoor is said to be a piece of code intentionally added to a program to grant remote control of the program — or the host that runs it – to its author, that at the same time remains difficult to detect by anybody else.

But this last aspect of the definition actually limits its usefulness, as it implies that the validity of the backdoor’s existence is contingent upon the victim’s failure to detect it. It does not provide any clue at all into how to create or detect a backdoor successfully.

A few years ago, the CoreTex team did an internal experiment at Core and designed the Backdoor Hiding Game, which mimics the old game Dictionary. In this new game, the game master provides a description of the functionalities of a program, together with the setting where it runs, and the players must then develop programs that fulfill these functionalities and have a backdoor. The game master then mixes all these programs with one that he developed and has no backdoors, and gives these to the players. Then, the players must audit all the programs and pick the benign one.

First, I think this is great, and I look forward to seeing it. I do have some questions. What elements of the game can we evaluate and how? A general question we can ask is “Is the game for fun or to advance the state of the art?” (Both are ok and sometimes it’s unclear until knowledge emerges from the chaos of experimentation.) His blog states “We discovered many new hiding techniques,” which is awesome. Games that are fun and advance the state of the art are very hard to create. It’s a seriously cool achievement.

My next question is, how close is the game to the reality of secure software development? How can we transfer knowledge from one to the other? The rules seem to drive backdoors into most code (assuming they all work, (n-1)/n). That’s unlike reality, with a much higher incidence of backdoors than exist in the wild. I’m assuming that the code will all be custom, and thus short enough to create and audit in a game, which also leads to a higher concentration of backdoors per line of code. That different concentration will reward different techniques from those that could scale to a million lines of code.

More generally, do we know how to evaluate hiding techniques? Do hackers playing a game create the same sort of backdoors as disgruntled employees or industrial spies? Because of this contest and the Underhanded C Contests, we have two corpuses of backdoored code. However, I’m not aware of any corpus of deployed backdoor code which we could compare.

So anyway, I look forward to seeing this game at Defcon, and in the future, more serious games for information security.

Previously, I’ve blogged about the Underhanded C contest here and here

SOUPS Keynote & Slides

This week, the annual Symposium on Usable Privacy and Security (SOUPS) is being held on the Microsoft campus. I delivered a keynote, entitled “Engineers Are People Too:”

In “Engineers Are People, Too” Adam Shostack will address an often invisible link in the chain between research on usable security and privacy and delivering that usability: the engineer. All too often, engineers are assumed to have infinite time and skills for usability testing and iteration. They have time to read papers, adapt research ideas to the specifics of their product, and still ship cool new features. This talk will bring together lessons from enabling Microsoft’s thousands of engineers to threat modeling effectively, share some new approaches to engineering security usability, and propose new directions for research.

A fair number of people have asked for the slides, and they’re here: Engineers Are People Too.