2008/01/11

Warumungu Norms, Privacy, Facebook, and Useful Friction

We could learn something from the Warumungu. Wendy Seltzer's Mukurtu Digital Archiving: digital "restrictions" done right is about DRM, freedom, and controls; I think it's also about privacy. What's private, and what's public, and what's semi-private are culturally determined no less than the Warumungu rules around who is allowed to see what artifacts:
...the Warumungu have a set of protocols around objects and representations of people that restrict access to physical objects and photographs. Only elders may see or authorize viewing of sacred objects; other objects may be restricted by family or gender. Images of the deceased shouldn’t be viewed, and photographs are often physically effaced. When the Warumungu archive objects or images, they want to implement the same sort of restrictions.
With an interesting twist:
People can also print images or burn CDs and thus allow the images to circulate more widely to others who live on outstations or in other areas. In fact, one of the top priorities in Mukurtu’s development was that it needed to allow people to take things with them, printing and burning were necessary to ensure circulation of the materials.
What, then, prevents people from violating these norms?
Because the Murkurtu protocol-restrictions support community norms, rather than oppose them, the system can trust its users to take objects with them. If a member of the community chooses to show a picture to someone the machine would not have, his or her interpretation prevails — the machine doesn’t presume to capture or trump the nuance of the social protocol.
People, relationships, and norms are fuzzy and messy, so maybe it's reasonable that a system to deal with them is fuzzy and messy too. What Murkurtu does is put enough useful friction in the way of disclosure to give community norms a chance to operate. You can't email an image out to a mailing list, but you can print it and show it to a reasonably small number of people at a time. The point is not to control distribution perfectly, but to give human-scale trust mechanisms a chance to operate correctly.

Who owns the data?

L'Affaire Scoble raised the question, who owns relationship data? Dare Obasanjo argues that his contact data is his, not Robert's. And he wants Facebook to enforce this.

I'd argue that we should un-ask the ownership question. As long as we're talking about ownership, we're heading down the road towards DRM that has worked out so well for the music business. I'd like to talk about community norms, and what kind of useful friction we should be thinking about in the pure digital realm to give community norms a chance to operate. Reputation and portable identity is part of this, as are things like limited access (E.g., OAuth), rate limits, soft constraints, and user centric norm enforcement. (What would happen if the people on Robert's friends list were simply informed, in real time, that he was copying their data for an unknown purpose?)

(Nick Carr has a great post on this subject as well.)