The more social we get, the more we expose about ourselves. That's a fact of life that antedates social media, the Internet, or computers. From this perspective, social media are just a new way to reveal ourselves, sometimes by choice, other times not.
For the PM in charge of product requirements or release checklists, working on a social media product requires standing your ground on security and privacy issues. I won't beat on the unfortunate launch of Google Buzz further, except to make one last point. To avoid the sort of morass in which Google finds itself, someone in PM needs to have both the authority to insist on whatever external testing, use case analysis, red team testing—some way, any way, to avoid a class action lawsuit.
Unfortunately, very few PMs are experts in security and privacy. Historically, compliance and risk management don't overshadow the tech industry the way they figure prominently in pharmaceuticals and other tightly-regulated verticals. The section 508 requirements of the Americans With Disabilities Act, an important box to check before releasing a new product, is the closest many PMs get to anything resembling regulation.
That situation may change. Security specialists in tech companies deal with technical issues like, "How vulnerable are we to cross-site attacks?" Normally, they are not the people who think through the capabilities of the product to see if there is some way that product might compromise security or privacy. Even if they volunteered for the job, it's not feasible to bring them into the product development cycle early enough to make a difference. Nor will they be able to stay engaged to monitor how the features inevitably change over the course of development.
Therefore, it's unavoidable that security, risk management, privacy, and related issues naturally fall into the laps of PMs. Social media products raise questions about who can pry into your personal information, or how these details might become public without your permission or knowledge. Other products create different risks. For example, the Strange Case Of The Voyeuristic Laptops at a Philadelphia-area high school raises the question, is the company that produced the software used to spy on students at all liable for the abuse of the technology? Does circumventing an operating system's security features create other points of legal exposure?
I've yet to meet a PM who loves dealing with these kinds of issues. Who wants to be the person who, in a release meeting, presents the 15 potential security problems that need to be addressed before the product ships? Particularly if, just before you, someone gave a brilliant demo of how incredibly cool the product will be, if only it could get into the hands of customers. And, of course, success is defined as the absence of problems, not positive gains that everyone can celebrate. ("Woohoo! Three product releases without a lawsuit!") Sadly, I don't see a way of avoiding this responsibility.
[Cross-posted at The Heretech.]