I've been following some discussion, in a private security-related mailing list, on the topic of what constitutes sensitive information. What's interesting is that many participants seem to have completely missed the point about privacy, as opposed to security.
The whole thing started with a query as to whether a person's date of birth could be considered sensitive or confidential if combined with other personally identifiable information. The thread then meandered around various topics inclduing privacy, identity theft and authentication. It pretty much ended with various contributors suggesting that any security professional worth his (or her) salt should not need to ask this question, and in particular, the last person to ask it of should be a lawyer. One contributor - who had better remain nameless - stated outright, "No slight intended on lawyers, but if anybody who considers themselves to be an information security professional needs to rely on lawyers to tell them what sensitive information is and how they are to protect it then we are all doomed!".
Bzzt! Thank you for playing, Anonymous! Please bring on the next contestant.
Infosec professionals need to be aware of the law in this area, as in many others. And, the law being what it is, the best approach is to consult a lawyer. Certainly, one should have some degree of familiarity with the relevant local law; for me, here in Australia, this is the Privacy Act (C'wealth) (1988)  which defines personal information that is subject to the Act, as well as "sensitive information" which is subject to additional safeguards.
The Act defines personal information as "information or an opinion (including information or an opinion forming part of a database), whether true or not, and whether recorded in a material form or not, about an individual whose identity is apparent, or can reasonably be ascertained, from the information or opinion."
It further defines "sensitive information", which requires specific treatment under the Act as:
"(a) information or an opinion about an individual's:
(i) racial or ethnic origin; or
(ii) political opinions; or
(iii) membership of a political association; or
(iv) religious beliefs or affiliations; or
(v) philosophical beliefs; or
(vi) membership of a professional or trade association; or
(vii) membership of a trade union; or
(viii) sexual preferences or practices; or
(ix) criminal record;
that is also personal information; or
(b) health information about an individual; or
(c) genetic information about an individual that is not otherwise health information. "
If you're like me when I first read that list, you are probably surprised at some of the items on it, but upon mature reflection you'll probably come to agree and perhaps think of additional elements that should be added. All of which suggests that our intuitive understanding of privacy is often incomplete.
Notions of privacy vary enormously around the world; furthermore there is often a considerable gap between what an enterprise thinks it ought to know about its customers/clients/employees and what those individuals would like the enterprise to know and do with what it knows.
The essential difference between security/confidentiality and privacy is who has control; for security in general, the owner of the information has control, but for privacy, the subject of the information has control. As an individual, the subject clearly has little influence and especially not authority over what an enterprise information owner does; hence the mechanism by which subjects collectively exert that control is legislation.
Security professionals tend to focus on how to assure the confidentiality, integrity and availability of the information in the systems in their care. We put a lot of effort into making sure the bad guys can't get access to our information. But privacy legislation is written to make sure that we aren't the bad guys - that we don't collect information we shouldn't, and that we don't use information in ways we shouldn't. Sometimes that poses ethical conflicts, when our employers think it would be a good idea to collect or aggregate personal information contrary to legislation; in that case, we have to advise against this and require compliance with the law.
For the individual, the decision to disclose personal information is a trust decision. In some cases - when dealing with other individuals, for example - we are able to rely on their benevolence. But massive corporations, by and large, are not benevolent and not possessed of individual free will. We have to rely much more on their competence, their integrity in the sense of willingness to be bound to privacy compliance, and their ability to resist security breaches. Individuals are therefore forced to rely on what is sometimes called deterence-based trust - the existence of legal sanctions which ensure that penalties for breach of trust will exceed any potential benefits from opportunistic behaviour.
An excellent example from Icelandic usage of population genomics databases  illustrates the deeper complexities; most individuals have given virtually no thought to the privacy implications of releasing their DNA for research purposes, but fortunately medical and legal ethicists have been thinking about it and proposing additional safeguards. (Thanks to Graciela Pataro for this example).
So to say that security professionals don't need to consult a lawyer is disingenuous. Winging it and assuming that defending our systems against external threats simply isn't enough.
 Privacy Act, 1988, as amended, Commonwealth Government of Australia. Available online at http://www.comlaw.gov.au/comlaw/management.nsf/lookupindexpagesbyid/IP200401860
 Herman T. Tavani, "The Case of DeCODE Genetics, Inc" in Chapter 1, "Ethics at the Intersection of Computing and Genomics" in Herman T. Tavani (ed.), "Ethics, Computing and Genomics", Jones & Bartlett Publishers, 2005. ISBN 0763736201, 9780763736200. Available online at http://books.google.com.au/books?id=wlrPaPRshesC&pg=PA15&dq=the+case+of+DECODE+Genetics+Inc#v=onepage&q=the%20case%20of%20DECODE%20Genetics%20Inc&f=false