The following is a continuation of Facebook Privacy: Canadian Privacy Laws and How Facebook is Changing
The following are excerpts from the privacy commission's report detailing where the commission found Facebook in breach of Canadian Law
Section 1: The Use of Date of Births (DOBs)
55. In sum, with respect to its collection of DOB, I find Facebook to be in contravention of the above-cited principles, most notably Principles 4.2.3 and 4.3.2.”
56. In my preliminary report, I recommended that Facebook
(1) revise the pop-up phrase “a means of preserving the integrity of the site” so as to more clearly capture the true purpose intended and make it more understandable to users;
(3) revise its site literature wherever appropriate, including pop-ups on the registration page, so as to clearly define what it means by profile information and to clearly dispel the notion that “hiding” DOBs from a profile means exempting them from use in targeted advertising; and
(4) indicate, in the pop-up in which it specifies the purposes for collection of OBs, that DOBS are collected also for the purpose of targeted advertising. Facebook should likewise specify any other purposes for which it intends to use or disclose users' DOBs.”
Section 2 – The Pre-selection of Privacy Settings
98. To conclude, I find that Facebook's notification efforts relating to privacy settings fail to meet a reasonable standard in the circumstances, as envisaged in Principles 4.2.3 and 4.3.2. In particular, Facebook needs to do more to ensure that new users can make informed decisions about controlling access to their personal information when registering. Facebook has given its users tools to control their personal information.
99. In my preliminary report, I recommended that Facebook
(1) make user profiles inaccessible to search engines by default;
(2) change the default setting for photo albums to “Your Networks and Friends”;
(3) provide a link to the privacy settings at registration, accompanied by a means whereby users can inquire and be informed specifically about the meaning of the term “privacy settings” and can be notified that Facebook has preselected the settings and that the settings can be changed according to the users' preferences; and
(4) provide users who join networks after registration with the same notification as received by users who join networks at registration.
100. In response, Facebook has taken a holistic approach to meeting our Office's concerns relating to privacy settings. The company intends to implement the following two significant changes in the near future: (1) It will introduce a “Privacy Wizard”, whereby users will be able to select a low, medium, or high privacy setting. This selection will dictate more granular default settings. Notably, users who choose the “high” setting will not be included in public search listings. Facebook maintains that its new Privacy Wizard and emphasis on per-object privacy (see below) will meet the purpose of assuring that users have made a fully informed choice about whether their information is made available in any way to search engines.
(2) It will also implement a per-object privacy tool, whereby users will be given “an easily configurable setting on every piece of content that they will be able to configure at the time of uploading or other sharing. In a matter of weeks, the changes that are in testing will allow users to choose privacy settings on individual photos and pieces of content such as status updates.” Our Office infers from this that Facebook intends to extend its
notification practice in respect of photo albums to other types of information.
101. Facebook has also stated that it is conducting preliminary testing on a revised registration flow that will provide more information on privacy settings.
102. As for our fourth recommendation, Facebook has agreed to implement the appropriate measure.
In this section the commission rules that Facebooks default privacy settings allow too much personal information to be publically accessible, and rules that profiles must be preset to deny a great deal of public access. In other words, your information should start private, and then you should have to make it public, not vice versa.
Facebook agrees and takes it a step further. They actually take it a step further and are instituting a complete privacy system including a privacy wizard at signup, and granular content control so you can set privacy settings on a per-object basis.
Section 3 – Explanation of the use of personal information for advertising purposes
140. In my preliminary report, I recommended that Facebook
142. Facebook objected in principle to recommendation 2 above on grounds that it was opposed to interruptive notices that disrupt the user experience. Nevertheless, the company agreed to configure its systems so as to “allow users who are particularly privacy sensitive to discover more information easily about site operations and to provide feedback on their concerns to Facebook.”
The commission is concerned with Facebooks insufficient explanation of the difference between Facebook Ads and Social Ads, as well as what they see as an insufficient explanation of how personal information is used for advertising purposes.
Section 4 – Third Party Applications
200. When I speak of limits to access, and especially when I consider the vast amounts of Facebook users' personal information potentially available to large numbers of application developers, I believe something much more substantial in the way of safeguards is required. Specifically, I mean technological safeguards that will not simply forbid, but effectively prevent, developers' unauthorized access to personal information that they do not need.
202. I find that Facebook does not have adequate safeguards in place to prevent unauthorized access to users' personal information by application developers
203. On the question of consent, I find Facebook's manner of seeking consent to be problematic in two ways.
204. First, the consent language that Facebook uses is excessively broad. […] Facebook is in effect telling users that whenever they add an application, they must consent to allowing access to almost anything and everything that the developer asks for. In my view, consent obtained on such a basis is meaningless. In the circumstances, the user's meaningful consent to the collection and use of specified information should be sought at each instance of a user's adding an application.
205. Second, technically, application developers' receipt of users' personal information through the Facebook API may be considered not only a collection by the developer, but also a disclosure by Facebook. Accordingly, Facebook has an obligation to ensure that users consent to such disclosure of their personal information. However, given Facebook's platform as it relates to thirdparty applications, Facebook can meet this obligation by taking reasonable measures to ensure and verify that application developers are obtaining meaningful consent on behalf of Facebook.
207[…] Facebook should take further steps to ensure that developers are well aware of the requirement to do so and that they comply with it. For one thing, Facebook should feature the requirement prominently in the Platform Guidelines and other instructions to developers, as well as in the SRR. For another, the company should develop a means of monitoring applications to ensure that developers are complying with the requirement to obtain consent. The company might even consider providing developers with a means of
explaining to users what information they need and why (possibly by adjusting the current template so as to provide space for such an explanation).
208. Another consent-related concern that I have is the fact that no specific consent is sought from users for the disclosure of their personal information to applications when their friends and fellow network members add applications. Facebook maintains that, through its privacy settings, users have an extensive ability to choose whether or not they will interact with any particular Facebook application and to block any particular application and opt-out of all Facebook applications in a simple way. However true this statement may be in theory, I would note that users' “ability to choose” would depend on their being knowledgeable about developers' practice of accessing and using third-party information when friends add applications. I would also note that the only way users can control the exposure of their personal information to application developers when their friends and fellow network members add applications is either to opt out of all applications altogether or to block specific applications. Moreover, the latter option would effectively require them to guess which of the more than 350,000 applications their friends and fellow network members are likely to add.
211. In my preliminary report, I recommended that Facebook consider and implement measures
(1) to limit application developers' access to user information not required to run a specific application;
(2) whereby users would in each instance be informed of the specific information that an application requires and for what purpose;
(3) whereby users' express consent to the developer's access to the specific information would be sought in each instance; and
(4) to prohibit all disclosures of personal information of users who are not themselves adding an application.
212. In response, Facebook raised objections as noted in my findings above and in effect declined to implement the recommendations.
The privacy commission had serious problems with the Facebook application model, stating that they require technological limits to access to user information through third party applications. Of particular note was the use of third party applications that could gather information about friends of users.
The commission also states that the language used to explain this access is insufficient, and as such does not legally gain the consent of the user.
Finally the commission recommends that Facebook should implement stricter third party guidelines and a monitoring system to make sure that third party developers are obeying said guidelines, including denying third party developers access to information that they don't require for their application, informing users of the information a third party application will use, and prevent all use of information about users who have not installed an application.
Facebook initially objects completely, denying the request. However on August 24th Facebook released a press release that stated that they are: Increasing the understanding and control a user has over the information accessed by third-party applications. Specifically, Facebook will introduce a new permissions model that will require applications to specify the categories of information they wish to access and obtain express consent from the user before any data is shared. In addition, the user will also have to specifically approve any access to their friends' information, which would still be subject to the friend's privacy and application settings.
Increasing the understanding and control a user has over the information accessed by third-party applications. Specifically, Facebook will introduce a new permissions model that will require applications to specify the categories of information they wish to access and obtain express consent from the user before any data is shared. In addition, the user will also have to specifically approve any access to their friends' information, which would still be subject to the friend's privacy and application settings.
So in the end, Facebook as essentially agreed
Section 6 – Collection of Personal Information from Sources Other than Facebook
249. In my preliminary report, I recommended that Facebook develop, institute, and inform users of a retention policy whereby the personal information of users who have deactivated their accounts will be deleted from Facebook's servers after a reasonable length of time.
250. I also suggested, as best practice in the interest of clarity for users, that Facebook
(1) include an account deletion option, as well as an explanation thereof as distinct from account deactivation, on its users' Account Settings pages; and
251. In response to my recommendation, Facebook objected on the following grounds:
“… [A] majority of deactivating users reactivate within weeks, and those who reactivate on a longer timeframe are generally expecting their social connections to be intact when they return. Because the option to delete data is present for users, and because of interdependencies on certain data, setting a firm date for erasing a user's information without clear direction from them in this context would be inappropriate.”
252. The Act is clear that organizations must retain personal information only for as long as necessary to fulfil the organization's purposes, that organizations should develop guidelines and implement procedures with respect to the retention of personal information, and that such guidelines should include minimum and maximum retention periods. While I acknowledge that the length of time an organization may retain personal information may vary depending on the circumstances, I do not consider it either necessary or reasonable in the present circumstances for Facebook to retain personal information indefinitely in deactivated accounts.
“Individuals who wish to deactivate their Facebook account may do so on the My Account page. Removed information may persist in backup copies for a reasonable period, but will not be generally available to members of Facebook. Individuals who wish to delete their accounts may use the attached form to submit their account for the deletion process, which may take several weeks to complete processing.”
Facebook refuses to implement a deletion option on the account settings page, but agrees to implement an explanation of deactivation and a form that users can submit that will “submit their account for the deletion process”
Section 7b – Accounts of Deceased Users
276. I find therefore that, with respect to informing individuals of its practice of account memorialization, Facebook is in contravention of 4.2.1, 4.2.3, 4.3.2, and 4.8.
281. In my preliminary report, I recommended that Facebook
(2) provide, and notify users of, a means whereby they may opt out of Facebook's intended use of their personal information for the purpose of memorializing accounts.
282. In response, Facebook has in effect declined to implement either recommendation, on the following grounds:
“We still do not believe that retaining data for the purpose of allowing users to remember their friends constitutes another use under PIPEDA, and in any event users are perfectly capable of using other means to express their wishes in this area. We also believe that it would be inappropriate to create a standard for handling information in this case that would be at variance with existing legal norms for the disposition of estate property.”
Facebook also noted that services around access to digital assets in the event of death are carried out by private vendors.
284. I will not insist upon Facebook's implementation of my second recommendation. My first, however, remains. I would strongly urge Facebook to reconsider it.
The commission finds that memorialization of accounts is improperly explained, and requests that Facebook provide better description and an option to opt out of memorialization.
Section 8 – Personal Information of Non-users
308. The “Invite New Friends” email invitation feature is also an activity by Facebook. Facebook maintains that it provides this service for the use of its users, but clearly the service also helps Facebook gain new members and thereby increase its ability to generate revenue.
309. In my view, therefore, Facebook should assume some responsibility for seeking consent in these contexts. The question is, what kind of responsibility–
311. I continue to believe that responsibility for consent should begin to apply at the point in the tagging process where Facebook actively solicits non-users' email addresses from users with the intention of using them for purposes of its own.
312. Furthermore, Principle 4.3 states that the knowledge and consent of the individual are required. For situations where one party collects from a second party the personal information of a third, our Office has determined in previous cases that, depending on the circumstances, it may be deemed incumbent on the second party (in this case, the Facebook user) to directly obtain the consent from the third (in this case, the non-user). We have also determined in such cases that the first party (in this case, Facebook), though not responsible for directly obtaining consent, must nevertheless take reasonable measures to ensure that consent is obtained by the second party. In other words, the first party must exercise due diligence to ensure that the requirement for consent is met.
317. In my preliminary report, I recommended that Facebook
(1) consider and implement measures to address our concerns about nonusers' lack of knowledge of, and consent to, their being tagged in photographs;
(2) consider and implement measures to improve its invitation feature so as to address our Office's concerns about non-users' lack of knowledge and consent to Facebook's collection, use, and retention of their email addresses; and
(3) set a reasonable time limit on the retention of non-users' email addresses or purposes of tracking invitation history and the success of the referral program.
318. In response to my first and second recommendations, Facebook declined to implement on the following grounds:
“… Facebook believes we continue to provide significantly greater notice to nonusers as to the presence of any information about them on our site than does any other site on the web. If a nonuser wishes to block further notifications, we honor that request, and data is otherwise retained at the direction of the user who uploaded it initially, making action Facebook would take to delete the data inappropriate without an intervening action by the person who uploaded it in the first place.”
319. As to the practice of tagging non-users, Facebook commented as follows:
“With regard to photographs in particular, Facebook's tagging infrastructure offers users more notice than they get on other websites as to the presence of a photograph they may want to review. While on most sites a picture of an individual can be uploaded and they may have no idea of its presence, Facebook provides a means for them to be notified and to get in touch with the person who uploaded the photo if they have an objection. For non-users, this can be done by adding an e-mail address to a tag. Furthermore, we have designed the tagging infrastructure to allow removal of tags by the individual tagged, and for blocking of further emails if the recipient so desires.”
320. Over all, Facebook has argued that non-user data is the responsibility of the user who uploads it, that the photo tagging and invitation features constitute personal uses by users themselves, and Facebook provides non-users with better notice than any other website about the presence of their data on the site.
321. As was also the case with my other recommendation relating to retention, Facebook made no direct response to my third recommendation above.
The commission notes that as Facebook has a financial interest in the collection of emails of potential users, and as such shares some responsibility when emails are collected without the consent of the third party. It recommends that Facebook implements measures to inform users that they must have permission, and punish them when they invite people who don't.
Section 10 – Monitoring for Anomalous Activity
368. While I do not find the practice to be unreasonable or inappropriate in itself, in consideration of the Principles cited above I am concerned that Facebook is not making a reasonable effort to document it and inform users of it.
“To improve the security of the site, Facebook uses a variety of technological systems to detect and address anomalous activity that may be undertaken by users. This may on occasion result in a temporary or permanent suspension of some functions for some users on the Facebook service.”