16
The research paper begins by talking about the existing problematic view of privacy.
It insists that there are two types of privacies that most people see as one. The first type is
whereby privacy is a subject and this is the normal social privacy that users enforce against
each other. There is, however, another type of privacy termed as ‘privacy as an object’
whereby, at this level, personal information can no longer be seen by other users but, rather,
by big data algorithms. The trio claim that by separating the two types of privacy, they can
effectively prove how social media companies have left some exploitable blind spots on the
side of user privacy. Social media platforms basically try to delve into the subjective privacy,
that is, the privacy amongst the users. They effectively hide a loophole in the objective
privacy where the user data interacts with third parties and some big data algorithms
(Heyman, Wolf & Pierson, 2014). The three researchers look at
the settings that Facebook
and Twitter give to their users in order to control their privacy. They then evaluate the design
of these settings according to Donald Norman’s stipulations for human-centered interfaces.
The three researchers begin by briefly explaining about the two privacy perspectives
that they set out. They say that on social media platforms, there are two flows of information
even though users might think that there is one. The first and obvious flow of information is
between users and, at this stage, the account owners can determine what can be viewed or
accessed by others (Heyman, Wolf & Pierson, 2014). However, in some incidents, users
cannot really limit their posts from getting to unintended people. If a Facebook user chooses
to post a picture and set it to be visible to her Facebook friends only, there is no control to bar
the friends from further broadcasting the picture. This has been an issue that courts have
clarified. They have said that one fortifies privacy interests of any information that they
choose to post on Facebook.
In 2017, Mark Sableman made a contribution that was particular to this issue of
limited privacy controls that users have. He gave a real-life illustration of a court case by a
17
Facebook user called Chelsea Charney (Sableman, 2017). A court threw away the argument
she presented in court that since she had set a post to be visible to Friends of Friends, she had
protected her privacy or had acquired semi-privacy for the post (Sableman, 2017). The court
said that while she (Chelsea) was able to select her Facebook friends, she was not in position
to select the friends of her friend (Sableman, 2017). Therefore,
the post was availed to
hundreds or thousands of people that she did not know. The court also leaned to the legal
precedence that one should not have any expectations of privacy for information voluntarily
given to third parties (Sableman, 2017). This contribution therefore illustrates that users
ought to understand that they should be wary of the semi-private privacy settings offered by
social media companies. They can only provide limited control of what other users can see on
one’s profile.
The second flow of information goes to third parties and the social media platforms
themselves (Heyman, Wolf & Pierson, 2014). Users have no control whatsoever and
whatever
they publish, be it public or private, is streamed directly to these two. This is the
most threatening part of social media, they say (Heyman, Wolf & Pierson, 2014). These third
parties see users as mere assets and they are in a position to collect all the personal data of
these users without being barred or limited by some settings.
The three researchers refer to an earlier research done by the European Commission,
called the Eurobarometer. This research was done in Europe and it studied the opinions that
Europeans had towards sharing data, and the effects these opinions had on their data sharing
habits on different platforms. In the research, the European
Commission came to discover
that from 100% of the respondents, 44% worried about their data being collected without
their consent, 38% feared of it being shared without their knowledge, 32% feared the threat
of identity theft, and 28% feared that it would be used for advertising purposes (Heyman,
Wolf & Pierson, 2014). All these were different threats that the European Commission
18
obtained directly from the users in that region. The users were not in a position to control the
threats identified using the basic privacy settings that social media platforms provided. This
further exemplified the two types of privacy—one that is controllable and one that is at the
mercy of the platform and third parties.
Heyman, Wolf and Pierson (2014) cling to a definition of privacy as the rights of an
individual to decide what information can be given to others about him. They say that
information disclosure must be surrounded by some rules and they should delimit who this
information can be shared with. The privacy settings provided by the social media platforms
are supposed to act as enforcers of these rules concerning
disclosure according to the
researchers. However, social media platforms are said to have taken advantages of being the
system designers and severely limited the control that users have on their data privacy. They
have played an unfair game with the affordances of the privacy settings. The three describe
affordance as the relationship between an object and a user with regards to the properties of
the object. A designer is in a position to limit the functional properties of an object as they
make it. This is exactly what social media platforms have done; they have been clever in
severely limiting the kind of privacy that a user can control. A user is able to impose privacy
control over other users but not third-party apps or the platforms themselves. These platforms
have not provided any means for a user to be able to control the access of data by third
Dostları ilə paylaş: