parties, yet they have been given access to more confidential data.
The three researchers take the time to explain more about the two types of privacy.
The first one, privacy as a subject, is said to be when users are actors who make and manage
their online identities (Heyman, Wolf & Pierson, 2014). This type of privacy only controls
the flow of information between people. This is the one that social media platforms have
really tried in ensuring that users can control what others can see. However, the user has been
given an active role in controlling this privacy through the tweaking of a few settings. A user
19
can limit the people who can view his/her profile, block users from messaging him/her, and,
also, change these settings later on. The three researchers, however, disclaim that there are
some problems with this type of privacy, but they are quite different from the other type of
privacy. First of all, the idea that it is a person who creates an online avatar using whatever
data he/she pleases introduces some problems, that of identity theft, creation of false
identities, and the inability to control the access of some aspects of private and public
information to different the audience. Users have to make tradeoffs between privacy and
identity, and too much privacy would mean seclusion which is not the intention of people
joining social media sites.
The second type of privacy, privacy as a subject, is said to be the most controversial
type of privacy (Heyman, Wolf & Pierson, 2014). The three say that the user data is basically
stored in a database containing the records of millions of other users. This data is referred to
as big data and it can be mined by using various tools that are powered by complex
algorithms. This is done via a process called Knowledge Discovery in Databases. The trio
does confirm that this is the data that social media platforms make money out of. They
explain that the platforms generally collect and store all the data that a user creates on the
social media platforms and use it to classify the users into certain profiles. These profiles are
used to bundle users into certain groups as consumers, and the type of products the users may
buy can be predicted. The only purpose of this profiling and bundling is to make a
commercial value out of the user data.
Users are basically transformed into commodities and they are actually sold to
companies seeking a particular audience. The audience may be people within a certain age
bracket, with a certain level of education, who live at a given place, and so on and so forth.
There have been heated debates around this type of privacy. Users are complaining that their
privacy rights are violated from the moment they are put into these databases. Social media
20
platforms are not totally transparent about the type of data they collect, how it is used, and
which third parties have access to this information. The three researchers describe users as
laborers who do not receive any pay. They take their time to create online personalities, they
generate content that keeps other users visiting their platforms, but what do platforms repay
them with? Total prostitution of all their data to third parties and the users have no say about
what can be shared with the third parties.
The three conclude by looking at Norman’s views of affordances. They say that social
media platforms should focus on using various affordances in the privacy settings to
empower users rather than disempower them (Heyman, Wolf & Pierson, 2014). They define
empowerment in this context as being in control of one’s life or having mastery over one’s
affairs. They share a low opinion of the current affordances that these platforms have given to
the users. They refer to an earlier research that found out that there still were many people
unable to change their privacy settings due to lack of knowledge or uncertainties. This, they
say, makes it even harder for users to control their subjective privacy. Secondly, users are
said to have no control over their objective privacy. This is where they are basically seen as
commodities and sold around to different third parties. The three researchers share the results
of their research on Facebook, Twitter, and LinkedIn, and they call for the inclusion of more
but simpler privacy settings on the platforms. Also, they talk about the platforms putting in
place default settings that promote privacy.
In support of the recommendations listed above, John Drake came up with more ways
to protect social media privacy. In his 2016 research, he recommends that businesses should
avoid violation of the privacy of other people just for some short-term goals (Drake, 2016).
Drake’s recommendation might be subjected to the small businesses that he touched on in his
research but can be extended to social media platforms. His research encompassed businesses
with a tendency to violate job applicants’ social media privacy by invading them to gather
21
and analyze data that they can use in the hiring or termination process (Drake, 2016). The
same level of evil is already taking place in many social media companies. Social media
companies are invading the privacy of social media users just to fetch information that they
can use for advertising purposes. As Drake (2016) warns, this is for a short-term goal and
there might be negative implications in the long term. For instance, if users decide to abandon
social media platforms that have been associated with user privacy violations, their long-term
business will be lost. With the coming up of an increasingly privacy-aware generation of
Internet users, it is not far-fetched to imagine a point where Facebook will have lost many
users if it continues with its current trend.
Drake (2016) also recommends for the observance of user privacy even if the users do
not have a distinct right to it. This is very important since users are not granted a right to
privacy in many countries. There is no constitutional provision for this right. Drake is seen to
be driving a more ethical debate that aims at encouraging businesses to respect users’ privacy
even without an obligation to do so. This recommendation seems to be tied with the above-
stated recommendation. If social media companies fail to respect the privacy of users, there is
no guarantee that these users will remain on the platforms in the long term. A startup might
create a privacy-aware platform that might quickly gain adoption from users, thus leading to
a starvation of users in the privacy-violating social media platforms. Lastly, for a long-term
solution, Drake (2016) says that there has to be respect of privacy and the enforcement of
individual rights. This recommendation brings into the picture the role of legislators in
different countries or regions. It is high time that legislations are formed to protect an
individual’s privacy. Privacy needs to be made an unalienable right unless there are legal
grounds to challenge this. It is currently being taken for granted and even though some
legislators have taken up the issue, many others are yet to react. Social media platforms need
22
to be controlled when it comes to the boundary between their business interests and the
privacy interests of their users. Drake seemingly understands this, hence his recommendation.
Dostları ilə paylaş: |