You are on page 1of 7

Universiteit van Amsterdam

Corporate strategy and organization design

Assignment 1 - Team 3

Fixing Facebook

Authors:
Alexander Post, 11607149
Marnix van der Linde, 12414166
Thom van Assen, 12486493
Christopher Liu, 11416726

2022-2023
Stakeholder analysis

Freeman (1984) states that: “Any group or individual who can affect or is affected by the

achievement of the firm’s objectives” (p.25). According to Freeman (1984), stakeholders can

be divided into three groups. These groups consist of owners, primary stakeholders, and

secondary stakeholders. Owners are those who bring the most important resources to the firm,

they supply capital or equity in the business and have a say in how things are run in the firm.

They are residual claimants that bear risk and are vulnerable due to the risk they bear while

owning the formal rights to the firm. Primary stakeholders are individuals, groups or

entities–employees, customers, suppliers, or creditors–that have a contractual relationship

with the firm. They are an important group of stakeholders that define the business and are

vital to its continued existence. They bear little risk but are vulnerable to the firm due to their

market transaction-based relationship. Secondary stakeholders are those with no direct

contractual connection to the company, no claim on earnings (but who do bear risk), and who

are exposed because of this. It is not formally governed in any way since there is no direct

connection to the firm. These third parties, or secondary stakeholders, are impacted, for

example, by misconduct.

Mark Zuckerberg is the owner of Facebook, he has received the most notable negative

consequences. Other users were concerned about their own privacy, which led to reputation

damage for Facebook and Zuckerberg. He experienced significant risk. Facebook in this

sense, is also considered an owner. However, Zuckerberg is responsible for Facebook,

whereas the opposite is not applicable. Facebook received a fine of 5 billion dollars by the

FTC. In addition, users stopped using Facebook and the firm had to comply with new

regulations as a result of the misconduct. Jones (1995) states that ethical principles like trust,

trustworthiness, and cooperativeness can result in significant competitive advantage. In


addition, Facebook provides empirical evidence that the lack of ethical principles, can result

in the loss of competitive advantage.

Cambridge Analytica is a primary stakeholder, they had a contractual relationship with

Facebook (users gave their consent, but did not know how their data was used). The contract,

however, was a violation of Facebook’s terms of service, whilst the data was still accessible to

Cambridge Analytica and they did bear little risk as the data was from Facebook users.

Cambridge Analytica’s consequences were increased brand awareness, although of a negative

kind.

Ted Cruz is a primary stakeholder, as he actively purchased data for targeted advertising with

the goal of influencing elections and voters. He assumed that these purchases where of little

risk, as it was not known that the data was accessible by Cambridge Analytica.

Donald Trump, after Ted Cruz lost the elections, copied Ted Cruz with the same goal.

Therefore he is also a primary stakeholder. The consequences for Ted Cruz and Donald

Trump are reputational damage.

We identify the 87 million users from Facebook that were accessed through Cambridge

Analytica as secondary stakeholders, their data was used without consent. The consequences

that befell this group of stakeholders due to the scandal caused by Cambridge Analytica, were

the invasion of privacy and the lack of respect for this stakeholder group. The U.S. voters are

also considered secondary stakeholders because Ted Cruz’s presidential campaign–and later

Trump’s campaign–had data from millions of Facebook profiles and used it to target

advertisements in order to influence the election. As a consequence, many vulnerable voters

were manipulated by the misuse of the data which influenced those who were enticed to vote

for their client or discouraged to vote for their opponent, which disrupted the election results
(Isaak & Hanna, 2018). The U.S. Senate is a secondary stakeholder group, although it could

be a primary stakeholder. The U.S. Senate could directly affect Facebook through regulations,

however, in the Cambridge Analytica scandal, they were affected by manipulated votes. They

were vulnerable in this case, because of the absence of regulations influencing voters and

ultimately the Senate with online channels.

Governance problems of Facebook

To identify the key corporate governance problems at Facebook and address the

recommendations on how to solve them, it is useful to explain the concept of corporate

governance first. According to L’Huillier (2014), who compared different theories about

corporate governance, there are multiple definitions of corporate governance. For example,

Millstein (1993) stated that corporate governance “is the mechanism through which the

managers’ control is monitored and held to fairly enhance corporate profit and shareholder

gain”. Another way of looking at corporate governance is to say that corporate governance “is

deemed a systematic provision of some measure of control over the actions of agents such as

managers and subcontractors” (Boston et al., 1996). The definition of corporate governance

that applies the most to the case of Facebook is given by the Financial Reporting Council

(2018): “The purpose of “good” corporate governance is to facilitate (1) effective, (2)

entrepreneurial, and prudent management that can deliver the (3) long-term success of the

company."

One of Facebook’s corporate governance problems could be found in their board structure

and composition. A strong chair and a board of directors with a balance of abilities,

backgrounds, experience, and knowledge are characteristics of an effective board, according

to the Financial Reporting Council (2018). This indicates that each director has the ability to

add value while being "thinly informed, under resourced, and boundedly motivated" (Gordon,
J. and Gilson, R., 2019). The size of the board is determined by the firm's scope and

complexity. When looking at Facebook’s board composition, Facebook’s board's composition

may not appear to have the necessary knowledge and experience. Following the significant

Cambridge Analytica Scandal in 2016, the CEO's long-term board advisory turned away from

the company. It's possible that this board restructuring brought on new board members with

less experience (Riess, 2022). This issue was also mentioned during an interview with Mark

Zuckerberg when he was asked if Facebook had “just became too big and too vast and too

consequential for normal corporate governance structures” (Yoffie & Fisher, 2019).

To solve this corporate governance problem, it is important to develop a governance structure

for the content and community that prioritizes community needs over those of potential

short-term shareholders. Furthermore, a key component of this form of governance structure

would be a system for appealing content decisions, possibly with an independent board

(Yoffie & Fisher, 2019).

Another corporate governance issue that Facebook faces is a conflict of interests. Mark

Zuckerberg owns a majority stake and is focused on financial growth and revenue streams,

while on the other hand the users of Facebook want a safe platform, where they can be

assured that their personal information is not being used for the wrong purposes and where

their privacy is protected. Facebook said that they would only share the users' personal

information with the people selected in their settings, but allowed themselves to share

personal information with third party business partners (Wang et al., 2011). This way of

operating is one of the reasons that as of 2019, Facebook had a total of over seven million

advertisers and more than tripled their average revenue per user compared to 2015 (Yoffie &

Fisher, 2019).
The conflict of interest is also visible in the case of misinformation and fake news, which was

the most concerning in the topic of politics. An example was the presidential election in 2016,

where critics argued that fake information about the elections was spread and contributed to

the victory of Donald Trump. Facebook would have earned money by spreading the fake

news and would thereby be neglecting their corporate social responsibility, although

Zuckerberg came out with a statement saying that the proportion of fake news was very small

(Yoffie & Fisher, 2019).

A recommendation for Facebook would be to increase their level of transparency. Being

transparent to the user about who gets access to their data and what can be done with their

data, can help build trust and is an improvement for multiple stakeholders. Facebook also

needs to make sure the directors have all the information they need. In the long term, they

have to evolve their business model into a model that is focussed on trust, which would mean

making data security and privacy as important as monetization (Burt, 2021).


Reference list

Boston, J. , Martin, J. , Pallot, J. and Walsh, P. (1996), Public Management: The New Zealand

model, Oxford University Press, Auckland, New Zealand.

Burt, I. (2021). Can Facebook Ever Be Fixed? Harvard Business Review.


https://hbr.org/2019/04/can-facebook-ever-be-fixed

Council, F. R. (2018). UK corporate governance code 2010. The Financial Reporting


Council, UK.

Council, F. R. (2018). Board diversity reporting.

Freemann, R. E. (1984). Strategic management: A stakeholder approach. Boston: Pitman, 25.

Gilson, R. J., & Gordon, J. N. (2019). The rise of agency capitalism and the role of

shareholder activists in making it work. Journal of Applied Corporate Finance, 31(1),

8-22.

Isaak, J., & Hanna, M. J. (2018). User data privacy: Facebook, Cambridge Analytica, and
privacy protection. Computer, 51(8), 56-59.
Jones, J.M. 1995. Instrumental Stakeholder Theory: A Synthesis of Ethics and Economics

L’Huillier, B. M. (2014). What does “corporate governance” actually mean?. Corporate

Governance.

Millstein, I. M. (1993). The evolution of the certifying board. The Business Lawyer,

1485-1497.

Wang, N., Xu, H., & Grossklags, J. (2011). Third-party apps on Facebook: privacy and the

illusion of control. In Proceedings of the 5th ACM symposium on computer human

interaction for management of information technology (pp. 1-10).

Yoffie, D. & Fisher, D. (2019). Fixing Facebook: Fake News, Privacy, and Platform
Governance. Harvard Business School.

You might also like