Tag Archives: big tech
Accountable Tech: Will the US take a leaf out of the Indian Playbook?
2024 is a decisive year for democracy and the liberal order. 1.8 billion citizens in India and the United States, who together constitute nearly 1/4th of the world’s population, are going to elect their governments in the very same year. This will be the first such instance in a world increasingly mediated and intermediated by platforms, who will be crucial actors shaping individual choices, voter preferences, and indeed, outcomes at these hustings. It is therefore, important to recognise these platforms as actors and not just benign intermediaries.
Prime Minister Modi’s government, especially in its second term, has approached digital regulation with the objective of establishing openness, trust and safety, and accountability. In June this year, Union Minister of State for Electronics & Technology, Rajeev Chandrasekhar, invited public inputs on the draft amendments to the IT Rules 2021 with an ‘open, safe and trusted, accountable internet’ as the central area of focus.
Runaway platforms and cowboy capitalism are the big dangers to the sanctity of our elections and to the citizens’ acceptance of political outcomes.
This Indian aspiration for Accountable Tech must be an imperative for all liberal and open societies if we are to enrich the public sphere, promote innovation and inclusive participation, and indeed, defend democracy itself. If we fail to act now and act in unison, we could end up perverting the outcomes in 2024. Runaway platforms and cowboy capitalism are the big dangers to the sanctity of our elections and to the citizens’ acceptance of political outcomes. India has clearly seen the need for it and is striving to make large tech companies accountable to the geographies they serve. The latest comer who seems to have understood the importance of this is the United States of America.
On 8 September 2022, the White House convened a Listening Session on Tech Platform Accountability ‘with experts and practitioners on the harms that tech platforms cause and the need for greater accountability’. The session ‘identified concerns in six key areas: competition; privacy; youth mental health; misinformation and disinformation; illegal and abusive conduct, including sexual exploitation; and algorithmic discrimination and lack of transparency.’ Hopefully, this session will lead to a more contemporary regulatory and accountability framework that aligns with what is underway in India.
From private censorship and unaccountable conversations hosted by intermediaries to propagation of polarised views, all of them constitute a clear and present danger to democracies, and certainly to India and the US, who are among the most plural, open, and loud digital societies. Digital India is indeed going to be ground zero of how heterogenous, diverse, and open societies co-exist online and in the real world. The efforts of the Indian government to put together sensible regulation may actually benefit many more geographies and communities. If India can create a model that works in the complex human terrain of India, variants of it would be applicable across the world.
The efforts of the Indian government to put together sensible regulation may actually benefit many more geographies and communities.
It must also be understood that there is no single approach to manage platforms, even though there could be a wider and shared urge to promote openness, trust and safety, and accountability. The regulations that flow from this ambition are necessarily going to be contextual and country specific.
Hence, it is important that India, the US, and other large digital hubs coordinate and collaborate with each other to defend these universal principles even as they institute their own and region-specific regulations. For instance, policy architecture in the US will focus on managing platforms and technology companies operating under American law and consistent with their constitutional ethos. India, on the other hand, has the onerous task of ensuring that these same corporations adhere to Indian law and India’s own constitutional ethic.
India and the US lead the free world in terms of global social media users. As of January 2022, India had 329.65 million Facebook, 23.6 million Twitter, and 487.5 million WhatsApp (June 2021) users, while the US had 179.65 million Facebook, 76.9 million Twitter, and 79.6 million WhatsApp (June 2021) users. The online world is no longer a negligible part of society. Most people online see the medium as an agency additive and are keen to use it to further their views and influence others’ thinking. Of these, many are also influencers in their own localities. What transpires online now has a population scale impact. The mainstream media reads from it, social media trends define the next day’s headline and the debates on primetime television.
Policy architecture in the US will focus on managing platforms and technology companies operating under American law and consistent with their constitutional ethos.
Thus, the idea that one can be casual in managing content on these platforms is no longer feasible and will have deleterious consequences as recent developments have shown. Intermediary liability, that sought to insulate platforms from societal expectations, needs to be transformed to a notion of intermediary responsibility. It must now become a positive and a proactive accountability agenda where the platforms become a part of responsible governance and responsible citizenship.
Predictable regulation is also good for business and policy arbitrage harms corporate planning; so, platforms too have a stake in making their board rooms and leadership accountable. They must make their codes and designs contextual and stop hiding behind algorithmic decision-making that threatens to harm everyone, including their own future growth prospects. And this must be the ambition as we head into 2024–the year when technology could decide the fate of the free world.
Big Tech and the State: The necessity of regulating tech giants
The scramble for gold on the Internet has transferred control of vast swathes of cyberspace to a very small and select group: Big Tech. This has made ‘significant social media intermediaries’ highly profitable ad businesses that have grown amid non-existent privacy and weak intermediary liability laws. They make the market, grow the market, and shape market rules. No ad business, or any business in history—not even Big Oil or Big Tobacco—has held so much power over consumers and the economy. This perverse power is, perhaps, the single biggest challenge that nations and peoples will have to grapple with. Accountable Tech must be India’s leitmotif in 2023 as it presides over the G-20, and a robust digital republic its sovereign mission as its turns 75 next year. This will need sensible politics, sophisticated policies, and a return to first principles.
Concentration of wealth is a competition issue and an economic policy question. Left unregulated, it brings about inequality in income and opportunity. But concentration of power when it comes to discourse—what is promoted, shared or suppressed—should be more worrying. Safe-harbour provisions in the United States along with self-regulation principles have allowed Big Tech to cherry pick what is to be acted on and what is to be ignored, effectively making it the arbiter of permissible speech. For example, anti-vaccine Twitter users have thrived during the pandemic, while, sometimes, less dangerous actors have had their posts labelled. In January, Angela Merkel, Chancellor of Germany, denounced the de-platforming of then US President Donald Trump by Twitter. “The right to freedom of opinion is of fundamental importance,” Merkel’s Chief Spokesperson, Steffen Seibert said, “Given that, the Chancellor considers it problematic that the President’s accounts have been permanently suspended.”
The issue here is not whether Merkel agreed or disagreed with Trump’s tweets. The question is—who censors him, how, and with what process and level of transparency? For the Chancellor and for many, Twitter cannot choose for itself when it seeks to be a provider of public goods, and when it is a private messenger eligible for intermediary protections. When governments around the world describe digital connectivity as a ‘utility’, information lines cannot be disrupted by religious, cultural or ideological filters. Like water, electricity and roads, significant social media will have to serve all, even those its management and owners disapprove of.
[Platforms] cannot choose for [themselves] when they seek to be a provider of public goods, and when they are a private messenger eligible for intermediary protections.
The instances when utilities (say electricity and water) are denied or disconnected are specific, rare and regulated. Even in the information age, only the state and its three pillars have this right. Global Big Tech is not part of this constitutional arrangement. There are checks and balances in place, with legal recourse available for all within the state and for external actors as well. Any alternative to this constitutional setup would be akin to legitimising foreign influence operations in domestic affairs. In an extreme, for a country that is almost perpetually in election mode, it would be tantamount to election interference. This may seem like hyperbole, but it is closer to the truth than we suspect. For instance, if an electoral candidate makes an incendiary speech on a physical stage, the Election Commission, law enforcement agencies and the judiciary act against him—not the private company that has set up the stage or the power utility that has provided an electricity connection to the mike. Is the online equivalent being honoured by Big Tech?
Regulation of Big Tech across democratic setups
Australia gets this. In February, it passed the News Media Bargaining Code. The code encourages intermediary tech firms to negotiate deals with media outlets, effectively mandating that Facebook and Google pay news firms for content. The law was passed after a protracted battle between the Australian government and social media firms. It escalated when Facebook removed content of certain Australian news agencies, several official government handles, emergency services, and civil society organisations from its platform. Prime Minister Scott Morrison held firm: “These actions will only confirm the concerns … about the behaviour of Big Tech companies who think they are bigger than governments and that the rules should not apply to them.”
Canada, too, is making moves to curtail the wealth and discourse monopoly currently enjoyed by Big Tech. Just this week, Canadian lawmakers passed Bill C-10, which seeks to regulate the kind of content media streaming services prioritise on their platforms. The Bill, which is yet to be passed by the Senate, aims to make digital streaming platforms at par with traditional broadcasting services; the latter are obligated to increase the visibility and “discoverability” of Canadian content, and to set aside part of their profits to support a fund that promotes original Canadian productions.
Across the pond from the Americas, the European Union is also actively working towards mitigating the risks posed by the monopoly of Big Tech. Margrethe Vestager, Vice President of the European Commission for A Europe Fit for the Digital Age, has stated that tech giants, “have the power to guide our political debates, and to protect—or undermine—our democracy.” In December 2020, Vestager and her office tabled the Digital Services Act (DSA), which seeks a systemic assessment of the varied social, economic and constitutional risks posed by the services provided by Big Tech.
The most decisive move yet has come from Poland, which has proposed a law to ‘limit’ the censorship tendencies of the tech giants. Soon after the deplatforming of Donald Trump by Twitter, Prime Minister Mateusz Morawiecki wrote on Facebook: “Algorithms or the owners of corporate giants should not decide which views are right and which are not. There can be no consent to censorship.” The new proposed law provides for a special mechanism for those whose content or profiles have been blocked/deleted by social media platforms, where they can complain directly to the platform, which is obligated to respond within 24 hours. After a review by a specially constituted “Freedom of Speech Council”, deleted content can be restored by order. If platforms do not comply, they can face a heavy fine of up to 50 million zloty (US $ 13.4 million).
Regulatory Frameworks in India
India, too, must take some tough calls. The vision of Digital India has advanced—from only four unicorn companies in 2014, India had 12 in just 2020 alone. Regulation must keep pace with this economic and social reality. It is absolutely critical that the Privacy and Data Protection (PDP) Bill, currently being examined by a Parliamentary Joint Committee be brought forth and enacted as law. Without the umbrella framework of the PDP bill, India’s regulation of Big Tech will be ad hoc, and may be misconstrued as a political instrument.
The vision of Digital India has advanced—from only four unicorn companies in 2014, India had 12 in just 2020 alone. Regulation must keep pace with this economic and social reality.
With respect to regulating intermediaries, the Indian government initiated a public consultation process in December 2018 and invited submissions from the public to the Ministry of Electronics and Information Technology. A spectrum of civic, industry and academic actors participated. The rules were notified in February 2021, specifying clear compliance requirements within three months. Yet, the reaction of Big Tech platforms has been to delay, stall and obfuscate compliance.
It is high time that the actions of these companies were subject to systematic and rigorous Parliamentary oversight; but for that to happen, legislation is needed. Indian law and policy are rooted in our Constitutional principles. Indian policies on digital governance are no different, but they now need the imprimatur of Parliament to truly be effective. And should there be questions and grievances regarding the scope and constitutionality of the law, the courts of India will be the ultimate judge.
The objective of regulatory frameworks is to safeguard public interest, even (or perhaps especially) if it involves eroding the bottomlines of powerful vested interests. To once again quote the EU Commission’s Magrethe Vestager (in an intervention at a technology policy panel at the Raisina Dialogue earlier this year), regulating Big Tech, “Is a job, not a popularity contest”.
Perhaps, the real limitation is one of our imagination. In our minds, Silicon Valley is forever a happy, sunshine place, led by geeky, long-haired wunderkinds in t-shirts and flip-flops. The reality is Big Tech’s instincts today are driven by a single-minded sense of territoriality and collective impatience for different governance systems. For them, their ‘code is law’ and it is universal. That is at the crux of it.