The emergence of new technologies has digitalised markets, societies and nations. Once perceived as a strength, this proliferation of technology is now also a vulnerability. It has made tech-governance more political and social, and less about the traditional modes of regulation such as permissions, standards and tariffs.
India is among the most technology adept nations, a function of its people’s comfort with IT products and services as well as its late-mover advantage. It must now engage with a spectrum of evolving needs around law and regulation. This is necessary to accelerate population-scale opportunities and address widespread risks.
Three sets of issues emerge here – understanding the nature of technology-linked risks; assessing the challenges to governance; and being imaginative in embracing new modes of regulation.
Three sets of issues emerge here – understanding the nature of technology-linked risks, assessing the challenges to governance, and being imaginative in embracing new modes of regulation.
Let us begin with the risks, which themselves are creations of enhanced democratic access. For example, in roughly two decades India has added over a billion mobile phone subscribers, with over 50 per cent of them now using smartphones. This is transformational and unprecedented.
Improved access is credited with enabling financial inclusion, efficiency in education and healthcare, and fostering local e-commerce as well as global trade. However, a large user base is also a double-edged sword. As a result, corrective interventions need to be nimble and at digital velocity and population scale. Legacy regulation is simply ineffective.
This is best illustrated by problems plaguing social media platforms. A 2021 study found a high rate of social media misinformation in India, and attributed this in part to the country’s higher Internet penetration rate, driven by smart phones. Between June-July 2021 alone, Facebook received 1,504 user complaints in India – with a significant proportion of these related to bullying, harassment or sexually inappropriate content. Concerns are also emerging across other digital ecosystems, such as online gambling and crypto-assets. The mobile phone is a communication device, a crime scene and also an unsafe personal space.
Several state-level laws regulate or entirely prohibit betting and gambling. However, research suggests India is among the top five countries in terms of income potential from online gambling, and that the domestic online casino market may grow by 22 per cent each year. People from several states, such as Maharashtra, Telangana and Karnataka, are among the most frequent visitors to online gambling websites. The market for illegal betting and gambling in India is highly lucrative, with some estimating its value at USD 150 billion.
Offshore gambling websites often channel black money, engage in illicit transactions and launder wealth through financial intermediaries. Their operators are invariably based outside India, which makes it difficult to enforce the writ of the state. Recent investigations by bodies like the Enforcement Directorate have revealed instances of locals being hired to open bank accounts and trade through various online wallets, revealing gaps in due diligence mechanisms.
For the digital economy to flourish, it is important to evolve approaches that help resolve systemic and structural risks. It is time to reassess what is good, what is bad and what is ugly in this new digital landscape. Online gaming and online gambling must not be conflated. Similarly blockchain and sensible DeFi must not be clubbed with predatory crypto-gaming. After all, if we don’t embrace disruptive technology markets through sensible regulation, others will. A failure to capitalise may see India lose key avenues for economic growth and investment. India risk environment will then be shaped by external jurisdictions, some inimical to the country.
For instance, there are approximately 15 million crypto-asset investors in India, with total holdings of INR 400 billion. However, the regulatory and policy uncertainty has compelled crypto-asset entrepreneurs and exchanges to look to operate in more favourable markets. Exchanges such as Cryptokart, Koinex and ZebPay have exited the Indian market. ZebPay, for instance, is now headquartered in Singapore. In late 2021, many crypto-asset founders in India were considering moving their businesses to either the UAE or Singapore.
What we need today is new thinking and a new imagination of the digital world as not merely a virtual extension of the real, but an entirely different paradigm.
By banning cryptocurrencies altogether, nations such as China have missed the bus. India must leverage its position as the world’s third-fastest growing technology hub and seize the opportunity created by Beijing’s command and control ethos that is antithetical to innovation. India can and should become a global norms shaper in tech.
Tech regulation at population scale is akin to writing a new constitution for a digital nation. What we need today is new thinking and a new imagination of the digital world as not merely a virtual extension of the real, but an entirely different paradigm. There needs to be a clear-eyed understanding of what is legal, what is illegal and what may be illegal and yet requires regulations to serve and protect users and citizens.
To use a real-world analogy, since the 1990s, many countries including India have consistently distributed condoms and undertook safety campaigns among sex-workers without legalising prostitution or made available safe syringes to drug users without legalising the act. For governments and regulators, the role is no longer one of a gatekeeper that has the ability to prevent or permit activities online; it is becoming more of an ecosystem shaper and reducer of public bads.
By taxing cryptocurrency assets but not recognising these as legal tender, India has shown some welcome flexibility. It would do well to retain this nimbleness and become a co-curator of relatively safe tech platforms, services and products of the future that respond to Indian jurisdiction rather that off-shore the production of risks along with the rewards.
In the Indo-Pacific and beyond, China’s growth in capabilities and political authoritarianism are now threatening to alter how we engage with technology and digital domains. China believes it has the right to access other nations’ information and networks without offering up access to its own. This is not a simple techno-mercantilism. There is a single purpose to China’s deepening investments in existing and future technologies: furthering the agenda of the Chinese Communist Party (CCP).
For Beijing, technology is about both national security and ideology. Under Xi Jinping, it will use the information age to rewrite every assumption of the postwar period. Countries outside China must join together to seek open, safe and inclusive technology and digital platforms and products.
There are five main ways in which we can shape national, regional and global engagement with our digital world. These must also drive the purpose and direction of the Quad countries (the United States, Australia, Japan and India) as they strive to create a technology and digital partnership in the Indo-Pacific.
‘China tech’ was for the CCP initially about managing the social contract within China. Now, the CCP is weaponising and gaming other nations’ democracies, public spheres and open systems. It is creating a digital insurgency that allows it to delegitimise its opponents on their own political turf. This goes beyond episodic interference in elections. The CCP uses American forums such as Twitter and Facebook to critique the domestic and foreign policy of nations such as India. Wolf warriors seek to shape the information space internationally while China and the CCP remain protected behind the Great Firewall. The unimpeded global access China is allowed under some perverse notion of free speech must be questioned; internet propaganda endorsed by authoritarian regimes cannot and should not go unchecked. As a first step, the world will have to embrace a political approach to repel the digital encroachments we are witnessing. The European Union offers a model – just as its General Data Protection Regulation sought to rein in the US technology giants, we need laws that limit China’s access to the public spheres of open societies, thereby curtailing its global influence.
Today, all digital (silk) roads lead to Beijing. Many developing countries rely on China for their technology sectors. From control over rare earths and key minerals to monopoly over manufacturing, China commands the digital spigot. The Quad countries and others in the Indo-Pacific must seek and encourage diversification. Affordable, accessible products and innovations must emerge in the digital space. From resilient supply chains to diversity of ownership, a whole new approach is needed to prevent the perverse influence of any single actor. This is the second way to shape global patterns of digital engagement.
The Chinese under Xi have embraced the dangerous essence of the Chinese phrase ‘borrowing a boat to go out to the sea’. The CPC has essentially borrowed all our boats to further their agenda.
Universities in the developed world, their media, their public institutions and even their technology companies are serving and responding to missives from the Middle Kingdom. Many journalists have exposed the Western media’s promiscuous entanglements with a Beijing that artfully co-opts them into its propaganda effort. In the digital age, this cannot be ignored. Countries will soon be faced with a digital fait accompli – signing on to Pax Sinica. As a third way to enhance engagement, it is time to protect liberal institutions from their own excesses.
China has attempted to internationalise its currency with the launch of its own digital currency. After banning financial institutions and payment companies from providing crypto-related services in May, China launched a crackdown on computer-powered crypto mining in June, and a blanket ban on all crypto transactions and mining in September, clearing the way for its digital renminbi (digital RMB). With the development of its own central bank digital currency, the Chinese government will now have the power to track spending in real time. It will have access to the entire digital footprint of a citizen or a company. This will provide Beijing with an unprecedented vault of data, which it can use to exercise control over technology companies and individuals.
The rise of China’s digital RMB has the potential to challenge the status of the American greenback. For decades, the US dollar has been the world’s dominant reserve currency. Yet countries such as Iran, Russia and Venezuela have already begun using the Chinese yuan for trade-related activities or replacing the dollar with the yuan as reference currency. China can shape all three attributes of the ‘ideal’ currency, also referred to as the ‘Impossible Trinity’: free capital flow, a fixed exchange rate and independent monetary policy. It is a matter of time before it uses currency as part of its wider geopolitical plans. And with its past experiments with many countries on ‘trade in local currency’, it will have the capacity to create disruptions in the global monetary system. This can only be countered with two measures: one, depoliticising the existing dollar-led currency arrangements (the tendency to weaponise the SWIFT system – a giant messaging network used by banks and other financial institutions to transmit secure information – and to employ ad-hoc economic sanctions) and two, investing in the economic future of the emerging economies that currently depend on China.
Lastly, China is seeking technological domination not only terrestrially but also in outer space. China has invested considerably in space technology and engages in counterspace activities. These include suspected interference in satellite operations, both through cyberattacks and ground-based lasers. There are growing fears that Chinese technologies developed for ostensibly peaceful uses, such as remote satellite repair and cleaning up debris, could be employed for nefarious ends. The inadequate space governance mechanisms is an opportunity for the Quad to develop situational awareness in the space realm to track and counter such activities, and to develop a new set of norms for space governance.
The Quad’s agenda is prescribed by China’s actions. It will have to be a political actor and have the capacity to challenge China in the information sphere and the technology domain. It will need to be a normative power and develop ideas and ideals that are attractive to all.
From codes and norms for financial technologies to the code of conduct for nations and corporations in cyberspace and outer space, the Quad has the responsibility and opportunity to write the rules for our common digital future.
The Quad will also have to be an economic actor and build strategic capacities and assets in the region and beyond. It will have to secure minerals, diversify supply chains and create alternatives that ensure the digital lifelines are not disrupted.
Most importantly, the Quad will need to be an attractive partner for others to work with. This is its best means to counter China’s dangerous influence.
This commentary originally appeared in The Sydney Dialogue.
World Economic Forum, 16th April 2018
Three of the fastest-growing applications of artificial intelligence (AI) today are a manifestation of patriarchal stereotypes – the booming sexbots industry, the proliferation of autonomous weapon systems, and the increasing popularity of mostly female-voiced virtual assistants and carers. The machines of tomorrow are likely to be either misogynistic, violent or servile.
Sophia, the first robot to be granted citizenship, has called for women’s rights in Saudi Arabia and declared her desire to have a child all in the span of one month. Other robots are mere receptacles for abuse.
The Guardian in 2017 reported that the sex tech industry, including smart sex toys and virtual-reality porn, is estimated to be worth a whopping $30 billion. This is just the beginning. The industry is well on its way to launch female sex robots with custom-made genitals and even heating systems, all in the quest to create a satisfying sexual experience.
The advent of sexually obedient machines – which are designed to never say no – is problematic not just because of the presumption that women can be replaced, but because the creators and users largely tend to be heterosexual men. The news of a sex robot being molested, broken and soiled at an electronics festival in Austria last year should hardly come as a surprise.
While one could argue that the use of sexbots could reduce the abuse and rape of women in real life, it is undeniable that these machines are created to serve some men’s perverse needs. The machine represents a new wave of objectification, one that could potentially exacerbate violence against actual women.
The arrival of lethal autonomous weapon systems (LAWS) or “killer robots”, on the other hand, threatens to de-humanize both the male and female victims of war.
On the one hand, autonomous weapons could reduce the number of humans involved in combat and even decrease casualties. Killer robots could ultimately bring down the human cost of wars. On the other hand, they could downplay the human consequences of combat — and indeed, violence itself — and lower the threshold for armed conflict.
Several states are calling for a pre-emptive ban of killer robots, afraid that they might lead to an AI arms race, one that could raise the risk of a violent clash. Should machines be allowed to make life and death decisions? Or should this choice stay in the hands of humans, however fallible they may be?
Finally, the development of voice assistants and Internet of Things (IoT) devices for the care of children and senior citizens is a market that is expected to expand rapidly in the next decade. The presumption that machines cannot invoke the emotional intelligence that care workers possess is increasingly being challenged with the emergence of smart tracking devices and health monitors that can observe and predict behaviour.
These innovations no doubt share their underlying technology with other voice-driven platforms such as virtual assistants – platforms often designed to mimic servility and subservience. It is no coincidence that voice assistants are often designed to sound and act feminine – take Apple’s Siri (in its original avatar) and Amazon’s Alexa. The more recent development of a ‘male’ option for voice assistants does not change the overall picture of male dominance and female servitude in AI.
When chatbots and voice assistants are fed on a diet of data assembled by male coders, machines perpetuate inequities found in the real world. This can have unintended consequences.
Given the general uncertainty surrounding the impact of AI on the real world, the responsibility of creators as well as broader communities merits all the more attention.
Today, machines reflect regressive, patriarchal ideas that have proven to be harmful to society. If this continues, technology may no longer usher us into a post-gender world. In fact, like all bad doctrines that have held communities back, biased codes may just institutionalize damaging behaviour.
Perhaps the involvement of more women and marginalized communities in the creation of AI agents could deliver the equity that we desire in future machines, and prevent the development of more patriarchal technology. If the machine is patriarchal, do we remove the more systemic condition of patriarchy or reduce reliance on the machine altogether? Both are easier said than done.
To build an equitable world, which will be inhabited by women, men and machines, the global community needs to script norms around the fundamental purpose, principles of design and ethics of deployment of AI, today.
Autonomous systems cannot be driven by technological determinism that plagues Silicon Valley – instead their design should be shaped by multiethnic, multicultural and multi-gendered ethos. AI and its evolution, more importantly, needs to serve much larger constituencies with access to benefits being universally available.
The administration of AI applications cannot be left to the market alone. Experience tells us that the market often fails and is regularly compromised by perversion and greed. History teaches us that when governments control, constrain and constrict innovation, they produce aberrant outcomes that are far from ideal. Norms developed by communities, instead, provide a workaround. We must promote norms that manage these technologies, make it available to those who need it most, and ensure a gendered development of this space led by a multistakeholder community that includes voices from outside the Atlantic consensus.
Samir Saran, Vice-President, Observer Research Foundation (ORF)
Madhulika Srikumar, Junior Fellow, Observer Research Foundation
New technologies are radically transforming our idea of community – and subsequently statehood. How will future states look like? And how should they look like? Disaster researcher Malka Older, author of the highly acclaimed cyberpunk thrillers “Infomocracy” and “Null States” will discuss digital governance with Shoshana Zuboff. Since the early 1980s Zuboff’s career has been devoted to the study of the rise of the digital, its individual, organizational, and social consequences. She coined the term “commercial surveillance” and is now working on her book “Master or Slave? The Fight for the Soul of Our Information Civilization”. This final discussion about new forms of digitalized governance and its impact on the individual will be moderated by one of the world’s leading experts on cyber security Samir Saran, Vice President of the Observer Research Foundation in Delhi.
original link is here
Economic Times, August 29, 2017
Original link is here
The Supreme Court’s verdict affirming the fundamental right to privacy should not come as news to technology companies. The court merely codifies what should have been an article of faith for Internet platforms and businesses: the user’s space is private, into which companies, governments or non-state actors must first knock to enter.
The technical architecture of Aadhaar and its associated ecosystem, too, will now be tested before a legal standard determined by the court. But GoI should see this judgment for what it is: a silver lining. The verdict bears enough hints to suggest the court sees the merits in a biometrics-driven authentication platform.
In fact, Justice DY Chandrachud impresses upon the possibility of better governance through big data, highlighting that it could encourage “innovation and the spread of knowledge”, and prevent “the dissipation of social welfare benefits”. The court’s words should spur GoI to create a ‘privacy-compliant Aadhaar’.
But this requires systematic thinking on the part of its architects. The private sector, too, will have to put ‘data integrity’ and privacy at the core of their consumer offerings and engagement.
For starters, GoI must account for Aadhaar’s biggest shortcomings — its centralised design and proliferating linkages. A central data base creates a single, and often irreversible, point of failure. GoI must decentralise the Aadhaar database.
Second, Aadhaar must be a permission-based system with the freedom to opt-in or out, not just from the (unique identification (UID) database but from the many services linked to it. This must be a transparent, accessible and user-friendly process.
With a ‘privacy-compliant’ Aadhaar, GoI would not merely be adhering to the Supreme Court verdict, but also be on the verge of offering the world’s most unique governance ecosystem. Take Beijing’s efforts, for instance.
In 2015, the Chinese government unveiled a national project to digitise its large, manufacturing-intensive economy and to create a digital society. The ‘Internet-plus’ initiative aimed for the complete ‘informationisation’ of social and economic activity, and harvest the data collected to better provide public and private services to citizens.
China has no dearth of capital or ICT infrastructure. But the ‘Internet plus’ initiative has struggled to take off in any significant way. The project suffered from a fundamental flaw: Beijing believed by gathering information — from personally identifiable data to more complex patterns of user behaviour — the State would emerge as the arbiter of future economic growth, consumption patterns and, indeed, social or political agendas.
If a project like Aadhaar is to succeed, its underlying philosophy must be premised on two goals: first, to increase trust and confidence in India’s digital economy among its booming constituency of Internet users; and second, to ensure that innovations in digital platforms also result in increased access to economic and employment opportunities.
A privacy-compliant Aadhaar creates trust between the individual and the State, allowing the government to redefine its approach to delivering public services. The Aadhaar interface, that the Unified Payments Interface (UPI) and other innovations rely on, could well generate a ‘polysemic’ model of social security, where the same suite of applications cater to multiple needs such as digital authentication, cashless transfers, financial inclusion through a Universal Basic Income, skills development and health insurance.
But such governance models should not be based on a relationship of coercion or compulsion. It is heartening that India’s political class has embraced the court verdict.
A key reform missing in current debates about the UID platform is GoI’s accountability for its management. Aadhaar, to this end, should have a chief privacy officer who will be able to assess complaints, audit and investigate potential breaches of privacy with robust autonomy.
A privacy-compliant Aadhaar, with a bottom-of-the-pyramid financial architecture, would inspire confidence in other emerging markets to also adopt the platform, with Indian assistance. Companies and platforms must internalise that promise of black box commitments towards privacy and data-integrity may no longer suffice. These commitments must be articulated at the level of the board, and communicated to each user that engages with them. Overseers of data integrity must be appointed to engage with users and regulators in major localities.
The writer is Commissioner, Global Commission on the Stability of cyberspace