Guiding the Future of Tech Policy: An Agenda for the New Government in India

Shashank Mohan, Sukriti / Jul 8, 2024

Today, Indians are reliant on the internet to get government wages, obtain subsidies, and seek employment through gig work. Besides these, the internet is a panacea for underrepresented communities for self-expression and access to knowledge. These are not new assertions, but they are worth re-emphasizing, as future regulatory guardrails for the internet will have an unprecedented impact on all spheres of society. While the government’s push to make the internet open, trustworthy, accountable, and secure is noteworthy, it must make rules that have clear objectives, are made after wide consultation, and serve the public interest.

Clear objectives and robust public consultations

Tech–policy solutions may be devised as quick responses to an identified or presumed harm but are often de-linked from clear goals. This is illustrated by the government’s recent attempts at regulating generative AI models. The Ministry of Information Technology’s AI advisory required tech companies to seek government permission before deploying “under-tested or unreliable” AI products, but it was later revised to remove the requirement after push-back from the industry. The revised advisory also required watermarking AI-generated content to address concerns around deepfakes. Watermarking AI content is accepted as an ineffective solution to the problem. Impacted stakeholders were not consulted in this process, and the government has yet to clarify concerns raised by companies and civil society. If one of the goals of the advisory was to tackle deepfakes online, it failed to do so.

Similarly, in a move that is now before the Bombay High Court, the government amended rules requiring online platforms to remove content marked as misinformation by a fact-check unit appointed by it. After the High Court refused to allow an interim stay on the setting-up of the fact-check unit, the Supreme Court stayed the move on appeal—citing the impact on online free speech. Despite the amendment’s negative effect on free speech, the government—in court hearings—maintains that such a move is essential to combat harmful misinformation without considering more effective measures such as fighting misinformation with counter-speech or issuing government clarifications. Defining clear goals and considering multiple approaches to solve specific challenges will lead to robust and future-proof policymaking.

Regulation also cannot serve its purported objective without an adequate understanding of the subject matter, and pre-legislative consultations and stakeholder feedback serve to fill this gap. Recently, there has been an absence of transparent public consultations on the drafting and introduction of bills before Parliament. Consultation in policymaking is essential to address the information asymmetries that prevail across stakeholders. Stakeholder inputs should be collected and synthesized in reports or white papers and made available publicly to enable engagement during legislative and policy processes.

Recent consultation efforts around the data protection law, rules for internet platforms, and the draft digital competition law have been mired in opacity. Stakeholder inputs and identities have not been disclosed, and invite-only meetings with industry and civil society have become the norm. Such actions damage trust, and dilute accountability. A cogent strategy for tech–regulation must be built upon conducting robust public consultations.

In the public interest

As technology impacts a cross-section of society, it is crucial that tech–policy be in alignment with the public interest. Welfare mechanisms may be geared towards benefitting the public but still fail to serve their interests. For instance, in 2020, the government blanketly imposed the use of the contact-tracing app Aarogya Setu during the pandemic, even as it raised a multitude of concerns around efficacy, data privacy, and consent. In response to the concerns raised, the government released a protocol in the form of an executive order containing guidelines for data processing and sharing, all without prior consultation, clarification, or legislative backing.

In another example, India’s recently enacted data protection law did not contain a standard exemption given to journalistic entities to protect their activities from arbitrary privacy claims. The government hasn’t clarified why such a provision, which found place in previous drafts, was removed from law. Balancing journalistic speech with privacy considerations is a critical public interest question that merits statutory intervention. Similarly, the rule to ask services like WhatsApp to enable tracing the first originator of a message is not only against the general privacy rights of citizens but is also, as multiple observers have pointed out, unenforceable due to WhatsApp’s celebrated end-to-end encryption. The government must explain why such policy measures are essential and preferable over other less rights-infringing actions to meet intended goals.

According to a recent report, the Ministry of Information and Broadcasting (MIB) has proposed to bring “professional content creators” within the ambit of the Broadcasting Services (Regulation) Bill. However, there is no clarity on how this category will be defined and qualified. Concerns have been raised on the implications of such regulation on free speech and diversity of viewpoints, which directly impact the public interest.


India’s G20 presidency helped the world recognize it as a leader in tech deployment and innovation. However, policy decisions not firmly rooted in the public interest will create an unsafe digital ecosystem, hamper innovation, and discourage private investment. Trust and accountability help build better regulation and safer user experiences online. The government must invest in building relations with civil society, academia, and subject-matter experts. To position India as a global leader, especially in the Global South, the new government must draw clear objectives, conduct comprehensive consultations, and keep the public interest paramount.


Shashank Mohan
Shashank Mohan is a Programme Manager at the Centre for Communication Governance at National Law University Delhi. His work is primarily focused on privacy and data protection, data governance, intermediary liability, and e-governance.
Sukriti works as an Analyst at the Centre for Communication Governance at National Law University Delhi (CCG). She is interested in data protection and privacy, platform governance, and issues of digital rights and free speech.