Mr. Brijesh Singh, Principal Secretary, Government of Maharashtra on algorithms and cyber missions, lessons for India security
Happy to be among such a fine group of professionals. I find many old friends here in the audience. I was reading about Rotary, and Rotary talks about truth, peace, and fairness. Now, these are the three things which are under attack in this world today, in this world of post-truth. These are the things under attack. And how did this happen? How did we land here?
I’ll start with a strange biological phenomenon of the grasshopper and locust. If you look at the grasshopper, it’s a very solitary insect, green, a small insect. It doesn’t like to interact much with others. And then you look at the locust. This is like the bouncer variety, right? It’s got stripes, it’s big and it’s bulky. Now the irony is that they are the same individual, right? The locust is the grasshopper, and the grasshopper is the locust.
How does it happen that a small solitary insect gets transformed into this huge thing which goes on to destroy agriculture and, within a day, eat tons of material? So, what happens is that if grasshoppers are put in a small confinement, when they start touching each other, there’s a blip in their mind and there’s a chemical called serotonin which releases. So every time a grasshopper is bumping across each other, there’s a blip of serotonin. And in some time, this comes to a flip switch and a biological switch flip in the mind and then it starts transforming into this transformer variety, which is so dangerous.
Now, you understand where I’m going with this. Our brains are so welded to this machine in our hands that we’re getting serotonin and dopamine. With us, it’s dopamine. Dopamine blips with this. Now, your phones and social media hyper-personalise, right? It exactly knows everything about you. It knows what time you eat, what time you get up, what stage of life you are at, how do you spend, what are you going to purchase, so on and so forth. And since you are the product, this information is sold and it is utilised.
And these algorithms are getting developed more and more. The irony is that this is okay for e-commerce, this is okay for entertainment. But as soon as it enters the arena of politics, then it finds people with similar views, it creates an iconography, it creates an urgency. So, all the aspects of mobilisation, which earlier for protests could take years, happen in a matter of days. Look at country after country in the so-called Gen Z protests. Right now, just because there are woods, it doesn’t mean that somebody should light them up.
Every democracy has a grievance everywhere. See, in every democracy, the margin between the ruling and the losing party is hardly two to three per cent. It’s almost that 50 per cent people don’t want a government which generally is there. That’s the nature of democracy. And dissent is the very soul of democracy. However, if you weaponise this dissent, then you’re looking at collapse.
See, if you start from Israel, and then Iran, Afghanistan, Pakistan — then you cross India, Myanmar, Bangladesh, and up to China, there’s hardly any democracy. There’s only one democracy in this side of the world, and we are one of the largest and oldest democracies. It is time for us to understand what is happening with social media algorithms and to protect this democracy.
And I’m very happy that I’m getting a chance to talk to this group of professionals, because you would understand that somewhere, the language you speak and your children speak is very different. Your views and their views on anything political are very different, right? They would have, let us say, the young are supporting Mamdani now — he’s a mayor of New York. Why are children in Mumbai so much bothered about the mayor of New York? I don’t understand. They don’t know who’s probably going to be the mayor of Mumbai or the Vice President of India, but they are bothered about Mamdani. Why? Because their mind, their brain, is getting wired. They are getting this information continuously.
So, I recently said at a cybersecurity seminar for children, I said, kids, we are sorry we abandoned parenting to this device, right? And now we are unhappy that this device is parenting our children, making them form ideologies, and interests.
In any scenario… for example, if you analyse Nepal, there were grievances, and everywhere there are grievances. Somewhere, these grievances are picked up by social media platforms. So you’ll be amused to know that in social media platforms, anger is five times more promoted than peace. Sadness, only when weaponised, is promoted. So, social media algorithms are primed to take grievance, multiply it, and generally you get a feed which is called “For you”, right? It understands who you are, whether you are a right-wing supporter or a woke left-wing supporter, you will get only that kind of feed. So it is tailored to you. It has formed a digital twin of you. It knows.
And, as soon as there’s a grievance, it takes that grievance and starts spreading it to similar people. If there is a group of people, for example, in Nepal. What they did was that they sat at a place and they systematically created messages, and started sending. Now the algorithm thought that something natural is happening. It took it and multiplied it. As soon as it multiplied it, it was on everybody’s feed, and the use of particular memes. So the Nepal thing started with the sound of a gunshot. It was a very small seven-second clip, and later on, people came to know that it was AI-generated.
But that clip angered the youth so much that this grasshopper who was playing on his phone suddenly became the locust. He came out on the street with Molotov cocktails. They went down to burn the Prime Minister’s house and the ex-Prime Minister’s wife died in that. This is the danger we are facing everywhere in democracies. This doesn’t happen in totalitarian states, except when they are probably collapsing, let’s say, like Iran.
See, technology is being used on both sides, but probably this genie is out of our hands because none of these algorithms are under control. The platforms are opaque. They don’t tell you what algorithms they are using. They don’t tell you what they are promoting. And if you legitimately talk to them about takedown, they start calling you enemies of democracy.
So, once I was in a debate with a Google representative on TV. He said that there are so many takedown requests from India and this is undemocratic. I said, hold on. We function under the scheme of the Constitution, where even for the right to speech or freedom of speech, there are conditions. Freedom of speech doesn’t mean that I can speak anything anywhere all the time. There are limitations to every right. No right is absolute because rights exist in a framework.
We exist in a constitutional framework, and if there’s a request from a government for a takedown, it is a legitimate request under due process of law, which is auditable, which is transparent, and which is appealable. I said, tell me, what is your mandate? Under which scheme of the Constitution is Google or Twitter or Facebook operating an algorithm? If there’s algorithmic transparency, are you telling anybody what are you promoting, what are you not promoting?
You’ll be surprised to know that TikTok has, probably my friend from Taiwan would support me on this, that TikTok shows separate feeds to people in America and people in China. In China, you get things about nation building and how you should work hard and pay attention to your body and your mind and things like that. And in the US, they are going mad about creating some viral dance move or something like that.
So the thing is, that the kind of attention we pay to devices is very different with age groups. So we are living in absolutely different worlds, especially for the younger generation. They are on different social media apps, we probably don’t use them, and they are digital natives, so their minds are much more welded into it. And imagine this: it is a grand scheme of puppetry, where somebody can decide that if they want to create chaos in a country, they can simply change the algorithm, right? Just like tweaking a dial, and there would be riots.
Let me tell you something very surprising. There is a CIA venture capital firm, okay — not NSA, not US government — a CIA venture capital firm. And it created a lot of companies, beginning with Google. The name of that CIA venture firm is In-Q-Tel. In-Q-Tel is somebody which created Palantir, Android, Recorded Future, Google, and many other companies like this.
So the US does its job very properly. It has a kind-of narrative control of the world. It does it very properly, and it’s in their national interest to do so. What I intend to say here is that we, as a country, we have to understand that we cannot let our youth and our minds be controlled by some algorithms which are running from somewhere where they neither have a mandate nor do our national interests adhere to them.
Now, coming to cybersecurity, you look at how Nicolás Maduro, President of Venezuela, was picked up, right? Before 150 helicopters went in, the radar systems went blind, right? Imagine this: the radars didn’t see 150 helicopters coming in. The S-300, S-400 Russian batteries which we use, they were paralysed, they didn’t fire a single missile, and lo and behold, the whole power grid also went down. This was all remotely done as a cyber operation.
So cyber, rather than just being for… let’s say, intelligence gathering or recon, has become the main weapon in any conflict. So imagine tomorrow that probably you go to fight a country and nothing works, right? Everything will be shut down. And I think China, if one treats China as an adversary of anybody, China has exceptional cyber capabilities. The thing is that it’s easier to cultivate these capabilities in a non-democratic country.
So I was talking to kids in Pune yesterday, and I said, if I go to the accountant and say that I want to develop malware to bring down power grids, nobody’s going to allow me. In a democracy, the auditor and the accountant will not allow me to create malware like this. But in China, they have specific groups working on routers, somebody working on industrial control systems, somebody working on… — in the US, they discovered that there were backdoors even in the F-35 chips.
So probably somebody like China would have a remote control of the world and they can wilfully close down things the way the US did in Venezuela. So I think cybersecurity, rather than becoming a business question, is an existential question, and has today become the main weapon of war. And in this theatre, a thought has to be given at all levels.
Wonderful work is being done in the fintech side by the RBI as a regulator. RBI is really on top of its game. It’s giving the right directions and getting things implemented, because digital systems work on trust. And today, UPI is much larger than Visa, MasterCard, everybody combined in terms of volume, and in terms of number of transactions, everything.
So, imagine this: if trust in this digital system goes away, then our financial system will collapse. And there are constant attacks. You’d be amazed to know that the number of attacks that happen on our critical infrastructure is in lakhs per minute, and they’re also geopolitically oriented. So if you find geopolitical events, as our terms with countries change, you see the nature of attacks changing.
It is upon us to urge upon the government and industry bodies to create policies which ensure resilience, right? So resilience of society in terms of what algorithms are being fed to our people, resilience of our infrastructure, resilience of our industry, resilience of our financial system — and I think it’s a shared responsibility.
It’s slightly technical, but it is time to co-create policy, that for every policy there should be consultation. And you see that more and more, the government now is working on a policy sandbox kind of thing, where they issue a policy and ask people for feedback and modify it on the way. And I’m seeing this again and again, that wherever there are critical policy issues or laws, there is a lot of public consultation.
So, the whole point here was to say that we created technology, but technology has become a sovereign. It is a genie out of the bottle, and coupled with AI now, AI can do anything. The other day, a friend of mine gave me a WhatsApp video call with me speaking from the other side, and it was real. He said, okay, give me five minutes, I can clone your voice also. I just need a ten-second clip of your voice and you’ll be speaking on the other side. Imagine this.
There have been multiple places where businesses have been impacted, where people have made Zoom calls with the chief executive speaking from the other side, giving commands. Two companies, one from Hong Kong and one from the UK, they lost 30 million dollars each. And this happened like three years before, where people were impersonating on a live Zoom call.
So, this is where technology is growing. There are great advantages of technology, but somewhere, as a society, we should sit down and build guardrails – and ethical guardrails. And I think where we can use principles of Rotary there. Thank you so much.
ROTARIANS ASK
Brijesh, that was fabulous, really eye-opening. One question — Australia is experimenting with the social media ban for children. Do you think something like that works for India?
I’ll tell you something very surprising. I had gone to a kids’ workshop and there must have been 1,000 children in the room and the organisers had given them these thumbs up and thumbs down. So this is the first question I asked. I said, Australia banned social media. Should it be done for you? Unbelievably, most of them said that it should be banned. I was very surprised. I didn’t expect this. They said, we don’t want it.
But what has been seen generally after bans is that these technological bans are difficult to implement and workarounds are very simple. So let’s say if you’re going to geofence, they’re going to use VPNs. Like TikTok was banned in India, but kids then ask parents for VPN. Or otherwise, there’s a greater danger that you’ll have to implement a universal authentication regime, which then will spark fears of surveillance, because you’ll have to authenticate everybody. Or then, children will start asking their parents to make fake accounts and impersonate.
So banning, I think, is a difficult idea, but definitely there needs to be some kind of control. And that has to be through platform accountability. And I think that is where we are failing. The platforms are not accountable. If you try to upload a video with any music which is copyrighted, it doesn’t allow you. But other things it allows. It allows the overthrow of governments, but it doesn’t allow you copyrighted music.
So, I think it’s for platforms to have a serious look. Country after country has had parliaments speak to these platforms, but it seems that they’ve become supranational, they follow their own foreign policy, and somewhere we will need to build laws.
So, India in 2021 created a law which is basically the Social Media Intermediary Guidelines and Ethics Code, which created a three-tier system, which is like self-regulation. So that if, let us say, somebody’s pictures have been morphed and sent out, then they can directly go and appeal to the platform. The platform has a responsibility of bringing them down in a particular time frame. So that has been done. So it’s not so that India has started controlling the platforms, but it has created a mechanism where they are accountable. So that mechanism is, I think, working well. Thank you.
Moderator
My questions to you are two. How secure is the WhatsApp encryption? They keep on saying even we don’t know what you’re writing and you’re talking. Is that true in real life? That’s the first question. And the second question is, how much cyber intelligence was used in the India-Pakistan skirmish recently?
So, on your first part, you have to understand encryption. Peer-to-peer encryption happens when I am sending a message to you. So either you can read or I can read. Even if I do something called a man-in-the-middle attack — let’s say I intercept everything that is going from my device to your device — even then I would not be able to decrypt it. Even WhatsApp is not able to decrypt it. So that way it is secure.
But see, metadata is available. Metadata, in the sense of who called whom, who sent a message to whom, is available. The content of the message is not available. However, if one of the devices is obtained, then the decryption keys are on the device. So let’s say in case of any proceeding, if one of the parties anywhere hands over a device, then that device can be decoded, and all the messages can be obtained, and used in a court of law.
But there is no interception possible. So this is what we in law enforcement classically call the “going dark” problem. Let’s say if there is a group of scamsters or paedophiles that is using WhatsApp, we don’t know what they’re doing. We absolutely have no idea of the content because it’s encrypted, unless and until you have access to a device, which is very difficult and happens only at a later stage. So, investigation is becoming more and more difficult.
It preserves privacy, which is absolutely essential for various things, but it is also a double-edged sword, as it creates issues.
Now, about Operation Sindoor, I think the cyber component was more in the disinformation. Because just after the Pahalgam attack, within four hours, there were thousands of Pakistani accounts which started tweeting that it was an insider job. Now, two or three months back, something happened that Twitter changed its policies and started giving the real location of people. So you can go to a profile and find out where the profile is located. Twitter has given this new functionality.
And then you found that most of the dissent in India, right from the farmers’ protest to various other things, was running from Pakistan, because they can impersonate us. And many of the most popular handles were actually Pakistanis. But see, in a non-democratic country you can do this, you can have money for this, you can set up something like this.
A democracy is like a large vegetarian dinosaur. It doesn’t have fangs, it cannot bite, it cannot eat up anybody, so it can only have a thick skin.
Mr. Singh, it’s comforting to know that there are erudite people like you working for the Maharashtra Government who are well-versed in cybersecurity and hacking. I’m just curious, as Principal Secretary to the Chief Minister of Maharashtra, is this your core responsibility, or is this partial responsibility?
So actually, I’m Principal Secretary, but I handle Information and Public Relations. I handled one department. I was Principal Secretary to Eknath Shinde. There I was handling that department, as well as my present role.
But in my earliest stint with Mr. Fadnavis, I was handling the Department of Information and Public Relations, but I was also handling cyber. So Maharashtra Cyber was created at that time, and Maharashtra is the first state to create forty-seven cyber labs in one go.
So the Chief Minister had the vision at that time. In 2015, he gave us 1,000 crores and said, go and get me the best system. We toured across the world. We went and met the FBI in Sacramento, we went to Estonia, we went to France, we went everywhere, to Israel. And we could create forty-seven cyber labs, which at the time were better than anybody in the world, because our procurement was fresh.
And just one small thing. Everybody has a very bad opinion of police because you see movies, and in movies, we are represented as being simpletons and idiots who come after the crime has happened. Now everybody said, how are you going to train your people because this is expensive equipment? You’re giving a crore-rupee worth of equipment to a rural district like Jalna, Bhandara or Gondia.
So I took a call. I said, I’m not going to train anybody. Because people use Android phones, everybody uses end-to-end encryption, disappearing messages, status change — everybody is using it. But for using a machine which has a menu, why do you need training?
So, I did an experiment where I gave all my equipment to my people for three months to play. I said, go play. You will not believe, within three months my constables — and see, the general level of constabulary, though it is supposed to be 12th pass, but we are getting graduates and good quality graduates — within three months, these boys were pros. They could sit in a room and talk to anybody.
Not only that, they were finding bugs, they were making feature requests, and within months of this equipment being put there, they started using it without any training. So it was very heartening to know that people could automatically learn on their own.
Hello. I’m Rajni from Australia, actually a Rotarian, but visiting. And I do work in cybersecurity, so I do understand. And also, it’s so heartening to see the brilliant minds that are here in India working on this.
My question will be a bit later, but I just wanted to make a comment on that mandate that you spoke about, about children under 16 in Australia being the first country to ban that. When those feedback and policy sessions were happening, in fact most of the parents said no, because the pressure then would also come to them to keep the kids entertained. So that was one of the other social aspects of this, which I think would be here too.
And I actually work consulting to the Australian Government Department of Defence with data privacy impacts and all of that. So they do, you know, how you mentioned about a policy mandate. What about India implementing a financial mandate? Because all of this comes from the platforms. So is there any sort of shift or move to implement those controls? Like, say, for instance, Google, Facebook, TikTok — having some financial penalties if they do not implement content control. That kind of a track.
Yes, so the 2021 rules account for that.
One last thing I would like to say, probably for the benefit of this gathering: any kind of business that you’re running, please have a look at the Data Protection Act. It applies to everybody, and the penalties are very severe, and the rules are out.
And if you’re collecting any kind of personal data for your business — any kind of name, phone number, anything — if there’s a breach, there would be severe penalties. So please go and talk to whoever you can to explain to you the liabilities which are possible under the Data Protection Act. And please let me tell you, this is very serious.