DRDO Creates A Facial Recognition Technology That Can See Through Disguises And Masks

One group was pleased when Covid made everyone wear masks—the criminals. Now that it was harder to recognise a person, they might find it easier to blend in with the background noise.

A facial recognition system that can identify faces wearing masks or other disguises during the epidemic was discussed due to the hazard this posed.

In reality, the National Crime Records Bureau published a tender for it last year, but there has been no mention of whether this technology was created or put into use.

It turns out, though, that India’s top defence research facility, the Defence Research and Development Organization (DRDO), has really created such a system.

The “Face Recognition Technology under Disguise,” or FRSD, system makes the claim that it can recognise faces even when they are hidden under various “disguises including face masks, beards, moustaches, wigs, sunglasses, head scarves, monkey-caps, hats, etc.”

In a recent paper titled “AI in Defence,” the Ministry of Defence (MoD) described the development of the Indian Army’s FRSD and three additional facial recognition systems by MoD-affiliated organisations.

It is vital to explain how and why these technologies are being employed because they might not just be used in military operations but also in civilian settings.

FRSD

The FRSD uses algorithms rather than human eyes to recognise the individual from splotchy, low-resolution security camera feeds.

The MoD paper stated that “the algorithm can also be employed by security organisations for effective face search across massive archives.”

The device can be installed for real-time video surveillance in restricted or secure areas. According to the paper, it can also be used in public settings to identify anti-social members.

For identification, it considers various lighting situations, face shadows, crowd occlusions, and other factors.

“Because of the cameras’ low resolution, ‘face recognition in the field’ on surveillance camera feeds is a challenging challenge to address. With the complexity of many facial disguises, crowd occlusions, and different illuminations added, this challenge becomes even more difficult to address, according to the MoD paper.

The system was created by DRDO with the idea that it should scale across servers and graphics processing units.

People counting, geo-fencing, fire detection, and collision detection are just a few of the other surveillance applications that come with the system’s adaptable video analytics suite.

Project Seeker

A facial recognition system called Project Seeker was created by organisations under the MoD.

According to the MoD study, it was created and used by the Indian Army for population monitoring, surveillance, and garrison security.

It can gather intelligence information from numerous sources without internet connectivity, be configured remotely with a field-ready system, and be used anywhere.

It can be used to ensure “state-of-the-art security” at civilian establishments as well as in “disturbed” areas for ongoing surveillance and monitoring.

The Seeker system, according to the study, is a self-contained, AI-based facial recognition, surveillance, monitoring, and analysis system for identifying and tracking terrorist threats, ongoing surveillance, and monitoring of troubled areas.

For increased security, it was stated that the technology may be used at “important military” or “civilian establishments.”

The Army seeks to follow the movements of terrorists and “anti-national” groups using intelligence data from numerous sources.

According to the paper, the Army wants to gain “psychological domination on threats and anti-national groups.” It also explains how the technology will benefit the country.

It is significant to highlight that the phrase “anti-national” has not been defined by statute and lacks a legal definition.

Robot at the border

In addition to Project Seeker, the Indian Army has also created Silent Sentry, a 3D-printed rail-mounted robot with full facial recognition capabilities that can be deployed on fences and anti-filtration obstacle systems (AIOS).

The WiFi-connected robot has artificial intelligence built in to recognise faces and recognise human people.

“An AI programme using object recognition analyses the video feed the robot sends over. According to the study, the software automatically detects movement and human presence, creates an audio warning, and keeps the photos with a time and date log.

A background face recognition algorithm is engaged when a human is detected and attempts to identify the person from a database of recorded information. The data about the face features is then kept in the database.

Driver fatigue monitoring system

The MoD-affiliated public sector company BEML Ltd has created a facial recognition-based driver tiredness monitoring system.

According to the report, “assessment of driver weariness under critical settings is a vital instrument, especially in the Armed Forces.”

According to the study, the technology can tell when a driver starts to get sleepy while the car is moving.

The driver is continuously being filmed inside the vehicle by a camera, and an algorithm analyses the video to identify whether the driver’s eyes are open or closed.

The percentage of eyelid closure over the pupil over time (PERCLOS) algorithm is used to detect drowsiness by continuously monitoring for symptoms and taking physical signs such yawning, drooping eyelids, closed eyes, and prolonged blinks into account.

Reliability

Despite how impressive these technologies may seem, they are all ultimately dependent on the implemented software and algorithms.

Given that these systems are known to be prone to error, how trustworthy are they?

Because faces are difficult to detect accurately, there are worries about misidentification.

“Technology for facial recognition is unreliable. It produces inaccurate findings. The accuracy would drop even further with masks, which can cover half the face, according to Anushka Jain, associate counsel at the Internet Freedom Foundation (IFF).

For instance, in a 2018 test, 28 members of the US Congress were mistakenly matched by Amazon’s Rekognition face recognition software as other individuals who had been arrested for a crime.

Jain used the illustration of two siblings who could be mistakenly recognised while donning masks.

“It’s entirely possible that two siblings wearing masks may have similar upper-half appearances. They could be misidentified. Communities might potentially become the target of this, according to Jain.

The study makes no mention of the FRSD technology’s accuracy.

“Any decision made based on inaccurate information could have serious repercussions. There are drawbacks to facial recognition as a method and in application. Therefore, the Armed Forces must further examine and filter the collected data before the data may be used, according to Kritika Seth, founding partner of Victoriam Legalis – Advocates & Solicitors.

Privacy

Civil society organisations and digital rights activists have scrutinised the use of facial recognition technology by state governments and the Centre for Governance and Policing over the years because they are concerned about privacy invasion.

Although Seth from Victoriam Legalis raised concerns about the data gathering procedures and if its utilisation is in accordance with the Right to Privacy judgement, the technology will be used more frequently by foreigners.

“There is no statutory structure that requires transparency in the gathering of data for the aforementioned purposes. As stated in the case of Justice K.S. Puttaswamy (Retd.) versus Union of India, the secrecy surrounding the use of personal data may constitute a violation of the right to privacy, according to Seth.

“In addition, the Army is eager to keep an eye on social media sites. Such surveillance will overlap with currently in place state surveillance and may not fall under the purview of roles played by the armed forces, the source continued.

Legality

The Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data of Information) Rules, 2011 (the “SPDI Rules”) classify data acquired through facial recognition systems as “sensitive personal data,” according to Siddharth Suresh, partner at DSK Legal.

Nevertheless, he claimed that laws have included exceptions for government agencies to gather and use such data without the subject’s agreement, with the underlying assumption being that such use of data is for the benefit of the general public and national security.

Authorities may now collect and share biometric data thanks to the Criminal Procedure (Identification) Act, 2022’s latest announcement.

Exit mobile version