Anushka is the latest example of the enduring frugal innovation seen in Indian engineering colleges like KIET.
When most science fiction fans think of humanoid robots, they usually imagine something from an Isaac Asimov story or a terrifyingly lifelike construct like Dolores from the television show Westworld. With the rise of generative AI and current social robots such as Sophia built by Hanson Robotics, a future with fully realized robots does not appear remote. However, the route to this reality can be equally exciting.
That’s why it was difficult for me to pass up the opportunity to engage with a humanoid robot developed in a small lab on an engineering college campus in India’s heartland.
Anushka, the humanoid robot created by a team of students and professors at the Krishna Institute of Engineering and Technology (KIET) in Ghaziabad, Uttar Pradesh. Anushka’s current version is largely intended to greet visitors and deliver pertinent information in response to their inquiries. However, Anushka’s inventors envision her as more than a robot receptionist, with possible applications in healthcare and consulting.
Anushka caused a bit of a splash in the media when she was originally introduced in March 2024, as she appears to be the first humanoid robot with autonomous movement developed in north India. The assertion that she was designed in accordance with Vedic principles also raised some questions.
Anushka tied to a monitor next to a table crowded with microcontrollers and 3D-printed parts. Dr Manoj Goel, joint director of KIET, said that the robot was created on a budget of Rs 2 lakh, which is a fraction of the Rs 7-8 million generally required to manufacture humanoid robots. He even told me that some of the components came from a local garbage yard.
Anushka’s face has 3D-printed parts, and the flexible silicone skin was created by Madame Tussauds in India. Notably, her face characteristics were modeled after a late French princess and then improved with generative AI. The entire project took approximately one and a half years to complete.
The master-slave configuration
The underlying technology that enables Anushka to imitate human motions uses a master-slave architecture, with an i7 CPU acting as the brain, controlling a network of microcontrollers and servo motors to control the movements of her hands, neck, jaws, eyes, and so on.
Anushka uses natural language processing (NLP) to answer queries accurately. Voice commands from a person, captured by a microphone placed discreetly behind her necklace, are converted into digital signals. A programme written in Python language uses NLP to turn them into meaningful data, which is processed by the robot, who then fetches the required data from a database containing 500 terabytes of information that has been acquired from OpenAI, the startup behind AI chatbot ChatGPT.
The humanoid is also equipped with computer vision trained for facial recognition. This gives Anushka the ability to recognise a person standing 10 metres away from her, thanks to a high-resolution, 30 megapixel webcam. However, it might take the robot two or three sessions to accurately recognise a person as someone she has met before, the team said.
Self-awareness in humanoid robots
Currently, humanoid robots like Anushka cannot be accurately described as being self-aware since Artificial General Intelligence (AGI) has not yet been achieved. But, that doesn’t mean she lacks any intelligence whatsoever.
“There are four stages of artificial intelligence at work here: one is when she listens to you, another is when she watches and understands your image via computer vision, the third is when she uses natural language processing to communicate with you, and the final stage commands the servo motors to work in sync. Each level of intelligence is pivotal as together, they ensure that everything functions smoothly,” the team said.
Engineers and scientists are undoubtedly presented with several technical challenges while making realistic-looking humanoid robots. But an ethical dilemma that they could be forced to confront has to do with the uncanny valley. Simply put, ‘uncanny valley’ is a term to describe the creepy feeling you get when you see a robot that is too close to looking like an actual human. It was first used to refer to the dip in likability whenever viewers saw a human-like robot or CGI character on-screen, according to a report by Gizmodo.
Also Read : Dell announces job reductions amid ongoing tech industry layoffs in 2024.