Reading Time: 6 minutes
INTRODUCTION

Recent advancements in computation, robotics, machine learning communication, and miniaturization technologies are turning futuristic visions of compassionate intelligent devices into reality. Of these, AI is the most far-reaching technological advancement that has the potential to fundamentally alter our lives. The combination of AI and Robotics is resulting in the development of several intelligent devices that could meet the increasing demand of humankind.

Background

In the mid-1980s, robots had largely been confined to factories and laboratories where they were used for tasks such as lifting objects. Later, Honda started a humanoid robotics program wherein it developed the P3, which could walk, wave and shake hands. Since then, researchers have been making progress in the development of robots that could cater to varying needs of humans in their day-to-day life. An emerging need is companionship to evoke emotional engagement and help combat loneliness. Pet animals have been catering to this need for a long time. However, while pet animals are known to be stress debusters, the high cost and effort involved in taking care of pets has triggered an interest in robotic pets that can engage and interact with humans. Yet, interfacing of emotions has always been a challenge. It requires connecting human functions such as physiological, physical and cognitive functions, with intelligent devices. This brings us to the concept of robotic pets -emotional ‘human’-like robot pets that engage and interact like a human, while being comparatively easy to manage.

Robotic pets are manmade machines that resemble common pets. Features and actions of robotic pets are very close to those of real pet animals such as dogs, cats and birds. Though the existing robot pets can purr, walk, run and respond to petting, there is a challenge to develop robots that look or behave as real pets do. However, with the advancements in robotics and AI, the recent interactive robotic pets are better substitutes bearing resemblance to real pets in their looks and behavior as well.

Progress So Far

PARO, a therapeutic seal robot, was designed by Takanori Shibata of the Intelligent System Research Institute of Japan’s AIST. The development was initiated in 1993, but was first exhibited to the public in late 2001. It was an advanced interactive robot that had five sensors – tactile, light, audition, temperature, and posture sensors, with which it could perceive people and its environment. It provided the benefits of animal therapy to patients in environments such as hospitals or extended care facilities, where live animals could pose difficulties with treatments or logistics. The benefits included reduction of stress in patients and their caregivers, stimulation of interaction between patients and caregivers, boosting a psychological effect on patients, improving their relaxation and motivation, and improving socialization with each other and with caregivers.

therapeutic-seal-robot

AIBO (Artificial Intelligence Robot) is a series of robotic pets designed and manufactured by Sony. It was a highly innovative idea that began back in 1994 when Toshi T. Doi, Sony Corporation’s lead engineer, enlisted an artificial intelligence expert, Masarhiro Fujita to formulate a robot with sophisticated sensors. The prototype was announced in mid-1998 and the first consumer model was introduced in 1999. Soon thereafter, Sony unveiled the ‘AIBO First Litter Edition’, which was the sixth generation of Sony’s robot dog. The biggest difference from the previous models was the cloud-based AI engine, which relied on a powerful on-board computer and advanced image sensors, making AIBO smarter and more life-like. AIBO was equipped with infrared sensors and cameras for eyes that allowed it to judge distance and avoid walking into objects. It also had sensors on its head and in its paws, and an antenna for a tail.

“Born” with deep learning artificial intelligence (AI), the sixth generation of AIBO has the ability to detect and respond to their owner’s facial expressions and voice commands – becoming smarter with time. The bot learns tricks via the My AIBO app, which could also be used to customize the pup’s gender, eye, color and voice.

artificial-intelligence-pet-robotics
artificial-intelligence-pet-robotics

SpotMini from Boston Dynamics is the smallest of the robot menagerie that is all-electric. In 1998, through the RHex robot program, Boston Dynamics developed an oversized insect-like robot. Then came an intimidating robot named BigDog in 2007, followed by Atlas and Spot, a smaller dog robot. The robot’s main frame had a quick-disconnect battery, stereo cameras in the front, side cameras and a “butt cam”. It could also be upgraded with a series of attachments on the top, including an articulated arm.

SpotMini

Boston Dynamics is ready to produce its robots on a commercial scale. Incidentally, Jeff Bezos of Amazon made a public appearance that was nothing short of a scene lifted off a sci-fi movie. Bezos walked right next to his robot pet, SpotMini, at the Mars 2018 conference in Boston. Going by this, one can expect a robotic pet from Boston Dynamics in future.

robotic-pet-from-Boston-Dynamics

Golden Pup from Ageless Innovation LLC is a life-like robotic dog that looks, moves and sounds like the real thing. It was developed to bring joy, fun, play and comfort to seniors, with a focus on improving the quality of life of seniors, their families and caregivers.

Golden-Pup-robotic-dog

The features of the Golden Pup include life-like movements of the head, tail and eyes, and realistic fur. The dogs bark and also have a heartbeat. In addition, they contain built-in sensors that respond to motion and touch.

CHiP robot dog from WowWee is an intelligent, affectionate robot dog with advanced sensors and smart accessories. WowWee was founded in Hong Kong as an independent research, development and manufacturing company, focusing on cutting-edge technologies. Artificial intelligence algorithms enable CHiP’s adaptive personality. CHiP follows commands and adapts to user’s preferences.

CHiP-robot-dog

The WowWee CHiP is controlled by a wearable watch controller, that can be used to activate games like fetch and soccer, and also follow the human around. 360 degrees infrared sensors are located all around in its head. Via built-in Bluetooth it is connected to the supplied smart bracelet that the owner wears.

Buddy, the emotional robot, from Blue Frog Robotics, a French company founded in 2014, has a range of emotions that he expresses naturally throughout the day, based on his interactions with family members. The company developed BUDDY for communication, interaction and emotional response.

iCat, an emotionally intelligent user-interface robot developed by Philips Research in the Netherlands, can be used as a game buddy for children or as a TV assistant, or play many other roles. iCat is a 38cm tall cat-like robot character whose head contains 13 servomotors that can move the head and different parts of the face, including eyebrows, eyes, eyelids, and mouth. The servomotors can generate facial expressions, which give the robot socially intelligent features. Through a camera in the nose, iCat can recognize objects and faces using computer vision techniques. Each foot contains a microphone that can identify sounds, recognize speech, and determine the direction of the sound source. A speaker in the bottom can play sounds and connected speech. iCat can control domestic appliances through connection to a home network, and can obtain information by connecting to the internet. iCat can sense touch through sensors in its feet. It can communicate information encoded by coloured light through multi-color LEDs in its feet and ears. Different modes of operation such as sleeping, awake, busy and listening can be indicated through the LEDs in its ears.

Interesting patents

Some recent patent applications are presented below. The inventions presented could be an indicator of things to come in the future.

WO2017166991A1 from Shen Zhen Kuang-Chi Hezhong Technology Ltd. deals with a robot which has a blinking function, wherein the upper and the lower eyelids are connected to a driving device disposed on the fixing seat, and the driving of the eyelids is reversed by the driving device which is an eyeball, disposed between the upper and the lower eyelids. The eyeball is connected to a camera, fixed in the head casing. The eyeball is made of an acrylic material. A micro motor is programmed to set the blink frequency of the robot, the turn-around angle and the upper eyelid and the lower eyelid, etc., so that the robot with the blink function is more intelligent and can effectively improve the entertainment and bio-mimetic property by which the pet robot can better meet the people’s demands.

robot-blinking-function

WO2018006370A1 from Shenzhen Gowild Robotics Co., Ltd. deals with a virtual 3D interactive method for a robot, comprising a multi-modal user information acquisition, generating interactive content which is converted into machine code recognizable by the robot. The robot outputs according to the interactive content, the expression forms of the robot are further diversified and humanized, and the user experience in interaction with the robot is improved; the output mode including a couple interaction, a cross interaction, and a pet interaction.

WO2018190250A1 from Groove X Inc. deals with a robot comprising a flexible outer skin which is provided so as to cover the head and the body. The outer skin includes a stretchable base material and an elastic mounting portion made of a flexible and elastic material. The robot is an autonomous behavior type that determines behavior and gesture based on the external environment and internal state. The external environment is recognized by various sensors such as cameras and thermos-sensors. The internal state is quantified as various parameters expressing the emotion of the robot. The technical idea of the outer skin structure can be applied to other humanoid robots, pet robots, and the like.

WO2018147506A1 from Cho Hyeon Hong deals with a robot dog for the visually impaired, the robot dog being capable of safely guiding the visually impaired on the basis of a movement map generated by receiving external information and detecting a movement state. The robot dog has mobility so as to assist the activities of the visually impaired, and comprises a communication unit capable of communicating with an external server that can provide information, a state confirmation unit in communication with a portable electronic device attached to the visually impaired, an environment detection unit, a geographical feature detection unit, a movement determination unit and a driving control unit that can actuate the robot dog according to a command of the visually impaired person to guide them to a predetermined place.

the-robot-dog

JP2018094683 from Menicon Co. Ltd. deals with a watcher type pet robot capable of moving to a predetermined position set in advance by detecting a specific stimulus from the outside, and performs an imaging operation. It is possible for a family living apart, for example, to confirm that residents living alone and elderly residents are doing their daily activities.

pet-robot-capable
Scientific Literature

A few scientific publications in this interesting area are presented below:

Haptic Communication between Humans and Robots, a joint publication from Intelligent Robotics and Communication Labs, Kyoto and Dept. of Adaptive Machine Systems, Osaka University, Japan.

Tactile sensors are embedded in a soft skin that covers the robot’s entire body as tools to detect a communication partner’s position and posture even if the vision sensor did not observe the person. A map that the robot obtains statistically describes relationships between its tactile information and human positions/postures, based on records of haptic interaction taken by tactile sensors and a motion-capturing system during communication, and then estimate its communication partner’s position/posture based on the tactile sensor outputs.

Haptic-Communication-between-Humans-and-Robots
Tactile-sensors

The soft skin consists of three layers. The outside layer is made of thin silicone rubber (thickness: 5 mm), and the middle layer is made of thick silicone rubber (thickness: 10 mm), these silicone rubber layers being used to achieve human-like softness. The silicone rubber, moreover, possesses sense of touch and friction on its surface similar to that of human skin. The thick silicone rubber also absorbs the physical noise made by the robot’s actuators. The inner layer is made of urethane foam, which insulates against heat from inside the robot and has a surface friction different from human skin. The tactile sensor elements are film-type piezoelectric polymer sensors inserted between the thin and thick silicone rubber layers and consist of polyvinylidene fluoride (PVDF) and sputtered silver.

Intelligent emotion and behavior based on topological consciousness and adaptive resonance theory in a companion robot (Department of Mechanical Information Science and Technology, Kyushu Institute of Technology, Lizuka, Japan)

In this publication, the use of an artificial topological consciousness that uses a synthetic neurotransmitter and motivation, including a biologically inspired emotion system is proposed. The focus is on three aspects:

  • The organization of the behavior, including inside-state emotion regarding the phylogenetic consciousness-based architecture
  • A method whereby the robot can have empathy toward its human user’s expressions of emotion, and
  • A method that enables the robot to select a facial expression in response to the human user, providing instant human-like ‘emotion’ and emotional intelligence (EI) that uses a biologically inspired topological online method to express, for example, encouragement or being delighted.
pet-robitics

Robotic and Sensor Technologies for Mobility in Older People (Ortelio Ltd., UK, Complex Unit of Geriatrics, Department of Medical Sciences, IRCCS, Italy, The Bio-Robotics Institute, Italy and ICT, Innovation & Research Unit, IRCCS, Italy)

In this publication, two categories of research were identified:

  • Sensor technologies for mobility analysis, and
  • Robots for supporting older people with mobility limitations.

Exploiting the potential of such technology will depend on a number of factors, including raising awareness of available technologies and their utility, promoting accessibility and affordability, and overcoming challenges to acceptance and use. Further research in sensor and robotic technologies is required to improve the integration with current state of caring of elders.

Start-Up Activities

The activities of a few start-ups in the area of Pet Robotics are given below:

Consequential Robotics’ MiRo consists of 6 sensors, which allows its head to nod and rotate, ears to move, eyes to blink and tail to wag. It is featured with functions of a real pet animal – for instance, it can watch this world with its bionic eyes, seek and respond to the physical interactive effect. Consequential Robotics, in partnership with a range of UK and International firms, designs and develops next generation robotic systems. Interestingly, the third generation MiRo has been reengineered for education and is available to schools to accelerate STEM learning. MiRo has been in use for research by universities and research labs across the UK and around the globe, since its launch in early 2017. With its accessible architecture and biomimetic morphology, MiRo is an attractive platform for research in fields such as biomimetic robotics, human-robot interaction and robotics in general.

ROOBO, founded in 2014 is a fast-growing hardware and AI start-up headquartered in Beijing. It unveiled a prototype of its product, a “pet robot” called Domgy. Domgy rolls around, rather than walking on four feet. It can navigate normal obstacles in a home, or make its way down a shallow step or two but it is not meant to navigate rugged or steep terrain. Usually, Domgy goes back to its charging station automatically when its battery is getting low. Domgy’s pupils turn to hearts to show love and happiness, and it giggles. It can also growl, whine or respond with a good emotional range, for example, if it’s jostled or yelled at. Domgy uses ROOBO’s proprietary artificial intelligence and facial recognition systems to identify family members, greet and entertain them and follow their rules. A 5M camera in Dogmy’s head can also serve as a monitor of sorts, alerting family members when it encounters a stranger in the house, which makes it a “good guard”. ROOBO created the artificial intelligence that powers Domgy. The device and its operating system used was originally designed by Innovative Play Lab in Korea. ROOBO is also an investor in IPL, which was earlier funded by the Korean government.

Groove X, a Japanese robotics start-up developed Lovot, a pint-sized companion robot pet which uses artificial intelligence (AI) to interact with its surroundings.The wheeled machine, resembling a penguin with cartoonish human eyes, can interchange its outfits and communicates in squeaks. Packed with sensors, it is designed to mimic affection by becoming warm to the touch, going to “sleep” when cuddled or following users when called.

The area of robotic pets is thus set to redefine the terms ‘companionship’ and ‘humanity’. In the future, they are likely to partner with humans in care giving and also provide joy and affection to childless couples and single-sex parents. It is likely to create huge market opportunities particularly in countries with aging population and high disposable incomes.

References
  1. http://www.parorobots.com/
  2. https://spectrum.ieee.org/automaton/robotics/home-robots/sony-aibo-robot-dog-is-coming-to-america
  3. https://www.sony.net/SonyInfo/News/Press/201711/17-105E/index.html
  4. http://www.wionews.com/world/jeff-bezos-takes-a-walk-with-his-new-dog-at-tech-conference-36653
  5. http://www.vvdailypress.com/news/20181011/seniors-increasingly-taking-to-robotic-pets
  6. https://wowwee.com
  7. https://buddytherobot.com/en/buddy-the-emotional-robot/
  8. https://ercim-news.ercim.eu/en67/special-theme-embedded-intelligence/icat-a-friendly-robot-that-helps-children-and-grown-ups
  9. http://robots.stanford.edu/isrr-papers/draft/miyashita-final.pdf
  10. https://www.sciencedirect.com/science/article/abs/pii/S2212683X1630072X
  11. https://www.iris.sssup.it/retrieve/handle/11382/518957/32961/IP027%20-%20Rejuven.pdf
  12. http://consequentialrobotics.com/about/
  13. http://www.roobo.com/en/
  14. https://www.engadget.com/2019/01/07/lovot-groove-x-robot-adorable/

Disclaimer:

  • This document has been created for educational and instructional purposes only
  • Copyrighted materials used have been specifically acknowledged
  • We claim the right of fair use as ascertained by the author

AUTHOR

Ms. Fathima Zehra
Dr. P. Lakshmi Santhi
Author
Submit your review
1
2
3
4
5
Submit
     
Cancel

Create your own review

Scitech Patent Art
Average rating:  
 0 reviews