That’s Not A Droid, That’s My Girlfriend

March 10, 2014 | Author: | Category: General

WLN > News > General > That’s Not A Droid, That’s My Girlfriend

Osamu Kozaki’s life in Tokyo is, by his own admission, often a lonely one. The 35-year-old, an engineer who designs industrial robots, has had few relationships with women in his life. Those few have almost always gone badly.

So when Kozaki’s girlfriend, Rinko Kobayakawa, sends him a message, his day brightens up. The relationship started more than three years ago, when Kobayakawa was a prickly 16-year-old working in her school library, a quiet girl who shut out the world with a pair of earphones that blasted punk music.


Kozaki sums up Kobayakawa’s personality with one word: tsundere – a popular term in Japan’s otaku geek culture, which describes a certain feminine ideal. It refers to the kind of girl who starts out hostile but whose heart gradually grows warmer. And that’s what has happened; over time, Kobayakawa has changed. These days, she spends much of her day sending affectionate missives to her boyfriend, inviting him on dates, or seeking his opinion when she wants to buy a new dress or try a new hairstyle.

But while Kozaki has aged, Kobayakawa has not. After three years, she’s still 16. She always will be. That’s because she is a simulation; Kobayakawa only exists inside a computer.
Kozaki’s girlfriend has never been born. She will never die. Technically, she has never lived. She may be deleted, but Kozaki would never let that happen.


Because he’s in love.


Kozaki is one of hundreds of thousands of Japanese who have bought Love Plus, a game released on the Nintendo DS in 2009, which is intended to simulate the experience of high-school romance with one of three pre-programmed teen girl characters. For a sizable number of loyal male gamers, it has become something more: a relationship that, if not entirely like dating a real woman, comes close as a source of affection.
“I really do love her,” Kozaki explains, when he and two of his friends meet with me in a coffee shop in Akihabara, the Tokyo neighbourhood at the centre of Japan’s otaku culture. Kozaki fully expects the game to be a lifelong commitment. “If someone were to ask me to stop, I don’t think I could do it,” he says.

Kozaki recounts what happened when an updated version of the game came out; which meant he had to move his saved data onto a new program. Kozaki couldn’t come at the idea of having two simultaneous versions of his virtual girlfriend in existence, so he asked a friend to delete the old saved data for him. It was — almost — as if he had arranged for someone to be murdered, he says. “I cried when he pushed that delete button,” he says, acknowledging that it sounds strange. “It was as if I crossed a border line from reality.”


One of his friends, Yutaka Masano, 37, feels the same about the possibility of losing his girlfriend, who is also the Kobayakawa character. “I can’t imagine how I’d feel if I lost the data. My mind would go blank, I wouldn’t be able to think at all,” he says.
Both men, along with another friend, 39-year-old Nobuhito Sugiye, can articulate a philosophical basis for their affection and their fear of loss. That is, for them these computer girls possess the same tamashii — spirits — that devotees of Japanese animism, or Shinto, believe can inhabit all things, fromrocks and streams to humans.
“Everything is equal. We have no borders between robots and people,” Kozaki explains.


“In the foreign stories, robots are always the enemies. In Japan, they’re our friends.”

This is the future Japan is working towards. The country is not alone in pursuing cutting-edge work in robotics and artificial intelligence (AI). The United States, in particular, has led the development of military robotics, sparking often anguished debates over the ethical issues posed by drones that can spy and kill at great distances from both their targets and controllers. Similarly, American-born AI systems such as Apple’s virtual personal assistant, Siri, and Watson, the IBM computer that in 2011 defeated human champions on the quiz show Jeopardy!, and even Google’s search engine, have produced stunning results.
But Japanese robotic engineering has a fundamentally different approach. While most of the world’s robotics development has been focussed on creating largely impersonal machines, to work or kill on behalf of humans, robots in Japan are being specifically developed to be personable, sociable and endearing — to be friends rather than slaves.


Kozaki and his friends obsessing over AI girlfriends may seem bizarre now. But if roboticists working in Japan have their way, these men may prove to be trendsetters. In the near future, as the line between humans and machines blurs, many of us may well develop affection for — or even fall in love with — robots.

And for that we can thank Shinto, which, in one way or another, colours the belief system of nearly every Japanese person. Add to that Japan’s ageing population, the country’s anti-immigrant xenophobia, the Second World War and Astro Boy.


MORE THAN ANY COUNTRY ON EARTH, Japan is getting old. Already, 23 per cent of the population is over the age of 65. By 2050, it is estimated that two out of five Japanese will be elderly, and the population as a whole will have contracted by tens of millions.
Most of the world’s other developed countries can simply respond to the fact of their greying populations by importing people — increasing immigration. But public sentiment in Japan, which has one of the world’s most homogenous cultures, is largely against this option. The country has long been averse to large-scale immigration, and although it has, in recent decades, actively tried to recruit foreign aged-care nurses, these efforts have not been enough to meet demand.


Instead, Japan’s government and corporations have poured significant amounts of money into developing often cuddly looking robots that can perform aged-care duties. The government is aiming to have robots working widely in Japanese homes by 2018, with the main emphasis on machines that can help lift the bed-ridden elderly, monitor the senile, assist people with going to the toilet, and generally perform to aid people’s mobility. Japan’s Ministry of Economy Trade and Industry will spend 3.3 billion yen, or about AUD 34 million, in the next financial year, on the research and development of such service robots. Universities and companies are also kicking in large amounts of research funding.


After Japan was defeated in the Second World War, its American occupiers left the country with a new constitution that renounced Japan’s right to wage war. Today the country still officially has no military (although it does have a sizeable and well-equipped “Self-Defence Force”). In the post-war period Japan’s universities and its big conglomerates mainly turned their focus to a technology-driven economic recovery. Mitsubishi, for example, went from making Zero fighter planes to making affordable family cars. And over time, an army of industrial robots came to underpin Japan’s manufacturing miracle, before the economy went into a tailspin in the early 1990s.

Thus, Japanese culture has, for more than half a century, been infused with a belief in redemption by robot. While American science fiction has long obsessed over the risk of robots rebelling against humanity — think 2001: A Space Odyssey or the Terminator movies — the robots of Japan’s popular culture have been far friendlier. Astro Boycame to life in manga form in 1952, as a thinking, feeling robot child powered by nuclear energy — just seven years after the atomic bombing of Hiroshima and Nagasaki. There have been countless such characters since.


But Japan’s actual robot success story has been mixed when it comes to producing the ultimate dream: advanced human-like robots. Honda has received a constant stream of publicity for its costly ASIMO, a humanoid robot billed as the most advanced of its kind. The product of nearly three decades of development, the latest version of ASIMO, unveiled in 2011, can run, hop on one leg, move up stairs and pour drinks, all on a lithium ion battery that lasts for one hour. But this is clearly still far from being a robot that people could use in their homes. When the Fukushima nuclear crisis struck in 2011, many Japanese also expected the country’s robots to play a role in the response. In the end, the role they played was far less than many hoped; many Western-developed robots were better suited to the job.


The most promising aged-care robots now in development are much less exciting than science fiction’s promise of realistic androids. Instead, they reflect the narrow goals the government has for them: that they be able to perform a fairly limited range of tasks, to help, rather than replace, human caregivers.
But that doesn’t mean research into realising the android dream isn’t going apace. Slowly, and in a piecemeal fashion, things are coming together. It’s just that such robots need a certain kind of smarts.
If you’re going to put a robot in a situation that involves dealing closely with people, it needs to be able to think about how humans move, gesture, communicate and feel, explains Professor Toyoaki Nishida, who is researching artificial intelligence at Kyoto University.

Imagine putting a fully autonomous robot in a room with a frail elderly person: one misjudged move by the robot could cause an injury. And how useful is a robot nurse that can’t recognise an important gesture like a wince, the pointing of a finger or a wave for help? Assuming you can design a robot with these capabilities, once you put it in a room with dozens of individuals, all darting in different directions and sending different cues, things get even more difficult.
Overcoming this problem actually presents a key opportunity to create artificial intelligence that starts to calculate and anticipate what humans can do, Nishida explains.


“So far the artificial intelligence people have been working just on [simulating] the mind — on programs, software. That sort of thing has caused a lot of difficulty [for] building a mind, deep artificial intelligence, because without a body it’s very hard for us to give intelligence to things,” he says. “Our physical environment is much more complicated than a chess board.”
Nishida leads a small international team focussed on the problems that come with human-robot interactions. It’s deeply immersive work. In one approach, Nishida’s researchers enter a specially designed chamber, which has motion sensors, and screens relaying a 360-degree view of the outside environment; the chamber is in turn linked to a robot. The researcher inside the chamber sees what the robot sees through its camera on the screens. And when the human moves, so does the robot. The aim, says Nishida, is that by observing how a human operating a robot body interacts with the world, researchers will then be able to compile data on the complexities of how humans react with their environment — including other people. The whole process involves a huge array of sensors and analysing massive amounts of collected information.

And yet getting a robot to move well is the easy part. By far the greatest challenge in developing AI is to create machines that can converse, Nishida says. That requires AIs that can comprehend speech, understand the layers of meaning behind it, and then formulate responses that make sense. A misstep at any stage can lead to the AI giving the wrong response, or not responding at all.
It’s a problem one of Nishida’s team, Yasser Mohammad, calls “going off the script”. You can only program a robot to deal with a certain number of situations in ways that make it seem convincingly intelligent. Throw up an unexpected situation and it immediately becomes clear that you are dealing with a machine.


The solution, Mohammad explains, is to throw away the hard coding — an approach that’s also being used in the West, but in which Japan is a key player.


“Our long-term goal is we want robots to learn natural behaviours the same way a child learns,” he says. Mohammad and his colleagues have developed robots that can do very little at first, but which learn how to do physical tasks through repeated instruction by humans. At first, they are slow to pick up tasks, but after several tries they make rapid progress. It’s a process that Mohammad, an Egyptian, likens to his own children learning Japanese.
Ultimately, he says, “If you go off the script, the robot will be able to go off the script with you.” When robots reach that stage of development, their behaviour will be hard to distinguish from that of a human. It’s also the point at which a big philosophical question kicks in: Is this thing actually conscious?

“The maximum we can achieve is behavioural intelligence,” Mohammad says. In other words, we can create a robot that behaves like a living, thinking being. “It’s up to you to decide if someone is actually inside or not.”
The idea may seem a little far-fetched, but in fact it takes very little for a person to see human qualities in a machine.
Watching demonstration videos at Nishida’s lab, I experience a moment of surprising emotional impact. In one clip, a human points down to an object on a bench in order to get a robot to interact with it. The robot, a fairly unimpressive looking pile of nuts and bolts, doesn’t yet know what a pointing motion means.
But it does know to follow a human’s gaze. Taking its cue from the person, the robot dips its head and glances at the bench. Instantly, the robot no longer seems to be a lifeless machine. It has interacted with a human in a way that looks as if it is responding to his desires or interest in something. The sense that they have an emotional connection, a shared interest, is inescapable. Even though I know it’s not true, my human interpretation of what I see, is telling me otherwise; and I feel a twinge of empathy for the robot.
This sort of response is inevitable, says David Levy, a British chess master who has written extensively on AI. Humans, especially children, actually have a strong tendency to form attachments to objects, including computers.
Even when the technology is relatively rudimentary, some people can become very emotionally invested in it. In the 1990s, many people formed almost obsessive relationships with the Tamagotchi, a virtual pet that is nothing more than a bleeping plastic egg with an LCD screen. The pet’s constant need for food and care prompted a strong nurturing response in some people.

Levy believes that the more robots come to respond to human emotional desires — the need for company, for empathy and nurturing –- the more we’ll become emotionally attached to them.
“As to humans forming close emotional relationships with robots, I believe it will be about 40 years until people in large numbers are falling in love with robots and even marrying them — but in smaller numbers it will happen earlier,” he says.
The appearance of the machines, as well as their artificial intelligence, plays an important role in our tendency to respond to them. Japanese robotics design has seen a profusion of attempts to make machines relatable. Some have focused on plush toy-like cuteness, as in the case of Paro, a fluffy seal intended to comfort sick children and the elderly by responding to their caresses and emotional states.


Others have attempted to imitate human appearance itself.
IN PERSON – IF INDEED PERSON IS THE RIGHT WORD – Geminoid F can appear strikingly beautiful in a way that doesn’t translate to photos or video. Her hair is smooth and glossy, and falls across her delicately translucent, pale silicone face. As she sits in the chair at Osaka University, she runs through an idling program of random motions. She blinks, fidgets and makes small, distracted movements with her lips. When she turns her head and looks at you with her plastic eyes, the effect is thrilling and unsettling. It feels as if you’re being stared at, a little too intently, by an attractive stranger.


Geminoid F is one of the latest in a series of machines designed by engineers led by Hiroshi Ishiguro, a roboticist who has gained a measure of international fame for creations made to realistically resemble people. In 2000, Ishiguro started with a small, child-like robot Repliee R1, modelled on his own daughter. He soon moved on, creating a series of four geminoids modelled on himself (the word geminoid denotes a robot copy of a real person) as well as three other robots modelled on other people.

Ishiguro’s goal is to overcome the “Uncanny Valley”, a term coined in 1970 by the Japanese roboticist Masahiro Mori. It refers to a problem that occurs in both robot design and 3D animation: that is, the more human-like you make something, the more appealingly familiar it becomes, until you cross a certain line and the thing looks and acts too much like a human — then it suddenly becomes terribly creepy. A jerky movement or a dead-eyed stare can render a wonderfully humanoid robot suddenly repulsive.
Western robotics, with its emphasis on military and industrial applications, has been less concerned with this problem. But it’s an essential challenge for the kind of intimate roles envisioned for Japanese robots.


“My goal is to understand what human is,” Ishiguro says. “By making a copy of a human, we can understand humans.”
This work involves obsessive study of the quirks and mannerisms of human subjects so that they can be replicated in a robot. When it came to creating a copy of himself, Ishiguro had to pass the job on to colleagues. But after a lifetime of recognising his own face in a mirror, he says seeing his robotic clone seems more like meeting a long-lost twin brother than meeting himself.
“It doesn’t look like a mirror [image], therefore I can’t accept the geminoid face is my face. It’s a bit confusing.”
Ishiguro’s work is very much focussed on outside appearance. The geminoids are not built to communicate or move autonomously. Instead, they’re a vehicle for exploring how close to human-like robots can become. Geminoid F, based on an anonymous female model, has already been used in performances, including a travelling “robot theatre” that made its way to Australia last year. And, in an experiment, Geminoid F replaced the human receptionist at a company’s front desk, greeting visitors as they entered. Only 20 per cent of people noticed something was amiss, Ishiguro says.

At one stage late in our interview, after I have switched off the video camera and Ishiguro is standing behind Geminoid F, I ask him if I can see beneath her skin. He fiddles with the seam at the back of her skull before saying no. But his hand lingers, and I notice he is tenderly stroking her hair.


I ask Ishiguro whether he’s started having feelings for the robot?
“Maybe. [It’s] confusing. We are working so many years together. I'm very sure my students, some of them are loving this humanoid,” he says. “The relationship is very human.”


One day, Ishiguro says, when the parts of one of their geminoids wear out, it will be time to dispose of it. When that happens, he and his students will hold a memorial service.


The confusing emotional questions that come with human-robot interaction has given rise to an emerging sub-discipline, dubbed “lovotics”. An academic journal of the same name was launched this year, aimed at addressing the question of how robots can enrich human emotional lives.


Adrian David Cheok, an Australian who is now a professor at Keio University in Tokyo, is one of the journal’s founders. The way he sees it, the internet has already helped bring people closer together. But it’s an experience limited by the fact that the internet currently only interacts with two of our senses: sight and sound. Anyone who has been brought back to childhood by a smell, or been comforted by a hug or touch — in other words, pretty much everyone — knows how powerful such senses can be.

“Actually, physically it’s also been shown that the smell and taste senses are directly connected to the limbic system of our brain. The limbic system of our brain is responsible for emotion and memory. Unlike the visual sense, which basically gets processed by visual cortex and then the frontal lobe, which is higher-order, logical part, we have direct connection between smell and taste and the emotional and memory part of our brain,” Cheok says.


“So much of our lives now is online, but still I think a lot of us will agree it’s so different than meeting someone face-to-face. You have all these different physical communications that we can’t capture now through an audio/visual screen,” he says. “Essentially I’m really interested in [whether we can] merge all of our five senses of human communication with the internet — with the virtual world. That’s what I call ‘mixed reality’.”


Robotics plays a key role in making that a reality, through what is known as telepresence. Basically, it means transmitting actions into a robotic surrogate somewhere else. This can be fairly simple, Cheok says. Cheok and his students have already developed a ring worn on the finger that can deliver a gentle squeeze from a loved one, via a smart phone app. A student of Cheok’s has recently commercially released a vest that can transmit hugs, which is proving useful for calming autistic children. Cheok’s engineers are working on systems to transmit taste, via electrical impulses to the tongue, as well as smell, either via electrical stimulation or the release of chemicals.
The goal further down the track will be the creation of robotic avatars — representations or embodiments of people, though not necessarily made to look like them. To start with, these will be soft, fluffy and not particularly complex. For example, we could transmit our presence into a pillow or teddy bear. But as the endeavours of such scientists as Hiroshi Ishiguro progress, the creation of human-like surrogates will become possible.


“We’re definitely getting there… The rate of change of technology is exponential. What before maybe we thought would take 50 years now takes 5 or 10. I don’t think it’s going to be very far off when we have humanoid robots. They may be expensive at first,” Cheok says.

“I think at that stage, we can have virtual avatars; virtual robots which then, for example, [let you] be in Tokyo or Sydney and give a conference in Los Angeles. You don’t have to fly there. Your robot can be there.”
If there’s one major obstacle in the way of Japan’s projected robo-utopia, it’s the country’s economic situation. Japan has been in a state of economic malaise for more than two decades, and memories of the robot-supported boom years are fading. Neither the companies likely to do the research nor the Japanese government are as flush with cash as they used to be.


One of Japan’s major strengths — its peacenik constitution — has also proved to be a weakness. In the United States, the massive military-industrial complex has marshalled resources to create some truly impressive machinery; drones, for example, have been developed to meet guaranteed demand from government agencies. In Japan, however, there is little co-ordination between different institutions and industries, explains Nishida of Kyoto University.


“People are just interested in working on small parts of the problem, rather than looking at the whole,” Nishida says. While some work on artificial intelligence, others are focussed on the outer physical appearance of robots. With co-ordination and plenty of funding, a fairly complete intelligent android could be built within the next decade or two, he says. Under current conditions, it will probably take longer.
But the consensus is that such robots are coming, and that they will most likely be made first in Japan.
Cheok, of Keio University, says he’s not convinced we’ll produce thinking, feeling, conscious robots until at least the middle of the century, if at all. But he is certain we’re heading towards a loving technological future.


Thanks to their Shinto beliefs, the Japanese have fewer cultural barriers standing in the way of forming close emotional bonds with machines. But as robots become smarter and better looking, he says many more people of other cultures will become ensnared.
“I think the thing is that we already develop bonding with not very intelligent beings. As a kid you might have kept a pet hamster or pet mouse. They’re not actually so intelligent. But I think that a kid can even cry when the hamster dies,” he says.
“I'm not a biologist. I don't know why we developed empathy but I’m sure there’s an important evolutionary reason why we developed empathy. That empathy doesn’t just stop at human beings. We can develop empathy for small creatures and animals. I don’t think the leap is very far where you can develop empathy for robots.”

 

 

 

 

(Source:http://www.theglobalmail.org/feature/thats-not-a-droid-thats-my-girlfriend/560/)