Category Archives: Robots

THE AFFECTIVE EFFECTS OF LEARNING WITH HUMANOID ROBOTS

After reading an article in El País about using humanoid robots to help children with autism understand emotions, I started to dig deeper into the affective effects of learning with humanoid robots, and came across Milo.

Milo is considered a revolution amongst educators, therapists, and parents of children with autism. Milo is able to motivate children who find human interaction difficult by helping them practise interaction and social skills. The absence of emotion promotes communication in a context where emotional dysregulation usually creates anxiety and blocks children from comfortably socialising.

I find the results of Milo extremely positive with respect to my own interests of the use of humanoid robots for language teaching and learning. Recent research has demonstrated that autistic children working in conjunction with Milo and a therapist, are engaged 70-80% of the time, compared to as little as 3-10% with ‘traditional approaches’.

It appears that the key element to increasing the communication between autistic children and Milo is trust. Looking further into the use of humanoid robots for learning in South Korea, I found a lot of parallels. Research carried out by Han et al. (2005) demonstrates that learner achievements with IROBI to be a more effective compared to learning carried out with computers, and was rated excellent for language learning. Learners were able to build up a relationship with the humanoid robots that they hadn’t managed to construct with the computers they had previously used. They considered the robot as a teaching assistant and it’s humanoid form made them feel at ease, which in turn promoted learning, in fact, “robots were effective in inducing motivation and enhancing achievement” (Han, 2012).

Ultimately then the emotional effect of learning can have a huge impact on the learner, and while humanoid robots are not essentially humans per say, they are playing a huge role in socially situated learning and social development of learners with low confidence and trust issues.

Speech synthesis, voice recognition and humanoid robots

Speech synthesis or the artificial production of human speech had been around long before daleks on Doctor Who. Apparently, the first speech-generating device was prototyped in the UK in 1960, in the shape of a sip and puff typewriter controller, the POSSUM. Wolfgang von Kempleton preceded all of this with a a speaking machine built in leather and wood that had great significance in the early study of phonetics. Today, text to speech computers and synthesisers are widely used by those with speech impediments to facilitate communication.

Speech to text systems became more prominent thanks to the IBM typewriter Tangora which held a remarkable 20,000-word vocabulary by the mid 1980s. Nowadays speech to text has advanced phenomenally with the Dragon Dictation iOS software being a highly favoured choice. Our world is increasingly becoming dominated by voice automation, from customer service choices by phone to personal assistants like Siri. Voice and speech recognition has been used for identification purposes by banks too since 2014.

I’m curious how these systems work, how they are programmed, what corpus is used and which accents are taken into consideration. Why, because robots fascinate me, and I wonder if it will be possible to “ humanize” digital voices to such an extent that humanoid robots will appear more human than ever because of their voice production and recognition capabilities. It seems like a far cry from the days of speak and spell the kids speech synthesizer of the 80s, but it is looking increasingly more probable as advances in AI develop.

Developments have gone as far as Hiroshi Ishiguro’s Gemonoid HI-1 Android Prototype Humanoid Robot. Hiroshi is a Roboticist at Osaka University Japan, who create a Germaoid robot in 2010 that is a life size replica of himself. He used silicone rubber, pneumatic actuators, powerful electronics, and hair from his own scalp.

Gemonoid is basically a doppelganger droid which is controlled by a motion-capture interface. It can imitate Ishiguro’s body and facial movements, and it can reproduce his voice in sync with his motion and posture. Ishiguro hopes to develop the robot’s human-like presence to such a degree that he could use it to teach classes remotely, lecturing from home  while the Germonoid interacts with his classes at Osaka Univerisity.

You can see a demonstration of Gemonoid here

 

 

 

 

 

From CALL, to ICALL, to MALL, to RALL

From CALL, to ICALL, to MALL, to RALL, oh how we’ve moved on!

The pioneering drill and practice CALL (Computer Assisted Language Learning) computer programmes that dominated the style of learning in the 60’s and 70’s has witnessed many changes. The 1980’s brought about the first radical change in the form of ICALL (Intelligent Computer Assisted Language Learning), where NLP (Natural Language Processing) help computers understand the structure of human language in order to be able to generate it from a computational data structure.

Dramatic shifts in our electronic environment has resulted in mobile technology navigating our learning environment and MALL (Mobile Assisted Language Learning) is becoming ever more popular as educators incorporate smartphones and tablets into their teaching practice. This form of mobile technology also extends to RALL (Robot Assisted Language Learning).

Humanoid robots are already being used for educational purposes and language learning in the US, Japan, and Korea. Japan and the US are using robots as peer tutors, while Korea is using them as teaching assistants and “friends” to generate motivation and increase learning achievement. In the US and Korea the robots use visual instructional materials while in Japan the interaction is gesture and voice-based. Unsurprisingly, RALL is already in full swing in Korea. iRobiQ is an example of an anthropomorphized robot which has been developed with a face, and a tablet interface attached to its chest like Pepper. The advantage of iRobiQ is the emphasis on education and language learning, whereas Pepper has been created for companionship.

So just how long will it be before we start hanging out with humanoid robots in our staff rooms and teaching institutions I wonder?!

Companion robots

Selfishly this week I have been trying to sneak humanoid robots into every aspect of my teaching. I am totally smitten with them, so I am reading extensively on the subject.

One topic of discussion that naturally arose from some of the conversations I had with my students, was that of companion robots. In Japan, companion robots are filling many social gaps, the primary one being that of the increasing ageing population with a constant soaring demographic.

Other kinds of hope robots can offer in the spectrum of companionship is that of pets, like PARO the fluffy baby seal. PARO is marketed as a therapeutic robot, who claims to reduce stress and improve relationships between patients and their caregivers. Other pets include AIBO the robot dog, created by Sony in 1999. Perfect pet solutions without the maintenance a domestic pet requires.

Perhaps the most intriguing king of companionship robots can offer, is that of romantic companionship. Long gone are the days of computers waiting for humans to provide a sense of significance, the humanoid robots of today are able to meet our gaze, track our motions, speak to us and recognise us. While for some this immediately raises issues of ethics, for others, robots could provide all the comforts of companionship without the obligation of commitment, or the perfect partner that is there when you want them but that can be switch off! Pepper is the first humanoid robot to be adopted in Japanese households, and you can read more about Pepper here

Asimo, Pepper and Robots

My first encounter with bioengineered or biorobotic androids was back in 1992, courtesy of Ridley Scott’s replicants in Blade Runner.

Today, with the avalanche of digital learning platforms, apps, AI, VR, and AR, we are being flooded with practices of imitation learning while at the same time adaptive learning seeks to personalise learning experiences.

Neural computation has been written about and researched since the 1940’s, and I was reminded of these technological advances on a recent trip to Japan, birthplace of Honda’s infamous Asimo.

Can robots really replace humans, and in what capacity? Is it possible to perfectly clone the human biological neural network with artificial neural networks or neurodes? If so, to what extend, and what place do humanoid robots have in society? How will this affect teaching and learning and in which contexts? These are questions that I am going to research this year, so I will be sharing my ideas here.

In the meantime, it’s Happy New Year from me and Happy New Year from Pepper.