Category Archives: Robots

Nao (Softbank robotics) – Robots that write

After spending time with Nao (Softbank robotics) in February I am not in the slightest bit surprised at one of his many skills is the ability to write any word asked, and spell the word as he writes. Through speech recognition programming, the robot is able to perform many tasks, but the one of writing is a profound tool that can help those with literacy skill deficiencies, and of course those wanting to learn a language. Another interesting feature that will support my current research.

Automation: friend or foe?

Automation: friend or foe?

The debate regarding automation is becoming increasingly charged as technology progressively continues to permeate ever-increasing sectors of society. While on the one hand users scoff at self-check out tills in supermarkets, I’m not entirely convinced that shop assistants in the UK can offer a better service. Reflecting back on a recent trip I have felt even more alienated as the people “serving” me look utterly perplexed when they are challenged to engage in conversation other than stating the price and asking which method of payment I would like to use. I usually leave the till disappointed and question whether a robot would in fact be capable of offering me better customer service because it would be programmed to do so.

A recent white paper published by the Association for Advancing Automation (, April 2017), puts forward several arguments regarding career sustainability and changing job titles as tasks evolve and shift more heavily towards automation. Automation is nothing new, society has been relying on machines as early back as the industrial revolution, what is new however is the way that society needs to adapt and implement the appropriate changes as the skills required to support the technological advances that evolve.

Many argue that robots will deprive many manual labour workers of their employment opportunities; I however would argue very differently and believe that there is room for both manual and automated labour. Unimate, the industrial robot designed by General Motors in 1961 was considered a welcome relief from the heavy duty lifting and welding work that was deemed unpleasant and dangerous by blue-collar workers that had previously carried it out. In today’s society many of the most advanced robots continue to be those designed for industrial purposes as automation seems to provide an attractive technological solution to increasing labour costs in societies like China, South Korea and Japan where there is still a strong emphasis on production. Many in fact see a clear correlation between automation and manufacturing and claim it could save the manufacturing industry in China.

Robots have also had a considerable impact on white-collar jobs or knowledge workers. Robots that replace white-collar workers have weaved their way into society in many contexts. In some societies, autonomous humanoid robots are already replacing shop assistants and bank tellers, which demonstrates the societal changes and trends towards the use of robots to replace human workers in white-collar jobs. While the predicted abundance of robots in society and the effect they will have on the human labour force in white-collar jobs is perceived as a threat by many, I do not share the same view. Automated machines have been integrated into our lives without a second thought, providing quick solutions in many contexts. Long gone are the days of queuing at the bank during banking hours to withdraw cash, or queuing to buy a train ticket. These machines are considered unobtrusive and their existence is not challenged yet they are replacing white-collar workers. When the machine takes on a humanoid form however, the convenience is often perceived as a threat. Maybe this is due to lack of confidence in humans to believe they are able to carry out a task as efficiently as a robot, and to return to the beginning of this post, maybe that explains the increasing lack of apparent customer service skills nowadays.

20 years ago we could never have imagined the impact of digital technologies on society. Maybe we need to embrace the automation age and consider the opportunities for career prospects as the rise for new careers and industries based around automation continue to grow. Instead of creating skills gap perhaps we should consider training options that embrace automation and the changes it has created in our society, irrespective of the sector we work in. Research and development investment in technology will continue, and this includes automation. I prefer to be making the necessary changes to be prepared for what is next to come, and to be served by humanoid robot shop assistants that are guaranteed to smile and be courteous to ask if everything is okay and if I would like any further help, but that is just me personally..



The value of learning human values from robots

Lately, I have been questioning the human robot relationship, the natural reactions we as humans have towards humanoid robots, and the value of learning human values from robots.

While being perfectly aware that these beings are not living beings, recent interaction with Erica and Nao have made me realise that whether the interaction is with an android or a humanoid, my emotional reactions towards them are the same.

When touching Erica’s hand I was careful to place my hand on hers gently refraining from any sudden movements that may startle her, just as I would with a human whose hand I place mine upon for the first time. Interacting with Nao, I was careful not to take his hand to firmly in mine as I walked him along the worktop, for fear of hurting or damaging him.

What is this inherent ‘care’ that the human brain automatically takes on when interacting with humanoid robots? A research sample from studies carried out by Hiroshi Ishiguro demonstrates that human interaction patterns with androids parallel those with humans, and evidence demonstrates that it is the ‘humanness’ of the robot, which provokes this subconscious reaction.


android robots for language learning

I have recently become extremely interested in research carried out by Ishiguro regarding human responses to android robots. By using the Total Turing Test it was possible to determine that subjects were unable to identify android robots when being flash exposed to them for one or two seconds when given the task of remembering the colour of the clothing the android was wearing. For me this demonstrates that the neurological capacity of the brain believes what it sees but is also influenced by what it wants to see. With respect to language learning and the use of androids, studies have demonstrated that the lack of emotion in androids supports learning in individuals with autism because they do not respond emotionally to the subjects they are interacting with. This highlights important parallels with inhibition in language learning and the subconscious facial gestures teachers often demonstrate in response to learner performance. One raised eyebrow is enough for a learner to become aware that something they said was incorrect and they will directly react to this by either losing their train of thought, pausing for correction or stopping what they were saying all together. Remove the facial gesture from the teacher out of this equation and the learner will probably continue to speak. Perhaps androids can offer a different solution to this problem.

Androids – Erica – Ishiguro – Geminoid

I have just returned from my annual trip to Japan, which has proved to be extremely insightful. I had the great pleasure of meeting Prof Ishiguro in Osaka and the opportunity to see some of his current research in action.

Ishiguro: Through his research, it is possible to gain a sense of Ishiguro’s motivation for creating android robots. He argues that society itself is responsible for shaping humans, therefore by using a combination of computers, motors, and sensors he is able to create androids that are capable of mimicking humans. So synergistic androids are created, that with exposure to language and HRI, are able to develop a personality, therefore making them as human as any other being that depends on exposure to language, society, others and interaction to shape who they are and who they become. In addition, robotic research enables us to gain further insights into the activities of the human brain, and therefore a greater understanding of cognitive neuroscience. In this way robots reflect the activity of the human mind which permit this understanding.

Robots in Japan: Japanese citizens openly accept robots and autonomous systems into their society so they don’t feel the need to distinguish the differences between them, and humans. Robots are considered beings, just like any other being, and take an active part in society in theatre productions, as caregivers, companions and shop assistants.

Erica: Erica, one of Ishiguro’s projects designed as a research platform for an autonomous conversational android, uses voice recognition software to interact with humans. Unfortunately my Japanese is not proficient enough to have successfully interacted with her myself, but here is a short clip of her talking with one of Ishiguro’s research students.

Intelligent microphones: Ishiguro is also working on intelligent microphones that would permit scheduled turn taking among robots, thereby releasing the pressure for humans to partake in interaction. From a pedagogical perspective this is a very interesting development for language training and the treatment and education of humans with communicational disorders like autism.

Geminoid: When asked about the reaction of his students to learning with Geminoid, the responses were all positive. Japanese communication etiquette is an inherent part of the country’s culture. By teaching his classes via a tele-operated android doppelgänger, Ishiguro confirms that students feel less intimidated to ask questions and extend their enquiry, which they may not otherwise do with the professor himself. Ishiguro also confirms positive learning outcomes in European contexts (with 13 different nationalities) and is working alongside several European companies to bring these positive learning outcomes to a wider variety of contexts and nationalities.

The current goal for me is to get my Japanese to a proficient enough level to be able to reap the rewards of HRI myself.

Enhancing Emotional Facial Expressiveness on NAO: Pluggable Eyebrows

Apparently we say a lot with our eyes. I have woken this morning to see a short video which you can watch here of Nao, one of my favourite robots, who now has the option of unpluggable eybrows. It is believed that enhancing facial expressiveness can enhance emotional expression so Nao is now able to express anger, sadness and delight more effectively. Nao was the first humanoid robot created by Softbank Robotics in 2006 and has been continuously evolving since its release onto the market. It stands at only 58cm tall and was designed as an interactive companion robot. While the new unpluggable eyebrow option may not appear a revelation to many, it is yet another step towards giving humanoid robots a more human-like guise by providing them with the option of expressing emotion.



The first robot

I recently found out that the first robot was not in fact invented by the Japanese as I presumed, but by Leonardo da Vinci in 1515! Here is clip of a modern day replica of the robot. I find it fascinating to think that robotics dates back this far in fact da Vinci sketched plans for a humanoid robot as early as 1495.

Leonardo's Robots - Book Mario Taddei -_Page_189

Da Vinci’s mechanical lion was presented as the star gift in a pageant in honour of the new king of France in 1515. He also designed a mechanical knight, able to bend its legs, move its arms and hands, turn its head and open its mouth. It also had ‘talking’ capabilities created by using an internal automatic drum roll and is often claimed to be the first ‘programmable’ computer.


After reading an article in El País about using humanoid robots to help children with autism understand emotions, I started to dig deeper into the affective effects of learning with humanoid robots, and came across Milo.

Milo is considered a revolution amongst educators, therapists, and parents of children with autism. Milo is able to motivate children who find human interaction difficult by helping them practise interaction and social skills. The absence of emotion promotes communication in a context where emotional dysregulation usually creates anxiety and blocks children from comfortably socialising.

I find the results of Milo extremely positive with respect to my own interests of the use of humanoid robots for language teaching and learning. Recent research has demonstrated that autistic children working in conjunction with Milo and a therapist, are engaged 70-80% of the time, compared to as little as 3-10% with ‘traditional approaches’.

It appears that the key element to increasing the communication between autistic children and Milo is trust. Looking further into the use of humanoid robots for learning in South Korea, I found a lot of parallels. Research carried out by Han et al. (2005) demonstrates that learner achievements with IROBI to be a more effective compared to learning carried out with computers, and was rated excellent for language learning. Learners were able to build up a relationship with the humanoid robots that they hadn’t managed to construct with the computers they had previously used. They considered the robot as a teaching assistant and it’s humanoid form made them feel at ease, which in turn promoted learning, in fact, “robots were effective in inducing motivation and enhancing achievement” (Han, 2012).

Ultimately then the emotional effect of learning can have a huge impact on the learner, and while humanoid robots are not essentially humans per say, they are playing a huge role in socially situated learning and social development of learners with low confidence and trust issues.

Speech synthesis, voice recognition and humanoid robots

Speech synthesis or the artificial production of human speech had been around long before daleks on Doctor Who. Apparently, the first speech-generating device was prototyped in the UK in 1960, in the shape of a sip and puff typewriter controller, the POSSUM. Wolfgang von Kempleton preceded all of this with a a speaking machine built in leather and wood that had great significance in the early study of phonetics. Today, text to speech computers and synthesisers are widely used by those with speech impediments to facilitate communication.

Speech to text systems became more prominent thanks to the IBM typewriter Tangora which held a remarkable 20,000-word vocabulary by the mid 1980s. Nowadays speech to text has advanced phenomenally with the Dragon Dictation iOS software being a highly favoured choice. Our world is increasingly becoming dominated by voice automation, from customer service choices by phone to personal assistants like Siri. Voice and speech recognition has been used for identification purposes by banks too since 2014.

I’m curious how these systems work, how they are programmed, what corpus is used and which accents are taken into consideration. Why, because robots fascinate me, and I wonder if it will be possible to “ humanize” digital voices to such an extent that humanoid robots will appear more human than ever because of their voice production and recognition capabilities. It seems like a far cry from the days of speak and spell the kids speech synthesizer of the 80s, but it is looking increasingly more probable as advances in AI develop.

Developments have gone as far as Hiroshi Ishiguro’s Gemonoid HI-1 Android Prototype Humanoid Robot. Hiroshi is a Roboticist at Osaka University Japan, who create a Germaoid robot in 2010 that is a life size replica of himself. He used silicone rubber, pneumatic actuators, powerful electronics, and hair from his own scalp.

Gemonoid is basically a doppelganger droid which is controlled by a motion-capture interface. It can imitate Ishiguro’s body and facial movements, and it can reproduce his voice in sync with his motion and posture. Ishiguro hopes to develop the robot’s human-like presence to such a degree that he could use it to teach classes remotely, lecturing from home  while the Germonoid interacts with his classes at Osaka Univerisity.

You can see a demonstration of Gemonoid here






From CALL, to ICALL, to MALL, to RALL

From CALL, to ICALL, to MALL, to RALL, oh how we’ve moved on!

The pioneering drill and practice CALL (Computer Assisted Language Learning) computer programmes that dominated the style of learning in the 60’s and 70’s has witnessed many changes. The 1980’s brought about the first radical change in the form of ICALL (Intelligent Computer Assisted Language Learning), where NLP (Natural Language Processing) help computers understand the structure of human language in order to be able to generate it from a computational data structure.

Dramatic shifts in our electronic environment has resulted in mobile technology navigating our learning environment and MALL (Mobile Assisted Language Learning) is becoming ever more popular as educators incorporate smartphones and tablets into their teaching practice. This form of mobile technology also extends to RALL (Robot Assisted Language Learning).

Humanoid robots are already being used for educational purposes and language learning in the US, Japan, and Korea. Japan and the US are using robots as peer tutors, while Korea is using them as teaching assistants and “friends” to generate motivation and increase learning achievement. In the US and Korea the robots use visual instructional materials while in Japan the interaction is gesture and voice-based. Unsurprisingly, RALL is already in full swing in Korea. iRobiQ is an example of an anthropomorphized robot which has been developed with a face, and a tablet interface attached to its chest like Pepper. The advantage of iRobiQ is the emphasis on education and language learning, whereas Pepper has been created for companionship.

So just how long will it be before we start hanging out with humanoid robots in our staff rooms and teaching institutions I wonder?!