Enhancing Emotional Facial Expressiveness on NAO: Pluggable Eyebrows

Apparently we say a lot with our eyes. I have woken this morning to see a short video which you can watch here of Nao, one of my favourite robots, who now has the option of unpluggable eybrows. It is believed that enhancing facial expressiveness can enhance emotional expression so Nao is now able to express anger, sadness and delight more effectively. Nao was the first humanoid robot created by Softbank Robotics in 2006 and has been continuously evolving since its release onto the market. It stands at only 58cm tall and was designed as an interactive companion robot. While the new unpluggable eyebrow option may not appear a revelation to many, it is yet another step towards giving humanoid robots a more human-like guise by providing them with the option of expressing emotion.

 

 

The first robot

I recently found out that the first robot was not in fact invented by the Japanese as I presumed, but by Leonardo da Vinci in 1515! Here is clip of a modern day replica of the robot. I find it fascinating to think that robotics dates back this far in fact da Vinci sketched plans for a humanoid robot as early as 1495.

Leonardo's Robots - Book Mario Taddei -_Page_189

Da Vinci’s mechanical lion was presented as the star gift in a pageant in honour of the new king of France in 1515. He also designed a mechanical knight, able to bend its legs, move its arms and hands, turn its head and open its mouth. It also had ‘talking’ capabilities created by using an internal automatic drum roll and is often claimed to be the first ‘programmable’ computer.

Reflections of giving a plenary at Innovate ELT

As a teacher I don’t give standing in front of a class a second thought, it’s what I do. The students “want” to learn English or study skills so I do what I can to help them. Giving a talk at a conference is slightly different, while the people that attend your talk want to listen to what you have to say, they have based their decision on a very short description of what you will say which is often only 60 words long. The attendees are a mixture of professionals in the filed; teachers, materials writers, ELT consultants, editors, academic directors and the list goes on. You deliver your talk, hope your message has reached some of the people in the room and they will take something away, you check the conference programme for the next talk you’d like to attend, and off you go!

On Friday 6th May I was given the opportunity to give my first plenary at the Innovate ELT conference in Barcelona. Needless to say, the difference between talking to a smaller group of people that have chosen to listen to your talk, and the entire crowd of conference attendees is quite different.

The chance to put my neck on the line in front of the entire conference was made possible due to its format which allowed speakers the opportunity to apply to give a plenary, quite unusual for a conference but a prospect I jumped at, so I applied, and I got accepted.

Despite being very nervous prior to standing on the edge of a 2-metre wall in front of a sea of people, I chose to put myself in this position because I believe it’s good to put ourselves out of our comfort zone every now and again, and challenge ourselves. I attended some great talks at Innovate ELT that gave me plenty of new ideas to consider, but along with making new friends and catching up with others, giving a plenary was probably the highlight for me on a personal level because it was exciting and pushed me to do something I had never done before. My only regret is not having a recording so I can look back, reflect and learn from my experience. Next time!

IMG_1615

 

 

 

IATEFL 2016 Birmingham – Instant messaging with learners – Talk summary

Here is a summary of the talk I gave at IATEFL 2016 in Birmingham; Instant messaging with learners: chilled out chatroom or creepy treehouse?

The talk was based on research I carried out while teaching a 10-week pre-sessional course to a group of 16 of multilingual post-graduate students. I set out 4 learning objectives that I wanted to achieve during the course, but I needed to find a strategy to meet them.

The 4 learning objectives were:

  1. Boost student motivation for academic writing.
  2. Increase student collaboration.
  3. Establish a sense of community in the classroom.
  4. Improve academic writing skills.

I chose to harness the affordances of student smartphones as a learning tool and using a cross-platform mobile instant messaging app, I set up a class group on WeChat. I specifically chose this app, because students could sign up using their Facebook accounts and therefore there was no exchange of telephone numbers or infringement on privacy.

I wanted to make a correlation between the constant Tweeting, social media updates and instant messages my students were writing all day long, and academic writing, while also trying to bring a motivational and fun element to the learning.

The students were guided through a range of activities both inside and outside the classroom and were required to share their ideas and collaborate with peers as they interacted using WeChat.

Synonym race

Using a selection of high frequency words in academic English, I sent words one by one to the group. Each time I sent a word, the first group of students to reply with a synonym got a point. This motivated students to think quickly, and added a fun element to the activity. It also helped widen their lexical range and they were able to refer back to the messages during the course to find and use lexis they needed.

Lecture summary

This was a collaborative writing task where students shared their notes from the weekly lecture and wrote a summary together in no more than 100 words. The four groups wrote their summaries on WeChat and sent them to the group. Each group read the summaries of the other groups and made a note of any inaccuracies, points they wanted to question, and things they liked. Each group read their summary aloud open class and as they did this any student could shout stop at something they wanted to question, and it was discussed openly. Because the writing had been carried out collaboratively, no student felt pinpointed or undermined.

TED talk summary of main ideas

For homework I gave the students a TED talk to watch in their own time over the weekend. I asked them to post a summary of the main points of interest for them personally supporting them with evidence and reasons why (one of the tenets of academic writing). They sent their summaries to the WeChat group and I moderated them. This was not time consuming for me, and gave the students a sense of ownership over their learning because they were free to do it at any time during the weekend that was convenient for them. This worked well, so I continued to do this throughout the course to help the students develop their listening, summarising and having to explain why.

Feedback

I asked the students to carry out an anonymous questionnaire, to gain an insight into the usefulness of the class instant messaging group from the learners’ perspective.

Here is a summary of the most common responses the students gave:

  • The activities were fun and interesting and transformed a task we dreaded into something we enjoyed.
  • I communicate more with my classmates and learnt from them.
  • I compared my work with our classmates and adopted a competitive approach to impress them (This was reflected in the quality of the writing the students produced).
  • I feel the gap between the teacher and me has narrowed, because she is a part of the chat group.
  • I feel more confident to write now, so I am more motivated also.

 The benefits of having a class chat are:

  1. It is student-centred, interactive and communicative.
  2. It creates dialogue amongst students and nurtures a social atmosphere
  3. It increases motivation and shifts the motivation from extrinsic to intrinsic.
  4. It encourages sharing and extends learning.
  5. It creates a personalised learning platform that students can refer to both inside and outside the classroom.

If you decide to try out any of the activities mentioned, please let me know how it went!

 

 

 

IATEFL Birmingham 2016 – Instant messaging with learners: chilled out chatroom or creepy treehouse?

The post conference buzz is still racing around in my head after seeing so many talks, new ideas, old ideas, different takes on current ideas, and trying frantically to catch fleeting moments to talk with friends and make new ones in between a very packed IATEFL programme in Birmingham.

All in all a great conference, and for those of you who didn’t manage to see my talk; “Instant messaging with learners: chilled out chatroom or creepy treehouse?”, the British Council have kindly sent me a recording which you can watch here.

I look forward to the next IATEFL as I’m sure we all do!

“Many educators have embraced the use of mobile technologies and instant messaging with learners. But inviting learners to connect with their teacher on social media can provoke horror; what some have called the creepy treehouse syndrome. In this talk, I present contexts where I used IM and the outcome. Cautious of creating a creepy treehouse syndrome, I trod extremely carefully.”

 

 

 

 

THE AFFECTIVE EFFECTS OF LEARNING WITH HUMANOID ROBOTS

After reading an article in El País about using humanoid robots to help children with autism understand emotions, I started to dig deeper into the affective effects of learning with humanoid robots, and came across Milo.

Milo is considered a revolution amongst educators, therapists, and parents of children with autism. Milo is able to motivate children who find human interaction difficult by helping them practise interaction and social skills. The absence of emotion promotes communication in a context where emotional dysregulation usually creates anxiety and blocks children from comfortably socialising.

I find the results of Milo extremely positive with respect to my own interests of the use of humanoid robots for language teaching and learning. Recent research has demonstrated that autistic children working in conjunction with Milo and a therapist, are engaged 70-80% of the time, compared to as little as 3-10% with ‘traditional approaches’.

It appears that the key element to increasing the communication between autistic children and Milo is trust. Looking further into the use of humanoid robots for learning in South Korea, I found a lot of parallels. Research carried out by Han et al. (2005) demonstrates that learner achievements with IROBI to be a more effective compared to learning carried out with computers, and was rated excellent for language learning. Learners were able to build up a relationship with the humanoid robots that they hadn’t managed to construct with the computers they had previously used. They considered the robot as a teaching assistant and it’s humanoid form made them feel at ease, which in turn promoted learning, in fact, “robots were effective in inducing motivation and enhancing achievement” (Han, 2012).

Ultimately then the emotional effect of learning can have a huge impact on the learner, and while humanoid robots are not essentially humans per say, they are playing a huge role in socially situated learning and social development of learners with low confidence and trust issues.

Speech synthesis, voice recognition and humanoid robots

Speech synthesis or the artificial production of human speech had been around long before daleks on Doctor Who. Apparently, the first speech-generating device was prototyped in the UK in 1960, in the shape of a sip and puff typewriter controller, the POSSUM. Wolfgang von Kempleton preceded all of this with a a speaking machine built in leather and wood that had great significance in the early study of phonetics. Today, text to speech computers and synthesisers are widely used by those with speech impediments to facilitate communication.

Speech to text systems became more prominent thanks to the IBM typewriter Tangora which held a remarkable 20,000-word vocabulary by the mid 1980s. Nowadays speech to text has advanced phenomenally with the Dragon Dictation iOS software being a highly favoured choice. Our world is increasingly becoming dominated by voice automation, from customer service choices by phone to personal assistants like Siri. Voice and speech recognition has been used for identification purposes by banks too since 2014.

I’m curious how these systems work, how they are programmed, what corpus is used and which accents are taken into consideration. Why, because robots fascinate me, and I wonder if it will be possible to “ humanize” digital voices to such an extent that humanoid robots will appear more human than ever because of their voice production and recognition capabilities. It seems like a far cry from the days of speak and spell the kids speech synthesizer of the 80s, but it is looking increasingly more probable as advances in AI develop.

Developments have gone as far as Hiroshi Ishiguro’s Gemonoid HI-1 Android Prototype Humanoid Robot. Hiroshi is a Roboticist at Osaka University Japan, who create a Germaoid robot in 2010 that is a life size replica of himself. He used silicone rubber, pneumatic actuators, powerful electronics, and hair from his own scalp.

Gemonoid is basically a doppelganger droid which is controlled by a motion-capture interface. It can imitate Ishiguro’s body and facial movements, and it can reproduce his voice in sync with his motion and posture. Ishiguro hopes to develop the robot’s human-like presence to such a degree that he could use it to teach classes remotely, lecturing from home  while the Germonoid interacts with his classes at Osaka Univerisity.

You can see a demonstration of Gemonoid here

 

 

 

 

 

Reflective Practice

Exams Catalunya: Using the principles of Reflective Practice to improve oral skills.

The notion of reflective practice where teachers and students spend time reflecting on their performance in order to develop clear goals for improving is a very powerful one. Too often learners repeat activities in class without any focused ideas about what they want to change or improve.

This seminar looks at using technology to give learners instant feedback on their performance in speaking activities and explore how they can be guided towards formulating learning objectives to extend their learning and oral skills. This practice helps motivate students in an engaging way and will encourage them to take ownership of their learning and develop their confidence and learner autonomy.

The slides from the seminar can be viewed here:

Exams Catalunya

From CALL, to ICALL, to MALL, to RALL

From CALL, to ICALL, to MALL, to RALL, oh how we’ve moved on!

The pioneering drill and practice CALL (Computer Assisted Language Learning) computer programmes that dominated the style of learning in the 60’s and 70’s has witnessed many changes. The 1980’s brought about the first radical change in the form of ICALL (Intelligent Computer Assisted Language Learning), where NLP (Natural Language Processing) help computers understand the structure of human language in order to be able to generate it from a computational data structure.

Dramatic shifts in our electronic environment has resulted in mobile technology navigating our learning environment and MALL (Mobile Assisted Language Learning) is becoming ever more popular as educators incorporate smartphones and tablets into their teaching practice. This form of mobile technology also extends to RALL (Robot Assisted Language Learning).

Humanoid robots are already being used for educational purposes and language learning in the US, Japan, and Korea. Japan and the US are using robots as peer tutors, while Korea is using them as teaching assistants and “friends” to generate motivation and increase learning achievement. In the US and Korea the robots use visual instructional materials while in Japan the interaction is gesture and voice-based. Unsurprisingly, RALL is already in full swing in Korea. iRobiQ is an example of an anthropomorphized robot which has been developed with a face, and a tablet interface attached to its chest like Pepper. The advantage of iRobiQ is the emphasis on education and language learning, whereas Pepper has been created for companionship.

So just how long will it be before we start hanging out with humanoid robots in our staff rooms and teaching institutions I wonder?!

Companion robots

Selfishly this week I have been trying to sneak humanoid robots into every aspect of my teaching. I am totally smitten with them, so I am reading extensively on the subject.

One topic of discussion that naturally arose from some of the conversations I had with my students, was that of companion robots. In Japan, companion robots are filling many social gaps, the primary one being that of the increasing ageing population with a constant soaring demographic.

Other kinds of hope robots can offer in the spectrum of companionship is that of pets, like PARO the fluffy baby seal. PARO is marketed as a therapeutic robot, who claims to reduce stress and improve relationships between patients and their caregivers. Other pets include AIBO the robot dog, created by Sony in 1999. Perfect pet solutions without the maintenance a domestic pet requires.

Perhaps the most intriguing king of companionship robots can offer, is that of romantic companionship. Long gone are the days of computers waiting for humans to provide a sense of significance, the humanoid robots of today are able to meet our gaze, track our motions, speak to us and recognise us. While for some this immediately raises issues of ethics, for others, robots could provide all the comforts of companionship without the obligation of commitment, or the perfect partner that is there when you want them but that can be switch off! Pepper is the first humanoid robot to be adopted in Japanese households, and you can read more about Pepper here