Pepper, the social robot, opened production fair

Social robots that in various ways are able to interact and relate to humans are trending. At the University of Skövde, there are currently several research projects where researchers are studying how the skills of social robots can be put into use. For example within the industry. On May 15th 2018, Pepper, the University of Skövde’s robot and an eight year old girl Rut opened one of Sweden’s largest industry trade fairs. They cut the ribbon together, as a symbol of next generation’s industry coworkers.

Pepper was invited to Elmia Production Fair on May 15-18 to open the event, and was described as “the latest within technology and industry”. Together with his colleagues, Erik Billing, lecturer in Information Technology at the University of Skövde, has programmed Pepper’s movements and expressions.
– Pepper and eight year old Rut opened the fair by untying a ribbon. Afterwards, Pepper was available for visitors who wanted to say hello and learn more about social robots, and our various research projects in this field at the University of Skövde, says Erik Billing, and he also mentioned the various applications of the robot beyond acting as an opening speaker.

Studying the human-robot interaction in various fields

In the research project AIR, a joint effort with the RISE research institute, Örebro University and Halmstad University who receive funding from the KK-foundation, the University of Skövde is studying how robots can interpret and interact with various autonomous systems. Some areas that are being studied in addition to social robots are industrial robots and autonomous systems for traffic, that is driver-less cars.
– By studying these three areas, we are hoping to gain a better understanding about how humans react and interact with the three systems. Our goal is to create a tight collaboration between humans and robots, where the robot is able to adjust to human actions and intentions. For the industry, we hope that this will open up for new ways of producing goods while contributing to a better work environment for machine operators.

The development of social robots such as Pepper has just begun, and the fields of application will most likely expand in the future, says Erik Billing.
– Technology from social robots, like how they establish eye contact, process human speech or gestures, is likely to be very important for the future of industry.

European initiative develops better healthcare solutions for patients with dementia

People with a dementia diagnosis often struggle with daily routines, for example remembering to take their medication. In a European initiative called Remind, organisations from nine different countries collaborate to develop technical solutions that can help patients with dementia and their relatives in everyday life.

Remind is a European project where universities and companies collaborate to generate and share knowledge about health care solutions for patients with dementia.

Ubaid Ur Rehman is a PhD student from Kyung Hee University in South Korea, one of the partner universities in the Remind project. He visited Halmstad University as a guest researcher during three months in the beginning of 2018. Ubaid Ur Rehman has developed an application that, based on data from an intelligent home and a smartphone, can determine when it is a good time to remind a patient with dementia to take his or her medication.

– The main purpose of the project Remind is to generate and share knowledge between the involved organisations ­– which are both within the industry and the academy, says Ubaid Ur Rehman.

The Remind project gives us the opportunity to share ideas, information and results with others so that we efficiently can work towards the same goal.

– Anita Sant’Anna, Assistant Professor at the School of Information Technology at Halmstad University.

Towards the same goal

Remind was initiated by Anita Sant’Anna from Halmstad University and Chris Nugent from Ulster University in Northern Ireland. Chris Nugent was until recently a visiting professor at Halmstad University, focusing on development of mobile and pervasive computing solutions to support ambient assisted living.

Anita Sant’Anna, Assistant Professor at the School of Information Technology at Halmstad University, is one of the initiators of the Remind project. Photo: JOACHIM BRINK

– A growing elderly population requires new health technology solutions and smart home environments. The Remind project gives us the opportunity to share ideas, information and results with others so that we efficiently can work towards the same goal, says Anita Sant’Anna, Assistant Professor at the School of Information Technology at Halmstad University.

Martin Cooney, social robotics researcher at Halmstad University, is the local coordinator of Remind:

– Halmstad University is contributing to the project with competence in Artificial Intelligence. I think Ubaid’s activity recognition is a very good example of this. We have also, for example, made a system to detect recently performed activities – like taking medicine –using a thermal camera.

– Aside from the research, I think the project also offers nice opportunities for networking, by sending people to different countries, and connecting academia to industry, says Martin Cooney.

About Remind

Remind has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement. The project started in 2017 and will end in 2020. Remind webpage:


  • Beneficiaries
    • Ulster University (UK)
    • University of Jaén (Spain)
    • National University of Ireland Galway (Ireland)
    • Luleå University of Technology (Sweden)
    • Halmstad University (Sweden)
    • University of Florence (Italy)
    • Associazione NOVILUNIO Onlus (Italy)
    • I+ srl (Italy)
    • KARDE AS (Norway)
    • Swedish Adrenaline (Sweden)
    • The Ageing social Lab Foundation (Spain)
  • Other organisations
    • University Medical Center Groningen (Netherlands)
    • Kyung Hee University (Korea)
    • Corporación Universidad de la Costa (Colombia)
    • Universidad Nacional Abierta y a Distancia (Colombia)

Knowledge exchange will be achieved through collaborative and hands-on activities where participants develop and/or implement something together within the following areas:

  • Signal and data analysis for healthcare
  • Context and behaviour modelling
  • User-centred reminding technologies

A final project demonstrator will be developed by the end of the project period, in June 2020.

Text: LOUISE WANDEL, Halmstad University

Film: IDA FRIDVALL and LOUISE WANDEL,  Halmstad University

Emotional art by social robot Baxter

Halmstad University’s social robot Baxter can create paintings that express human emotions. Last year, Baxter’s paintings successfully participated in an international robot art competition. The emotions were transferred to the robot by brain waves from a famous Swedish artist, Peter Wahlbeck. This year, the robot art has reached the next level – Baxter has learned to paint as a performance, showing emotions itself and infusing these feelings into artwork.

Winning sixth place in last year’s international robot art competition encouraged the Halmstad research team to further develop the social robot Baxter’s emotional artwork. Robot art has also become a compulsory part of the Master’s Programme in Embedded and Intelligent Systems at Halmstad University.

– Last year, Baxter painted with a frozen, blank face. Only the paintings expressed emotions, so it was like a computer printer just printing images. This year we wanted the robot too to show emotions, says social robotics researcher Martin Cooney.

In order for social robots to successfully interact with and help humans, they need to be given “life”, complexity and randomness.

– Martin Cooney

Robots need to be given “life”

Martin Cooney and his fellow researchers at the School of Information Technology want to be more creative and innovative in their development of social robotics, and robot art is one example of this.

– Robots in factories might follow rigid programs – do this, do that.. But that’s not how humans work. In order for social robots to successfully interact with and help humans, they need to be given “life”, complexity and randomness. The robot has to be able to both sense human emotions and express feelings of its own – just like humans do, says Martin Cooney.

Six paintings from Halmstad University have been sent to the 2018 International Robotic Art Competition. A team of 24 Master’s students created the backgrounds of all paintings by using random movements of small platooning robots. The foreground was painted by Baxter, programmed to show a certain emotion through its ‘face’ and sound.

Martin Cooney and Baxter at the robot art competition last year when Baxter created paintings that express human emotions. This year, Baxter has learned how to show it’s own feelings through art.

Text: Louise Wandel

Photo: Hanna Carmvall

Meet Baxter – a social robot at your service

Baxter lives at Halmstad University where he is in training to understand and interpret people’s needs and emotions in order to help them in different ways. Baxter can also be useful when you are cooking to make sure the food is healthy, or help you when you fall.

The base of the robot, “Ridgeback”, is from Clearpath Robotics. The top, “Baxter”, is from Rethink Robotics. The robot was purchased with support from the Swedish Knowledge Foundation for the SIDUS AIR project, which focuses on action and intention recognition in human interaction with autonomous systems.



Baxter impressed judges in Robot Art Competition

200 works of robot art divided between 38 teams from 10 different countries competed in the international Robot Art Competition. The winners were announced earlier this week and the social robot Baxter from Halmstad University, and his emotional paintings, came in sixth place.

HEARTalion (Halmstad University Emotional Art Robot) was top six in the Robot Art Competition, receiving high acclaim for the results of the Master’s project where a social robot like Baxter is being trained to pick up on and interpret human emotions through art.

“If this body of work was exhibited at a gallery and I was told that the artist aimed to capture emotion through colour, composition, and textures – I would buy it (says one of our professional judges). The bold brush strokes, cool or warm templates to match the emotional quality expressed, it all made sense – but felt alive. Loved them”, reads the jury motivation.

A breakthrough for a unique research

Dan Koon, one of the artists who have been coaching Baxter and his team during this project sees it as a breakthrough for unique research.

”This should make the team realise that it is really onto something. They have made a breakthrough not only in art but also in giving psychological aid to many people. I hope the strong showing inspires them to continue their valuable work.”

The hard work paid off

Martin Cooney, researcher in Social Robotics at Halmstad University and supervisor for the Master’s project is proud of the teams work, earning sixth place and a prize sum of $2,000.

”I think it’s really cool that our robot’s work was well received and that we will even get a bit of money, also because our team members worked hard and were pretty serious about this idea of somehow trying to do something with art and technology that could potentially help someone in the near future”, says Martin, also sending his thanks to the people involved in the competition.

“I’m grateful to the organiser of the competition for giving us this great opportunity to share a small part of our dreams, to the students who fought to put things together, to the artists who shared their wisdom, and to all the people who watched Peter painting with the robot, who voted for us in the competition, who talked and thought about what we are doing, or otherwise supported us.”

A team with much to to celebrate

Soon the students, Sowmya Vaikundham Narasimman and Daniel Westerlund, will defend their Master’s thesis, and the robot team will continue to look at new possibilities for the social robot, who’s single goal is helping people and making them feel better.

For the team, who came in ahead of, among others, MIT in the Robot Art Competition, cake is on the menu.

“There has also been some talk about celebrating, maybe with some cake or pizza, which I think should definitely be followed up on”, says Martin Cooney.




Social robot in training to express human feelings through art

Can a robot sense your feelings? Right now 198 works of robot art are competing in a Robot Art Competition in the USA. The winners are selected partly by Facebook likes. A unique contribution to the competition comes from Halmstad University, where the robot Baxter has attempted to interpret human emotions through painting, coached by artists Peter Wahlbeck and Dan Koon. 

Usually Baxter lives at the School of Information Technology at Halmstad University, where he is in training to understand and interpret people’s needs and emotions in order to help them feel better. But one day a few weeks ago, Baxter took part in a public event well suited for a social robot. People had gathered on campus to see Baxter try to read, interpret and draw the feelings of well known local artist Peter Wahlbeck, also assisted by artist and author Dan Koon. They have both been part of Baxter’s training by coaching the robot research team in an ongoing master thesis project where the robot is trained to pick up on feelings and express them through painting.

Baxter, who had taken the name ”Rob Boss” fort the art competition, met his audience blindfolded.

“We did it to show that this is not about the robot reading facial expressions or body language. A person can smile without being happy. So we want to take it one step further by getting the robot to sense emotions through brain waves, to understand how a person truly feels,” says Martin Cooney, researcher in social robotics at Halmstad University.

Transferring emotions through brain waves

With a sensor measuring brain waves attached to his head, Peter Wahlbeck proclaimed to be the guinea pig for the event. The emotions he was trying to convey to the robot were decided by chance. And as it turned out, they were to be two extremes – happiness and misery. Peter Wahlbeck produced his happy thoughts by thinking of a hot sunny summer day on the beach. Misery and irritation was evoked by remembering a recent event:

“I thought about something I had ordered online that turned out to not be what I expected, which was very irritating at the time!”

The robot reading Peter Wahlbeck’s emotions painted his impressions with a bright happy yellow and a dark blue and black. Even though the robot chose colours to represent the ”right” emotions, Dan Koon was disappointed after the public event, seeing the robot perform at a more advanced level before.

“Baxter had stage fright today, the robot has done better on other occasions. But we still think that the event has proved what a large potential this research has,” says Dan Koon.

“It is positive that the robot chose the colours matching the emotions, according to the colour scheme we have been using. It makes for a promising future,” says Martin Cooney.

The public event was not the first time Dan Koon and Peter Wahlbeck painted with Baxter – and the images entered in the Robot Art Competition are from those other occasions.



Research environment granted funding from the KK-foundation to study human interaction with autonomous systems

For humans, it is natural and relatively easy to interpret other people’s social signals and movement patterns, but when it comes to interacting with technical systems, like a robot or a driver-less car, there may be some challenges. A strong, distributed research environment has been granted funding from the KK-foundation to study human interaction with autonomous systems more closely.

In today’s society, there are already several so called autonomous systems, for example lawn mowers, household and industry robots, and cars that can park by themselves. But what exactly happens when autonomous systems try to interpret us humans, and how can we interpret their behaviour?

The KK-foundation has granted 27 million Swedish crowns over a period of four years to reserachers at Örebro University, the University of Skövde, the University of Halmstad and Viktoria Swedish ICT for a joint effort in a strong distributed research environment to study the interactions between humans and autonomous systems. Tom Ziemke, Professor of Cognitive Sciences at the University of Skövde is the lead researcher.

Using a map

– This is a multidisciplinary project, where reciprocation in the interaction is important. Interacting with a self-propelled lawn mower may seem as a simple interaction, but what about the interaction when it comes to a driver-less car taking a human to work? It is necessary that humans are able to interpret the artificial systems and the systems need to be able to interpret us for a successful interaction. This has all to do with trust and safety, like when driver-less cars actually operation in traffic, says Tom Ziemke, Professor at the University of Skövde.

– One example of what we will be working with in order to facilitate human-robot communication is outlines of maps. That is, how can a robot make use of human knowledge by using a map or a plan drawn on a piece of paper? says Martin Magnusson, researcher at Örebro University.

The researchers are also going to look at how robots can interpret human intentions by observing what they are doing – for example if they are sitting, walking, or standing in line – to enable a more intuitive interaction.

Instructions are not necessary

– That is, finding ways to bypass explicit instructions to the robot. Regarding information going the other direction, that is for the robot to behave in a manner that is easily understood by humans, we will study how a robot can communicate its plans and what it has been able to understand from its surroundings by projecting images on close objects, says Martin Magnusson.

– Another important step is to observe how robots can move in a way that is non-obstructive to humans, for example by teaching the robots what routes others use, to avoid the “traffic”.

The research environment will commence in the spring of 2015, and continue until 2019. The KK-foundation is a university funding entity with the purpose of strengthening Sweden’s competitiveness.

Text: Linda Harradine