The researcher who wants to teach machines to see

When will we be able to teach robots to see like humans do? That means, to really interpret the surroundings and then make rational decisions. Today, research has not got that far. But Joseph Bigun, Professor of Signal Analysis, wants to be involved in the development of the first visually intelligent machine.

Perhaps vision is a human’s most important sense. We can manage without it, but it’s with our eyes most of us see and interpret the world.

“Our vision uses a great deal of our brain’s capacity. We need to process all the visual data and then put them into context. It really applies to anything we want to do. Our vision is so essential to our lives that we hardly think about what we would do without it.”

Josef Bigun is a Professor of Signal Analysis and Intelligent Systems. His research is about biometrics, that is technical means for computers to identify and collect information about us and our characteristics. It could be eye scanning, DNA, fingerprints or recognising a walking style. This involves teaching computers visual intelligence, and eventually teaching a robot to recognise images or patterns.

Everything can be used for good or bad purposes. But I think intelligent systems really will be able to help us in the future. I think it’s legitimate to be skeptical, that is why we should have better technology, make laws about using personal data and have good education about computers and artificial intelligence.

– Josef Bigun

Josef Bigun, Professor at Halmstad University
Josef Bigun is a Professor of Signal Analysis and Intelligent Systems. His research is about biometrics, that is technical means for computers to identify and collect information about us and our characteristics.

“We already have computers with different types of intelligence, but it’s not sophisticated. Currently, there are no robots with a visual intelligence so well developed that they can draw conclusions like we humans do.”

Human vision is still largely superior to that of robots, but to succeed in making a robot copy the way we see the world is what fascinates and motivates Josef Bigun.

“It’s a big challenge to teach a robot to do what we’ve learnt since we were children,” he says.

Halmstad stands out

It was a bit of coincidence that Joseph Bigun ended up at Halmstad University. After graduating with a MSc in Engineering at Linköping University, he continued with a PhD in Computer Science at the same university. After ten years as a researcher at the Swiss Federal Institute of Technology, he returned to Sweden and received a position at Halmstad University, which has become one of the premier institutions in Sweden for research in visual intelligence and in particular, biometric identification.

The collaboration within the CAISR (Center for Applied Intelligent Systems Research) research group is important. The researchers discuss results and share ideas, and collaborate with many international colleagues. In 2016, CAISR organised the International Conference on Biometrics, with over 200 participants from around the world. It is one of the oldest and most respected conference regarding current research in biometric identification.

The research field in artificial intelligence is very active. It has also attracted much attention in media in recent years. While challenging, it’s primarily a lot of fun, Josef Bigun thinks.

“There’s a constant stream of new results published and things happening all the time. It has been like that for the past 40 years.”

Produce knowledge and solve problems

Josef Bigun’s goal as a researcher is clear. He would like to leave a better world, he says, and as a researcher he wants to contribute to continually improving both the environment and our lives.

“I don’t try to solve all the problems myself, but in my niche, I try to see how we can contribute. For example, by cameras that help to recycle and make environmental benefits, or to improve future transport with autonomous vehicles. I believe in the role of knowledge; new knowledge will help solve several problems.”

We are constantly building on our knowledge – it’s like an artist who keeps working on a piece of art.

– Josef Bigun

Josef Bigun, Halmstad University
According th Josef Bigun, the best thing about working as a researcher is producing new knowledge.

He also hopes that his research will be useful for the robots of the future:

“Robots will need to help with different things in the future, where we won’t have enough human resources. Our society is still becoming more and more automated; what’s left to automate is often linked to visual intelligence.”

For example, robots can help in healthcare, but the technology can also be used with other types of services.

“As an example, cameras and sensors could become intelligent and beep if you haven’t turned off the stove. Or they could be used to check if a passenger has shown his ticket on a bus.”

A research field that sparks controversy

But it is not always easy to work with issues related to visual intelligence and intelligent computers. Sometimes the research is met with criticism and skepticism – or fear. Josef Bigun, on the other hand, argues that a future with visually intelligent robots is a bright future.

“Everything can be used for good or bad purposes. But I think intelligent systems really will be able to help us in the future. I think it’s legitimate to be skeptical, that is why we should have better technology, make laws about using personal data and have good education about computers and artificial intelligence. It’s much like cars: traffic rules, better vehicles and infrastructure and driving license training make traffic safer. We don’t ban the cars, the same goes for technology – by looking at the benefits and acting on several fronts we can minimise the risks.”

The best thing about working as a researcher is producing new knowledge, Josef Bigun thinks; and it’s something he can always do working in the research group CAISR.

“We are constantly developing new knowledge. It’s difficult to point to something that was particularly important in 2017. However, 2017 was better than all previous years because we are constantly building on our knowledge – it’s like an artist who keeps working on a piece of art”, he says.


The text was originally written for CAISR annual report 2017.

Pepper, the social robot, opened production fair

Social robots that in various ways are able to interact and relate to humans are trending. At the University of Skövde, there are currently several research projects where researchers are studying how the skills of social robots can be put into use. For example within the industry. On May 15th 2018, Pepper, the University of Skövde’s robot and an eight year old girl Rut opened one of Sweden’s largest industry trade fairs. They cut the ribbon together, as a symbol of next generation’s industry coworkers.

Pepper was invited to Elmia Production Fair on May 15-18 to open the event, and was described as “the latest within technology and industry”. Together with his colleagues, Erik Billing, lecturer in Information Technology at the University of Skövde, has programmed Pepper’s movements and expressions.
– Pepper and eight year old Rut opened the fair by untying a ribbon. Afterwards, Pepper was available for visitors who wanted to say hello and learn more about social robots, and our various research projects in this field at the University of Skövde, says Erik Billing, and he also mentioned the various applications of the robot beyond acting as an opening speaker.

Studying the human-robot interaction in various fields

In the research project AIR, a joint effort with the RISE research institute, Örebro University and Halmstad University who receive funding from the KK-foundation, the University of Skövde is studying how robots can interpret and interact with various autonomous systems. Some areas that are being studied in addition to social robots are industrial robots and autonomous systems for traffic, that is driver-less cars.
– By studying these three areas, we are hoping to gain a better understanding about how humans react and interact with the three systems. Our goal is to create a tight collaboration between humans and robots, where the robot is able to adjust to human actions and intentions. For the industry, we hope that this will open up for new ways of producing goods while contributing to a better work environment for machine operators.

The development of social robots such as Pepper has just begun, and the fields of application will most likely expand in the future, says Erik Billing.
– Technology from social robots, like how they establish eye contact, process human speech or gestures, is likely to be very important for the future of industry.

Emotional art by social robot Baxter

Halmstad University’s social robot Baxter can create paintings that express human emotions. Last year, Baxter’s paintings successfully participated in an international robot art competition. The emotions were transferred to the robot by brain waves from a famous Swedish artist, Peter Wahlbeck. This year, the robot art has reached the next level – Baxter has learned to paint as a performance, showing emotions itself and infusing these feelings into artwork.

Winning sixth place in last year’s international robot art competition encouraged the Halmstad research team to further develop the social robot Baxter’s emotional artwork. Robot art has also become a compulsory part of the Master’s Programme in Embedded and Intelligent Systems at Halmstad University.

– Last year, Baxter painted with a frozen, blank face. Only the paintings expressed emotions, so it was like a computer printer just printing images. This year we wanted the robot too to show emotions, says social robotics researcher Martin Cooney.

In order for social robots to successfully interact with and help humans, they need to be given “life”, complexity and randomness.

– Martin Cooney

Robots need to be given “life”

Martin Cooney and his fellow researchers at the School of Information Technology want to be more creative and innovative in their development of social robotics, and robot art is one example of this.

– Robots in factories might follow rigid programs – do this, do that.. But that’s not how humans work. In order for social robots to successfully interact with and help humans, they need to be given “life”, complexity and randomness. The robot has to be able to both sense human emotions and express feelings of its own – just like humans do, says Martin Cooney.

Six paintings from Halmstad University have been sent to the 2018 International Robotic Art Competition. A team of 24 Master’s students created the backgrounds of all paintings by using random movements of small platooning robots. The foreground was painted by Baxter, programmed to show a certain emotion through its ‘face’ and sound.

Martin Cooney and Baxter at the robot art competition last year when Baxter created paintings that express human emotions. This year, Baxter has learned how to show it’s own feelings through art.

Text: Louise Wandel

Photo: Hanna Carmvall

Meet Baxter – a social robot at your service

Baxter lives at Halmstad University where he is in training to understand and interpret people’s needs and emotions in order to help them in different ways. Baxter can also be useful when you are cooking to make sure the food is healthy, or help you when you fall.

The base of the robot, “Ridgeback”, is from Clearpath Robotics. The top, “Baxter”, is from Rethink Robotics. The robot was purchased with support from the Swedish Knowledge Foundation for the SIDUS AIR project, which focuses on action and intention recognition in human interaction with autonomous systems.



Baxter impressed judges in Robot Art Competition

200 works of robot art divided between 38 teams from 10 different countries competed in the international Robot Art Competition. The winners were announced earlier this week and the social robot Baxter from Halmstad University, and his emotional paintings, came in sixth place.

HEARTalion (Halmstad University Emotional Art Robot) was top six in the Robot Art Competition, receiving high acclaim for the results of the Master’s project where a social robot like Baxter is being trained to pick up on and interpret human emotions through art.

“If this body of work was exhibited at a gallery and I was told that the artist aimed to capture emotion through colour, composition, and textures – I would buy it (says one of our professional judges). The bold brush strokes, cool or warm templates to match the emotional quality expressed, it all made sense – but felt alive. Loved them”, reads the jury motivation.

A breakthrough for a unique research

Dan Koon, one of the artists who have been coaching Baxter and his team during this project sees it as a breakthrough for unique research.

”This should make the team realise that it is really onto something. They have made a breakthrough not only in art but also in giving psychological aid to many people. I hope the strong showing inspires them to continue their valuable work.”

The hard work paid off

Martin Cooney, researcher in Social Robotics at Halmstad University and supervisor for the Master’s project is proud of the teams work, earning sixth place and a prize sum of $2,000.

”I think it’s really cool that our robot’s work was well received and that we will even get a bit of money, also because our team members worked hard and were pretty serious about this idea of somehow trying to do something with art and technology that could potentially help someone in the near future”, says Martin, also sending his thanks to the people involved in the competition.

“I’m grateful to the organiser of the competition for giving us this great opportunity to share a small part of our dreams, to the students who fought to put things together, to the artists who shared their wisdom, and to all the people who watched Peter painting with the robot, who voted for us in the competition, who talked and thought about what we are doing, or otherwise supported us.”

A team with much to to celebrate

Soon the students, Sowmya Vaikundham Narasimman and Daniel Westerlund, will defend their Master’s thesis, and the robot team will continue to look at new possibilities for the social robot, who’s single goal is helping people and making them feel better.

For the team, who came in ahead of, among others, MIT in the Robot Art Competition, cake is on the menu.

“There has also been some talk about celebrating, maybe with some cake or pizza, which I think should definitely be followed up on”, says Martin Cooney.




Social robot in training to express human feelings through art

Can a robot sense your feelings? Right now 198 works of robot art are competing in a Robot Art Competition in the USA. The winners are selected partly by Facebook likes. A unique contribution to the competition comes from Halmstad University, where the robot Baxter has attempted to interpret human emotions through painting, coached by artists Peter Wahlbeck and Dan Koon. 

Usually Baxter lives at the School of Information Technology at Halmstad University, where he is in training to understand and interpret people’s needs and emotions in order to help them feel better. But one day a few weeks ago, Baxter took part in a public event well suited for a social robot. People had gathered on campus to see Baxter try to read, interpret and draw the feelings of well known local artist Peter Wahlbeck, also assisted by artist and author Dan Koon. They have both been part of Baxter’s training by coaching the robot research team in an ongoing master thesis project where the robot is trained to pick up on feelings and express them through painting.

Baxter, who had taken the name ”Rob Boss” fort the art competition, met his audience blindfolded.

“We did it to show that this is not about the robot reading facial expressions or body language. A person can smile without being happy. So we want to take it one step further by getting the robot to sense emotions through brain waves, to understand how a person truly feels,” says Martin Cooney, researcher in social robotics at Halmstad University.

Transferring emotions through brain waves

With a sensor measuring brain waves attached to his head, Peter Wahlbeck proclaimed to be the guinea pig for the event. The emotions he was trying to convey to the robot were decided by chance. And as it turned out, they were to be two extremes – happiness and misery. Peter Wahlbeck produced his happy thoughts by thinking of a hot sunny summer day on the beach. Misery and irritation was evoked by remembering a recent event:

“I thought about something I had ordered online that turned out to not be what I expected, which was very irritating at the time!”

The robot reading Peter Wahlbeck’s emotions painted his impressions with a bright happy yellow and a dark blue and black. Even though the robot chose colours to represent the ”right” emotions, Dan Koon was disappointed after the public event, seeing the robot perform at a more advanced level before.

“Baxter had stage fright today, the robot has done better on other occasions. But we still think that the event has proved what a large potential this research has,” says Dan Koon.

“It is positive that the robot chose the colours matching the emotions, according to the colour scheme we have been using. It makes for a promising future,” says Martin Cooney.

The public event was not the first time Dan Koon and Peter Wahlbeck painted with Baxter – and the images entered in the Robot Art Competition are from those other occasions.



Research environment granted funding from the KK-foundation to study human interaction with autonomous systems

For humans, it is natural and relatively easy to interpret other people’s social signals and movement patterns, but when it comes to interacting with technical systems, like a robot or a driver-less car, there may be some challenges. A strong, distributed research environment has been granted funding from the KK-foundation to study human interaction with autonomous systems more closely.

In today’s society, there are already several so called autonomous systems, for example lawn mowers, household and industry robots, and cars that can park by themselves. But what exactly happens when autonomous systems try to interpret us humans, and how can we interpret their behaviour?

The KK-foundation has granted 27 million Swedish crowns over a period of four years to reserachers at Örebro University, the University of Skövde, the University of Halmstad and Viktoria Swedish ICT for a joint effort in a strong distributed research environment to study the interactions between humans and autonomous systems. Tom Ziemke, Professor of Cognitive Sciences at the University of Skövde is the lead researcher.

Using a map

– This is a multidisciplinary project, where reciprocation in the interaction is important. Interacting with a self-propelled lawn mower may seem as a simple interaction, but what about the interaction when it comes to a driver-less car taking a human to work? It is necessary that humans are able to interpret the artificial systems and the systems need to be able to interpret us for a successful interaction. This has all to do with trust and safety, like when driver-less cars actually operation in traffic, says Tom Ziemke, Professor at the University of Skövde.

– One example of what we will be working with in order to facilitate human-robot communication is outlines of maps. That is, how can a robot make use of human knowledge by using a map or a plan drawn on a piece of paper? says Martin Magnusson, researcher at Örebro University.

The researchers are also going to look at how robots can interpret human intentions by observing what they are doing – for example if they are sitting, walking, or standing in line – to enable a more intuitive interaction.

Instructions are not necessary

– That is, finding ways to bypass explicit instructions to the robot. Regarding information going the other direction, that is for the robot to behave in a manner that is easily understood by humans, we will study how a robot can communicate its plans and what it has been able to understand from its surroundings by projecting images on close objects, says Martin Magnusson.

– Another important step is to observe how robots can move in a way that is non-obstructive to humans, for example by teaching the robots what routes others use, to avoid the “traffic”.

The research environment will commence in the spring of 2015, and continue until 2019. The KK-foundation is a university funding entity with the purpose of strengthening Sweden’s competitiveness.

Text: Linda Harradine