Page 2 of 3
1 2 3

Driverless shuttles: the latest from two European projects

AIhub | Horizon | Keolis autonomous shuttle
Autonomous vehicles must be well-integrated into public transport systems if they are to take off in Europe’s cities, say researchers. Image credit – Keolis

By Julianna Photopoulos

Jutting out into the sea, the industrial port area of Nordhavn in Denmark’s capital, Copenhagen, is currently being transformed into a futuristic waterfront city district made up of small islets. It’s billed as Scandinavia’s largest metropolitan development project and, when complete, will have living space for 40,000 people and workspace for another 40,000.

At the moment, Nordhavn is only served by a nearby S-train station and bus stops located near the station. There are no buses or trains running within the development area, although there are plans for an elevated metro line, and parking will be discouraged in the new neighbourhood. This is a great opportunity for autonomous vehicles (AVs) to operate as a new public transport solution, connecting this area more efficiently, says Professor Dimitri Konstantas at the University of Geneva in Switzerland.

‘We believe that AVs will become the new form of transport in Europe,’ he said. ‘We want to prove that autonomous vehicles are a sustainable, viable and environmental solution for urban and suburban public transportation.’

Prof. Konstantas is coordinating a project called AVENUE, which aims to do this in four European cities. In Nordhavn, the team plans to roll out autonomous shuttles on a loop with six stops around the seafront. They hope to have them up and running in two years. But once in place, the Nordhavn plan may provide a glimpse of how AV-based public transportation systems could work in the future.

Prof. Konstantas envisages these eventually becoming an on-demand, door-to-door service, where people can get picked up and go where they want rather than predetermined itineraries and bus stops.

In Nordhavn, AVENUE will test and implement an autonomous ‘mobility cloud’, currently under development, to link the shuttles with existing public transport, such as the nearby train station. An on-demand service will ultimately allow passengers to access the available transport with a single app, says Prof. Konstantas.

Integrating autonomous shuttles into the wider transport system is vital if they are to take off, says Guido Di Pasquale from the International Association of Public Transport (UITP) in Brussels, Belgium.

‘Autonomous vehicles have to be deployed as fleets of shared vehicles, fully integrated and complementing public transport,’ he said. ‘This is the only way we can ensure a sustainable usage of AVs in terms of space occupancy, traffic congestion and the environment.’

Single service

Di Pasquale points to a concept known as Mobility-as-a-Service (MaaS) as a possible model for future transport systems. This model combines both public and private transport. It allows users to create, manage and pay trips as a single service with an online account. For example, Uber, UbiGo in Sweden and Transport for Greater Manchester in the UK are exploring MaaS to enable users to get from one destination to another by combining transport and booking it as one trip, depending on their preferred option based on cost, time and convenience.

Di Pasquale coordinates a project called SHOW, which aims to deploy more than 70 automated vehicles in 21 European cities to assess how they can best be integrated with different wider transport systems and diverse users’ needs. They are testing combinations of AV types, from shuttles to cars and buses, in real-life conditions over the next four years. During this time, he expects the project’s AVs to transport more than 1,500,000 people and 350,000 containers of goods. ‘SHOW will be the biggest ever showcase and living lab for AV fleets,’ he said.

He says that most of the cities involved have tested autonomous last-mile vehicles in the past and are keen to include them in their future sustainable urban mobility plans.

However, rolling out AVs requires overcoming city-specific challenges, such as demonstrating safety.

‘Safety and security risks have restricted the urban use of AVs to dedicated lanes and low speed — typically below 20km/h,’ explained Di Pasquale. ‘This strongly diminishes their usefulness and efficiency, as in most city environments there is a lack of space and a high cost to keep or build such dedicated lanes.’

It could also deter users. ‘For most people, a speed barely faster than walking is not an attractive solution,’ he said.

We want to prove that autonomous vehicles are a sustainable, viable and environmental solution for urban and suburban public transportation.

Prof. Dimitri Konstantas, University of Geneva, Switzerland

Di Pasquale hopes novel technology will make higher speed and mixed traffic more secure, and guarantee fleets operating safely by monitoring and controlling them remotely.

Each city participating in SHOW will use autonomous vehicles in various settings, including mixed and dedicated lanes, at various speeds and types of weather. For safety and regulation reasons, all of them will have a driver present.

The objective is to make the vehicle fully autonomous without the need for a driver as well as optimise the service to encourage people to make the shift from ownership of cars to shared services, according to Di Pasquale. ‘This would also make on-demand and last-mile services sustainable in less densely populated areas or rural areas,’ he said.

Authorisation

But the technical issues of making the vehicle autonomous are only a part of the challenge.

There’s also the issue of who pays for it, says Di Pasquale. ‘AVs require sensors onboard, as well as adaptations to the physical and digital infrastructure to be deployed,’ he explained. ‘Their market deployment would require cities to drastically renew their fleets and infrastructures.’

SHOW’s pilots are scheduled to start in two years from now, as each city has to prepare by obtaining the necessary permits and getting the vehicles and technology ready, says Di Pasquale.

Getting authorisation to operate in cities is one of the biggest hurdles. City laws and regulations differ everywhere, says Prof. Konstantas.

AVENUE is still awaiting city licences to test in Nordhavn, despite a national law being passed on 1 July 2017 allowing for AVs to be tested in public areas. Currently, they have pilots taking place in Lyon, France and Luxembourg. In Geneva, the team has managed to get the required licences and the first worldwide on-demand, AV public transportation service will be rolled out on a 69-bus-stop circuit this summer.

AVENUE’s initial results show that cities need to make substantial investments to deploy AVs and to benefit from this technology. The legal and regulatory framework in Europe will also need to be adapted for smooth deployment of services, says Prof. Konstantas.

Both he and Di Pasquale hope their work can pave the way to convince operators and authorities to invest in fleets across Europe’s cities.

‘Depending on the willingness of public authorities, this can take up to four years until we see real, commercially sustainable AV-based public transportation services on a large scale in Europe,’ said Prof. Konstantas.

The research in this article was funded by the EU.

This post Driverless shuttles: what are we waiting for? was originally published on Horizon: the EU Research & Innovation magazine | European Commission.

Robotic arms and temporary motorisation – the next generation of wheelchairs

An automated wheelchair with an exoskeleton arm is designed to help people with varying forms of disability carry out daily tasks independently. Image credit – AIDE, Universidad Miguel Hernandez

by Julianna Photopoulos

Next-generation wheelchairs could incorporate brain-controlled robotic arms and rentable add-on motors in order to help people with disabilities more easily carry out daily tasks or get around a city.

Professor Nicolás García-Aracil from the Universidad Miguel Hernández (UMH) in Elche, Spain, has developed an automated wheelchair with an exoskeleton robotic arm to use at home, as part of a project called AIDE.

It uses artificial intelligence to extract relevant information from the user, such as their behaviour, intentions and emotional state, and also analyses its environmental surroundings, he says.

The system, which is based on an arm exoskeleton attached to a robotised wheelchair, is designed to help people living with various degrees and forms of disabilities carry out daily functions such as eating, drinking, and washing up, on their own and at home. While the user sits in the wheelchair, they wear the robotised arm to help them grasp objects and bring them close — or as the whole system is connected to the home automation system they can ask the wheelchair to move in a specific direction or go into a particular room.

Its mechanical wheels are made to move in narrow spaces, ideal for home-use, and the system can control the environment remotely – for example, switching lights on and off, using the television or making and answering phone calls. What’s more, it can anticipate the person’s needs.

‘We can train artificially intelligent algorithms to predict what the user wants to do,’ said Prof. García-Aracil. ‘Maybe the user is in the kitchen and wants a drink. The system provides their options (on a monitor) so they can control the exoskeleton to raise the glass and drink.’

Multimodal system

The technology isn’t simple. As well as the exoskeleton robotic arm attached to the robotic wheelchair, the chair has a small monitor and uses various sensors, including two cameras to recognise the environment, voice control, eye-tracking glasses to recognise objects, and sensors that capture brain activity, eye movements and signals from muscles.

Depending on each person’s needs and disabilities, the multiple devices are used accordingly. For example, someone with a severe disability such as a cervical spinal cord injury, who wouldn’t otherwise be able to use voice control, could use the brain activity and eye movement sensors combined.

The user wears a cap on their head, filled with electrodes, to record the brain’s activity which controls the exoskeleton hand’s movement, explains Prof. García-Aracil. So when the user sees themself closing their hand onto an object for example, the exoskeleton arm actually does it for them. This technology is called brain-neural-computer interaction (BNCI), where brain — as well as muscle — activity can be recorded and used to interact with an electronic device.

But the system can sometimes make mistakes so there is an abort signal, says Prof. García-Aracil. ‘We use the horizontal movement of the eye, so when you move your eyes to the right you trigger an action, but when you move your eyes to the left you abort that action,’ he explains.

The AIDE prototype was successfully tested last year by 17 people with disabilities including acquired brain injury (ABI), multiple sclerosis (MS), and spinal cord injury (SCI), at the Cedar Foundation in Belfast, Northern Ireland. Its use was also demonstrated at UMH in Elche, with the user asking to be taken to the cafeteria, then asking for a drink, and drinking it with the help of the exoskeletal arm.

Now more work needs to be carried out to make the system easier to use, cheaper and ready for the market, says Prof. García-Aracil.

But it’s not just new high-tech wheelchairs that can increase the functionality for users. Researchers on the FreeWheel project are developing a way of adding motorised units to existing wheelchairs to improve their utility in urban areas.

‘Different settings have different challenges,’ said project coordinator Ilaria Schiavi at IRIS SRL in Torino, Italy. For example, someone with a wheelchair may struggle to go uphill or downhill without any physical assistance whilst outdoors. But this system could allow people using wheelchairs to have an automated wheelchair experience regardless of whether they are indoors or outdoors, she says.

Rentable

The motorised units would attach to manual wheelchairs people already have in order to help them move around more easily and independently, Schiavi explains. These could either be rented for short periods of time and tailored according to the location — an indoor or outdoor environment — or bought, in which case would be completely personalised to the individual.

The researchers are also developing an app for the user which would include services such as ordering a bespoke device to connect the wheelchair and the unit, booking the unit, controlling it, and planning a journey within urban areas for shopping or sightseeing.

‘You have mobility apps that allow you to book cars, for example. Our app will allow the owner of a wheelchair to firstly subscribe to the service, which would include buying a customised interface to use between their own wheelchair and the motorising unit they have booked,’ said Schiavi.

‘A simple customised interface will allow wheelchair users to motorise their exact device, as it is used by them, at a reasonable cost.’

Customisation is made possible through additive manufacturing (AM) technologies, she says. AM technologies build 3D objects by adding materials, such as metal or plastic, layer-by-layer.

Schiavi and her colleagues are exploring various uses for the motorised units and next year, the team plans to test this system with mobility-impaired people in both Greece and Italy. They hope that, once developed, they will be made available like city bicycles in public spaces such as tourist attractions or shopping centres.

The research in this article was funded by the EU.

A ‘cookbook’ for vehicle manufacturers: Getting automated parts to talk to each other

Automated, networked truck convoys could save fuel and cut down on driving time. Image credit – MAN Truck & Bus

by Sandrine Ceurstemont
Semi-autonomous cars are expected to hit the roads in Europe next year with truck convoys following a few years later. But before different brands can share the roads, vehicle manufacturers need to agree on standards for automated functions.

Automation will increasingly allow vehicles to take over certain aspects of driving. However automated functions are still being fine-tuned, for example, to ensure smooth transitions when switching between the human driver and driverless mode.

Standards also need to be set across different car manufacturers, which is one of the goals of a project called L3Pilot. Although each brand can maintain some unique features, automated functions that help with navigating traffic jams, parking and motorway and urban driving must be programmed to do the same thing.

‘It’s like if you rent a car today, your expectation is that it has a gear shift, it has pedals, it has a steering wheel and so on,’ said project coordinator Aria Etemad from Volkswagen Group Research in Wolfsburg, Germany. ‘The approaches and the interfaces to the human driver are the same.’

To get the same functions from different brands to operate in an identical way, the team is creating a code of practice. This will result in a checklist for developers to run through when creating a self-driving function. ‘It’s like a cookbook for how to design and develop automated driving functions,’ said Etemad.

So far, the project team, which includes representatives from 13 vehicle manufacturers, has been conducting initial tests to make sure their company’s technology works. Cars are equipped with several sensors as well as cameras and computers which need to be properly calibrated to deal with real-world traffic.

The next step is to test the automated functions on public roads to ensure that the vehicles are ready. The tests will begin this month. Volunteer drivers will be chosen from diverse backgrounds, including different ages and genders. ‘We plan, all in all, to have 1,000 drivers using 100 vehicles in 10 different countries,’ Etemad said.

Transition

The technologies being trialled will cover a wide range of situations from overtaking on motorways to driving through urban intersections. Long journeys are also planned to see how people are able to transition back to driving after a long time in automated driving mode.

‘We really want to understand if the way we have designed our systems is the way drivers expect them to behave,’ said Etemad.

The team will also be investigating other aspects of automated driving such as the effect on traffic flow and CO2 emissions. Self-driving features are likely to make driving more efficient due to connectivity between vehicles and infrastructure, for example, although research so far has shown mixed results due to other contributing factors such as more people choosing to drive.

The first automated functions, which should be available in the next year, are likely to be for motorway driving, according to Etemad. Parking functions are likely to come to market next followed by automated urban driving, which is much more complex due to additional elements such as pedestrians and cyclists moving around.

‘There will be a good contribution from the project with results about long-term automated driving and a general understanding of people and functions that will impact how these systems are developed,’ said Etemad.

Automated functions are of interest for trucks too, where networked vehicles driving in a convoy could help save fuel, cut down on driving time or help with traffic flow. Truck platooning involves several trucks linking up through a wireless connection when they are close by so that they can share information and use automated functions to drive together as one. But so far, the concept has mostly been demonstrated in trucks of the same brand and in very few cases, with two brands.

‘Each automotive manufacturer tries it out and develops the technique within their own factory,’ said Dr Marika Hoedemaeker, a senior project manager at the Netherlands Organisation for Applied Scientific Research (TNO) in Helmond.

Truck platooning

Hoedemaeker and her project partners are now trying to break new ground by developing truck platooning that works across different brands as part of a project called ENSEMBLE. ‘Now we’re going to show that we can do this together with all the European truck manufacturers,’ said Dr Hoedemaeker.

The first phase, which the team has just completed, involved coming up with specifications that need to be implemented by all manufacturers. For example, braking and speed keeping will be automated whereas steering won’t be.

They’ve also come up with guidelines for how a convoy will respond when it enters a zone with a new speed limit or passes a toll gate, for example. ‘It’s not only communicating with the platoon but also with the outside world,’ she said.

The team is also keen to gauge the impact that a platoon will have on traffic. They would like to calculate its effect on traffic flow and whether it would in fact reduce congestion on the roads. Driving simulators will also be used to see how other drivers react when they encounter an automated truck convoy. ‘Will it change their traffic behaviour or will they never drive in the right lane anymore? There are lots of questions around this other traffic behaviour as well,’ said Dr Hoedemaeker.

Once specifications have been implemented in the trucks, they will start to test platoons on dedicated grounds then on public roads. Since trucks cross borders quite often, they will have to respect laws in different countries which vary across EU member states. The minimum following distance between vehicles, for example, differs from country to country.

In a final showcase event in May 2021, trucks from seven different brands, such as Daimler and Volvo, will drive together in one or more convoys across national borders, most likely to a key goods transport destination such as a large European port.

Following this deployment on European roads, Hoedemaeker expects the first generation platooning trucks to start being manufactured and sold a year after the project ends in 2021.

Since platoons are being developed worldwide, the standards created during the project could also be adopted more widely.

‘I think there is potential that the rest of the world could say they (Europe) already thought about the standards so we can use these and not do the whole thing over again,’ she said.

The research in this article was funded by the EU.

From robotic companions to third thumbs, machines can change the human brain

Cozmo robots and their corresponding tablets are being distributed to participants to take home so that they can interact with them for a week for an experiment being carried out by social robotics professor Emily Cross. Image credit – Ruud Hortensius and Emily Cross
By Frieda Klotz

People’s interactions with machines, from robots that throw tantrums when they lose a colour-matching game against a human opponent to the bionic limbs that could give us extra abilities, are not just revealing more about how our brains are wired – they are also altering them.

Emily Cross is a professor of social robotics at the University of Glasgow in Scotland who is examining the nature of human-robot relationships and what they can tell us about human cognition.

She defines social robots as machines designed to engage with humans on a social level – from online chatbots to machines with a physical presence, for example, those that check people into hotel rooms.

According to Prof. Cross, as robots can be programmed to perform and replicate specific behaviours, they make excellent tools for shedding light on how our brains work, unlike humans, whose behaviour varies.

‘The central tenets to my questions are, can we use human-robot interaction to better understand the flexibility and fundamental mechanisms of social cognition and the human brain,’ she said.

Brain imaging shows that a sad, happy or neutral robotic expression will engage the same parts of the brain as a human face with similar expressions.

Through their project called Social Robots, Prof. Cross and her team are using neural decoding techniques to probe the extent to which human feelings towards a robot change depending on how it behaves.

Tantrums

When the robots used in the project lose a game, they alternate between throwing tantrums or appearing dejected. ‘So far, people actually find it really funny when the robot gets angry,’ she said. ‘But people do respond to them quite strongly and that’s really interesting to see.’

Having robots as colleagues has been shown to affect humans in complex ways. Researchers at the University of Washington found that when soldiers used robots in bomb disposal, they developed emotional attachments towards them and felt frustration, anger or sadness if their robot was destroyed.

Prof. Cross says that from an evolutionary perspective, this doesn’t make sense. ‘We care about people and perhaps animals that might help us or hurt us,’ she said. ‘But with machines it’s a bit more of a mystery and understanding how far we can push that (to develop social relationships with machines) is a really, really fascinating question.’

It’s important to understand these dynamics since, as she points out, robots are already working as companions in nursing homes or even as tutors in early childhood education. Home care and education are prime areas of social robotics research, with R&D efforts focusing on adults suffering from dementia and young children.

Ten-hour rule

Typically, studies on such groups observe interactions over a relatively short time-span. They rarely exceed what Prof. Cross describes as a ten-hour rule, beyond which study participants tend to get bored of their robotic toys. But her team is looking at how feelings towards robots evolve over time.

As part of the project, the researchers send a palm-sized Cozmo robot home with study participants and instruct them to interact with it every day for a week by playing games or introducing it to their friends and pets. The participants’ brains are imaged at the start and end of that period to track changes.

‘If we’re going to have robots in our home environment, if they’re going to be in our schools teaching our kids across weeks, if not years, if they’re going to be peoples’ social companions, we want to know a lot more than just what happens after ten hours’ (of exposure),’ she said.

‘We want to know how people’s social bonds and relationships to robots change across many, many more hours.’

With such technologies set to become a bigger part of our future, other studies are investigating how the brain reacts to a different kind of robot – wearable robotic limbs that augment the body, providing extra abilities.

Wearables could have social and healthcare benefits. For instance, a third arm could assist surgeons to carry out procedures more safely rather than relying on human assistants, enable people to complete their household chores much faster or help construction workers.

But even as the technology capabilities develop apace, Dr Tamar Makin, a neuroscientist at University College London, UK, is exploring what it would take for the brain to accept and operate a robotic appendage as part of the body, through a five-year project called Embodied Tech.

Additional thumb

In order to understand how the brain deals with an extra body part, Dr Makin’s team asks participants to wear an additional opposable thumb for a week. Created by a designer named Dani Clode, the thumb is controlled by pressure sensors worn on the big toes.

Product designer Dani Clode created a prosthetic opposable thumb for people to wear as an extra digit. Video credit: Dani Clode

With the additional thumb, the augmented hand almost has the capabilities of two hands, giving people extra capacity to carry out actions. The question is what effect that has on the brain.

The study is still underway but preliminary results indicate that the presence of an extra thumb alters the brain’s internal map of what the biological hand looks like. Scans show that the brain represents the fingers as collapsing onto each other, away from the thumb and index finger.

This mirrors what happens in diseases like dystonia, when the representation of fingers begins to merge – for instance, when musicians use their fingers excessively – and causes cramp-like pain. The same effect could theoretically cause pain in the wearer of an extra thumb.

‘One important interim message we have is that there are potential costs, not just benefits, to using augmentation technology,’ said Dr Makin.

She believes that the newness of human augmentation means there are lots of unanswered questions but it’s vital to explore the challenges of wearable robotics in order to fully realise the promises, such as multitasking or safer working conditions.

‘I feel like we have a responsibility to gain a much better understanding of how having good control of an additional body part is going to change the representation of the body parts you already have.’

The research in this article was funded by the European Research Council.

Robots are being programmed to adapt in real time

In trials, the ResiBot robot learned to walk again in less than two minutes after one of its legs was removed. Image credit – Antoine Cully / Sorbonne University

By Gareth Willmer

It’s part of a field of work that is building machines that can provide real-time help using only limited data as input. Standard machine-learning algorithms often need to process thousands of possibilities before deciding on a solution, which may be impractical in pressurised scenarios where fast adaptation is critical.

After Japan’s Fukushima nuclear disaster in 2011, for example, robots were sent into the power plant to clear up radioactive debris in conditions far too dangerous for humans. The problem, says robotics researcher Professor Jean-Baptiste Mouret is that the robots kept breaking down or came across hazards that stopped them in their tracks.

As part of the ResiBots initiative, he is designing a lower-cost robot that can last long periods without needing constant human maintenance for breakages and are better at overcoming unexpected obstacles.

The ResiBots team is using what it refers to as micro-data learning algorithms, which can help robots adapt in front of one’s eyes in a similar way to how animals react to problems. An animal will, for example, often find a way to continue moving if they get injured, even if they don’t know exactly what the problem is.

In contrast, most current robots self-diagnose a problem before working out a way to overcome it, says Prof. Mouret, principal investigator at ResiBots and a senior researcher at the Inria research centre in France.

‘We’re trying to shortcut this by finding a way for them to react without necessarily having developed an understanding of what’s wrong,’ he said.

Rather than self-diagnosing, the aim for these robots is to learn in a proactive way by trial and error what alternative actions they can take. This could help them overcome difficulties and stop them from shutting down in situations such as disaster scenarios like Fukushima, said Prof. Mouret.

This may not be full artificial intelligence, but Prof. Mouret points out that having knowledge of everything is not essential for getting a robot to work.

‘We’re not trying to solve everything,’ he said. ‘I’m more interested in how they can adapt – and, in fact, adapting to what’s happening is some of what makes animals intelligent.’

Simulated childhood

In one of the most promising approaches developed in the ResiBots project, the robots have a simulated childhood, in which they learn different ways to move their body using an algorithm that searches ahead of time to collect examples of useful behaviours. 

This means that when seeking a way to move, the robots need to choose from one of about 13,000 behaviours rather than an estimated 1047 options that standard algorithms could select from. And the aim is for them to try only a handful of these before finding one that works.

Most of ResiBot’s tests are currently being carried out on a six-legged robot that seeks to find new ways to move after having one or more legs removed. In the latest trials, Prof. Mouret said the robots learned to walk in one to two minutes after one of their legs was taken off, meaning they generally need to test fewer than 10 behaviours before finding one that works.

Test robots can learn to overcome a broken leg in under two minutes. Video credit – Horizon

In total, the researchers are working on half a dozen robots at varying levels of complexity including a child-like humanoid robot known as iCub. Though the much more complex iCub is not yet being used in many trials, the team hopes to do so more over time.

‘Humanoids have the potential of being highly versatile and adapting well to environments designed for humans,’ said Prof. Mouret. ‘For instance, nuclear power plants have doors, levers and ladders that were designed for people.’

There are, however, some big challenges still to overcome, including the fact that a robot needs to be moved back to its starting position once a limb is removed, rather than being able to carry on from the injury site towards the target.

Safety

There are also wider safety issues involving such robots – for example, ensuring that they do not harm earthquake survivors while rescuing them, particularly if the robot is learning by trial and error, said Prof. Mouret.

He believes it will be at least four or five years before such a robot could be used in the field, but is hopeful that the techniques can eventually be employed in all types of robot – not just those for disaster situations, but in the home and other scenarios.

But it’s not just mechanics that can help robots navigate the real world. Robots may also adapt better if they can more strongly connect language to reality. 

Professor Gemma Boleda at the Universitat Pompeu Fabra in Spain, has a background in linguistics and her team is trying to link research in this field to artificial intelligence to help machines better understand the world around them, as part of a project called AMORE.

It’s something that could be useful for making technology such as GPS more intelligent. For example, when driving in a car, the GPS system could specify that you turn right where ‘the big tree’ is, distinguishing it from several other trees.

Prof. Boleda says this has been hard to do in the past because of the difficulty of modelling the way humans link language with reality.

‘In the past, language had largely been represented out of context,’ said Prof. Boleda.

AMORE’s aim is to get computers to understand words and concepts in a real-world context rather than as individual words in isolation, she says. For instance, a robot would learn to connect the phrase ‘this dog’ with an actual dog in the room, representing both the words and the real-world entities.

‘The crux of these models is that they are able to learn their own representations from data,’ she added. ‘Before, researchers had to tell the machine what the world looked like.’

Giving machines a better understanding of the world around them will help them do ‘more with less’ in terms of the amount of data they need and get better at predicting outcomes, Prof. Boleda said.

It could also help with the issue of having enough physical space on devices like mobile phones for the next wave of intelligent applications.

‘I am working with language, but this problem of needing a lot of data is a problem that plagues many other domains of artificial intelligence,’ said Prof. Boleda. ‘So if I develop methods that can do more with less, then these can also be applied elsewhere.’

The research in this article was funded by the EU’s European Research Council.

Drones and satellite imaging to make forest protection pay

Better tracking of forest data will make the climate change reporting process easier for countries who want compensation for protecting their carbon stock. Image credit – lubasi, licensed under CC BY-SA 2.0

by Steve Gillman
Every year 7 million hectares of forest are cut down, chipping away at the 485 gigatonnes of carbon dioxide (CO2) stored in trees around the world, but low-cost drones and new satellite imaging could soon protect these carbon stocks and help developing countries get paid for protecting their trees.

‘If you can measure the biomass you can measure the carbon and get a number which has value for a country,’ said Pedro Freire da Silva, a satellite and flight system expert at Deimos Engenharia, a Portuguese technology company.

International financial institutions, such as the World Bank and the European Investment Bank, provide developing countries with economic support to keep forests’ carbon stocks intact through the UN REDD+ programme.

The more carbon a developing country can show it keeps in its forests, the more money the government could get, which would give them a greater incentive to protect these lands. But, according to Silva, these countries often lack the tools to determine the exact amount of carbon stored in their forests and that means they could be missing out on funding.

‘If you have a 10% error in your carbon stock (estimation), that can have a financial value,’ he said, adding that it also takes governments a lot of time and energy to collect the relevant data about their forests.

To address these challenges, a project called COREGAL developed automated low-cost drones that map biomass. They put a special sensor on drones that fly over forests and analyse Global Positioning System (GPS) and Galileo satellite signals as they bounce back through a tree canopy, which then reveals the biomass density of an area and, in turn, the carbon stocks.

‘The more leaves you have, the more power (from GPS and Galileo) is lost,’ said Silva who coordinated the project. This means when the drone picks up weaker satellite navigation readings there is more biomass below.

‘If you combine this data with satellite data we get a more accurate map of biomass than either would (alone),’ he added.

Sentinels

The project trialled their drone prototype in Portugal, with Brazil in mind as the target end user as it is on the frontline of global deforestation. According to Brazilian government data, an area about five times to size of London was destroyed between August 2017 and July this year.

COREGAL’s drones could end up enabling countries such as Brazil to access more from climate funds, in turn creating a stronger incentive for governments to protect their forests. Silva also believes the drones could act as a deterrent against illegal logging.

‘If people causing deforestation know that there are (drone) flight campaigns or people going to the field to monitor forests it can demotivate them,’ he said. ‘It is like a sentinel system.’

In the meantime, governments in other developing countries still need the tools to help them fight deforestation. According to Dr Thomas Häusler, a forest and climate expert at GAF, a German earth observation company, the many drivers of deforestation make it very difficult to sustainably manage forests.

‘(Deforestation) differs between regions and even in regions you have different drivers,’ said Dr Häusler. ‘In some countries they (governments) give concessions for timber logging and companies are (then) going to huge (untouched forest) areas to selectively log the highest value trees.’

Logging like this is occurring in Brazil, central Africa and Southeast Asia. When it happens, Dr Häusler says this can cause huge collateral damage because loggers leave behind roads that local populations use to access previously untouched forests which they further convert for agriculture or harvest wood for energy.

Twenty percent of Earth's forest cover is roughly 863,000 km². Image credit - Horizon

Demand for timber and agricultural produce from developed countries can also drive deforestation in developing countries because their governments see the forest as a source of economic development and then allow expansion.

With such social, political and economic dependency, it can be difficult, and expensive, for governments to implement preventative measures. According to Dr Häusler, to protect untouched forests these governments should be compensated for fighting deforestation.

‘To be compensated you need strong (forest) management and observation (tools),’ said Dr Häusler, who is also the coordinator of EOMonDis, a project developing an Earth-observation-based forest monitoring system that aims to support governments.

Domino effect

They combine high-resolution data from the European Sentinel satellites, available every five days through Copernicus, the EU’s Earth observation system, along with data from the North American Landsat-8 satellite.

Automated processing using special algorithms generates detailed maps on the current and past land use and forest situation to identify the carbon-rich forest areas. The project also has access to satellite data going as far back as the 1970s which can be used to determine how much area has been affected by deforestation.

Like COREGAL, using these maps, and the information they contain, a value is put on the most carbon-rich forest areas, meaning countries can access more money from international financial institutions. The project is almost finished and they soon hope to have a commercially viable system for use.

‘The main focus is the climate change reporting process for countries who want compensation in fighting climate change,’ said Dr Häusler. ‘We can support this process by showing the current land-use situation and show the low and high carbon stocks.’

Another potential user of this system is the international food industry that sells products containing commodities linked to deforestation such as palm oil, cocoa, meat and dairy. In response to their contribution, and social pressure, some of these big companies have committed to zero-deforestation in their supply chain.

‘When someone (a company) is declaring land as zero deforestation, or that palm plantations fit into zero deforestation, they have to prove it,’ said Dr Häusler. ’And a significant result (from the project) is we can now prove that.’

Dr Häusler says the system will help civil society and NGOs who want to make sure industry or governments are behaving themselves as well as allow the different groups to make environmentally sound decisions when choosing land for different purposes.

‘We can show everybody – the government, NGO stakeholders, but also the industry – how to better select the areas they want to use.’

The research in this article was funded by the EU. If you liked this article, please consider sharing it on social media.

Models of dinosaur movement could help us build stronger robots and buildings

Researchers are using computer simulations to estimate how 11 different species of extinct archosaurs such as the batrachotomus might have moved. Image credit: John Hutchinson

By Sandrine Ceurstemont

From about 245 to 66 million years ago, dinosaurs roamed the Earth. Although well-preserved skeletons give us a good idea of what they looked like, the way their limbs worked remains a bigger mystery. But computer simulations may soon provide a realistic glimpse into how some species moved and inform work in fields such as robotics, prosthetics and architecture.

John Hutchinson, a professor of evolutionary biomechanics from the Royal Veterinary College in Hertfordshire, UK, and his colleagues are investigating the locomotion of the earliest, small dinosaurs, as part of the five-year-long Dawndinos project which began in 2016.

‘These dinosaurs have been hugely neglected,’ Prof. Hutchinson said. ‘People – including me – have mostly been studying the celebrity dinosaurs like T. rex.’

About 225 million years ago, during the late Triassic period, these small dinosaurs were in the minority, whereas the bigger crocodile-like animals that lived alongside them were more numerous and diverse. Dinosaurs somehow went on to thrive while most other animals from that period became extinct.

Compared to their quadrupedal, heavy-built contemporaries, what stands out about these early dinosaurs is that they had an erect posture and could, at least intermittently, walk on two limbs. One theory is that their style of locomotion gave them a survival edge.

‘The idea of this project is to test that idea,’ Prof. Hutchinson said.

The team has started to develop computer simulations to estimate how 11 different species of extinct archosaurs – the group of animals that includes crocodiles, birds, their relatives and dinosaurs – might have moved. They will focus on five different types of motion: walking, running, turning, jumping and standing.

Simulations

To test whether their simulations are accurate, the researchers plan to give the same treatment to their living relatives – crocodiles and birds – as well. They will then compare the results to actual measurements of motion to determine how good their computer models of extinct animals are.

‘It will be the first time we ground-truth (test with empirical evidence) these methods very rigorously with the best possible data we can get,’ Prof. Hutchinson said.

So far, they’ve modelled the movement of a Mussaurus – an early cousin of giant plant-eating sauropod dinosaurs such as Brontosaurus. The Mussaurus was much smaller and researchers wanted to see whether it moved on four legs like its larger relatives. The first reconstructions of the animal had it on four legs because it had quite big arms, said Prof. Hutchinson.

Using scans of well-preserved fossils from Argentina, they were able to produce new models of its movement. Prof. Hutchinson and his team found that it was in fact bipedal. It couldn’t have walked on four legs since the palms of its front limbs faced inwards and the forearm joints weren’t capable of rotating downwards. Therefore, it wouldn’t have been able to plant its front legs on the ground.

‘It wasn’t until we put the bones together in a 3D environment and tried playing with their movements that it became clear to us that this wasn’t an animal with very mobile arms and hands,’ Prof. Hutchinson said.

After modelling the large forearm of the Mussaurus, the Dawndinos team realised that it could not be used for walking. Video courtesy: John Hutchinson

Robotics

The simulations produced during the project could be useful for zoologists. But they could have less obvious applications too, for example, helping to improve how robots move, according to Prof. Hutchinson.

Accurate models are needed to replicate the motion of animals, which robotics researchers often take inspiration from. Mimicking a crocodile, for example, could be of interest to create a robot that can both swim and walk on land.

Prof. Hutchinson also regularly gets contacted by film and documentary makers who are interested in using his simulations to create realistic animations. ‘It’s hard to make bigger, or unusual, animals move correctly if the physics isn’t right,’ Prof. Hutchinson said.

Understanding the locomotion of the very largest dinosaurs is the aim of a project being undertaken by paleobiology researcher Alexandra Houssaye and her colleagues from France’s National Centre for Scientific Research and the National Museum of Natural History in Paris. Through their Gravibone project, which began last year, they want to pin down the limb bone adaptations that allow large animals to carry a heavy skeleton.

‘We really want to understand what (bone features) are linked to being massive,’ Dr Houssaye said.

Massive

So far, research has shown that the long bones in the limbs of bigger animals are more robust than those of smaller animals. But this general trend has only been superficially observed. The outer and inner bone structures have adapted over time to help support animals’ weight. For example, whereas smaller terrestrial animals have hollow limb bones, massive ones like elephants, rhinos and hippos have connective tissue in the middle.

Among the largest animals and their ancestors there are also other differences. The limb bones of modern rhinos, for example, are short and heavy. But their prehistoric relatives called Indricotherium, the largest land mammal that ever lived, had a less stocky skeleton. ‘It’s interesting to see that the biggest didn’t have the most massive (frame),’ Dr Houssaye said.

The team is studying both living and extinct animals, focussing on elephants, rhinos, hippos, prehistoric mammals and dinosaurs such as sauropods – a group that includes the biggest terrestrial animals of all time.

So far, they have compared the ankle bones of horses, tapirs, rhinos and fossils of rhinos’ ancestors. They found that for animals of the same mass there were differences depending on if they were short and stout or had longer limbs. In less stocky animals, the two ankle bones tended to be more distinct whereas they were more strongly connected in those that were massively built, probably to reinforce the articulation.

‘It’s not only the mass (of the animal) but how the mass is distributed on the body,’ said Dr Houssaye. ‘For us that was interesting.’

3D modelling

Their next step will be to scan different limb bones and analyse their inner structure. They will also use 3D modelling to figure out how much weight different parts of the bones can handle in different spots, for example.

The results from the project could help make more efficient prosthetics for people and animals, Dr Houssaye said. Designers will be able to better understand how different features of limb bones, such as thickness and orientation, relate to their strength, enabling them to create materials that are lighter but more resistant. 

Similarly, Dr Houssaye has also had interest from the construction industry which is looking for new types of materials and more effective building techniques. Pillars supporting heavy buildings, for example, could be made using less material by improving their inner structure instead.

‘How a skeleton adapts (to heavy weight) has implications for construction,’ Dr Houssaye said. ‘(Architects) are trying to create structures that are able to support heavy weight.’

The research in this article was funded by the European Research Council. If you liked this article, please consider sharing it on social media.

Memory-jogging robot to keep people sharp in ‘smart’ retirement homes

Sensors placed throughout a retirement home helped the ENRICHME robot to keep track of the movements and activities of residents taking part in the project’s trial. Image credit – ENRICHME

by Steve Gillman

Almost a fifth of the European population are over 65 years old, but while quality of life for this age bracket is better than ever before, many will at some point suffer from a decline in their mental abilities.

Without adequate care, cognitive abilities can decline quicker, yet with the right support people can live longer, healthier and more independent lives. Researchers working on a project called ENRICHME have attempted to address this by developing a robotic assistant to help improve mental acuity.

‘One of the big problems of mild cognitive impairment is temporary memory loss,’ said Dr Nicola Bellotto from the School of Computer Science at the University of Lincoln in the UK and one of the principal investigators of ENRICHME.

‘The goal of the project was to assist and monitor people with cognitive impairments and offer basic interactions to help a person maintain their cognitive abilities for longer.’

The robot moves around a home providing reminders about medication as well as offering regular physical and mental exercises – it can even keep tabs on items that are easily misplaced.

A trial was conducted with the robot in three retirement homes in England, Greece and Poland. At each location the robot helped one or more residents, and was linked to sensors placed throughout the building to track the movements and activities of those taking part.

‘All this information was used by the robot,’ said Dr Bellotto. ‘If a person was in the bedroom or kitchen the robot could rely on the sensors to know where the person is.’

The robot was also kitted out with a thermal camera so it could measure the temperature of a person in real time, allowing it to estimate the levels of their respiration and heartbeat. This could reveal if someone was experiencing high levels of stress related to a particular activity and inform the robot to act accordingly.

This approach is based around a principle called ambient assisted living, which combines technology and human carers to improve support for older people. In ENRICHME’s case, Dr Bellotto said their robots could be operated by healthcare professionals to provide tailored care for elderly patients.

The users involved in the trials showed high level of engagement with the robot they were living with – even naming it Alfie in one case – and also provided good feedback, said Dr Bellotto. But he added that it is still a few years away from being rolled out to the wider public.

‘Some of the challenges we had were how to approach the right person in these different environments because sometimes a person lives with multiple people, or because the rooms are small and cluttered and it is simply not possible for the robot to move safely from one point to another,’ he said.

Dr Bellotto and his team are now applying for funding for new projects to solve the remaining technical problems, which hopefully one day will help them take the robot one step closer to commercialisation.

This type of solution would help increase people’s physical and psychological wellbeing, which could help to reduce public spending on care for older people. In 2016 the total cost of ageing in the EU was 25% of GDP, and this figure is expected to rise in the coming decades.

Ageing populations

‘One of the big challenges we have in Europe is the high number of elderly people that the public health system has to deal with,’ said Professor María Inés Torres, a computer scientist from the University of the Basque Country in Spain.

Older people can fall into bad habits for a variety of reasons, such as being unable to go for walks and cook healthy meals because of physical limitations. The loss of a loved one can also lead to reduced socialising, exercising or eating well. These unhealthy habits are all exacerbated by depression caused by loneliness, and with 32% of people aged over 65 living alone in Europe, this is a significant challenge to overcome.

‘If you are able to decrease the correlation between depression and age, so keeping people engaged with life in general and social activities, these people aren’t going to visit the doctor as much,’ said Prof. Torres, who is also the coordinator of EMPATHIC, a project that has developed a virtual coach to help assist elderly people to live independently.

The coach will be available on smart devices such as phones or tablets and is focused on engaging with someone to help them keep up the healthier habits they may have had in the past.

‘The main goal is that the user reflects a little bit and then they can agree to try something,’ said Prof. Torres.

For instance, the coach may ask users if they would like go to the local market to prepare their favourite dinner and then turn it into a social activity by inviting a friend to come along. This type of approach addresses the three key areas that cause older people’s health to deteriorate, said Prof. Torres, which are poor nutrition, physical activity and loneliness.

For the coach to be effective the researchers have to build a personal profile for each user, as every person is different and requires specific suggestions relevant to them. They do this by building a database for each person over time and combining it with dialogue designed around the user’s culture.

The researchers are testing the virtual coach on smart devices with 250 older people in three areas – Spain, France and Norway – who are already providing feedback on what works and what doesn’t, which will increase the chances of the virtual coach actually being used.

By the end of the project in 2020, the researchers hope to have a prototype ready for the market, but Prof. Torres insisted that it will not replace healthcare professionals. Instead she sees the smart coach as another tool to help older people live a more independent life – and in doing so reduce the pressure on public healthcare systems.

The research in this article was funded by the EU. If you liked this article, please consider sharing it on social media.

Garbage-collecting aqua drones and jellyfish filters for cleaner oceans

An aqua drone developed by the WasteShark project can collect litter in harbors before it gets carried out into the open sea. Image credit – WasteShark

By Catherine Collins

The cost of sea litter in the EU has been estimated at up to €630 million per year. It is mostly composed of plastics, which take hundreds of years to break down in nature, and has the potential to affect human health through the food chain because plastic waste is eaten by the fish that we consume.

‘I’m an accidental environmentalist,’ said Richard Hardiman, who runs a project called WASTESHARK. He says that while walking at his local harbour one day he stopped to watch two men struggle to scoop litter out of the sea using a pool net. Their inefficiency bothered Hardiman, and he set about trying to solve the problem. It was only when he delved deeper into the issue that he realised how damaging marine litter, and plastic in particular, can be, he says.

‘I started exploring where this trash goes – ocean gyres (circular currents), junk gyres, and they’re just full of plastic. I’m very glad that we’re now doing something to lessen the effects,’ he said.

Hardiman developed an unmanned robot, an aqua drone that cruises around urban waters such as harbours, marinas and canals, eating up marine litter like a Roomba of the sea. The waste is collected in a basket which the WasteShark then brings back to shore to be emptied, sorted and recycled.

The design of the autonomous drone is modelled on a whale shark, the ocean’s largest known fish. These giant filter feeders swim around with their mouths open and lazily eat whatever crosses their path.

It’s powered by rechargeable electric batteries, ensuring that it doesn’t pollute the environment through oil spillage or exhaust fumes, and it is relatively silent, avoiding noise pollution. It produces zero carbon emissions and the device moves quite slowly, allowing fish and birds to merely swim away when it gets too close for comfort.

‘We’ve tested it in areas of natural beauty and natural parks where we know it doesn’t harm the wildlife,’ said Hardiman. ‘We’re quite fortunate in that, all our research shows that it doesn’t affect the wildlife around.’

WasteShark’s autonomous drone is modelled on a whale shark. Credit – RanMarine Technology

WasteShark is one of a number of new inventions designed to tackle the problem of marine litter. A project called CLAIM is developing five different kinds of technology, one of which is a plasma-based tool called a pyrolyser. 

Useful gas

CLAIM’s pyrolyser will use heat treatment to break down marine litter to a useful gas. Plasma is basically ionised gas, capable of reaching very high temperatures of thousands of degrees. Such heat can break chemical bonds between atoms, converting waste into a type of gas called syngas.

The pyrolyser will be mounted onto a boat collecting floating marine litter – mainly large items of plastic which, if left in the sea, will decay into microplastic – so that the gas can then be used as an eco-friendly fuel to power the boat, or to provide energy for heating in ports.

Dr Nikoleta Bellou of the Hellenic Centre for Marine Research, one of the project coordinators of CLAIM, said: ‘We know that we humans are actually the key drivers for polluting our oceans. Unlike organic material, plastic never disappears in nature and it accumulates in the environment, especially in our oceans. It poses a threat not only to the health of our oceans and to the coasts but to humans, and has social, economic and ecological impacts.’

The researchers chose areas in the Mediterranean and Baltic Seas to act as their case studies throughout the project, and will develop models that can tell scientists which areas are most likely to become litter hotspots. A range of factors influence how littered a beach may be – it’s not only affected by litter louts in the surrounding area but also by circulating winds and currents which can carry litter great distances, dumping the waste on some particular beaches rather than others.

CLAIM’s other methods to tackle plastic pollution include a boom – a series of nets criss-crossing a river that catches all the large litter that would otherwise travel to the sea. The nets are then emptied and the waste is collected for treatment with the pyrolyser. There have been problems with booms in the past, when bad weather conditions cause the nets to overload and break, but CLAIM will use automated cameras and other sensors that could alert relevant authorities when the nets are full.

Microplastics

Large plastic pieces that can be scooped out of the water are one thing, but tiny particles known as microplastics that are less than 5mm wide pose a different problem. Scientists on the GoJelly project are using a surprising ingredient to create a filter that prevents microplastics from entering the sea – jellyfish slime.

The filter will be deployed at waste water management plants, a known source of microplastics. The method has already proven to be successful in the lab, and now GoJelly is planning to upscale the biotechnology for industrial use.

Dr Jamileh Javidpour of the GEOMAR Helmholtz Centre for Ocean Research Kiel, who coordinates the project, said: ‘We have to be innovative to stop microplastics from entering the ocean.’

The GoJelly project kills two birds with one stone – tackling the issue of microplastics while simultaneously addressing the problem of jellyfish blooms, where the creatures reproduce in high enough levels to blanket an area of ocean.

Jellyfish are one of the most ancient creatures on the planet, having swum in Earth’s oceans during the time of the dinosaurs. On the whole, due to a decline in natural predators and changes in the environment, they are thriving. When they bloom, jellyfish can attack swimmers and fisheries.

Fishermen often throw caught jellyfish back into the sea as a nuisance but, according to Dr Javidpour, jellyfish can be used much more sustainably. Not only can their slime be used to filter out microplastics, they can also be used as feed for aquaculture, for collagen in anti-ageing products, and even in food.

In fact, part of the GoJelly project involves producing a cookbook, showing people how to make delicious dishes from jellyfish. While Europeans may not be used to cooking with jellyfish, in many Asian cultures they are a daily staple. However, Dr Javidpour stresses that the goal is not to replace normal fisheries.

‘We are mainly ecologists, we know the role of jellyfish as part of a healthy ecosystem,’ she said. ‘We don’t want to switch from classical fishery to jellyfish fishery, but it is part of our task to investigate if it is doable, if it is sustainable.’

The research in this article has been funded by the EU.

We want to end the de-industrialisation of Europe – Prof. Jürgen Rüttgers

Artificial intelligence software technologies are very important for European industry, says Prof. Jürgen Rüttgers. Image credit – Dirk Vorderstraße, licensed under CC BY 2.0
by Rex Merrifield

He leads the High Level Group on Industrial Technologies, which on 24 April released a report called Re-finding industry – Defining Innovation to make recommendations on EU research and innovation priorities for industry in the next funding programme.

Your report says that AI should be designated as a key enabling technology in the next funding programme, which means it is classed as a priority policy area. What does this mean in practice?

‘Artificial intelligence is something relatively new and a field of strong competition, not only in the world, but also in Europe. It is therefore essential to find a common way ahead to encourage our research. Europe has long experience with identifying key enabling technologies and through these we can organise research and innovation, so we find the best answers from new technologies to benefit our industries.

‘That was why we are defining not only a new innovation policy, but also a policy to re-find industry. We want to end the de-industrialisation of Europe of recent years and by setting these technologies as priorities, we can find a strong way ahead for industry.

‘If we do it well, and I am sure that is possible, we also have the possibility in Europe to recover jobs we have been losing abroad. That is an idea that is very closely tied with the idea of strong productivity growth. But we are aiming not only for economic growth. Among our recommendations is that the European Union and Member States aim for inclusive growth and sustainable protection of our planet. That is essential to our idea for the future.’

 

What other technologies are crucial to Europe’s industry?

‘Industry is central to Europe’s economy. It contributes to Europeans’ prosperity and provides jobs to 36 million people in Europe – one in five jobs. So the new key enabling technologies for the 21st century need to put us in front in the global competition between Europe, the United States and China. We look at these technologies in three areas.

‘It is very important for the EU to prioritise production technologies. These include advanced manufacturing technologies, advanced materials and nanotechnologies, and life science.

‘And in the 21st century it is essential to encourage digital technologies – micro- and nano-electronics and photonics.

‘Thirdly, we also need to advance cyber technologies and central to this is artificial intelligence, along with cyber security and connectivity.

‘If you look at data generation and handling, big data analytics, machine learning and deep learning, robots and virtual agents, you can see that artificial intelligence software technologies are very important for European industry in this century.’

What impact do you hope that your report will have on research output, industry and for the wider public?

‘Central to the great challenges in the 21st century is the transition from the industrial society to the knowledge society. In this society, we have a new production factor – knowledge, which is a most important resource.

‘We also face globalisation, through communication without borders. In the future, no economy can be organised only in a national way.

‘The knowledge society will be accelerated by the widespread digitisation of the economy and society, leading to this process being called the ‘digital revolution’. It is comparable in scope and impact to the industrial revolution of some 200 years ago.

‘Many people fear that this change is not good for them, so it is necessary to have open discussion in all European Member States about this great change. If we manage it well, we will have more jobs, more productivity growth and will be at the front of international competition. But we will only have a chance to achieve this future if we have a strong research ecosystem.’

What else needs to happen to encourage innovation?

‘The European policy for the future is research and innovation policy. It is currently under-financed and for a new system of innovation to grow strongly, we must also have a bigger budget. I believe a most important decision for the European Council in coming months is the debate over the budget for 2021-2027.

‘And we also know what is necessary to do over the next decade, together with all citizens in Europe, to achieve success. I believe we also need a clear message to the people of Europe, so that they will agree.’

Your proposals emphasise sustainability, job creation, fighting inequalities and support to democracy. Could you explain the importance of these elements in research and innovation?

‘In research and innovation policy there is not only an argument for jobs, but also the argument that it is good to live in a united Europe with these values.

‘Young people must see they have a chance to work, have good education, to have good vocational or academic jobs. And we need to make sure there is a chance for more new jobs in new start-ups, in new small and medium enterprises, in new industries and in a new innovation system. If we do that, we will see great acceptance among European Union citizens. They will accept new things, not fear them. The reason is that they know we have an inclusive society and not only a very successful economy.’

If you liked this article, please consider sharing it on social media.

Sherlock Drones – automated investigators tackle toxic crime scenes

Using drones to gather information and samples from a hazardous scene can help incident commanders make critical decisions. Image credit – ROCSAFE

by Anthony King

Crimes that involve chemical, biological, radiological or nuclear (CBRN) materials pose a deadly threat not just to the target of the attack but to innocent bystanders and police investigators. Often, these crimes may involve unusual circumstances or they are terrorist-related incidents, such as an assassination attempt or the sending of poisons through the mail.

In the recent notorious case of poisoning in the UK city of Salisbury in March 2018, a number of first responders and innocent bystanders were treated in hospital after two victims of chemical poisoning were found unconscious on a park bench. One policeman who attended the scene became critically ill after apparent exposure to a suspected chemical weapon, said to be a special nerve agent called novichok. Police said a total of 21 people required medical care after the incident.

Past examples of rare but toxic materials at crime scenes include the 2001 anthrax letter attacks in the US and the Tokyo subway sarin gas attack in 1995. Following the radioactive poisoning of the Russian former spy, Alexander Litvinenko in London, UK, in 2006, investigators detected traces of the toxic radioactive material polonium in many locations around the city.

Despite these dangers, crime scene investigators must begin their forensic investigations immediately. European scientists are developing robot and remote-sensing technology to provide safe ways to assess crime or disaster scenes and begin gathering forensic evidence.

Harm’s way

‘We will send robots into harm’s way instead of humans,’ explained Professor Michael Madden at the National University of Ireland Galway, who coordinates a research project called ROCSAFE. ‘The goal is to improve the safety of crime scene investigators.’

The ROCSAFE project, which ends in 2019, will deploy remote-controlled aerial and ground-based drones equipped with sensors to assess the scene of a CBRN event without exposing investigators to risk. This will help to determine the nature of the threat and gather forensics.

In the first phase of a response, a swarm of drones with cameras will fly into an area to allow investigators to view it remotely. Rugged sensors on the drones will check for potential CBRN hazards. In the second phase, ground-based robots will roll in to collect evidence, such as fingerprint or DNA samples.

The ROCSAFE aerial drone could assess crime or disaster scenes such as that of a derailed train carrying radioactive material. It will deploy sensors for immediate detection of radiation or toxic agents and collect air samples to test later in a laboratory. Meanwhile, a miniature lab-on-a-chip on the drone will screen returned samples for the presence of viruses or bacteria, for instance.

An incident command centre usually receives a huge volume of information in a short space of time – including real-time video and images from the scene. Commanders need to process a lot of confusing information in an extreme situation quickly and so ROCSAFE is also developing smart software to lend a hand.

Rare events

‘These are rare events. This is nobody’s everyday job,’ said Prof. Madden. ‘We want to use artificial intelligence and probabilistic reasoning to reduce cognitive load and draw attention to things that might be of interest.’

As an example, image analysis software might flag an area with damaged vegetation, suggest a possible chemical spill and suggest that samples be taken. Information such as this could be presented to the commander on a screen in the form of a clickable map, in a way that makes their job easier.

Sometimes, the vital evidence itself could be contaminated. ‘There may be some physical evidence we need to collect – a gun or partly exploded material or a liquid sample,’ said Prof. Madden. ‘The robot will pick up, tag and bag the evidence, all in a way that will stand up in court.’ The researchers are constructing a prototype six-wheeled robot for this task that is about 1.5-metres-long and that can handle rough terrain.

Helping forensic teams to deal with hazardous evidence is a new forensic toolbox called GIFT CBRN. The GIFT researchers have devised standard operating procedures on how to handle, package and analyse toxins such as the nerve-agent ricin, which is deadly even in minute amounts.

When the anthrax powder attacks took place in the US in 2001, the initial response of the security services was slow, partly because of the unprecedented situation. GIFT scientists have drawn up ‘how to’ guides for investigators, so they can act quickly in response to an incident.

‘We want to find the bad guys quickly so we can stop them and arrest those involved,’ said Ed van Zalen at the Netherlands Forensic Institute, who coordinated the GIFT project.

Nerve agents

As well as containment, GIFT devised sensing technology such as a battery-powered boxed device that can be brought to a crime scene to identify nerve agents like sarin within an hour or two. This uses electrophoresis, a chemical technique that identifies charged molecules by applying an electric field and analysing their movements. Usually, samples must be collected and returned to the lab for identification, which takes far longer.

Additionally, they developed a camera to detect radioactive material that emits potentially damaging radiation called alpha particles. This form of radiation is extremely difficult to detect and even a Geiger counter – used to detect most radiation – cannot pick it up. The substance polonium, used to murder Litvinenko, emits alpha particles which create radiation poisoning but because it was a novel attack, the failure to detect it initially slowed the police investigation.

‘That detector would have been very helpful at the time of the Litvinenko case,’ said van Zalen.

The research in this article is funded by the EU.

Swarming drones could help fight Europe’s megafires

More than 700,000 hectares of land in the EU were destroyed by forest fires between January and September 2017. Image credit – CC0 Public Domain
by Rob Coppinger

Swarms of firefighting drones could one day be deployed to tackle hugely destructive megafires that are becoming increasingly frequent in the Mediterranean region because of climate change, arson and poor landscape management.

It’s one of a number of initiatives looking at how best to fight large fires from the air – a challenge that’s becoming more and more common.

A 2017 report on forest fires by the EU’s Joint Research Centre said that the year would ‘likely be remembered as one of the most devastating wildfire seasons in Europe since records began’, after the destruction of nearly 700,000 hectares of land in the EU by early September.

Such fires are dangerous not only for people who live in the area but also for the crews of people whose job it is to put the fires out. But using intelligent robots to scout the area and drop water can allow humans to stand further back from the danger zone, only looking at the drones’ data to make decisions from the safety of a command and control centre.

Because drones can fly day or night and gain rapid access to previously inaccessible urban or rural fires they can help to save both the lives of the public and first responders.

Torrential

Multiple autonomous drones dropping 600 litres of water every minute during the night while other unmanned vehicles refill to repeat the attack on a raging fire is the vision of Spanish company Drone Hopper. Despite this torrential approach, ‘we are not meant to be competitors with the airplanes and helicopters, we want to be complementary,’ Drone Hopper’s chief executive officer, Pablo Flores, said.

Their drone uses heat cameras to locate the fire, analyse it, send back the data, and identify what type of fire it is. At just over a metre and a half in length, it can be deployed from an aircraft, as well as a ground vehicle.

The drone is like a helicopter and can hover directly over a specific burning area, but it has many propellers. Once over its target it will release its liquid cargo as a mist designed specifically for the fire type identified.

A mist is good at fighting fire because it cools the area by evaporation and it blocks the transfer of heat to anything flammable nearby. To create the right type of mist, the Drone Hopper uses a proprietary magnetic system and the jet wash from its many propellers to direct the released water and nebulise it.

Flores wants to offer his drone, which is still in development, to local authorities for firefighting. ‘They can’t buy a $30 million airplane, but can have this platform and have their own means to (tackle a fire).’ He says that the Drone Hopper UAV costs five times less per litre than a water tanker aircraft.

Spanish company Drone Hopper wants to create a fleet of drones that can drop 600 litres of water every minute during the night. Image credit – Drone Hopper

But, there is a regulatory obstacle. At the moment, it hasn’t been proven that drones can reliably act autonomously, so national rules generally require each one to have a human remote pilot.

Dr Nazim Kemal Ure, an assistant professor in the aerospace department at Istanbul Technical University in Turkey, said: ‘With multiple autonomous systems many things can be achieved much more quickly.’

Swarming

He is developing a way of coordinating drones that he hopes could contribute to a change in regulation. By the end of the year, Dr Ure expects to be field testing autonomous drones and their swarming algorithms, developed under the DUF project.

Like Drone Hopper, ‘we are detecting the fire by using vision,’ Dr Ure explained. In initial testing the image processing will not be done by the drones, but eventually in real-world flight-testing the algorithms will be installed onboard.

His drones would fly over a burning area and, by examining the vegetation and wind direction and other factors, predict the fire’s spread and direction. With that information they would then precisely drop retardant to stop the fire.

Dr Ure added that further flight-testing may see cooperation with the Turkish government’s Ministry of Forestry and involve a controlled fire.

However, there is work yet to be done to improve the computer-generated fire images in the simulated environment they are using for training the artificial intelligence. ‘Our models are, in the graphical parts, not state-of-the-art,’ said Dr Ure. He wants to have ‘hyper-realistic’ fire for the drones’ vision analysis software to learn from. For him, that will help ensure the drones will operate well in the real world.

And in the real world, Dr Ure sees many other applications. ‘I would like to extend this algorithm to other scenarios such as search and rescue and planetary exploration,’ he said.

Types of forest fires
Ground fires occur 25 to 50 cm underground and move slowly, burning through peat and roots. They are notoriously difficult to put out and, if the conditions are right, they can smoulder through the winter and then move above ground in spring.

Surface fires move at a speed of between 3 and 300 metres per minute but burn only the lower vegetation and leave the trees unaffected. Of all the fires, they cause the least damage and are usually easy to put out.

Ladder fires climb up the taller trees and engulf smaller vegetation. Vines and invasive plants help the fire gain momentum.

Crown fires burn trees all the way to the canopy and are the hottest and most dangerous of wildfires. They can spread quickly and are very difficult to put out, partly due to the height of the flames. They can spread beyond natural firebreaks such as rivers through a process called spotting, where wind or hot air carries a piece of burning wood elsewhere and starts a new fire.

The research in this article is funded by the EU. If you liked this article, please consider sharing it on social media.

More info
Drone Hopper
DUF

Robots and workers of the world, unite!

Robots in the workforce will give rise to new jobs for humans, including safety engineers, robot specialists and augmented reality experts, according to researchers. Image credit – ‘FANUC robots’, by Mixabest – Own work, CC BY-SA 3.0

Robots are already changing the way we work – particularly in factories – but worries that they will steal our jobs are only part of the picture, as new technologies are also opening up workplace opportunities for workers and are likely to create new jobs in the future.

Last year, the BBC reported that 800 million global workers will lose their jobs to robotic automation by 2030. This statistic, from a McKinsey Global Institute study, led to countless headlines asking, will robots take your job?

The study found that robots will eliminate some jobs, but also create new ones. As the field develops, European roboticists are busy investigating how factory robots could create new opportunities for workers in manufacturing jobs.

The MANUWORK project is collaborating with non-profit group Lantegi Batuak in Spain, which helps to incorporate people with disabilities into the world of work.

The project scientists are testing a number of assistive technologies including augmented reality (AR) displays that can help workers with disabilities in complex tasks such as the assembly and wiring of electrical cabinets. The display shows step-by-step wiring instructions to the worker.

‘We want to introduce a robot to work with them in the assembly process, to indicate cable connections or do some quick quality checks,’ said Dr Kosmas Alexopoulos at the University of Patras in Greece, who coordinates the project. This kind of human-robot collaboration technology could allow greater involvement of people with disabilities in manufacturing roles across Europe.

Predictable

Predictable physical work in factories plays to the strengths of robots, but even the most modern factories are not run entirely by machines.

European car manufacturing is leading the way in adopting robotics. Last year, the number of robots in French car factories rose by 22% to 1,400 units, with around 9 robots for every 100 workers.

Robots are used heavily for the assembly of cars, mainly for body welding and positioning of large metal parts. Later, the skilled human labour comes in to create the interior. ‘Assembling car interiors is complex and it requires the skills of a human to perform,’ explained Dr Sotiris Makris, industrial robotics expert at the University of Patras.

His industrial robots lab in Patras took a leading role in a project called ROBO-PARTNER, which aimed to safely mix people with robots operating in the same workspace to perform a car assembly task. Under normal circumstances, close man-machine teamwork is not possible because robots and humans are kept apart for safety reasons. The human worker brings intelligence and fine skills, while their robot buddy delivers super-human strength and precision.

Allowing robots and humans work together adds flexibility. – Prof. Björn Hein, Karlsruhe Institute of Technology, Germany

The team took a real-life scenario as a starting point. In an automobile factory, the rear axle of a vehicle – which can weigh over 50 kilograms (kgs) – must be brought into position by a worker. They must secure the part and then fit drum brake components, which also involves heavy lifting and manipulation of flexible wire parts.

‘The worker carries the drums, mounts them and must screw them in place. It is quite a stressful process, both physically and cognitively,’ said Dr Makris.

The project developed a robot, capable of lifting 130 kgs, to move the axle and brake parts into place for the human worker. Importantly, the worker remains in the vicinity as the robot – essentially a powerful arm that can move around and grip and lift items – manoeuvres the car parts into position.

Strenuous

At present, one particular 14-kg part for the rear wheel must be lifted 500 times in a single shift by car assembly workers, but the robot could shoulder this strenuous work. This would mean physical strength would no longer be a reason for someone not to do this job.

Video cameras, sensors and adjustable safety zones prevent the robot from moving dangerously close to the worker. The worker interacts with the robot using a control panel, but also uses virtual reality (VR) glasses that visualise the next task for the worker. The completion of each task is signalled using a smart watch and the human remains in charge.

Dr Makris is optimistic that such research will allow people and robots to work side-by-side. There will be new jobs too. Factories of the future will create new roles, such as safety engineers, robot specialists and AR experts, Dr Makris predicted.

Robots are also being used to transform warehouses for online businesses, which are currently divided into two areas. In the centre, wheeled robots shift shelves around and move items to the perimeter, where human workers are waiting. The workers and robots are kept apart for safety. A laser barrier detects any unauthorised entries by people, which will cause the system to shut down. In the case of a robot failure, the warehouse has to be stopped so a service technician can do repairs.

But what if robots and people could safely tango together in warehouses? This is the vision of SafeLog, a project that is developing a flexible warehouse system where humans and automated guided vehicles safely share the same space.

‘With SafeLog, the service technician could just walk in the warehouse and fix the robot, so the warehouse could stay online,’ explained Professor Björn Hein, robotics engineer at the Karlsruhe Institute of Technology, Germany, and coordinator of the project. Right now, there are usually a few dozen robots, so downtime is not a significant issue. As numbers of robots start to increase, downtime could start to be a problem.

Safety vest

SafeLog is creating a safety vest to be worn by warehouse workers. This wirelessly transmits the worker’s location and can be detected by robots nearby, so that they will sense if a human is too close and stop.

For coordination, SafeLog scientists are developing algorithms that track and predict the movements of people using the vest and robots. This would be used by a fleet management system, the silicon brain that will safely guide hundreds of robots around a warehouse.

The workers will wear smart glasses that will warn them about objects close by and help guide them to items in the warehouse.

‘We intend to make it possible for warehouses to become even bigger, because allowing robots and humans work together adds flexibility and makes it easier to extend,’ said Prof. Hein.

The SafeLog system will be put through its paces in a real warehouse in early 2019.

All research in this article is funded by the EU.

‘Earworm melodies with strange aspects’ – what happens when AI makes music

A new AI machine creates new music from songs it’s fed, mimicking their style. Image credit – FlowMachines

by Kevin Casey

The first full-length mainstream music album co-written with the help of artificial intelligence (AI) was released on 12 January and experts believe that the science behind it could lead to a whole new style of music composition.

Popular music has always been fertile ground for technological innovation. From the electric guitar to the studio desk, laptops and the wah-wah pedal, music has the ability to absorb new inventions with ease.

Now, the release of Hello World, the first entire studio album co-created by artists and AI could mark a watershed in music composition.

Stemming from the FlowMachines project, funded by the EU’s European Research Council, the album is the fruits of the labour of 15 artists, music producer Benoit Carré, aka Skygge, and creative software designed by computer scientist and AI expert François Pachet.

Already Belgian pop sensation Stromae and chart-topping Canadian chanteuse Kiesza have been making waves with the single Hello Shadow.

The single Hello Shadow, featuring Stromae and Kiesza, is taken from the AI-co-written album, Hello World. Video credit – SKYGGE MUSIC

The software works by using neural networks – artificial intelligence systems that learn from experience by forming connections over time, thereby mimicking the biological networks of people’s brains. Pachet describes its basic job as ‘to infer the style of a corpus (of music) and generate new things’.

A musician firstly provides ‘inspiration’ to the software by exposing it to a collection of songs. Once the system understands the style required it outputs a new composition.

‘The system analyses the music in terms of beats, melody and harmony,’ said Pachet, ‘And then outputs an original piece of music based on that style.’

Creative workflow

The design challenge with this software was to make it adapt to the creative workflow of musicians without becoming a nuisance.

‘The core (problem) was how to do that so that (it) takes into account user constraints. Why? Because if you compose music, actually you never do something from scratch from A to Z,’ said Pachet.

He outlines a typical scenario where the AI software generates something and only parts of it are useful but the musician wants to keep it in, drop the rest and generate new sounds using the previous partial output. It’s a complex requirement, in other words.

‘Basically, the main contribution of the project was to find ways to do that, to do that well and to do that fast,’ said Pachet. ‘It was really an algorithmic problem.’ As creative workers driven by intuition, musicians need direct results to maintain their momentum. A clunky tool with ambivalent results would not last long in a creative workflow.

Pachet is satisfied that his technical goal is completed and that the AI will generate music ‘quickly and under user constraints’.

After years of development and refinement, the AI music tool now fits on a laptop, such as to be found in any recording studio, anywhere. In the hands of music producer Carré, the application became the creative tool that built Hello World.

Computer scientist and AI expert Francois Pachet created a system that co-writes music. Image credit – Kevin Casey/ Horizon

Collaboration

As a record producer, Carré collaborated closely with the artists in the studio to write and produce songs. So, as the resident musical expert, can Carré say if this is a new form of music?

‘It’s not a new form of music,’ he said, ‘It’s a new way to create music.’

Carré said he believes the software could lead to a new era in composition. ‘Every time there is a new tool there is a new kind of compositional style. For this project we can see that there is a new kind of melody that was created.’ He describes this as ‘earworm melodies with strange aspects’.

He also says that the process is a real collaboration between human and machine. The system creates original compositions that are then layered into songs in various forms, whether as a beat, a melody or an orchestration. During the process, artists such as Stromae are actively involved in making decisions about what and how to include the muscial fragments the AI provides.

‘You can recognise all the artists because they have made choices that are their identity, I think,’ said Carré.

Pachet concurs. ‘You know in English you say every Lennon needs a McCartney – so that’s the kind of stuff we are aiming at. We are not aiming at autonomous creation. I don’t believe that’s interesting, I don’t believe it’s possible actually, because we have no clue how to give a computer a sense of agency, a sense that something is going somewhere, (that) it has some meaning, a soul, if you want.’

The album’s title Hello World reflects the expression commonly used the very first time someone runs a new computer program or starts a website as proof that is working. Carré believes that Hello World is just the first step and the software signals the start of a whole new way of composing.

‘Maybe not next year, but in five years there will be a new set of tools that helps creators to make music,’ said Carré.


More info

FlowMachines

Max Order web comic

Robotic bugs train insects to be helpers

Robots help ants with daily chores so they can be accepted into the colony. Image credit – Dr Bertrand Collignon

by Aisling Irwin

Tiny mobile robots are learning to work with insects in the hope the creatures’ sensitive antennae and ability to squeeze into small spaces can be put to use serving humans.

With a soft electronic whirr, a rather unusual looking ant trundles along behind a column of its arthropod comrades as they march off to fetch some food.

While the little insects begin ferrying tiny globules of sugar back home, their mechanical companion bustles forward to effortlessly pick up the entire container and carry it back to the nest.

It is a dramatic demonstration of how robots can be introduced and accepted into insect societies.

But the research, which is being conducted as part of the EU-funded CyBioSys project, could be an important step towards using robots to subtly control, or work alongside, animals or humans.

‘The idea is to be able to solve (a) problem with a better solution than they (the robots and insects) can produce individually,’ said Dr Bertrand Collignon, who is leading the research at the École Polytechnique Fédérale de Lausanne, in Switzerland.

The robots, which ‘live’ with the ants, pick up signs that food has been discovered through a camera mounted inside the nest. The camera alerts the robots when it detects an increasing numbers of ants are departing – a sign that food has been found.

The robots – reprogrammed off-the-shelf Thymio bots managed by simple Raspberry Pi computers – then use sensors to follow the columns of exiting ants. Once the ants have led their robotic counterparts to their discovery, the robots take over, using their superior muscle power to lug it home.

Dr Collignon described this as a ‘cyber-biological system’, which improves both on the natural order, and on what robots could achieve on their own. By getting ants and robots to collaborate, each community plays to its strengths, he says.

‘The ants are good at exploring the environment very efficiently, with many scouts patrolling the vicinity of the nest at the same time,’ said Dr Collignon, who is a Marie Skłodowska-Curie action fellow. ‘But individual ants are not able to transport large amounts of food and some can get lost between the food and the nest.’

‘By getting ants and robots to collaborate, each community plays to its strengths.’

Dr Bertrand Collignon, École Polytechnique Fédérale de Lausanne, Switzerland

Robots are like pack animals in comparison, carrying an order of magnitude more food than an ant can, and accomplishing in a few minutes what would have taken the ants hours.

Dr Collignon believes it is the first project to consider an insect swarm as a biosensor and then embed in a robot the ability to extract data from the colony.

But he also believes this research could be combined with other work teaching robots to communicate with animals. Instead of relying on top-down instructions — like a shepherd dog herding sheep — this would work by subtly influencing them from a position as one of the group.

As many social insects such as ants and bees can form aggressive colonies that normally do not respond well to outsiders, influencing them from within may offer a new approach.

In a previous EU-funded project, LEURRE, a team pioneered the creation of small mobile robots that could interact with cockroaches and influence their collective behaviour.

When kept in a pen together, cockroaches will gradually gather under the same dark shelter. They achieve this simply by following two rules: stay close to other cockroaches, and head for somewhere dark.

But when the researchers released small robots into the pen programmed with slightly different rules — stay close to other cockroaches but prefer a lighter refuge — in time the cockroaches, along with the robots, gathered in the lighter shelter instead.

Dr Collignon believes that the two types of robotic work – collaboration and communication – could find applications in search and rescue, exploring environments too dangerous or inaccessible for humans. Eventually, small animals could be used to get into restricted environments such as collapsed buildings.

By integrating artificial systems, such as robots, into more natural ones – such as a warehouse full of chickens – it could lead to new solutions to help control animal behaviour on farms. An example might be preventing deadly mass panic attacks amongst intensively reared animals by using robots that can detect the early signs of an impending stampede and diverting one by behaving in a different way.

‘The first step is to be able to track what natural agents are doing and react appropriately to that,’ he said. ‘That’s already a tricky thing. Once you have sensed what nature is doing, you can then interact. The robotic agent can do what it has been designed for and then act on the system.’


More info
CyBioSys

Page 2 of 3
1 2 3