Page 1 of 3
1 2 3

Foldable, organic and easily broken down: Why DNA is the material of choice for nanorobots

DNA origami is a technique that allows scientists to create 3D bots made from DNA. Image credit – Daniele Adami, licensed under CC BY 2.0

By Anthony King

Doctors know that we need smarter medicines to target the bad guys only. One hope is that tiny robots on the scale of a billionth of a metre can come to the rescue, delivering drugs directly to rogue cancer cells. To make these nanorobots, researchers in Europe are turning to the basic building blocks of life – DNA.

Today robots come in all shapes and sizes. One of the strongest industrial robots can lift cars weighing over two tons. But materials such as silicon are not so suitable at the smallest scales.

While you can make really small patterns in solid silicon, you can’t really make it into mechanical devices below 100 nanometres, says Professor Kurt Gothelf, chemist and DNA nanotechnologist at Aarhus University in Denmark. That’s where DNA comes in. ‘The diameter of the DNA helix is only two nanometres,’ says Prof. Gothelf. A red blood cell is about 6,000 nanometres across.

Lego

Dr Tania Patiño, a nanotechnologist at the University of Rome in Italy, says DNA is like Lego. ‘You have these tiny building blocks and you can put them together to create any shape you want,’ she explained. To continue the analogy, DNA comes in four different coloured blocks and two of the colours pair up opposite one another. This makes them predictable.

Once you string a line of DNA blocks together, another line will pair up opposite. Scientists have learnt how to string DNA together in such a way that they introduce splits and bends. ‘By clever design, you branch out DNA strands so that you now have three dimensions,’ said Prof Gothelf. ‘It is very easy to predict how it folds.’

Dr Patiño is developing self-propelled DNA nanorobotics in her project, DNA-Bots. ‘DNA is highly tuneable,’ she said. ‘We can have software that shows us which sequences produce which shape. This is not possible with other materials at this tiny scale.’

While DNA nanorobots are a long way from being used in people, with Prof. Gothelf saying that ‘we won’t see any medicines based on this in the next ten years,’ progress is being made in the lab. Already scientists can obtain a string of DNA from a virus, and then design using software shorter stretches of DNA to pair with and bend the string into a desired shape. ‘This amazing technique is called DNA origami,’ said Prof. Gothelf. It allows scientists to create 3D bots made from DNA.

In an early breakthrough, Prof. Gothelf’s research lab made a DNA box with a lid that opened. Later, another group built a barrel-shaped robot that could open when it recognised cancer proteins, and release antibody fragments. This strategy is being pursued so that one day a DNA robot might approach a tumour, bind to it and release its killer cargo.

‘With nanorobots we could have more specific delivery to a tumour,’ said Dr Patiño. ‘We don’t want our drugs to be delivered to the whole body.’ She is in the lab of Professor Francesco Ricci, which works on DNA devices for the detection of antibodies and delivery of drugs.

Meanwhile, the network Prof. Gothelf heads up, DNA-Robotics, is training young scientists to make parts for DNA robotics that can perform certain actions. Prof. Gothelf is working on a ‘bolt and cable’ that resembles a handbrake on a bike, where force in one place makes a change in another part of the DNA robot. A critical idea in the network is to ‘plug and play,’ meaning that any parts built will be compatible in a future robot.

This has the potential to make a completely new generation of drugs.

Prof. Kurt Gothelf, Aarhus University, Denmark

Bloodstream

As well as carrying out specific functions, most robots can move. DNA robots are too miniscule to swim against our bloodstream, but it is still possible to engineer into them useful little engines using enzymes.

Dr Patiño previously developed a DNA nanoswitch that could sense the acidity of its environment. Her DNA device also worked as a self-propelling micromotor thanks to an enzyme that reacted with common urease molecules found in our bodies and acted as a power source. ‘The chemical reaction can produce sufficient energy to generate movement,’ said Dr Patiño.

Movement is important to get nanorobots to where they need to be. ‘We could inject these robots in the bladder and they harvest the chemical energy using urease and move,’ said Dr Patiño. In future such movement ‘will help them to treat a tumour or a disease site with more efficiency that passive nanoparticles, which cannot move.’ Recently, Patiño and others reported that nanoparticles fitted with nanomotors spread out more evenly than immobile particles when injected into the bladder of mice.

Rather than swim through blood, nanobots might be able to pass through barriers in our body. Most problems delivering drugs are due to these biological barriers, such as mucosal layers, notes Dr Patiño. The barriers are there to impede germs, but often block drugs. Dr Patiño’s self-propelled DNA robots might change these barriers’ permeability or simply motor on through them.

Stability

Nanoparticles can be expelled from a patient’s bladder, but this option isn’t as easy elsewhere in the body, where biodegradable robots that self-destruct might be necessary. DNA is an ideal material, as it is easily broken down inside of us. But this can also be a downside, as the body might quickly chew up a DNA bot before it gets the job done. Scientists are working on coating or camouflaging DNA and strengthening chemical bonds to boost stability.

One other potential downside is that naked pieces of DNA can be viewed by the immune system as signs of bacterial or viral foes. This may trigger an inflammatory reaction. As yet, no DNA nanobot has ever been injected into a person. Nonetheless, Prof. Gothelf is confident that scientists can get around these problems.

Indeed, stability and immune reaction were obstacles that the developers of mRNA vaccines – which deliver genetic instructions into the body inside a nanoparticle – had to get over. ‘The Moderna and the Pfizer (BioNTech) vaccines (for Covid-19) have a modified oligonucleotide strand that is formulated in a nano-vesicle, so it is close to being a small nanorobot,’ said Prof. Gothelf. He foresees a future where DNA nanorobots deliver drugs to exactly where needed. For example, a drug could be attached to a DNA robot with a special linker that gets cut by an enzyme that is only found inside certain cells, thus ensuring that drug is set free at a precise location.

But DNA robotics is not just for nanomedicine. Prof. Gothelf is mixing organic chemistry with DNA nanobots to transmit light along a wire that is just one molecule in width. This could further miniaturise electronics. DNA bots could assist manufacturing at the smallest scales, because they can place molecules at mind bogglingly tiny but precise distances from one another.

For now though, DNA robotics for medicine is what most scientists dream about. ‘You could make structures that are much more intelligent and much more specific than what is possible today,’ said Prof. Gothelf. ‘This has the potential to make a completely new generation of drugs.’

The research in this article was funded by the EU. If you liked this article, please consider sharing it on social media.

This post Foldable, organic and easily broken down: Why DNA is the material of choice for nanorobots was originally published on Horizon: the EU Research & Innovation magazine | European Commission.

Driverless shuttles: the latest from two European projects

AIhub | Horizon | Keolis autonomous shuttle
Autonomous vehicles must be well-integrated into public transport systems if they are to take off in Europe’s cities, say researchers. Image credit – Keolis

By Julianna Photopoulos

Jutting out into the sea, the industrial port area of Nordhavn in Denmark’s capital, Copenhagen, is currently being transformed into a futuristic waterfront city district made up of small islets. It’s billed as Scandinavia’s largest metropolitan development project and, when complete, will have living space for 40,000 people and workspace for another 40,000.

At the moment, Nordhavn is only served by a nearby S-train station and bus stops located near the station. There are no buses or trains running within the development area, although there are plans for an elevated metro line, and parking will be discouraged in the new neighbourhood. This is a great opportunity for autonomous vehicles (AVs) to operate as a new public transport solution, connecting this area more efficiently, says Professor Dimitri Konstantas at the University of Geneva in Switzerland.

‘We believe that AVs will become the new form of transport in Europe,’ he said. ‘We want to prove that autonomous vehicles are a sustainable, viable and environmental solution for urban and suburban public transportation.’

Prof. Konstantas is coordinating a project called AVENUE, which aims to do this in four European cities. In Nordhavn, the team plans to roll out autonomous shuttles on a loop with six stops around the seafront. They hope to have them up and running in two years. But once in place, the Nordhavn plan may provide a glimpse of how AV-based public transportation systems could work in the future.

Prof. Konstantas envisages these eventually becoming an on-demand, door-to-door service, where people can get picked up and go where they want rather than predetermined itineraries and bus stops.

In Nordhavn, AVENUE will test and implement an autonomous ‘mobility cloud’, currently under development, to link the shuttles with existing public transport, such as the nearby train station. An on-demand service will ultimately allow passengers to access the available transport with a single app, says Prof. Konstantas.

Integrating autonomous shuttles into the wider transport system is vital if they are to take off, says Guido Di Pasquale from the International Association of Public Transport (UITP) in Brussels, Belgium.

‘Autonomous vehicles have to be deployed as fleets of shared vehicles, fully integrated and complementing public transport,’ he said. ‘This is the only way we can ensure a sustainable usage of AVs in terms of space occupancy, traffic congestion and the environment.’

Single service

Di Pasquale points to a concept known as Mobility-as-a-Service (MaaS) as a possible model for future transport systems. This model combines both public and private transport. It allows users to create, manage and pay trips as a single service with an online account. For example, Uber, UbiGo in Sweden and Transport for Greater Manchester in the UK are exploring MaaS to enable users to get from one destination to another by combining transport and booking it as one trip, depending on their preferred option based on cost, time and convenience.

Di Pasquale coordinates a project called SHOW, which aims to deploy more than 70 automated vehicles in 21 European cities to assess how they can best be integrated with different wider transport systems and diverse users’ needs. They are testing combinations of AV types, from shuttles to cars and buses, in real-life conditions over the next four years. During this time, he expects the project’s AVs to transport more than 1,500,000 people and 350,000 containers of goods. ‘SHOW will be the biggest ever showcase and living lab for AV fleets,’ he said.

He says that most of the cities involved have tested autonomous last-mile vehicles in the past and are keen to include them in their future sustainable urban mobility plans.

However, rolling out AVs requires overcoming city-specific challenges, such as demonstrating safety.

‘Safety and security risks have restricted the urban use of AVs to dedicated lanes and low speed — typically below 20km/h,’ explained Di Pasquale. ‘This strongly diminishes their usefulness and efficiency, as in most city environments there is a lack of space and a high cost to keep or build such dedicated lanes.’

It could also deter users. ‘For most people, a speed barely faster than walking is not an attractive solution,’ he said.

We want to prove that autonomous vehicles are a sustainable, viable and environmental solution for urban and suburban public transportation.

Prof. Dimitri Konstantas, University of Geneva, Switzerland

Di Pasquale hopes novel technology will make higher speed and mixed traffic more secure, and guarantee fleets operating safely by monitoring and controlling them remotely.

Each city participating in SHOW will use autonomous vehicles in various settings, including mixed and dedicated lanes, at various speeds and types of weather. For safety and regulation reasons, all of them will have a driver present.

The objective is to make the vehicle fully autonomous without the need for a driver as well as optimise the service to encourage people to make the shift from ownership of cars to shared services, according to Di Pasquale. ‘This would also make on-demand and last-mile services sustainable in less densely populated areas or rural areas,’ he said.

Authorisation

But the technical issues of making the vehicle autonomous are only a part of the challenge.

There’s also the issue of who pays for it, says Di Pasquale. ‘AVs require sensors onboard, as well as adaptations to the physical and digital infrastructure to be deployed,’ he explained. ‘Their market deployment would require cities to drastically renew their fleets and infrastructures.’

SHOW’s pilots are scheduled to start in two years from now, as each city has to prepare by obtaining the necessary permits and getting the vehicles and technology ready, says Di Pasquale.

Getting authorisation to operate in cities is one of the biggest hurdles. City laws and regulations differ everywhere, says Prof. Konstantas.

AVENUE is still awaiting city licences to test in Nordhavn, despite a national law being passed on 1 July 2017 allowing for AVs to be tested in public areas. Currently, they have pilots taking place in Lyon, France and Luxembourg. In Geneva, the team has managed to get the required licences and the first worldwide on-demand, AV public transportation service will be rolled out on a 69-bus-stop circuit this summer.

AVENUE’s initial results show that cities need to make substantial investments to deploy AVs and to benefit from this technology. The legal and regulatory framework in Europe will also need to be adapted for smooth deployment of services, says Prof. Konstantas.

Both he and Di Pasquale hope their work can pave the way to convince operators and authorities to invest in fleets across Europe’s cities.

‘Depending on the willingness of public authorities, this can take up to four years until we see real, commercially sustainable AV-based public transportation services on a large scale in Europe,’ said Prof. Konstantas.

The research in this article was funded by the EU.

This post Driverless shuttles: what are we waiting for? was originally published on Horizon: the EU Research & Innovation magazine | European Commission.

Robotic arms and temporary motorisation – the next generation of wheelchairs

An automated wheelchair with an exoskeleton arm is designed to help people with varying forms of disability carry out daily tasks independently. Image credit – AIDE, Universidad Miguel Hernandez

by Julianna Photopoulos

Next-generation wheelchairs could incorporate brain-controlled robotic arms and rentable add-on motors in order to help people with disabilities more easily carry out daily tasks or get around a city.

Professor Nicolás García-Aracil from the Universidad Miguel Hernández (UMH) in Elche, Spain, has developed an automated wheelchair with an exoskeleton robotic arm to use at home, as part of a project called AIDE.

It uses artificial intelligence to extract relevant information from the user, such as their behaviour, intentions and emotional state, and also analyses its environmental surroundings, he says.

The system, which is based on an arm exoskeleton attached to a robotised wheelchair, is designed to help people living with various degrees and forms of disabilities carry out daily functions such as eating, drinking, and washing up, on their own and at home. While the user sits in the wheelchair, they wear the robotised arm to help them grasp objects and bring them close — or as the whole system is connected to the home automation system they can ask the wheelchair to move in a specific direction or go into a particular room.

Its mechanical wheels are made to move in narrow spaces, ideal for home-use, and the system can control the environment remotely – for example, switching lights on and off, using the television or making and answering phone calls. What’s more, it can anticipate the person’s needs.

‘We can train artificially intelligent algorithms to predict what the user wants to do,’ said Prof. García-Aracil. ‘Maybe the user is in the kitchen and wants a drink. The system provides their options (on a monitor) so they can control the exoskeleton to raise the glass and drink.’

Multimodal system

The technology isn’t simple. As well as the exoskeleton robotic arm attached to the robotic wheelchair, the chair has a small monitor and uses various sensors, including two cameras to recognise the environment, voice control, eye-tracking glasses to recognise objects, and sensors that capture brain activity, eye movements and signals from muscles.

Depending on each person’s needs and disabilities, the multiple devices are used accordingly. For example, someone with a severe disability such as a cervical spinal cord injury, who wouldn’t otherwise be able to use voice control, could use the brain activity and eye movement sensors combined.

The user wears a cap on their head, filled with electrodes, to record the brain’s activity which controls the exoskeleton hand’s movement, explains Prof. García-Aracil. So when the user sees themself closing their hand onto an object for example, the exoskeleton arm actually does it for them. This technology is called brain-neural-computer interaction (BNCI), where brain — as well as muscle — activity can be recorded and used to interact with an electronic device.

But the system can sometimes make mistakes so there is an abort signal, says Prof. García-Aracil. ‘We use the horizontal movement of the eye, so when you move your eyes to the right you trigger an action, but when you move your eyes to the left you abort that action,’ he explains.

The AIDE prototype was successfully tested last year by 17 people with disabilities including acquired brain injury (ABI), multiple sclerosis (MS), and spinal cord injury (SCI), at the Cedar Foundation in Belfast, Northern Ireland. Its use was also demonstrated at UMH in Elche, with the user asking to be taken to the cafeteria, then asking for a drink, and drinking it with the help of the exoskeletal arm.

Now more work needs to be carried out to make the system easier to use, cheaper and ready for the market, says Prof. García-Aracil.

But it’s not just new high-tech wheelchairs that can increase the functionality for users. Researchers on the FreeWheel project are developing a way of adding motorised units to existing wheelchairs to improve their utility in urban areas.

‘Different settings have different challenges,’ said project coordinator Ilaria Schiavi at IRIS SRL in Torino, Italy. For example, someone with a wheelchair may struggle to go uphill or downhill without any physical assistance whilst outdoors. But this system could allow people using wheelchairs to have an automated wheelchair experience regardless of whether they are indoors or outdoors, she says.

Rentable

The motorised units would attach to manual wheelchairs people already have in order to help them move around more easily and independently, Schiavi explains. These could either be rented for short periods of time and tailored according to the location — an indoor or outdoor environment — or bought, in which case would be completely personalised to the individual.

The researchers are also developing an app for the user which would include services such as ordering a bespoke device to connect the wheelchair and the unit, booking the unit, controlling it, and planning a journey within urban areas for shopping or sightseeing.

‘You have mobility apps that allow you to book cars, for example. Our app will allow the owner of a wheelchair to firstly subscribe to the service, which would include buying a customised interface to use between their own wheelchair and the motorising unit they have booked,’ said Schiavi.

‘A simple customised interface will allow wheelchair users to motorise their exact device, as it is used by them, at a reasonable cost.’

Customisation is made possible through additive manufacturing (AM) technologies, she says. AM technologies build 3D objects by adding materials, such as metal or plastic, layer-by-layer.

Schiavi and her colleagues are exploring various uses for the motorised units and next year, the team plans to test this system with mobility-impaired people in both Greece and Italy. They hope that, once developed, they will be made available like city bicycles in public spaces such as tourist attractions or shopping centres.

The research in this article was funded by the EU.

A ‘cookbook’ for vehicle manufacturers: Getting automated parts to talk to each other

Automated, networked truck convoys could save fuel and cut down on driving time. Image credit – MAN Truck & Bus

by Sandrine Ceurstemont
Semi-autonomous cars are expected to hit the roads in Europe next year with truck convoys following a few years later. But before different brands can share the roads, vehicle manufacturers need to agree on standards for automated functions.

Automation will increasingly allow vehicles to take over certain aspects of driving. However automated functions are still being fine-tuned, for example, to ensure smooth transitions when switching between the human driver and driverless mode.

Standards also need to be set across different car manufacturers, which is one of the goals of a project called L3Pilot. Although each brand can maintain some unique features, automated functions that help with navigating traffic jams, parking and motorway and urban driving must be programmed to do the same thing.

‘It’s like if you rent a car today, your expectation is that it has a gear shift, it has pedals, it has a steering wheel and so on,’ said project coordinator Aria Etemad from Volkswagen Group Research in Wolfsburg, Germany. ‘The approaches and the interfaces to the human driver are the same.’

To get the same functions from different brands to operate in an identical way, the team is creating a code of practice. This will result in a checklist for developers to run through when creating a self-driving function. ‘It’s like a cookbook for how to design and develop automated driving functions,’ said Etemad.

So far, the project team, which includes representatives from 13 vehicle manufacturers, has been conducting initial tests to make sure their company’s technology works. Cars are equipped with several sensors as well as cameras and computers which need to be properly calibrated to deal with real-world traffic.

The next step is to test the automated functions on public roads to ensure that the vehicles are ready. The tests will begin this month. Volunteer drivers will be chosen from diverse backgrounds, including different ages and genders. ‘We plan, all in all, to have 1,000 drivers using 100 vehicles in 10 different countries,’ Etemad said.

Transition

The technologies being trialled will cover a wide range of situations from overtaking on motorways to driving through urban intersections. Long journeys are also planned to see how people are able to transition back to driving after a long time in automated driving mode.

‘We really want to understand if the way we have designed our systems is the way drivers expect them to behave,’ said Etemad.

The team will also be investigating other aspects of automated driving such as the effect on traffic flow and CO2 emissions. Self-driving features are likely to make driving more efficient due to connectivity between vehicles and infrastructure, for example, although research so far has shown mixed results due to other contributing factors such as more people choosing to drive.

The first automated functions, which should be available in the next year, are likely to be for motorway driving, according to Etemad. Parking functions are likely to come to market next followed by automated urban driving, which is much more complex due to additional elements such as pedestrians and cyclists moving around.

‘There will be a good contribution from the project with results about long-term automated driving and a general understanding of people and functions that will impact how these systems are developed,’ said Etemad.

Automated functions are of interest for trucks too, where networked vehicles driving in a convoy could help save fuel, cut down on driving time or help with traffic flow. Truck platooning involves several trucks linking up through a wireless connection when they are close by so that they can share information and use automated functions to drive together as one. But so far, the concept has mostly been demonstrated in trucks of the same brand and in very few cases, with two brands.

‘Each automotive manufacturer tries it out and develops the technique within their own factory,’ said Dr Marika Hoedemaeker, a senior project manager at the Netherlands Organisation for Applied Scientific Research (TNO) in Helmond.

Truck platooning

Hoedemaeker and her project partners are now trying to break new ground by developing truck platooning that works across different brands as part of a project called ENSEMBLE. ‘Now we’re going to show that we can do this together with all the European truck manufacturers,’ said Dr Hoedemaeker.

The first phase, which the team has just completed, involved coming up with specifications that need to be implemented by all manufacturers. For example, braking and speed keeping will be automated whereas steering won’t be.

They’ve also come up with guidelines for how a convoy will respond when it enters a zone with a new speed limit or passes a toll gate, for example. ‘It’s not only communicating with the platoon but also with the outside world,’ she said.

The team is also keen to gauge the impact that a platoon will have on traffic. They would like to calculate its effect on traffic flow and whether it would in fact reduce congestion on the roads. Driving simulators will also be used to see how other drivers react when they encounter an automated truck convoy. ‘Will it change their traffic behaviour or will they never drive in the right lane anymore? There are lots of questions around this other traffic behaviour as well,’ said Dr Hoedemaeker.

Once specifications have been implemented in the trucks, they will start to test platoons on dedicated grounds then on public roads. Since trucks cross borders quite often, they will have to respect laws in different countries which vary across EU member states. The minimum following distance between vehicles, for example, differs from country to country.

In a final showcase event in May 2021, trucks from seven different brands, such as Daimler and Volvo, will drive together in one or more convoys across national borders, most likely to a key goods transport destination such as a large European port.

Following this deployment on European roads, Hoedemaeker expects the first generation platooning trucks to start being manufactured and sold a year after the project ends in 2021.

Since platoons are being developed worldwide, the standards created during the project could also be adopted more widely.

‘I think there is potential that the rest of the world could say they (Europe) already thought about the standards so we can use these and not do the whole thing over again,’ she said.

The research in this article was funded by the EU.

From robotic companions to third thumbs, machines can change the human brain

Cozmo robots and their corresponding tablets are being distributed to participants to take home so that they can interact with them for a week for an experiment being carried out by social robotics professor Emily Cross. Image credit – Ruud Hortensius and Emily Cross
By Frieda Klotz

People’s interactions with machines, from robots that throw tantrums when they lose a colour-matching game against a human opponent to the bionic limbs that could give us extra abilities, are not just revealing more about how our brains are wired – they are also altering them.

Emily Cross is a professor of social robotics at the University of Glasgow in Scotland who is examining the nature of human-robot relationships and what they can tell us about human cognition.

She defines social robots as machines designed to engage with humans on a social level – from online chatbots to machines with a physical presence, for example, those that check people into hotel rooms.

According to Prof. Cross, as robots can be programmed to perform and replicate specific behaviours, they make excellent tools for shedding light on how our brains work, unlike humans, whose behaviour varies.

‘The central tenets to my questions are, can we use human-robot interaction to better understand the flexibility and fundamental mechanisms of social cognition and the human brain,’ she said.

Brain imaging shows that a sad, happy or neutral robotic expression will engage the same parts of the brain as a human face with similar expressions.

Through their project called Social Robots, Prof. Cross and her team are using neural decoding techniques to probe the extent to which human feelings towards a robot change depending on how it behaves.

Tantrums

When the robots used in the project lose a game, they alternate between throwing tantrums or appearing dejected. ‘So far, people actually find it really funny when the robot gets angry,’ she said. ‘But people do respond to them quite strongly and that’s really interesting to see.’

Having robots as colleagues has been shown to affect humans in complex ways. Researchers at the University of Washington found that when soldiers used robots in bomb disposal, they developed emotional attachments towards them and felt frustration, anger or sadness if their robot was destroyed.

Prof. Cross says that from an evolutionary perspective, this doesn’t make sense. ‘We care about people and perhaps animals that might help us or hurt us,’ she said. ‘But with machines it’s a bit more of a mystery and understanding how far we can push that (to develop social relationships with machines) is a really, really fascinating question.’

It’s important to understand these dynamics since, as she points out, robots are already working as companions in nursing homes or even as tutors in early childhood education. Home care and education are prime areas of social robotics research, with R&D efforts focusing on adults suffering from dementia and young children.

Ten-hour rule

Typically, studies on such groups observe interactions over a relatively short time-span. They rarely exceed what Prof. Cross describes as a ten-hour rule, beyond which study participants tend to get bored of their robotic toys. But her team is looking at how feelings towards robots evolve over time.

As part of the project, the researchers send a palm-sized Cozmo robot home with study participants and instruct them to interact with it every day for a week by playing games or introducing it to their friends and pets. The participants’ brains are imaged at the start and end of that period to track changes.

‘If we’re going to have robots in our home environment, if they’re going to be in our schools teaching our kids across weeks, if not years, if they’re going to be peoples’ social companions, we want to know a lot more than just what happens after ten hours’ (of exposure),’ she said.

‘We want to know how people’s social bonds and relationships to robots change across many, many more hours.’

With such technologies set to become a bigger part of our future, other studies are investigating how the brain reacts to a different kind of robot – wearable robotic limbs that augment the body, providing extra abilities.

Wearables could have social and healthcare benefits. For instance, a third arm could assist surgeons to carry out procedures more safely rather than relying on human assistants, enable people to complete their household chores much faster or help construction workers.

But even as the technology capabilities develop apace, Dr Tamar Makin, a neuroscientist at University College London, UK, is exploring what it would take for the brain to accept and operate a robotic appendage as part of the body, through a five-year project called Embodied Tech.

Additional thumb

In order to understand how the brain deals with an extra body part, Dr Makin’s team asks participants to wear an additional opposable thumb for a week. Created by a designer named Dani Clode, the thumb is controlled by pressure sensors worn on the big toes.

Product designer Dani Clode created a prosthetic opposable thumb for people to wear as an extra digit. Video credit: Dani Clode

With the additional thumb, the augmented hand almost has the capabilities of two hands, giving people extra capacity to carry out actions. The question is what effect that has on the brain.

The study is still underway but preliminary results indicate that the presence of an extra thumb alters the brain’s internal map of what the biological hand looks like. Scans show that the brain represents the fingers as collapsing onto each other, away from the thumb and index finger.

This mirrors what happens in diseases like dystonia, when the representation of fingers begins to merge – for instance, when musicians use their fingers excessively – and causes cramp-like pain. The same effect could theoretically cause pain in the wearer of an extra thumb.

‘One important interim message we have is that there are potential costs, not just benefits, to using augmentation technology,’ said Dr Makin.

She believes that the newness of human augmentation means there are lots of unanswered questions but it’s vital to explore the challenges of wearable robotics in order to fully realise the promises, such as multitasking or safer working conditions.

‘I feel like we have a responsibility to gain a much better understanding of how having good control of an additional body part is going to change the representation of the body parts you already have.’

The research in this article was funded by the European Research Council.

Robots are being programmed to adapt in real time

In trials, the ResiBot robot learned to walk again in less than two minutes after one of its legs was removed. Image credit – Antoine Cully / Sorbonne University

By Gareth Willmer

It’s part of a field of work that is building machines that can provide real-time help using only limited data as input. Standard machine-learning algorithms often need to process thousands of possibilities before deciding on a solution, which may be impractical in pressurised scenarios where fast adaptation is critical.

After Japan’s Fukushima nuclear disaster in 2011, for example, robots were sent into the power plant to clear up radioactive debris in conditions far too dangerous for humans. The problem, says robotics researcher Professor Jean-Baptiste Mouret is that the robots kept breaking down or came across hazards that stopped them in their tracks.

As part of the ResiBots initiative, he is designing a lower-cost robot that can last long periods without needing constant human maintenance for breakages and are better at overcoming unexpected obstacles.

The ResiBots team is using what it refers to as micro-data learning algorithms, which can help robots adapt in front of one’s eyes in a similar way to how animals react to problems. An animal will, for example, often find a way to continue moving if they get injured, even if they don’t know exactly what the problem is.

In contrast, most current robots self-diagnose a problem before working out a way to overcome it, says Prof. Mouret, principal investigator at ResiBots and a senior researcher at the Inria research centre in France.

‘We’re trying to shortcut this by finding a way for them to react without necessarily having developed an understanding of what’s wrong,’ he said.

Rather than self-diagnosing, the aim for these robots is to learn in a proactive way by trial and error what alternative actions they can take. This could help them overcome difficulties and stop them from shutting down in situations such as disaster scenarios like Fukushima, said Prof. Mouret.

This may not be full artificial intelligence, but Prof. Mouret points out that having knowledge of everything is not essential for getting a robot to work.

‘We’re not trying to solve everything,’ he said. ‘I’m more interested in how they can adapt – and, in fact, adapting to what’s happening is some of what makes animals intelligent.’

Simulated childhood

In one of the most promising approaches developed in the ResiBots project, the robots have a simulated childhood, in which they learn different ways to move their body using an algorithm that searches ahead of time to collect examples of useful behaviours. 

This means that when seeking a way to move, the robots need to choose from one of about 13,000 behaviours rather than an estimated 1047 options that standard algorithms could select from. And the aim is for them to try only a handful of these before finding one that works.

Most of ResiBot’s tests are currently being carried out on a six-legged robot that seeks to find new ways to move after having one or more legs removed. In the latest trials, Prof. Mouret said the robots learned to walk in one to two minutes after one of their legs was taken off, meaning they generally need to test fewer than 10 behaviours before finding one that works.

Test robots can learn to overcome a broken leg in under two minutes. Video credit – Horizon

In total, the researchers are working on half a dozen robots at varying levels of complexity including a child-like humanoid robot known as iCub. Though the much more complex iCub is not yet being used in many trials, the team hopes to do so more over time.

‘Humanoids have the potential of being highly versatile and adapting well to environments designed for humans,’ said Prof. Mouret. ‘For instance, nuclear power plants have doors, levers and ladders that were designed for people.’

There are, however, some big challenges still to overcome, including the fact that a robot needs to be moved back to its starting position once a limb is removed, rather than being able to carry on from the injury site towards the target.

Safety

There are also wider safety issues involving such robots – for example, ensuring that they do not harm earthquake survivors while rescuing them, particularly if the robot is learning by trial and error, said Prof. Mouret.

He believes it will be at least four or five years before such a robot could be used in the field, but is hopeful that the techniques can eventually be employed in all types of robot – not just those for disaster situations, but in the home and other scenarios.

But it’s not just mechanics that can help robots navigate the real world. Robots may also adapt better if they can more strongly connect language to reality. 

Professor Gemma Boleda at the Universitat Pompeu Fabra in Spain, has a background in linguistics and her team is trying to link research in this field to artificial intelligence to help machines better understand the world around them, as part of a project called AMORE.

It’s something that could be useful for making technology such as GPS more intelligent. For example, when driving in a car, the GPS system could specify that you turn right where ‘the big tree’ is, distinguishing it from several other trees.

Prof. Boleda says this has been hard to do in the past because of the difficulty of modelling the way humans link language with reality.

‘In the past, language had largely been represented out of context,’ said Prof. Boleda.

AMORE’s aim is to get computers to understand words and concepts in a real-world context rather than as individual words in isolation, she says. For instance, a robot would learn to connect the phrase ‘this dog’ with an actual dog in the room, representing both the words and the real-world entities.

‘The crux of these models is that they are able to learn their own representations from data,’ she added. ‘Before, researchers had to tell the machine what the world looked like.’

Giving machines a better understanding of the world around them will help them do ‘more with less’ in terms of the amount of data they need and get better at predicting outcomes, Prof. Boleda said.

It could also help with the issue of having enough physical space on devices like mobile phones for the next wave of intelligent applications.

‘I am working with language, but this problem of needing a lot of data is a problem that plagues many other domains of artificial intelligence,’ said Prof. Boleda. ‘So if I develop methods that can do more with less, then these can also be applied elsewhere.’

The research in this article was funded by the EU’s European Research Council.

Drones and satellite imaging to make forest protection pay

Better tracking of forest data will make the climate change reporting process easier for countries who want compensation for protecting their carbon stock. Image credit – lubasi, licensed under CC BY-SA 2.0

by Steve Gillman
Every year 7 million hectares of forest are cut down, chipping away at the 485 gigatonnes of carbon dioxide (CO2) stored in trees around the world, but low-cost drones and new satellite imaging could soon protect these carbon stocks and help developing countries get paid for protecting their trees.

‘If you can measure the biomass you can measure the carbon and get a number which has value for a country,’ said Pedro Freire da Silva, a satellite and flight system expert at Deimos Engenharia, a Portuguese technology company.

International financial institutions, such as the World Bank and the European Investment Bank, provide developing countries with economic support to keep forests’ carbon stocks intact through the UN REDD+ programme.

The more carbon a developing country can show it keeps in its forests, the more money the government could get, which would give them a greater incentive to protect these lands. But, according to Silva, these countries often lack the tools to determine the exact amount of carbon stored in their forests and that means they could be missing out on funding.

‘If you have a 10% error in your carbon stock (estimation), that can have a financial value,’ he said, adding that it also takes governments a lot of time and energy to collect the relevant data about their forests.

To address these challenges, a project called COREGAL developed automated low-cost drones that map biomass. They put a special sensor on drones that fly over forests and analyse Global Positioning System (GPS) and Galileo satellite signals as they bounce back through a tree canopy, which then reveals the biomass density of an area and, in turn, the carbon stocks.

‘The more leaves you have, the more power (from GPS and Galileo) is lost,’ said Silva who coordinated the project. This means when the drone picks up weaker satellite navigation readings there is more biomass below.

‘If you combine this data with satellite data we get a more accurate map of biomass than either would (alone),’ he added.

Sentinels

The project trialled their drone prototype in Portugal, with Brazil in mind as the target end user as it is on the frontline of global deforestation. According to Brazilian government data, an area about five times to size of London was destroyed between August 2017 and July this year.

COREGAL’s drones could end up enabling countries such as Brazil to access more from climate funds, in turn creating a stronger incentive for governments to protect their forests. Silva also believes the drones could act as a deterrent against illegal logging.

‘If people causing deforestation know that there are (drone) flight campaigns or people going to the field to monitor forests it can demotivate them,’ he said. ‘It is like a sentinel system.’

In the meantime, governments in other developing countries still need the tools to help them fight deforestation. According to Dr Thomas Häusler, a forest and climate expert at GAF, a German earth observation company, the many drivers of deforestation make it very difficult to sustainably manage forests.

‘(Deforestation) differs between regions and even in regions you have different drivers,’ said Dr Häusler. ‘In some countries they (governments) give concessions for timber logging and companies are (then) going to huge (untouched forest) areas to selectively log the highest value trees.’

Logging like this is occurring in Brazil, central Africa and Southeast Asia. When it happens, Dr Häusler says this can cause huge collateral damage because loggers leave behind roads that local populations use to access previously untouched forests which they further convert for agriculture or harvest wood for energy.

Twenty percent of Earth's forest cover is roughly 863,000 km². Image credit - Horizon

Demand for timber and agricultural produce from developed countries can also drive deforestation in developing countries because their governments see the forest as a source of economic development and then allow expansion.

With such social, political and economic dependency, it can be difficult, and expensive, for governments to implement preventative measures. According to Dr Häusler, to protect untouched forests these governments should be compensated for fighting deforestation.

‘To be compensated you need strong (forest) management and observation (tools),’ said Dr Häusler, who is also the coordinator of EOMonDis, a project developing an Earth-observation-based forest monitoring system that aims to support governments.

Domino effect

They combine high-resolution data from the European Sentinel satellites, available every five days through Copernicus, the EU’s Earth observation system, along with data from the North American Landsat-8 satellite.

Automated processing using special algorithms generates detailed maps on the current and past land use and forest situation to identify the carbon-rich forest areas. The project also has access to satellite data going as far back as the 1970s which can be used to determine how much area has been affected by deforestation.

Like COREGAL, using these maps, and the information they contain, a value is put on the most carbon-rich forest areas, meaning countries can access more money from international financial institutions. The project is almost finished and they soon hope to have a commercially viable system for use.

‘The main focus is the climate change reporting process for countries who want compensation in fighting climate change,’ said Dr Häusler. ‘We can support this process by showing the current land-use situation and show the low and high carbon stocks.’

Another potential user of this system is the international food industry that sells products containing commodities linked to deforestation such as palm oil, cocoa, meat and dairy. In response to their contribution, and social pressure, some of these big companies have committed to zero-deforestation in their supply chain.

‘When someone (a company) is declaring land as zero deforestation, or that palm plantations fit into zero deforestation, they have to prove it,’ said Dr Häusler. ’And a significant result (from the project) is we can now prove that.’

Dr Häusler says the system will help civil society and NGOs who want to make sure industry or governments are behaving themselves as well as allow the different groups to make environmentally sound decisions when choosing land for different purposes.

‘We can show everybody – the government, NGO stakeholders, but also the industry – how to better select the areas they want to use.’

The research in this article was funded by the EU. If you liked this article, please consider sharing it on social media.

Models of dinosaur movement could help us build stronger robots and buildings

Researchers are using computer simulations to estimate how 11 different species of extinct archosaurs such as the batrachotomus might have moved. Image credit: John Hutchinson

By Sandrine Ceurstemont

From about 245 to 66 million years ago, dinosaurs roamed the Earth. Although well-preserved skeletons give us a good idea of what they looked like, the way their limbs worked remains a bigger mystery. But computer simulations may soon provide a realistic glimpse into how some species moved and inform work in fields such as robotics, prosthetics and architecture.

John Hutchinson, a professor of evolutionary biomechanics from the Royal Veterinary College in Hertfordshire, UK, and his colleagues are investigating the locomotion of the earliest, small dinosaurs, as part of the five-year-long Dawndinos project which began in 2016.

‘These dinosaurs have been hugely neglected,’ Prof. Hutchinson said. ‘People – including me – have mostly been studying the celebrity dinosaurs like T. rex.’

About 225 million years ago, during the late Triassic period, these small dinosaurs were in the minority, whereas the bigger crocodile-like animals that lived alongside them were more numerous and diverse. Dinosaurs somehow went on to thrive while most other animals from that period became extinct.

Compared to their quadrupedal, heavy-built contemporaries, what stands out about these early dinosaurs is that they had an erect posture and could, at least intermittently, walk on two limbs. One theory is that their style of locomotion gave them a survival edge.

‘The idea of this project is to test that idea,’ Prof. Hutchinson said.

The team has started to develop computer simulations to estimate how 11 different species of extinct archosaurs – the group of animals that includes crocodiles, birds, their relatives and dinosaurs – might have moved. They will focus on five different types of motion: walking, running, turning, jumping and standing.

Simulations

To test whether their simulations are accurate, the researchers plan to give the same treatment to their living relatives – crocodiles and birds – as well. They will then compare the results to actual measurements of motion to determine how good their computer models of extinct animals are.

‘It will be the first time we ground-truth (test with empirical evidence) these methods very rigorously with the best possible data we can get,’ Prof. Hutchinson said.

So far, they’ve modelled the movement of a Mussaurus – an early cousin of giant plant-eating sauropod dinosaurs such as Brontosaurus. The Mussaurus was much smaller and researchers wanted to see whether it moved on four legs like its larger relatives. The first reconstructions of the animal had it on four legs because it had quite big arms, said Prof. Hutchinson.

Using scans of well-preserved fossils from Argentina, they were able to produce new models of its movement. Prof. Hutchinson and his team found that it was in fact bipedal. It couldn’t have walked on four legs since the palms of its front limbs faced inwards and the forearm joints weren’t capable of rotating downwards. Therefore, it wouldn’t have been able to plant its front legs on the ground.

‘It wasn’t until we put the bones together in a 3D environment and tried playing with their movements that it became clear to us that this wasn’t an animal with very mobile arms and hands,’ Prof. Hutchinson said.

After modelling the large forearm of the Mussaurus, the Dawndinos team realised that it could not be used for walking. Video courtesy: John Hutchinson

Robotics

The simulations produced during the project could be useful for zoologists. But they could have less obvious applications too, for example, helping to improve how robots move, according to Prof. Hutchinson.

Accurate models are needed to replicate the motion of animals, which robotics researchers often take inspiration from. Mimicking a crocodile, for example, could be of interest to create a robot that can both swim and walk on land.

Prof. Hutchinson also regularly gets contacted by film and documentary makers who are interested in using his simulations to create realistic animations. ‘It’s hard to make bigger, or unusual, animals move correctly if the physics isn’t right,’ Prof. Hutchinson said.

Understanding the locomotion of the very largest dinosaurs is the aim of a project being undertaken by paleobiology researcher Alexandra Houssaye and her colleagues from France’s National Centre for Scientific Research and the National Museum of Natural History in Paris. Through their Gravibone project, which began last year, they want to pin down the limb bone adaptations that allow large animals to carry a heavy skeleton.

‘We really want to understand what (bone features) are linked to being massive,’ Dr Houssaye said.

Massive

So far, research has shown that the long bones in the limbs of bigger animals are more robust than those of smaller animals. But this general trend has only been superficially observed. The outer and inner bone structures have adapted over time to help support animals’ weight. For example, whereas smaller terrestrial animals have hollow limb bones, massive ones like elephants, rhinos and hippos have connective tissue in the middle.

Among the largest animals and their ancestors there are also other differences. The limb bones of modern rhinos, for example, are short and heavy. But their prehistoric relatives called Indricotherium, the largest land mammal that ever lived, had a less stocky skeleton. ‘It’s interesting to see that the biggest didn’t have the most massive (frame),’ Dr Houssaye said.

The team is studying both living and extinct animals, focussing on elephants, rhinos, hippos, prehistoric mammals and dinosaurs such as sauropods – a group that includes the biggest terrestrial animals of all time.

So far, they have compared the ankle bones of horses, tapirs, rhinos and fossils of rhinos’ ancestors. They found that for animals of the same mass there were differences depending on if they were short and stout or had longer limbs. In less stocky animals, the two ankle bones tended to be more distinct whereas they were more strongly connected in those that were massively built, probably to reinforce the articulation.

‘It’s not only the mass (of the animal) but how the mass is distributed on the body,’ said Dr Houssaye. ‘For us that was interesting.’

3D modelling

Their next step will be to scan different limb bones and analyse their inner structure. They will also use 3D modelling to figure out how much weight different parts of the bones can handle in different spots, for example.

The results from the project could help make more efficient prosthetics for people and animals, Dr Houssaye said. Designers will be able to better understand how different features of limb bones, such as thickness and orientation, relate to their strength, enabling them to create materials that are lighter but more resistant. 

Similarly, Dr Houssaye has also had interest from the construction industry which is looking for new types of materials and more effective building techniques. Pillars supporting heavy buildings, for example, could be made using less material by improving their inner structure instead.

‘How a skeleton adapts (to heavy weight) has implications for construction,’ Dr Houssaye said. ‘(Architects) are trying to create structures that are able to support heavy weight.’

The research in this article was funded by the European Research Council. If you liked this article, please consider sharing it on social media.

Memory-jogging robot to keep people sharp in ‘smart’ retirement homes

Sensors placed throughout a retirement home helped the ENRICHME robot to keep track of the movements and activities of residents taking part in the project’s trial. Image credit – ENRICHME

by Steve Gillman

Almost a fifth of the European population are over 65 years old, but while quality of life for this age bracket is better than ever before, many will at some point suffer from a decline in their mental abilities.

Without adequate care, cognitive abilities can decline quicker, yet with the right support people can live longer, healthier and more independent lives. Researchers working on a project called ENRICHME have attempted to address this by developing a robotic assistant to help improve mental acuity.

‘One of the big problems of mild cognitive impairment is temporary memory loss,’ said Dr Nicola Bellotto from the School of Computer Science at the University of Lincoln in the UK and one of the principal investigators of ENRICHME.

‘The goal of the project was to assist and monitor people with cognitive impairments and offer basic interactions to help a person maintain their cognitive abilities for longer.’

The robot moves around a home providing reminders about medication as well as offering regular physical and mental exercises – it can even keep tabs on items that are easily misplaced.

A trial was conducted with the robot in three retirement homes in England, Greece and Poland. At each location the robot helped one or more residents, and was linked to sensors placed throughout the building to track the movements and activities of those taking part.

‘All this information was used by the robot,’ said Dr Bellotto. ‘If a person was in the bedroom or kitchen the robot could rely on the sensors to know where the person is.’

The robot was also kitted out with a thermal camera so it could measure the temperature of a person in real time, allowing it to estimate the levels of their respiration and heartbeat. This could reveal if someone was experiencing high levels of stress related to a particular activity and inform the robot to act accordingly.

This approach is based around a principle called ambient assisted living, which combines technology and human carers to improve support for older people. In ENRICHME’s case, Dr Bellotto said their robots could be operated by healthcare professionals to provide tailored care for elderly patients.

The users involved in the trials showed high level of engagement with the robot they were living with – even naming it Alfie in one case – and also provided good feedback, said Dr Bellotto. But he added that it is still a few years away from being rolled out to the wider public.

‘Some of the challenges we had were how to approach the right person in these different environments because sometimes a person lives with multiple people, or because the rooms are small and cluttered and it is simply not possible for the robot to move safely from one point to another,’ he said.

Dr Bellotto and his team are now applying for funding for new projects to solve the remaining technical problems, which hopefully one day will help them take the robot one step closer to commercialisation.

This type of solution would help increase people’s physical and psychological wellbeing, which could help to reduce public spending on care for older people. In 2016 the total cost of ageing in the EU was 25% of GDP, and this figure is expected to rise in the coming decades.

Ageing populations

‘One of the big challenges we have in Europe is the high number of elderly people that the public health system has to deal with,’ said Professor María Inés Torres, a computer scientist from the University of the Basque Country in Spain.

Older people can fall into bad habits for a variety of reasons, such as being unable to go for walks and cook healthy meals because of physical limitations. The loss of a loved one can also lead to reduced socialising, exercising or eating well. These unhealthy habits are all exacerbated by depression caused by loneliness, and with 32% of people aged over 65 living alone in Europe, this is a significant challenge to overcome.

‘If you are able to decrease the correlation between depression and age, so keeping people engaged with life in general and social activities, these people aren’t going to visit the doctor as much,’ said Prof. Torres, who is also the coordinator of EMPATHIC, a project that has developed a virtual coach to help assist elderly people to live independently.

The coach will be available on smart devices such as phones or tablets and is focused on engaging with someone to help them keep up the healthier habits they may have had in the past.

‘The main goal is that the user reflects a little bit and then they can agree to try something,’ said Prof. Torres.

For instance, the coach may ask users if they would like go to the local market to prepare their favourite dinner and then turn it into a social activity by inviting a friend to come along. This type of approach addresses the three key areas that cause older people’s health to deteriorate, said Prof. Torres, which are poor nutrition, physical activity and loneliness.

For the coach to be effective the researchers have to build a personal profile for each user, as every person is different and requires specific suggestions relevant to them. They do this by building a database for each person over time and combining it with dialogue designed around the user’s culture.

The researchers are testing the virtual coach on smart devices with 250 older people in three areas – Spain, France and Norway – who are already providing feedback on what works and what doesn’t, which will increase the chances of the virtual coach actually being used.

By the end of the project in 2020, the researchers hope to have a prototype ready for the market, but Prof. Torres insisted that it will not replace healthcare professionals. Instead she sees the smart coach as another tool to help older people live a more independent life – and in doing so reduce the pressure on public healthcare systems.

The research in this article was funded by the EU. If you liked this article, please consider sharing it on social media.

Garbage-collecting aqua drones and jellyfish filters for cleaner oceans

An aqua drone developed by the WasteShark project can collect litter in harbors before it gets carried out into the open sea. Image credit – WasteShark

By Catherine Collins

The cost of sea litter in the EU has been estimated at up to €630 million per year. It is mostly composed of plastics, which take hundreds of years to break down in nature, and has the potential to affect human health through the food chain because plastic waste is eaten by the fish that we consume.

‘I’m an accidental environmentalist,’ said Richard Hardiman, who runs a project called WASTESHARK. He says that while walking at his local harbour one day he stopped to watch two men struggle to scoop litter out of the sea using a pool net. Their inefficiency bothered Hardiman, and he set about trying to solve the problem. It was only when he delved deeper into the issue that he realised how damaging marine litter, and plastic in particular, can be, he says.

‘I started exploring where this trash goes – ocean gyres (circular currents), junk gyres, and they’re just full of plastic. I’m very glad that we’re now doing something to lessen the effects,’ he said.

Hardiman developed an unmanned robot, an aqua drone that cruises around urban waters such as harbours, marinas and canals, eating up marine litter like a Roomba of the sea. The waste is collected in a basket which the WasteShark then brings back to shore to be emptied, sorted and recycled.

The design of the autonomous drone is modelled on a whale shark, the ocean’s largest known fish. These giant filter feeders swim around with their mouths open and lazily eat whatever crosses their path.

It’s powered by rechargeable electric batteries, ensuring that it doesn’t pollute the environment through oil spillage or exhaust fumes, and it is relatively silent, avoiding noise pollution. It produces zero carbon emissions and the device moves quite slowly, allowing fish and birds to merely swim away when it gets too close for comfort.

‘We’ve tested it in areas of natural beauty and natural parks where we know it doesn’t harm the wildlife,’ said Hardiman. ‘We’re quite fortunate in that, all our research shows that it doesn’t affect the wildlife around.’

WasteShark’s autonomous drone is modelled on a whale shark. Credit – RanMarine Technology

WasteShark is one of a number of new inventions designed to tackle the problem of marine litter. A project called CLAIM is developing five different kinds of technology, one of which is a plasma-based tool called a pyrolyser. 

Useful gas

CLAIM’s pyrolyser will use heat treatment to break down marine litter to a useful gas. Plasma is basically ionised gas, capable of reaching very high temperatures of thousands of degrees. Such heat can break chemical bonds between atoms, converting waste into a type of gas called syngas.

The pyrolyser will be mounted onto a boat collecting floating marine litter – mainly large items of plastic which, if left in the sea, will decay into microplastic – so that the gas can then be used as an eco-friendly fuel to power the boat, or to provide energy for heating in ports.

Dr Nikoleta Bellou of the Hellenic Centre for Marine Research, one of the project coordinators of CLAIM, said: ‘We know that we humans are actually the key drivers for polluting our oceans. Unlike organic material, plastic never disappears in nature and it accumulates in the environment, especially in our oceans. It poses a threat not only to the health of our oceans and to the coasts but to humans, and has social, economic and ecological impacts.’

The researchers chose areas in the Mediterranean and Baltic Seas to act as their case studies throughout the project, and will develop models that can tell scientists which areas are most likely to become litter hotspots. A range of factors influence how littered a beach may be – it’s not only affected by litter louts in the surrounding area but also by circulating winds and currents which can carry litter great distances, dumping the waste on some particular beaches rather than others.

CLAIM’s other methods to tackle plastic pollution include a boom – a series of nets criss-crossing a river that catches all the large litter that would otherwise travel to the sea. The nets are then emptied and the waste is collected for treatment with the pyrolyser. There have been problems with booms in the past, when bad weather conditions cause the nets to overload and break, but CLAIM will use automated cameras and other sensors that could alert relevant authorities when the nets are full.

Microplastics

Large plastic pieces that can be scooped out of the water are one thing, but tiny particles known as microplastics that are less than 5mm wide pose a different problem. Scientists on the GoJelly project are using a surprising ingredient to create a filter that prevents microplastics from entering the sea – jellyfish slime.

The filter will be deployed at waste water management plants, a known source of microplastics. The method has already proven to be successful in the lab, and now GoJelly is planning to upscale the biotechnology for industrial use.

Dr Jamileh Javidpour of the GEOMAR Helmholtz Centre for Ocean Research Kiel, who coordinates the project, said: ‘We have to be innovative to stop microplastics from entering the ocean.’

The GoJelly project kills two birds with one stone – tackling the issue of microplastics while simultaneously addressing the problem of jellyfish blooms, where the creatures reproduce in high enough levels to blanket an area of ocean.

Jellyfish are one of the most ancient creatures on the planet, having swum in Earth’s oceans during the time of the dinosaurs. On the whole, due to a decline in natural predators and changes in the environment, they are thriving. When they bloom, jellyfish can attack swimmers and fisheries.

Fishermen often throw caught jellyfish back into the sea as a nuisance but, according to Dr Javidpour, jellyfish can be used much more sustainably. Not only can their slime be used to filter out microplastics, they can also be used as feed for aquaculture, for collagen in anti-ageing products, and even in food.

In fact, part of the GoJelly project involves producing a cookbook, showing people how to make delicious dishes from jellyfish. While Europeans may not be used to cooking with jellyfish, in many Asian cultures they are a daily staple. However, Dr Javidpour stresses that the goal is not to replace normal fisheries.

‘We are mainly ecologists, we know the role of jellyfish as part of a healthy ecosystem,’ she said. ‘We don’t want to switch from classical fishery to jellyfish fishery, but it is part of our task to investigate if it is doable, if it is sustainable.’

The research in this article has been funded by the EU.

Page 1 of 3
1 2 3