Page 3 of 3
1 2 3

EU’s future cyber-farms to utilise drones, robots and sensors

Farmers could protect the environment and cut down on fertiliser use with swarms of drones. Image credit – ‘Aerial View – Landschaft Markgräflerland’ by Taxiarchos228 is licenced under CC 3.0 unported
by Anthony King

Bee-based maths is helping teach swarms of drones to find weeds, while robotic mowers keep hedgerows in shape.

‘We observe the behaviour of bees. We gain knowledge of how the bees solve problems and with this we obtain rules of interaction that can be adapted to tell us how the robot swarms should work together,’ said Vito Trianni at the Institute of Cognitive Sciences and Technologies of the Italian National Research Council.

Honeybees, for example, run on an algorithm to allow them to choose the best nest site, even though no bee knows the full picture.

Trianni runs an EU-funded research project known as SAGA, which is using the power of robotic groupthink to keep crops weed free.

‘We can use low-cost robots and low-cost cameras. They can even be prone to error, but thanks to the cooperation they will be able to generate precise maps at centimetre scales,’ said Trianni.

‘They will initially spread over the field to inspect it at low resolution, but will then decide on areas that require more focus,’ said Trianni. ‘They can gather together in small groups closer to the ground.’

Importantly the drones make these decisions themselves, as a group.

Next spring, a swarm of the quadcopters will be released over a sugar beet field. They will stay in radio contact with each other and use algorithms learnt from the bees to cooperate and put together a map of weeds. This will then allow for targeted spraying of weeds or their mechanical removal on organic farms.

Today the most common way to control weeds is to spray entire fields with herbicide chemicals. Smarter spraying will save farmers money, but it will also lower the risk of resistance developing to the agrichemicals. And there will be an environmental benefit from spraying less herbicides.

Co-ops

Swarms of drones for mapping crop fields offer a service to farmers, while farm co-ops could even buy swarms themselves.

‘There is no need to fly them every day over your field, so it is possible to share the technology between multiple farmers,’ said Trianni. A co-op might buy 20 to 30 drones, but adjust the size of the swarm to the farm.

The drones are 1.5 kilos in weight and fly for around 20-30 minutes. For large fields, the drone swarms could operate in relay teams, with drones landing and being replaced by others.

It’s the kind of technology that is ideally suited to today’s large-scale farms, as is another remote technology that combines on-the-ground sensor information with satellite data to tell farmers how much nitrogen or water their fields need.

Wheat harvested from a field in Boigneville, 100 km south of Paris, France, in August this year will have been grown with the benefit of this data, as part of pilot being run by an EU-funded project known as IOF2020, which involves over 70 partners and around 200 researchers.

‘Sensors are costing less and less, so at the end of the project we hope to have something farmers or farm cooperatives can deploy in their fields,’ explained Florence Leprince, a plant scientist at Arvalis – Institut du végétal, the French arable farming institute which is running the wheat experiment.

‘This will allow farmers be more precise and not overuse nitrogen or water.’ Florence Leprince, Arvalis – Institut du végétal, France

Adding too much nitrogen to a crop field costs farmers money, but it also has a negative environmental impact. Surplus nitrogen leaches from soils and into rivers and lakes, causing pollution.

It’s needed because satellite pictures can indicate how much nitrogen is in a crop, but not in soil. The sensors will help add detail, though in a way that farmers will find easy to use.

It’s a similar story for the robotic hedge trimmer being developed by a separate group of researchers. All the farmer or groundskeeper needs to do is mark which hedge needs trimming.

‘The user will sketch the garden, though not too accurately,’ said Bob Fisher, computer vision scientist at Edinburgh University, UK, and coordinator of the EU-funded TrimBot2020 project. ‘The robot will go into the garden and come back with a tidied-up sketch map. At that point, the user can say go trim that hedge, or mark what’s needed on the map.’

This autumn will see the arm and the robot base assembled together, while the self-driving bot will be set off around the garden next spring.

More info:
SAGA (part of ECHORD Plus Plus)
IOF2020
TrimBot2020

EU’s future cyber-farms to utilise drones, robots and sensors

Farmers could protect the environment and cut down on fertiliser use with swarms of drones. Image credit – ‘Aerial View – Landschaft Markgräflerland’ by Taxiarchos228 is licenced under CC 3.0 unported
by Anthony King

Bee-based maths is helping teach swarms of drones to find weeds, while robotic mowers keep hedgerows in shape.

‘We observe the behaviour of bees. We gain knowledge of how the bees solve problems and with this we obtain rules of interaction that can be adapted to tell us how the robot swarms should work together,’ said Vito Trianni at the Institute of Cognitive Sciences and Technologies of the Italian National Research Council.

Honeybees, for example, run on an algorithm to allow them to choose the best nest site, even though no bee knows the full picture.

Trianni runs an EU-funded research project known as SAGA, which is using the power of robotic groupthink to keep crops weed free.

‘We can use low-cost robots and low-cost cameras. They can even be prone to error, but thanks to the cooperation they will be able to generate precise maps at centimetre scales,’ said Trianni.

‘They will initially spread over the field to inspect it at low resolution, but will then decide on areas that require more focus,’ said Trianni. ‘They can gather together in small groups closer to the ground.’

Importantly the drones make these decisions themselves, as a group.

Next spring, a swarm of the quadcopters will be released over a sugar beet field. They will stay in radio contact with each other and use algorithms learnt from the bees to cooperate and put together a map of weeds. This will then allow for targeted spraying of weeds or their mechanical removal on organic farms.

Today the most common way to control weeds is to spray entire fields with herbicide chemicals. Smarter spraying will save farmers money, but it will also lower the risk of resistance developing to the agrichemicals. And there will be an environmental benefit from spraying less herbicides.

Co-ops

Swarms of drones for mapping crop fields offer a service to farmers, while farm co-ops could even buy swarms themselves.

‘There is no need to fly them every day over your field, so it is possible to share the technology between multiple farmers,’ said Trianni. A co-op might buy 20 to 30 drones, but adjust the size of the swarm to the farm.

The drones are 1.5 kilos in weight and fly for around 20-30 minutes. For large fields, the drone swarms could operate in relay teams, with drones landing and being replaced by others.

It’s the kind of technology that is ideally suited to today’s large-scale farms, as is another remote technology that combines on-the-ground sensor information with satellite data to tell farmers how much nitrogen or water their fields need.

Wheat harvested from a field in Boigneville, 100 km south of Paris, France, in August this year will have been grown with the benefit of this data, as part of pilot being run by an EU-funded project known as IOF2020, which involves over 70 partners and around 200 researchers.

‘Sensors are costing less and less, so at the end of the project we hope to have something farmers or farm cooperatives can deploy in their fields,’ explained Florence Leprince, a plant scientist at Arvalis – Institut du végétal, the French arable farming institute which is running the wheat experiment.

‘This will allow farmers be more precise and not overuse nitrogen or water.’ Florence Leprince, Arvalis – Institut du végétal, France

Adding too much nitrogen to a crop field costs farmers money, but it also has a negative environmental impact. Surplus nitrogen leaches from soils and into rivers and lakes, causing pollution.

It’s needed because satellite pictures can indicate how much nitrogen is in a crop, but not in soil. The sensors will help add detail, though in a way that farmers will find easy to use.

It’s a similar story for the robotic hedge trimmer being developed by a separate group of researchers. All the farmer or groundskeeper needs to do is mark which hedge needs trimming.

‘The user will sketch the garden, though not too accurately,’ said Bob Fisher, computer vision scientist at Edinburgh University, UK, and coordinator of the EU-funded TrimBot2020 project. ‘The robot will go into the garden and come back with a tidied-up sketch map. At that point, the user can say go trim that hedge, or mark what’s needed on the map.’

This autumn will see the arm and the robot base assembled together, while the self-driving bot will be set off around the garden next spring.

More info:
SAGA (part of ECHORD Plus Plus)
IOF2020
TrimBot2020

Digital symbiosis lets robot co-workers predict human behaviour

Robot co-workers could help out with repetitive jobs and heavy lifting by reacting to human actions. Image credit – Italian Institute of Technology

by Anthony King
Stephen Hawking and Elon Musk fear that the robotic revolution may already be underway, but automation isn’t going to take over just yet – first machines will work alongside us.

Robots across the world help out in factories by taking on heavy lifting or repetitive jobs, but the walking, talking kind may soon collaborate with people, thanks to European robotics researchers building prototypes that anticipate human actions.

‘Ideally robots should be able to sense interactional forces, like carrying a table with someone,’ said Francesco Nori, who coordinates the EU-funded An.Dy project which aims to advance human-robot collaboration. ‘(Robots) need to know what the human is about to do and what they can do to help.’

In any coordinated activity, whether dancing or lifting a table together, timing is crucial and that means a robot needs to anticipate before a person acts.

‘Today, robots just react – half a second of anticipation might be enough,’ said Nori, who works at the Italian Institute of Technology which is renowned for its humanoid robot called iCub, that will be educated in human behaviour from data collected during the An.Dy project.

The data will flow from a special high-tech suit that lies at the heart of the project – the AndySuit. This tight suit is studded with sensors to track movement, acceleration of limbs and muscle power as a person performs actions alone or in combination with a humanoid robot.

A special high-tech suit known as the AndySuit allows a person to perform actions alongside a robot. Image credit – Italian Institute of Technology

This sends data to a robot similar to iCub so that it can recognise what the human is doing and predict the next action just ahead of time. The collaborative robot – also known as a cobot – would then be programmed to support the worker.

‘The robot would recognise a good posture and a bad posture and would work so that it gives you an object in the right way to avoid injury,’ explained Nori, adding that the cobot would adapt its own actions to maximise the comfort of the human.

The robot’s capabilities will come from its library of pre-programmed models of human movement, but also from internal sensors and a mobile phone app. Special sensors that communicate with the iCub are also being developed for the AndySuit, but at the moment it is more appropriate for the robotics lab rather than a factory floor.

To get the robot and AndySuit closer to commercialisation it will be tested in three different scenarios. First, in a workspace where a person works beside a cobot. Second, when a person wears an exoskeleton, which could be useful for workers who must lift heavy loads and can be assisted by a robust metal skeleton around them.

A third scenario will be where a humanoid robot offers assistance and could take turns performing tasks. In this situation, the robot would look like the archetype sci-fi robot; like Sonny from the film iRobot.

Silicon sidekick

A different project will see a human-like prototype robot reach out a helping hand to support technicians, under an EU-funded project called SecondHands led by Ocado Technology in the UK.

‘Ask it to pass the screw driver, and it will respond asking whether you meant the one on the table or in the toolbox.’ Duncan Russel, Ocado Technology

Ocado runs giant automated warehouses that fulfil grocery orders. Its warehouse in Hatfield, north of London, is the size of several football fields and must be temporarily shut down for regular maintenance.

Duncan Russell, research coordinator at Ocado Technology explained: ‘Parts need to be cleaned and parts need replacing. The robot system is being designed to help the technicians with those tasks.’

While the technician stands on a ladder, a robot below would watch what they are doing and provide the next tool or piece of equipment when asked.

‘The robot will understand instructions in regular language – it will be cleverer than you might expect,’ said Russell. ‘Ask it to pass the screw driver, and it will respond asking whether you meant the one on the table or in the toolbox.’

The robot will feature capabilities straight from the Inspector Gadget cartoon series. An extendable torso will allow it to move upwards and telescopic limbs will give it a three metre plus reach.

‘The arm span is 3.1 metres and the torso is around 1.8 metres, which gives it a dynamic reach. This will allow it to offer assistance to technicians up on a ladder,’ said Russell.

This futuristic scenario is being brought to reality by research partners around Europe. Robotics experts at Karlsruhe Institute of Technology in Germany have built a wheeled prototype robot. The plan is for a bipedal robot to be tested in the Ocado robots lab in Hatfield, and for it to be transferred to the warehouse floor for a stint with a real technician.

Karlsruhe is also involved in teaching the robot natural language and together with the Swiss Federal Institute of Technology in Lausanne it is developing a grasping hand, so the helper robot can wield tools with care. The visions system of this silicon sidekick is being developed by researchers at University College London, UK.

The handy robotic helper could also do cross-checking for the maintenance person, perhaps offering a reminder if a particular step is missed, for example.

‘The technician will get more done and faster, so that the shutdown times for maintenance can be shortened,’ said Russell.

More info:
An.Dy
SecondHands

Artificial skin could allow robots to feel like we do

Researchers are running tests on pig skin to better understand how skin behaves and pave the way for bioengineering applications. Image credit – Dr Aisling Ni Annaidh at University College Dublin

Artificial skin with post-human sensing capabilities, and a better understanding of skin tissue, could pave the way for robots that can feel, smart-transplants and even cyborgs.

Few people would immediately recognise the skin as our bodies’ largest organ, but the adult human has on average two square metres of it. It’s also one of the most important organs and is full of nerve endings that provide us with instant reports of temperature, pressure and pain.

So far the best attempts to copy this remarkable organ have resulted in experimental skin with sensor arrays that, at best, can only measure one particular stimulus.

But the SmartCore project, funded by the EU’s European Research Council and at the Graz University of Technology (TU Graz) in Austria, hopes to create a material that responds to multiple stimuli. To do so requires working at a nanoscale — where one nanometre represents a billionth of a metre — creating embedded arrays of minuscule sensors that could be 2 000 times more sensitive than human skin.

Principal investigator Dr Anna Maria Coclite, an assistant professor at TU Graz’s Institute for Solid State Physics, says the project aims to create a nanoscale sensor which can pick up temperature, humidity and pressure — not separately, but as an all-in-one package.

‘They will be made of a smart polymer core which expands depending on the humidity and temperature, and a piezoelectric shell, which produces an electric current when pressure is applied,’ she said.

These smart cores would be sandwiched between two similarly tiny nanoscale grids of electrodes which sense the electrical charges given off when the sensors ‘feel’ and then transmit this data.

If the team can surmount the primary challenge of distinguishing between the different senses, the first prototype should be ready in 2019, opening the door for a range of test uses.

Robots

Dr Coclite says the first applications of a successful prototype would be in robotics since the artificial skin they’re developing has little in common with our fleshy exterior apart from its ability to sense.

‘The idea is that it could be used in ways, like robotic hands, that are able to sense temperatures,’ said Dr Coclite. ‘Or even things that can be sensed on even a much smaller scale than humans can feel, i.e, robotic hands covered in such an artificial skin material that is able to sense bacteria.’

Moreover, she says the polymers used to create smart cores are so flexible that a successful sensor could potentially be modified in the future to sense other things like the acidity of sweat, which could be integrated into smart clothes that monitor your health while you’re working out.

And perhaps, one day, those who have lost a limb or suffered burns could also benefit from such multi-stimuli sensing capabilities in the form of a convincingly human artificial skin.

‘It would be fantastic if we could apply it to humans, but there’s still lots of work that needs to be done by scientists in turning electronic pulses into signals that could be sent to the brain and recognised,’ said Dr Coclite.

She also says that even once a successful prototype is developed, possible cyborg use in humans would be at least a decade away — especially taking into account the need to test for things like toxicity and how human bodies might accept or reject such materials.

Getting a grip

But before any such solutions are possible, we must learn more about biological tissue mechanics, says Professor Michel Destrade, host scientist of the EU-backed SOFT-TISSUES project, funded by the EU’s Marie Skłodowska-Curie actions.

Prof. Destrade, an applied mathematician at the National University of Ireland Galway, is supporting Marie Skłodowska-Curie fellow Dr Valentina Balbi in developing mathematical models that explain how soft tissue like eyes, brains and skin behave.

‘For example, skin has some very marked mechanical properties,’ said Prof. Destrade. ‘In particular its stretch in the body — sometimes you get a very small cut and it opens up like a ripe fruit.’

This is something he has previously researched with acoustic testing, which uses non-destructive sound waves to investigate tissue structure, instead of chopping up organs for experimentation.

And in SOFT-TISSUES’ skin research, the team hopes to use sound waves and modelling as a cheap and immediate means of finding the tension of skin at any given part of the body for any given person.

‘This is really important to surgeons, who need to know in which direction they should cut skin to avoid extensive scarring,’ explained Prof. Destrade. ‘But also for the people creating artificial skin to know how to deal with mismatches in tension when they connect it to real skin.

‘If you are someone looking to create artificial skin and stretch it onto the body, then you need to know which is the best way to cut and stretch it, the direction of the fibres needed to support it and so on.‘

Dr Balbi reports that the biomedical industry has a real hunger for knowledge provided by mathematical modelling of soft tissues — and especially for use in bioengineering.

She says such knowledge could be useful in areas like cancer research into brain tumour growth and could even help improve the structure of lab-grown human skin as an alternative to donor grafts.

More info:
SmartCore
SOFT-TISSUES

Swarms of smart drones to revolutionise how we watch sports

Researchers are looking for ways to connect drones together in swarms to capture sports events. Image credit — Flickr/ Ville Hyvönen

by Joe Dodgshun

Anyone who has watched coverage of a festival or sports event in the last few years will probably have witnessed commercial drone use — in the form of breathtaking aerial footage. But a collaboration of universities, research institutes and broadcasters is looking to take this to the next level by using a small swarm of intelligent drones.

The EU-funded MULTIDRONE project seeks to create teams of three to five semi-automated drones that can react to and capture unfolding action at large-scale sports events. Project coordinator Professor Ioannis Pitas, of the University of Bristol, UK, says the collaboration aims to have prototypes ready for testing by its media partners Deutsche Welle and Rai – Radiotelevisione Italiana within 18 months.

‘Deutsche Welle has two potential uses lined up – filming the Rund um Wannsee boat race in Berlin, Germany, and also filming football matches with drones instead of normal cameras – while Rai is interested in covering cycling races,’ said Prof. Pitas.

‘We think we have the potential to offer a much better film experience at a reduced cost compared to helicopters or single drones, producing a new genre in drone cinematography. We have the potential to offer a much better film experience at a reduced cost compared to helicopters or single drones, producing a new genre in drone cinematography.’

But before they can chase the leader of the Tour de France, MULTIDRONE faces the hefty challenge of creating AI that allows its drones to safely carry out a mission as a team. Prof. Pitas says safety is the utmost priority, so the drones will include advanced crowd avoidance mechanisms and the ability to make emergency landings.

And it’s not just safety in the case of bad weather, a flat battery or a rogue football. ‘Security of communications is important as a drone could otherwise be hijacked, not just undermining privacy but also raising the possibility that it could be used as a weapon,’ said Prof. Pitas.

The early project phase will have a strong focus on ethics to prevent any issues around privacy.

‘People are sensitive about drones and about being filmed and we’re approaching this in three ways — trying to avoid shooting over private spaces, getting consent from the athletes being followed, and creating mechanisms that decide which persons to follow and blur other faces.’

If they can pull it off, he predicts a huge boost for the European entertainment industry and believes it could lead to much larger drone swarms capable of covering city-wide events.

Drones-on-demand

According to Gartner research, sales of commercial-use drones are set to jump from 110 000 units in 2016 to 174 000 this year. Although 2 million toy drones were snapped up last year for USD 1.7 billion, the commercial market dwarfed this at USD 2.8 billion.

Aside from pure footage, drones have also proven their worth in research, disaster response, construction and even in monitoring industrial assets. One company trying to open up the market to those needing a sky-high helping hand is Integra Aerial Services, a young drones-as-a-service company. An offshoot of Danish aeronautics firm Integra Holding Group, INAS, was launched in 2014 thanks to an EU-backed feasibility study.

INAS has more than 25 years of experience in aviation and used its knowledge of the sector’s legislation to shape a business model targeting heavier, more versatile drones weighing up to 25 kilogrammes. And they have already been granted a commercial drone operating license by the Danish Civil Aviation Authority.

These bigger drones have far more endurance than typical toy drones, which can weigh anywhere from 250 grams to several kilos. INAS CEO Gilles Fartek says their bigger size means they can carry multiple sensors, thus collecting all the needed data in one fell swoop, instead of across multiple flights. For example, one of their drones flies LIDAR (Light Detection and Ranging) radar over Greenland to measure ice thickness as a measure of climate change, but could also carry a 100 megapixel, high-definition camera.

While INAS spends most of the Arctic summer running experiments from the remote host Station Nord in Greenland, Fartek says they’re free to use the drones for different projects in other seasons, mostly in areas of environmental research, mapping and agricultural monitoring.

‘You can’t match the quality of data for the price, but drone-use regulations in Europe are still quite complicated and make between-country operations almost impossible,’ said Fartek. ‘The paradox is that you have an increasing demand for such civil applications across Europe and even in institutional areas like civil protection and maritime safety where they cannot use military drones.’

A single European sky

These issues, and more, should soon be addressed by SESAR, the project which coordinates all EU research and development activities in air traffic management. SESAR plans to deploy a harmonised approach to European airspace management by 2030 in order to meet a predicted leap in air traffic.

Recently SESAR unveiled its blueprint outlining how it plans to make drone use in low-level airspace safe, secure and environmentally friendly. They hope this plan will be ready by 2019, paving the way for an EU drone services market by safely integrating highly automated or autonomous drones into low-level airspace of up to 150 metres.

Modelled after manned aviation traffic management, the plan will include registration of drones and operators, provide information for autonomous drone flights and introduce geo-fencing to limit areas where drones can fly.

Read More

Swarms of smart drones to revolutionise how we watch sports

Credit: Flickr/ Ville Hyvönen

by Joe Dodgshun
Drone innovators are transforming the way we watch events, from football matches and boat races to music festivals.

Anyone who has watched coverage of a festival or sports event in the last few years will probably have witnessed commercial drone use — in the form of breathtaking aerial footage.

But a collaboration of universities, research institutes and broadcasters is looking to take this to the next level by using a small swarm of intelligent drones.

The EU-funded MULTIDRONE project seeks to create teams of three to five semi-automated drones that can react to and capture unfolding action at large-scale sports events.

Project coordinator Professor Ioannis Pitas, of the University of Bristol, UK, says the collaboration aims to have prototypes ready for testing by its media partners Deutsche Welle and Rai – Radiotelevisione Italiana within 18 months.

‘Deutsche Welle has two potential uses lined up – filming the Rund um Wannsee boat race in Berlin, Germany, and also filming football matches with drones instead of normal cameras – while Rai is interested in covering cycling races,’ said Prof. Pitas.

‘We think we have the potential to offer a much better film experience at a reduced cost compared to helicopters or single drones, producing a new genre in drone cinematography.’

‘We have the potential to offer a much better film experience at a reduced cost compared to helicopters or single drones, producing a new genre in drone cinematography.’

Professor Ioannis Pitas, University of Bristol, UK

But before they can chase the leader of the Tour de France, MULTIDRONE faces the hefty challenge of creating AI that allows its drones to safely carry out a mission as a team.

Prof. Pitas says safety is the utmost priority, so the drones will include advanced crowd avoidance mechanisms and the ability to make emergency landings.

And it’s not just safety in the case of bad weather, a flat battery or a rogue football.

‘Security of communications is important as a drone could otherwise be hijacked, not just undermining privacy but also raising the possibility that it could be used as a weapon,’ said Prof. Pitas.

The early project phase will have a strong focus on ethics to prevent any issues around privacy.

‘People are sensitive about drones and about being filmed and we’re approaching this in three ways — trying to avoid shooting over private spaces, getting consent from the athletes being followed, and creating mechanisms that decide which persons to follow and blur other faces.’

If they can pull it off, he predicts a huge boost for the European entertainment industry and believes it could lead to much larger drone swarms capable of covering city-wide events.

Drones-on-demand
According to Gartner research, sales of commercial-use drones are set to jump from 110 000 units in 2016 to 174 000 this year. Although 2 million toy drones were snapped up last year for USD 1.7 billion, the commercial market dwarfed this at USD 2.8 billion.

Aside from pure footage, drones have also proven their worth in research, disaster response, construction and even in monitoring industrial assets.

One company trying to open up the market to those needing a sky-high helping hand is Integra Aerial Services, a young drones-as-a-service company.

An offshoot of Danish aeronautics firm Integra Holding Group, INAS, was launched in 2014 thanks to an EU-backed feasibility study.

INAS has more than 25 years of experience in aviation and used its knowledge of the sector’s legislation to shape a business model targeting heavier, more versatile drones weighing up to 25 kilogrammes. And they have already been granted a commercial drone operating license by the Danish Civil Aviation Authority.

These bigger drones have far more endurance than typical toy drones, which can weigh anywhere from 250 grams to several kilos. INAS CEO Gilles Fartek says their bigger size means they can carry multiple sensors, thus collecting all the needed data in one fell swoop, instead of across multiple flights.

For example, one of their drones flies LIDAR (Light Detection and Ranging) radar over Greenland to measure ice thickness as a measure of climate change, but could also carry a 100 megapixel, high-definition camera.

While INAS spends most of the Arctic summer running experiments from the remote host Station Nord in Greenland, Fartek says they’re free to use the drones for different projects in other seasons, mostly in areas of environmental research, mapping and agricultural monitoring.

‘You can’t match the quality of data for the price, but drone-use regulations in Europe are still quite complicated and make between-country operations almost impossible,’ said Fartek.

‘The paradox is that you have an increasing demand for such civil applications across Europe and even in institutional areas like civil protection and maritime safety where they cannot use military drones.’

A single European sky
These issues, and more, should soon be addressed by SESAR, the project which coordinates all EU research and development activities in air traffic management. SESAR plans to deploy a harmonised approach to European airspace management by 2030 in order to meet a predicted leap in air traffic.

Recently SESAR unveiled its blueprint outlining how it plans to make drone use in low-level airspace safe, secure and environmentally friendly. They hope this plan will be ready by 2019, paving the way for an EU drone services market by safely integrating highly automated or autonomous drones into low-level airspace of up to 150 metres.

Modelled after manned aviation traffic management, the plan will include registration of drones and operators, provide information for autonomous drone flights and introduce geo-fencing to limit areas where drones can fly.

The Issue
Emerging drone sectors range from delivery services, collecting industry data, infrastructure inspections, precision agriculture, transportation and logistics.

The market for drone services is expected to grow substantially in the coming years with an estimated worth of EUR 10 billion by 2035.

To support high-potential small- and medium-sized enterprises (SMEs), the European Commission has allocated EUR 3 billion over the period 2014-2020. A further EUR 17 billion was set aside under the Industrial Leadership pillar of the EU’s current research funding programme Horizon 2020.

More info
MULTIDRONE

INAS

A robotic doctor is gearing up for action

A new robot under development can send information on the stiffness, look and feel of a patient to a doctor located kilometres away. Image credit: Accrea

A robotic doctor that can be controlled hundreds of kilometres away by a human counterpart is gearing up for action. Getting a check-up from a robot may sound like something from a sci-fi film, but scientists are closing in on this real-life scenario and have already tested a prototype.

‘The robot at the remote site has different force, humidity and temperature sensors, all capturing information that a doctor would get when they are directly palpating (physically examining) a patient,’ explains Professor Angelika Peer, a robotics researcher at the University of the West of England, UK.

Prof. Peer is also the project coordinator of the EU-funded ReMeDi project, which is developing the robotic doctor to allow medical professionals to examine patients over huge distances.

Through a specially designed surface mounted on a robotic arm, stiffness data of the patient’s abdomen is displayed to the human, allowing the doctor to feel what the remote robot feels. This is made possible thanks to a tool called a haptic device, which has a soft surface reminiscent of skin that can recreate the sense of touch through force and changing its shape.

During the examination, the doctor sits at a desk facing three screens, one showing the doctor’s hand on the faraway patient and a second for teleconferencing with the patient, which will remain an essential part of the exchange.

The third screen displays a special capability of the robot doctor – ultrasonography. This is a medical technique that sends sound pulses into a patient’s body to create a window into the patient. It reveals areas of different densities in the body and is often used to examine pregnant women.

Ultrasonography is also important for flagging injuries or disease in organs such as the heart, liver, kidneys or spleen and can find indications for some types of cancer, too.

‘The system allows a doctor from a remote location to do a first assessment of a patient and make a decision about what should be done, whether to transfer them to hospital or undergo certain treatments,’ said Prof. Peer.

The robot currently resides in a hospital in Poland but scientists have shown the prototype at medical conferences around the world. And they have already been approached by doctors from Australia and Canada where it can take several hours to transfer rural patients to a doctor’s office or hospital.

With the help of a robot, a doctor can talk to a patient, manoeuvre robotic arms, feel what the robot senses and get ultrasounds. Image credit: ReMeDi

‘This is to support an initial diagnosis. The human is still in the loop, but this allows them to perform an examination remotely,’ said Prof. Peer.

Telemedicine

The ReMeDi project could speed up a medical exam and save time for patients and clinics. Another EU-funded project – United4Health (U4H) – looks at a different technology that could be used to remotely diagnose or treat people.

‘We need to transform how we deliver health and care,’ said Professor George Crooks, director of the Scottish Centre for Telehealth & Telecare, UK, which provides services via telephone, web and digital television and coordinates U4H.

This approach is crucial as Europe faces an ageing population and a rise in long-term health conditions like diabetes and heart disease. Telemedicine empowers these types of patients to take steps to help themselves at home, while staying in touch with medical experts via technology. Previous studies showed those with heart failure can be successfully treated this way.

These patients were given equipment to monitor their vital signs and send data back to a hospital. A trial in the UK comparing this self-care group to the standard-care group showed a reduction in mortality, hospital admissions and bed days, says Prof. Crooks.

A similar result was shown in the demonstration sites of the U4H project which tested the telemedicine approach in 14 regions for patients with heart failure, diabetes and chronic obstructive pulmonary disease (COPD). For diabetic patients in Scotland, they kept in touch with the hospital using text messages. For COPD, some patients used video consultations.

Prof. Crooks stresses that it is not all about the electronics – what matters is the service wraparound that makes the technology acceptable and easy to use for patients and clinical teams.

‘It can take two or three hours out of your day to go along to a 15 minute medical appointment and then to be told to keep taking your medication. What we do is, by using technology, patients monitor their own parameters, such as blood sugar in the case of diabetes, how they are feeling, diet and so on, and then they upload these results,’ said Prof. Crooks.

‘It doesn’t mean you never go to see a doctor, but whereas you might have gone seven or eight times a year, you may go just once or twice.’

Crucially, previous research has shown these patients fare better and the approach is safe.

‘There can be an economic benefit, but really this is about saving capacity. It frees up healthcare professionals to see the more complex cases,’ said Prof. Crooks.

It also empowers patients to take more responsibility for their health and results in fewer unplanned visits to the emergency room.

‘Patient satisfaction rates were well over 90 %,’ said Prof. Crooks.

Robots offer the elderly a helping hand

Humanoid robots under development can be programmed to detect changes in an elderly person’s preferences and habits. Image credit: GrowMeUp

by Helen Massy-Beresford

Low birth rates and higher life expectancies mean that those over 65 years old now will account for 28.7 % of Europe’s population by 2080, according to Eurostat, the EU’s statistics arm.

It means the age-dependency ratio – the proportion of the elderly compared with the number of workers – will almost double from 28.8 % in 2015 to 51 % in 2080, straining healthcare systems and national budgets.

Yet there’s hope marching over the horizon, in the form of robots.

The creators of one humanoid robot under development for the elderly say it can understand people’s actions and learn new behaviours in response, even though it is devoid of arms.

Robots can be programmed to understand an elderly person’s preferences and habits to detect changes in behaviour: for example if a yoga devotee misses a class, it will ask why, while if an elderly person falls it will automatically alert caregivers or emergency services.

Yet there’s still a way to go before these devices will be able to bring out a tray of tea and biscuits when visitors drop by, according to its creator.

At the moment there are things the robot can perform perfectly in the lab but that still present challenges, says Dr Luís Santos from the University of Coimbra in Portugal, who has been developing the technology as part of an EU-funded research project known as GrowMeUp.

The proportion of elderly people is expected to almost double by 2080, so researchers are looking to robots to see if they can help care for the aging population. Image credit: GrowMeUp

‘There is a mismatch between what elderly people want and what science and technology can provide – some of them are expecting robots to do all types of household activities, engage them in everyday gossip or physically interact with them as another human would do,’ says Dr Santos.

The team is working on making the robot’s dialogue as natural and as intuitive as possible and improving its ability to safely navigate an older person’s home, using a low-cost laser and a camera, and a second prototype will be tested with elderly people in the coming months. Yet, Dr Santos foresees that these devices are still four to six years away from commercialisation, at least.

Revolution

He sees robotics as just a part of a wider revolution underway in how societies care for the elderly, with connectivity and augmented reality also playing a role.

‘In the future, elderly care will also be very focused on information and communications technologies – for example virtual access to doctors or care institutions and 24/7 monitoring in a non-invasive way are likely to become standard,’ he said.

Yet researchers believe that keeping the technology unobtrusive is key – no wearable devices or cumbersome cameras cluttering up people’s homes.

Dr Maria Dagioglou from the National Centre of Scientific Research ‘Demokritos’ in Greece, said: ‘We wanted to avoid a Big Brother scenario, so data privacy is important but also dignity.’

She is looking at ways to integrate robotics technology into a smart home equipped with connected devices, automation and sensors, as part of the EU-funded RADIO project.

Researchers are figuring out ways of putting robots in homes for virtual access to healthcare and constant monitoring, yet that are also non-invasive. Image credit: RADIO

Dr Stasinos Konstantopoulos, the scientific manager of the RADIO project, added: ‘All monitoring happens as the user interacts with the system to control the house, for example, to regulate the temperature, and to ask the robot to run errands, like finding misplaced items.’

Using a tablet or smartphone to interact, the equipment, which should only take a day to install, can monitor elements of an elderly person’s day-to-day life, efficiently processing and managing data to allow medical professionals to keep track of and assess their level of independence via smartphone notifications.

‘It’s a constant safety net in case something starts to be worrying,’ said Dr Dagioglou.

The goal of innovations like this is to allow people to live independently for longer.

A crucial element of this is finding ways for older people to keep up their activity levels, and this is an area where robots could really come into their own.

Dr Luigi Palopoli at the University of Trento in Italy said ‘our robot pushes them to do their exercise, to go out and about; it extracts information on their interests and on their fears and makes them part of a network.’

Barriers

‘We want to tear down the emotional barriers that make them stay at home and degrade the quality of their life,’ he said.

As part of the EU-funded Acanto project, he is developing a robot called FriWalk, following on from the progress made during a previous EU-funded project, the DALi project.

The team has worked hard to make the FriWalk robot look energetic and appealing and to ensure it offers useful services like carrying small items or giving directions.

With prototypes made, in the next few months, the researchers will start clinical trials in Spain as well as public demonstrations of the FriWalk in museums and other public spaces.

Researchers will start clinical trials in Spain of a robotic prototype designed to help elderly people to exercise. Image credit: Acanto

Further ahead, Dr Palopoli hopes for interest from established manufacturers and start-ups to bring the FriWalk technology to the market.

If you liked this article, you may also want to read:

See all the latest robotics news on Robohub, or sign up for our weekly newsletter.

Choreographing automated cars could save time, money and lives

If you take humans out of the driving seat, could traffic jams, accidents and high fuel bills become a thing of the past? As cars become more automated and connected, attention is turning to how to best choreograph the interaction between the tens or hundreds of automated vehicles that will one day share the same segment of Europe’s road network.

It is one of the most keenly studied fields in transport – how to make sure that automated cars get to their destinations safely and efficiently. But the prospect of having a multitude of vehicles taking decisions while interacting on Europe’s roads is leading researchers to design new traffic management systems suitable for an era of connected transport.

The idea is to ensure that traffic flows as smoothly and efficiently as possible, potentially avoiding the jams and delays caused by human behaviour.

‘Travelling distances and time gaps between vehicles are crucial,’ said Professor Markos Papageorgiou, head of the Dynamic Systems & Simulation Laboratory at the Technical University of Crete, Greece. ‘It is also important to consider things such as how vehicles decide which lane to drive in.’

Prof. Papageorgiou’s TRAMAN21 project, funded by the EU’s European Research Council, is studying ways to manage the behaviour of individual vehicles, as well as highway control systems.

For example, the researchers have been looking at how adaptive cruise control (ACC) could improve traffic flows. ACC is a ‘smart’ system that speeds up and slows down a car as necessary to keep up with the one in front. Highway control systems using ACC to adjust time gaps between cars could help to reduce congestion.

‘It may be possible to have a traffic control system that looks at the traffic situation and recommends or even orders ACC cars to adopt a shorter time gap from the car in front,’ Prof. Papageorgiou said.

‘So during a peak period, or if you are near a bottleneck, the system could work out a gap that helps you avoid the congestion and gives higher flow and higher capacity at the time and place where this is needed.’

Variable speed limits

TRAMAN21, which runs to 2018, has been running tests on a highway near Melbourne, Australia, and is currently using variable speed limits to actively intervene in traffic to improve flows.

An active traffic management system of this kind could even help when only relatively few vehicles on the highway have sophisticated automation. But he believes that self-driving vehicle systems must be robust enough to be able to communicate with each other even when there are no overall traffic control systems.

‘Schools of fish and flocks of birds do not have central controls, and the individuals base their movement on the information from their own senses and the behaviour of their neighbours,’ Prof. Papageorgiou said.

‘In theory this could also work in traffic flow, but there is a lot of work to be done if this is to be perfected. Nature has had a long head-start.’

One way of managing traffic flow is platooning – a way to schedule trucks to meet up and drive in convoy on the highway. Magnus Adolfson from Swedish truckmaker Scania AB, who coordinated the EU-funded COMPANION project, says that platooning – which has already been demonstrated on Europe’s roads – can also reduce fuel costs and accidents.

The three-year project tested different combinations of distances between trucks, speeds and unexpected disruptions or stoppages.

Fuel savings

In tests with three-vehicle platoons, researchers achieved fuel savings of 5 %. And by keeping radio contact with each other, the trucks can also reduce the risk of accidents.

‘About 90 percent of road accidents are caused by driver error, and this system, particularly by taking speed out of the driver’s control, can make it safer than driving with an actual driver,’ Adolfson said.

The COMPANION project also showed the benefits of close communication between vehicles to reduce the likelihood of braking too hard and causing traffic jams further back.

‘There is enough evidence to show that using such a system can have a noticeable impact, so it would be good to get it into production as soon as possible,’ Adolfson said. The researchers have extended their collaboration to working with the Swedish authorities on possible implementation.

Rutger Beekelaar, a project manager at Dutch-based research organisation TNO, says that researchers need to demonstrate how automated cars can work safely together in order to increase their popularity.

‘Collaboration is essential to ensure vehicles can work together,’ he said. ‘We believe that in the near future, there will be more and more automation in traffic, in cars and trucks. But automated driving is not widely accepted yet.’

To tackle this, Beekelaar led a group of researchers in the EU-funded i-GAME project, which developed technology that uses wireless communication that contributes to managing and controlling automated vehicles.

They demonstrated these systems in highway conditions in the 2016 Grand Cooperative Driving Challenge in Helmond, in the Netherlands, which put groups of real vehicles through their paces to demonstrate cooperation, how they safely negotiated an intersection crossing, and merged with another column of traffic.

Beekelaar says that their technology is now being used in other European research projects, but that researchers, auto manufacturers, policymakers, and road authorities still need to work together to develop protocols, systems and standardisation, along with extra efforts to address cyber security, ethics and particularly the issue of public acceptance.

Split-second decisions: Navigating the fine line between man and machine

Level 3 automation, where the car handles all aspects of driving with the driver on standby, is being tested in Sweden. Image courtesy of Volvo cars

Today’s self-driving car isn’t exactly autonomous – the driver has to be able to take over in a pinch, and therein lies the roadblock researchers are trying to overcome. Automated cars are hurtling towards us at breakneck speed, with all-electric Teslas already running limited autopilot systems on roads worldwide and Google trialling its own autonomous pod cars.

However, before we can reply to emails while being driven to work, we have to have a foolproof way to determine when drivers can safely take control and when it should be left to the car.

‘Even in a limited number of tests, we have found that humans are not always performing as required,’ explained Dr Riender Happee, from Delft University of Technology in the Netherlands, who is coordinating the EU-funded HFAuto project to examine the problem and potential solutions.

‘We are close to concluding that the technology always has to be ready to resolve the situation if the driver doesn’t take back control.’

But in these car-to-human transitions, how can a computer decide whether it should hand back control?

‘Eye tracking can indicate driver state and attention,’ said Dr Happee. ‘We’re still to prove the practical usability, but if the car detects the driver is not in an adequate state, the car can stop in the safety lane instead of giving back control.’

Next level

It’s all a question of the level of automation. According to the scale of US-based standards organisation SAE International, Level 1 automation already exists in the form of automated braking and self-parking.

Level 4 & 5 automation, where you punch in the destination and sit back for a nap, is still on the horizon.

But we’ll soon reach Level 3 automation, where drivers can hand over control in situations like motorway driving and let their attention wander, as long as they can safely intervene when the car asks them to.

HFAuto’s 13 PhD students have been researching this human-machine transition challenge since 2013.

Backed with Marie Skłodowska-Curie action funding, the students have travelled Europe for secondments, to examine carmakers’ latest prototypes, and to carry out simulator and on-road tests of transition takeovers.

Alongside further trials of their transition interface, HFAuto partner Volvo has already started testing 100 highly automated Level 3 cars on Swedish public roads.

Another European research group is approaching the problem with a self-driving system that uses external sensors together with cameras inside the cab to monitor the driver’s attentiveness and actions.

Blink

‘Looking at what’s happening in the scene outside of the cars is nothing without the perspective of what’s happening inside the car,’ explained Dr Oihana Otaegui, head of the Vicomtech-IK4 applied research centre in San Sebastián, Spain.

She coordinates the work as part of the EU-funded VI-DAS project. The idea is to avoid high-risk transitions by monitoring factors like a driver’s gaze, blinking frequency and head pose — and combining this with real-time on-road factors to calculate how much time a driver needs to take the wheel.

Its self-driving system uses external cameras as affordable sensors, collecting data for the underlying artificial intelligence system, which tries to understand road situations like a human would.

VI-DAS is also studying real accidents to discern challenging situations where humans fail and using this to help train the system to detect and avoid such situations.

The group aims to have its first interface prototype working by September, with iterated prototypes appearing at the end of 2018 and 2019.

Dr Otaegui says the system could have potential security sector uses given its focus on creating artificial intelligence perception in any given environment, and hopes it could lead to fully automated driving.

‘It could even go down the path of Levels 4 and 5, depending on how well we can teach our system to react — and it will indeed be improving all the time we are working on this automation.’

The question of transitions is so important because it has an impact on liability – who is responsible in the case of an accident.

It’s clear that Level 2 drivers can be held liable if they cause a fender bender, while carmakers will take the rap once Level 4 is deployed. However, with Level 3 transitions, liability remains a burning question.

HFAuto’s Dr Happee believes the solution lies in specialist insurance options that will emerge.

‘Insurance solutions are expected (to emerge) where a car can be bought with risk insurance covering your own errors, and those which can be blamed on carmakers,’ he said.

Yet it goes further than that. Should a car choose to hit pedestrians in the road, or swerve into the path of an oncoming lorry, killing its occupants?

‘One thing coming out of our discussions is that no one would buy a car which will sacrifice its owner for the lives of others,’ said Dr Happee. ‘So it comes down to making these as safe as possible.’

The five levels of automation:

  1. Driver Assistance: the car can either steer or regulate speed on its own.
  2. Partial Automation: the vehicle can handle both steering and speed selection on its own in specific controlled situations, such as on a motorway.
  3. Conditional Automation: the vehicle can be instructed to handle all aspects of driving, but the driver needs to be on standby to intervene if needed.
  4. High Automation: the vehicle can be instructed to handle all aspects of driving, even if the driver is not available to intervene.
  5. Level 5 – Full Automation: the vehicle handles all aspects of driving, all the time.

Airborne tech is coming to the rescue

L'Aquila, Abruzzo, Italy. A goverment's office disrupted by the 2009 earthquake. Source: Wikipedia Commons
L’Aquila, Abruzzo, Italy. A government’s office disrupted by the 2009 earthquake. Source: Wikipedia Commons

by Fintan Burke

‘There’s no real way to determine how safe it is to approach a building, and what is the safest route to do that,’ said Dr Angelos Amditis of the Institute of Communication and Computer Systems, Greece. ‘Now for the first time you can do that in a much more structured way.’

Dr Amditis coordinates the EU-funded RECONASS project, which is using drones and satellite technology to help emergency workers in post-disaster scenarios.

It’s part of a new arsenal of airborne disaster response technologies which is giving a bird’s eye view of disaster zones to help save lives.

The system developed by RECONASS places wireless positioning tags into a building’s structure, along with strain, temperature and acceleration sensors, either when it is first built or through retrofitting. Then after a disaster strikes drones are deployed to scan the building’s exterior and match data from the sensors to this image to build an accurate representation of the damage.

This allows the system to precisely calculate potential areas of structural weakness in the damaged building, and it can then be paired with satellite data to get an overview of the damage in an area.

‘From one side we have 3D virtual images showing the status of the building … produced by ground sensors’ input, and from the other side we have the 3D real images from the drones and satellite views,’ said Dr Amditis.

‘There’s no real way to determine how safe it is to approach a building, and what is the safest route to do that.’

Dr Angelos Amditis, Institute of Communication and Computer Systems, Greece

Dr Amditis and colleagues are hoping to test the system in September by detonating a mock three-storey building in Sweden to investigate how well the drone technology works.

The end goal is for emergency services to use the drone technology to provide information about the structural state of tagged buildings within the first crucial hours after the disaster.

To boost rescue services’ ability to respond quickly in the event of a crisis, researchers in Spain are working on how to help emergency helicopter pilots successfully navigate through a disaster area by giving them more precise and reliable information about their position.

The EU-funded 5 LIVES project is building a system that uses both the newly launched Galileo Global Navigation Satellite System (GNSS) from the European Commission and the European Geostationary Navigation Overlay Service (EGNOS), which reports on the reliability and accuracy of data from satellites.

At the moment, most European helicopters use visual navigation, meaning they cannot fly in bad conditions such as fog, clouds or bad weather.

While more and more are beginning to use GNSS technology, the system has been known to be affected by slight changes in the earth’s atmosphere. There is also no way of knowing how accurate the information is.

‘With conventional GNSS technology positioning, there are a lot of external effects that might negatively affect the precision of positioning,’ said project manager Santi Vilardaga of Barcelona-based Pildo Labs.

In order to encourage the adoption of GNSS technology, the 5 LIVES system overlays information from EGNOS to confirm the precision and accuracy of satellite data.

‘Different parameters that affect the displacement of the signal or the propagation of the satellite signal through the atmosphere are collected through a network of ground-based stations and then transferred to the geostationary satellites that then broadcast the corrections to the EGNOS users,’ he said.

More flexible

This extra precision and confidence in the data allows emergency helicopters to become more flexible. ‘They can operate at night, they can operate technology in meteorological conditions without need of added technology on the ground,’ said Vilardaga.

They are also designing drone-based search and rescue operations. Drones can be used to fly around disaster areas, using on-board cameras to continue the search for survivors even when weather conditions make it unsafe for human pilots to do so, explains Vilardaga.

The team is now ensuring that the drones satisfy existing European regulation and their new positioning technology is up to standard for a demonstration of their search and rescue procedures next year.

The need for a new way to combat natural disasters from the air is being driven in part by a new type of crisis – so-called mega-fires. These once-rare fires, which are known to spread at a rate of more than three acres per second, are forest fires large enough to spread between fire-fighting jurisdictions.

A variety of factors such as drought, climate change and the practice of preventing smaller fires to develop naturally have all been blamed for the increasing frequency of mega-fires.

To defend against these fires, the EU-funded Advanced Forest Fire Fighting project is trying to improve both firefighting logistics and technology.

One method involves developing a pellet made of extinguishing materials which can be dropped from a greater height, meaning pilots are less likely to encounter risks from smoke or heat from the fire.

Another area under development is centralising the information from satellite data, drones and ground reports into what is being called a core expert engine to help firefighters to prioritise procedures.

‘The commander can have at his disposal just one aircraft with a pellet,’ said project coordinator Cecilia Coveri of Finmeccanica SpA, Italy, ‘and in case the pellet is the best way to extinguish a fire … this simulation can help the commander to take this decision.’

The project begins its first trials in Athens during the summer in cooperation with Greek naval forces, with additional testing of the risk analysis and simulation tools to follow.

If you enjoyed that article, you may also enjoy reading: 

See all the latest robotics news on Robohub, or sign up for our weekly newsletter.

Page 3 of 3
1 2 3