Page 1 of 3
1 2 3

Shaping the UK’s future with smart machines: Findings from four ThinkIns with academia, industry, policy, and the public

The UK Robotics Growth Partnership (RGP) aims to set the conditions for success to empower the UK to be a global leader in Robotics and Autonomous Systems whilst delivering a smarter, safer, more prosperous, sustainable and competitive UK. The aim is for smart machines to become ubiquitous, woven into the fabric of society, in every sector, every workplace, and at home. If done right, this could lead to increased productivity, and improved quality of life. It could enable us to meet Net Zero targets, and support workers as their roles transition from menial tasks.

One thing that’s striking is that although robotics holds so much potential, they are not yet ready. The covid crisis has made this very clear. If it had been ready, we could have – at scale – deployed robots to sanitise hospitals, enable doctor-patient communication through telepresence, or connect patients with loved ones. Robots could have produced, organised, and delivered much needed samples, tests, PPE, medicine, and food across the UK. And many businesses could be reopening with a robotic interface. Robots could have powered a low-touch economy, where activities continue even when humans can’t be in close physical contact, driving recovery and resilience.

For the past year, we’ve been thinking about this at the RGP. How could we have done things better? What would it take to make an armada of disinfecting robots for a Covid pop-up hospital (called a Nightingale hospital in the UK)? Ideally we would have been able to log into a digital twin of the hospital, port in a model of a robot platform from a database, work-up a solution in the virtual world, and readily port it to an actual testbed to demonstrate its function in the physical world. We could then trial the solution in a living lab, maybe a dedicated Nightingale, before scaling up the solution to other hospitals. Others developing telepresence robots could follow the same methodology, checking that their solutions are interoperable, and working within the same virtual and physical environments. 

We have many bits of the puzzle in the UK – great research and industry, plus government buy-in, but what we need is to bring this together.

To explore this further, we spent the last months hosting ThinkIns with Tortoise Media to get feedback from Academia, Industry, Policy and the Public. You can read all the blog posts and watch the videos here:

The future of smart machines: reflections from academia https://ukrgp.org/the-future-of-smart-machines-reflections-from-academia/

Building an ecosystem to make useful robots https://ukrgp.org/building-an-ecosystem-to-make-useful-robots/

Musings with the public about their future with smart machines https://ukrgp.org/musings-with-the-public-about-their-future-with-smart-machines/

Keeping up with the pace of change – positioning the UK as a leader in smart machines https://ukrgp.org/keeping-up-with-the-pace-of-change-positioning-the-uk-as-a-leader-in-smart-machines/

Below are some preliminary findings.

From digital twins to living labs

In his blog, James Kell from Rolls Royce says “The UK is small enough to collaborate well, but big enough to be a global leader. But to be successful we will need new tools, in particular better, cheaper digital twins – synthetic environments where we can develop and test new approaches before we test them in real world environments on our $35m engines.”

Professor Samia Nefti-Meziani from Salford University had a similar comment “Society needs better tools to support a sustainable future, with a new network of synthetic environments to build and fine-tune new technologies, to ensure they work in the real world and to reduce the time from their inception to deployment from years to months. Digital platforms and accessible, open-source software tools will empower SMEs, academics and the public sector to engage and benefit from these new solutions.”

Collaboration across academia and industry

As Samia highlights, “Collaboration was a recurring theme in the ThinkIns, with academic and industry partnerships essential to ensure we target the most pressing challenges and drive innovation in the sectors that need its solutions. As smart machines become more capable and cheaper, their adoption and development within the UK business ecosystem will broaden across sectors and applications.”

Government support to unlock incentives

James continued, “Collaboration needs coordination: Government is critical to convening and leading, creating new ways and incentives to work better together. The new tools will only equip our researchers and SMEs to accelerate product development, validation and speed to market if they can trust each other and all both contribute and benefit. We need new ways for big industry (companies like mine) to have their challenges understood and find new partners to work with, to learn together to develop solutions and put them in place quicker. And if we join up the academics and link across our innovation infrastructure and existing test areas, we will accelerate the adoption of smart machines and unleash the multiple benefits they bring.”  

The human element

‘Taking the public with us’, is critical to mass adoption, says Samia. “Key will be:

– Involving the public in co-creating research and industry ambitions, to help them understand and engage with what RAS can offer

– Engaging with those who distrust RAS, to understand their concerns and gain their confidence

– Improving RAS education and lifelong learning, so those with interest and capability can be trained in RAS and directly involved in shaping their future.”

“The sector must ‘show its workings’ and be clear of the problems and challenges to prioritise. Ensuring standards and protocols are developed to protect the input of the public and the quality of the outputs is vital to buy-in in the long-term.”

David Bisset, a Robotics Consultant, commented on the Public session, highlighting that “Smart Machines are already with us, cars, aeroplanes, vacuum cleaners we don’t call them robots but they all use that technology. To make them work requires many skills; industrial design, AI people, sensor experts and interaction designer… and many more. At a human level we need to be able to trust, to know it’s built right and safe.”

He highlights the issue of “Tech Wash” mentioned by the public. “Is ‘smart machine’ just some clever rebranding? The needless selling of technology as a solution to every senior manager’s need to outshine their peers? We need to stop and think about the consequences of forcing through technology driven organisational change without evidence and stop needless disruption. We need to know these things will work!”

Overall, to make smart machines a success, we need to bring the discussion to a human level, to where this makes a difference to people.

Bringing it all together

Rob Buckingham, Head of RACE at the UK Atomic Energy Authority commented on the policy ThinkIn “Robotics includes both tools that are physically discrete from us and physical augmentation. In either case the interface between person and machine is going to be a field of rapid development driven at least in part by gaming and zooming.

The much bigger part is the informed discussion with people, with society, about the world we want to live in. Are we Canute (spoiler alert – it doesn’t end well) or are we the voice of sustainable democracy that values both people and nature?

I think Living Labs are going to sit at the heart of this… physical places where we explore the issues and opportunities together. In my field of nuclear, mock-ups have always been sensible because experimenting with the real thing is only allowed in exceptional circumstances. My hope is that we will invest in many Living Labs around the country that enable us, collectively, to explore the benefits and unintended consequences of our creativity. Of course, we might expect all of the Living Labs to be connected by data and the management of data; indeed we might expect common tech platforms and digital models of ‘nearly everything’ to be one of the highest value spin-offs.”

Updated: All the #ICRA2020 plenary and keynote videos


ICRA 2020, one of the main international robotics conferences, is happening online this year due to COVID-19. That means there is loads of free content you can view from home. It’s a great way to see what’s happening in the field straight from those pushing the state of the art.

Plenaries and Keynotes are being broadcast from June 1 to 15 at 1PM UTC on IEEE.TV. We’ve embedded all the talks below, and will keep updating throughout the conference. Check out the online programme for more great content, including workshops and tutorials.

Plenary Panel Chair: Wolfram Burgard Monday, June 1, 1PM UTC

Covid-19:
How Can Roboticists Help?

Ken Goldberg, UC Berkeley, Moderator

Robin Murphy, Texas A&M, USA
Brad Nelson, ETH Zurich, CH
Richard Voyles, Purdue, USA
Kris Hauser, UIUC, USA
Antonio Bicchi, I-RIM, Italian Institute of Robotics and Intelligent Machines
Andra Keay, Silicon Valley Robotics, USA

Gangtie Zheng, Tsinghua U, PRC
Ayanna Howard,
Georgia Tech, USA
Kirsten Thurow, CELISCA Rostock, Ge
Helen Grenier, ASAALT, USA
Howie Choset, CMU, USA
Guang-Zhong Yang, Shanghai Jiao Tong U, PRC

Plenaries
Lydia E. Kavraki Planning in Robotics and Beyond Tuesday June 2, 1PM UTC
Yann LeCun Self-Supervised Learning & World Models Wednesday June 3, 1PM UTC
Jean-Paul Laumond Geometry of Robot Motion: from the Rolling Car to the Rolling Man Thursday June 4, 1PM UTC

     
Keynotes
Allison Okamura Haptics for Humans in a Physically Distanced World Monday June 8, 1PM UTC
Kerstin Dautenhahn Human-Centred Social Robotics:
Autonomy, Trust and Interaction Challenges
Tuesday June 9, 1PM UTC
Pieter Abbeel Can Deep Reinforcement Learning from pixels
be made as efficient as from state?
Wednesday June 10,  1PM UTC
Jaeheung Park Compliant Whole-body Control for Real-World Interactions Thursday June 11, 1PM UTC
Cordelia Schmid Automatic Video Understanding Friday June 121PM UTC
Cyrill Stachniss Robots in the Fields:
Directions Towards Sustainable Crop Production
 Monday June 15, 1PM UTC
Toby Walsh How long before Killer Robots?  Tuesday June 16, 1PM UTC
Hajime Asama Robot Technology for Super Resilience – Remote Technology for Response to Disasters, Accidents, and Pandemic Wednesday June 17,  1PM UTC






COVID-19 robotics resources: ideas for roboticists, users, and educators

Thessaloniki, Greece – April 6, 2020: Drone with recorded message informs citizens of Thessaloniki to stay home to be protected from the coronavirus.

Robots could have a role to play in COVID-19, whether it’s automating laboratory research, helping with logistics, disinfecting hospitals, education, or allowing carers, colleagues or loved ones to connect using telepresence. Yet many of these solutions are still in development or early deployment. The hope is that accelerating these translations could make a difference.

This page aims to compile some resources for roboticists who are able to help, users who need robots for COVID-19 applications, and people who want to learn about robotics while on lockdown.


This is not an exhaustive resource page, and we will regularly be updating the content. Please send pointers to sabine.hauert@robohub.org. Check AIhub.org for a similar resource page related to AI. Read More

Human Robot Interaction conference launches online


The 15th Annual ACM/IEEE International Conference on Human Robot Interaction – HRI 2020 – was meant to take place in the city of Cambridge UK. Instead it will be launching online today. You can follow latest happenings on twitter and youtube. Check here for a list of all the papers.

The theme of this year’s conference is “Real World Human-Robot Interaction,” reflecting on recent trends in the HRI community toward creating and deploying systems that can facilitate real-world, long-term interaction between robots and users.

Check back here, and the official HRI website, for new links and videos.

20+ holiday robot videos

Thanks to all those that sent us their holiday videos. Here’s a selection of 20+ videos to get you into the spirit this season.

Let’s kick off with a full Christmas robot story (10min including bloopers). You can also watch the short version here.

Congrats on making it this far, as a final treat, here’s 9 hours of relaxing sounds of a cat on a robot hoovering around a Christmas tree and presents.

Did we miss your video? Send it to sabine.hauert@robohub.org.

#IROS2019 videos and exhibit floor – update

The 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (#IROS2019) was held in Macau earlier this month. The theme this year was “robots connecting people”.

For those who couldn’t make it in person, or couldn’t possibly see everything, IROS launched IROS TV. You can catch all 28 videos here, or watch the three summary videos below.


And here’s my quick tour of the exhibit floor.

Finally, follow #IROS2019 or @IROS2019MACAU on twitter.

Did you publish at IROS? Send your stories to me and we’ll get them on Robohub.

#IROS2019 videos and exhibit floor

The 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (#IROS2019) is being held in Macau this week. The theme this year is “robots connecting people”.

For those who can’t make it in person, or can’t possibly see everything, IROS is launching IROS TV. The first two episodes are below, but you can have a look at the 22 videos already produced here.

And here’s a quick tour of the exhibit floor.

Finally, follow #IROS2019 or @IROS2019MACAU on twitter.

Sabine (@sabinehauert) from Robohub will be on site, please share your IROS stories, and latest publications with her.

#IROS2019 live coverage and IROS TV

The 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (#IROS2019) is being held in Macau this week. The theme this year is “robots connecting people”.

The conference accepted 1,127 papers for oral presentation, 148 late breaking news posters, and 41 workshops and tutorials.

For those who can’t make it in person, or can’t possibly see everything, IROS is launching IROS TV, an onsite conference television channel featuring a new episode daily that is screened around the conference venue and online.

The TV shows profile the research of scientists, educators, and practitioners in robotics, and provide an opportunity to learn about advances in robotics.

You can also follow #IROS2019 or @IROS2019MACAU on twitter.

Sabine (@sabinehauert) from Robohub will be on site, please share your IROS stories, and latest publications with her.

Live coverage of #ICRA2019

The IEEE International Conference on Robotics and Automation (ICRA) is being held this week in Montreal, Canada. It’s one of the top venues for roboticists and attracts over 4000 conference goers.

Andra, Audrow, Lauren, and Lilly are on the ground so expect lots of great podcasts, videos with best-paper nominees, and coverage in the weeks and months ahead.

For a taste of who is presenting, here is the schedule of keynotes.

It also looks like you can navigate the program, read abstracts, and watch spotlight presentations by following these instructions.

Finally, check latest tweets at #ICRA2019.

Growing bio-inspired shapes with a 300-robot swarm

Artistic photo taken by Jerry H. Wright showing a hand-made shape generated following an emergent Turing pattern (displayed using the LEDs). The trajectory of one of the moving robots can be seen through long exposure. Jerry also used a filter to see the infrared communication between the robots (white light below the robots reflected on the table). Reprinted with permission from AAAS.

Work by I. Slavkov, D. Carrillo-Zapata, N. Carranza, X. Diego, F. Jansson, J. Kaandorp, S. Hauert, J. Sharpe

Our work published today in Science Robotics describes how we grow fully self-organised shapes using a swarm of 300 coin-sized robots. The work was led by James Sharpe at EMBL and the Centre for Genomic Regulation (CRG) in Barcelona – together with my team at the Bristol Robotics Laboratory and University of Bristol.

Here’s a video summarising the results, or you can read the paper here:

Self-organised shapes

Nature is capable of producing impressive functional shapes throughout embryonic development. Broadly, there are two ways to form these shapes.

1) Top-down control. Cells have access to information about their position through some coordinate system, for example generated through their molecular gradients. Cells use this information to decide their fate, which ultimately creates the shapes. There are beautiful examples of this strategy being used for robot swarms, check here for work by Rubenstein et al. (video).

2) Local self-organisation. Cells generate reaction-diffusion systems, such as those described by Alan Turing, resulting in simple periodic patterns. Cells can use these patterns to decide their fate and the resulting shape.

We use the second strategy, here’s how it works.

Patterning

We start from a swarm of 300 closely packed robots in a disc – each running the same code. Each robot stores two morphogens u and v, which you can think of as virtual chemical signals. Morphogen u activates itself and the other morphogen v, whereas v inhibits itself and the other morphogen u – this is a ‘reaction’ network. ‘Diffusion’ of u and v happens through communication from robot to robot. Symmetry breaking caused by the ‘reaction-diffusion’ system results in spots emerging on the swarm (or stripes if we change the parameters!). Areas with high-levels of morphogens are shown in green – that’s what we call a “Turing spot”.

Image adapted from Slavkov, I., Zapata D. C. et al., Science Robotics (2018).

Tissue movement
In biology, cells may die or multiply depending on their patterning. As we can’t do either of those things with robots, we simply move robots from areas where they are no longer needed to areas of growth. The general idea is that robots that are on the edge of the swarm, and are not in a Turing spot, move along the edge of the swarm until they are near the spot. This causes protrusions to grow at the location of the Turing spots.

Image adapted from Slavkov, I., Zapata D. C. et al., Science Robotics (2018).

Following these simple rules, we are able to grow shapes in a repeatable manner, although all the shapes are slightly different. If you watch the video, you’ll see that these shapes look quite organic. We did over 20 experiments with large robot swarms, each one taking about 3 hours.

Image adapted from Slavkov, I., Zapata D. C. et al., Science Robotics (2018).

Because the rules are so simple, and only rely on local information, we get adaptability and robustness for free.

Adaptability
First, as the shape grows, the Turing spots move, showing that the patterning adapts to the shape of the swarm, and that the shape further adapts to the patterning. Second, we can easily change the starting configuration of the swarm (smaller number of robots, or a ‘rectangular’ starting conditions) and the shape still forms.

Image adapted from Slavkov, I., Zapata D. C. et al., Science Robotics (2018).

Robustness
Chopping off a protrusion, causes the robots to regrow it, or to reallocate robots to other protrusions in the swarm. Splitting the swarm causes it to self-heal.

Image adapted from Slavkov, I., Zapata D. C. et al., Science Robotics (2018).

Potential for real world applications
While inspiration was taken from nature to grow the swarm shapes, the goal is ultimately to make large robot swarms for real-world applications. Imagine hundreds or thousands of tiny biodegradable robots growing shapes to explore a disaster environment after an earthquake or fire, or sculpting themselves into a dynamic 3D structure such as a temporary bridge that could automatically adjust its size and shape to fit any building or terrain. There is still a long way to go however, before we see such swarms outside the laboratory.

Team
James Sharpe (EMBL Barcelona) led the Swarm-Organ project, which was initiated at the Centre for Genomic Regulation (CRG) when Sharpe was a group leader there. Sabine Hauert (Bristol Robotics Laboratory and University of Bristol) was the key senior collaborator. Other collaborators were Fredrik Jansson (currently employed at Centrum Wiskunde & Informatica – CWI) and Jaap Kaandorp (University of Amsterdam – UvA).


Paper

You can read more in the paper Slavkov, I., Zapata D. C. et al., Science Robotics (2018).

Funding
The research leading to these results has received funding from the European Union Seventh Framework Programme (FP7) under grant agreement n° 601062, and the EPSRC Centre for Doctoral Training in Future Autonomous and Robotic Systems (FARSCOPE) at the Bristol Robotics Laboratory.

Page 1 of 3
1 2 3