Page 250 of 522
1 248 249 250 251 252 522

Exoskeleton walks out into the real world

For years, the Stanford Biomechatronics Laboratory has captured imaginations with their exoskeleton emulators—lab-based robotic devices that help wearers walk and run faster, with less effort. Now, these researchers will turn heads out in the "wild" with their first untethered exoskeleton, featured in a paper published Oct. 12 in Nature.

Measuring perception in AI models

Perception – the process of experiencing the world through senses – is a significant part of intelligence. And building agents with human-level perceptual understanding of the world is a central but challenging task, which is becoming increasingly important in robotics, self-driving cars, personal assistants, medical imaging, and more. So today, we’re introducing the Perception Test, a multimodal benchmark using real-world videos to help evaluate the perception capabilities of a model.

Measuring perception in AI models

Perception – the process of experiencing the world through senses – is a significant part of intelligence. And building agents with human-level perceptual understanding of the world is a central but challenging task, which is becoming increasingly important in robotics, self-driving cars, personal assistants, medical imaging, and more. So today, we’re introducing the Perception Test, a multimodal benchmark using real-world videos to help evaluate the perception capabilities of a model.

Measuring perception in AI models

Perception – the process of experiencing the world through senses – is a significant part of intelligence. And building agents with human-level perceptual understanding of the world is a central but challenging task, which is becoming increasingly important in robotics, self-driving cars, personal assistants, medical imaging, and more. So today, we’re introducing the Perception Test, a multimodal benchmark using real-world videos to help evaluate the perception capabilities of a model.

Measuring perception in AI models

Perception – the process of experiencing the world through senses – is a significant part of intelligence. And building agents with human-level perceptual understanding of the world is a central but challenging task, which is becoming increasingly important in robotics, self-driving cars, personal assistants, medical imaging, and more. So today, we’re introducing the Perception Test, a multimodal benchmark using real-world videos to help evaluate the perception capabilities of a model.

Measuring perception in AI models

Perception – the process of experiencing the world through senses – is a significant part of intelligence. And building agents with human-level perceptual understanding of the world is a central but challenging task, which is becoming increasingly important in robotics, self-driving cars, personal assistants, medical imaging, and more. So today, we’re introducing the Perception Test, a multimodal benchmark using real-world videos to help evaluate the perception capabilities of a model.

Measuring perception in AI models

Perception – the process of experiencing the world through senses – is a significant part of intelligence. And building agents with human-level perceptual understanding of the world is a central but challenging task, which is becoming increasingly important in robotics, self-driving cars, personal assistants, medical imaging, and more. So today, we’re introducing the Perception Test, a multimodal benchmark using real-world videos to help evaluate the perception capabilities of a model.

Measuring perception in AI models

Perception – the process of experiencing the world through senses – is a significant part of intelligence. And building agents with human-level perceptual understanding of the world is a central but challenging task, which is becoming increasingly important in robotics, self-driving cars, personal assistants, medical imaging, and more. So today, we’re introducing the Perception Test, a multimodal benchmark using real-world videos to help evaluate the perception capabilities of a model.

Measuring perception in AI models

Perception – the process of experiencing the world through senses – is a significant part of intelligence. And building agents with human-level perceptual understanding of the world is a central but challenging task, which is becoming increasingly important in robotics, self-driving cars, personal assistants, medical imaging, and more. So today, we’re introducing the Perception Test, a multimodal benchmark using real-world videos to help evaluate the perception capabilities of a model.

This unique burrowing robot was inspired by the Pacific mole crab

The unassuming Pacific mole crab, Emerita analoga, is about to make some waves. UC Berkeley researchers have debuted a unique robot inspired by this burrowing crustacean that may someday help evaluate the soil of agricultural sites, collect marine data and study soil and rock conditions at construction sites.

Fuzzy Logic and CAIRE, on the Way to Automating the Painting Process

Fuzzy Logic Repplix software module allows a robotic system to learn trajectories in a single gesture without any specific training. With the help of a portable learning device, an operator can transfer their professional know-how to the robot by simply executing their task.

50 women in robotics you need to know about 2022

Our Women in Robotics list turns 10 this year and we are delighted to introduce you to another amazing “50 women in robotics you need to know about” as we also celebrate Ada Lovelace Day. We have now profiled more than 300 women AND non-binary people making important contributions to robotics since the list began in 2013. This year our 50 come from robotics companies (small and large), self-driving car companies, governments, research organizations and the media. The list covers the globe, with the chosen ones having nationalities from the EU, UK, USA, Australia, China, Turkey, India and Kenya. A number of women come from influential companies that are household names such as NASA, ABB, GE, Toyota and the Wall Street Journal. As the number of women on the list grows so does the combined global impact of their efforts, increasing the visibility of women in the field who may otherwise go unrecognized. We publish this list to overcome the unconscious perception that women aren’t making significant contributions. We encourage you to use our lists to help find women for keynotes, panels, interviews and to cite their work and include them in curricula.

The role models these 50 women represent are diverse, ranging from emeritus to early career stage. Role models are important. Countess Ada Lovelace, the world’s first computer programmer and an extraordinary mathematician, faced an uphill battle in the days when women were not encouraged to pursue a career in science. Fast forward 200 years and there are still not enough women in science, technology, engineering or math (STEM). One key reason is clear: the lack of visible female role models and so we continue to run our women in robotics photo challenge, to showcase real women building real robots. Women in STEM need to be equally represented at conferences, keynotes, magazine covers, or stories about technology. Although this is starting to change, the change is not happening quickly enough. You can help. Spread the word and use this resource to inspire others to consider a career in robotics. As you will see there are many different ways the women we profile are making a difference.

We hope you are inspired by these profiles, and if you want to work in robotics too, please join us at Women in Robotics. We are now a 501(c)(3) non-profit organization, but even so, this post wouldn’t be possible if not for the hard work of volunteers and the Women in Robotics Board of Directors.

Want to keep reading? There are more than 300 other stories on our 2013 to 2021 lists (and their updates):

Please share this and cite Women in Robotics as the author. Why not nominate a woman or non-binary person working in robotics for inclusion next year! Tweet this.

ep.361: Recycling: An Opaque Industry, with Areeb Malik

The Recycling Industry in the United States is a for-profit industry. They profit from taking recyclable material, refining it, and reselling it to companies at a cheaper price than producing the material from scratch.

If you look at the demand side of the recycling industry, an array of multi-billion dollar companies like Coca-Cola and PepsiCo are incentivized to buy recycled goods and reduce their materials costs.

If you look at the supply side, ~300 million tons of trash are generated annually in the United States. Estimates suggest that up to 75% of that is recyclable.

On paper, it seems clear that maximizing the amount of trash the US recycles is in everyone’s interest. One issue though, less than a third of the trash ends up recycled.

Areeb, co-founder of Glacier, breaks down the multi-layered reasoning behind why the Recycling industry cannot handle this volume of trash, and what Glacier is doing to address this.

Areeb Malik

Areeb Malik is the Co-Founder of Glacier, and he is on a personal mission to fight climate change and extract value from the $123B worth of recyclables that fill the landfills and oceans. Before founding Glacier, Areeb was a Software Engineer at Facebook, where he used Machine Learning and Computer Vision to build out new product features.

Links

A system for automating robot design inspired by the evolution of vertebrates

Researchers at Kyoto University and Nagoya University in Japan have recently devised a new, automatic approach for designing robots that could simultaneously improve their shape, structure, movements, and controller components. This approach, presented in a paper published in Artificial Life and Robotics, draws inspiration from the evolution of vertebrates, the broad category of animals that possess a backbone or spinal column, which includes mammals, reptiles, birds, amphibians, and fishes.
Page 250 of 522
1 248 249 250 251 252 522