Researchers at the Electronics and Telecommunications Research Institute (ETRI) in Korea have recently developed a deep learning-based model that could help to produce engaging nonverbal social behaviors, such as hugging or shaking someone's hand, in robots. Their model, presented in a paper pre-published on arXiv, can actively learn new context-appropriate social behaviors by observing interactions among humans.
Are you looking for a gift for the women in robotics in your life? Or the up and coming women in robotics in your family? Perhaps these suggestions from our not-for-profit Women in Robotics organization will inspire! We hope these are also good suggestions for non binary people in robotics, and I personally reckon they are ideal for men in the robotics community too. It’s all about the robotics, eh!
Plus OMG it’s less than 50 days until 2023!!! So we’re going to do a countdown with a social media post every day until Dec 31st featuring one of the recent ’50 women in robotics you need to know about 2022′. It’s in a random order and today we have…
…. Follow us on Twitter, on Facebook, on Linked In, Pinterest or Instagram to find out 
Holiday gift ideas
Visit the Women in Robotics store for t-shirts, mugs, drink bottles, notebooks, stickers, tote bags and more!
From Aniekan @_aniekan_
From @mdn_nrbl
From Vanessa Van Decker @VanessaVDecker
From Andra @robotlaunch
Do you have a great robot gift idea?
More sophisticated robots are on the way, accelerating a drive to ensure they help workers rather than take their place.
Here, Knight Optical – the leading supplier of metrology-tested, high-precision optical components – investigates the fascinating metaverse and explores some of the most recent advances.
Most artificial intelligence (AI) researchers now believe that writing computer code which can capture the nuances of situated interactions is impossible. Alternatively, modern machine learning (ML) researchers have focused on learning about these types of interactions from data. To explore these learning-based approaches and quickly build agents that can make sense of human instructions and safely perform actions in open-ended conditions, we created a research framework within a video game environment.Today, we’re publishing a paper [INSERT LINK] and collection of videos, showing our early steps in building video game AIs that can understand fuzzy human concepts – and therefore, can begin to interact with people on their own terms.
Most artificial intelligence (AI) researchers now believe that writing computer code which can capture the nuances of situated interactions is impossible. Alternatively, modern machine learning (ML) researchers have focused on learning about these types of interactions from data. To explore these learning-based approaches and quickly build agents that can make sense of human instructions and safely perform actions in open-ended conditions, we created a research framework within a video game environment.Today, we’re publishing a paper [INSERT LINK] and collection of videos, showing our early steps in building video game AIs that can understand fuzzy human concepts – and therefore, can begin to interact with people on their own terms.
Most artificial intelligence (AI) researchers now believe that writing computer code which can capture the nuances of situated interactions is impossible. Alternatively, modern machine learning (ML) researchers have focused on learning about these types of interactions from data. To explore these learning-based approaches and quickly build agents that can make sense of human instructions and safely perform actions in open-ended conditions, we created a research framework within a video game environment.Today, we’re publishing a paper [INSERT LINK] and collection of videos, showing our early steps in building video game AIs that can understand fuzzy human concepts – and therefore, can begin to interact with people on their own terms.
Researchers at MIT have made significant steps toward creating robots that could practically and economically assemble nearly anything, including things much larger than themselves, from vehicles to buildings to larger robots.
Amoebae are single-cell organisms. By means of self-organization, they can form complex structures—and do this purely through local interactions: If they have a lot of food, they disperse evenly through a culture medium. But if food becomes scarce, they emit the messenger known as cyclic adenosine monophosphate (cAMP). This chemical signal induces amoebae to gather in one place and form a multicellular aggregation. The result is a fruiting body.
Volunteer customers will be able to order and receive beverages at designated drop-off points in PIT's Airside Terminal. Ottobot from Ottonomy is the first fully autonomous robot for deliveries inside airports and is currently deployed at multiple airports.
Learning how to build upon knowledge by tapping 30 years of computer vision research
Learning how to build upon knowledge by tapping 30 years of computer vision research
Learning how to build upon knowledge by tapping 30 years of computer vision research
Researchers from Hanyang University and Inha University have introduced collective behavior of multiple swimming robots for delivery of thousands of cargos.
IEC 61131-3 is the de facto standard in the industrial automation industry. It encompasses 5 different programming “languages”. It makes designing simple programs straightforward and easy, and it does this very well.