All posts by Oliver Mitchell

Page 2 of 4
1 2 3 4

Is the green new deal sustainable?

This week Washington DC was abuzz with news that had nothing to do with the occupant of The While House. A group of progressive legislators, led by Alexandra Ocasio-Cortez, in the House of Representatives, introduced “The Green New Deal.” The resolution by the Intergovernmental Panel on Climate Change was in response to the alarming Fourth National Climate Assessment and aims to reduce global “greenhouse gas emissions from human sources of 40 to 60 percent from 2010 levels by 2030; and net-zero global emissions by 2050.” While the bill is largely targeting the transportation industry, many proponents suggest that it would be more impactful, and healthier, to curb America’s insatiable appetite for animal agriculture.

In a recent BBC report, “Food production accounts for one-quarter to one-third of all anthropogenic greenhouse gas emissions worldwide, and the brunt of responsibility for those numbers falls to the livestock industry.” The average US family, “emits more greenhouse gases because of the meat they eat than from driving two cars,” quipped Professor Tim Benton of the University of Leeds. “Most people don’t think of the consequences of food on climate change. But just eating a little less meat right now might make things a whole lot better for our children and grandchildren,” signed Benton.

Americans continue to chow down more than 26 billion pounds of meat a year, distressing environmentalists who assert that the current status quo is unsustainable. While veganism would provide a 70% relief to greenhouse gases worldwide, it is not foreseeable that 7 billion people would instantly change their diets to save the planet. Robotics, and even more so, artificial intelligence, is now being embraced by venture-backed entrepreneurs to artificially grow meat alternatives as creative gastronomic replacements.

Screen Shot 2019-02-08 at 3.45.34 PM

Chilean startup, Not Company (NotCo), built a machine learning platform named Giuseppe to search animal ingredient substitutes. NotCo founder Matias Muchnick explains, “Giuseppe was created to understand molecular connections between food and the human perception of taste and texture.” While Muchnick did not disclose his techniques, he revealed to Business Insider that the company has hired teams of food and data scientists to classify ingredients into bits for Giuseppe. Muchnick explains the AI begins the work of processing the “data regarding how the brain works when it’s given certain flavors, when you taste salty, umami, [or] sweet.” Today, the company has a line of egg and milk alternatives on the shelves including: “Not Mayo,” Not Cheese,” “Not Yogurt and “Not Milk.” The NotCo website states that this is only the first step in a larger scheme for the deep learning algorithm: “NotCo currently has a very ambitious development plan for Giuseppe, which includes the generation of new databases with information of a different nature, such as production processes and other molecular properties of food, in such a way that Giuseppe gets closer and closer to be the most advanced chef and food scientist in the world.”

NotCo competes in a growing landscape of other animal substitute upstarts. Hampton Creek, which recently rebranded as JUST, also offers an array of dairy and egg alternatives from plant-based ingredients. The ultimate test for all these companies is creating meat in a petri dish. When responding to the challenge, JUST announced, “Through a first-of-its-kind partnership, JUST will develop cultured Wagyu beef using cells from Toriyama prized cows. Then, Awano Food Group (a premier international supplier of meat and seafood) will market and sell the meat to clients exactly how they do today with conventionally produced Toriyama Wagyu.” Today, a handful of companies, many ironically backed by livestock corporations, are also tackling the $90 billion cellular agriculture market, including: Mosa MeatImpossible Burger, Beyond Meat, and Memphis Meats. Mosa, backed by Google founder Sergi Brin, unveiled the first synthetic burger in 2013 at a staggering cost of nearly a half million dollars.

Screen Shot 2019-02-08 at 4.27.06 PM
While costs are declining, cultured meat is too expensive to supplement the American diet, especially when $1 still buys one a fast food dinner. The key to mass acceptance is attacking the largest pain point in the lab – acquiring enough genetic material from bovine tissue. Currently, the cost of such serums are close to $1,000 an ounce, and not exactly cruelty free as they are derived from animals. Many clean meat founders are proudly vegan with the implicit goal of replacing animal ingredients altogether. In order to accomplish this task, companies like JUST have invested in building robust AI and robotic systems to automatically scour the globe for plant-based alternatives. “Over 300,000 species are in the plant kingdom. That’s over 18 billion proteins, 108 million lipids, and 4 million polysaccharides. It’s an abundance almost entirely unexplored, until now,” exclaims their website. The company boasts that it is on the verge of major discoveries, “The more we explore, the more data we gather along the way. And the faster we’ll find the answers. It’s almost impossible to look at the data and say, ‘Here’s a pattern. Here’s an answer.’ So, we have to come up with algorithms to rank the materials and give downstream experiments a recommendation. In this way, we’re using data to increase the probability of discoveries.”

Screen Shot 2019-02-10 at 1.20.40 PM

The next few years will unearth major breakthroughs, already Mosa announced it will have an affordable product on the shelves by 2021. To accomplish this task, the company turned to Merck’s corporate venture arm, M Ventures, and Bell Food to lead its previous  financing round. Last July, Forbes reported that the strategic partnerships are critical to Mosa’s vision in mass producing meat. According to Mosa’s founder, Mark Post, “Merck’s experience with cell cultures is very attractive from a strategic standpoint. Cell production is key to scaling cultured meat production, as they still need to figure out how to get cells to grow more rapidly and at higher numbers. In short, new technology needs to be developed. That’s where companies like Merck can lend a hand.” In addition to leveraging the conglomerates expertise in the lab, food-packaging powerhouse, Bell Food, provides a huge distribution advantage. Already, Lorenz Wyss, CEO of Bell Food Group, excitedly predicts, “Meat demand is soaring and in the future it won’t be met by livestock agriculture alone. We believe this technology can become a true alternative for environment-conscious consumers, and we are delighted to bring our know-how and expertise of the meat business into this strategic partnership with Mosa Meat.”

Screen Shot 2019-02-10 at 2.36.22 PM
While the Green New Deal has been met with skepticism, the charging forces of climate change and technology are steaming ahead. Today, we have the computational and the mechatronic power to turn back the tides of destruction to implant positive change across the planet, quite possibly starting with scaling back animal agriculture. Even Winston Churchill commented in the 1931, “We shall escape the absurdity of growing a whole chicken in order to eat the breast or wing, by growing these parts separately under a suitable medium.”

Is our food source and AgTech networks under attack? Learn more at the next RobotLab on “Cybersecurity & Machines” with John Frankel of ffVC and Guy Franklin of SOSA on February 12th in New York City, RSVP Today!

The metaphysical impact of automation

Earlier this month, I crawled into Dr. Wendy Ju‘s autonomous car simulator to explore the future of human-machine interfaces at CornellTech’s Tata Innovation Center. Dr. Ju recently moved to the Roosevelt Island campus from Stanford University. While in California, the roboticist was famous for making videos capturing people’s reactions to self-driving cars using students disguised as “ghost-drivers” in seat costumes. Professor Ju’s work raises serious questions of the metaphysical impact of docility.

Last January, Toyota Research published a report on the neurological effects of speeding. The team displayed images and videos of sports cars racing down highways that produced spikes in brain activity. The study states,”we hypothesized that sensory inputs during high-speed driving would activate the brain reward system. Humans commonly crave sensory inputs that give rise to pleasant sensations, and abundant evidence indicates that the craving for pleasant sensations is associated with activation within the brain reward system.” The brain reward system is directly correlated to the body’s release of dopamine via the Ventral Tegmental Area. The findings confirmed that higher levels of brain activity on the VTA “were stronger in the fast condition than in the slow condition.” Essentially, speeding (which most drivers engage in regardless of laws) is addicting, as the brain rewards such aggressive behaviors with increased levels of dopamine.

As we relegate more driving to machines, the roads are in danger of becoming highways of strung out dopamine junkies craving new ways to get their fix. Self-driving systems could lead to a marketing battle for in-cabin services pushed by manufacturers, software providers, and media/Internet companies. As an example, Apple filed a patent in August for “an augmented-reality powered windshield system,” This comes two years after Ford filed a similar patent for a display or “system for projecting visual content onto a vehicle’s windscreen.” Both of these filings, along with a handful of others, indicate that the race for capturing rider mindshare will be critical to driving the adoption of robocars. Strategy Analytics estimates this “passenger economy” could generate $7 trillion by 2050. Commuters who spend 250 million hours a year in the car are seen by these marketers as a captive audience for new ways to fill dopamine-deprived experiences.

I predict at next month’s Consumer Electronic Show (CES) in-cabin services will be the lead story coming out of Las Vegas. For example, last week Audi announced a new partnership with Disney to develop innovative ways to entertain passengers. Audi calls the in-cabin experience “The 25th Hour,” which will be further unveiled at CES. Providing a sneak peak into its meaning, CNET interviewed Nils Wollny, head of Audi’s digital business strategy. According to Wollny, the German automobile manufacturer approached Disney 18 months ago to forge a relationship. Wollny explains, “You might be familiar with their Imagineering division [Walt Disney Imagineering], they’re very heavy into building experiences for customers. And they were highly interested in what happens in cars in the future.” He continues, “There will be a commercialization or business approach behind it [for Audi] I’d call it a new media type that isn’t existing yet that takes full advantage of being in a vehicle. We created something completely new together, and it’s very technologically driven.” When illustrating this vision to CNET’s Road Show, Wollny directed the magazine to Audi’s fully autonomous concept car design that “blurs the lines between the outside world and the vehicle’s cabin.” This is accomplished by turning windows into screens with digital overlays that simultaneously show media while the outside world rushes by at 60 miles per an hour.

Self-driving cars will be judged not by speed of their engines, but the comfort of their cabins. Wollny’s description is reminiscent of the marketing efforts of social media companies that were successful in turning an entire generation into screen addicts. Facebook founder Sean Parker, admitted recently that the social network was founded with the strategy of consuming “as much of your time and conscious attention as possible.” To accomplish this devious objective, Parker confesses that the company exploited the “vulnerability in human psychology.” When you like something or comment on a friend’s photo, Parker boasted “we… give you a little dopamine hit.” The mobile economy has birthed dopamine experts such as Ramsay Brown, cofounder of Dopamine Labs, which promises app designers with increased levels of “stickiness” by aligning game play to the player’s cerebral reward system. Using machine learning Brown’s technology monitors each player’s activity by providing the most optimal spike of dopamine. New York Times columnist David Brook’s said it best, “Tech companies understand what causes dopamine surges in the brain and they lace their products with ‘hijacking techniques’ that lure us in and create ‘compulsion loops’.”

The promise of automation is to free humans from dull, dirty, and dangerous chores. The flip side many espouse is that artificial intelligence could make us too reliant on technology, idling society. Already, semi-autonomous systems are being cited as a cause of workplace accidents. Andrew Moll of the United Kingdom’s Chamber of Shipping warned that greater levels of automation by outsourcing decision making to computers has lead to higher levels of maritime collisions. Moll pointed to a recent spat of seafaring incidents,”We have seen increasing integration of ship systems and increasing reliance on computers. He elaborated that “Humans do not make good monitors. We need to set alarms and alerts, otherwise mariners will not do checks.” Moll exclaimed that technology is increasingly making workers lazy as many feel a “lack of meaning and purpose,” and are suffering from mental fatigue which is leading to a rise in workplace injuries. “Seafarers would be tired and demotivated when they get to port,” cautioned Moll. These observations are not isolated to shipping, the recent fatality by Uber’s autonomous taxi program in Arizona faulted safety driver fatigue as one of the main causes for the tragedies. In the Pixar movie WALL-E, the future is so automated that humans have lost all motivation to leave their mobile lounge chairs. To avoid this dystopian vision, successful robotic deployments will have to strike the right balance of augmenting the physical, while providing cerebral stimulation.

To better understand the automation landscape, join us at the next RobotLab event on “Cybersecurity & Machines” with John Frankel of ffVC and Guy Franklin of SOSA on February 12th in New York City, RSVP Today!

The end of parking as we know it

A day before snow hindered New York commuters, researchers at the University of Iowa and Princeton identified the growth of urbanization as the leading cause for catastrophic storm damage. Wednesday’s report stated that the $128 billion wake of Hurricane Harvey was 21 times greater due to the population density of Houston, one of America’s fastest growing cities. This startling statistic is even more alarming in light of a recent UN study which reported that 70% of the projected 9.7 billion people in the world will live in urban centers by 2050. Superior urban management is one of the major promises of autonomous systems and smart cities.

Today, one of the biggest headaches for civil planners is the growth of traffic congestion and demand for parking, especially considering cars are one of the most inefficient and expensive assets owned by Americans. According to the Governors Highway Safety Association the average private car is parked 95% of the time. Billions of dollars of real-estate in America is dedicated to parking. For example, in a city like Seattle 40% of the land is consumed by parking. Furthermore, INRIX analytics estimates that more than $70 billion is spent by Americans looking for parking. The average driver wastes $345 a year in time, fuel and emissions. “Parking pain costs much moreNew Yorkers spend 107 hours a year looking for parking spots at a cost of $2,243 per driver,” states INRIX.Screen Shot 2018-11-17 at 8.43.58 PM.pngThis month I spoke with Dr. Anuja Sonalker about her plan to save Americans billions of dollars from parking. Dr. Sonalker is the founder of STEER Technologies, a full-service auto-valet platform that is providing autonomous solutions to America’s parking pain. The STEER value proposition uses a sensor array that easily connects to a number of popular automobile models, seamlessly controlled by one’s smart phone. As Dr. Sonalker explains, “Simply put STEER allows a vehicle user to pull over at a curb (at certain destinations), and with a press of a button let the vehicle go find a parking spot and park for you. When it’s time to go, simply summon the vehicle and it comes back to get you.” An added advantage of STEER is its ability to conserve space, as cars can be parked very close together since computers don’t use doors.

Currently, STEER is piloting its technology near its Maryland headquarters. In describing her early success, Dr. Sonalker boasts, “We have successfully completed testing various scenarios under different weather, and lighting conditions at malls, train stations, airports, parks, construction sites, downtown areas. We have also announced launch dates in late 2019 with the Howard Hughes Corporation to power the Merriweather district – a 4.9 Million square foot new smart development in Columbia, MD, and the BWI airport.” The early showing from STEER’s performance is the results of its proprietary product that is built for all seasons and topographies. “Last March, we demonstrated live in Detroit under a very fast snowstorm system. Within less than an hour the ground was covered in 2+ inches of snow,” describes Dr. Sonalker. “No lane markings were visible any more, and parking lines certainly were not visible. The STEER car successfully completed its mission to ‘go park’, driving around the parking lot, recognizing other oncoming vehicles, pacing itself accordingly and locating and manoeuvring itself into a parking spot among other parked vehicles in that weather.”

In breaking down the STEER solution, Dr. Sonalker expounds, “The technology is built with a lean sensor suite, its cost equation is very favorable to both after market and integrated solutions for consumer ownership.” She further clarifies, “From a technology stand point both solutions are identical in the feature they provide. The difference lies in how the technology is integrated into the vehicle. For after market, STEER’s technology runs on additional hardware that is retrofitted to the vehicle. In an integrated solution STEER’s technology would be housed on an appropriate ECU driven by vehicle economics and architecture, but with a tight coupling with STEER’s software. The coupling will be cross layered in order to maintain the security posture.” Unlike many self-driving applications that rely heavily on LIDAR (Light Detection And Ranging), STEER uses location mapping of predetermined parking structures along with computer vision. I pressed Dr. Sonalker about her unusual setup, “Yes, it’s true we don’t use LIDAR. You see, STEER started from the principle of security-led design which is where we start from a minimum design, minimum attack surface, maximum default security.”

I continued my interview of Dr. Sonalker to learn how she plans to roll out the platform, “In the long term, we expect to be a feature on new vehicles as they roll out of the assembly line. 2020-2021 seems to be happening based on our current OEM partner targets. Our big picture vision is that people no longer have to think about what to do with their cars when they get to their destination. The equivalent effect of ride sharing – your ride ends when you get off. There will be a network of service points that your vehicle will recognize and go park there until you summon it for use again.” STEER’s solution is part of a growing fleet of new smart city initiatives cropping up across the automation landscape. Screen Shot 2018-11-17 at 10.58.59 PM.pngAt last year’s Consumer Electronic Show, German auto supplier Robert Bosch GmbH unveiled its new crowd-sourcing parking program called, “community-based parking.” Using a network of cameras and sensors to identify available parking spots Bosch’s cloud network automatically directs cars to the closest spot. This is part of Bosch’s larger urban initiative, as the company’s president Mike Mansuetti says, “You could say that our sensors are the eyes and ears of the connected city. In this case, its brain is our software. Of Bosch’s nearly 400,000 associates worldwide, more than 20,000 are software engineers, nearly 20% of whom are working exclusively on the IoT. We supply an open software platform called the Bosch IoT Suite, which offers all the functions necessary to connect devices, users, and companies.”

As the world grabbles with population explosion exacerbated by cars strangling city centers, civil engineers are challenging technologists to reimagine urban communities. Today, most cars are dormant, and when used run at one-fifth capacity with typical trips less than a mile from one’s home (easily accessible on foot or bike). In the words Dr. Sonalker, “All autonomous technologies will lead to societal change. AV Parking will result in more efficient utilization of existing spaces, fitting more in the same spaces, better use of underutilized remote lots, and frankly, even shunting parking off to further remote locations and using prime space for more enjoyable activities.”

Join the next RobotLab forum discussing “Cybersecurity & Machines” to learn how hackers are attacking the the ecosystem of smart cities and autonomous vehicles with John Frankel of ffVC and Guy Franklin of SOSA on February 12th in New York City, RSVP Today!

Accessing the power of quantum computing, today

Two weeks ago, I participated on a panel at the BCI Summit exploring the impact of quantum computing. As a neophyte to the subject, I marveled at the riddle posed by Grover’s Algorithm. Imagine you are assigned to find a contact in a phonebook with a billion names, but all you are given is a telephone number. A quantum computer is able to decipher the answer with remarkable speed at a rate of .003% of today’s binary systems require one operation per line of data (in this case one billion).

Read More

The race for robot clairvoyance

This week a Harvard Business School student challenged me to name a startup capable of producing an intelligent robot – TODAY! At first I did not understand the question, as artificial intelligence (AI) is an implement like any other in a roboticist’s toolbox. The student persisted, she demanded to know if I thought that the current co-bots working in factories could one day evolve to perceive the world like humans. It’s a good question that I didn’t appreciate at the time as robots are best deployed for specific repeatable tasks, even with deep learning systems. By contrast, mortals comprehend their surroundings (and other organisms) using a sixth sense, intuition.

tennisbot.gif

As an avid tennis player, I also enjoyed meeting Tennibot this week. The autonomous ball-gathering robot sweeps the court like a roomba sucking up dust off a rug. In order to accomplish this task, without knocking over players, it navigates around the cage utilizing six cameras on each side. This is a perfect example of the type of job that an unmanned system excels at performing, freeing up athletes from wasting precious court time with tedious cleanup. Yet, Tennibot, at the end of the day, is a dumb appliance. While it gobbles up balls quicker than any person, it is unable to discern the quality of the game or the health of players.

No one expects Tennibot to save Roger Federer’s life, but what happens when a person has a heart attack inside a self-driving car on a two-hour journey? While autonomous vehicles are packed with sensors to identify and safely steer around cities and highways, few are able to perceive human intent. As Ann Cheng of Hyundai explains, “We [drivers] think about what that other person is doing or has the intent to do. We see a lot of AI companies working on more classical problems, like object detection [or] object classification. Perceptive is trying to go one layer deeper—what we do intuitively already.” Hyundai joined Jim Adler’s Toyota AI Ventures this month in investing Perceptive Automata, an “intuitive self-driving system that is able to recognize, understand, and predict human behavior.”

0_1dqK7bN1HqXq6AT1.png

As stated by Adler’s Medium post, Perceptive’s technology uses “behavioral science techniques to characterize the way human drivers understand the state-of-mind of other humans and then train deep learning models to acquire that human ability. These deep learning models are designed for integration into autonomous driving stacks and next-generation driver assistance systems, sandwiched between the perception and planning layers. These deep learning, predictive models provide real-time information on the intention, awareness, and other state-of-mind attributes of pedestrians, cyclists and other motorists.”

While Perceptive Automata is creating “predictive models” for outside the vehicle, few companies are focused on the conditions inside the cabin. The closest implementations are a number of eye-tracking cameras that alert occupants to distracted driving. While these technologies observe the general conditions of passengers, they rely on direct eye contact to distinguish between emotions (fatigue, excitability, stress, etc.), which is impossible if one is passed out. Furthermore, none of these vision systems have the ability to predict human actions before they become catastrophic.

Isaac Litman, formerly of Mobileye, understands fully well the dilemma presented by computer vision systems in delivering on the promise of autonomous travel. In speaking with Litman this week about his newest venture Neteera, he declared that in today’s automative landscape the “the only unknown variable is the human.” Unfortunately, the recent wave of Tesla and Uber autopilot crashes has glaringly illustrated the importance of tracking the attention of vehicle occupants in handing off between autopilot systems and human drivers. Litman further explains that Waymo and others are collecting data on occupant comfort as AI-enabled drivers have reportedly led to high levels of nausea from driving too consistently. Litman describes this as the indigestion problem, clarifying that after eating a big meal one may want to drive more slowly than on an empty stomach. In the future Litman professes that autonomous cars will be marketed “not by the performance of their engines, but on the comfort of their rides.”

Screen Shot 2018-10-28 at 11.46.02 AM.png

Litman’s view is further endorsed by the recent patent application filed this summer by Apple’s Project Titan team for developing “Comfort Profiles” for autonomous driving. According to AppleInsider, the application “describes how an autonomous driving and navigation system can move through an environment, with motion governed by a number of factors that are set indirectly by the passengers of the vehicle.” The Project Titan system would utilize a fusion of sensors (LIDAR, depth cameras, and infrared) to monitor the occupants’ “eye movements, body posture, gestures, pupil dilation, blinking, body temperature, heart beat, perspiration, and head position.” The application details how the data would integrate into the vehicle systems to automatically adjust the acceleration, turning rate, performance, suspension, traction control and other factors to the personal preferences of the riders. While Project Titan is taking the first step toward developing an autonomous comfort system, Litman expresses that it is limited by the inherent short-comings of vision-based systems that are susceptible to light, dust, line of sight, condensation, motion, resolution, and safety concerns.

Unlike vision sensors, Neteera is a cost-effective micro-radar on a chip that leverages its own network of proprietary algorithms to provide “the first contact free vital sign detection platform.” Its FDA-level of accuracy is not only being utilized by the automative sector, but healthcare systems across the United States for monitoring such elusive conditions as sleep apnea and sudden infant death syndrome. To date, the challenge of monitoring vital signs through micro-skin motion in the automotive industry has been the displacement caused by a moving vehicles. However, Litman’s team has developed a a patent-pending “motion compensation algorithm” that tracks “quasi-periodic signals in the presence of massive random motions,” providing near perfect accuracy (see tables below).

Screen Shot 2018-10-28 at 1.22.31 PM

While the automotive industry races to launch fleets of autonomous vehicles, Litman estimates that the most successful players will be the ones that install empathic engines into the machines’ framework. Unlike the crowded field of AI and computer vision startups that are enabling robocars to safely navigate city streets, Neteera’s “intuition on a chip” is probably one of the only mechatronic ventures that actually report on the psychological state of drivers and passengers. Litman’s innovation has wider societal implications, as social robots begin to augment humans in the workplace and support the infirm and elderly in coping with the fragility of life.

As scientists improve artificial intelligence, it is still unclear what the reaction will be from ordinary people to such “emotional” robots. In the words of writer Adam Williams, “Emotion is something we reserve for ourselves: depth of feeling is what we use to justify the primacy of human life. If a machine is capable of feeling, that doesn’t make it dangerous in a Terminator-esque fashion, but in the abstract sense of impinging on what we think of as classically human.”

Educating the workforce of the future, in the age of accelerations

I have two kids in college and one of my biggest concerns is their knowledge that what they have labored hard to acquire will become obsolete by the time of graduation. Our age is driven by the hypersonic accelerations of technology and data forcing innovative educators to create new pedagogical systems that empower students with the skills today to lead tomorrow.
The new CyberNYC initiative announced last week by the City of New York is just one example of this growing partnership between online platforms and traditional academia in the hope of fostering a new generation of wage earners.

The goal of CyberNYC is to train close to 5% of the city’s working population to become “cyber specialists.” In order to accomplish this lofty objective, the NYCEDC forged an educational partnership with CUNY, NYU, Columbia, Cornell Tech, and iQ4. One of the most compelling aspects of the partnership is the advanced degree program offered by CUNY and Facebook, enabling students to achieve a masters in computer science in just a year through the online educational hosting site EdX, which also enables users to stack credentials from other universities.

As Anant Agarwal, CEO of edX, explains, “The workplace is changing more rapidly today than ever before and employers are in need of highly-developed talent. Meanwhile, college graduates want to advance professionally, but are realizing they do not have the career-relevant skills that the modern workplace demands. EdX recognizes this mismatch between business and education for learners, employees and employers. The MicroMasters initiative provides the next level of innovation in learning to address this skills gap by creating a bridge between higher education and industry to create a skillful, successful 21st-century workforce.”

Realizing that not everyone is cut out for higher education, the Big Apple is also working to create boot camps to upskill existing tech operators in a matter of weeks with industry-specific cyber competencies. Fullstack Academy is leading the effort to create a catalogue of intensive boot camps throughout the boroughs. LaGuardia Community College (LAGCC) is also providing free preparatory courses for adults with minimum computing proficiency in order to qualify for Fullstack’s programs. Most importantly, LAGCC will act as a liaison to CyberNYC’s corporate partners to match graduates with open positions.

In 2012, Sebastian Thrun famously declared that, “access to higher education should be a a basic human right.” Thrun, who left his position of running Google X to “democratize education” worldwide by launching a free online open-university platform, Udacity, is now transforming the learning paradigm. The German inventor is no stranger to innovation, in 2011 he unveiled at the Ted Conference one of the first self-driving cars, inspired by losing his best friend to a car accident. Similar to CyberNYC’s fast-track masters in computer science program, Udacity is teamed up with AT&T and Georgia Tech to offer similar degrees for less than $7,000 (compared to $26,860 for an on-campus program).

In the words of AT&T’s Chief Executive Randall Stephenson, “We believe that high-quality and 100 percent online degrees can be on par with degrees received in traditional on-campus settings, and that this program could be a blueprint for helping the United States address the shortage of people with STEM degrees, as well as exponentially expand access to computer science education for students around the world.”

In 2003, Reid Hoffman launched LinkedIn, the first business social network. Today, there are more than a half billion profiles (resumes) posted on the site. Last March, Hoffman sat down with University of California President (and former director of Homeland Security) Janet Napolitano to discuss the future of education. The leading advocate for entrepreneurship explained that he believes everyone should be “in permanent beta,” whereby one is constantly consuming information. Hoffman states that this is the only way an individual and a society will be able to compete in a world driven by data and artificial intelligence. Universities, like the UC system, Hoffman suggests should move towards a cross-disciplinary system. As Hoffman espouses, “What we’re actually in fact primarily teaching is that learning how to learn as you get to new areas, not areas where it’s necessarily the apprenticeship model, which is we teach you this thing and you know how to do this one thing. You know how to do this thing really well, but actually, in fact, you’re going to be crossing domains. That’s how I would somewhat shift the focus overall in terms of thinking about it.”

In his book, “Thank You For Being Late: An Optimistic’s Guide To Thriving In The Age Of Accelerations,” Thomas Friedman quotes Nest Labs’ founder, Tony Fadell, as asserting that the future economy rests on businesses’ ability of turning “AI into IA,” or “Intelligent Assistants.” Friedman specifically singled out LinkedIn as one of these IAs that are creating human networks to amplify people in finding the best opportunities and highest demanded skills. In order to utilize his IA, Hoffman advised Napolitano’s audience to be versatile, “As opposed to thinking about this as a career ladder, a career escalator, to think of it as more of a career jungle gym, that you’re actually going to be changing around in terms industries. The exact shape of certain different job professions will change, and that you need to be adaptive with that.” He continued, “I do think that the notion that is still too often preached, which is you go to college, you discover your calling, and that’s your job for the next 50 years, that’s gone. ” The key to harnessing this trend says Hoffman is “to constantly be learning and to be learning new things. Some of them by taking ongoing classes but some of them also by doing, and talking to people and finding out what the relevant things are, and then tracking what’s going on.”

All these efforts are not happening fast enough in the United States to fill the current gap between the 6.9 million job openings and the number of unemployed. While the unemployment rate is at a forty year low with 6.2 million workers out of work, there still is a significant disparity with more open job listings. The primary reason stated by business leaders across the nation is that the current class of applicants lack the versatility of skills required for the modern workplace, resulting in the push towards full automation. Eric Schmidt, former Executive Chairman of Google and Alphabet, claims that “Today we all live and work in a new era, the Internet Century, where technology is roiling the business landscape and the pace of change is accelerating.” This Internet Century (and by extension cloud computing and unmanned systems) requires a new type of worker, which he affectionally calls the “smart creative” that is first and foremost an “adaptive learner.”

The deficiency of graduating “smart creatives” could be the reason why America, which is almost at full employment, is still producing historically low output resulting is stagnant wages. Mark Zandi, Moody’s Chief Economist, explains, “Wage growth feels low by historical standards and that’s largely because productivity growth is low relative to historical standards. Productivity growth between World War II and up through the Great Recession was, on average, 2 percent per annum. Since the recession 10 years ago, it’s been 1 percent.” The virtuous efforts of CyberNYC, and other grassroots initiatives, are only the first of many towards the complete restructuring of America’s educational framework to nurture a culture of smart creatives that are in permanent beta.

New York: The gateway to industry 4.0

As Hurricane Florence raged across the coastline of Northern Carolina, 600 miles north the 174th Attack Wing Nation Guard base in Syracuse, New York was on full alert. Governor Cuomo just hung up with Defence Secretary Mattis to ready the airbase’s MQ-9’s drone force to “provide post-storm situational awareness for the on-scene commanders and emergency personnel on the ground.” Suddenly, the entire country turned to the Empire State as the epicenter for unmanned search & rescue operations.

nybase

Located a few miles from the 174th is the Genius NY Accelerator. Genius boasts of the largest competition for unmanned systems in the world. Previous winners that received one million dollars, include: AutoModality and FotoKite. One of Genius’ biggest financial backers is the Empire State Development (ESD). Last month, I moderated a discussion in New York City between Sharon Rutter of the ESD, Peter Kunz of Boeing Horizon X and Victor Friedberg of FoodShot Global. These three investors spanned the gamut of early stage funders of autonomous machines. I started our discussion by asking if they think New York is poised to take a leading role in shaping the future of automation. While Kunz and Friedberg shared their own perspectives as corporate and social impact investors accordingly, Rutter singled out one audience participant in particular as representing the future of New York’s innovation venture scene.

Andrew Hong of ff Venture Capital sat quietly in front of the presenters, yet his firm has been loudly reshaping the Big Apple’s approach to investing in mechatronics for almost a decade (with the ESD as a proud limited partner). Founded in 2008 by John Frankel, formerly of Goldman Sachs, ff has deployed capital in more than 100 companies with market values of over $6 billion. As the original backer of crowd-funding site Indiegogo, ff could be credited as a leading contributor to a new suite of technologies. As Frankel explains, “We like hardware if it is a vector to selling software, as recurring models based on services lead to better economics for us than one-off hardware sales.” In the spirit of fostering greater creativity for artificial intelligence software, ff collaborated with New York University in 2016 to start the NYU/ffVC AI NexusLab — the country’s first AI accelerator program between a university and a venture fund. NexusLab culminated in the Future Labs AI Summit in 2017. Frankel describes how this technology is influencing the future of autonomy, “As we saw that AI was coming into its own we looked at AI application plays and that took us deeper into cyber security, drones and robotics. In addition, both drones and robotics benefited as a byproduct of the massive investment into mobile phones and their embedded sensors and radios.  Thus we invested in a number of companies in the space (Skycatch, PlusOne Robotics, Cambrian Intelligence and TopFlight Technologies) and continue to look for more.”

Recently, ff VC bolstered its efforts to support the growth of an array of cognitive computing systems by opening a new state-of-the-art headquarter in the Empire State Building and expanding its venture partner program. In addition to providing seed capital to startups, ff VC has distinguished itself for more than a decade by augmenting technical founders with robust back-office services, especially accounting and financial management. Last year, ff also widened its industry venture partner program with the addition of Dr. Kathryn Hume to its network. Dr. Hume is probably best known for her work as the former president of Fast Forward Labs, a leading advisory to Fortune 500 companies in utilizing data science and artificial intelligence. I am pleased to announce that I have decided to join Dr. Hume and the ff team as a venture partner to widen their network in the robotics industry. I share Frankel’s vision that today we are witnessing “massive developments in AI and ML that had led to unprecedented demand for automation solutions across every industry.”

 

ff’s commitment is not an isolated example across the Big Apple but part of a growing invigorated community of venture capitalists, academics, inventors, and government sponsors. In a few weeks, New York City Economic Development Corporation (NYCEDC) will officially announce the winner of a $30 million investment grant to boost the city’s cybersecurity ecosystem. CyberNYC will include a new startup accelerator, city-wide programming, educational curricula, up-skilling/job placement, and a funding network for home-grown ventures. As NYCEDC President and CEO James Patchett explains, “The de Blasio Administration is investing in cybersecurity to both fuel innovation, and to create new, accessible pathways to jobs in the industry. We’re looking for big-thinking proposals to help us become the global capital of cybersecurity and to create thousands of good jobs for New Yorkers.” The Mayor’s office projects that its initiative will create 100,000 new jobs over the next ten years, enabling NYC to fully maximize the opportunities of an autonomous world.

The inspiration for CyberNYC could probably be found in the sands of the Israeli desert town of Beer Sheva. In the past decade, this bedouin city in the Holy Land has been transformed from tents into a high-tech engine for cybersecurity, remote sensing and automation technologies. At the center of this oasis is Cyber Labs, a government-backed incubator created by Jerusalem Venture Partners (JVP). Next week, JVP will kick off its New York City “Hub” with a $1 million competition called “New York Play” to bridge the opportunities between Israeli and NYC entrepreneurship.  In the words of JVP’s Chairman and Founder, Erel Margalit, “JVP’s expansion to New York and the launch of New York Play are all about what’s possible. As New York becomes America’s gateway for international collaboration and innovation, JVP, at the center of the “Startup Nation,” will play a significant role boosting global partnerships to create solutions that better the world and drive international business opportunities.”

Looking past the skyscrapers, I reflect on Margalit’s image of New York as a “Gateway” to the future of autonomy.  Today, the wheel of New York City is turning into a powerful hub, connected throughout America’s academic corridor and beyond, with spokes shooting in from Boston, Pittsburgh, Philadelphia, Washington DC, Silicon Valley, Europe, Asia and Israel. The Excelsior State is pulsing with entrepreneurial energy fostered by the partnerships of government, venture capital, academia and industry. As ff VC’s newest venture partner, I personally am excited to play a pivotal role in helping them harness the power of acceleration for the benefit of my city and, quite possibly, the world.

Come learn how New York’s Retail industry is utilizing robots to drive sales at the next RobotLab on “Retail Robotics” with Pano Anthos of XRC Labs and Ken Pilot, formerly President of Gap on October 17th, RSVP today.

WAZE for drones: expanding the national airspace

Sitting in New York City, looking up at the clear June skies, I wonder if I am staring at an endangered phenomena. According to many in the Unmanned Aircraft Systems (UAS) industry, skylines across the country soon will be filled with flying cars, quadcopter deliveries, emergency drones, and other robo-flyers. Moving one step closer to this mechanically-induced hazy future, General Electric (GE) announced last week the launch of AiRXOS, a “next generation unmanned traffic” management system.

Managing the National Airspace is already a political football with the Trump Administration proposing privatizing the air-control division of the Federal Aviation Administration (FAA), taking its controller workforce of 15,000 off the government’s books. The White House argues that this would enable the FAA to modernize and adopt “NextGen” technologies to speed commercial air travel. While this budgetary line item is debated in the halls of Congress, one certainty inside the FAA is that the National Airspace (NAS) will have to expand to make room for an increased amount of commercial and recreational traffic, the majority of which will be unmanned.

Ken Stewart, the General Manager of AiRXOS, boasts, “We’re addressing the complexity of integrating unmanned vehicles into the national airspace. When you’re thinking about getting a package delivered to your home by drone, there are some things that need to be solved before we can get to that point.” The first step for the new division of GE is to pilot the system in a geographically-controlled airspace. To accomplish this task, DriveOhio’s UAS Center invested millions in the GE startup. Accordingly, the first test deployment of AiRXOS will be conducted over a 35 mile stretch of Ohio’s Interstate 33 by placing sensors along the road to detect and report on air traffic. GE states that this trial will lay the foundation for the UAS industry. As Alan Caslavka, president of Avionics at GE Aviation, explains, “AiRXOS is addressing the rapid changes in autonomous vehicle technology, advanced operations, and in the regulatory environment. We’re excited for AiRXOS to help set the standard for autonomous and manned aerial vehicles to share the sky safely.”

Stewart whimsically calls his new air traffic control platform WAZE for drones. Like the popular navigation app, AiRXOS provides drone operators with real-time flight-planning data to automatically avoid obstacles, other aircraft, and route around inclement weather. The company also plans to integrate with the FAA to streamline regulatory communications with the agency. Stewart explains that this will speed up authorizations as today, “It’s difficult to get [requests] approved because the FAA hasn’t collected enough data to make a decision about whether something is safe or not.” 
Screen Shot 2018-06-08 at 6.04.57 PM.png
NASA is a key partner in counseling the FAA in integrating commercial UAS into the NAS. Charged with removing the “technical and regulatory barriers that are limiting the ability for civil UAS to fly in the NAS” is Davis Hackenberg of NASA’s Armstrong Flight Research Center. Last year, we invited Hackenberg to present his UAS vision to RobotLabNYC. Hackenberg shared with the packed audience NASA’s multi-layered approach to parsing the skies for a wide-range of aircrafts, including: high altitude long endurance flights, commercial airliners, small recreational craft, quadcopter inspections, drone deliveries and urban aerial transportation. Recently the FAA instituted a new regulation mandating that all aircrafts be equipped with Automatic Dependent Surveillance-Broadcast (ADS-B) systems by January 1, 2020. The FAA calls such equipment “foundational NextGen technology that transforms aircraft surveillance using satellite-based positioning,” essentially connecting human-piloted craft to computers on the ground and, quite possibly, in the sky. Many believe this is a critical step towards delivering on the long-awaited promise of the commercial UAS industry with autonomous beyond visual line of sight flights.

I followed up this week with Hackenberg about the news of AiRXOS and the new FAA guidelines. He explained, “For aircraft operating in an ADS-B environment, testing the cooperative exchange of information on position and altitude (and potentially intent) still needs to be accomplished in order to validate the accuracy and reliability necessary for a risk-based safety case.” Hackenberg continued to describe how ADS-B might not help low altitude missions, “For aircraft operating in an environment where many aircraft are not transmitting position and altitude (non-transponder equipped aircraft), developing low cost/weight/power solutions for DAA [Detect and Avoid] and C2 [Command and Control Systems] is critical to ensure that the unmanned aircraft can remain a safe distance from all traffic. Finally, the very low altitude environment (package delivery and air taxi) will need significant technology development for similar DAA/C2 solutions, as well as certified much more (e.g. vehicles to deal with hazardous weather conditions).” The Deputy Project Manager then shared with me his view of the future, “In the next five years, there will be significant advancements in the introduction of drone deliveries. The skies will not be ‘darkened,’ but there will likely be semi-routine service to many areas of the country, particularly major cities. I also believe there will be at least a few major cities with air taxi service using optionally piloted vehicles within the 10-year horizon. Having the pilot onboard in the initial phase may be a critical stepping-stone to gathering sufficient data to justify future safety cases. And then hopefully soon enough there will be several cities with fully autonomous taxi service.”
Screen Shot 2018-06-10 at 11.30.50 AM
Last month, Uber already ambitiously declared at its Elevate Summit that its ride-hail program will begin shuttling humans by 2023. Uber plans to deploy electric vertical take-off and landing (eVTOL) vehicles throughout major metropolitan areas. “Ultimately, where we want to go is about urban mobility and urban transport, and being a solution for the cities in which we operate,” says Uber CEO, Dara Khosrowshahi. Uber has been cited by many civil planners as the primary cause for increased urban congestion. Its eVTOL plan, called uberAIR, is aimed at alleviating terrestrial vehicle traffic by offsetting commutes with autonomous air taxis that are centrally located on rooftops throughout city centers.

One of Uber’s first test locations for uberAIR will be Dallas-Fort Worth, Texas. Tom Prevot, Uber’s Director of Engineering for Airspace Systems, describes the company’s effort to design a Dynamic Skylane Networks of virtual lanes for its eVTOLs to travel, “We’re designing our flight paths essentially to stay out of the scheduled air carriers’ flight paths initially. We do want to test some of these concepts of maybe flying in lanes and flying close to each other but in a very safe environment, initially.” To accomplish these objectives, Prevot’s group signed a Space Act Agreement with NASA to determine the requirements for its aerial ride-share network. Using Uber’s data, NASA is already simulating small-passenger flights around the Texas city to identify potential risks to an already crowded airspace.

After the Elevate conference, media reports hyped the immanent arrival of flying taxis. Rodney Brooks (considered by many as the godfather of robotics) responded with a tweet: “Headline says ‘prototype’, story says ‘concept’. This is a big difference, and symptomatic of stupid media hype. Really!!!” Dan Elwell, FAA Acting Administrator, was much more subdued with his opinion of how quickly the technology will arrive, “Well, we’ll see”

Editor’s Note: This week we will explore regulating unmanned systems further with Democratic Presidential Candidate Andrew Yang and New York State Assemblyman Clyde Vanel at the RobotLab forum on “The Politics Of Automation” in New York City. 

WAZE for drones: expanding the national airspace

Sitting in New York City, looking up at the clear June skies, I wonder if I am staring at an endangered phenomena. According to many in the Unmanned Aircraft Systems (UAS) industry, skylines across the country soon will be filled with flying cars, quadcopter deliveries, emergency drones, and other robo-flyers. Moving one step closer to this mechanically-induced hazy future, General Electric (GE) announced last week the launch of AiRXOS, a “next generation unmanned traffic” management system.

Managing the National Airspace is already a political football with the Trump Administration proposing privatizing the air-control division of the Federal Aviation Administration (FAA), taking its controller workforce of 15,000 off the government’s books. The White House argues that this would enable the FAA to modernize and adopt “NextGen” technologies to speed commercial air travel. While this budgetary line item is debated in the halls of Congress, one certainty inside the FAA is that the National Airspace (NAS) will have to expand to make room for an increased amount of commercial and recreational traffic, the majority of which will be unmanned.

Ken Stewart, the General Manager of AiRXOS, boasts, “We’re addressing the complexity of integrating unmanned vehicles into the national airspace. When you’re thinking about getting a package delivered to your home by drone, there are some things that need to be solved before we can get to that point.” The first step for the new division of GE is to pilot the system in a geographically-controlled airspace. To accomplish this task, DriveOhio’s UAS Center invested millions in the GE startup. Accordingly, the first test deployment of AiRXOS will be conducted over a 35 mile stretch of Ohio’s Interstate 33 by placing sensors along the road to detect and report on air traffic. GE states that this trial will lay the foundation for the UAS industry. As Alan Caslavka, president of Avionics at GE Aviation, explains, “AiRXOS is addressing the rapid changes in autonomous vehicle technology, advanced operations, and in the regulatory environment. We’re excited for AiRXOS to help set the standard for autonomous and manned aerial vehicles to share the sky safely.”

Stewart whimsically calls his new air traffic control platform WAZE for drones. Like the popular navigation app, AiRXOS provides drone operators with real-time flight-planning data to automatically avoid obstacles, other aircraft, and route around inclement weather. The company also plans to integrate with the FAA to streamline regulatory communications with the agency. Stewart explains that this will speed up authorizations as today, “It’s difficult to get [requests] approved because the FAA hasn’t collected enough data to make a decision about whether something is safe or not.” 
Screen Shot 2018-06-08 at 6.04.57 PM.png
NASA is a key partner in counseling the FAA in integrating commercial UAS into the NAS. Charged with removing the “technical and regulatory barriers that are limiting the ability for civil UAS to fly in the NAS” is Davis Hackenberg of NASA’s Armstrong Flight Research Center. Last year, we invited Hackenberg to present his UAS vision to RobotLabNYC. Hackenberg shared with the packed audience NASA’s multi-layered approach to parsing the skies for a wide-range of aircrafts, including: high altitude long endurance flights, commercial airliners, small recreational craft, quadcopter inspections, drone deliveries and urban aerial transportation. Recently the FAA instituted a new regulation mandating that all aircrafts be equipped with Automatic Dependent Surveillance-Broadcast (ADS-B) systems by January 1, 2020. The FAA calls such equipment “foundational NextGen technology that transforms aircraft surveillance using satellite-based positioning,” essentially connecting human-piloted craft to computers on the ground and, quite possibly, in the sky. Many believe this is a critical step towards delivering on the long-awaited promise of the commercial UAS industry with autonomous beyond visual line of sight flights.

I followed up this week with Hackenberg about the news of AiRXOS and the new FAA guidelines. He explained, “For aircraft operating in an ADS-B environment, testing the cooperative exchange of information on position and altitude (and potentially intent) still needs to be accomplished in order to validate the accuracy and reliability necessary for a risk-based safety case.” Hackenberg continued to describe how ADS-B might not help low altitude missions, “For aircraft operating in an environment where many aircraft are not transmitting position and altitude (non-transponder equipped aircraft), developing low cost/weight/power solutions for DAA [Detect and Avoid] and C2 [Command and Control Systems] is critical to ensure that the unmanned aircraft can remain a safe distance from all traffic. Finally, the very low altitude environment (package delivery and air taxi) will need significant technology development for similar DAA/C2 solutions, as well as certified much more (e.g. vehicles to deal with hazardous weather conditions).” The Deputy Project Manager then shared with me his view of the future, “In the next five years, there will be significant advancements in the introduction of drone deliveries. The skies will not be ‘darkened,’ but there will likely be semi-routine service to many areas of the country, particularly major cities. I also believe there will be at least a few major cities with air taxi service using optionally piloted vehicles within the 10-year horizon. Having the pilot onboard in the initial phase may be a critical stepping-stone to gathering sufficient data to justify future safety cases. And then hopefully soon enough there will be several cities with fully autonomous taxi service.”
Screen Shot 2018-06-10 at 11.30.50 AM
Last month, Uber already ambitiously declared at its Elevate Summit that its ride-hail program will begin shuttling humans by 2023. Uber plans to deploy electric vertical take-off and landing (eVTOL) vehicles throughout major metropolitan areas. “Ultimately, where we want to go is about urban mobility and urban transport, and being a solution for the cities in which we operate,” says Uber CEO, Dara Khosrowshahi. Uber has been cited by many civil planners as the primary cause for increased urban congestion. Its eVTOL plan, called uberAIR, is aimed at alleviating terrestrial vehicle traffic by offsetting commutes with autonomous air taxis that are centrally located on rooftops throughout city centers.

One of Uber’s first test locations for uberAIR will be Dallas-Fort Worth, Texas. Tom Prevot, Uber’s Director of Engineering for Airspace Systems, describes the company’s effort to design a Dynamic Skylane Networks of virtual lanes for its eVTOLs to travel, “We’re designing our flight paths essentially to stay out of the scheduled air carriers’ flight paths initially. We do want to test some of these concepts of maybe flying in lanes and flying close to each other but in a very safe environment, initially.” To accomplish these objectives, Prevot’s group signed a Space Act Agreement with NASA to determine the requirements for its aerial ride-share network. Using Uber’s data, NASA is already simulating small-passenger flights around the Texas city to identify potential risks to an already crowded airspace.

After the Elevate conference, media reports hyped the immanent arrival of flying taxis. Rodney Brooks (considered by many as the godfather of robotics) responded with a tweet: “Headline says ‘prototype’, story says ‘concept’. This is a big difference, and symptomatic of stupid media hype. Really!!!” Dan Elwell, FAA Acting Administrator, was much more subdued with his opinion of how quickly the technology will arrive, “Well, we’ll see”

Editor’s Note: This week we will explore regulating unmanned systems further with Democratic Presidential Candidate Andrew Yang and New York State Assemblyman Clyde Vanel at the RobotLab forum on “The Politics Of Automation” in New York City. 

New battery technology is accelerating autonomy and saving the environment

If the robotics world had a celebrity it would be Spot Mini of Boston Dynamics. Last month at the Robotics Summit in Boston the mechanical dog strutted onto the floor of the Westin Hotel trailed by hundreds of flickering iPhones. Marc Raibert first unveiled his metal menaagerie almost a decade ago with a video of Big Dog. Today, Mini is the fulfillment of his mission in a sleeker, smarter, and environmentally friendlier robo-canine package than its gas-burning ancestor.

spot.jpg

Since the early 1990s, machines have relied on rechargeable lithium ion batteries for power. However these storage cells (inside most cell phones, and now Spot Mini) are dangerously combustable, easily degradable, and very expensive. One of the best examples of the instability of lithium ion is the Samsung Note 7 handset recall, after exploding units caused havoc to consumers. The design flaw ended up costing Samsung $6.2 billion, and even prompted the Federal Aviation Administration (FAA) to issue an advisement after paniced flyers saw cellphones overheat. Exploding batteries are not limited to Samsung, but include the entire lithium ion appliance ecosystem, including e-cigarettes, hoverboards, toys, and electric vehicles.

Screen Shot 2018-06-01 at 5.40.15 PM.png

While Marc Raibert was showing off his latest mechanical creation, across the river in Woburn, Mass Iconic Materials was opening its new 30,000 square-foot lab. The hot startup grabbed headlines months ago with a $65 million venture capital investment from the new Renault-Nissan-Mitsubishi Alliance, Total Energy Ventures, and Sun Microsystems co-founder Bill Joy. However, the real news story is their revolutionary solid-state lithium battery technology that is cheaper, less flammable and longer lasting. The technology came out of the research of Dr. Michael Zimmerman of Tufts University that was originally aimed at improving the performance of existing lithium ion batteries. Unlike lithium ion batteries that contain flammable liquid electrolyte, Zimmerman’s invention deploys a solid polymer electrolyte preventing short-circuiting. Iconic’s plastic electrolyte not only prevents explosive gases from escaping, but enables the composition battery to be constructed with higher energy density materials, such as pure lithium anodes.

Zimmerman first unveiled his innovation on PBS NOVA last year, whereby the famed host David Pogue tested it by poking the solid-state lithium cell with a screwdriver and scissors. Typically, such stress would immediately cause liquid electrolyte to explode, but Zimmerman’s battery did not even heat up and kept working. Pogue then noted, “If you can use lithium metal rather than lithium ions, you get five to ten times the energy density. That means ten days on a charge instead of one day on a charge, a thousand miles on a charge of your car instead of 200 miles.”

The promise of Iconic’s technology helps alleviate “range anxiety,” the fear of running out of charge without a power source nearby. The future of robots, and especially autonomous vehicles, relies heavily on investments in infrastructure to rival oil. Today, it takes 75 minutes to fully recharge the 7,104 lithium ion batteries inside a Tesla at one of of its 5,000 supercharging stations, compared to 15 minutes at the pump at more than 165,000 gas stations throughout America. Realizing the shortcomings of switching to electric, Sweden is making country-wide investments to accelerate adoption. Last month, Stockholm opened the first stretch of roadway capable of simultaneously charging vehicles while driving.

swed.jpg

Markus Fischer, spokesperson for state-owned energy company Vattenfall, describes “Such roads will allow (electric vehicles) to move long distances without big, costly and heavy batteries. The investment cost per kilometer is estimated to be less than that of using overhead lines, as is the impact on the landscape.” Currently, only 1.2 miles of electric rail has been laid, but it is already working with trucks making deliveries to the airport. Gunnar Ashland, CEO of Elways the maker of the road’s electric rail, boasted, “The technology offers infinite range — range anxiety disappears. Electrified roads will allow smaller batteries and can make electric cars even cheaper than fossil fuel ones.”

At the Robotics Summit in Boston, I spoke with Dr. Doug Schmidt of electric battery provider Acumentrics about the Swedish technology. Dr. Schmidt explained that most conductive charging platforms similar to Elways speed the degradation of lithium ion batteries. Israeli startup Phinergy offers an alternative to lithium for electric vehicles with their proprietary aluminum batteries that produce energy through a reaction between oxygen and aluminum using water. A few years ago, Phinergy powered a Renault car for over a thousand kilometers with just tap water. Now the company has partnered with Chinese-based Yunnan Aluminum to begin manufacturing batteries to meet China’s growing electric automobile market. According to the press release last month the joint venture “will introduce the world’s leading aluminum-air battery technology, relying on [Yunnan Aluminium’s] green and clean water and aluminum resources.” The statement further detailed that the initial annual output will be 2,500 units. Phinergy’s website promotes wider uses cases, including industrial robots and other unmanned systems. Screen Shot 2018-06-03 at 1.18.44 AM.png
China has been leading the world in alternative energy development. Last year, Pittsburgh-based Aquion was acquired out of bankruptcy for $9.16 million by Juline-Titans, an affiliate of China Titans Energy Technology Group. Aquion, a once high-flying startup that raised more than $190 million from such notable investors as Bill Gates, Kleiner Perkins Caufield & Byers, and Nick and Joby Pritzker, is now in the process of moving its operations to Asia. Similar to Phinergy, Aquion utilizes the most renewable of resources, water. Their patented “Aqueous Hybrid Ion” technology is able to create clean energy using sea water. However, it comes at a cost of weight: unlike lithium batteries that are light enough to fit in one’s pocket, salt-water fuel cells are considerably heavier. The company’s products are uniquely positioned to be utilized for future power grids, with the promise of weaning the world off fossil fuels.  aq.jpg
Today, fewer than 5% of lithium-ion batteries are recycled. The environmental costs could not be higher with dangerous toxic gases leaking from old batteries. Rising battery demand is also leading to a variety of unintended consequences, such as depleting the world’s natural resources of lithium and cobalt and increased water pollution from mineral extraction. While turning the tides of climate change depends greatly on ending the global dependency on oil, replacing it with a more green alternative is crucial. Promising inventions are not only developing new energy paradigms, but recycling old ones in innovative ways. British startup Aceleron is reusing dead electric car batteries for home energy storage. In the words of Amrit Chandan of Aceleron, “It takes so much energy to extract these materials from the ground. If we don’t re-use them we could be making our environmental problems worse. There’s going to be a storm of electric vehicle batteries that will reach the end of their life in a few years, and we’re positioning ourselves to be ready for it.”

Climate change and unmanned systems will be discussed in greater detail at the next RobotLab on “The Politics Of Automation,” June 13th @ 6pm in NYC, with Democratic Presidential Candidate Andrew Yang and New York State Assemblyman Clyde Vanel.

 

New battery technology is accelerating autonomy and saving the environment

If the robotics world had a celebrity it would be Spot Mini of Boston Dynamics. Last month at the Robotics Summit in Boston the mechanical dog strutted onto the floor of the Westin Hotel trailed by hundreds of flickering iPhones. Marc Raibert first unveiled his metal menaagerie almost a decade ago with a video of Big Dog. Today, Mini is the fulfillment of his mission in a sleeker, smarter, and environmentally friendlier robo-canine package than its gas-burning ancestor.

spot.jpg

Since the early 1990s, machines have relied on rechargeable lithium ion batteries for power. However these storage cells (inside most cell phones, and now Spot Mini) are dangerously combustable, easily degradable, and very expensive. One of the best examples of the instability of lithium ion is the Samsung Note 7 handset recall, after exploding units caused havoc to consumers. The design flaw ended up costing Samsung $6.2 billion, and even prompted the Federal Aviation Administration (FAA) to issue an advisement after paniced flyers saw cellphones overheat. Exploding batteries are not limited to Samsung, but include the entire lithium ion appliance ecosystem, including e-cigarettes, hoverboards, toys, and electric vehicles.

Screen Shot 2018-06-01 at 5.40.15 PM.png

While Marc Raibert was showing off his latest mechanical creation, across the river in Woburn, Mass Iconic Materials was opening its new 30,000 square-foot lab. The hot startup grabbed headlines months ago with a $65 million venture capital investment from the new Renault-Nissan-Mitsubishi Alliance, Total Energy Ventures, and Sun Microsystems co-founder Bill Joy. However, the real news story is their revolutionary solid-state lithium battery technology that is cheaper, less flammable and longer lasting. The technology came out of the research of Dr. Michael Zimmerman of Tufts University that was originally aimed at improving the performance of existing lithium ion batteries. Unlike lithium ion batteries that contain flammable liquid electrolyte, Zimmerman’s invention deploys a solid polymer electrolyte preventing short-circuiting. Iconic’s plastic electrolyte not only prevents explosive gases from escaping, but enables the composition battery to be constructed with higher energy density materials, such as pure lithium anodes.

Zimmerman first unveiled his innovation on PBS NOVA last year, whereby the famed host David Pogue tested it by poking the solid-state lithium cell with a screwdriver and scissors. Typically, such stress would immediately cause liquid electrolyte to explode, but Zimmerman’s battery did not even heat up and kept working. Pogue then noted, “If you can use lithium metal rather than lithium ions, you get five to ten times the energy density. That means ten days on a charge instead of one day on a charge, a thousand miles on a charge of your car instead of 200 miles.”

The promise of Iconic’s technology helps alleviate “range anxiety,” the fear of running out of charge without a power source nearby. The future of robots, and especially autonomous vehicles, relies heavily on investments in infrastructure to rival oil. Today, it takes 75 minutes to fully recharge the 7,104 lithium ion batteries inside a Tesla at one of of its 5,000 supercharging stations, compared to 15 minutes at the pump at more than 165,000 gas stations throughout America. Realizing the shortcomings of switching to electric, Sweden is making country-wide investments to accelerate adoption. Last month, Stockholm opened the first stretch of roadway capable of simultaneously charging vehicles while driving.

swed.jpg

Markus Fischer, spokesperson for state-owned energy company Vattenfall, describes “Such roads will allow (electric vehicles) to move long distances without big, costly and heavy batteries. The investment cost per kilometer is estimated to be less than that of using overhead lines, as is the impact on the landscape.” Currently, only 1.2 miles of electric rail has been laid, but it is already working with trucks making deliveries to the airport. Gunnar Ashland, CEO of Elways the maker of the road’s electric rail, boasted, “The technology offers infinite range — range anxiety disappears. Electrified roads will allow smaller batteries and can make electric cars even cheaper than fossil fuel ones.”

At the Robotics Summit in Boston, I spoke with Dr. Doug Schmidt of electric battery provider Acumentrics about the Swedish technology. Dr. Schmidt explained that most conductive charging platforms similar to Elways speed the degradation of lithium ion batteries. Israeli startup Phinergy offers an alternative to lithium for electric vehicles with their proprietary aluminum batteries that produce energy through a reaction between oxygen and aluminum using water. A few years ago, Phinergy powered a Renault car for over a thousand kilometers with just tap water. Now the company has partnered with Chinese-based Yunnan Aluminum to begin manufacturing batteries to meet China’s growing electric automobile market. According to the press release last month the joint venture “will introduce the world’s leading aluminum-air battery technology, relying on [Yunnan Aluminium’s] green and clean water and aluminum resources.” The statement further detailed that the initial annual output will be 2,500 units. Phinergy’s website promotes wider uses cases, including industrial robots and other unmanned systems. Screen Shot 2018-06-03 at 1.18.44 AM.png
China has been leading the world in alternative energy development. Last year, Pittsburgh-based Aquion was acquired out of bankruptcy for $9.16 million by Juline-Titans, an affiliate of China Titans Energy Technology Group. Aquion, a once high-flying startup that raised more than $190 million from such notable investors as Bill Gates, Kleiner Perkins Caufield & Byers, and Nick and Joby Pritzker, is now in the process of moving its operations to Asia. Similar to Phinergy, Aquion utilizes the most renewable of resources, water. Their patented “Aqueous Hybrid Ion” technology is able to create clean energy using sea water. However, it comes at a cost of weight: unlike lithium batteries that are light enough to fit in one’s pocket, salt-water fuel cells are considerably heavier. The company’s products are uniquely positioned to be utilized for future power grids, with the promise of weaning the world off fossil fuels.  aq.jpg
Today, fewer than 5% of lithium-ion batteries are recycled. The environmental costs could not be higher with dangerous toxic gases leaking from old batteries. Rising battery demand is also leading to a variety of unintended consequences, such as depleting the world’s natural resources of lithium and cobalt and increased water pollution from mineral extraction. While turning the tides of climate change depends greatly on ending the global dependency on oil, replacing it with a more green alternative is crucial. Promising inventions are not only developing new energy paradigms, but recycling old ones in innovative ways. British startup Aceleron is reusing dead electric car batteries for home energy storage. In the words of Amrit Chandan of Aceleron, “It takes so much energy to extract these materials from the ground. If we don’t re-use them we could be making our environmental problems worse. There’s going to be a storm of electric vehicle batteries that will reach the end of their life in a few years, and we’re positioning ourselves to be ready for it.”

Climate change and unmanned systems will be discussed in greater detail at the next RobotLab on “The Politics Of Automation,” June 13th @ 6pm in NYC, with Democratic Presidential Candidate Andrew Yang and New York State Assemblyman Clyde Vanel.

 

Automating window washing

Three and half years ago, I stood on the corner of West Street and gasped as two window washers clung to life at the end of a rope a thousand feet above. By the time rescue crews reached the men on the 69th floor of 1 World Trade they were close to passing out from dangling upside down. Everyday risk-taking men and women hook their bodies to metal scaffolds and ascend to deadly heights for $25 an hour. Ramone Castro, a window washer of three decades, said it best, “It is a very dangerous job. It is not easy going up there. You can replace a machine but not a life.” Castro’s statement sounds like an urgent call to action for robots.

One of the promises of automation is replacing tasks that are too dangerous for humans. Switzerland-based Serbot believes that high-rise facade cleaning is one of those jobs ripe for disruption. In 2010, it was first reported that Serbot contracted with the city of Dubai to automatically clean its massive glass skyline. Utilizing their GEKKO machine, the Swiss company has demonstrated a performance of over 400 square meters an hour, 15 times faster than a professional washer. GEKKO leverages a unique suction technology that enables the massive Roomba-like device to be suspended from the roof and adhere to the curtain wall regardless of weather conditions or architectural features. Serbot offers both semi and full autonomous versions of its GEKKOs, which include options for retrofitting existing roof systems. It is unclear how many robots are actually deployed in the marketplace, however Serbot recently announced the cleaning of the architecturally challenging FESTO’s Automation Center in Germany (shown below).

festo-news-1

According to the press release, “The entire building envelope is cleaned automatically: by a robot, called GEKKO Facade, which sucks on the glass facade. This eliminates important disadvantages of conventional cleaning: no disturbance of the user by cleaning personnel, no risky working in a gondola at high altitude, no additional protection during the cleaning phase, etc.” Serbot further states its autonomous system was able to work at amazing speeds cleaning the 8,600 square meter structure within a couple of days via its intelligent platform that plans a route across the entire glass facade.

skyscraper stats.jpg

Parallel to the global trend of urbanization, skyscraper construction is at an all time high. Demand for glass facade materials and maintenance services is close to surpassing $200 billion worldwide. As New York City is in the center of the construction boom, Israeli-startup, Skyline Robotics, recently joined Iconic Labs NYC (ICONYC). This week, I had the opportunity to ask Skyline founder and CEO Yaron Schwarcz about the move. Schwarcz proudly said, “So far we are deployed in Israel only and are working exclusively with one of the top 5 cleaning companies. Joining ICONYC was definitely a step forward, as a rule we only move forward, we believe that ICONIC can and will help us connect with the best investors and help us grow in the NY market.”

While Serbot requires building owners to purchase their proprietary suction cleaning system, Skyline’s machine, called Ozmo, integrates seamlessly with existing equipment. Schwarcz explains, “We use the existing scaffold of the building in contrast to GEKKO’s use of suction. The use of the arms is to copy the human arms which is the only way to fully maintain the entire building and all its complexity. The Ozmo system is not only a window cleaner, it’s a platform for all types of facade maintenance. Ozmo does not need any humans on the rig, never putting people in danger.” Schwarcz further shared with me the results of early case studies in Israel whereby Ozmo cleaned an entire vertical glass building in 80 hours with one supervisor remotely controlling the operation from the ground, adding with “no breaks.”

While Serbot and Skyline offer an optimistic view of the future, past efforts have been met with skepticism. In a 2014 New York Times article, written days after the two window washers almost fell to their deaths, the paper concluded, “washing windows is something that machines still cannot do as well.” The Times interviewed building exterior consultant, Craig S. Caulkins, who stated then, “Robots have problems.” Caulkins says the set back for automation has been the quality of work, citing numerous examples of dirty window corners. “If you are a fastidious owner wanting clean, clean windows so you can take advantage of that very expensive view that you bought, the last thing you want to see is that gray area around the rim of the window,” exclaimed Caulkins. Furthermore, New York City’s window washers are represented by a very active labor union, S.E.I.U. Local 32BJ. The fear of robots replacing their members could lead to city wide protests, and strikes. The S.E.I.U. 32BJ press office did not return calls for comment.

High rise window washing in New York is very much part of the folklore of the Big Apple. One of the best selling local children books, “Window Washer: At Work Above the Clouds,” profiles the former Twin Towers cleaner Roko Camaj. In 1995, Camaj predicted that “Ten years from now, all window washing will probably be done by a machine.” Unfortunately, Camaj never lived to see the innovations of GEKKO and Ozmo, as he perished in the Towers on September the 11th.

Screen Shot 2018-05-21 at 11.19.11 PM

Automating high-risk professions will be explored further on June 13th @ 6pm in NYC with Democratic Presidential Candidate Andrew Yang and New York Assemblyman Clyde Vanel at the next RobotLab on “The Politics Of Automation” – Reserve Today!

Drones as first responders

In a basement of New York University in 2013, Dr. Sergei Lupashin wowed the room of one hundred leading technology enthusiasts with one of the first indoor Unmanned Aerial Vehicle (UAV) demonstrations. During his presentation, Dr. Lupashin of ETH Zurich  attached a dog leash to an aerial drone while declaring to the audience, “there has to be another way” of flying robots safely around people. Lupashin’s creativity eventually led to the invention of Fotokite and one of the most successful Indiegogo campaigns.

Since Lupashin’s demo, there are now close to a hundred providers of drones on leashes from innovative startups to aftermarket solutions in order to restrain unmanned flying vehicles. Probably the best known enterprise solution is CyPhy Works which has raised more than $30 million. Last August, during President Trump’s visit to his Golf Course in New Jersey, the Department of Homeland Security (DHS) deployed CyPhy’s tethered drones to patrol the permitter. In a statement by DHS about their “spy in the sky program,” the agency explained: “The Proof of Concept will help determine the potential future use of tethered Unmanned Aircraft System (sUAS) in supporting the Agency’s protective mission. The tethered sUAS used in the Proof of Concept is operated using a microfilament tether that provides power to the aircraft and the secure video from the aircraft to the Operator Control Unit (OCU).” CyPhy’s systems are currently being utilized to provide a birdseye view to police departments and military units for a number of high profile missions, including the Boston Marathon.

Fotokite, CyPhy and others have proved that tethered machines offer huge advantages from traditional remote controlled or autonomous UAVs, by removing regulatory, battery and payload restrictions from lengthy missions. This past week Genius NY, the largest unmanned systems accelerator, awarded one million dollars to Fotokite for its latest enterprise line of leashed drones. The company clinched the competition after demonstrating how its drones can fly for up to twenty-four hours continuously providing real-time video feed autonomously and safely above large population centers. Fotokite’s Chief Executive, Chris McCall, announced that the funds will be utilized to fulfill a contract with one of the largest fire truck manufacturers in the United States. “We’re building an add-on to fire and rescue vehicles and public safety vehicles to be added on top of for instance a fire truck. And then a firefighter is able to pull up to an emergency scene, push a button and up on top of the fire truck this box opens up, a Fotokite flies up and starts live streaming thermal and normal video down to all the firefighters on the ground,” boasted McCall.

aerones_ff_screen.jpg

Fotokite is not the only kite drone company marketing to fire fighters, Latvian-born startup Aerones is attaching firehoses to its massive multi-rotor unmanned aerial vehicles. Aerones claims to have successfully built a rapid response UAV that can climb up to a thousand feet within six minutes to extinguish fires from the air. This enables first responders to have a reach of close to ten times more than traditional fire ladders. The Y Combinator startup offers municipalities two models: a twenty-eight propeller version that can carry up to 441 pounds to a height of 984 feet and a thirty-six propeller version that ferriess over 650 pounds of equipment to ascend over 1,600 feet. However, immediate interest for the Aerones solution is coming from industrial clients such as wind farms. “Over the last two months, we’ve been very actively talking to wind turbine owners,” says Janis Putrams, CEO of Aerones. “We have lots of interest and letters of intent in Texas, Spain, Turkey, South America for wind turbine cleaning. And in places like Canada, the Nordic and Europe for de-icing. If the weather is close to freezing, ice builds up, and they have to stop the turbine.” TechCrunch reported last March that the company moved its sales operations to Silicon Valley. 

The emergency response industry is also looking to other aerial solutions to tackle its most difficult challenges. For over a year, Zipline has been successfully delivering blood for critical transfusions to the most remote areas of Africa. The company announced earlier this month that it has filed with the FAA to begin testing later this year in America. This is welcome news for the USA’s rural health centers which are straddled with exploding costs, staff shortages and crippling infrastructure. In a Fast Company article about Zipline, the magazine reported that “Nearly half of rural providers already have a negative operating margin. As rural residents–who tend to be sicker than the rest of the country–have to rely on the smaller clinics that remain, drones could ensure that those clinics have access to necessary supplies. Blood products spoil quickly, and outside major hospitals, it’s common not to have the right blood on hand for a procedure. Using the drones would be faster, cheaper, and more reliable than delivering the supplies in a van or car.”

Keller Rinaudo, Zipline’s Chief Executive, describes, “There’s a lot that [the U.S.] can be doing better. And that’s what we think is ultimately the promise of future logistics and automated logistics. It’s not delivering tennis shoes or pizza to someone’s backyard. It’s providing universal access to healthcare when people need it the most.”

To date, Zipline has flown over 200,000 miles autonomously delivering 7,000 units of blood throughout Rwanda. To prepare for its US launch, the company re-engineered its entire platform to bolster its delivery capabilities. Rinaudo explains, “In larger countries, you’re going to need distribution centers and logistics systems that are capable of doing millions of deliveries a day rather than hundreds or thousands.” The new UAV is a small fixed-wing plane called the Zip that can soar close to 80 miles per hour enabling life-saving supplies such as blood, organ donations or vaccines to be delivered in a matter of minutes.


As I prepare to speak at Xponetial 2018 next month, I am inspired by these innovators who turn their mechanical inventions into life-saving solutions. Many would encourage Rinaudo and others to focus their energies on the seemingly more profitable  sectors such as e-commerce delivery and industrial inspections. However, Rinaudo retorts that “Healthcare logistics is a way bigger market and a way bigger problem than most people realize. Globally it’s a $70 billion industry. The reality is that there are billions of people who do not have reliable access to healthcare and a big part of that is logistics. As a result of that, 5.2 million kids die every year due to lack of access to basic medical products. So Zipline’s not in a rush to bite off a bigger problem than that.”

The topic of utilizing life-saving technology will be discussed at the next RobotLab event on “The Politics Of Automation,” with Democratic Presidential Candidate Andrew Yang and New York Assemblyman Clyde Vanel on June 13th @ 6pm in NYC – RSVP Today

SXSW 2018: Protect AI, robots, cars (and us) from bias

As Mark Hamill humorously shared the behind-the-scenes of “Star Wars: The Last Jedi” with a packed SXSW audience, two floors below on the exhibit floor Universal Robots recreated General Grievous’ famed light saber battles. The battling machines were steps away from a twelve foot dancing Kuka robot and an automated coffee dispensary. Somehow the famed interactive festival known for its late night drinking, dancing and concerts had a very mechanical feel this year. Everywhere debates ensued between utopian tech visionaries and dystopia-fearing humanists.

Even my panel on “Investing In The Autonomy Economy” took a very social turn when discussing the opportunities of utilizing robots for the growing aging population. Eric Daimler (formerly of the Obama White House) raised concerns about AI bias affecting the well being of seniors. Agreeing, Dan Burstein (partner at Millennium Tech Value Partners) nervously expressed that ‘AI is everywhere, in everything, and the USA has no other way to care for this exploding demographic except with machines.’ Daimler explained that “AI is very good at perception, just not context;” until this is solved it could be a very dangerous problem worldwide.

Last year at a Google conference on the relationship between humans and AI, the company’s senior vice president of engineering, John Giannandrea, warned, “The real safety question, if you want to call it that, is that if we give these systems biased data, they will be biased. It’s important that we be transparent about the training data that we are using, and are looking for hidden biases in it, otherwise we are building biased systems.” Similar to Daimler’s anxiety about AI and healthcare, Giannandrea exclaimed that “If someone is trying to sell you a black box system for medical decision support, and you don’t know how it works or what data was used to train it, then I wouldn’t trust it.”


One of the most famous illustrations of how quickly human bias influences computer actions is Tay, the Microsoft customer service chatbot on Twitter. It took only twenty-four hours for Tay to develop a Nazi persona leading to more than ninety thousand hate-filled tweets. Tay swiftly calculated that hate on social media equals popularity. In explaining its failed experiment to Business Insider Microsoft stated via email: “The AI chatbot Tay is a machine learning project, designed for human engagement. As it learns, some of its responses are inappropriate and indicative of the types of interactions some people are having with it. We’re making some adjustments to Tay.”

While Tay’s real impact was benign, it raises serious questions of the implications of embedding AI into machines and society. In its Pulitzer Prize-winning article, ProPublica.org uncovered that a widely distributed US criminal justice software called Correctional Offender Management Profiling for Alternative Sanctions (COMPAS) was racially biased in scoring the risk levels of convicted felons to recommit crimes. ProPublica discovered that black defendants in Florida, “were far more likely than white defendants to be incorrectly judged to be at a higher rate of recidivism” by the AI. Northpointe, the company that created COMPAS, released its own report that disputed ProPublica’s findings but it refused to pull back the curtain on its training data, keeping the algorithms hidden in a “black box.” In a statement released to the New York Times, Northpointe’s spokesperson argued, “The key to our product is the algorithms, and they’re proprietary. We’ve created them, and we don’t release them because it’s certainly a core piece of our business.” 

The dispute between Northpointe and ProPublica raises the question of transparency and the auditing of data by an independent arbitrator to protect against bias. Cathy O’Neil, a former Barnard professor and analyst at D.E. Shaw, thinks a lot about safeguarding ordinary Americans from biased AI. In her book, Weapons of Math Destruction, she cautions that big corporate America is too willing to hand over the wheel to the algorithms without fully assessing the risks or implementing any oversight monitoring. “[Algorithms] replace human processes, but they’re not held to the same standards. People trust them too much,” declares O’Neil. Understanding the high stakes and lack of regulatory oversight by the current federal government, O’Neil left her high-paying Wall Street job to start a software auditing firm, O’Neil Risk Consulting & Algorithmic Auditing. In an interview with MIT Technology Review last summer, O’Neil frustratingly expressed that companies are more interested in the bottom line than protecting their employees, customers, and families from bias, “I’ll be honest with you. I have no clients right now.”

Most of the success of deconstructing “black boxes” is happening today at the US Department of Defense. DARPA has been funding the research of Dr. David Gunning to develop Explainable Artificial Intelligence (XAI). Understanding its own AI and that of foreign governments could be a huge advantage for America’s cyber military units. At the same time, like many DARPA-funded projects, civilian opportunities could offer societal benefits. According to Gunning’s statement, online XAI aims to “produce more explainable models, while maintaining a high level of learning performance (prediction accuracy); and enable human users to understand, appropriately trust, and effectively manage the emerging generation of artificially intelligent partners.” XAI plans to work with developers and user interfaces designers to foster “useful explanation dialogues for the end user,” to know when to trust or question the AI-generated data. 

Besides DARPA, many large technology companies and universities are starting to create think tanks, conferences and policy groups to develop standards that test AI bias. The results have been startling – ranging from computer vision sensors that negatively identify people of color to gender bias in employment management software to blatant racism of natural language processing systems to security robots that run over kids identified mistakenly as threats. As an example of how training data affects outcomes, when Google first released its image processing software, the AI identified photos of African Americans as “gorillas,” because the engineers failed to provide enough minority examples into the neural network.

Ultimately artificial intelligence reflects the people that program it, as every human being brings with him his own experiences that shape personal biases. According to Kathleen Walch, host of AI Today podcast, “If the researchers and developers developing our AI systems are themselves lacking diversity, then the problems that AI systems solve and training data used both become biased based on what these data scientists feed into AI training data,” Walch advocates that hiring diversity can bring “about different ways of thinking, different ethics and different mindsets. Together, this creates more diverse and less biased AI systems. This will result in more representative data models, diverse and different problems for AI solutions to solve, and different use cases feed to these systems if there is a more diverse group feeding that information.”

Before leaving SXSW, I attended a panel hosted by the IEEE on “Algorithms, Unconscious Bias & AI,” amazingly all led by female panelists including one person of color. Hiring basis became a big theme of their discussion. Following the talk, I hopped into my Uber and pleasantly rode to the airport reflecting on a statement made earlier in the day by John Krafcik, Cheif Executive, of Waymo. Krafcik boasted that Waymo’s mission is to build “the world’s most experienced driver,” I just hope that the training data is not from New York City cabbies.

Healthcare’s regulatory AI conundrum

It was the last question of the night and it hushed the entire room. An entrepreneur expressed his aggravation about the FDA’s antiquated regulatory environment for AI-enabled devices to Dr. Joel Stein of Columbia University. Stein a leader in rehabilitative robotic medicine, sympathized with the startup knowing full well that tomorrow’s exoskeletons will rely heavily on machine intelligence. Nodding her head in agreement, Kate Merton of JLabs shared the sentiment. Her employer, Johnson & Johnson, is partnered with Google to revolutionize the operating room through embedded deep learning systems. In many ways this astute observation encapsulated RobotLab this past Tuesday with our topic being “The Future Of Robotic Medicine,” the paradox of software-enabled therapeutics offering a better quality of life and the societal, technological and regulatory challenges ahead.

To better understand the frustration expressed at RobotLab, a review of the policies of the Food & Drug Administration (FDA) relative to medical devices and software is required. Most devices fall within a criteria that was established in the 1970s. The “build and freeze” model whereby a product filed doesn’t change overtime and currently excludes therapies that rely on neural networks and deep learning algorithms that evolve with use. Charged with progressing its regulatory environment, the Obama Administration established a Digital Health Program tasked with implementing new regulatory guidance for software and mobile technology. This initiative eventually led Congress to pass the 21st Century Cures Act (“Cures Act”) in December 2016. An important aspect of the Cures Act is its provisions for digital health products, medical software, and smart devices. The legislators singled out AI for its unparalleled ability to be used in supporting human decision making referred to as “Clinical Decision Support” (“CDS”) with examples like Google and IBM Watson. Last year, the administration updated the Cures Act with a new framework that included a Digital Health Innovation Action Plan. These steps have been leading a change in the the FDA’s attitude towards mechatronics, updating its traditional approach to devices to include software and hardware that iterates with cognitive learning. The Action Plan states “an efficient, risk-based approach to regulating digital health technology will foster innovation of digital health products.” In addition, the FDA has been offering tech partners the ability of filing a Digital Health Software Pre-Certification (“Pre-Cert”) to fast track the evaluation and approval process, current Pre-Cert pilot filings include Apple, Fitbit, Samsung and other leading technology companies.

Another way for AI and robotic devices to receive approval from the FDA is through their “De Novo premarket review pathway.” According to the FDA’s website, the De Novo program is designed for “medical devices that are low to moderate risk and have no legally marketed predicate device to base a determination of substantial equivalence.” Many computer vision systems fall into the De Novo category using their sensors to provide “triage” software to efficiently identify disease markers based upon its training data of radiology images. As an example, last month the FDA approved Viz.ai a new type of “clinical decision support software designed to analyze computed tomography (CT) results that may notify providers of a potential stroke in their patients.”

Dr. Robert Ochs of the FDA’s Center for Devices and Radiological Health explains, “The software device could benefit patients by notifying a specialist earlier thereby decreasing the time to treatment. Faster treatment may lessen the extent or progression of a stroke.” The Viz.ai algorithm has the ability to change the lives of the nearly 800,000 annual stroke victims in the USA. The data platform will enable clinicians to quickly identify patients at risk for stroke by analyzing thousands of CT brain scans for blood vessel blockages and then automatically send alerts via text messages to neurovascular specialists. Viz.AI promises to streamline the diagnosis process by cutting the traditional time it takes for radiologists to review, identify and escalate cases to specialists for high-risk patients.

Screen Shot 2018-03-11 at 9.32.20 AMDr. Chris Mansi, Viz.ai CEO, says “The Viz.ai LVO Stroke Platform is the first example of applied artificial intelligence software that seeks to augment the diagnostic and treatment pathway of critically unwell stroke patients. We are thrilled to bring artificial intelligence to healthcare in a way that works alongside physicians and helps get the right patient, to the right doctor at the right time.” According to the FDA’s statement, Mansi’s company “submitted a study of only 300 CT scans that assessed the independent performance of the image analysis algorithm and notification functionality of the Viz.ai Contact application against the performance of two trained neuro-radiologists for the detection of large vessel blockages in the brain. Real-world evidence was used with a clinical study to demonstrate that the application could notify a neurovascular specialist sooner in cases where a blockage was suspected.”

Viz.ai joins a market for AI diagnosis software that is growing rapidly and projected to eclipse six billion by 2021 (Frost & Sullivan), an increase of more than forty percent since 2014. According to the study, AI has the ability to reduce healthcare costs by nearly half, while at the same time improving the outcomes for a third of all US healthcare patients. However, diagnosis software is only part of the AI value proposition, adding learning algorithms throughout the entire ecosystem of healthcare could provide new levels of quality of care.

At the same time, the demand for AI treatment is taking its toll on an underfunded FDA which is having difficulty keeping up with the new filings to review computer-aided therapies from diagnosis to robotic surgery to invasive therapeutics. In addition, many companies are currently unable to afford the seven-figure investment required to file with the FDA, leading to missed opportunities to find cures for the most plaguing diseases. The Atlantic reported last fall about a Canadian company, Cloud DX, that is still waiting for approval for its AI software that analyzes coughing data via audio wavelengths to detect lung-based diseases (i.e., asthma, tuberculosis, and pneumonia). Cloud DX’s founder, Robert Kaul, shared wth the magazine, “There’s a reason that tech companies like Google haven’t been going the FDA route [of clinical trials aimed at diagnostic certification]. It can be a bureaucratic nightmare, and they aren’t used to working at this level of scrutiny and slowness.” It took Cloud DX two years and close to a million dollars to achieve the basic ISO 13485 certification required to begin filing with the agency. Kaul, questioned, “How many investors are going to give you that amount of money just so you can get to the starting line?”

Last month, Rani Therapeutics raised $53 million to begin clinical trials for its new robotic pill. Rani’s solution could usher in a new paradigm of needle-free therapy, whereby drugs are mechanically delivered to the exact site of infection. Unfortunately, innovations like Rani’s are getting backlogged with a shortage of knowledgable examiners able review the clinical data. Bakul Patel, the FDA’s New Associate Center Director For Digital Health, describes that one of his top priorities is hiring, “Yes, it’s hard to recruit people in AI right now. We have some understanding of these technologies. But we need more people. This is going to be a challenge.” Patel is cautiously optimistic, “We are evolving… The legacy model is the one we know works. But the model that works continuously—we don’t yet have something to validate that. So the question is [as much] scientific as regulatory: How do you reconcile real-time learning [with] people having the same level of trust and confidence they had yesterday?”

Screen Shot 2018-03-11 at 11.13.44 AMAs I concluded my discussion with Stein, I asked if he thought disabled people will eventually be commuting to work wearing robotic exoskeletons as easily as they do in electric wheelchairs? He answered that it could come within the next decade if society changes its mindset on how we distribute and pay for such therapies. To quote the President, “Nobody knew health care could be so complicated.”

Page 2 of 4
1 2 3 4