Why think, do for yourself?


Warning: Trying to access array offset on value of type bool in /bitnami/wordpress/wp-content/themes/the-newspaper/theme-framework/theme-style/function/template-functions.php on line 673

From building cars to sorting mail, the workload of robots has dramatically expanded, taking on human tasks that make them part of the family and workforce.

Today’s robots build our cars and sort our mail. Tomorrow’s automated devices already are beginning to prepare our meals, keep company with our elders and replace us in thousands of mundane — and not so mundane — jobs. 

We are on the verge of an avalanche of virtual reality and related technology that will transform not only the business world but the quality, pace and structure of daily life for millions worldwide.

With startling effect, our machines increasingly will think for us. “Big data,” computers’ ability to gather vast amounts of information in complex databases, will be stored in the limitless spheres of “The Cloud.” 

And the “deep learning” capability of artificial-intelligence software, able to learn from its own mistakes and experiences to improve functionality, will progressively eliminate the need for thousands of tasks now performed by humans.

The marriage of robotics with artificial intelligence/virtual-reality technologies has obliterated the ceiling of possibilities.

BREEDING ROBOTS AT HOME, WORK
“Breeding Robots,” one of our Trends Research Institute’s top trends for 2016, will usher in a new era of automated intelligence with virtually no boundaries for the application and usefulness of these new technologies.

Simply put, robotic and related technologies no longer need to be programmed with explicit instructions; this technology learns through trial and error. And that opens up a world of usages.

The investment opportunities will be as plentiful as the robots themselves. From 2015 through 2019, the global market for automation and related services will almost double from $71 billion to more than $135 billion.

The reasons: The bottom line and efficiency. Getting the product to the consumer with the least amount of human labor at the fastest speed will drive this growth.
Already, those same reasons are beginning to transform the fast-food industry. Take a look at Eatsa, a San Francisco restaurant start-up with national ambitions that’s going robotic. It’s a fast-food model to follow.

At Eatsa, you order by pressing buttons at an automated kiosk. Key in your name, swipe your plastic card (no cash accepted), and within four minutes your order is delivered through the restaurant’s rear wall into a cubby resembling an oversized post office box with your name projected on the door.

Although Eatsa’s front end is fully automated, there are still three very human cooks toiling in the kitchen.

Eatsa is emblematic of the industry’s growing reliance on artificial intelligence. Wendy’s, for example, is installing ordering kiosks in its 6,000 US restaurants, although the company leaves it to each franchisee whether to adopt them. But that, too, may end someday.

Financially, the move makes sense to burger chains Hardee’s and Carl’s Jr. The chains will invest steadily in automating customer interactions, in part to defend against the push for a $15-an-hour minimum wage and fast food’s high rate of employee turnover. Unlike kiosks, people also take sick days and sometimes screw up a customer’s order.

Another plus: People under age 40 often prefer dealing with digital devices instead of people. Andrew Puzder, CEO of Hardee’s and Carl’s Jr., reports seeing customers queue at a restaurant kiosk to order while human order-takers stand idle at the counter.

However, Puzder isn’t ready to automate the kitchen, where human cooks will still be needed to respond to special orders.

Or maybe not.

ROBOTS AS COOKS?
Momentum Machines, another San Francisco firm, has unveiled a burger-making robot. Taking up just 24 square feet of kitchen space, it grinds meat at one end, forms a patty and drops it in its oven. While the meat is cooking to the customer’s order, the machine opens a bun, shreds fresh lettuce and adds the condiments requested. It then assembles the burger, wraps it and sends it down a delivery chute. The company claims the machine can spit out 400 burgers an hour and replace up to three fast-food fry cooks that might cost upwards of $100,000 a year in wages, training and other expenses. To prove it works, Momentum will soon open its own burger joint in its hometown.

For those who prefer food that doesn’t come in a paper sack, automation also is going gourmet. A four-man team of MIT students is preparing to take their robotic Spyce Kitchen commercial. At present, the unit offers a choice of five one-bowl dishes, including chickpea coconut curry on couscous and shrimp andouille jambalaya.

Orders are placed through a smartphone app or kiosk that allows diners to tailor seasonings and other ingredients to taste. A conveyor brings ingredients to a pot, which mixes and cooks the dish, then returns the used pot to a dishwasher. The unit can make two meals at a time, takes up just 20 square feet, and, inventors say, can sling half as much food as a full-size quick-meal restaurant.

The SpyceKitchen is awaiting Food and Drug Administration approval.

AUTOMATED INTELLIGENCE EXPLODING
But edibles aren’t the only facet of retail being transformed by a new generation of automated intelligence. Smart devices are at work from the loading dock to the showroom. 

Midea, a Chinese appliances company, sees a big future for robotics on the production line. In July, it became the largest shareholder of Kuka, a German robotic company that produces robotic systems for a wide range of production-line tasks.

Perry Kramer, vice president of the Boston Retail Partners consulting firm, sees heavy investments in warehouse automation. Showroom tech will follow while customers become used to robots that recognize them by name, know their purchase preferences and lead shoppers to their usual products. The retail workforce will shrink in number, Kramer says, but will be paid better for the required technical expertise workers will need to keep the digital showrooms running smoothly.

Glimpsing that future, Google in recent years has bought seven robotics companies with specialties in gripping, lifting and moving things. A recent patent application reveals a vision of a fully automated warehouse in which robotic arms unload a truck and place goods on self-driving carts. The carts drive to a central location where other robots sort the items and yet more robots carry the sorted stock to shelves. The entire process is coordinated by a central computer that keeps track of what each robot is doing and where each is so they don’t bump into each other.

Staples, The Gap and other retail chains already rely on sophisticated robot integration in their warehouses.

Take a look at Amazon’s distribution centers. Instead of humans wandering aisles picking up a book here and an electric toothbrush there, robots do the lifting and sorting. They hoist racks of goods containing products, and order and trundle them to the person packing the shipment. That employee then selects the needed items. What took hours of person-time now happens in minutes.
Not satisfied with its army of more than 30,000 robots, Amazon in 2015 sponsored its “Amazon Picking Challenge.” It offered $25,000 for the robot best able to pick individual items out of bins on a warehouse shelf — one of the last remaining tasks humans do in company warehouses. Alt
hough contestants failed miserably in their attempts to beat humans’ order-picking time, recent advances have brought nimble-fingered bots within striking distance of typical human speeds.

ROBOTS IN THE SHOWROOM
Smart robots also are entering stores’ showrooms. Simbe Robotics has created Tally, a 6-foot-tall kiosk on wheels. Tally patrols store aisles, scanning each shelf and signaling employees when an item is out of stock, running low, or even when an item is in the wrong shelf space. Such lapses cost retailers an estimated $450 billion worldwide annually. Using an autopilot system, Tally deftly navigates around people, shopping carts and merchandise displays so it can operate throughout the retail day.

The Israel-based Imagine Technologies is fielding its similar “Retail One” system. Pittsburgh’s Bossa Nova Robotics recently raised $14 million in a private placement to test its own version of an inventory-scanning robot in what it describes as “five of the world’s leading retail chains.”

Lowe’s Home Improvement Centers has taken the next step by introducing its “OSHbot” floor-walker robots in select stores. Using facial-recognition software, OSHbot spots a human customer, rolls up, introduces itself and asks how it can help. The customer could say, “I need a light bulb,” and a screen mounted on OSHbot’s chest will display a list of options – fluorescent, spot, outdoor and so on. The customer presses the item of choice and OSHbot leads the customer to the item. If you need more nails like the one you brought with you, OSHbot can scan the item, identify it and take you to it.

If you have several items to pick up, you may soon be trailed by a Budgee, a small, rolling robot with a basket to hold your purchases while you carry the cart’s homing device. Five Elements Robotics, Budgee’s creator, is said to be partnering with Walmart to create a smart shopping cart that recognizes your purchases as you deposit them, totals your bill and someday will be able to communicate with an automated checkout counter where your cart will be automatically unloaded and purchases bagged.

Hointer, a Seattle-based clothing store founded by Amazon’s former head of supply-chain technology, is thinking beyond individual carts and kiosks. Based on the theory that men don’t like to shop, the store uses artificial intelligence to speed the entire experience.

The customer downloads the Hointer app to a smartphone. He selects from items displayed in the store, chooses the size and color, and robots deliver the selections to a fitting room within 30 seconds.

Meanwhile, the app directs the customer to the right cubicle and places the items in the customer’s virtual shopping cart. Clothes that don’t fit are robotically whisked away, removed from the shopping cart and better sizes delivered. The customer can swipe a payment card in the fitting room or as he leaves the store. And he can rate his satisfaction with the items he saw and bought. Hointer sends that data to the supplier so the upstream supply chain can know what’s selling and why.

WHY CARRY SHOPPING BAGS?
In the near future, you won’t even have to carry your own bags. Purchases can be brought to your car by a smart cart from Starship Technologies, an Estonia tech firm. Its delivery carts use cameras and GPS systems to find you. Your purchases – groceries, a prescription, baby formula – are locked in a compartment that can be opened with a code the store sends to your smartphone.

Starship’s cart is a form of personal assistant, perhaps the fastest-growing market for robots for consumers. This carry-and-deliver system will be a big hit among the aging population. It’s a global trend. Post-World War II Baby Boomers, for example, are turning 70 this year. 

In Japan, where 40 percent of the population will be 65 or older by 2040, the government is heavily investing in the development of artificial intelligence and robotics.

Pepper, a 3-foot-tall humanistic robot from Japanese tech giant Softbank, has a tablet in its chest to help it communicate with people. The maker says Pepper has a built-in “emotional engine” that can steadily improve its empathy and responsiveness based on its experiences with people. Each Pepper is linked to a cloud-based databank that enables all Peppers to absorb what each learns.

Like Pepper, Sota – a foot-tall, teddy-bear-like bot – engages in simple conversations with humans, monitors and reports a person’s vital signs and can operate internet-linked lights, televisions and other appliances. Palro, from FujiSoft, is more capable, with more than 350 different programs. It can carry out simple conversations to monitor an elder’s emotional affect for signs of depression or dementia. And it can play games such as trivia contests with humans to help firm up memory and cognitive function.

At the University of Denver, Mohammad Mahoor’s engineering lab has worked with psychologists to develop a doll-like robot into a companion for children with autism. For these children, perceiving and processing another person’s gestures, voice tone, facial expressions and words is overwhelming. Nao, the companion robot, can be programmed with limited movements and verbal expressions that help a child with autism relax, come out of the condition’s psychological shell and begin to respond.

The robot can be programmed to progressively engage the child in more complex interactions as the child is ready, eventually helping the child relate to and interact with other humans. Just as important, the robot never loses patience or becomes exasperated.

But the rest of us need friends, too.

PERSONAL ASSISTANTS
A new army of “robot buddies” is entering the market to keep our schedules, remind us of tasks and read our emails to us when our hands are doing other things. Most of them also can read news to us.

Jibo, the creation of a team of Boston-based entrepreneurs, is the shape and size of a tabletop fan. It will be priced at around $800 when it begins shipping soon. The device can draw from its library of 14,000 phrases to cobble together sentences and carry out conversations. It’s a compatible and entertaining robot for children as well. Jibo can intone pleasure, surprise or sadness, while interpreting words and emotions. Jibo owners also can program the bot to do more specialized tasks or functions.

France’s Blue Frog Robotics makes a robot named Buddy. It has a face like a TV screen to express its reaction to what it hears or sees. Cubic, created by Cubic Robotics, is literally a small cube that sits on a table. It will chat with you, tell jokes and alert you that the movie you’ve said you’re interested in is at the local theater. Robotbase’s personal robot is a wheeled pedestal with a screen that projects an age- and gender-appropriate face. It is studded with sensors that can control your home’s lights, temperature and security system. This bot can even stream you live videos from home when you’re away.

But this type of technology isn’t limited to the home, workplace or fast-food restaurant.

THE DIGITAL CHAUFFEUR
As other robotics start to break through, the quest for the perfect self-driving vehicle continues at full speed. In fa
ct, the United States just gave its support to help drive the driverless trend. And, in addition to automakers, tech giants such as Intel are entering the field. Google has partnered with Fiat Chrysler. Tech giants Qualcomm and Nvidia are creating their own vehicle platforms. Start-ups focused on niche aspects of automated driving are multiplying like bunnies.
In fact, the market for self-driving vehicle technology is estimated to be as much as $100 billion annually by 2020. It will be even greater as it expands beyond cars to virtually anything that drives on land and sea.

A key motivation is the ability to give self-driving vehicles a more reliable “eye” to locate things around them.

Tesla’s vehicles detect obstacles by using radar. Radar sends out radio waves that bounce back when they hit a solid object, alerting operators that something is there. But radar can’t detect people, dogs or other living things in the roadway because they’re made of water, which radar’s radio beams pass through without noticing.

In contrast, Google’s self-driving prototype car uses LIDAR, which substitutes invisible laser beams for radio waves. Trouble is, LIDAR doesn’t see well in fog or mist. Also, it has to shoot laser pulses in all directions to orient itself. That makes conventional LIDAR systems unwieldy.

The ideal autopilot system would integrate the two detection methods, which would allow a car to self-drive more safely. However, LIDAR systems have been too elaborate and expensive — as much as $75,000 — to add to a vehicle.
But Quanergy, a Sunnyvale, Calif., firm, may have licked the problem.

In January, it announced it has created a $250 LIDAR system that sends and receives a million pulses per second with no moving parts and is small enough to tuck into any vehicle. Its development partners include Mercedes Benz and Hyundai. Quanergy plans to start shipping units before 2017.

The company should have lots of customers. It’s not only cars that are learning to drive themselves.

SELF-DRIVING EVERYTHING
On its corporate campus in Dearborn, Mich., Ford has deployed its automated “Dynamic Shuttle” to move people around. The shuttle also is a platform to test advances in Ford’s self-driving technologies, including hailing the shuttle through a cellphone app. Mercedes’ automated “Future Bus” drives itself in special street lanes, can stop itself at and leave from bus stops, and dodges pedestrians and other obstacles.

The bus relies on “CityPilot” automation, an offshoot of Mercedes’ “HighwayPilot” system that the company is developing to enable large trucks to conduct themselves along freeways. Truck makers are motivated by the vision of “platooning” – a row of semis driving nose-to-tail to reduce aerodynamic drag, all controlled by one driver in the lead truck. To achieve that, the trucks’ sensing, turning and braking systems would have to instantly inter-communicate. That achievement is years away.

But that idea of connectedness is another key theme in tomorrow’s vehicles. For example, a car that needs a parking space could broadcast its need and another car might respond: “I’m just leaving a space a block north of you.” Cars also could broadcast trouble alerts to dealers or your mechanic so he could have the appointment scheduled and parts on hand without you having to phone him.
That’s because companies such as Ford are working on ways to connect your home and vehicle.

Your car could access your smartphone’s calendar and schedule the repair in the mechanic’s datebook. It also will be able to sense when you’re close to home and automatically send a signal to turn on lights and the audio player so you enter your warmly lit home to the sounds of your favorite song. Ford is integrating its vehicle connectivity system with Amazon’s “Echo” personal assistant and voice-command device. Before you drive, you could ask Echo from inside your home to report how many more miles your electric car will go before it needs to be recharged.

THE ROAD AHEAD
Highways as well as cars are acquiring their own form of automated intelligence. For example, the California Department of Transportation has just completed work on its I-80 “Smart Corridor” near San Francisco. Algorithms control on-ramp traffic lights that serve as valves to let cars onto the freeway. That allows efficient, smooth and integrated traffic flows on the highway and side streets. Electronic signs let drivers know when a snarl lies ahead. It routes them onto side streets; signs on those streets direct drivers along the most efficient route back to the highway and give those streets more “green light time” to speed their journey.

The $79 million project is expected to shave three to five minutes off morning and evening rush-hour congestion. That doesn’t sound like a lot, but it’s a beginning.  And it will return as much as $12 million yearly in commuters’ time and fuel saved.   – TJ  

Comments are closed.

Skip to content