Researchers teach an AI how to dribble

While this animated fellow looks like something out of NBA 2K18, it’s really an AI that’s learning how to dribble in real time. The AI starts out fumbling the ball a bit and by cycle 95 it is able to do some real Harlem Globetrotters stuff. In short, what you’re watching is a human-like avatar learning a very specialized human movement.

To do this researchers at Carnegie Mellon and DeepMotion, Inc. created a “physics-based, real-time method for controlling animated characters that can learn dribbling skills from experience.” The system, which uses “deep reinforcement learning,” can use motion capture date to learn basic movements.

“Once the skills are learned, new motions can be simulated much faster than real-time,” said CMU professor Jessica Hodgins.

Once the avatar learns a basic movement, advanced movements come more easily including dribbling between the legs and crossovers.

From the release:

A physics-based method has the potential to create more realistic games, but getting the subtle details right is difficult. That’s especially so for dribbling a basketball because player contact with the ball is brief and finger position is critical. Some details, such as the way a ball may continue spinning briefly when it makes light contact with the player’s hands, are tough to reproduce. And once the ball is released, the player has to anticipate when and where the ball will return.

The program learned the skills in two stages — first it mastered locomotion and then learned how to control the arms and hands and, through them, the motion of the ball. This decoupled approach is sufficient for actions such as dribbling or perhaps juggling, where the interaction between the character and the object doesn’t have an effect on the character’s balance. Further work is required to address sports, such as soccer, where balance is tightly coupled with game maneuvers, Liu said.

The system could pave the way for smarter online avatars and even translate into physical interactions with the real world.


Tapping into the power grid could predict the morning traffic

Why is there traffic? This eternal question haunts civic planners, fluid dynamics professors, and car manufacturers alike. But just counting the cars on the road won’t give you a sufficient answer: you need to look at the data behind the data. In this case, CMU researchers show that electricity usage may be key to understanding movement around the city.

The idea that traffic and electricity use might be related makes sense: when you turn the lights and stereo on and off indicates when you’re home to stay, when you’re sleeping, when you’re likely to leave for work or return, and so on.

“Our results show that morning peak congestion times are clearly related to particular types of electricity-use patterns,” explained Sean Qian, who led the study.

They looked at electricity usage from 322 households over 79 days, training a machine learning model on that usage and the patterns within it. The model learned to associate certain patterns with increases in traffic — so for instance, when a large number of households has a dip in power use earlier than usual, it might mean that the next day will see more traffic when all those early-to-bed people are also early to rise.

The researchers report that their predictions of morning traffic patterns were more accurate using this model than predictions using actual traffic data.

Notably, all that’s needed is the electricity usage, Qian said, nothing like demographics: “It requires no personally identifiable information from households. All we need to know is when and how much someone uses electricity.”

Interestingly, the correlation goes the other way as well, and traffic patterns could be used to predict electricity demand. A few less brownouts would be welcome during a heat wave like this summer’s, so I say the more data the better.

There are many factors like this that indicate the dynamics of a living city — not just electricity use but water use, mobile phone connections, the response to different kinds of weather, and more. Traffic is only one result of a city struggling to operate at maximum capacity, and all these data feed into each other.

The current study was limited to a single electricity provider and apparently other providers are loath to share their data — so there’s still a lot of room to grow here as the value of that data becomes more apparent.

Qian et al published their research in the journal Transportation Research.

Knitting machines power up with computer-generated patterns for 3D shapes

At last, a use for that industrial knitting machine you bought at a yard sale! Carnegie Mellon researchers have created a method that generates knitting patterns for arbitrary 3D shapes, opening the possibility of “on-demand knitting.” Think 3D printing, but softer.

The idea is actually quite compelling for those of us who are picky about their knitwear. How often have we picked up a knit cap, glove, or scarf only to find it too long, too short, too tight, too loose, etc?

If you fed your sartorial requirements (a 3D mesh) into this system from James McCann and students at CMU’s Textiles Lab, it could quickly spit out a pattern that a knitting machine could follow easily yet is perfectly suited for your purposes.

This has to be done carefully — the machines aren’t the same as human knitters, obviously, and a poorly configured pattern might lead to yarn breaking or jamming the machine. But it’s a lot better than having to build that pattern purl by purl.

With a little more work, “Knitting machines could become as easy to use as 3D printers,” McCann said in a CMU news release.

Of course, it’s unlikely you’ll have one of your own. But maker spaces and designer ateliers (I believe that’s the term) will be more likely to if it’s this easy to create new and perfectly sized garments with them.

McCann and his team will be presenting their research at SIGGRAPH this summer.

Aurora hires SpaceX’s Jinnah Hosein, opens SF and Pittsburgh offices

Self-driving technology company Aurora has made some key moves on its leadership team and overall company growth: It’s bringing on SpaceX’s now former head of software engineering, Jinnah Hosein, to lead its own software engineering team in a VP role. The autonomous software provider is also opening up two new offices, including one in San Francisco, and another in Pittsburgh, in addition to its existing HQ in Palo Alto.

Bringing on Hosein is a huge move for Aurora, which will now have some additional senior leadership taken to help direct and organize its growing engineering team, according to Aurora co-founder Chris Urmson . Hosein’s background includes his time as VP of Software Engineering at SpaceX, where he spent the past four years and oversaw projects including the recent successful Falcon Heavy launch. Before that, he was Director of Software Engineering at Google working on Google Cloud, site reliability and other software projects.

“It’s a pretty incredible set of experiences he has,” Urmson said. “We’re just excited about him bringing that leadership capability, that experience in building both cloud and incredibly reliable software to our team and working with the rest of the folks here.”

Hosein also worked for a brief time overseeing Tesla’s software operations as well as SpaceX’s when he served as acting VP of Tesla’s Autopilot Software prior to Tesla hiring Apple’s Chris Lattner for the role. Urmson says that Hosein’s proven track record launching rockets, and organizing software projects on that level of complexity is more important to Aurora than any brief time he may have spent on Autopilot, however.

Aurora is also opening two new physical offices and testing locations, as mentioned, including the San Francisco one that Urmson says will be a welcome relief to some of their employees currently commuting south to Palo Alto, as well as a way to attract more talent looking to work in the city proper. The Pittsburgh office gives them a new testbed, where they can prove their tech in inclement driving conditions and adverse winter weather, and it also puts them in close proximity to Carnegie Mellon and Pittsburgh’s robotics talent pool.

“When you combine that, between the offices we have in the South Bay, the San Francisco test areas that we’ll now have more access to and the Pittsburgh test areas, we have a pretty exciting diversity of test environments and places to operate,” Urmson added.

Aurora has already announced partnerships with Volkswagen, Hyundai, Byton and more, and recently added LinkedIn founder Reid Hoffman and Index Ventures’ Mike Volpi to its board.