-
Important news
-
News
-
Shenzhen
-
China
-
World
-
Opinion
-
Sports
-
Kaleidoscope
-
Photos
-
Business
-
Markets
-
Business/Markets
-
World Economy
-
Speak Shenzhen
-
Health
-
Leisure
-
Culture
-
Travel
-
Entertainment
-
Digital Paper
-
In-Depth
-
Weekend
-
Newsmaker
-
Lifestyle
-
Diversions
-
Movies
-
Hotels and Food
-
Special Report
-
Yes Teens!
-
News Picks
-
Tech and Science
-
Glamour
-
Campus
-
Budding Writers
-
Fun
-
Qianhai
-
Advertorial
-
CHTF Special
-
Futian Today
在线翻译:
szdaily -> Tech and Science -> 
Sony AI driver achieves ‘superhuman’ race time 
    2022-02-14  08:53    Shenzhen Daily

AI agents have bested humans at many games, from chess to Go to poker. Now, the machines can claim a new high score on the classic racing video game series “Gran Turismo.”

Sony announced last week that its researchers have developed an AI driver named GT Sophy that is “reliably superhuman,” able to beat top human drivers in “Gran Turismo Sport” in back-to-back laps. Experts in both video game racing and AI say GT Sophy’s success is a significant breakthrough, with the agent showing mastery of tactics and strategy.

“Outracing human drivers so skilfully in a head-to-head competition represents a landmark achievement for AI,” writes Stanford automotive professor J. Christian Gerdes in an editorial in the scientific journal Nature that accompanies a paper describing the work. “GT Sophy’s success on the track suggests that neural networks might one day have a larger role in the software of automated vehicles than they do today.”

GT Sophy was trained using a method known as reinforcement learning: essentially a form of trial-and-error in which the AI agent is thrown into an environment with no instructions and rewarded for hitting certain goals. In the case of GT Sophy, Sony’s researchers say they had to craft this “reward function” extremely carefully: for example, fine-tuning penalties for collisions in order to shape a driving style that was aggressive enough to win but that didn’t lead to the AI simply bullying other racers off the road.

Using reinforcement learning, GT Sophy was able to navigate round a racetrack with just a few hours of training and “within a day or two” was faster than 95 percent of drivers in its training dataset. After some 45,000 total hours of training, GT Sophy was able to achieve superhuman performance on three tracks. (For “Gran Turismo Sport” players, the tracks in question were Dragon Trail Seaside, Lago Maggiore GP, and Circuit de la Sarthe.)

A common concern when testing AI agents against humans is that machines have a number of innate advantages, like perfect recall and fast reaction times. Sony’s researchers note that GT Sophy does have some advantages compared to human players, like a precise map of the course with coordinates of track boundaries and “precise information about the load on each tire, slip angle of each tire, and other vehicle state.” But, they say, they accounted for two particularly important factors: action frequency and reaction time.

GT Sophy’s inputs were capped at 10 Hz, compared to a theoretical maximum human input of 60 Hz. This sometimes led to human drivers displaying “much smoother actions” at high speeds, write the researchers. For reaction times, GT Sophy was able to respond to events in the game environment in 23-30 ms, which is faster than an estimated top reaction time for professional athletes of 200-250 ms. To compensate, researchers added artificial delay, training GT Sophy with reaction times of 100 ms, 200 ms, and 250 ms. But as they found out: “All three of these tests achieved a superhuman lap time.”(SD-Agencies)

深圳报业集团版权所有, 未经授权禁止复制; Copyright 2010-2020, All Rights Reserved.
Shenzhen Daily E-mail:szdaily@126.com