The Department of Energy is preparing to use the massive computing power of its national laboratories to tackle a daily scourge of American life: traffic jams.
The effort is aimed at more than just improving motorists’ moods. If it works, it could cut U.S. transportation fuel consumption up to 20% and reduce auto emissions.
A second goal is to recover as much as $100 billion in lost worker productivity by unsnarling rush hour traffic jams in U.S. cities over the next 10 years.
Two years ago, the National Renewable Energy Laboratory in Golden, Colo., selected Chattanooga, Tenn. (population 182,799), as the guinea pig for its first traffic-cutting experiment.
The city, nestled among the hills and ridges of the southeastern corner of the state, is ranked among the nation’s top 20 most traffic-congested cities.
The first step for NREL scientists was to make a detailed computer model, or what it calls a “digital twin,” of the city’s traffic patterns to isolate and then explore solutions to its snarled rush hours.
“Chattanooga provided an ideal microcosm of conditions and opportunities to work with an exceptional roster of municipal and state partners,” explained John Farrell, who manages the vehicle technology management program for NREL.
“Eventually, the plan is to apply these solutions to larger metropolitan areas and regional corridors across the country.”
The national goal is to save 3.3 billion gallons of fuel wasted each year, to reduce the 8.8 billion hours of lost productivity and to cut the surges in emissions that cars idling in traffic jams produce each year.
In a typical (non-COVID-19) year, according to NREL, a driver spends 46 hours “stuck behind the wheel.”
As Farrell explained in an interview, the key to the two-year study of Chattanooga’s traffic was Eagle, NREL’s latest supercomputer. It can do 8 million-billion calculations per second. The computer’s technology and the accumulating knowledge of how to apply it, he pointed out, are a combination that didn’t exist five years ago.
It helped NREL use a process called “machine learning” that can explore huge amounts of data and quickly recognize patterns that might otherwise take humans weeks or even months to pick out. Chattanooga’s traffic provided mountains of data.
The scientists worked with a long list of partners. Those included the city’s transportation department, three universities, the Tennessee and Georgia departments of transportation, and several trucking companies such as FedEx Corp.
More than 500 data sources were factored into the research. Among them: space satellites in the U.S. Global Positioning System, automated cameras, radar detectors, weather stations, city records showing where cars were located and visual observations.
“One of the areas where machine learning struggles is that it can tell you the what but not the why” of trouble-prone areas, added Farrell. The first “what” that Eagle pounced on was Shallowford Road, a multilane highway that feeds drivers from the suburbs into the city.
The researchers found that four traffic signal controllers along Shallowford were timed to handle very heavy traffic, which meant that in the middle of the day drivers found themselves stopped by a parade of red lights.
One of their “congestion relief” moves was the ability to adjust the timing to the traffic, which resulted in more green lights and a thumping 16% decrease in fuel use for vehicles on the highway.
With one move, they had brought an area close to the fuel-saving target they had set for the whole city. Before that, their approach had been all computer-based theories. Farrell called the result “shocking and gratifying at the same time.”
“We convinced ourselves we had a viable route to get there,” he said, referring to a citywide congestion solution.
The next two years will be devoted to a regional traffic solution, one that includes other highways leading into the Chattanooga area. It will study the behavior of truck traffic and how it may respond to decongestion changes in Chattanooga.
They will include “dynamic speed limits” that adjust to move traffic more effectively and tinkering with the timing of stoplights that control the ramps leading into freeways.
“Freight is one of the major contributors to gridlock, not just there but all across the country,” noted Farrell, who explained that making trucks repeatedly stop and start, especially in hilly country, causes more fuel losses and climate-changing emissions than cars produce.
According to NREL, more than 11 billion tons of freight is transported annually across U.S. highways, amounting to more than $32 billion worth of goods each day.
By around October 2022, NREL and its partners hope they will have a model that can be applied to other cities and their surrounding regions. Denver and Atlanta are among the most likely cities where the traffic may be modeled next.
Farrell said he hopes the Chattanooga-based models and experiences can be adapted to bring answers to other urban congestion-based problems within a year.
“The impact on climate change is just one of many problems we can help reduce,” he said. “When you address congestion issues, you also improve safety as well.”
While COVID-19 recently has cut the amount of time lost in traffic jams to around 26%, Farrell said that once the nation recovers, the time lost sitting in traffic jams likely will return to the national norm, which runs between 40 and 50 hours.
That turns into 8.8 billion hours in lost productivity. “The impact of minimizing that turns out to be a very large number because it will affect a lot of people,” added Farrell, who predicted it may inspire more rapid tests and still faster supercomputers.
Reprinted from E&E News with permission from POLITICO, LLC. Copyright 2021. E&E News provides essential news for energy and environment professionals.