YEARS OF RESEARCH have shown that teenagers need their sleep. Yet high schools often start very early in the morning. Starting them later in Boston would require tinkering with elementary and middle school schedules, too — a Gordian knot of logistics, pulled tight by the weight of inertia, that proved impossible to untangle.
Until the computers came along.
Last year, the Boston Public Schools asked MIT graduate students Sébastien Martin and Arthur Delarue to build an algorithm that could do the enormously complicated work of changing start times at dozens of schools — and rerouting the hundreds of buses that serve them.
Martin, a bright-eyed Frenchman, and Delarue, a thin, bespectacled American, had already done some impressive bus route optimization for the district, allowing officials to take 50 vehicles off the road at a savings of some $5 million. And they took to the new project with gusto, working 14- and 15-hour days to meet a tight deadline — and occasionally waking up in the middle of the night to feed new information to a sprawling MIT data center.
The machine they constructed was a marvel. Sorting through 1 novemtrigintillion options — that’s 1 followed by 120 zeroes — the algorithm landed on a plan that would trim the district’s $100 million-plus transportation budget while shifting the overwhelming majority of high school students into later start times.
The potential benefits were striking: Sleep-deprived teens are at increased risk for poor academic performance, binge drinking, and suicide. And district officials were thrilled about the prospect of more shut-eye for their oldest students.
“This is a very exciting evening,” Tommy Chang, then superintendent, told the Globe in December, just before a key school committee vote. “This is a problem that nobody thought we could solve, and we are going to solve it tonight.”
The algorithm was poised to put Boston on the leading edge of a digital transformation of government. In New York, officials were using a regression analysis tool to focus fire inspections on the most vulnerable buildings. And in Allegheny County, Pa., computers were churning through thousands of health, welfare, and criminal justice records to help identify children at risk of abuse.
The potential, says Stephen Goldsmith, a former mayor of Indianapolis who now runs the Data-Smart City Solutions project at Harvard University, is enormous: “more effective utilization of public resources, more individuals helped, more problems preempted.”
While elected officials tend to legislate by anecdote and oversimplify the choices that voters face, algorithms can chew through huge amounts of complicated information. The hope is that they’ll offer solutions we’ve never imagined — much as Google Maps, when you’re stuck in traffic, puts you on an alternate route, down streets you’ve never traveled.
Dataphiles say algorithms may even allow us to filter out the human biases that run through our criminal justice, social service, and education systems. And the MIT algorithm offered a small window into that possibility. The data showed that schools in whiter, better-off sections of Boston were more likely to have the school start times that parents prize most — between 8 and 9 a.m. The mere act of redistributing start times, if aimed at solving the sleep deprivation problem and saving money, could bring some racial equity to the system, too.
Or, the whole thing could turn into a political disaster.
Last year, the Boston Public Schools set out to reconfigure start times. The district wanted high school students to start later in the morning so they could get more sleep, improving their health and academic performance. It also wanted to trim transportation costs. Officials asked a pair of MIT graduate students to build an algorithm that could do the job.
As the graduate students and district dug into the work, they confronted these baseline realities: Only a quarter of high school students started school after 8 a.m…
…Less than half of BPS parents were happy with their children’s start times. And the district was using 600 buses to transport kids to school at a cost of tens of millions of dollars.
Officials faced a number of difficult trade-offs. They could calibrate start times to minimize the number of buses required and save as much money as possible. But fewer parents would be happy with the results.
The district could start every high schooler after 8 a.m., and give more elementary and middle school parents start times they’d be happy with. But that would require more buses and higher costs.
Ultimately, officials picked a solution, from one of thousands generated by the algorithm, that attempted to balance several goals — student health, cost savings, and parental happiness. But opponents said the formula had some big blind spots.
District officials expected some pushback when they released the new school schedule on a Thursday night in December, with plans to implement in the fall of 2018. After all, they’d be messing with the schedules of families all over the city.
But no one anticipated the crush of opposition that followed. Angry parents signed an online petition and filled the school committee chamber, turning the plan into one of the biggest crises of Mayor Marty Walsh’s tenure. The city summarily dropped it. The failure would eventually play a role in the superintendent’s resignation.
It was a sobering moment for a public sector increasingly turning to computer scientists for help in solving nagging policy problems. What had gone wrong? Was it a problem with the machine? Or was it a problem with the people — both the bureaucrats charged with introducing the algorithm to the public, and the public itself?
AS ALGORITHMS HAVE taken on a larger role in state and local governance, they’ve attracted mounting criticism.
Books with titles such as “Weapons of Math Destruction” and “Automating Inequality” have warned that formulas, though capable of stamping out human biases, can also replicate and supersize them.
Take the crime prediction software police departments use to deploy officers and equipment. It relies, in part, on past interactions with law enforcement. But people of color are picked up for “nuisance crimes” at disproportionate rates.
The data, in other words, are biased. And if the software uses them to recommend a heavier police presence in black and Latino neighborhoods, that can lead to more arrests for the sort of low-level crimes that go unpunished in other places. Those arrests are then fed into the algorithm, and the cycle continues.
What’s particularly insidious about this kind of bias, critics say, is that it’s cloaked in the “neutrality” of machines. The computer, the average user figures, must be right.
That leads to the second major complaint: Too often, the algorithms are black boxes. Bureaucrats don’t have the time or know-how to construct them, so they turn to private vendors who have a proprietary interest in keeping their formulas secret. Mayors and superintendents can insist on transparency, of course, but they may have trouble getting anyone to bid on their projects.
Applying the algorithms
This tool allows you to experiment with start times on a school-by-school basis. Hoping to save the district money by reducing the number of buses on the road? See how that affects the start time at your child’s school. Willing to pay millions more for additional buses to guarantee prime start times for all? See what that looks like. Want to balance several goals, like cost, parental satisfaction, and student health? That’s an option, too.
Goal: Minimize cost
Goal: Parents’ preferences
Goal: Savings, later start times
By some measures, the MIT algorithm stacks up pretty well against the standard established by critics. Martin and Delarue aren’t private vendors. They’re graduate students who are willing to share their work. Even now, they’re working to publish their algorithm in an academic journal.
Of course, publication at the time of the controversy would have been ideal. But as Rutgers law professor Ellen P. Goodman argues, an “open source” algorithm isn’t the end-all, be-all of transparency.
“It’s both too much, and too little,” she explains. Too much, because most people can’t make sense of an algorithm, and too little, because the formula doesn’t tell you enough about the data it’s crunching.
Meaningful transparency, she says, requires a clear explanation of what an algorithm aims to do, which factors it considers, and how much weight it gives them.
The Boston Public Schools took some notable steps in that direction. Just before the release of the new bell times, the school committee laid out the algorithm’s four guiding principles: increase the number of high school students starting school after 8 a.m.; decrease the number of elementary school students dismissed after 4 p.m., so they wouldn’t have to travel home in the dark; accommodate the needs of special education students wherever possible; and generate transportation savings that could be reinvested in the schools.
But in retrospect, it’s clear that the school officials who communicated with the public about the algorithm fell short in at least one crucial respect.
Big districts stagger their start times so a single fleet of buses can serve every school: dropping off high school students early in the morning, then circling back to get the elementary and middle school kids.
If you’re going to push high school start times back, then you’ve probably got to move a lot of elementary and middle schools into earlier time slots. The district knew that going in, and officials dutifully quizzed thousands of parents and teachers at every grade level about their preferred start times.
But they never directly confronted constituents with the sort of dramatic change the algorithm would eventually propose — shifting school start times at some elementary schools by as much as two hours. Even more.
Hundreds of families were facing a 9:30 to 7:15 a.m. shift. And for many, that was intolerable. They’d have to make major changes to work schedules or even quit their jobs. And because their kids would have to go to bed so early, they’d miss out on valuable family time in the evening.
These were the concerns at the heart of an uprising that had parents buttonholing the mayor at a Christmas tree lighting in West Roxbury and waving “Families over Algorithms” signs at a packed school committee meeting. “We are infuriated,” one father declared, as the crowd erupted in applause.
Martin and Delarue, the MIT students, were at that meeting. They were sympathetic. Maybe the process had gone wrong somewhere. Maybe the district hadn’t adequately engaged the parents.
They couldn’t help but notice, though, that most of the critics hailed from wealthier sections of the city. The MIT team had built a formula that promised racial equity — distributing the best school start times more evenly across white, black, and brown neighborhoods. And here were the most active, politically connected families in the city trying to spike it.
Sure, the district officials who pushed the algorithm into the public sphere had made some mistakes. But what about the people who were receiving it? Were they really doing the right thing? Were they being fair-minded?
“I recognize if you have two or three children who need to be picked up at a particular time, and you organize your life around that, and it’s drastically changing — I recognize how hard that is,” says Dimitris Bertsimas, the MIT students’ adviser. “But you put zero weight on the rest of society?”
The moral equation wasn’t quite that simple, though. It’s not clear the white parents who packed the school committee knew much, if anything, about the racial disparities the algorithm attempted to remedy. And a couple of days after the school committee meeting, the NAACP and the Lawyers’ Committee for Civil Rights and Economic Justice actually came out against the plan.
Even if the algorithm promised to reduce inequities, the upheaval involved — with nearly 85 percent of the district getting new start times — would hit black and brown families especially hard, the groups argued.
“We know our parents of color are disproportionately likely to have lower-wage jobs that will make it harder for them to change schedules to meet the new demands of BPS, let alone pay more money for additional child care after school,” said Matt Cregor, education project director at the Lawyers’ Committee, in an interview with the Globe.
The opposition had crested. Soon, the algorithm would crash.
IN THE END, the school start time quandary was more political than technical.
The MIT algorithm had done all the city could reasonably ask. It had sorted through more possibilities than any human being could possibly contemplate. And it had come up with a solution no bureaucrat had ever mustered.
But it was people who made the final call. People with competing interests and a mish-mash of motivations. This was a fundamentally human conflict, and all the computing power in the world couldn’t solve it.
If anything, the algorithm fueled the conflict and made the choices stark. Before the district commissioned the formula, few parents had thought about the interplay between high school start times and teenage sleep deprivation. And even fewer understood that starting high school later would mean sending younger kids to school earlier.
But even if the algorithm flopped, says Goldsmith, the former Indianapolis mayor who now runs Data-Smart City Solutions at Harvard, it was worth pursuing. “We live in an inherently political world,” he says, “and sometimes, politics are going to trump science. But if the science can illuminate the disparities, that’s better than continuing in ignorance.”
Once a problem has been identified, it’s hard to forget about it. And even if it’s pushed aside now, it can be picked up again later. That’s certainly what the MIT students are hoping for. Maybe, they say, if school officials took up the algorithm again — and if they better engaged parents — they could come up with a better system than the one they have now. Imperfect, perhaps. But substantially better. “With the algorithm,” says Martin, “it’s so easy to improve, a lot, on the things that matter.”
The district, stung by last year’s blow-up, seems unlikely to change bell times in the near future. But perhaps Martin and Delarue can take the idea elsewhere. Last year, even after everything went sideways in Boston, some 80 school districts from around the country reached out to the whiz kids from MIT, eager for the algorithm to solve their problems.
Data analysis by Sébastien Martin and Arthur Delarue, PhD students at the MIT Operations Research Center.