So, Were We Wrong About San Diego Unified's Grad Rate? (Hint: No.)
San Diego Unified wrote that a new report on its graduation rate proved “allegations a local news outlet raised about the district’s graduation rate … are false.” The report did not prove any of VOSD’s findings false. In some cases, it added important context to issues we’ve been reporting for years.
The communications team at San Diego Unified School District is quite excited about a new report from UC San Diego’s San Diego Education Research Alliance.
The report is an update on the class of 2016, which the group had studied before. That cohort was significant because it was the first to face new, more stringent graduation requirements. Read Maya Srikrishnan’s breakdown of the report’s findings.
The researchers’ earlier warnings about how far behind that class was startled us. So, last year, when the district proclaimed the class had a 91 percent graduation rate, we wanted to understand how that happened.
Mario Koran took on the quest, which went in a number of different directions, many of which school district officials did not appreciate.
When the new report came out, the district pounced. In a press release, the district wrote: “In addition to analyzing graduation rates and student performance, the report looked into allegations a local news outlet raised about the district’s graduation rate. Data analysis included in the report helps confirm the allegations are false.”
I’m taking the leap to assume that by “a local news outlet,” they mean Voice of San Diego.
This kind of a charge is very serious to a news organization, and it’s one I found startling. First, I think what we did was a genuine service. We found things out about why students had left the school district. Wouldn’t the district appreciate that?
So I reviewed the report and our coverage to see if we had written anything false that needed to be addressed.
I started with what we reported.
Essentially, Koran’s reporting focused on four factors that add context to the graduation rate.
First, Koran revealed that the rate did not include students from the class of 2016 who were in charter schools. It also didn’t include students who left district-managed schools after their freshman year for charter schools or other avenues.
In other words, the graduation rate is not the number of students who graduated, divided by the number who started high school. I, for one, did not know that. The term graduation rate would seem to mean the percentage of kids who graduate from high school after starting.
It does not.
As we’ve pointed out numerous times, there is nothing nefarious about the district excluding these students from its calculations – indeed, that’s how they’re required to do it. And yet simply stating the fact that students who left district high schools don’t count toward the graduation rate has enraged the district. As recently as Thursday, its spokeswoman was still dwelling on it:
— Maureen Magee (@MaureenMagee) September 7, 2017
The second thing Koran discovered is that many of the students who had left district-managed schools for charter schools had been struggling significantly. Had they stayed in district-managed schools, the graduation rate would have been lower. Related, and importantly, he heard from people who said students were encouraged by the district to leave. We have variously described that as them being “counseled to leave” or “pushed out.”
The third point Koran examined was how much the district relied on online courses to help kids reach graduation requirements.
The fourth significant point he made was that students and teachers discussed and demonstrated just how easy those courses were to cheat.
That’s it. That’s what we reported over several months.
Did the new report prove any of those points wrong?
What exactly does the graduation rate measure?
On our first finding, the report did not refute anything we wrote. It actually added a lot of data to the context we laid.
The report found that 80 percent of the class of 2016 graduated with a standard diploma. That’s not 91 percent.
This represents meaningful improvement for some students, with only 7 of 10 students on track at the end of grade 11. Put differently, 1 in 3 students who were off track to graduate on time did graduate by summer 2016.
The district does not count the kids who left its schools for charters even in the same district. That’s fine. It’s methodologically sound. We have never said otherwise.
But it is nonetheless interesting and important that so many students from the class of 2016 left district managed schools for charter schools.
Were the students who left struggling? Yes.
This is very clear in the report.
Of those students who were off track at the end of grade 11, roughly one-third graduated on time, roughly one-third left district-managed schools, either enrolling in charter schools in the district or leaving altogether, and roughly one-third dropped out or stayed enrolled but failed to graduate by summer of 2016.
Notably, almost all of those leaving district-managed schools were off track to complete the a-g requirement on time …
Also the report found what we did: that the grade point average of those kids who left was very bad: 1.49.
So, to review: One out of 10 students from the class of 2016 left a district-managed high school for a charter school. They were, for the most part, struggling. The district got eight out of nine of the rest to graduate. This is better than researchers thought it would be, but there’s a lot of room for improvement.
Finally, did district staff encourage, counsel or push struggling students to leave?
The district acknowledged recently that doing so had long been the practice and that they were taking steps to change it.
The researchers said they didn’t investigate whether that was happening. They did, though, specifically try to figure out if the new requirements had led more students to leave, which might indicate that more students were counseled out.
They found there was no apparent increase in the number of kids counseled to leave, at least based on a data comparison of previous years.
But we never reported the new graduation requirements had provoked more to be counseled out. We simply found it quite newsworthy that students would be counseled to leave for charter schools – especially considering the rather intense competition for enrollment. The district itself had said pushing students who were struggling out would be morally wrong.
All the more reason it was so interesting that the school board president would admit it was something they did.
Now, is counseling a student to leave different than “pushing” them out? We rarely used that word in our coverage (though I did personally). I think telling a student and his or her parent they should leave a school qualifies as a “push” whether they actually physically pushed them out the door or not.
The researchers recommend the district should better track the kids who leave.
Were online classes crucial to the graduation rate? Yes.
Koran examined how the district relied on a new company and its “credit-recovery courses.”
The impact of this program is not in dispute.
Additionally, the district contracted with Edgenuity to provide approved a-g courses online, and multiple district officials have told us that this strategy helped many students to catch up in grade 12.
Koran’s story dug into the Edgenuity courses and controversy nationwide about similar programs and their academic rigor.
But there is no serious contention about this point. District officials have spoken openly for the last few years about the centrality of online courses to their efforts to help students meet grad requirements.
It’s interesting that by using similar programs to what online charter schools were themselves using, the district helped some struggling students catch up.
Those courses are easy to cheat.
In two articles, Koran explored tips he had gotten about how easy it was for students to game the online courses. In the first article, students showed him how easy it was to cheat. And Koran explained how the courses work.
In the second piece, Koran heard from many educators and students about how vulnerable the courses were to cheating. They said cheating was “pervasive.”
Rather than investigate these claims, the district aimed its fury over these statements at us.
But the researchers admitted they “can neither verify nor dismiss the claims of cheating on online tests.”
They did examine how students fared, and many of them did not do well on the online courses. From the report:
The median grade was a C, and about 1 in 9 of the course enrollments led to a failing grade. The grades are typically quite low, suggesting that students did not necessarily find these courses to be easy. That by itself, though, does not disprove the allegations cited above.
Cheating does still take some effort.
Our reporting on this was by definition anecdotal and therefore reasonably vulnerable to criticism. I stand by its newsworthiness. We had witnesses and students offering firsthand examples of what they did. But we did not have data on how widespread cheating was.
If the district is not interested in what these students and teachers told us, that’s fine.
But nothing in the report proved anything we wrote was false.
“I think Mario’s articles have raised some concerns in the community about these courses, and we have some ideas for the district to alleviate those,” Julian Betts, executive director of the San Diego Education Research Alliance, told Srikrishnan.
I take the accuracy of our reporting very seriously. If the district is going to accuse “a local news outlet” of writing something false, it has an obligation to explain exactly what was contradicted.
The district has not done that.