TFA Teachers Perform Well in a New Study — But Teacher Experience Still Matters

Before I dive in to Mathematica's new, positive research on Teach for America, a major caveat: Past studies of TFA suggest its recruits are more effective at teaching math than other subjects, and this study looks only at math. Across the board, it is easier for schools and teachers to raise math test scores than literacy scores. That's because most kids encounter math only at school, while in reading and writing, middle-class kids get a huge boost from vocabularly and book-rich home environments.

Here we go.

The study design: Mathematica compared the performance of 136 TFA math teachers and 153 Teaching Fellows math teachers in 11 unnamed school districts to the performance of "matched" math teachers from other training programs, working within the same school buildings and with similar low-income student populations. Student outcomes were measured using end-of-schoolyear standardized tests. TFAers and Fellows were not compared to one another, in part because they tend to work at different schools.

The big takeaway: TFA math teachers outperformed non-TFA math teachers in their schools by .06 standard deviations in middle school and .13 standard deviations in high school. The talking point will be that this is the equivalent of an additional 2.6 months of learning per schoolyear. But it's important to realize this represents a relatively modest improvement in student achievement. For the average child in this study, who scored in just the 27th percentile in math compared to her peers across the country, having a TFA teacher will help her move up to the 30th percentile–still a long way off from grade-level math proficiency.

Teaching Fellows teachers, who tend to be career-changers, not recent grads, performed similarly to their non-Fellows peers. They were slightly less effective than traditionally-certified teachers, but more effective than teachers who came from non-elite alternative certification routes.

Teacher experience still matters: The bias against first-year teachers is borne out in the data. The students of second-year teachers outperformed the students of first-year teachers by .08 standard deviations–a larger gap than the average one (.07) between the students of TFA and non-TFA teachers. And even though TFA recruits did well in this study, that doesn't mean teachers reach their pinnacle after two years on the job. To the contrary, the researchers found that for teachers with at least five years of experience, each additional year of work was associated with a statistically significant increase of .005 standard deviations in student achievement. Interestingly, during years 2, 3, and 4 of teaching, there is no observable improvement. So this study shows a big leap in effectiveness from year 1 to 2, a flat line for a few years, and then slow and steady improvement year-to-year after year 5.

College selectivity is not a magic cure-all: Are TFA teachers successful because they hail from elite colleges? Maybe not, this study suggests. Teachers here who attended selective institutions did not outperform other teachers, regardless of whether or not they participated in TFA or the Teaching Fellows. That finding is in line with new data from New York City linking student achievement back to the colleges teachers attended. In that study, NYU and Columbia grads were not significantly more effective than graduates of Hofstra or CUNY.

It doesn't matter much what teachers majored in: One of the big critiques of traditional teacher education is that not enough teachers have college degrees in the subjects they teach. But in this study, traditional teachers were actually more likely than TFA or Teaching Fellows teachers to have majored in math. That coursework didn't necessarily help them become better teachers.

And teachers' own test scores are not all that predictive: TFAers and Fellows demonstrated better standardized test scores in math–they scored an average of 17-22 points higher than their counterparts–perhaps because they were much more likley to have attended academically selective colleges, which require good test scores for admission. The relationship between teachers' own test scores and student achievement remains murky, however. The researchers conclude that at the high school level, higher teacher test scores are associated with slightly better student outcomes, but that there is no relationship between teacher and student test scores at the middle school level.

Coursework is distracting: When a teacher is taking night courses–as all first-year TFA teachers do, to meet state certification requirements–student achievement declines. 

So, why are TFA teachers successful? If it isn't college selectivity or their higher test scores in math, what's the theory of change? After observing TFA's summer training institute this July, I'd guess that there are two major factors. First, TFA teachers are incredibly mission-driven and optimistic. They actively choose to teach in low-income schools and they are selected because they believe closing the achievement gap is not only important, but possible. This inspires them to work hard. (Of course, many non-TFA teachers have these characteristics, as well, and also tend to be great at their jobs.) Second, TFA's training emphasizes data tracking of student outcomes and the importance, specifically, of raising standardized test scores. That could lead to the students of TFA teachers getting more test-prep and hearing more messages about why performing well on tests is important.

Update: The researchers tell Dylan Matthews that although they used the results of high-stakes state exams to measure student outcomes in the middle-school grades, at the high school level, the tests they used were completely new to the teachers, so they couldn't have prepared students for them. I'd still make the point that the students of TFA teachers may be more likely to take testing seriously, for the reasons I outline above.

Don't forget race and class: Of TFA's 2012 class of recruits, 62 percent are white. But the TFA sample in this study was a whopping 89 percent white, while the demographics of the non-TFA comparison teachers were starkly different: only 30 percent white. The student population, meanwhile, was 80 percent low-income children of color. As I research my book, sources across the country are telling me, anecdotally, that urban districts are losing teachers of color, especially African American teachers. Given what we know about the importance of race-similar role-models for minority students, and how this, too, can affect achievement and school culture, it's important to gather more information on how well districts and teacher training programs are doing at putting teachers of color in front of students of color. 

9 thoughts on “TFA Teachers Perform Well in a New Study — But Teacher Experience Still Matters

  1. Anthony Cody

    Mission driven? Having worked as a mentor for perhaps a dozen TFA corps members in Oakland, I would say this is an apt term. And the mission, as far as TFA is concerned, is higher test scores.

    Most of the TFA teachers with whom I worked were instructed to put large posters on their wall that exhorted their students to higher test performance. Student data was often tracked on wall posters as well.

    I saw a first year teacher who had low scores on her tests get the advice of her TFA data coach to change her daily instruction so that it more closely resembled the tests. Soon, she was giving daily worksheets with multiple choice and short answer questions. Her students test scores rose, but after a month or two, they were bored out of their skulls and climbing the walls. In her second year, she shifted and embraced a Project based learning approach and was able to get much more engagement. Unfortunately, the year after that, she left to go to medical school.

    That is the key problem with TFA. As you point out, experience matters. People become more effective over time on a variety of levels, many of which are not measured by test scores. But so long as test scores are the only yardstick we are using, TFA may look good. If we were to choose basketball players on the basis of height alone, I might make the team alongside Arne Duncan. But there are many more aspects to being a winning ball player, just as there are many more aspects to teaching.

  2. Concerned

    Honest question, not snarky -
    On a multiple choice test, how many more questions did TFA students get correct compared to traditional? What is 2.6 months of learning in multiple choice question terms?

  3. Concerned

    I think my questions was a bit confusing. What I’m trying to get at is: For students with TFA teachers, how many more multiple choice questions did they answer correctly compared to students with traditional teachers? Did they get, let’s say, 55 of 65 multiple choice questions correct while the other students got 47 correct? What are we talking about here with months of learning?

  4. Ellie Herman

    This study poses questions about gender as well. Why was the comparison group so heavily female (79% as compared with a 59% average for teachers nationwide and 60% in the TFA group). When so many at-risk students are male, isn’t it widely believed that they may perform better in general with a male teacher for reasons that have nothing to do with teaching technique? Why, then, focus on students with female math teachers in the comparison group? When the results are so minuscule, what if this gender discrepancy had an effect?

    Above all, what does a slight bump in math scores on a single test even prove? Does a child’s ability to bubble in answers at the command of a teacher in any way prepare him or her for college or for life? Does a 6 or 7-answer test score difference represent actual learning–or just the teacher’s enthusiasm for the test?

  5. Ed_Act

    I think it’s important here to raise the substantially higher turnover of TFA teachers. If only 15% of TFA teachers remain in the low-income school they started in, either because they burn out or because they never intended for teaching to be a career, then there are significant long-term negatives that far outweigh these short-term and small gains in “student achievement”.

  6. mpledger

    The report says “This difference in math scores was
    equivalent to an increase in student achievement from the 27th to the 30th percentile. This difference also translated into an additional 2.6 months of school for the average student nationwide.”

    On a 65 question test (assumuing 1 mark for a correct answer) – the ordinary teacher’s kids would have scored 17.6 and the TFA teacher’s kids would have scored 19.5.

    There is an interesting study done in a military college where they had complete control of student assignment to a math class. They randomised the students to either new or experienced teachers and then randomised them again the second year to other teachers. Students with a new teacher in the first year got better scores in the first year but students with an experienced teachers in the first year got better scores in the second year – it was assumed that the experienced teachers gave them deeper learning that played out as they advanced.
    link to nber.org

  7. I Teach in Philly

    It is important to note that TFA teachers do not teach courses beyond Algebra 2. To suggest that they are better overall math teachers is misleading.

    If a TFAer didn’t major in math, it’s unlikely that they have the range of knowledge that a real teacher has; beyond teaching formulas (perfect for multiple choice testing!) TFAers do not encourage higher order thinking skills. Their sole goal is to generate high test scores, not understanding of underlying concepts or the practical applications of advanced math.

    Value as instructors? ok for first and second year of high school, as long as they stick to fundamentals. However if a student asks questions beyond what is supplied in the script, he or she will be out of luck.

Comments are closed.