top of page
How Fast Should Phonics Be Taught?

Last week, I wrote an article on the topic of systematic phonics. ( In this article, I discussed how research shows phonics programs with a scope and sequence outperform phonics programs without one. However, there has been very little definitive research on what that scope and sequence should look like. One issue of particular interest to me, is the issue of pacing. As a classroom teacher, I have always preferred a much faster phonics pace, as I believe with enough instructional time students tend to be more capable than we assume. Moreover, decoding requires the acquisition of multiple letter sound correspondences and I would hypothesize that a faster pace of instruction would allow students to decode more quickly. In my previous article, I reviewed a very well controlled RCT by Vadasy, et al, on the topic that showed a faster scope and sequence, did not increase total learning by a statistically significant amount, but did increase student automaticity, as can be seen in the below graph.

However, there has been no meta-analysis on the topic to date and the research is not definitive. That said, I did write a large scale phonics meta-analysis earlier this year, for this blog and I decided to revisit that data-base to see if I could control for this factor. (If you want to read the original article, click here: Elizabeth Brown the writer for, was also interested in the topic and helped me with the recoding. We coded each phonics study for the average number of letter-sound correspondences (GPCs) taught each month. One caveat that we should make to this, was most intervention programs used an individualized pace, opposed to a set pace. Personally, I would have hypothesized that a faster scope and sequence or an individualized scope and sequence would outperform. An individualized scope and sequence, seems to make intuitive sense, as it allows the teacher to adapt to the needs of their specific students. 


In total 41 phonics experimental or quasi experimental studies were included. (A downloadable copy of the studies database was included in the references section.) The overall results can be seen below. 

The correlation coefficient would seem to suggest that on average, there is a slight statistical benefit to a faster scope and sequence, compared to a slower scope and sequence. However, it also appears that there is an upwards limit to this benefit of 15 GPCs per month. Suggesting that teaching 5 or less GPC’s per week can slow the academic progress of students. This data was initially surprising to me. 15 GPCs is a slower pace than I have personally used in my own practice as a teacher. I also find it very shocking that an individualized approach would underperform, as I would think teaching to the individual needs of students would be ideal, something I have written about at length in my books. That said, when I ran my Pearson correlation tests, phonics pace was not the most influential factor. The sample size showed a negative correlation of .28, (p value= 0.5) and the study duration showed a negative correlation of .45, p value = 0. Meaning that the larger, and longer a study was, the lower the effect size that was found. 


However, there is a logical and common explanation found for these results. Realistically, this is a dosage analysis. We are testing, does increasing the phonics dose increase reading results. And it is a well documented phenomenon within dosage research, in other fields that there is often an ideal range for a dosage, opposed to a linear relationship. For a hypothetical example, taking a moderate amount of vitamin d might provide more health benefits than taking a smaller amount of vitamin d, or no vitamin d. However, that does not mean that taking too much vitamin d won’t make you sick. This type of phenomenon is often depicted with a curved line graph and indeed, if we graph these results on a line we get a curved result. 

As can be clearly seen in the above graph, the typical curved relationship can clearly be seen for phonics dosage. That said, in the above analysis, I did not control for core vs intervention based instruction. So I redid the analysis, for each type of instruction. 


Core Instruction Results:

When Intervention studies were removed from the analysis, the results were much more meaningful. The speed of the scope and sequence, became the most correlating factor for study results, with an ES of .52, p value=0.001. Comparatively, the r effect size for sample size was -.32, p value =.01 and the r effect size for duration was -.49, p value=0.0001. Again, a scope and sequence of 6-15 GPCs per month outperformed. However, there were no studies with a scope and sequence between 6-9. These results also showed negligible benefits for teaching 1 or less GPCs per week, suggesting that scope and sequences that teach below 6 GPCs per month are very inefficient. That said, there were only 17 studies within this sub-analysis. It would be ideal to see these results replicated with a much higher number of total studies. 


Intervention Instruction Results:

The Intervention results look consistent with the classroom results, at first glance. However, teaching 5 or less GPCs per month, no longer show a negligible result and more importantly, the correlation between speed and results is no longer statistically significant. Suggesting that the ideal speed of instruction within an intervention setting, is dependent on the needs of the student. This sub-analysis only included 25 studies and ideally should be replicated with a greater number of studies included in the sample. 


Does Study Quality Change This Analysis?

One of the biggest criticisms of meta-analysis is that higher quality studies tend to show lower results. To control for this factor, I conducted a quality based regression analysis and speed based regression analysis. (I have attached downloadable copies of these regression analyses in the references section). The quality based regression analysis controlled for 5 quality factors: Experimental design, fidelity, measurement standardization, sample size, and duration. I then ranked each study in the meta-analysis, based on its quality, on a scale of 0-5. 5 meaning that the study was a large scale, longitudinal RCT that controlled for fidelity, and used a standardized measurement.  0 meaning that a study was a small scale, short term, quasi-experimental study that did not control for fidelity and used a custom made measurement. Then I conducted a Pearson analysis to see how the quality of the study correlated to the results. There was a negative correlation of .36, p value =0 for quality. Meaning that there was a small but very statistically significant and negative correlation between study quality and results. However, this correlation was still lower than the correlation for sequence speed for core instruction. Moreover, the confidence intervals found in the quality based regression analysis were considerably larger than the confidence intervals found in the regression analysis based on the speed of instruction. This difference suggests that the pace of phonics instruction influenced results more consistently than did study quality, or sample size. Pearson correlation tests were also conducted on student age, and the extent of multi-sensory instruction used, these results were negligible. 


Prior to writing this article, I hypothesized that there would be a direct correlation between the speed of instruction and the effect size found. My hypothesis was mostly incorrect. That said, on average a faster scope and sequence outperforms a slower scope and sequence. However, diminishing returns are found for scopes and sequences that teach more than 15 letter-sound correlations per month. This analysis would suggest that there is both a lower limit and upper limit pace for what is most effective for teaching phonics. The upper limit appears to be 15 GPCs per month and the lower limit appears to be 6 GPCs per month. However, the detriment caused by teaching phonics too slowly, appeared to be much worse than teaching phonics too quickly. Moreover, within an intervention setting, the ideal pace is dependent on the needs of the individual. Some students in need of intervention instruction, may require a much slower rate of instruction. 



None of the authors of this analysis have used the programs discussed in this article. Coding was therefore based on study details, scopes and sequences posted online, and other teachers' testimony.This analysis has not been peer-reviewed. Some effect sizes were underpowered and or appeared random, when looked at in comparison with confidence intervals. No studies in this analysis looked at the 16-20 range of GPCs, as all studies at the top of the range looked at 23 or more GPCs per month. More research is needed to better flush out the precise ranges of effectiveness. 


One convoluting factor, pointed out to me, by Dr. Holly Lane, is that studies which use faster scope and sequences might be using more of an interspersed or spiraling scope, meaning that they consistently review GPCs previously taught. Whereas slower scope and sequences might be aiming to teach to complete mastery. This could mean that some of these results are reflective of mastery vs interspersed teaching, opposed to the pace of instruction. 


Written by, Nathaniel Hansford, Elizabeth Brown, and Joshua King

Last Edited 2022-12-13



N, Hansford. (2022). A Meta-Analysis and Literature Review of Language Programs. Teaching by Science. Retrieved from <>. 


N, Hansford. (2022). Should Phonics be Systematic? Teaching by Science. Retrieved from <>. 


Vadasy, P. F., & Sanders, E. A. (2020). Introducing grapheme-phoneme correspondences (gpcs): Exploring rate and complexity in phonics instruction for kindergarteners with limited literacy skills. Reading and Writing: An Interdisciplinary Journal. Advance online publication.

For a downloadable copy of the meta-analysis database click here: 

For a downloadable copy of the quality regression analysis click here:

For a downloadable copy of the speed regression analysis click here:

bottom of page