I was reading Reddit when I came upon this. A change in methodology means tornado reports are inflated when compared to prior years.
[T]he NWS changed its policy regarding duplicate severe weather reports this year.
It should be noted that duplicate preliminary reports are NO LONGER purged from the data set. As a consequence, it is not possible to compare this preliminary data to previous years’ data. In the past, possible duplicate reports were culled based on time and proximity. This is not the case anymore.
The old rule of thumb was ~15% of preliminary reports were duplicates. Based on the preliminary v. final numbers so far this season it is possible that up to 65% of these preliminary reports are duplicates.
On March 8, 2011, the SPC removed space/time filtering on incoming National Weather Service (NWS) Local Storm Reports (LSRs). This filtering had been used by SPC in an attempt to reduce duplicate reports and limit artificially inflated initial estimates of severe weather events when many reports arrived for the same event. Space/time filtering is no longer being applied to decoded NWS LSRs and this approach is consistent with NWS storm-based verification methods.
You probably already know the number of weak tornadoes reported has gone up over time as ‘chasing’ has made fewer of them unobserved.