OK so I like math. Although I'm not a statistician, I **am **pretty good with math (which is lucky for anyone in the USA haha bc in a few short months I will be using said math to prepare intravenous drug compounds for hospitalized patients... tl;dr if I sucked at this, it would suck WAAAY worse to be you bwahahah XD)

OK first off, if you compare the OVERALL numbers from s1-s7, although s7 (in red) looks like it's a bit low, there doesn't seem to be much difference ... right? *Right*???? And, truth be told, the only ~statistically significant~ differences (eg, where s7 really comparatively sucks balls) are when it's directly compared to s1. But ... this chart is a colorful mess. A colorful MEANINGLESS mess, because all I'm showing you is a bunch of lines without analysis. Six would probably proudly wear this chart as a coat it's so fugly. Anyway, this is usually the data people are looking at when they glance at the ratings and shrug it off as being "not all that different." This is not accurate.

So let's clean it up a bit!!! To simplify things, I'm gonna compare apples to apples. All RTD-era episodes are accounted for by the blue line (*"You said BLUE!!" **... "I said NOT blue!!!"*), the beginning of the Moffat era (s5-6) is the green line, and s7 is the red line. NOW things start to look interesting!!

*Look at the pretty lines and numbers!!!*Here's what they mean: see the dotted lines with the equations? Those are ~trendlines~ for the graph. Basically what that means is it tells you, on average, where the hell your data is going. See the equations? Those tell you how fast viewers are flocking to your show (or, alternatively, turning it off bc it sucks and going to read fanfic or something lol idk). And see the (sorrysorry tiny font I knowww) "R^2" value? That tells you if you can trust your trends or not (lol @ those evil, untrustworthy trend bitches). The closer to "1" the better, and these are all pretty freaking close to one which means the trends are pretty strong. (So anyone who tries to respond and say it's meaningless - look at the R^2 value and hush lol).

So what does this mean? Again this isn't a calculus class so I'll skip the lecture on how to calculate derivatives and try not to make this too boring (BUT CALCULUS IS SUPER COOL AND YOU SHOULD LOVE IT GUYZ), but essentially the first number (x^2) is saying "this is how fast viewers are coming/going".

And this is where the RTD era is strong, s5-6 are a bit weaker, and s7 is in trouble. For the RTD era, the first number shows that yeah viewers were coming and going - but that there was a general trend back up. For s5-6, there are fewer people coming and going. And for s7, the number is negative --- that means there is a trend of people leaving. How reliable is this? Well back to the R-squared thingy I was telling you about - it's pretty freaking close to 1, so the trend is pretty tight.One of the big weaknesses here is that premieres and season finales tend to have more viewers, so in this next graph, I simply removed the premieres and finales (which meant I had to remove mainly s6-7 episodes from the data pool bc of the split season). Taking away those premiere/finale bumps in viewership looks even worse for s7 - the number of viewers leaving is even MORE negative now!!!! And s5-6 has a much flatter line too ... viewers were pretty stagnant. Again, the RTD era had some swings, but at the end of the day, viewers were coming home. That's not happening for the past few years, especially this year.

So what can we make out of all of this? Tl;dr, the numbers aren't good. And they're getting not-gooder by the season.

(AND CALCULUS ROCKS AND YOU SHOULD TOTALLY LOVE IT!!)