Aside from cable’s growth in the 1960s and 70s, the television industry has faced very little disruption since it first emerged in the wake of WWII–Lucille Ball’s zany antics notwithstanding.
Ohhhh, that crazy redhead.
But as Nobel laureate and ceremony-snubber Bob Dylan can tell you, the times, they are a-changin’. The twenty-first century brought the internet and a flurry of other technology to the world of TV, and the only people looking back are the advertisers. Tech has helped usher in a new Golden Age for the smaller-than-silver screen; compelling long-form stories, richly developed and fully-rounded characters, on-demand viewing with more choices, and actual measurement of those choices enabling viewer-responsive programming.
Thanks to technology, we are right in the middle of the greatest television experience ever, by much more than a mile. Here’s how it all happened.
The first major change came, as such changes often do, very quietly. Digital photography hit the scene in 1990, but the prohibitive cost of the cameras—especially if you wanted film-quality shots—kept them from having much of an immediate impact. Big production houses and studios were the only users for about a decade, and the digital learning curve offset a fair amount of the cost savings that came along with ditching film.
Then consumer digital cameras came along in 2000, which made the technology mass-producible and a whole lot more lucrative. That drove improvements and price reductions, like it always does, and now you can shoot an entire feature film on your iPhone and have it come out looking fine. You still have to light it properly, but LEDs are making that more affordable, too—the same goes with digital audio equipment. Generally speaking, today’s digital effects are also cheaper than practical effects–they’re also a lot more versatile.
All told, more filmmakers can do more things without running into prohibitive budget demands. From low-key indie web series shot on smartphones to top-tier shows like Game of Thrones—which, with its dragons, enormous crowd scenes, and other demands would be more expensive than a Lannister’s bar mitzvah if it weren’t for the power of digital—the arrival of affordable digital videography ushered in an explosion of storytelling; that means more voices on the scene and more choice for viewers.
And yeah, a lot of it is utter shit. Also, yes, a top-of-the-line professional digital video kit will still easily set you back six figures. But a $2,000 package is perfectly passable if you know how to use it, and costs are still falling while quality continues to improve. Without the affordability that digital, well, affords, we wouldn’t have seen the number of original scripted television shows double from 2009 to 2015, with roughly 500 estimated for 2017—a 275% increase since 2002.
I’m not saying digital videography is responsible for all of this growth, but it’s a huge factor, and so is most of the other tech we’re about to look at. Especially streaming.
The fact that Netflix, Amazon, and HBO GO/NOW (and…Hulu?) have drastically changed the way TV is watched is no longer news. Pointing it out is about as trite as it gets. But acknowledging that people can now watch what they want, when they want, virtually anywhere they want is just the beginning. It’s the downstream effects of streaming that are shaping the way TV is made.
Take this year’s breakout hit from Netflix Stranger Things, for example. The creators have basically said Season 1 is an eight-hour movie, with a consistent plot arc that plays out across all eight episodes. You don’t get problem-conflict-resolution in each episode the way you typically do in TV shows, and it isn’t simply that new cliffhangers are set up at the end of each episode a la Breaking Bad or The Walking Dead (both of which have seen a significant bump in viewers due to their Netflix releases following initial runs on AMC). Stranger Things is more like a mini-series used to be, and the same can be said for House of Cards, Marco Polo, Narcos, and other Netflix hits, as well as HBO’s Westworld and of course, Game of Thrones.
Shows like this wouldn’t attract the audiences they do if people had to sit down in front of their TVs at the same time every week to tune in. You miss one episode and you’re out of the loop, and the odds are good that you’ll stop watching rather than being satisfied with reading a recap. In fact, one wonders if cable’s early entries in the extended-plot series—things like Breaking Bad and The Walking Dead—would have found sticky audiences without DVR recording and the first on-demand streaming services offered by cable companies. Viewers’ ability to catch up (and binge on) shows on their terms makes watching these season-long plot arcs far more practical.
Without a sticky audience, shows don’t get made. Streaming, quite directly, is changing the way TV is written. You can even see it as The Walking Dead marches on. Though the plot has always formed one long arc with little mini-arcs along the way, episodes in the later seasons are far less self-contained. It takes four or five episodes to resolve each conflict now, while in the first few seasons we were putting certain storylines to bed every episode or two. AMC has cottoned onto how people are watching, and they’re shifting the pace and direction of the show as a result.
Of course, that wouldn’t be possible without the “cottoning on” part, which brings us to our next tech shift.
Streaming services—and on-demand services from cable, and the website-based viewing options increasingly being offered by AMC and the like—do more than offer convenience to viewers. They offer data to content producers. Netflix and others have been famously unwilling to share viewer numbers with the public, but there’s no question that they are tracking those numbers internally. For second-run shows–ones that air originally on traditional TV’s cable and broadcast–you can bet that streaming views are shared with the production companies that own the shows, and that this factors into the deals they strike as they sell each show and each season.
You can also bet that this info translates directly into programming choices, driving creation in pursuit of the biggest audiences and the biggest deals. That’s how TV has always worked, after all, except that traditional means of measuring viewership relied on estimates that have often been accused of severe inaccuracy. There’s no estimating, here; Netflix, Amazon and the rest know exactly what shows you’re watching, when you’re watching them, and how you’re watching them. They know how many episodes you’re likely to binge at a time, they know if audience drop-off spikes after a certain episode, or even after a certain scene, and thanks to “second screens” and the internet at large, they and all TV producers/distributors know a whole lot more.
“Online reputation monitoring” is all the buzz in the marketing and PR world, and the same principles can be applied even more proactively by the TV industry; monitoring searches and social media activity surrounding shows, characters, and celebrities to see what’s working and what isn’t with their series. I’ve never seen a single episode of Arrow, but thanks to it cropping up in several random Reddit threads, I know that Arrow and Felicity finally hooked up after several seasons of “the internet crowd” pushing for it. According to my anecdotal experience on Reddit, the show also suffered for it, but the point is that the makers were definitely listening and trying to retain and increase their audience share based on direct audience opinion.
The cornucopia of data the internet offers up to TV makers can influence content decisions in ways large and small, from plot shifts to character changes to actors’ salaries to, in extreme cases, which shows live and die. Jon Stewart’s internet-famous appearance on CNN’s Crossfire in late 2004 was almost certainly a factor in the show’s cancellation in January of 2005. Though Stewart’s episode saw a significant spike in viewership, his decimation of the news program’s debate premise and of the two co-hosts, especially Tucker Carlson, quickly went viral and proved too sticky to be washed off.
There is also a growing population, both horizontally and vertically, of business and innovation contributors. The internet has been doing a bit of growing in the decade since, just as TV has been going through its transition, and there is no shortage of startups who have tried to take advantage of that fact.
BlueFin Labs, bought by Twitter in 2013 for a reported $90 million, promised analytics that dove far deeper than hashtags to provide TV show affinity and engagement data to advertisers and brands—and presumably, if everyone got smart about it, to TV creators as well.
GetGlue, the startup behind the Telfie social media app, wants you to “check in” to what you’re watching, discover new shows, join TV-centric discussions, and so on, all so it can sell data to those same advertisers/brands/creators.
NetBase uses its natural language processing to sift through social media and deliver consumer information on a scary scale, offering a whole lot more than TV audience insights (but the potential is there).
Comcast’s See It was another Twitter-integrated tool that let you, “search, discover, collect, watch, record and receive reminders for their favorite programming,” though the largely unknown and unused platform was shuttered two weeks ago.
When most viewers think about interactive TV, if they think about it at all, they’re probably thinking about live tweeting during contest shows and awards ceremonies. I promise, the people behind the screen are thinking about it on another level entirely, and it’s shaping the pixels you end up watching.
The Future of Tech and TV
Advertisers still haven’t managed to take true advantage of the popularity of streaming services, though with Netflix’s increasingly original-heavy programming push it wouldn’t surprise me if commercials started making a comeback in one way or another. Maybe Netflix will sell its data to advertisers, and the next time I go online after binging on Narcos I’ll be bombarded with moustache wax ads. Maybe Netflix is doing that already. I’d be amazed if Amazon wasn’t customizing its on and off-site ads based on my Prime Instant Video viewing habits.
What’s less clear is how changing audience demographics and viewing habits, which are themselves shaped by our technology, will change the processes and decisions involved in TV creation. Will the generation of young punks who prefer fake prank videos and douchey shirtless guys doing “social experiments” eventually grow up and start watching well-crafted narratives? Or will their coming of age be the death knell for scripted TVs resurgence? Will the death of the Baby Boomer generation render traditional TV completely obsolete? Or will choice inundation cause a viewer backlash and a return to time-specific, provider-selected content?
At the end of the day, as long as they keep making Care Bears and Cousins, I don’t really care.
Daniel A. Guttenberg is an Atlanta-based writer who fell into the startup world by accident and has been gleefully treading water ever since. He will be survived by his beard and his legacy of procrastination.
Latest posts by Daniel A. Guttenberg (see all)
- Trump’s Immigration Phobia Won’t Do Diddly To Tech - February 10, 2017
- Biometric Security Is In The Eye Of The Beholder - February 3, 2017
- You Down with IoT? (Yeah, the NSA Knows Me) - January 27, 2017