Why do bother watching nature documentaries? I know they’re rubbish and yet each time I’m like “maybe this one won’t suck”
Right off the bat it’s projecting vicious intent onto nature. Nature isn’t just shit that happens, oh no, it’s A BRUTAL WAR OF DYNASTIES!!!1!!!
“Look at this centipede from the Devonian, but invertebrates wouldn’t be the ones to win the game of survival” WHAT DO YOU MEAN? WHAT GAME? INVERTS ARE STILL HERE, THEY’RE THE MOST COMMON AND THRIVING LIFEFORM ON THE PLANET.
And of course the whole thing chooses to fixate on competition and ignore how much of nature revolves around cooperation and symbiosis.
I am begging the media (especially media that sells itself as educational) to stop speaking about nature the same way a 1930s German pseudoscientist would.
I like documentaries in general but the quality has dipped a lot over time
There is a vast difference in american documentaries to those made in Europe, same could be said about tv in general.
Not that I watch much tv now, I haven’t watched in over a decade