I get that they have to dramatize nature to get ratings, but these documentaries project our fucked up societal bias on nature like crazy.
Every single one is like "DANGER, SURVIVAL, DEATH, DANGER, CONSTANT STRUGGLE, PAIN, DOMINATION, VIOLENCE!"
Yes, violence is a part of nature, but it's just one part. The majority of life spends its time conserving energy, sleeping, eating grass, etc. Even lions only hunt once or twice a week and then sleep the rest of the time.
"IS THIS BABY SEAL GENETICALLY SUPERIOR ENOUGH TO PASS THE TEST OF LIFE?!?! YES, HE BOOTSTAPPED HIMSELF THEREFORE HE HAS PROVEN HE HAS THE RIGHT TO LIVE!"
What a nihilistic, unobjective conclusion to jump to.
"LOOK AT THESE TWO BABY LIONS PRACTICING KILLING AND DOMINATING!"
No, my dude, they're just playing... you know, forming bonds like all social species do? Entertaining themselves because all animals with a brain get bored? Nope, they must be doing it for BRUTAL VIOLENT DOMINANCE reasons.
Again, I'm not saying nature is all rainbows and unicorn farts, but these documentaries give a false impression of the wild. Where are the documentaries about the cooperation and loving families of elephants? The amazing parenting of crocodiles? The beauty of cooperation that happens between species? Those spiders that form societies? The spiders that are vegetarian?
Ugh, nature is far more beautiful, complicated, and nuanced than Westoids think. But no, we can't help but project our own unconscious fascist bias onto it.
I know Animal Planet was like this too, but it seems like Netflix has leaned hard into the "Nature but trying to make it into a grimdark drama" angle.