Have we reached the end of popular culture?
I’m not being facetious here, and I’m not (just) trying to do this for hits. I’m asking a simple question: What’s REALLY changed in popular culture in the past decade? And is anything honestly POPULAR at this point? I’m not so sure that our culture has moved to a point of differentiation from the films and products of a decade ago beyond surface aesthetics, and it saddens me to state this.