It does feel to me as though romance novels are EVERYWHERE right now. Every week, a new romance book store opens; the New York Times, Entertainment Weekly, even The Guardian all review romance. Bookstores that previously didn’t carry romance, now have whole sections. Don’t even get me started on TikTok.
So, does this mean the cultural bias against love stories has substantially ebbed? I’m curious what you think. I’ll share my own views in the comments as well!