I couldn't source it yet, but I've been thinking about this quote by
Albert Camus:
"Fiction is the lie through which we tell the truth". I rarely read fiction anymore. Despite that, I am aware that the stories and the myths we are told determine how we see the world and our place in it. Is fiction the best way to tell the truth? Or, is fiction simply the exploitation of emotion?
More often than not, fiction does not tell the truth. It could, but is usually doesn't. Fiction relieves the author from the pressure of "getting it 100% right" and allows the author to explore subjects and ideas which could be impossible to know for sure. Surely fiction could point to the truth, even moreso than journalism, non-fiction, biography and history. And, that's what's happening.
These days, journalism, history and "statements of fact" are more dubious in their "facts" than fiction is. Take this statement by Vice President
Joe Biden:
"No law-abiding citizen in the United States of America has any fear that their constitutional rights will be infringed in any way. None. Zero." So-called
news, whether on print or on cable networks, reports selective
factoids while avoiding all inconvenient facts, neatly fitted in a predetermined
fictitious narrative. What is on the news, is not news.
Why does any of this matter? If the truth matters to you, then these matter. If not, they don't.