Nearly every American is disillusioned with America.
I mean, let’s face it — as both a reality and a concept, the old girl has gradually yet steadily lost any luster she might have once had.
We’ve been pitted against one another, increasingly so, for quite some time now and that divide only grows wider by the day. The left sees the incoming fascist takeover by the right, and the right sees the left as destroying their precious white-washed white bread way.
The fact that this is not a win-win situation is starting to sink in.
I seem to remember something someone said long ago about how “united we stand, divided we fall.”
Ring any bells?