The United States of America is “Christian” nation only in the sense that the majority of the inhabitants self identify as Christian. But make no mistake, this nation was not founded on Christian ideals. It was founded on greed, built on the backs of slaves, and maintained on the continued subjugation of Black people. The American brand Christianity of America has been used as a tool to justify all manner of evil and deception. True Christianity has been demonstrated by the kind of Americans that fought for and defended the rights of others, and too many times those true children of God have been martyred. God is real though and I believe wholeheartedly that on one day this nation will face a reckoning for all its past and present sins,a nd maybe then this nation will finally be held to account for its stated ideals. Until then we can not lose hope. We have to continue to try to persuade, have the hard conversations with people, speak the truth, denounce the lies and injustice, organize, vote, and be the light in a dark world the way God wants us too. We have to fight this hate with love.