When we talk about “whitewashing” or “Americanizing” Christianity, we are talking about a gospel that teaches people that White American culture is normal and ideal.
The post Three Ways American Christianity Has Whitewashed the World appeared first on Faithfully Magazine.