Jan 18, 2024
So---if Nikki Haley thinks or believes America has never been a racist country--- then how did it happen that she acknowledges experiencing racism growing up? And what kind of history books and teachers did she have, that left out so much info about the genocide of the Native Americans and our shameful history of owning and abusing slaves---Jim Crow, lynching of Black people, the need for the Civil Rights Movement, etc.???