The majority of American adults believe that more than 150 years after the 13th Amendment ended slavery in the country, the legacy of slavery still affects how black people are treated in modern American culture. According to a recent Pew Research Center study, more than four out of ten respondents believe that racial equality has not been sufficiently advanced in the country. There is also some doubt that black people will ever enjoy equal rights with white people.
Racism as come to stay to the best of my knowledge, for example take a look at South Africa upto now Black people are been treated as slave even in their on land. The white people are just deceiving us by showing it public that they are fighting racism which is not true. I was watching a video in YouTube titles "what do you think of Africa" it was so annoying to me, few people were asked but the answer they gave was so disgusting because the all have bad thoughts by Africa. If really their elite are really fighting to eradicate racism, they shouldn't have been saying all that about Africa. They only becomes friend with us if they see something good in us.