Did White America give Black Americans the Christian Bible?
The answer is a certain YES that White Americans gave Black Americans the Christian Bible! Is Christianity a White man’s religion? The answer is a certain YES that Christianity is a White man’s religion! How did Black Americans become Christians? White Americans made us Black Americans Christians.
The damage that White America has done to our spirits, souls and minds only God can fit it!
The Deacon and Ms. Bride are two examples of the mental damage White America has inflected on our race that they believe they are ancient Biblical Jewish people. They are damn stupid and ignorant!
@ Harry
I must commend you for providing my some intelligence on Steve Williams and you see I have him pinned down this morning!
Yours Truly
Yaiqab Saint