Press Enter to search or select a section to narrow results

Did White America give Black Americans the Christian Bible?

Did White America give Black Americans the Christian Bible?

Harry Watley · Saturday, March 28th 2015 at 4:26AM · 348 views

 

The answer is a certain YES that White Americans gave Black Americans the Christian Bible! Is Christianity a White man’s religion? The answer is a certain YES that Christianity is a White man’s religion! How did Black Americans become Christians?  White Americans made us Black Americans Christians.

 

 

 

The damage that White America has done to our spirits, souls and minds only God can fit it!

 

The Deacon and Ms. Bride are two examples of the mental damage White America has inflected on our race that they believe they are ancient Biblical Jewish people. They are damn stupid and ignorant!

 

About the Author

Harry Watley Wilson Salem, NC

Share This Article

Comments (1)

Yaiqab Saint Saturday, March 28th 2015 at 10:45AM

@ Harry

I must commend you for providing my some intelligence on Steve Williams and you see I have him pinned down this morning!

Yours Truly
Yaiqab Saint

Post a Comment

Please log in to post comments.