Bo Jacobs 3 months ago White Americans are realizing the United States is everything black Americans have been telling them it is for centuries.