White Christians set the tone for this country, dating back to its founding. But that’s changing in some profound ways. For one thing, white Christians no longer comprise a majority of the nation. As the cultural and religious ground shifts under them we’ll see how their influence is changing.
Guest:
- Robert Jones is the author of The End of White Christian America. He’s also the founding CEO of the Public Religion Research Institute.