In a nation where, increasingly, belief in God cannot be assumed, and where Christianity is losing more and more of its sway in public discourse, what does membership in a church offer? Or, to put it another way, how might we say that church matters?
I’m curious how faith leaders might answer these questions because I recently ran across a very difficult sort of answer.