I had a conversation with my pastor a few weeks ago. I normally have some casual chat with my pastor on Christian faith and theology. He is a very good person and always encourages me. I was quite alarmed when he said he wants Christendom to end everywhere in the world especially in America.
The Church that I attend is more liberal and so do the pastor I guess. From what I converse with him, I guess he is concerned about the atrocities done by the British and other western colonials under the name of Christianity.
I am just concerned about what is his real intention behind those words “I want Christendom to end in America”. I personally believe we can’t interpret the Bible based on our political lenses such as liberal, conservative, democrat, etc. I also think too much liberal thoughts deviate from true Christian teachings and more prone to adapt to postmodern culture.
Need some opinions or advice.