Should Christians Engage The Culture?
Posted by Job on December 18, 2008
I hear it commonly asserted by pastors and theologians that I respect that Christians should engage the culture. So, I am asking those of you wise and learned in things concerning God’s Word and commandments these questions.
1. Does the Bible, especially the New Testament but I will accept the Old Testament (although no dominion or covenant theology interpretations of Old Testament scriptures please), contain verses that commands saints to engage the culture?
2. Does the Bible contain any examples of saints engaging the culture?
3. If the Bible does not contain any commandments or examples of saints changing the culture, should Christians do it anyway?
4. What other things that the Bible does not command or example saints to do should Christians engage themselves in?
5. What things that the Bible does not command or example saints to do should Christians NOT engage themselves in and why?
6. What is the goal of engaging the culture? Is it to spread the gospel? To oppose and restrain evil? Or is it to promote or preserve specific cultural norms?
7. Should Christians engage the culture only in cultures and populations that are majority or historically Christian? Or should Christians engage the culture in areas where they are tiny and persecuted minorities with little or no history?
8. Where should we focus our efforts? A. On primarily good, functional prosperous cultures to keep them from getting worse? B. Or primarily wretched, dysfunctional, violent, impoverished cultures to make them better?
9. What is the primary way or method that Christians should use to engage the culture, especially in the case of 9B?
10. If Christians successfully engage and change the culture, to whose will and glory is it, especially in the case of 9A? Is it to the will and to the glory of Christians (man) or of God?
Thoughtful sincere discussions and debate from Bible – believing Christians would be appreciated.