I was just wondering if any of you have heard of Dominionism and your thoughts upon it. I understand that it includes groups such as "Joels Army", "Latter Day Reign" and the like. Whats you opinion of these groups?
Dominionism is nothing new. It existed long ago in a doctrine called Manifest Destiny, which was the notion that America was destined by God to stretch from coast to coast, spreading Christianity and civilization on the continent. When society's conscience began to ache from all the killing and enslaving, Manifest Destiny was a sort of salve that warlike types used to excuse themselves. The same ideas drove Spain into South America.
The difference is that while dominionists are not warmongers, they are trying to change society through political means claiming all the while it is spiritual. This represents a lack of faith in Christ, which acts in the realm of ideas not of politics. They think that by outlawing sin they can improve society, but they are actually outlawing faith.
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.