Women's Role In Religions And Philosophy It seems in most major world religions state that women should obey men and men have the right to punish his woman if she disobeys. First there is Christianity. In the Bible even in the New Testament it says for wives to submit to their husbands and to be subject to him. Then there is Islam. We all know the controversy in Islam. One of the verses in the Quran even said that men have the right to beat their wives if they rebel or threaten the family order. Hinduism, a religion so open-minded to all beliefs and has less dogma, suggests in the Epic Lore that women should worship their husbands as a God and do exactly what they say even at cost of her own well-being. And so forth. Why do you think many of these major religions teach women to obey men and does it have any meaning in today's world? I think men should be leader of the family, but has no right to force his woman to do something she strongly disagrees or something that doesn't make sense. Men have the final decision, but must listen to what his woman has to say.