A Sociologist Talks about How the Appearance of Christianity Improved the Lives of Women

One of the claims of the critics of Christianity is that the religion is denigrating to women, especially in the aspect that some Christian churches do not recognize the capacity of women to lead. Moreover, women are assigned a completely subservient role in many churches, and in marriage, making the religion sexist. In his book, […]

A Sociologist Talks about How the Appearance of Christianity Improved the Lives of Women Read More »