Definition Of A Feminism
Feminism the belief in social economic and political equality of the sexes.
Definition of a feminism. The belief that women should be allowed the same rights power and opportunities as men and be. The belief that women should be allowed the same rights power and opportunities as men and be. Although largely originating in the west feminism is manifested worldwide and is represented by various institutions committed to activity on behalf of women s rights and interests. Learn more about feminism.
Source : pinterest.com