Women—particularly minority women and poor women—have been pushed to America’s margins for centuries. It is a difficult truth but a sobering one. For most of our history, men have controlled access to wealth creation (via education), wealth growth (via real estate ownership), and the levers of the political system (via voting rights). With these powerful tools in hand, they built a society designed to cater to their needs, wants, and desires. Whether implicitly or explicitly, men have deprived women of the resources they need to flourish or even survive.
Though many historical hurdles have been removed, policies and social norms remain, making life more difficult for women participating in the workforce or the public square. The United States remains the only developed country unwilling to mandate any paid leave for new parents, and in many cases, women are still paid less than men. Pregnancy discrimination claims are near all-time highs, and studies show that working moms are seen as less committed to their jobs by peers than working fathers. Society continues to expect women to bear the responsibility of nurturing children and sacrifice other pursuits and passions to “tend to her home.”