Gender Pay

Women are vital to the workforce development of the United States. Women deserve to be paid the same amount as their male counterparts. When women make more money, it is beneficial to the entire economy and society.