Women’s Rights in the Workplace

static1.squarespace

Today, women enjoy the same rights in the workplace as men do. Historically women have worked in some capacity in the United States since it was founded. During World War I, as a result of the men being sent off to war, women entered the workforce in higher numbers. At this time women had to fill positions that only men filled. However, women who filled these jobs still didn’t have the rights that men had. It wasn’t until the passage of Title VII in 1964 that women began to get rights in the workplace. Title VII states that “employers may not discriminate against people on the basis of race, color, religion, sex or national origin.” In other words, employers aren’t allowed to exclude qualified women from available jobs.

Even with Title VII in effect for over 50 years, we still see remnants from the days of discrimination in the workplace. Issues such as the wage gap, the “glass ceiling,” and pregnancies are still factors that work to discriminate against women in the workplace. Many women who are pregnant or breastfeeding are often fired or pushed out of their jobs. According to the American Civil Liberties Union, “this practice is often rooted in the stereotype that women should be mothers, not workers, and it is reinforced by workplace policies modeled on traditional male norms.”

Women’s rights in the workplace include the right to work while pregnant, not to be sexually harassed, not to be forced to work in a hostile environment where the woman endures sexual comments, touching, or materials. Additionally, a women has the right to work in a place where she doesn’t fear losing her job if she does not comply with sexual advances. Any women who feels that they do not have these rights in their workplace are urged to report it to the Equal Employment Opportunity Commission or to a supervisor.