Introduction
Gender roles have changed since colonization and they are still changing today. During the time period from colonization to the 1820s, the gender roles in America were severely divided with separate rights for men and women. From the 1820s to 1900s, the gender roles began to change as women's rights increased. By the 1900s until today, the gender roles have become similar. Although historically women have not had the civil rights that were given to men, American women in the 21st century finally have been granted equal rights that are protected by law.