I think most people would agree that women still have reasons to be discontented with the supposed equality between the sexes. It seems to me that the typical woman's role- the idea that women are weaker and that they raise the children, clean the house, etc.- is something that was created hundreds of years ago. The fact that this ideal is so long lasting has led me to wonder, will women ever be able to move past it? Even today with a growing trend of stay-at-home dads and working women, housework still takes on feminine connotations. Is it possible for women to ever truly be equal with men? Or was the damage that was done centuries ago to much to overcome for the female gender? What do you guys think?