Musings and Whiteboard Shots

Wednesday, March 4, 2015

Gender Roles: Have They Changed?

For centuries before us men have always been the ones to lead societies, families, and do all of the work that helped people around them survive. Clearly, men have been the dominant sex over women in the past, but that seems to be changing, not very fast, but it is changing. In more recent years, women have started to do the things that have been known as "guy things", like heading million dollar companies, defending our country, and working jobs that require manual labor. With these changes being made, and the subject of equality between sex  being talked about now more than ever, is the male dominance starting to decrease? It does look like it is heading in that direction, but not completely. One place that males still dominate is in the work force. Males hold more governmental positions, have more numbers in the military, and lead more of the top companies across the United States. Maybe one day, possibly soon, males and females will have co-dominance in all aspects of life, but we have not reached that point just yet.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.