Are Women Taking Over the World?
Heisenberg 2012/08/23 17:00:00
Are women taking over the world? According to The End of Men, an upcoming book by Atlantic editor Hanna Rosin, they're getting pretty close -- and in many cases, they're leaving their men behind. Today, The End of Men reports, women earn almost 60 percent of all bachelor's degrees, hold more than half of all managerial and professional jobs, and are soaring in the ranks of medical, law, and business schools.
See Votes by State
Hot Questions on SodaHead
More Hot Questions