Oh my god. Do women “choose” to go into lower-paying careers or is it that “women’s work” is seen as being less important and thus less worthy? These days, there aren’t that many jobs that are downright dangerous. Mostly the divide is in dealing with humans. Any job that requires taking care of humans, i.e. teaching, nursing, elder care, social work, etc. is seen as “women’s work” and thus is paid less. Were men to suddenly to decide to flood the daycare or elementary school market, wages and benefits would go up.