I recall reading a while back about how Brigham Young encouraged women to become doctors, as they were natural healers. I was reminded of this as I lay on my stomach getting stitches in my leg in the ER last week. The doctor tending to me was a woman.
Thinking back over the past few years, I've had quite a few experiences with female doctors. I've given birth to two children in recent years, so I've been to clinics and hospitals more than most people in a relatively short time frame, what with pre-natal check-ups, labour and delivery, then having to take my son to the ER on two occassions. It seemed to me that most of the younger doctors were women. My regular physician often has student doctors, which were mostly women. I wondered if it was my imagination or if there really were more women becoming doctors. I looked into it, and apparently, women are dominating medical schools across the board.
Based on my experiences, women do make better doctors than men. The short answer as to why I think so, is that they're nicer. I sense that the male doctors are more interested in looking at test results than listening to their patients and that they see things in very black-and-white terms. They act like authority figures rather than public servants, which is what I think they should consider themselves. I also think that it would be more comfortable for women patients to have women doctors. I know I feel this way. Plus, I insist that my healthcare professionals not only tolerate my stand-up routine, but like it. I like my current doctor but I still wish he was a woman. He'd make a funny-looking woman, but then again, he makes a funny-looking man so I guess it doesn't matter.
Regardless of my personal belief that women are superior physicians, there seems to be some worry over the increasing number of women graduating med school. One concern is that the pendulum has swung too far and that now men are being discriminated against. I doubt it. My hunch is that women honestly find themselves more drawn to the profession. I think women become doctors because they think they'll enjoy it and be good at it, whereas men become doctors because the income, title, and status in the community appeal to them, hence why they act like authority figures.
The other worry is that women are more likely to work part-time, so staffing problems will become an issue when all the older, male doctors retire. My answer to this is that there should be efforts made to make full-time work more achievable and appealing.
Now, perhaps you've read this and think I'm being a little hard on male doctors. Maybe so, but experience has les me to prefer female doctors. What can I say? I think they're just plain ole better at it.