When Television started the media would have commercials and shows where only women played the roles of nurses.The media gave nursing the so called title of being only a women's job. The show Nurses which ran from 1991 to 1994 was a sitcom that focus on female nurses doing their jobs. It had women as nurses while doctors were men.
It wasn't until around the 20th century that things started changing in the media. They started having shows were men were nurses and women were doctors. Shows like Grey's Anatomy which is a popular ABC show focuses on a group of female doctors along with male doctors. The show also shows a few male nurses as the rise in men going into the profession started growing.
Another great show that has male nurses is Code Black. Throughout the years women are no longer portrayed as just nurses and men are no longer portrayed as just doctors. The media has allowed us to see that men like me for instance are fine with going into nursing and not worrying about others judging me for it.
Statistics have shown that in 1970 about 2.7% of registered nurses were men. In 2011 that increase to 9.6 which is a big increase.
https://www.census.gov/people/io/files/Men_in_Nursing_Occupations.pdf
In past century’s, nurses were uneducated and were well known of low status, and were considered cheap and unskilled. What people don't realize is that gender shouldn't matter in the profession, what should really be focused on is how well they are in the practice. Media has portrayed the medical practice false in many ways. Greys Anatomy is one of the first types of media to portray this false image. I understand that “Grey’s Anatomy’s” producers must sometimes sacrifice by relating to false advertising to engage their audience and ratings. But I dislike the way the show overlooks the lives of the residents and often presents false medical information. By portraying the residents’ lives as exciting, dramatic, and romance in some ways. The show still portrays real-world residency less difficult. Which is not even close to being true. Although nursing is now a common career choice for men, there are still many false details and stereotypes associated with being a male nurse. Although all the benefits of becoming a nurse would cancel out any kind of challenges nurses face, such as stresses at work, and developing the skills required for a difficult profession, but there are still reasons that prevent men from eagerly pursuing a career in nursing. Even today, despite the growth in healthcare, nursing is still perceived as a field that is less prestigious.
ReplyDelete