By Patricia Fontejon
When I was little, I already knew what I wanted to do in the future. Growing up in a household where everyone was in the medical field, it was difficult not to be pushed into that arena. It’s what my dad, my mom, and even my distant older cousin expected of me. Everyone assumed I would follow the steps of my parents and end up with a job that entails helping people.
“Are you going to be a nurse when you grow up?” my aunts asked me at every family gathering. Regardless of whether I knew them or not, or how old I was, they took one look at me and assumed my future would be as a nurse rather than a doctor. It was the stereotype imposed onto me, simply because I was a girl.
My family had the same expectations for my brother, but they always presumed that he would be a doctor, regardless of his lack of interest in the medical field. According to my family, my brother would aspire to do greater things than be a nurse. What about me?
In America, it is common for people to assume that the majority of nurses are women. In many podcasts and shows, people think that being a nurse is a “woman’s job.” This idea can even be traced back to WWII when women were employed to nurse injured soldiers.
According to the Association of American Medical Colleges, around 38 percent of doctors are women, and almost half of current medical students are female. So why are women traditionally viewed as nurses, rather than doctors?
Both doctors and nurses are integral parts of the field. Doctors typically diagnose and treat illnesses, prescribe medications, perform surgeries, and manage complex medical conditions. Nurses also play an important role in the healthcare system by providing direct patient care and assisting doctors with procedures. Nurses typically follow the orders and treatment plans prescribed by doctors, although they may also have some autonomy in certain areas such as administering medications and carrying out specific procedures.
In terms of salary and working hours, doctors generally earn higher salaries than nurses due to their advanced level of education and training. According to the Bureau of Labor Statistics, the median annual wage for physicians and surgeons in the United States was $208,000 in 2020, compared to $75,330 for registered nurses.
Both professions are equally important, but it is important to consider how regardless of whether a woman is a nurse or doctor, they often fulfill more duties than men. Women are expected to give up or put their careers on pause once they have a child. It happened with my mom and even my grandma. On the other end of the spectrum, men are expected to continue pursuing their careers, regardless of barriers, such as poor mental or physical health.
Now that I’m older, I am indeed following my family’s expectations of entering the medical field, however, I will strive to be a doctor when I grow up. I want to not only help people but also prove to my family that women do not have to be behind the scenes, as nurses are often relegated to.
Nursing isn’t an easy job, and the medical field would collapse without nurses. For many women, being a nurse is a fantastic choice and a fulfilling career. For me, however, I want to take the extra challenge of becoming a doctor in a male-dominated field.