Bikini medicine refers to the medical practice, research, and funding that focuses solely on the female breasts and reproductive system.
Today, we speak with a leader in the field of women’s health and when you hear that term, you could be forgiven if you thought reproductive health. Some people even called it bikini medicine because, for so long, the focus of research and medical practice seemed to be almost exclusively on the areas of the body you’d cover with a bikini.
Bikini medicine is an informal term that refers to the tendency of some medical professionals to focus on health issues relating to the breasts and genital areas when treating women. This is as opposed to taking a look at the woman’s overall health.
Bikini medicine has been blasted by advocacy groups as it does not take into consideration the entire body’s health. As such, it can be rather unhealthy - and even dangerous - for women.