Why are my employer’s religious beliefs more important than mine, is what I want to know. How is my employer being able to tell MY INSURANCE COMPANY what I can and can’t do with MY health choices not an infringement on my most basic of rights? Why is MY EMPLOYER’S opinion on my medical choices being held up on a higher pedestal than my fucking DOCTOR’S OPINION?
This is all bullshit. I am sick and tired of fighting this fight. There is NO goddamn reason why hormonal contraceptives should be THIS fucking controversial in 2014. Women are losing their most basic rights. The conservatives on the court and in political office want women to be thrown back into the 1950s. Why don’t they just take away our right to vote, too?
I don’t understand how this is actually happening. I don’t understand how there are people just letting this happen to us.
I mean, we’re now at the point where if you want to avoid a pregnancy, hormonal contraceptives are harder and harder to get. If you need the morning after pill, you’re at the whim of the “conscience” for the pharmacist who will sell it to you. If you get pregnant, not only is abortion incredibly hard to obtain, BUT you can’t even expect to be safely escorted into the clinic to get one because there are no buffer zones.
Does anyone else notice a trend? ALL of these barriers are because of the feelings of OTHER people or because of insurance companies. ALL of these problems are due to people interfering in someone’s health care where they have absolutely NO fucking business. Absolutely NONE of these take into the religious beliefs, rights, or bottom line of the actual HUMAN BEING they affect.
I’ve said this before but I’ll say it again, if this was ANY OTHER FORM of health care (diabetic care, elective plastic surgery, brain surgery, cardiology, ophthalmology, radiology, nutrition, etc.) this would all be considered someone’s own personal business between them and their doctor. NOT your employer’s business. NOT the violent, yelling “sidewalk counselor” outside the clinic.