I'm not sure what feminism actually is, or exactly when it started, but seemed to have come to light in our lifetime.

I wondered if anyone has more specific info and how it affected their lives.

Seemed like I was suddenly expected to not only be a wife and mother, but to hold a job as well. I didn't see the males taking on any additional work for sure. Seemed like a male conspiracy to extract more labor from us unsuspecting females to me.

How do you ladies see it?

smile