Share
3,994 Posts.
lightbulb Created with Sketch. 91
clock Created with Sketch.
13/08/22
06:42
Share
Originally posted by alonso:
↑
We've been hearing for a couple of generations now at least, feminist writers and influencers going back to the Simone de Beauvoirs and more in our time the Germaine Greers, haranguing society about injustice to women in many ways. And of course they were right and gradually the merit if you like of their cause came to be accepted and all kinds of adjustments and legal frameworks were put in place to rectify things and create a balance. But I look around one day and I see that nearly all the journalists on tv are women, I think there would be more female than male journalists overall. Most teachers are women. In fact I suggest that when it comes to influencers, there are increasingly more women than men. Are men being overlooked in recruitment for many power positions. I've got a daughter who's an executive. Not that there's anything wrong with what we've got now. However, the other thing that strikes me more & more is that younger women in positions of influence are becoming like most men used to be ie. aggressive might be too strong a word but in many cases it isn't. Is this where we're headed? A society where women have become the dominant holders or determiners of power and influence? We're not there yet but I seem to see a trend and will it end well? Does anybody else see this trend and is it going to be good for society? There are many implications.
Expand
I often ask young women if you were running a small business and budgeting the books was your focus and you were told you have to employ young females who are entitled to 2 years maternity leave full pay, would you accept this? Not allowed to hire another person and hold their job for them. Most say that is acceptable. How can they be respected as leaders when obviously their biases get in the way of their thinking