So, some say, all men talk dirty about women. What does it prove? Do you feel good about it? Do you feel empowered that, you think, you have have a right to do so? As a woman, I feel like such mindset and behavior is a betrayal. Man's role is to protect and uphold his partner. Real, secure in their position and authority men wouldn't demean women. Those, who hate women or fear rejection - would.