Isn't it sad that in America, that the boys are taught to not be leaders, that it is okay to do what ever they want? Isn't it sad that they are not taught how to look after their "woman"?
The other half is sad too, how women are taught to be feminists (I'm talking extreme) and to totally disrespect men?
I tell ya, what is this world coming too. The way society is teaching young and old - is teaching failure.