Masculinity in America has never been under attack the way it is today. We have reached the point where the term itself is considered toxic or offensive to many. American men are conflicted as to what their role is in society. The consistent message that has proliferated in our nation is that masculinity, by nature, is bad and is the root cause of many of the problems plaguing our society. Everything from racism to pedophilia has been blamed on "toxic masculinity." Some colleges and universities are now offering classes on how to overcome or be delivered from this very "threatening" phenomenon called "masculinity."
If men take up biblical mandates ordained by their Creator—no matter their color, nationality, station, upbringing, or education—a new vision can be cast and executed that will restore a civil and prosperous America for all.