The university has the image of being a haven of sorts, a place where ideas can be explored and a person can grow intellectually and personally, regardless of gender or race/ethnicity. Although universities have been historically dominated by men, women have made great gains in the last few decades. Yoder (1999) states women earned more than half of all bachelors and masters degrees and about 40% of all Ph.D.s and professional degrees. More and more women are entering faculty and other professional positions at universities. Does this mean that universities are havens for women, places where we can grow personally and professionally without encountering the barriers that face professional women in business or other work environments?