I don't know which feminists you have in mind, but they do not occur to me as a group who are likely to take a very positive view of masculinity. On the contrary, they tend to argue that our notions of what it is to be a man and a woman are linked, that they are based far more on culture than on nature, and that in any progressive social development both notions need to be questioned and eventually replaced by something better.
Read another response by Oliver Leaman
Read another response about Feminism
Sometimes I read feminists who say that their mission has nothing to do with emasculating men and that they think masculinity is wonderful. I am perplexed since I don't know what this masculinity thing is or why it should matter. What is masculinity and why should it matter to anyone whether it stays or goes?