If technology advances to the point of recreating the world almost perfectly in
As usual, the answer will depend on your ethical theory. For instance, some forms of utilitarianism might require that you go into the Matrix if doing so would maximize happiness (e.g., because you'd be much happier, outweighing any unhappiness you might cause to people in the 'real world' by being hooked up to the machine). Indeed, Robert Nozick used his Experience Machine thought experiment (a prequel to The Matrix) to argue that there must be something wrong with utilitarianism precisely because he thought we would not (and should not) hook up to the machine, in which our happiness would not be based on real actions and accomplishments. (There's some interesting experimental work on whether and why people say they would or would not be hooked up.)
For various reasons (not just utilitarian), I think everything depends on what you would be leaving behind and what you would be doing in the Matrix. I'm not sure what you meant when you wrote that we should "assume there is a moral disparity between someone with/without family, friends, attachments." But I take it that there would be nothing wrong with entering the Matrix for a person who would not thereby betray her obligations to others or herself (e.g., she had no family or friends or was on the verge of suicide or could do no jobs that would help society, etc.). But there would be something wrong for the person who would be betraying such obligations to others (perhaps also including obligations to develop her own abilities, create a meaningful life, etc.)
But note that there may be Matrix setups that allow people to develop their abilities (creating art or learning new skills--like flying!), to help other people (who are interacting with them in the Matrix), to fulfill obligations, etc. It might be that people in online gaming communities create real friendships and really help other people, even in a world that is, in many ways unreal.