The word "metaphysics" wasn't actually used by Aristotle, but was a label applied later to a set of writing that seemed to come "after" rather than "beyond" in some reasonable ordering of topics. What we philosophers call "metaphysics" these days covers a broad swath of territory and includes a variety of questions about the nature of things (What is causation? Is everything physical? Is there such a thing as free will?…)
Some of the claims made by New Age thinkers would count as metaphysical by any reasonable accounting (for example: the idea that there is a non-physical "astral plane" or that matter is created by mind.) The belief that certain kinds of stones have healing properties, on the other hand, seems straightforwardly empirical and testable; not metaphysical in any obvious sense.
The main complaint philosophers have with New Age "metaphysical" claims is that they aren't subjected to analysis and critical assessment. They seem mostly based on traditions in need of evaluation or uncritical intuition or appeals to authorities that we don't have good reason to accept. At least, this is what I found some years back when I made a sympathetic attempt to understand various strands of New Age thought. Philosophers who do metaphysics take for granted that they need to argue for what they say and take account of objections and alternative views. And that gives rise to an obvious complaint: metaphysics, as a branch of philosophy, tries to go about its business with intellectual rigor and care. If the term comes mainly to be associated with an enterprise that lacks intellectual standards, then philosophers would reasonably see this as unfortunate. But while philosophers might wish that New Age thinkers would pick a different word, that horse has long since left the barn!
I'd add: if there were good reason to believe that certain New Age claims were true, that would be surprising, but it would be interesting and worth talking about. At least in my experience, however, many proponents of New Age ideas don't see the need to give reasons. And therein lies the most important difference with philosophers.
I think there's rather more that can be said here (and, for what it's worth, I don't actually agree that "words mean what we use them to mean").
We probably need to distinguish a couple different things here. One kind of case is that of idiom. These are linguistic expressions, like "kicked the bucket", whose meaning has nothing to do with the component words. These sorts of phrases are really just single words, but long ones, and there are good tests for when you have an idiom. Note, e.g., that I cannot say "The bucket was kicked by John" and have it mean the same as "John kicked the bucket", where the latter is the idiomatic use meaning "John died".
It might well be that "not bad" in this kind of case is an idiom, but the case seems to me to have many features of a case of implicature. Here's a standard kind of example. Suppose Professor Jones writes a letter of recommendation for Mr Smith. The letter says:
To whom it may concern:
Smith has excellent handwriting and was never late for class.
Yours, Prof Jones
Now Jones certainly hasn't said that Smith isn't qualified for whatever the letter was supposed to recommend him for. But he's made his view pretty clear. Why? Well, there's a story to be told about that. Jones knows what kind of letter he's expected to write, but he's totally failed to do that. Why? The obvious thought is that Jones is just saying something positive, and that's all he's got to say that's positive. So you can infer what Jones actually thinks from what he does say and the situation in which it's said.
Here's another kind of case. Suppose I say, "Most of the students passed the test". I do not, when I say that, also say that not all of them did. You can see that because I could continue, "In fact, all of them did". But if I don't continue that way, then you might reasonably infer that not all of them did. Why? Because, if all of them did, I could just have said so. But I didn't. Now, maybe in certain circumstances, that wouldn't be relevant. Maybe all that matters is that most of them passed and, if so, then you shouldn't draw that inference. But in a normal case, you could draw it, and reasonably so.
This kind of example illustrates what's known as the "conversational maxim of quantity", which says, more or less: When you say something, say the most informative thing you can say on the topic, given the general parameters of the conversation.
The case of "not bad" is like that. If the salad were delicious, then you could have said so. Indeed, it's reasonable to suppose you would have said so. But you didn't say so, so it must not really be delicious. Rather, it's merely okay. Minimally not bad. That's it.
Like any real experiment, a thoughtexperiment (or analogy, case study or example) in order to be validevidence for some position, has to be conceived of as beingrepeatable. So, my thought experiment should be compelling onits own terms, and not because of some special context that makes itcompelling. Only then will the thought experiment (or whatever) havevalidity beyond that context. 'All things being equal' is thus akinto the notion of controlling variables.
Great question. I would think, though, that the onus would be on the skeptic raising the question to give solid reasons why communication would NOT be possible. Already on earth we find different cultures with different languages, different "habits," different ways of thinking and ethics, and in fact even different 'types of logic' apparently (or so some cultural psychologists have argued at various points). But why should that make some form of communication impossible? True, the process of translating between such languages might be more complicated than it would be if more were shared; and true, more miscommunication might well occur during the translations, due to all sorts of pragmatic reasons. But that seems very far removed from holding a strong claim such as "communication is impossible." Absent such an explicit argument, and given the kind of counterevidence different earth cultures already present, I would have to go with a "yes" answer to your question! (By the way, Donald Davidson famously argued, in " On the Very Idea of a Conceptual Scheme," that the claim that there could be other conceptual schemes different from "ours" doesn't make sense -- such schemes would not be translatable into ours, but the idea of untranslatable languages also doesn't make sense ... Pretty relevant to your concerns -- check it out!)
A valuable paper on this topic, written by a psychologist, but with many discussions of Descartes's and Spinoza's views on these issues, is:
Gilbert, Daniel T. (1991), "How Mental Systems Believe", American Psychologist 46(2) (February): 107-119
(online at http://www.wjh.harvard.edu/~dtg/Gillbert%20(How%20Mental%20Systems%20Believe).PDF)
Briefly, Gilbert argues that (his interpretation of) Spinoza's view that believing is part of understanding and that one must believe a proposition before one can reject it is psychologically more valid than (his interpretation of) Descartes's view that believing or disbelieving a proposition must psychologically and logically come after understanding it.
Most philosophers are still very much interested in, and try to engage regularly in, live discussions with others. You won't find many of us claiming, for example, that teaching philosophy can effectively be done remotely, for example. The direct exchange of ideas and the interplay of active minds in an immediate person-to-person context still seems to most of us to be critical.
On the other hand, writing and reading make ideas more available than does simply speaking. You don't have to have known Kant to read his works and be engaged with and influenced by his thought. Reading and writing are the skills of our globalized age, and it allows us to transcend time by "speaking" to those who we can never meet, because of distance in space or time. By reading my colleagues' work (before I meet them in person), I can get to know which of them I want to speak with in person. So as important as speaking, listening and such are, writing and reading offer distinct advantages that we would be very much impoverished without!
I confess I'm puzzled by Prof. Heck's reply. He defends the following three assumptions:
(1) If you understand a proposition, then you also understand its negation.
(2) It is necessary, if you are to believe a proposition, to understand it.
(3) It's perfectly possible to believe a proposition and not understand its negation.
I interpret those assumptions as follows:
(1*) Understanding P entails understanding not-P.
(2*) Believing P entails understanding P.
(3*) Believing P doesn't entail understanding not-P.
(1*)-(3*) imply a contradiction: Believing P does and doesn't entail understanding not-P. If so, then (1)-(3) imply everything (if I've interpreted them correctly). I also don't see how the falsity of (3) implies that we would always have to believe contradictions. If (3) is false, then believing P entails understanding not-P; I don't see how any unwelcome consequences follow from that.
PLEASE NOTE: (3) above was taken from Professor Heck's original posting, which he has since amended. [Alexander George on 6/6/2014.]
I would say that there is, in any general sense, a right or wrong way to use a word. There are various generalizations about how people actually use words that are captured in dictionaries. Dictionaries tell us how people do use words, not how they should use them. There is no such thing as the correct use of the term 'game', or any other term. If a person uses a word intending to mean something by it that is not in line with the dictionary, there is nothing wrong with that per se. We do it all the time. for many reasons, many of them very good. We just need to be careful not to be misunderstood. And we might use a word thnking we are using it in line withe dcitionaries and be wrong about that.
Nobel Prize-winning physicist Richard Feynmann said "People often complain of the unwarranted extension of the ideas of particles, paths etc. into the atmomic realm." He responds to the complaint that the extension is unwarranted "Not so at all; there is nothing unwarranted about the extension. We must, and we should and we always do extend as far as we can beyond what we a;ready know, beyond those ideas that we have already obtained ... it is the only way to make progress" (The Character of Phusical Law). He could have said the same thing about using the words 'particle' etc. in a new way. Calling something 'a particle' when it isn't a particle in the usual sense, can be a vital way to make progress.
Extending the use of terms in the way Feynman discusses is vital to the progress of knowledge. But of course we have to try our best to be carful when we do this, so as not to create confusion.