Moody's argument is imaginative, but not exactly rigorous: it's painfully difficult to identify his premises, and his inferences therefrom to the conclusion that conscious inessentialism (CI) is false. Charitable exegesis yields the following overarching reasoning:
Of course, Moody intimates a chain of inference which is supposed to establish (1). At the core of this sub-argument is the claim that if (CI) is true, then zombies are conceptually, or logically, possible. Once one accepts the logical possibility of beings who, behaviorally, pass for human persons, while at the same time lacking consciousness (= ``something it is like" to be them), one must---by Moody's lights---accept the logical possibility of zombies evolving in another world, and, in turn, the logical possibility of a group of these creatures paying us a visit. All of this I regard to be unexceptionable. In fact, I think (CI) itself is false---but for reasons having nothing to do with Moody's argument. I have articulated these reasons elsewhere ; put synoptically here, the problem for (CI) is that there is mental activity that crucially involves phenomenal, subjective, ``what it's like to be" consciousness. My favorite example is the mental activity involved in composing belletristic fiction; specifically, the adoption, by an author of such fiction, of the point of view of a character therein. Now I know that Moody tells us that
Increasingly, scientists are finding that what happens in consciousness is not essential for understanding mental functioning. We recognize each other, solve problems, use language, and although all these things have `conscious accompaniements' it seems that the real work is not done consciously at all.but this is sanguinity unsupported by the evidence. That fact is, highly creative behavior, whether it be the production of belletristic narrative or the discovery of a startling theorem, is currently inexplicable from the standpoint of computation-based cognitive psychology---and part of the proof of this is the complete absence of any computational artifacts that accomplish the tasks in question, despite AI's concerted effort to create them.
Nonetheless, for the sake of argument I'm prepared to pretend that (CI) is near and dear to my heart. Under this assumption, does Moody do damage to the thesis? I don't think so: