GirlChat #602422
It is internally sound, but it requires a number of assumptions which are externally questionable. Particularly problematic are two: first, that it is possible to influence the past (which requires assumptions about the nature of time which are still at least controversial) and second, that a super-intelligence worries about such lesser beings as humans (but are ants more worried about humans or humans about ants)? A related third assumption, which is not a problem just for the Basilisk, but for all thoughts on AI is whether or not the singularity is even possible. If it is possible, it is certain to happen eventually, but for the time being it remains a theoretical concept. This, of course, matters in itself, but matters doubly so because a super-intelligent Basilisk would know its own existence to be a historical inevitability, and so wouldn't worry about it. But OTOH, this is not very different to the concept of God in religions which believe in predestination. I could be reading easily Calvinist and Muslim treatises on predestination. |