That was a really interesting article. One thing that I wonder about, though. The AI has to get it's concepts from somewhere. And some of the concepts it has make it almost human-like; specifically it has not only conceptions, but (like humans) it also has MISconceptions, too. Is this revealing that, perhaps, it's not an AI but a human posing as an AI? And wouldn't that bring us full-circle. Humans create software that tries to imitate humans, and humans then try to imitate AI.