xAI is hard.

It’s hard for many reasons. A major reason is that, as an area of study, xAI is overloaded. Consider the following questions that you might ask:

  1. Why did you do that?
  2. How did you do that?
  3. What if I did X instead of Y, would your behavior change?
  4. Wait a minute, I asked you the same question 5 minutes ago and got a totally different answer. What gives?
  5. I think you’re wrong, can you convince me otherwise?
  6. I don’t believe you, can you convince me otherwise?
  7. I don’t trust you, can you convince me otherwise? (note that this is different than the previous question)
  8. I don’t trust me (or I don’t trust my data). Can you help?

The list goes on and on and on. What question are you asking when you want an “explanation” from AI? Better yet, what answer are you looking for?