In these series of articles, we will provide you with a brief dictionary of terms surrounding data science including AI, machine learning, and deep learning.
So, what is Explainable AI in Data Science in 2020?
Explainable (interpretable) AI models strive to solve the recognized problem that as we generate newer and more innovative applications for neural networks, the question “How do they work?” becomes more and more important.
Opening the black box to enable transparency is becoming more important as we realize that we don’t really know why AI models make the choices they do. As models become more complex, the task of producing an interpretable version of the model becomes more difficult.