Today, we’re announcing the launch of an industry-first integrated AI Analytics Workflow powered by Explainable AI, ‘Slice and Explain’™, to expand Fiddler’s industry leading AI explanations.
Explainable AI, a topic of research until recently, is now mainstream. But ML practitioners still struggle to utilize it to get meaningful insights from their AI models, detect potential ML bias, debug customer complaints and analyse overall performance.
Slice and Explain (S&E) was developed to resemble the ‘Drill-down Model Analysis’ paradigm of data scientists and business analysts. In this paradigm, the user begins at the global dataset level with global explanations and data insights to get a sense of which input data most affects the overall model output (e.g. debt to income ratio is the top indicator of lending risk).
Using these automated insights and their domain knowledge of the data, the user then drills down to understand how the model behaves for a specific region, or slice, of the data (e.g. number of accounts open in the past 24 months is more important for lending declines) or group of slices (e.g. loans made in CA vs NY).
Finally, the user can further drill down to an individual row or “instance” to understand the local drivers of that model prediction and view it in the context of another prediction, its enclosing slice, or the global dataset.
S&E’s comprehensive AI analysis workflow is anchored by a single intuitive interface. It provides a SQL query text box to identify the slice, a data viewer for the slice, and explanation visualizations for the slice data.
S&E Analysis is driven by queries written in SQL, the most popular and common analytics language, that facilitate quick drill-downs. The rich visualizations from the analyses can then be easily captured and shared with other AI stakeholders via the Fiddler dashboard.
S&E is available today for all our customers on Fiddler’s on-premise and cloud offerings. Request a demo.
Product