notesum.ai
Published at October 23An Ontology-Enabled Approach For User-Centered and Knowledge-Enabled Explanations of AI Systems
cs.AI
cs.LG
Released Date: October 23, 2024
Authors: Shruthi Chari1
Aff.: 1Rensselaer Polytechnic Institute

| Setting | Competency Question | Answer | SPARQL Query length | Property Restrictions accessed? | Inference Required? | Filter Statements |
| System Design | Q1. Which AI model(s) is/are capable of generating this explanation type (e.g. trace-based)? | Knowledge-based systems, Machine learning model: decision trees | 8 | Yes | No | No |
| System Design | Q2. What example questions have been identified for counterfactual explanations? | What other factors about the patient does the system know of? What if the major problem was a fasting plasma glucose? | 4 | No | No | No |
| System Design | Q3. What are the components of a scientific explanation? | Generated by an AI Task, Based on recommendation, and based on evidence from study or basis from scientific method | 2 | Yes | No | No |