International Journal of Innovative Research in Engineering & Multidisciplinary Physical Sciences
E-ISSN: 2349-7300Impact Factor - 9.907

A Widely Indexed Open Access Peer Reviewed Online Scholarly International Journal

Call for Paper Volume 13 Issue 1 January-February 2025 Submit your research for publication

Explainable AI (XAI): Methods and Techniques to Make Deep Learning Models More Interpretable and Their Real-World Implications

Authors: Gaurav Kashyap

DOI: https://doi.org/10.5281/zenodo.14382747

Short DOI: https://doi.org/g8vbhn

Country: USA

Full-text Research PDF File:   View   |   Download


Abstract: The goal of the developing field of explainable artificial intelligence (XAI) is to make complex AI models, especially deep learning (DL) models, which are frequently criticized for being "black boxes" more interpretable. Understanding how deep learning models make decisions is becoming crucial for accountability, fairness, and trust as deep learning is used more and more in various industries. This paper offers a thorough analysis of the strategies and tactics used to improve the interpretability of deep learning models, including hybrid approaches, post-hoc explanations, and model-specific strategies. We examine the trade-offs between interpretability, accuracy, and computational complexity and draw attention to the difficulties in applying XAI in high-stakes domains like autonomous systems, healthcare, and finance. The study concludes by outlining the practical applications of XAI, such as how it affects ethical AI implementation, regulatory compliance, and decision-making.

Keywords: Explainable AI (XAI), Deep Learning (DL), Decision Tree, Rule Based Models, Linear Models.


Paper Id: 231825

Published On: 2023-07-05

Published In: Volume 11, Issue 4, July-August 2023

Cite This: Explainable AI (XAI): Methods and Techniques to Make Deep Learning Models More Interpretable and Their Real-World Implications - Gaurav Kashyap - IJIRMPS Volume 11, Issue 4, July-August 2023. DOI 10.5281/zenodo.14382747

Share this