Skip to main content

Research Repository

Advanced Search

Business insights using RAG–LLMs: A review and case study

Arslan, Muhammad; Munawar, Saba; Cruz, Christophe

Business insights using RAG–LLMs: A review and case study Thumbnail


Authors

Muhammad Arslan

Saba Munawar

Christophe Cruz



Abstract

As organizations increasingly rely on diverse data sources like invoices and surveys, efficient Information Extraction (IE) is crucial. Natural Language Processing (NLP) enhances IE through tasks such as Named Entity Recognition (NER), Relation Extraction (RE), Event Extraction (EE), Term Extraction (TE), and Topic Modeling (TM). However, implementing these methods requires significant expertise, which smaller organizations often lack. Large Language Models (LLMs), powered by Generative Artificial Intelligence (GenAI), can address this by performing multiple IE tasks without extensive development costs. However, LLMs may struggle with domain-specific accuracy. Integrating Retrieval-Augmented Generation (RAG) with LLMs improves precision by incorporating external data. Despite the potential, research on RAG-LLM applications in the business domain is limited. This article reviews Business IE systems, explores RAG-LLM applications across disciplines, and presents a case study demonstrating how RAG-LLMs can enhance business insights, offering scalable, cost-effective solutions.

Journal Article Type Article
Acceptance Date Sep 21, 2024
Online Publication Date Oct 3, 2024
Deposit Date Dec 24, 2024
Publicly Available Date Jan 2, 2025
Journal Journal of Decision Systems
Print ISSN 1246-0125
Electronic ISSN 2116-7052
Publisher Taylor & Francis
Peer Reviewed Peer Reviewed
DOI https://doi.org/10.1080/12460125.2024.2410040
Public URL https://uwe-repository.worktribe.com/output/13294855
Publisher URL https://doi.org/10.1080/12460125.2024.2410040

Files





You might also like



Downloadable Citations