10 Minute Martech

Eudald Camprubí: Accelerate Your AI-Readiness

Episode Summary

Eudald Camprubí, Co-Founder of Nuclia (acquired by Progress), breaks down what agentic RAG actually is—and why it’s becoming foundational for marketers who need secure, hallucination-free AI that runs on their own data.

Episode Notes

Eudald Camprubí, Co-Founder of Nuclia and Progress Software Fellow, joins Sara to demystify agentic RAG—what it is, why it matters, and how it’s transforming the way marketers deliver hyper-personalized, contextual experiences at scale.

Eudald explains how retrieval-augmented generation allows organizations to ask questions of their own knowledge—securely, privately, and without the hallucinations that plague general-purpose LLMs. 

He breaks down: 

Eudald: 
“RAG is a way for organizations to ask questions to their own knowledge and get answers—while keeping data security and privacy in place.”

“We’re not using general knowledge. We’re using a very small context we know is useful to answer the user’s question—and this is how we avoid hallucinations.”

“Humans are changing how they want to access information. Everything is about conversations and getting answers in the right context.”

Links & Resources:
Connect with Sara: linkedin.com/in/sara-faatz-b67213
Connect with Eudald: linkedin.com/in/eudaldcamprubi
Learn about Progress® Agentic RAG platform: progress.com/agentic-rag
Learn more about Progress: www.progress.com

Timestamps:
00:45 – Understanding AI and Avoiding Hallucinations
01:20 – The Concept of Agentic RAG
02:40 – Combating Hallucinations in AI
03:50 – Marketing Use Cases for Agentic RAG
06:05 – Small Language Models vs. Large Language Models
07:15 – Future of AI and Agentic RAG
09:40 – Inspiration Behind Nucleus and Progress Agentic RAG

Episode Transcription

0:00:00.1 Sara Faatz: I'm Sara Faatz, and I lead community and awareness at Progress. This is 10 Minute Martech.

0:00:05.0 Eudald Camprubí: ChatGPT, they are using all public available data and then for the model it's very easy to somehow hallucinate. The way RAG works is that there is a previous step which is making your data AI ready. We are not using general knowledge, we are just using a very small context that we know it can be useful to answer the question of the user. And this is how we avoid hallucinations.

0:00:28.3 Sara Faatz: That's Eudald Camprubí, co-founder and creator of Nuclia, now Progress Agentic RAG and Progress Software Fellow. Let's get started. So every guest we've had on the show this year has talked about AI in some way, shape, or form. Right? I mean it's no surprise, the proliferation and democratization have fundamentally changed human behavior and it's caused marketers, and martech teams around the world to at a minimum throw out their... Alter their playbooks, but in many cases throw them out completely. I usually start by asking my guests what keeps them up at night and I will get to that. However, I'd like to start this episode a little differently. I'd like to ask you to share with our listeners in the simplest of terms, what is Agentic RAG and what is the impact it is having or could have on the marketing and Martech landscape today?

0:01:19.1 Eudald Camprubí: So RAG stands for Retrieval-Augmented Generation, and a very, very, very easy way to explain to everyone what it really means. Let's take as an example, ChatGPT. So basically when you ask a question, the knowledge is already there and you get the answer. And this sometimes seems like magic. It's not magic, but it seems like magic. So what about if you would like to have exactly the same behavior but instead of asking a question and get the answer on top of public available data, you would like to get the answer on top of your internal data, on top of the information that you own. So basically RAG is this. It's a way for organizations to be able to ask questions to their knowledge and get outputs, get answers, but making sure that they keep data security and data privacy in place. So if we plug together agents with RAG, we get Agentic RAG. And that means that we can use our internal knowledge together with agents that are able to augment the quality of the information that we have. They are able to auto classify, summarize, create your own knowledge graph and a lot of different things. But basically this is what Agentic RAG means.

0:02:40.2 Sara Faatz: How do you combat the problem of hallucinations that are... You know, people are afraid of that, right? With your general LLMs. Do you still have that same exposure when you're using Agentic RAG solution?

0:02:52.8 Eudald Camprubí: When you ask a question to ChatGPT they are using all public available data. And then for the model it's very easy to somehow hallucinate. The way RAG works is that there is a previous step before asking questions on top of your knowledge which is making your data AI ready. And this means a lot of things, I'm not now going to explain. But this process of making data AI ready, it's also indexing data, generating vectors, embeddings knowledge graph on top of your data. And when a user asks a question, what we are doing is we are going to the database where we have your information and we are retrieving the specific parts of your knowledge that we can use to answer the question. We are not using general knowledge, we are just using a very small context that we know it can be useful to answer the question of the user. And this is how we avoid hallucinations.

0:03:50.3 Sara Faatz: Very cool. Now I know that there are a lot of organizations who are implementing or embedding this solution right now. From a marketing perspective, what are some of the use cases that come to mind or that you've seen that are probably impactful or interesting to the marketing persona?

0:04:06.6 Eudald Camprubí: So again, because of ChatGPT we start seeing a lot a shift between how the end users wants to consume data, wants to consume knowledge. We see that everything it's about conversations, everything is about getting the answer in the right place within the right context. So we start seeing a lot of customers using Agentic RAG to deliver hyper personalized answers to the users or even to provide a full conversational experience even when they trying to sell a product. So we are really changing, humans are really changing the way they want to access the information. And we can see this every day more and more.

0:04:53.7 Sara Faatz: We've been talking about personalization for 20 plus years. And I think that this, to me, it's the first time where... And especially I think, correct me if I'm wrong, the the Agentic portion of Agentic RAG is one of the key reasons why you have that ability to now actually see that come to... It's a reality, right. That we could actually have hyper personalization and we could have dynamic experiences that are contextual and conversational which is really, really compelling.

0:05:22.9 Eudald Camprubí: It is, it is. And especially, and this is forward thinking about what is coming in the next few months. We really believe that not LLMs but small language models are going to have a crucial role in the Agentic industry because we are going to be able to fine tune small language models based on your specific behavior. So CMS sitefinity will be able to provide extremely hyper personalized information to each user, different information based on the existing knowledge to make sure that you can increase the conversion rate or you can really deliver the output the answers that the users are looking for.

0:06:03.9 Sara Faatz: That's great. So for our listeners who may not be familiar with the term small language model, what is the difference between an LLM and an SLM?

0:06:11.8 Eudald Camprubí: So an LLM, basically it's an LLM. It's a language model trained with vast amount of data. So they are super huge models that they have to run in the cloud, consuming a lot of energy and they are meant to serve general purposes use cases. Small language models are the same concept. Our models are able to generate content, but they are way smaller. They are models that instead of running in the cloud, they can run on the device or they can run in a website or in your phone. The thing is that when you ask a question in a large language model you can generate an image, or you can generate a video, or you can answer any kind of question. But for the end user this is not useful. You want small language models that are able to understand what the user wants and just deliver the information. And they just need the information that the user wants, nothing else. So small language models are the reduced version of a large language model able to generate answers to the customers.

0:07:13.1 Sara Faatz: Fast forward a year from now, what do you think we would be talking about from either an Agentic RAG perspective or just an AI perspective in general? Do you have any thoughts on where the technology is going? And where as an industry businesses may take advantage of it?

0:07:31.0 Eudald Camprubí: This is very risky Sara, so forgive me.

0:07:33.0 Sara Faatz: I know.

0:07:34.7 Eudald Camprubí: Even a year time you'll listen what I answer now, I'm sure I will be wrong. But what we anticipate that it's going to be happening, we are going to start listening a lot about MCP, Model Context Protocol. MCPs are a way that existing softwares that we are using every day, it's a way for them to use LLMs to be able to provide instead of using APIs, the APIs that we all know. Instead of using APIs, they are going to use MCP to use language models to talk to other software. So the combinations between agents, MCPS and small language models is going towards really starting to have autonomous agents. So agents that, because they will have access to data and they understand the user behavior, will be able to autonomously do some pretty complex tasks. Besides that, I'm not 100% confident that AGI will be here, so general intelligence will be here. But I think that in a year we'll start realizing that AGI, it's not what we think it is. AGI, it will be a way to basically orchestrate all the artificial intelligence that we are using in our daily life.

0:08:49.9 Eudald Camprubí: So we are using AI everywhere. When we go to Instagram, to YouTube, when we go in any webpage, or when we ask our personal questions to a language model. So, for me, AGI will switch to a concept which is an orchestrator of the different AIs that we are using in our daily life that will be able to ultra personalize the information that we need and the way we want to consume.

0:09:19.4 Sara Faatz: That's great. Can you again explain what AGI means? Just for those of the listeners who might not know what that is.

0:09:27.6 Eudald Camprubí: Basic definition of AGI, it's the kind of intelligence that will be able to do things autonomously without human intervention. So it's science fiction today.

0:09:39.2 Sara Faatz: Yeah, very cool. Kind of going back to the beginning. What inspired you to create Nuclia, which is now Progress Agentic RAG. But what was the inspiration for that?

0:09:48.6 Eudald Camprubí: So we started Nuclia together with Ramon, my co-founder. And when we started the company, we had a single mission which is helping humans find the information they need when they need it. We ourselves were heavy users of Internet and document management systems and we were always suffering and we were always frustrated on not being able to find the information. We knew it was in the Internet, but we were struggling to get. So we decided that we wanted to change that and we wanted to help humans to find the information they need to do their job. And that was the main lay motive on starting the company.

0:10:30.5 Sara Faatz: That's great. I love that there's a human element to something that's so very technical and automated. That's pretty amazing. Who do you follow... Along those lines, who are you following for inspiration or information?

0:10:43.0 Eudald Camprubí: That's a great question. I'm not a fan of having some people that I continuously follow. What I love to do to understand and to get inspired, it's to talking with people. And that means not only my colleagues or colleagues in the industry, especially with customers, because most of the functionalities that we developed or most of the use cases that we see that are built on top of the platform, it's by talking to customers, understanding the needs and understand how they think about the future of their industry. And this is my source of inspiration way more than AI gurus or LinkedIn gurus that sometimes they write something that they are not really experts on. So customers, conversation with colleagues and yeah, casual conversations, probably. This is what inspired me the most.

0:11:43.8 Sara Faatz: Well, thank you so much, Eudald. It's been wonderful talking to you today. I'm a big fan and really excited to see where the technology continues to grow.

0:11:52.1 Eudald Camprubí: Thank you very much, Sara.

0:11:54.1 Sara Faatz: Listeners, thanks for tuning in. Make sure you like and subscribe wherever you get your podcasts. Until next time, I'm Sarah Faatz and this is 10 Minute Martech.

[music]