In this insightful talk, Jerry Liu, the founder and CEO of LlamaIndex, explores the future of knowledge assistants, delving into advanced concepts that extend beyond simple single-LLM prompt calls. The presentation begins by outlining the prevalent use cases for AI in enterprises, including document processing, knowledge search, and question answering. Liu emphasizes the limitations of traditional retrieval-augmented generation (RAG) systems and advocates for a more sophisticated approach to building knowledge assistants. He introduces the idea of advanced data and retrieval modules, highlighting the importance of high-quality data processing to enhance the performance of AI applications. Liu discusses the necessity of effective parsing, chunking, and indexing to ensure that AI systems can accurately understand and respond to complex queries. Moving beyond basic RAG, he presents the concept of agentic RAG, which incorporates advanced query flows and multi-agent task-solving capabilities. This approach allows for greater specialization and efficiency in handling diverse tasks. Liu announces the launch of Llama Agents, a framework that treats agents as microservices, enabling seamless communication via a single API. He concludes by inviting developers to explore the potential of multi-agent systems in creating robust, production-grade knowledge assistants, paving the way for a more integrated and intelligent future in AI.