9.6 C
Casper
Sunday, May 26, 2024

Dataiku Enables Generative AI-powered Chat Across the Enterprise

Must read

Dataiku Answers brings secure, tailored, and scalable conversational AI to the enterprise.

Dataiku, the platform for Everyday AI, has announced Dataiku Answers, a new way for data teams to build Generative AI-powered chat using retrieval-augmented generation (RAG) at an enterprise scale. With Dataiku Answers, teams can select their large language model (LLM) of choice, feed it their organization’s proprietary data through RAG techniques, and build tailored AI chatbots for all departments across their organization. 

Many companies have blocked access to public LLMs like ChatGPT due to security and compliance risks, preventing employees from taking advantage of the benefits of Generative AI for day-to-day use. Even when employees do have access, mainstream LLMs lack the power to query an organization’s proprietary data, making insights unreliable and considerably limiting enterprise value for chat applications. Dataiku Answers solves these challenges by enabling data teams to easily build RAG-based chatbots fueled by proprietary content to deliver accelerated insights into enterprise data and knowledge. The result is many secure, governed, and scalable conversational AI chatbots that contextualize information and produce relevant and reliable answers to an organization’s unique set of questions. 

“Every organization can and should be using Generative AI to streamline operations and work smarter, and data leaders need to be able to build these applications with the right level of transparency and reliability to mitigate risk at the right speed,” said Sophie Dionnet, Global VP, Product and Business Solutions, at Dataiku. “With Dataiku Answers, we take the conversational experience of ChatGPT and the accuracy afforded by RAG to equip data leaders with the enterprise-grade security, control, and visibility required for smart and responsible innovation.”

Because Dataiku Answers sits on top of the Dataiku LLM Mesh framework, data teams can connect to preferred LLM vendors like Anthropic, AWS Bedrock, Azure, Databricks, Google Vertext, OpenAI, and more, as well as vector stores like Pinecone, to build their AI chatbots. Alternatively, they can use self-hosted LLMs. From there, they easily build RAG pipelines to give the chatbot access to proprietary content so that its answers are accurate and tailored to the organization. The Dataiku LLM Mesh has dedicated components for AI service routing, personally identifiable information (PII) screening, LLM response moderation as well as performance and cost tracking, allowing for trusted GenAI deployment.

Also Read: Explained: TinyLlama – The Promising Generation of Powerful Smaller Language Models

“Data leaders have been asking how they can deploy RAG-powered chatbots more easily throughout their organizations. Dataiku Answers already addresses this need for more than 20 global retailers, manufacturers, and other organizations. In a matter of a few weeks, we have seen them meet a variety of needs from opening a company-wide LLM chat to their corporate documents down to creating domain-specific chatbots for investor relations, procurement, sales, and other teams,” said Dionnet. “Employees can ask questions just as they would of ChatGPT and feel confident they are getting reliable responses. Meanwhile, data teams get complete visibility and control over usage, costs, and quality. Everyone wins.”

More articles

Latest news