Nvidia presents AI chatbot for PCs

CHAT-WITH-RTX-HERO-1200x624
Translate from : Nvidia præsenterer AI-chatbot til PC'er
Imagine a more private and personalized ChatGPT that can answer questions about the data you have stored locally on your PC. Try Nvidia's new 'Chat with RTX' demo now.

On Tuesday, Nvidia released a technology demo called "Chat with RTX." The software is free to download and allows users to use open source AI store language models - including Mistral or Meta Platforms' Llama 2 - to interface with their files and documents.

After a user points "Chat with RTX" to a folder on their computer containing .txt, .pdf and Microsoft Word documents, they can ask the chatbot questions about information contained in the files.


The Nvidia team explained: "Since Chat with RTX runs locally on Windows RTX computers and workstations, the results presented are fast - and the user's data remains on the device. Instead of relying on cloud-based LLM services, Chat with RTX lets users process sensitive data on a local PC without the need to share it with a third party or have an Internet connection."

Nvidia says the current version of the software is good for informational queries, but not so good for questions that involve reasoning across the entire dataset of files. Chatbot performance also improves on a specific topic when it gets more file content about that topic.

Chat with RTX" requires Windows 10 or Windows 11, along with an Nvidia GeForce RTX 30 Series GPU or 40 Series GPU with at least 8 GB of RAM.

Read more at Nvidia here.

Our Partners