You can run Nvidia Chat with RTX AI LLM on your Windows PC – Knowligent
You can run Nvidia Chat with RTX AI LLM on your Windows PC

You can run Nvidia Chat with RTX AI LLM on your Windows PC

HomeHow toYou can run Nvidia Chat with RTX AI LLM on your Windows PC

You’ve probably noticed that generative AI tools like Google Gemini and ChatGPT are making their way into most of the technology we use every day. These tools rely on massive Large Language Models, or LLMs: networks that have been trained on huge amounts of human data so that they can spit out realistic text, images, or videos.

NVIDIA'S NEW OFFLINE GPT! Chat with RTX | Crash Course Guide

However, you don’t need a cloud app to access these LLMs: you can run them on your own computer. You can take advantage of everything these models offer while you’re offline, and you don’t have to hand over your prompts and conversations to Google or OpenAI either.

Now, Nvidia has launched its own local LLM application that harnesses the power of its RTX 30- and RTX 40-series graphics cards, called Chat with RTX. If you have one of these GPUs, you can install a generative AI chatbot right on your computer and customize it to your own needs.

Before you get started, make sure you have the latest drivers for your Nvidia GPU. The GeForce Experience app on your PC will help you with this. Next, head over to the Chat with RTX download page. To run the tool, you'll need Windows 11, a GeForce RTX 30/40 series GPU (or RTX Ampere or Ada Generation GPU with at least 8GB of VRAM), and at least 16GB of RAM.