Want to run a local LLM? How much memory does your NVIDIA RTX have?

Want to run LLMs locally? The new NVIDIA Chat with RTX  requires 8GB of VRAM. Other language models require even more. 

You can use the Windows utility 'dxdiag' to see how much memory your NVIDIA RTX card has.

  1. Windows-R or click on the Windows key and type in the search bar.
  2. Enter dxdiag
  3. Click on Display or in my case Display 1
We're looking for the amount of Display Memory (VRAM) I have 12,086MB or 12GB so I'm good to go.


Machine Learning models suddenly made me care how much VRAM my NVIDIA graphics card has. I can never remember the exact model let alone how much VRAM it has.  Now I have this handy article to remind me how to find the numbers I need.

Vido

Revision History

2024 02 created

Comments

Popular posts from this blog

Installing the RNDIS driver on Windows 11 to use USB Raspberry Pi as network attached

Understanding your WSL2 RAM and swap - Changing the default 50%-25%

Almost PaaS Document Parsing with Tika and AWS Elastic Beanstalk