Want to run a local LLM? How much memory does your NVIDIA RTX have?
Want to run LLMs locally? The new NVIDIA Chat with RTX requires 8GB of VRAM. Other language models require even more.
You can use the Windows utility 'dxdiag' to see how much memory your NVIDIA RTX card has.
- Windows-R or click on the Windows key and type in the search bar.
- Enter dxdiag
- Click on Display or in my case Display 1
We're looking for the amount of Display Memory (VRAM) I have 12,086MB or 12GB so I'm good to go.
Machine Learning models suddenly made me care how much VRAM my NVIDIA graphics card has. I can never remember the exact model let alone how much VRAM it has. Now I have this handy article to remind me how to find the numbers I need.
Vido
Revision History
2024 02 created
Comments
Post a Comment