getting hands on ai
Contents
It may sound strange to someone over 50, but I bought my first good graphics card for evaluating LLM models. The entry level is an RTX 3090 with 24G VRAM. It's still way too little. But at least the 30B to 70B range is now within reach and some nice things can be done with it. As long as you take your time.
I am simultaneously evaluating Claude AI, Claude Code and Api, Mistral and OpenAi on commercial offerings and locally mainly Ollama and vllm. All bundled via litellm as proxy and open-webui as frontend. Additionally Claude AI as daily driver and Claude Code as comparison to tools like Aider etc.
Author
LastMod 2025-04-25