Widely-known topics: I use lumo by Proton for these topics because of the privacy aspects.
Niche topics (e.g. Nix or bike related stuff) and math: I use lmarena in side-by-side mode with the powerful llms like gemini-2.5 or claude-opus to compare the results.
From time to time I use lmarena with smaller models like qwen-30b or claude-haiku just to try out smaller models to be up-to-date about them. In the long run I want to run them locally for privacy reasons, but currently they are too bad or too slow for my devices.