Now with local, privacy-respecting AI available, I started playing around with it.
What I did:
- Installed ollama with Docker (probably the easiest way, should work on all platforms)
- Searched for and installed some basic models (the "AI").
- Customized a prompt to make the AI behave like in an specific environment ("as a grandma", "as a engineer", "as a copywriter").
I'm currently running Openchat 3.5 7b
. It needs 5GB of disc storage and mostly around 2GB RAM on
my device.
The pros so far:
- no connection to the outside world (= privacy-respecting)
- no SEO bullshit in contrast when using a search engine
- I can use natural language instead of optimized search terms
The cons so far:
- slower than using a search engine
- older training data (AFAIR 2023 march)