Using an LLM to query my notes #

robot asks a question holding a book
" image from stable diffusionI occasionally play with LLMs (large language models). Last time I had tried Simon Willison's llm software to search over my notes. Today I tried amaiya/onprem to ask questions about my notes. The documentation is pretty good but I made a few changes for my configuration, so I decided to write them down here. I'm running these on my Mac M1 and not on my Linux or Windows machines.
First step, get the thing installed:
cd ~/Projects/machine-learning/ mkdir onprem cd onprem # need to use python 3.9 through 3.11 for torch support; 3.12 doesn't support it yet # check the version I have installed python3 --version # if python3 isn't a reasonable version, then install or choose a different python # before doing the next step python3.11 -m venv venv source venv/bin/activate # install onprem itself, which also installs torch pip install onprem mkdir data
Labels: howto
Subscribe to:
Posts (Atom)