Using an LLM to query my notes #
![robot asks a question holding a book](https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhh2X78XIFWnmvwx-Y-9Zj4wn6K7tFbTdrcDkT5fp6hECaQp-T06s04hg-LfMC13V3FnlZ-30IZC0xXmIH0lKBM26AQeFeeLOI01O0iirCgQxPPKBkInV8Kfylxrl2h9_LvXb9FGW4GElHZZPoCsxmJ4fpXU38vzFolKm63XK6TSfM_wWeYPTgu/s1600/robot%20asks%20a%20question%20holding%20a%20book_631681.png)
robot asks a question holding a book
" image from stable diffusionI occasionally play with LLMs (large language models). Last time I had tried Simon Willison's llm software to search over my notes. Today I tried amaiya/onprem to ask questions about my notes. The documentation is pretty good but I made a few changes for my configuration, so I decided to write them down here. I'm running these on my Mac M1 and not on my Linux or Windows machines.
First step, get the thing installed:
cd ~/Projects/machine-learning/ mkdir onprem cd onprem # need to use python 3.9 through 3.11 for torch support; 3.12 doesn't support it yet # check the version I have installed python3 --version # if python3 isn't a reasonable version, then install or choose a different python # before doing the next step python3.11 -m venv venv source venv/bin/activate # install onprem itself, which also installs torch pip install onprem mkdir data
Labels: howto
Subscribe to:
Posts (Atom)