I’ve been experimenting with using LLMs locally for generating datasets to test Harper against.
I might write a blog post about the technique (which I am grandiosely calling “LLM-assisted fuzzing”), but I’m going to make you wait.
I’ve written a little tool called ofc
that lets you insert Ollama into your bash scripts.
I think it’s pretty neat, since it (very easily) lets you do some pretty cool things.
For example, y
3 Comments
wunderwuzzi23
Beware of ANSI escape codes where the LLM might hijack your terminal, aka Terminal DiLLMa.
https://embracethered.com/blog/posts/2024/terminal-dillmas-p…
TheDong
I feel like the incumbent for running llm prompts, including locally, on the cli is llm: https://github.com/simonw/llm?tab=readme-ov-file#installing-…
How does this compare?
zoobab
Did a similar curl script to ask questions to Llama3 hosted at Duckduckgo:
https://github.com/zoobab/curlduck