Show HN: Clippy – 90s UI for local LLMs by felixrieseberg
Clippy lets you run a variety of large language models (LLMs)
locally on your computer while sticking with a user interface of the
1990s. It’s a love letter and homage to the late, great Clippy – and
the visual design created by Microsoft in that era.
-
Simple, familiar, and classic chat interface.
Send messages to
22 Comments
Jagerbizzle
Man do I ever miss this UI design. Nice work!
kuberwastaken
Is it insane that I tried to make a version of this exactly a week ago!? This is freakin awesome, congratulations!
dehrmann
Can you add narration in Gilbert Gottfried's voice?
https://www.youtube.com/watch?v=tu_Pzuwy-JY
alkh
Great job! Having ollama support would be useful as well[1]!
[1]https://github.com/ollama/ollama
rvz
[flagged]
ale42
Great idea and design, thanks for this! I was hoping since some time to see this :-D
I hope that one day a non-Electron app (to minimize resource usage when idle) will also appear!
talkinghead
yes yes yes!!!
rangerelf
This is a thing of beauty, thank you!! :-D
tasn
I love the terrible font rendering! Is it a special font, or some CSS?
nullchan
Pretty sure Clippy is trademarked. Had the same idea but did not go through with it because of the TM.
basketbla
Pretty fantastic follow-up to https://www.latent.space/p/clippy-v-anton
_pdp_
Super cool. Serious 90s vibes. I also tried to make a super clippy here. https://chatbotkit.com/examples/super-clippy I think I match the color shema perfectly but does not have the same feeling as the original.
rafram
This is cool, but does no one even look at what libraries they're shipping anymore? I mean, why does this Clippy-style LLM interface bundle:
– A JavaScript implementation of the Jinja templating language
– A full GitHub API client
– A library that takes a string and tells you if it's a valid npm package name
– A useless shim for the JavaScript Math module
And 119 other libraries? This thing would have taken up 10% of the maximum disk space available on a Windows 95 FAT16 volume.
Aardwolf
It's weird that when clippy was new I found it to be everything that's wrong with UI design, and today I'm nostalgic for it
concerndc1tizen
I hope people realize that this is an easy way to get a virus.
Don't install third party software except from highly trusted sources.
animanoir
[dead]
mkgeorge7
Question for the devs in here…something I've been thinking about a lot recently. So I see that OP linked out to a public github repo…but when downloading the actual bundle, what's a quick way for me to determine that what I'm installing on my mac is actually the same as what's in the public repo? It's always seemed like a loophole to me ready for (potential) exploitation.
>> Ship project.
>> Link out Github repo on the static site somewhere
>> Gain trust instantly as users presume the public repo is what's used behind the scenes
Disclaimer: I'm a web dev and don't know a single thing about native MacOS software
GuinansEyebrows
BonziBuddy next?
ayaros
I love your website design.
aligundogdu
This is such an amazing piece of work — truly impressive! Hats off to you If it supports Ollama and local LLMs too, it'll be absolutely unbeatable!
givemeethekeys
Like phoenix, it rises from the ashes..
jl6
IIRC correctly, Clippy’s most famous feature was interrupting you to offer advice. The advice was usually basic/useless/annoying, hence Clippy’s reputation, but a powerful LLM could actually make the original concept work. It would not be simply a chatbot that responds to text, but rather would observe your screen, understand it through a vision model, and give appropriate advice. Things like “did you know there’s an easier way to do what you’re doing”. I don’t think the necessary trust exists yet to do this using public LLM APIs, nor does the hardware to do it locally, but crack either of those and I could see ClipGPT being genuinely useful.