Home | Tabby This is actually pretty cool and useful. Just tried this on my Mac locally of course and it seems to have quite good utility. What would be interesting for me would be to train it on my code and many projects ๐
Most of the can run locally have such a small training set they arnt worth it. Are more like the Markov chains from the subreddit simulator days.
There is one called orca that seems promising that will be released as OSS soon. Its running at comparable numbers to OpenAI 3.5.
But remember the LLM is only a very good auto complete.
@prologic@twtxt.net The hackathon project that I did recently used openai and embedded the response info into the prompt. So basically i would search for the top 3 most relevant search results to feed into the prompt and the AI would summarize to answer their question.