In this video I show I was able to install an open source Large Language Model (LLM) called h2oGPT on my local computer for …
In this video I show I was able to install an open source Large Language Model (LLM) called h2oGPT on my local computer for …
would you say this is still the best chatbot to run locally ?!
I agree with you on the llm’s… eventually you will probably even have certifications… like ai solutions engineer… different flavors or llm’s just like the different flavors of Linux… Every small to medium company will want their own private ai setup when they see the benefits.
is it uncensored?
The latest version doesn't seem to have the data sources option. Was it taken out?
Thanks
Can it review code? I mean checking for vulnerabilities in files and folders of code
What Hardware are you running? your output is crazy fast for being offline.
can the process be altered to use cpu and would there be an advantage to this?
GPTs store popular recommendations:
1. The magical GPT "Story King" allows children to interact with stories. When children listen to stories, they are both listeners and story creators, stimulating children's imagination and creativity.Parents with children must check it out.
2. GPT "TOP NEWS ”, According to your needs, it is so convenient to grasp the real-time headlines and hot events around the world!
3. GPT“1 To 1 Foreign Teacher”your best foreign language teacher!
As of a while ago that vectorizing a document required it to be sent to OpenAI to be vectorized. Has this changed?
Sounds interesting. Installation for Linux assumes a .deb apt package manager. It also seems to have a dependency on X11. If it ever comes out as an appimage self-executable linux package, I may come back to explore it further. I also could not install the older torch 2.1.2. The current version is 2.2.0 and I could not find an archive to find an older version.
I'm running Mixtral (the latest model from Mistral AI) offline on my own computer right now, using LM Studio
Has anyone got the UI to work without the one click installer?
I get the model to run but If I enter the commands that you show to run it with the UI I get and missing library error.
Is there a step missing or is the repo missing something?
Excuse me, I am new to this field and may have limited knowledge. Could you kindly inform me which software or program the terminal was opened in? Are there any that are commonly used to operate a terminal? Your expertise would be greatly appreciated.
Enter an instruction: hello
ERICERICERICERICERICERICERICERICERICERICERICERICERICERICERICERICERICERICERICERICERICERIC
I am getting this type of output any idea ?
Keep good work. I'm very interested
Hi. Nice job, and thanks for sharing it! i´m having trouble while working in local mode. The Chatbot works perfectly, even when ißm disconnected from the internet. the problem is that i cannot drag any documents to make my model work on this data. Any help please? i´m currantly testing it with a CPU model on windows (i know it´ll be slow, but it is initially for tests purpose).
Very clear explanation of the program. Great video. I wonder how people create these open source programs and still can put food on the table. They must have day jobs.
Not uncensored
And if you don't have GPU
DON'T WASTE YOUR TIME WITH THIS
how to change theme dark into light?
mac intel install tut please
Brilliant content! How would my humble RTX 3070 fair with a local LLM?
Hi, have you tried finetune a LLM on your laptop (100% offline)?
We have the problem that we need results in foreign languages. When doing this the results of the ai models are getting worse. Is there a practical solution for this problem?
Interesting,thanks for the video, please how do we customise the UI .
Let me guess, doesn’t work with superior AMD cards.
Comments are closed.