Local LLMs integrated into the OS could be such a blessing for people like me. I've already noticed I use ChatGPT much more compared to Google search when I need a concept explained quickly and succinctly. In fact, "define ______" and "what is ______?" were two of my most frequent search queries up until last year.
Now I just ask ChatGPT. When things don't make sense, I just ask it to explain like I'm five or ten, and it works wonderfully 80% of the times. Having a local LLM capable of doing this at your fingertips will make this even smoother!
Exactly. Recently, the Arc Search App( a browser actually) released a feature that allows you to pinch a page, and it will summarize it for you. Although these are simple use cases of Generative AI, they give you a glimpse of where we are heading.
For further actions, you may consider blocking this person and/or reporting abuse
We're a place where coders share, stay up-to-date and grow their careers.
Local LLMs integrated into the OS could be such a blessing for people like me. I've already noticed I use ChatGPT much more compared to Google search when I need a concept explained quickly and succinctly. In fact, "define ______" and "what is ______?" were two of my most frequent search queries up until last year.

Now I just ask ChatGPT. When things don't make sense, I just ask it to explain like I'm five or ten, and it works wonderfully 80% of the times. Having a local LLM capable of doing this at your fingertips will make this even smoother!
Exactly. Recently, the Arc Search App( a browser actually) released a feature that allows you to pinch a page, and it will summarize it for you. Although these are simple use cases of Generative AI, they give you a glimpse of where we are heading.