I have some genuine hope for this plus the semantic web. Have quick general answers be answered by the LLM, and use it to also generate vector (or a knowledge graph from wikidata) results of the other content on the internet so that if want to dig deeper it can ingest a specific sources data (or route to an models with that info already embedded) or just return it to you for your personal reading.
Pretty exciting tbh, and hopefully all open source, open data, on local or distributed systems!
At least all the pieces are moving to make that possible.
I have some genuine hope for this plus the semantic web. Have quick general answers be answered by the LLM, and use it to also generate vector (or a knowledge graph from wikidata) results of the other content on the internet so that if want to dig deeper it can ingest a specific sources data (or route to an models with that info already embedded) or just return it to you for your personal reading.
Pretty exciting tbh, and hopefully all open source, open data, on local or distributed systems!
At least all the pieces are moving to make that possible.