Category: AI

  • RAG models

    This topic can quickly become extensive due to its complexity. My aim here is to document my experiment using llamaindex, ollama, and chroma_db for RAG. In essence, RAG involves conversing with an AI about processed documents. My focus lies not in the intricacies of document processing into vector stores, but rather in engaging with the…

  • Running LLM locally

    For those entrenched in the loop, this might be old news, but it remains crucial for those still traversing the learning curve. Q: How can I execute a Language Model (LM) on my local machine? There’s a valid query as to why one would even consider this. Sure, there are readily available sandbox environments online…