Category: AI

  • DLS and Your Application

    Introduction Let us start by defining “Domain Specific Language (DSL)”. A Domain-Specific Language (DSL) is a specialized programming language tailored to a specific problem domain, designed to express solutions in a more intuitive and concise way than general-purpose languages. There are common examples of Domain-Specific Languages (DSLs): DLS is fundamental to intent driven development, it…

  • UI needs to change

    Context This post focuses on enterprise-scale applications, though the principles can apply broadly to various types of applications. For now, however, let’s assume the scope is large-scale enterprise applications. Traditional UI design patterns for such applications revolve around a few core principles and areas of functionality: 1. Navigation Enterprise applications often feature complex navigation structures…

  • Ollama, the gift that keeps on giving.

    For the past few weeks, I’ve been developing a web component that wraps Ollama’s features, making it easy to integrate into web applications. Although the feature set isn’t complete yet—I’d like to add Retrieval-augmented generation (RAG) functionality—the core implementation with Ollama is working smoothly. In this post, I’d like to highlight a few insights and…

  • Process API in AI tooling

    I am currently working on the third iteration of the process API, this time in Rust. The first version, written in JavaScript, aimed to expose user-defined processes without requiring users to write code. This introduced the concept of intent-driven development, where users define the intent and the process API handles the execution. This approach has…

  • RAG models

    This topic can quickly become extensive due to its complexity. My aim here is to document my experiment using llamaindex, ollama, and chroma_db for RAG. In essence, RAG involves conversing with an AI about processed documents. My focus lies not in the intricacies of document processing into vector stores, but rather in engaging with the…

  • Running LLM locally

    For those entrenched in the loop, this might be old news, but it remains crucial for those still traversing the learning curve. Q: How can I execute a Language Model (LM) on my local machine? There’s a valid query as to why one would even consider this. Sure, there are readily available sandbox environments online…