Our AI Stack: Tools and Frameworks We Use
A transparent look at the AI tools, frameworks, and infrastructure AVARC Solutions uses to build intelligent applications for clients.
Introduction
Clients often ask us which AI tools and frameworks we use. It is a fair question — the tools your development partner chooses determine the reliability, cost, and maintainability of the system they build. Proprietary lock-in, immature libraries, and poor abstraction layers can haunt a project for years.
Here is a transparent look at our current AI stack as of late 2025. We update this regularly because the AI tooling landscape evolves fast, and sticking to outdated tools is as risky as chasing every new framework.
Language Models: Provider Diversity by Design
We do not commit to a single LLM provider. Our systems are designed to work with OpenAI, Anthropic, Google Gemini, and open-source models through a unified abstraction layer. This lets us route different tasks to the best model for the job: Claude for nuanced reasoning and long context, GPT-4o for multimodal tasks, and smaller open-source models for classification and extraction tasks where cost matters most.
The Vercel AI SDK is our primary interface for model interaction. It provides a consistent API across providers, streaming support out of the box, and structured output generation that integrates cleanly with TypeScript. For agent workflows that require multi-step reasoning and tool use, we layer LangChain or custom orchestration logic on top.
Vector Storage and Retrieval
For vector storage, we default to Supabase with pgvector because it keeps embeddings co-located with application data in a single PostgreSQL database. This eliminates the operational overhead of managing a separate vector database while delivering excellent query performance for collections up to several million vectors.
For larger-scale deployments or use cases requiring advanced features like hybrid search with built-in reranking, we use Pinecone or Weaviate. The choice depends on the client requirements, but our application code abstracts the vector store behind a retrieval interface so switching providers is a configuration change, not a rewrite.
Development and Deployment Infrastructure
Our AI applications are built with Next.js and TypeScript, deployed on Vercel with edge functions for low-latency inference routing. Background jobs like document processing, embedding generation, and batch report creation run on dedicated serverless functions with longer timeout limits.
For observability, we use Langfuse to trace every AI interaction: inputs, outputs, latency, token usage, and cost per call. This gives us production visibility that is essential for debugging, cost optimization, and quality monitoring. Evaluation pipelines run nightly against curated test sets to catch model regressions before they reach users.
Why We Avoid Hype-Driven Tooling
The AI ecosystem launches a new framework every week. Most of them will not exist in a year. We evaluate new tools against three criteria: production readiness (are real companies using it at scale?), maintenance trajectory (is it backed by a sustainable team or company?), and abstraction quality (does it simplify without hiding essential complexity?).
This discipline means we sometimes adopt tools later than the bleeding edge, but it also means our clients never end up maintaining software built on an abandoned framework. Boring technology choices that reliably work in production are more valuable than exciting ones that break.
Conclusion
Our AI stack is opinionated but not rigid. We choose tools that are production-proven, well-abstracted, and provider-agnostic wherever possible. The goal is to build systems that our clients can maintain and evolve long after the initial engagement ends. Get in touch if you want to understand how our stack would apply to your specific use case.
AVARC Solutions
AI & Software Team
Related posts
2025 in Review: What We Learned About AI in Practice
An honest retrospective on a year of building AI-powered software. The wins, the failures, and the lessons that will shape how we work in 2026.
Why We Started AVARC Solutions
The story behind AVARC Solutions — why five developers decided to build an AI-first software company and what drives us every day.
AI Trends 2026: What You Need to Know
The most important AI developments shaping software, business, and technology in 2026 — from agentic systems and multimodal models to regulation and open source.
Hybrid AI: Combining Cloud and Edge for Smarter Applications
Why running AI entirely in the cloud is not always the answer, and how AVARC Solutions architects hybrid systems that balance latency, cost, and privacy.








