FlowDevs Tech Stack: A Comprehensive Guide to Agile AI-Assisted Development

Stability in software development usually comes at the cost of speed, but the FlowDevs tech stack proves you can have both by enforcing a strict governance protocol requiring two votes for any structural change. Before any new tool or update is implemented, it requires alignment and approval from key stakeholders, ensuring that our foundation remains solid even as we move fast. This balance allows us to leverage cutting-edge AI-assisted development without introducing chaos into our production environments.
The Evolution of Vibe Coding
Modern development has shifted from accurate typing to accurate prompting, a methodology we call vibe coding. This approach uses AI interaction to accelerate front-end and full-stack generation. Our primary engine for this is Lovable, which handles the heavy lifting of interface generation. However, we do not rely on a single tool. We support a robust list of approved AI assistants including Codex via OpenAI, the Gemini CLI, GitHub Copilot, and Claude Code.
By defining a specific set of tools, we ensure that every developer is speaking the same language. Whether we are building a quick prototype or a complex enterprise application, the underlying AI assistance remains consistent / predictable and secure.
Standardizing Intelligence with Agent Skills
One of the biggest challenges in AI-assisted development is ensuring the AI understands specific internal rules and frameworks. To solve this, we treat knowledge as a module. We utilize skills.sh, an open agent skills ecosystem, to inject procedural knowledge directly into our AI agents.
Instead of hoping the AI guesses the right folder structure or coding convention, we explicitly install these skills. For example, a developer can run a simple command like npx skills add owner/repo to give their agent the exact instructions needed for a task. This is critical when switching between different architectural paths. If a project requires a high-performance backend, we pull in skills optimized for Supabase. If we are working strictly within a Next.js environment, we pull in Vercel-specific skills. This ensures the AI acts as a specialized expert rather than a generalist.
Core Infrastructure and Web Architecture
We believe in fitting the architecture to the business need, not the other way around. Our stack offers two primary deployment paths for web applications built with Lovable, as well as a custom option for specialized needs.
- Path 1: Standard. This is our cost-effective route. Both the frontend and cloud hosting are managed entirely within Lovable. It is perfect for projects where budget is the primary constraint.
- Path 2: Vibe-Optimized. When speed of generation is better than raw cost savings, we pair Lovable with Supabase. This allows for rapid iteration and "vibe coding" workflows, though it incurs higher infrastructure costs.
- Custom Path. For projects requiring hosting outside the Lovable ecosystem, we utilize Azure paired with Bun. It is important to note that we explicitly do not support Vercel or Netlify for these custom deployments to maintain strict control over our environments.
Regardless of the path chosen, GitHub remains our single source of truth. Every line of code must be synced to GitHub to ensure version control and disaster recovery.
Connecting Context with MCP Configurations
The Model Context Protocol (MCP) is the bridge that allows our AI tools to securely access our development environment. Without MCP, an AI is blind to the specific context of a project. We configure MCP servers in Codex to give it direct access to our databases, issue trackers, and secret managers.
To make this work, we use a global feature flag in our configuration files to enable remote clients. This setup allows for powerful integrations:
- Linear Integration. By connecting Codex to Linear, developers can create issues, track sprints, and query project documentation without ever leaving their code editor.
- Firebase and Supabase. We configure these servers to allow the AI to interact with Firestore, manage storage, and query database schemas. This turns the IDE into a command center where the database is just as accessible as the code files.
- 1Password. Security is paramount multiple secrets should never be hardcoded. We use the 1Password MCP to automate credential retrieval. However, because secrets are plaintext within the MCP context, this is strictly strictly used for automated, disposable credentials and never for banking or primary personal accounts.
These configurations transform a standard text editor into a fully aware development environment where the AI understands not just code, but the entire project lifecycle.
Mobile and Local Development Standards
For applications that live outside the browser, we maintain a robust environment using Flutter for mobile development and C# compiled via VS Code for local desktop applications. Security in these environments is enforced strictly; all local applications must be signed using the official FlowDevs code signing certificate.
This rigor extends to our project management and operations. We use Linear for issue tracking, utilizing its MCP integration to keep AI in the loop. We use Microsoft Loop for all client-facing documentation, ensuring transparency. By standardizing these operational tools, we ensure that the speed of our development never outpaces our ability to manage it.
Frequently Asked Questions
What is the governance rule for changing the tech stack?
Any changes or additions to the FlowDevs tech stack require a minimum of two votes to pass. Currently, this requires alignment and approval from both Justin and Caleb before implementation begins.
What is "vibe coding"?
Vibe coding is our term for AI-assisted development where the focus shifts from manual coding to guiding AI generators. We primarily use Lovable for this, supported by assistants like Codex and GitHub Copilot.
Why are Vercel and Netlify unsupported?
For custom hosting outside of the Lovable ecosystem, we standardize on Azure + Bun. Vercel and Netlify are explicitly unsupported to maintain a consistent, controlled hosting environment that aligns with our specific infrastructure requirements.
How do agents learn new frameworks?
We use skills.sh to inject specific skills into our AI agents. This allows us to load best practices for specific technologies, such as Supabase or React, ensuring the AI follows our internal guidelines.
Check out this post on Techne Blog.
Stability in software development usually comes at the cost of speed, but the FlowDevs tech stack proves you can have both by enforcing a strict governance protocol requiring two votes for any structural change. Before any new tool or update is implemented, it requires alignment and approval from key stakeholders, ensuring that our foundation remains solid even as we move fast. This balance allows us to leverage cutting-edge AI-assisted development without introducing chaos into our production environments.
The Evolution of Vibe Coding
Modern development has shifted from accurate typing to accurate prompting, a methodology we call vibe coding. This approach uses AI interaction to accelerate front-end and full-stack generation. Our primary engine for this is Lovable, which handles the heavy lifting of interface generation. However, we do not rely on a single tool. We support a robust list of approved AI assistants including Codex via OpenAI, the Gemini CLI, GitHub Copilot, and Claude Code.
By defining a specific set of tools, we ensure that every developer is speaking the same language. Whether we are building a quick prototype or a complex enterprise application, the underlying AI assistance remains consistent / predictable and secure.
Standardizing Intelligence with Agent Skills
One of the biggest challenges in AI-assisted development is ensuring the AI understands specific internal rules and frameworks. To solve this, we treat knowledge as a module. We utilize skills.sh, an open agent skills ecosystem, to inject procedural knowledge directly into our AI agents.
Instead of hoping the AI guesses the right folder structure or coding convention, we explicitly install these skills. For example, a developer can run a simple command like npx skills add owner/repo to give their agent the exact instructions needed for a task. This is critical when switching between different architectural paths. If a project requires a high-performance backend, we pull in skills optimized for Supabase. If we are working strictly within a Next.js environment, we pull in Vercel-specific skills. This ensures the AI acts as a specialized expert rather than a generalist.
Core Infrastructure and Web Architecture
We believe in fitting the architecture to the business need, not the other way around. Our stack offers two primary deployment paths for web applications built with Lovable, as well as a custom option for specialized needs.
- Path 1: Standard. This is our cost-effective route. Both the frontend and cloud hosting are managed entirely within Lovable. It is perfect for projects where budget is the primary constraint.
- Path 2: Vibe-Optimized. When speed of generation is better than raw cost savings, we pair Lovable with Supabase. This allows for rapid iteration and "vibe coding" workflows, though it incurs higher infrastructure costs.
- Custom Path. For projects requiring hosting outside the Lovable ecosystem, we utilize Azure paired with Bun. It is important to note that we explicitly do not support Vercel or Netlify for these custom deployments to maintain strict control over our environments.
Regardless of the path chosen, GitHub remains our single source of truth. Every line of code must be synced to GitHub to ensure version control and disaster recovery.
Connecting Context with MCP Configurations
The Model Context Protocol (MCP) is the bridge that allows our AI tools to securely access our development environment. Without MCP, an AI is blind to the specific context of a project. We configure MCP servers in Codex to give it direct access to our databases, issue trackers, and secret managers.
To make this work, we use a global feature flag in our configuration files to enable remote clients. This setup allows for powerful integrations:
- Linear Integration. By connecting Codex to Linear, developers can create issues, track sprints, and query project documentation without ever leaving their code editor.
- Firebase and Supabase. We configure these servers to allow the AI to interact with Firestore, manage storage, and query database schemas. This turns the IDE into a command center where the database is just as accessible as the code files.
- 1Password. Security is paramount multiple secrets should never be hardcoded. We use the 1Password MCP to automate credential retrieval. However, because secrets are plaintext within the MCP context, this is strictly strictly used for automated, disposable credentials and never for banking or primary personal accounts.
These configurations transform a standard text editor into a fully aware development environment where the AI understands not just code, but the entire project lifecycle.
Mobile and Local Development Standards
For applications that live outside the browser, we maintain a robust environment using Flutter for mobile development and C# compiled via VS Code for local desktop applications. Security in these environments is enforced strictly; all local applications must be signed using the official FlowDevs code signing certificate.
This rigor extends to our project management and operations. We use Linear for issue tracking, utilizing its MCP integration to keep AI in the loop. We use Microsoft Loop for all client-facing documentation, ensuring transparency. By standardizing these operational tools, we ensure that the speed of our development never outpaces our ability to manage it.
Frequently Asked Questions
What is the governance rule for changing the tech stack?
Any changes or additions to the FlowDevs tech stack require a minimum of two votes to pass. Currently, this requires alignment and approval from both Justin and Caleb before implementation begins.
What is "vibe coding"?
Vibe coding is our term for AI-assisted development where the focus shifts from manual coding to guiding AI generators. We primarily use Lovable for this, supported by assistants like Codex and GitHub Copilot.
Why are Vercel and Netlify unsupported?
For custom hosting outside of the Lovable ecosystem, we standardize on Azure + Bun. Vercel and Netlify are explicitly unsupported to maintain a consistent, controlled hosting environment that aligns with our specific infrastructure requirements.
How do agents learn new frameworks?
We use skills.sh to inject specific skills into our AI agents. This allows us to load best practices for specific technologies, such as Supabase or React, ensuring the AI follows our internal guidelines.
Check out this post on Techne Blog.

.jpg)