— Portfolio · 2026

Badrinarayanan Rangarajan.

Enterprise AI Leader Fractional CTO Agentic Systems Edge-to-Cloud Two-time Founder Researcher

Eighteen years of building things that ship — from flight computers on nano-UAVs to multi-agent AI running across enterprise GPU clusters. I have been team member, product manager, designer, founder, architect — and now I am returning to where the next decade actually gets built: research that becomes a venture. Today I am architecting that future inside one of the world's largest enterprises — and shipping it. Not a consumer of AI tools — a builder of AI systems that others use.

PORTRAIT Badrinarayanan Rangarajan
BR · 2026 Bengaluru
SCROLL
01 — Now

Building production agentic AI at automotive scale.

As Technology Program Manager for Enterprise AI at Mercedes-Benz R&D India, I architect the systems that let autonomous agents safely operate inside legacy enterprise infrastructure — an architectural problem the industry has not yet solved.

Enterprise Agentic AI · Brownfield Deployments

Mercedes-Benz R&D IndiaApril 2023 → PresentBengaluru

I lead the architecture of multi-agent systems with autonomous task decomposition, dynamic tool selection and real-time workflow adaptation — deployed across 60+ engineering workflows in brownfield environments where the legacy stack cannot be rewritten.

The hard part is not the model. It is what happens when an agent fails inside a 30-year-old enterprise system. So we built meta-cognitive recovery patterns — agents that reformulate strategy on failure, without a human stepping in.

Beyond the cloud, I architected onboard ML for in-cabin sensing — INT8/FP16 models compressed for automotive gateways, ISO 26262 and GDPR compliant, working alongside the ADAS team.

LangGraph AutoGen CrewAI vLLM Ray DeepSpeed SHAP / LIME VLMs ISO 26262
35–40%
Reduction in development & testing cycles via AI-driven tooling
60+
Engineers monthly trained & shipping GenAI features into live products
10×4
GPUs across 4 nodes — distributed inference orchestrated end-to-end
100%
Of trained engineers shipping AI to production

Two platforms reshaping how software ships.

Architecting the layers where autonomous agents take the keyboard — without breaking the systems that already run the business.

01

AURA · Agentic Unified Reactive Agile Platform

Multi-agent SDLC platform with autonomous task decomposition, dynamic tool selection and real-time workflow adaptation across legacy enterprise toolchains. Meta-cognitive recovery patterns let agents reformulate strategy on failure — without a human stepping in. Built around the Model Context Protocol and Agent-to-Agent protocol, with multi-LLM model routing, vector retrieval over corporate knowledge bases, and distributed inference across multi-GPU clusters via vLLM + Ray. Brownfield-first.

PROTOCOLSMCP · A2A · LangGraph · vLLM + Ray · multi-LLM routing
02

The autonomous production line.

Seven role-based agents running in parallel where they can and in handoff where they must: requirement analysis → coding ⇄ data + unit testing → review → merge → end-to-end validation with self-healing. The requirement-stage Blueprint persists immutably through to validation, so every downstream agent verifies against the original intent. End-to-end Playwright tests trigger recursive loop-backs into the coding, data or unit-test phases when implementation drifts. Zero-touch from backlog ticket to shipped feature.

LOOPRequirement → Code → Test → Review → Merge → E2E → Self-heal

And in the open, shipping the same thesis.

Three open-source repositories where the autonomous-SDLC, declarative-agent and on-device-AI patterns live in public — written in Rust and C++, deployable from a laptop to a cluster.

zenith · your AI engineering team

Nine specialised agents — Architect, Developer, Tester, Reviewer, DevOps, Research and a 3-stage Playwright pipeline that plans, generates and self-heals browser tests. Three-provider failover across Gemini, Claude and Azure OpenAI; every action gated through an interactive approval dashboard; 207 tests passing. Runs entirely on your own hardware — the open-source articulation of the same autonomous-production-line pattern.

agentfile · Dockerfile, for agents

A declarative format for portable AI agents — one file defines tools, memory, multi-model orchestration, pipelines and the whole service layer. Ten LLM providers (Claude, GPT-4, Gemini, Grok, Groq, Bedrock, vLLM, Ollama, Azure, Vertex). Run locally, serve as HTTP, or deploy to Kubernetes through the official Helm chart and AgentDeployment CRD. SPAWN · PARALLEL · AGGREGATE for multi-agent topologies; MEMORY episodic across sessions.

KLLM · kernel-level language models

A C++/Rust framework that injects language models directly at the kernel level — pseudo-level layering, on-device small LMs, a finite-state machine for state and recovery, and Sentinel AI for real-time monitoring, jailbreak prevention and self-healing. Cognitive re-modelling of the mobile OS for context-aware, privacy-preserving on-device intelligence; targets Android, Ubuntu Touch and web-OS. The edge-AI thesis I have carried since the flight-computer days, now applied to the operating system itself.

02 — Current Research

Cognitive Integration Intelligence.

The greenfield-AI era is ending. Ninety-plus percent of enterprise value still lives inside legacy systems that cannot be rewritten — and the agents being built today execute without observing, reset between calls, and treat failure as terminal. I am developing a methodology that closes those three gaps at once: AI that learns the way an experienced engineer does, inside systems that refuse to change.

Legacy Enterprise SYSTEMS · DATA · PROCESS Human Cognition EXPERTISE · INTUITION Agentic AI Frameworks LLMs · TOOLS · MEMORY TRIBAL KNOWLEDGE STANDARD RAG REASONING AGENTS Cognitive Integration Intelligence
The intersection where brownfield reality, human expertise, and agentic AI actually converge.
01

Observe, don't read.

Real expertise is behavioural, not documented. The senior engineer knows to check a log timestamp before a config file — that sequence never appears in an SOP. My agents shadow workflows and capture the implicit ordering that makes decisions correct.

REPLACESDocument ingestion + RAG over PDFs
02

Accumulate, don't reset.

Current agents start every session with amnesia. A Rust-backed persistent memory layer lets the system carry forward what it learned yesterday — per-engineer, per-workflow, per-failure-mode — so expertise compounds instead of resetting with each prompt.

REPLACESStateless per-call context windows
03

Recognise failure as data.

When an agent fails inside a 30-year-old stack, that failure is the most information-dense signal in the system. Meta-cognitive recovery reformulates strategy from the failure trace itself — turning what breaks production into what teaches the agent.

REPLACESRetry loops & human escalation

Edge AI Benchmark · Cognitive Integration across constrained hardware.

Jetson Orin Google Coral Raspberry Pi 6× RTX A6000
EDGE AI BENCHMARK · COGNITIVE INTEGRATION Edge AI Benchmark Report — Cognitive Integration across constrained hardware (Jetson Orin, Google Coral, RPi, 6×A6000); Gemma 4 · INT8 · Ollama; +5pp accuracy gain across math/code/reasoning; ~20% INT8 latency reduction; 3/5 sites deployed with cognitive layer vs 0/5 standard

Source: internal benchmark shared in the Cognitive Integration Intelligence field-note series on LinkedIn. Click to view original post.

◆ Hardware Matrix
NVIDIA Jetson Orin Automotive-class edge · full CII stack, INT8, real-time throughput
Google Coral TPU-only · distilled memory layer, quantised observer models
Raspberry Pi CPU-only fallback · failure-mode recorder + async sync
6× RTX A6000 Training & distributed inference · observer fleet supervision
Rust-based persistent memory layer. The accumulator runs below the model — a small, fast, typed store that carries observation traces, failure signatures, and engineer-specific decision patterns across sessions. This is the piece that makes the rest of the stack behave like an engineer with tenure rather than a prompt.
FIELD NOTES

Three essays that assemble the thesis.

Published on LinkedIn over the past months — the shape of the methodology, told in three parts.

Essay 01

The greenfield AI hype is over.

"Ninety percent of enterprise value is trapped inside legacy systems that cannot be rewritten. The next decade of AI is not about building on blank slates — it is about learning to operate inside the ones that already run the world."

READ ON LINKEDIN →
Essay 02

Stop building AI that executes.

"An agent that executes without observing is a faster way to be wrong. Real expertise is behavioural — captured in what order a senior engineer checks things, not in any document they've written."

READ ON LINKEDIN →
Essay 03

The problem with tribal knowledge.

"Tribal knowledge is not a documentation failure — it is a behavioural one. You cannot write down what you don't know you know. So we stopped trying to extract it and started observing it instead."

READ ON LINKEDIN →
03 — Evolution

From line-of-code engineer to founder, architect, researcher.

A non-linear path on purpose. Each role was a deliberate widening of perspective — banking systems taught me scale and reliability, drones taught me physics and real-time constraints, founding taught me the brutal economics of deep tech, and Mercedes is teaching me what it takes to land AI inside a 140-year-old enterprise.

PHASE 01Foundations · Engineer at scale
2003 — 2009Phase: Team member → Tech lead
Consultant · Tata Consultancy Services
SEI Investments · Bank of America · Citi Cards Philadelphia & Jacksonville, USA
  • Technical lead and application manager for payment systems and fraud detection at major US financial institutions.
  • Six years in the US — learned what it takes to ship software that handles real money, real consequences, real users.
2011 — 2012Phase: First-time founder
Founding Member · UooLabs
Singapore
  • Designed and prototyped Android device-management components using OSGi & OMA-DM.
  • First taste of zero-to-one. Set the trajectory.
PHASE 02Researcher · Designer · Product Manager
2013 — 2015Phase: Researcher & PM
Researcher & Product Manager · Nanyang Technological University
Air Traffic Management Research Institute Singapore
  • Designed a Pixhawk-based flight computing platform for Temasek Labs to identify signals from unknown terrain.
  • Built ML algorithms and parallel processing pipelines for large-scale data; published peer-reviewed work on cognitive and metacognitive systems — the PBL-McRBFN and McCIT2FIS classifier lineage that runs through five papers (see Recognition).
  • Secured research funding through proposals to A*Star, MINDEF, MOE.
PHASE 03Founder · Chief architect of a deep-tech venture
2015 — 2017Phase: CTO & co-founder
Co-founder, Chief Architect & CTO · SwarmX
Singapore
  • Built and scaled a 25-member team; served 5 enterprise clients including DNVGL and Singapore Police Force.
  • Designed the entire technology stack — ARM-based flight computers, precision landing, deep-learning analytics platform, cloud-based fleet management.
  • Owned marketing, commercialization and the architectural roadmap end-to-end.
2018 — 2019Phase: Independent researcher
Researcher · Republic Polytechnic
Singapore
  • Developed a low-cost ARM-based drone navigation system as a potential lidar replacement; designed an indoor UAV for warehousing.
  • Tested at Toyota facility; led customer engagement and commercialization.
PHASE 04Founder again · Defense-grade deep tech
2020 — 2022Phase: Co-founder & Technopreneur-in-Residence
Technopreneur-in-Residence · ARTPARK & Indian Institute of Science
Co-founder of Vishwa Dynamics Bengaluru, India
  • Founded a deep-tech venture; built a 15-member R&D team; developed Omnipilot — a modular computing platform for autonomous land & aerial systems.
  • Designed cognitive decision-making architecture for Omnipilot — an early agentic system enabling autonomous navigation, failure recovery and real-time strategy adaptation in GPS-denied environments.
  • Led product lifecycle for nano-UAVs with custom AI flight computers, ARM Edge AI systems, and acoustic counter-UAV defense.
  • Secured INR 1.8 Cr government grant from the Ministry of Heavy Industries; showcased at Aero India 2023 with DRDO as primary customer.
PHASE 05Enterprise AI architect · The current chapter
2023 — PresentPhase: Technology Program Manager
Technology Program Manager, Enterprise-AI · Mercedes-Benz R&D India
Bengaluru, India
  • Architecting agentic AI systems at automotive scale — 35–40% reduction in dev cycles, 60+ engineers shipping AI monthly.
  • Pioneered meta-cognitive recovery patterns for brownfield agentic deployment — agents that reformulate strategy on failure.
  • Established Responsible AI framework: explainability (SHAP, LIME), audit trails, bias mitigation.
  • Authoring the AURA agentic SDLC platform and an autonomous production line for software delivery — full description in the Now section.
04 — Ventures

Two ventures. Both deep tech. Both built from zero.

Across SwarmX (Singapore) and Vishwa Dynamics (Bengaluru), I have built engineering organizations of 15 and 25 people, owned the P&L, closed enterprise sales with regulated customers, and shipped hardware to defense agencies. The lessons compound.

IMAGE OmniPilot platform capabilities · Vishwa Dynamics
2020 — 2022 · BengaluruCO-FOUNDER

Vishwa Dynamics

Modular autonomy for land & aerial systems

Out of ARTPARK & IISc, we built Omnipilot — a modular AI flight computing platform with a cognitive decision-making layer that worked in GPS-denied environments. Defense-grade, nano-UAV class, with an acoustic counter-UAV stack alongside.

IMAGE OmniPilot product card · Aero India 2023
IMAGE Custom flight computer · Vishwa Dynamics
₹1.8 CrGovernment grant secured (Ministry of Heavy Industries)
15R&D team built ground-up
DRDOPrimary customer · Aero India 2023
Edge AIARM-based, mission-critical compute
PRECISION LANDING · 2015 SwarmX precision landing — raised platform with landing guidance
2015 — 2017 · SingaporeCO-FOUNDER · CTO

SwarmX

Energy asset management with autonomous UAVs

Architected the entire stack — flight computers, precision landing, the deep-learning analytics platform, and cloud-based fleet management for varied operational verticals. Enterprise GTM into regulated customers.

25Engineers across hardware, ML, cloud
5Enterprise clients (DNVGL, SG Police Force)
5–25UAV fleets managed across deployments
Full StackHardware → ML → Cloud → GTM
05 — Selected Work

Since 2013 — the work in pictures and in motion.

A non-exhaustive visual archive starting from the NTU years — the hardware I built, the demos we flew, the dashboards we shipped. The grid below holds the stills; the reel that follows it holds the field tapes and lab demos from the OmniPilot, LGSDP, NTU and SwarmX eras.

FEATURED OmniPilot capabilities — fully indigenized drones, no-code flying, drone app store, cloud simulator
Omnipilot Flight Computer
Vishwa Dynamics · Modular AI autonomy
2021
BLR
PRODUCT OmniPilot — AI-driven autonomous navigation system
Aero India 2023
DRDO showcase
2023
HARDWARE Custom flight controller board · Vishwa Dynamics
Nano-UAV
Custom AI flight computer
2022
DETECTION AI-based drone detection — bounding box tracking on land and naval scenes
Counter-UAV Defense
EO/IR & acoustic detection
2022
ARCHITECTURE Cognitive modeling for indoor navigation — multi-modal sensor fusion through meta-cognitive reasoning
Cognitive Architecture
Agentic decision layer
2021
UTM UAV traffic management — conflict resolution dashboard
SwarmX Platform
Fleet analytics & conflict resolution
2016
ENTERPRISE
Brownfield agentic AIMercedes-Benz · architecture under NDA
Enterprise Agentic AI
Mercedes-Benz · brownfield deployment
2025–26
▶ FIELD
Urban-farming POC
Vision-only navigation · 2025 refresh
2025
CLUSTER
10 × 4 GPU clustervLLM · Ray · distributed inference
GPU Orchestration
vLLM · Ray · 10 GPUs ×4
2024
PLATFORM Pixhawk-based flight platform · IISc AIRL
NTU ATMRI
Flight platform · Temasek Labs
2014
▶ WAREHOUSE
Indoor warehouse POC
Lidar-free navigation · 2025 refresh
2025
RESEARCH
Meta-cognitive networksPBL-McRBFN · McCIT2FIS · 5 peer-reviewed papers
Meta-Cognitive ML
McRBFN · McCIT2FIS · research
2014–15
// field tapes · lab demos
▶ OMNIPILOT
OmniPilot · platform demo
Vishwa Dynamics · ARTPARK era
2021
▶ FIELD
UAV field clip · I
UAV portfolio · multi-employer
2014–22
▶ PRECISION LANDING
Precision landing · UAV portfolio
Stabilised descent · multi-platform
2014–22
▶ FIELD
UAV field clip · III
UAV portfolio · multi-employer
2014–22
▶ AI · DEFENSE
AI in Defense · I
IISc presentation · 2021
2021
▶ PRECISION LANDING
Precision landing · SwarmX
Raised platform · landing guidance
2015
▶ AI · DEFENSE
AI in Defense · III
IISc presentation · 2021
2021
▶ LGSDP · WAREHOUSE
LGSDP warehouse · I
Lidar-free navigation · field
2019
▶ LGSDP · WAREHOUSE
LGSDP warehouse · II
Lidar-free navigation · field
2019
▶ LGSDP · SIMULATION
LGSDP simulation
Warehouse simulator · 2019
2019
▶ LAB
SAS recording
Lab capture · 2019
2019
▶ PRECISION LANDING
Precision landing · IISc
Infrastructure-aided · hobby drone
2021
▶ PRECISION LANDING
Precision landing · NTU
Landing on a moving vehicle
2017
06 — Capabilities

A rare full-stack across the AI value chain.

Most enterprise AI leaders have never written firmware. Most embedded engineers have never sized a GPU cluster. I have done both — and the leverage is in the seam between them.

The seam between edge and cloud is where deep-tech entrepreneurship lives.

10+ years building AI at the edge — flight computers, automotive gateways, embedded SOCs — and 10+ years architecting at the cloud and enterprise layer. The combination is what lets a venture defensibly own a vertical.

This is not a list of buzzwords. Each of these I have shipped to a paying customer, a regulated environment, or a peer-reviewed publication.

⊹ AI
Agentic Architecture
Multi-agent orchestration, LangGraph / AutoGen / CrewAI, meta-cognitive recovery, brownfield deployment
⊹ ML
Cognitive Systems
Meta-cognitive RBFN, McCIT2FIS, PBL-McRBFN — published research on cognitive architectures
⊹ INF
Distributed Inference
vLLM, Ray, Triton, DeepSpeed — orchestrating GPU fleets across nodes
⊹ EDGE
Edge AI & Embedded
ARM SOC/SOM, INT8/FP16 quantization, automotive gateways, ISO 26262 compliance
⊹ ROBO
Autonomous Systems
UAV flight computing, GPS-denied navigation, precision landing, swarm coordination
⊹ CV
Vision & VLMs
Real-time interior sensing, object detection, vision-language models for ADAS contexts
⊹ MLOPS
Cloud & MLOps
Multi-cloud AI infrastructure, CI/CD for ML, DeepEval benchmarking, MLOps automation
⊹ RAI
Responsible AI
SHAP / LIME explainability, audit trails, bias mitigation, GDPR-compliant deployment
⊹ BIZ
Build & Sell
P&L ownership, VC raise, enterprise & government sales, cross-cultural team building
07 — Recognition

Patents, papers, podiums.

A research record alongside a builder's record — which is the rarer combination, and exactly what an MS by Research demands.

Patents 04

  • System, method & station for docking unmanned vehiclesUAV infrastructure
  • System, method & server for managing stations & vehiclesFleet management
  • Indoor intelligent edge analytics platform for warehousingIP disclosure
  • Intelligent computing system for roboticsIP disclosure

Published Research 07+

  • Applied Soft Computing · ElsevierJournal · 2015
  • IEEE ISSNIP · SingaporeConference · 2014
  • IEEE ICCAR & Vision · SingaporeConference · 2014
  • IEEE Fuzzy Systems · TurkeyConference · 2015
  • IEEE ICCCIP · IndiaConference · 2015
  • Complete gallery below with cover previews & PDFs

Selected Talks 08

  • Mercedes-Benz Enterprise Architect ConferenceIndia · 2024
  • Mercedes-Benz Technology ForumsIndia · 2023
  • Startup-India: Journey through a hardware startupISB · 2022
  • Coexistence of Robots and Humans · CilreIndia · 2022
  • IEEE International Conference on Fuzzy SystemsTurkey · 2015
  • IEEE 13th Int'l Conference on Control Automation Robotics & VisionSingapore · 2014

Published research, in covers.

05 SHOWN · 07+ TOTAL
08 — The next chapter

Where my curiosity is pointing now.

A few open threads — surfaced from everything I have built so far across UAVs, automotive AI, robotics, biomedical signals and factory floors. Not a closed list, and not the only directions that interest me — just the ones I am actively reading, prototyping and writing about today, and the ones the next venture is most likely to grow out of.

Thread · 01

Compute at the constrained edge.

The interesting deep-tech problems are bounded — by power, latency, weight, cost, certification. I have spent a decade designing AI inside those constraints, and that is where I want to keep going deeper.

Edge AI Embedded SOC/SOM Quantized Inference Mission-Critical Compute Sensor Fusion
Already running through my work
  • Mercedes-Benz: INT8/FP16 quantized CV/ML on automotive gateways — ISO 26262 & GDPR compliant onboard ML.
  • Vishwa Dynamics: Omnipilot — custom AI flight computer for nano-UAVs; ARM-based edge compute, mission-critical, defense-grade.
  • SwarmX: ARM flight systems with precision landing, deployed across fleets of 5–25 UAVs in regulated environments.
  • NTU / Temasek Labs: Pixhawk-based flight computing platform for signal identification in unknown terrain.
  • Republic Polytechnic / Toyota: low-cost ARM navigation explored as a lidar replacement for indoor warehousing UAVs.
  • Two granted patents and two filed IP disclosures around edge analytics platforms and intelligent computing for robotics.
Thread · 02

Autonomy as a cognitive problem,
not a control problem.

Most autonomous systems break the moment the world stops matching the model. The interesting question is how a system reformulates strategy on failure — meta-cognitive recovery, dynamic re-planning, adaptive control. The industry has not yet solved this.

Agentic AI Meta-Cognitive Learning Adaptive Control GPS-Denied Autonomy Multi-Agent Orchestration
Already running through my work
  • Mercedes-Benz: pioneered meta-cognitive recovery patterns for brownfield agentic deployment; multi-agent orchestration with autonomous task decomposition and dynamic tool selection.
  • Vishwa Dynamics: designed the cognitive decision-making architecture for Omnipilot — autonomous navigation, failure recovery and real-time strategy adaptation in GPS-denied environments.
  • Published research on PBL-McRBFN, McCIT2FIS, McRBFN — a body of work on meta-cognitive learning algorithms across vision and biomedical signals.
  • Speaker on cognitive systems and robot-human coexistence (IEEE Turkey 2015, ICCAR & Vision Singapore 2014, Cilre 2022).
Thread · 03

Multi-parameter sensing × real-time decisions.

Real-world data is noisy, non-stationary, multi-modal. The hardest part of useful AI is the seam between rich physical sensing and the next decision the system has to make. I want to keep working on that seam — where signal becomes action.

Multi-Parameter ML Hybrid Edge-Cloud Distributed Inference Noisy / Non-Stationary Signals Spatio-Temporal Data
Already running through my work
  • Mercedes-Benz: hybrid edge-cloud AI for in-cabin sensing, vision-language models for ADAS, and distributed GPU inference across vLLM / Ray / DeepSpeed clusters.
  • SwarmX: deep-learning analytics platform on UAV-collected data; cloud-based fleet management across varied operational verticals.
  • Published multi-parameter ML on biomedical signals — EEG seizure detection (McCIT2FIS), structural MRI biomarker discovery for ADHD, compressed-domain action recognition (Applied Soft Computing, Elsevier).
  • Republic Polytechnic / Toyota: indoor sensing UAV for warehousing; led commercialization and customer engagement.
  • Responsible-AI framework with SHAP, LIME, audit trails — because turning sensor data into operational decisions demands explainability.
Founded
2 ventures
SwarmX · Vishwa Dynamics
Funded
₹1.8 Cr
Govt. of India · MoHI grant
Published
7+ papers
Cognitive systems · Robotics · CV
09 — Education
2025 — Ongoing
M.S., Electrical & Computer Engineering
University of Colorado Boulder
2022 — 2023
Executive Business Administration
Indian Institute of Management, Bangalore
1998 — 2002
B.E., Computer Science & Engineering
Visvesvaraya Technological University

Let's talk about what's next.