I Was Below Average in School — Here's What Changed Everything
I used to score below 40% in most subjects — not because I lacked capability, but because I was studying out of fear.
The shift happened when curiosity replaced pressure. This is that story, and the mindset that changed everything for my entire career.
A deep-dive into Retrieval-Augmented Generation. How I built semantic search with OpenSearch,
connected Amazon Bedrock LLMs, and deployed the whole system on AWS ECS Fargate — with real token cost breakdowns and architecture decisions explained.
Before Python, before AI, there was Java. The verbosity, the OOP principles, the cryptic NullPointerExceptions —
Java taught me how to think like a programmer. And its strictness made me a better engineer than any "easy first language" ever could.
It was 2 AM. A YouTube video about Netflix recommendations. The realization that machines could learn patterns
from data — without being explicitly programmed. That one video completely rewired my career path forever.
Curiosity Is My Biggest Advantage — I Always Ask Why
Most engineers learn HOW to use transformers. I obsessed over WHY they work. That habit of going one level deeper
than most — into cosine similarity, async event loops, token economics — is what led to the ~99% token reduction at Idyllic.
Don't follow tutorials blindly. Build real things. Understand fundamentals before frameworks.
Embrace errors as teachers. If I could send a message back to my first-year self, this would be it.
Why Consistency Matters More Than Talent in Programming
I was never the smartest person in the room. But I showed up every day. Consistency got me the DSA lead role,
the internship, the production AI systems. Talent gets you started — consistency gets you there.
Start with a real problem, not a tutorial. Build something broken, then fix it. Find the 20% that does 80% of the work.
Applied this to AWS Bedrock, OpenSearch, n8n, Docker, and Kubernetes — all within months.
The tutorial trap is real. 10 hours of watching vs 1 hour of building something broken and fixing it.
Every project I built — JARVIS, the booking system, the Chrome extension — taught me 10× more than any course.
JARVIS — a voice assistant I built with zero guidance, bad code, and complete obsession. It taught me
API integration, state handling, prompt engineering basics, and one crucial lesson: imperfect projects build more confidence than perfect tutorials.
The shift from academic coding (solve this LeetCode problem) to real-world problem solving (this mutt needs a booking system).
Real users, real deadlines, real maintenance — they force better architecture, better code, better engineering.
Building a RIS Application for Medical Imaging Systems
Building a Radiology Information System — DICOM handling, patient data management, imaging report generation.
The responsibility of writing software in healthcare, and how real constraints shaped every architecture decision.
Before any internship, before any course — I built a voice-activated AI assistant from scratch.
Speech recognition, text-to-speech, Wikipedia API, task automation. The project that got me hired.
Building integration-smoke-test: A Python Tool for API Diagnostics
How I built and published a lightweight Python library to PyPI — testing API connectivity, diagnosing DNS issues,
timeout failures, and endpoint errors with a single import. Open source, production-ready.
From Code to Containers: What Docker and Kubernetes Taught Me
From "it works on my machine" chaos to containerized, orchestrated deployments on AWS ECS.
Why containerization is now part of my day-one architecture decisions on every production project.
Building a Real-Time Collaborative Code Editor with WebSockets
Multiple users, one editor, zero conflicts. The full architecture of my WebSocket-powered collaborative
coding platform — room management, operational transforms, conflict resolution, and Socket.IO under the hood.
Building a Booking System That Cut Manual Work by 90%
Flask, MySQL, automated WhatsApp confirmations. This is the full technical breakdown of the
Raghavendra Swami Mutt booking system — from 30+ hours/week of manual work to near-zero.
Building a Reliable AI Email Triage Pipeline with n8n
How I built a hybrid AI + rules email triage system using n8n — LLM classification, urgency scoring,
validation, and structured output. Why probabilistic AI needs deterministic guardrails to be production-safe.
I Built an AI Chrome Extension That Fills Job Forms Automatically
Tired of refilling the same job application forms? I built an AI Chrome Extension that reads the page,
understands the context, and fills fields automatically — tailored to each job description.
When Junior College Vector Math Saved My Real-World AI Project
The dot product and cosine similarity from 11th-grade math became essential for production RAG on AWS OpenSearch.
Semantic search IS linear algebra — and the moment I realized that led to the 99% token reduction achievement.
One Prompt Mistake Cost Me Hours: The Empty List Debugging Story
Sending an empty list in my prompt caused the LLM to return empty lists every time. Hours of debugging,
thinking it was a model issue. Lesson: LLMs follow examples literally. Always validate inputs before LLM calls.
XGBoost + SHAP: Building an Explainable House Price Predictor
Not a black-box model — users see exactly which features drive the predicted price. Full breakdown of
feature engineering, XGBoost tuning, SHAP value visualization, and serving predictions via Flask REST API.
From hackathon winner to production system — AWS ECS Fargate, OpenSearch vector search, Amazon Bedrock LLMs.
The complete journey of taking a prototype and deploying it as a real-world AI service that handles semantic resume matching.
JARVIS 2.0 — Building an Autonomous Multi-Agent AI System
Evolving from a simple voice assistant to a full autonomous multi-agent system. Supervisor agent architecture,
specialized sub-agents, LLM tool calling — the exact pattern I now use in production at Idyllic Services.
How I Replaced Fragile If-Else Chains with a Supervisor Agent
Brittle if-else routing in agentic AI breaks every time you add a new intent. I replaced the whole thing
with a supervisor agent pattern — achieving ~99% token reduction and ~10× throughput at Idyllic Services.