I find signal
in noisy systems

I'm Akaash. I build AI systems that make sense of messy, real-world data — turning chaotic procurement workflows into reusable infrastructure across 50+ consulting engagements at Kearney. I like creating things that cut through noise and surface what actually matters.

Approach

How I think about problems

01

Repetitive chaos becomes reusable infrastructure

Most teams solve the same messy data problem over and over — different client, same fire drill. I build systems that absorb that chaos once and turn it into something teams can run themselves. Harmonization, categorization, cleanup — not as one-off scripts, but as services anyone can trigger.

02

Institutional knowledge should be queryable

The best information in any organization is trapped in spreadsheets, old decks, and people's heads. I build agents that make that knowledge searchable and usable — supplier intelligence from 1.5M records, benchmarking from curated datasets — with evidence you can actually trace back.

03

Build on real data, not slide decks

I embed with client teams and build on their actual data and workflows. PoCs aren't demos — they're decision tools that make problems visible. When something works, it turns into a real workstream, not a one-off win that gets forgotten.

Selected Work

Signal found, impact delivered

$1B+

addressable spend unlocked

Finding $1B+ in hidden spend across 50 companies

A PE firm needed to find sourcing opportunities across ~50 portfolio companies, but the spend data was a mess — inconsistent schemas, different naming conventions, nothing joinable. I proved the approach on one company first using semantic clustering and an LLM-as-judge validation step, then scaled it portfolio-wide. The result: $1B+ in previously invisible addressable spend, and a reusable harmonization layer the team keeps using for new portfolio additions.

Start small, prove it works, then scale — and make sure the method outlives the project.

1.5M

suppliers centralized

Turning 1.5M scattered supplier records into a queryable asset

Every consulting team was rebuilding supplier normalization from scratch — same cleanup, different project, zero reuse. I reframed it as a data product problem: centralize the cleaned annotations, map everything to a shared taxonomy, and build an ingestion flow that handles messy spreadsheet dumps with alias resolution. Now ~1.5M suppliers live in one compounding intelligence layer that every team draws from.

The best systems compound — every project that uses them makes them better for the next one.

~$500K

annual efficiency lever

A PoC that uncovered ~$500K in process waste

At a ~$100B retailer, PR review logic was scattered across teams and tables — a manual process nobody had questioned. I built a PoC that automated the review checklists on their real data, but the bigger discovery was what the automation exposed: process gaps that were invisible when everything was manual. The PoC triggered a dedicated redesign workstream and surfaced ~$500K in annual efficiency gains.

Sometimes the value of building something isn't the thing itself — it's what becomes visible once you build it.