Ontologies: What They Are, Why They Matter Now
It was in June 2024, when I theoretically understood the word "ontology". Another quarter before I saw it being used in a POC. Only in Q1 2025 when we actively started pushing agents to production, my understanding became firm — thanks to actual implementations.
Context graphs have made the concept of ontology wildly popular. Apart from Palantir and knowledge management groups, it's not a commonly used term in new age AI communities. Though Ontologists as a job title has been around for a while, and regulated industries like Pharma and Healthcare have dedicated knowledge management teams.
The question really is — what's an ontology, why does it matter now, and how should you think about using it?
What's an Ontology?
Non-ELI5 version: A formal, machine-readable domain model of concepts, attributes, relationships, and axioms within a domain, enabling shared understanding and reasoning. Sounds dense. Let's break it down.
Anatomy of an Ontology
Six building blocks — each adds a layer of machine-readable intelligence
Interactive Explainer
Built a quick interactive tool to show this instead of just explaining it. Try the Ontology vs Taxonomy Explainer →
Ontologies vs Taxonomies
Taxonomy (hierarchical categorization) is far more well understood as a concept. Think the category menu on Amazon — how products are classified and organized. Ontology captures the relationships between those concepts. As code.
Why Are They Popular Now?
Simply put — human teams route around incomplete information all the time. We know which concept means what in context without it being explicit. We fill gaps with judgment. AI agents don't have that. They take documents and words literally. They don't know that Metformin treats diabetes — they just know those words appear near each other in training data. That's correlation, not meaning.
Same Word, Different Meaning
Fig 1"Discharge" — Patient leaving the hospital after treatment. A discharge summary documents medications, follow-up appointments, and care instructions.
"Exposure" — Contact with a pathogen or harmful substance. Requires isolation protocols and notification chains.
"Pipeline" — Clinical research: the sequence of drug candidates from discovery through FDA approval.
Ontologies give AI the relationship structure that humans carry in their heads. That's why everyone's suddenly talking about them.
Sounds Great. What's the Catch?
Most teams don't get to ontologies. ETL pipelines — built. Already shipped the pilot. Prompt iterations are ongoing to fix hallucinations. The instinct is to bolt on the ontology at the end. Layer it on top of your existing data. That's backwards.
Don't LLMs Already Know This Stuff?
Sort of. But not in the way you need. LLMs learn patterns. They know that "Metformin" and "diabetes" frequently appear together. That's statistical correlation. It's not the same as knowing that Metformin treats Type 2 Diabetes, that it's contraindicated in patients with kidney failure, that it belongs to the biguanide drug class.
An LLM might get it right. It might not. Ask the same question differently and you might get a different answer. No guarantee of consistency, no logical structure underneath.
Using ontologies isn't about training LLMs. It's about extending a formal structure to LLMs for reasoning.
Correlation vs Reasoning
Fig 2"Metformin" and "diabetes" frequently appear together in training data. Statistical co-occurrence. High probability of association.
Ask differently and you might get a different answer. No guarantee of consistency. No logical structure underneath.
Where Should You Start?
Short answer — start with the domain ontology. Not your enterprise data.
Enterprise data is dynamic — evolves, new knowledge is added, older information is sunsetted. But the underlying meaning doesn't change. The domain ontology captures relationships that exist in your field regardless of your specific company. Then you introduce your enterprise data on top of that foundation.
Most teams hit this wall and try to fix it with more data. More documents. More context. Longer prompts. Unfortunately, the wrong fix. The foundation needs the ontology. And the ontology needs to come first, not last.
Go Deeper
For a deeper look at how ontologies enable and govern reasoning, read the Neuro-Symbolic AI Practitioner's Taxonomy →

Vivek Khandelwal
2X founder who has built multiple companies in the last 15 years. He bootstrapped iZooto to multi-millons in revenue. He graduated from IIT Bombay and has deep experience across product marketing, and GTM strategy. Mentors early-stage startups at Upekkha, and SaaSBoomi's SGx program. At CogniSwitch, he leads all things Marketing, Business Development and partnerships.