Quick Summary
- What: API Doc Lie Detector automatically tests your API documentation against the live API to find discrepancies, outdated endpoints, and incorrect response formats.
The Great API Documentation Charade
Let's be honest: most API documentation is written by people who either A) haven't touched the actual API since the initial commit, B) are writing what they wish the API did, or C) are actively trying to gaslight developers into thinking their integration problems are user error. The result? Documentation that's about as reliable as a weather forecast from a psychic octopus.
Consider the classic scenario: you're integrating with a payment API. The docs proudly proclaim "easy 5-minute setup!" Three hours later, you're deep in Stack Overflow threads trying to figure out why the "optional" currency field is actually required, why the example response shows a neat JSON object but you're getting XML soup, and why the "GET /transactions" endpoint returns a 404 with the helpful message "Resource not found" (spoiler: it's actually at /api/v1/transactions, not /v2/transactions like the docs say).
This isn't just annoying—it's expensive. Developers waste countless hours playing "documentation detective," trying to reverse-engineer what the API actually does from error messages and trial-and-error. It's like being given a treasure map where X marks a spot that's actually a sewage treatment plant, and the compass points south-north-west.
From Fiction to Fact-Checking
I built API Doc Lie Detector after my third consecutive day of debugging an integration where the documentation was so wrong it should have come with a "for entertainment purposes only" disclaimer. The premise is beautifully simple: if documentation is supposed to describe reality, why not test it against reality?
The tool works by taking your API documentation (OpenAPI/Swagger specs work beautifully) and actually hitting the endpoints it describes. It compares what the documentation says will happen with what actually happens. Does the endpoint exist? Does it return the status codes promised? Does the response structure match the examples? Are the parameters actually accepted?
What makes this genuinely useful (beyond the cathartic pleasure of catching documentation in a lie) is that it turns subjective frustration into objective data. Instead of "this documentation sucks," you get "this documentation has a 42% truthiness score with 18 discrepancies across 7 endpoints." Suddenly, you have something concrete to show your PM or the API team: "Here's exactly where our documentation doesn't match reality."
How to Stop Trusting and Start Verifying
Getting started is easier than explaining to your manager why the integration is delayed because the API docs were fiction. First, install it:
npm install -g api-doc-lie-detector
# or
pip install api-doc-lie-detector
Then point it at your OpenAPI spec and your API base URL:
lie-detect --spec ./openapi.yaml --base-url https://api.example.com
Here's a snippet from the core verification logic that shows how beautifully simple the concept is:
async function verifyEndpoint(endpoint, baseUrl) {
const url = `${baseUrl}${endpoint.path}`;
const response = await fetch(url, {
method: endpoint.method,
headers: endpoint.headers
});
const lies = [];
// Check if endpoint actually exists
if (response.status === 404) {
lies.push(`Endpoint ${endpoint.path} returns 404 (does not exist)`);
}
// Validate response structure
const actualSchema = inferSchema(await response.json());
if (!schemasMatch(actualSchema, endpoint.expectedSchema)) {
lies.push(`Response schema doesn't match documentation`);
}
return { endpoint, lies, truthiness: lies.length === 0 };
}
Check out the full source code on GitHub for the complete implementation, including the truthiness scoring algorithm and diff report generation.
Features That Call BS on Your Docs
- Automated Endpoint Testing: Actually hits every documented endpoint with appropriate HTTP methods and validates responses. No more "trust me bro" documentation.
- Schema Validation: Compares actual response structures against documented examples and schemas. Catches those "oops, we changed the field name but forgot to update the docs" moments.
- Truthiness Scoring: Generates a percentage score showing how accurate your documentation actually is. Perfect for shaming teams into better practices (or at least providing objective metrics).
- Diff Reports: Creates detailed reports showing exactly where documentation diverges from reality, complete with before/after comparisons and specific discrepancies.
- Parameter Validation: Tests whether documented parameters are actually accepted, whether required parameters are truly required, and whether optional parameters are actually optional.
- Authentication Testing: Verifies that documented authentication methods actually work (because nothing's worse than auth docs that lead to perpetual 401s).
The End of Documentation Wishful Thinking
API Doc Lie Detector won't write your documentation for you, but it will tell you when your documentation is lying to you. In a world where APIs are the glue between services, accurate documentation isn't a nice-to-have—it's table stakes. This tool turns the subjective art of "bad documentation" into the objective science of "documentation with 37 specific inaccuracies."
The real magic happens when you integrate this into your CI/CD pipeline. Imagine: every time your API changes, your documentation gets automatically fact-checked. No more drift between what you ship and what you document. No more developers wasting hours debugging integration issues that stem from outdated docs. Just beautiful, verified, accurate documentation that actually helps people use your API.
Try it out: https://github.com/BoopyCode/api-doc-lie-detector. Your future self (and the developers integrating with your API) will thank you. And if your documentation scores below 50%... well, maybe don't show that part to your boss just yet.
Because in the battle between documented fiction and API reality, it's time reality had some automated backup.
💬 Discussion
Add a Comment