If documentation is part of your product, what kind of experience should you provide? And how should docs adapt when your primary audience includes both humans and LLMs?
Hello, I’m Allen, technical documentation engineer at Dify. Starting in April 2025, I led a comprehensive migration of Dify’s documentation from GitBook to Mintlify. This wasn’t just a platform switch—it was a complete infrastructure rebuild designed for the AI era, where both developers and LLMs need to quickly understand and interact with our platform. Here’s what changed. At first glance, the visual differences might seem subtle—but the impact runs much deeper.
Before: GitBookAfter: Mintlify
Original documentation interfaceCurrent documentation interface

Why Documentation Infrastructure Matters in the AI Era

The real question isn’t whether the migration was worth it—it’s whether you’re willing to optimize your docs for AI consumption. Because here’s the thing: how knowledge systems organize information is fundamentally changing. Before AI, all software documentation was designed exclusively for humans. Layout mattered. Progressive disclosure mattered. Visual hierarchy mattered. Even SEO required careful attention to HTML structure, metadata, and sitemaps to help search engines surface your content. But LLMs don’t care about elegant typography or beautiful layouts. They care about comprehensive, well-structured content that provides rich context. While humans scan for visual cues, AI models parse for semantic meaning and completeness.

How Information Discovery Is Evolving

Think about when users need documentation: when they hit a problem. The traditional path looked like this: Traditional Discovery: User encounters issue → Searches Google → Finds documentation page → Reads through content → Implements solution This created three primary user flows:
  1. Finding content through search engines
  2. Browsing documentation hierarchy
  3. Using in-doc search functionality
Stripe’s documentation arguably perfected this approach. Any team member can quickly find what they need, and the docs even generate personalized test code for logged-in users. Stripe documentation example In the LLM era, the path has shifted: AI-Powered Discovery: User asks LLM → LLM provides solution → User gives feedback → LLM offers detailed guidance Users no longer need to piece together information from multiple pages—they get instant, contextual answers.

The Rise of Generative Engine Optimization

Large Language Models are transforming SEO. This has created an entirely new field: Generative Engine Optimization (GEO). Research predicts LLM-driven search traffic will jump from 0.25% of all searches in 2024 to 10% by the end of 2025.
Research source: AI SEO Study 2024
For technical documentation, this shift will revolutionize how customers discover information and succeed with products. What content will LLMs prioritize? The same strategies they’ve learned from processing vast amounts of internet content. As LLMs become sophisticated content consumers—and eventually develop Agent capabilities to interact with the physical world—the content produced by technical writers will significantly shape how AI understands and recommends products. To be truly LLM-friendly, technical documentation needs one critical capability: automatically generating llms.txt files.
llms.txt is a structured Markdown file that summarizes your most important content in a format optimized for LLM consumption, free from HTML clutter, JavaScript, or ads.
Beyond LLM compatibility, developer tool documentation should naturally provide an excellent interactive experience. API documentation should support live testing and debugging. This established our migration goals:
  1. LLM-Optimized Content Make our product more likely to surface in AI responses across search engines and intelligent assistants, improving overall discoverability.
  2. Complete API Documentation Provide comprehensive API docs with live debugging capabilities, helping developers understand integration methods quickly and boosting adoption.
  3. AI-Powered Documentation Chat Enable natural language queries about our documentation, reducing support burden while improving user experience.

LLM-Optimized

Ensure documentation works well with AI models

API-First

Complete API docs with debugging functionality

AI Chat Ready

Support conversational documentation experiences

Research Before Rebuild

1. Identifying GitBook’s Limitations

Dify’s documentation had grown with GitBook since our open-source launch. But as we scaled, fundamental problems became impossible to ignore. Here’s what broke the camel’s back:
The root issue: GitBook targets business users without programming skills, prioritizing GUI-based editing over docs-as-code workflows. While it supports IDE integration, local environments can’t preview rendering effects.
GitBook isn’t built for docs-as-code.
With over 600 articles and complex internal linking, we needed a platform designed for scale and sustainability.

2. Comprehensive Platform Evaluation

I believe documentation is part of the product. Any major infrastructure change requires thorough evaluation, especially during AI transformation. I researched and tested every major documentation platform, creating this comparison for our team:
FeatureGitBookMintlifyStarlightDocusaurusNextra
⭐️ Local preview
⭐️ Version control✅ (with plugins)
⭐️ Multi-language
⭐️ API documentation✅ (buggy)✅ (needs integration)✅ (needs plugins)✅ (custom)
Visual editing
CostFree / $79+Free / $150/monthOpen sourceOpen sourceOpen source
Who uses itNordVPN, RaycastCursor, Anthropic, PerplexityAstro projectsVarious industriesOpen source projects
After testing, I narrowed it down to Mintlify and Docusaurus.
  • Mintlify excelled in visual design and API auto-generation, with commercial-grade polish out of the box.
Mintlify interface example
  • Docusaurus offered strong extensibility but required React development skills for customization.
Docusaurus interface example Considering maintenance overhead, out-of-the-box functionality, and AI integration potential, Mintlify won. Their blog also provides excellent insights on AI-era documentation strategy. Mintlify blog insights

Execution: Tackling Migration Challenges

1. Team Alignment and Learning

After choosing Mintlify, I immediately set up a demo environment for our technical writing team. Through multiple presentations and hands-on sessions, I built team consensus around the new platform. The migration became an intensive learning experience. I thoroughly studied Mintlify’s documentation (which showcased their technical writing expertise) to understand all supported functions, component styles, and content organization methods. One key difference: Mintlify uses .mdx instead of traditional .md files. This format combines Markdown simplicity with JSX capabilities, enabling richer interactions and component usage. With AI assistance, format conversion became manageable rather than a barrier. What mattered more was future extensibility and intelligent tool integration.
GitBook SyntaxMintlify Syntax
GitBook syntax exampleMintlify syntax example

2. Solving Core Infrastructure Problems

As I dove deeper into the migration, I identified several critical issues that needed systematic solutions.

Problem 1: Image Management Chaos

Our original system lacked unified image hosting. Images were scattered throughout the repository with inconsistent naming and no CDN acceleration. GitBook’s private hosting also caused loading delays across different regions. Solution: Automated Image Infrastructure I built a Pico + S3 image hosting service with:
  • Automatic standardized naming to prevent conflicts
  • Separation from the main code repository
  • Global CDN acceleration for fast loading
  • Streamlined upload experience for writers
Internal image hosting interface For legacy images, I had an intern write automated migration scripts, solving what could have been the most tedious part of the migration.

Problem 2: Inconsistent Syntax Formatting

GitBook’s syntax was unintuitive and error-prone. Compare these approaches for creating an info callout:
{% hint style="info" %}
The name Dify comes from **D**o **I**t **F**or **Y**ou.
{% endhint %}
GitBook’s {% %} syntax is not only unreadable but breaks in most Markdown environments. Solution: AI-Powered Conversion I built a Dify application to automatically convert documentation syntax. By providing syntax correspondence rules as context, the AI accurately transformed content while maintaining semantic meaning. Try the MD to MDX Assistant MD to MDX conversion tool With 600+ articles, cross-references were everywhere. Links used inconsistent formats—some complete URLs, others relative paths. Different writers had different habits, creating migration complexity. I designed a Python-based workflow:
  1. Automatically identify all link references
  2. Suggest replacement paths with intelligent recommendations
  3. Human confirmation for accuracy
  4. Batch execution of verified replacements
This preserved AI efficiency while ensuring accuracy through human oversight.

3. Smooth Launch Execution

After solving infrastructure problems, I manually verified every page and link before launch. I considered A/B testing with gradual rollout but decided simplicity was better for a documentation project. The goal: stable, accessible, zero user disruption. On launch day, I monitored key pages in real-time. No failures occurred, and most users didn’t even notice the platform had changed. Visual consistency between systems and proper redirect handling enabled “invisible” migration. Post-launch, we deployed API documentation with DevRel team support, completing our structured documentation system. API documentation interface We also added automated feedback modules across all pages, creating a positive cycle of content publishing, user feedback, and iterative improvement. User feedback module

The AI-First Documentation Experience

This migration validated a key insight: choosing the right platform isn’t just about features—it’s about finding a partner that shares your vision. Mintlify’s rapid innovation proved this point. When MCP became a hot topic in March 2025, Mintlify immediately supported documentation MCP functionality. I quickly wrote deployment guides to help users create dedicated documentation Q&A services:

Dify Docs MCP Guide

Learn how to deploy dedicated documentation Q&A services
The documentation Q&A experience was impressive—truly conversational AI within documentation. MCP documentation Q&A in action Shortly after, Mintlify launched built-in AI chat functionality. Users can now interact with AI directly within pages for real-time help with complex content or code explanations.
Full Document AI QueryReal-time AI Assistance
Full document AI queryReal-time AI code help
This was the perfect example of PLG (Product-Led Growth) collaboration. Dify focuses on production-grade AI application platforms, while Mintlify specializes in technical documentation. Both bring deep expertise to their domains, creating genuine value at the intersection without superficial marketing. Today, Mintlify powers documentation for leading AI companies like Claude, Cursor, and Perplexity. Its evolution proves that technical products should be beautiful, useful, and scalable. As a technical documentation engineer, I’ve learned that our role extends far beyond writing good content. We’re documentation product owners, responsible for trustworthy content, smooth experiences, and sustainable user journeys. In a world of information overload and time constraints, when everyone achieves excellence within their capabilities, the entire system—and world—naturally improves.

References

How Generative Engine Optimization is Reshaping Docs