explain_plan works with FTS queries (#2466)
## Summary Fixes issue #2465 where FTS explain plans only showed basic `LanceScan` instead of detailed execution plans with FTS query details, limits, and offsets. ## Root Cause The `FTSQuery::explain_plan()` and `analyze_plan()` methods were missing the `.full_text_search()` call before calling explain/analyze plan, causing them to operate on the base query without FTS context. ## Changes - **Fixed** `explain_plan()` and `analyze_plan()` in `src/query.rs` to call `.full_text_search()` - **Added comprehensive test coverage** for FTS explain plans with limits, offsets, and filters - **Updated existing tests** to expect correct behavior instead of buggy behavior ## Before/After **Before (broken):** ``` LanceScan: uri=..., projection=[...], row_id=false, row_addr=false, ordered=true ``` **After (fixed):** ``` ProjectionExec: expr=[id@2 as id, text@3 as text, _score@1 as _score] Take: columns="_rowid, _score, (id), (text)" CoalesceBatchesExec: target_batch_size=1024 GlobalLimitExec: skip=2, fetch=4 MatchQuery: query=test ``` ## Test Plan - [x] All new FTS explain plan tests pass - [x] Existing tests continue to pass - [x] FTS queries now show proper execution plans with MatchQuery, limits, filters Closes #2465 🤖 Generated with [Claude Code](https://claude.ai/code) <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit * **Tests** * Added new test cases to verify explain plan output for full-text search, vector queries with pagination, and queries with filters. * **Bug Fixes** * Improved the accuracy of explain plan and analysis output for full-text search queries, ensuring the correct query details are reflected. * **Refactor** * Enhanced the formatting and hierarchical structure of execution plans for hybrid queries, providing clearer and more detailed plan representations. <!-- end of auto-generated comment: release notes by coderabbit.ai --> --------- Co-authored-by: Claude <noreply@anthropic.com>
The Multimodal AI Lakehouse
How to Install ✦ Detailed Documentation ✦ Tutorials and Recipes ✦ Contributors
The ultimate multimodal data platform for AI/ML applications.
LanceDB is designed for fast, scalable, and production-ready vector search. It is built on top of the Lance columnar format. You can store, index, and search over petabytes of multimodal data and vectors with ease. LanceDB is a central location where developers can build, train and analyze their AI workloads.
Demo: Multimodal Search by Keyword, Vector or with SQL
Star LanceDB to get updates!
Key Features:
- Fast Vector Search: Search billions of vectors in milliseconds with state-of-the-art indexing.
- Comprehensive Search: Support for vector similarity search, full-text search and SQL.
- Multimodal Support: Store, query and filter vectors, metadata and multimodal data (text, images, videos, point clouds, and more).
- Advanced Features: Zero-copy, automatic versioning, manage versions of your data without needing extra infrastructure. GPU support in building vector index.
Products:
- Open Source & Local: 100% open source, runs locally or in your cloud. No vendor lock-in.
- Cloud and Enterprise: Production-scale vector search with no servers to manage. Complete data sovereignty and security.
Ecosystem:
- Columnar Storage: Built on the Lance columnar format for efficient storage and analytics.
- Seamless Integration: Python, Node.js, Rust, and REST APIs for easy integration. Native Python and Javascript/Typescript support.
- Rich Ecosystem: Integrations with LangChain 🦜️🔗, LlamaIndex 🦙, Apache-Arrow, Pandas, Polars, DuckDB and more on the way.
How to Install:
Follow the Quickstart doc to set up LanceDB locally.
API & SDK: We also support Python, Typescript and Rust SDKs
| Interface | Documentation |
|---|---|
| Python SDK | https://lancedb.github.io/lancedb/python/python/ |
| Typescript SDK | https://lancedb.github.io/lancedb/js/globals/ |
| Rust SDK | https://docs.rs/lancedb/latest/lancedb/index.html |
| REST API | https://docs.lancedb.com/api-reference/introduction |
Join Us and Contribute
We welcome contributions from everyone! Whether you're a developer, researcher, or just someone who wants to help out.
If you have any suggestions or feature requests, please feel free to open an issue on GitHub or discuss it on our Discord server.
Check out the GitHub Issues if you would like to work on the features that are planned for the future. If you have any suggestions or feature requests, please feel free to open an issue on GitHub.
