## Summary - Fixed flaky Node.js integration test for mirrored store functionality - Converted callback-based `fs.readdir()` to `fs.promises.readdir()` with proper async/await - Used unique temporary directories to prevent test isolation issues - Updated test expectations to match current IVF-PQ index file structure ## Problem The mirrored store integration test was experiencing random failures in CI with errors like: - `expected 2 to equal 1` at various assertion points - `done() called multiple times` ## Root Causes Identified 1. **Race conditions**: Mixing callback-based filesystem operations with async functions created timing issues where assertions ran before filesystem operations completed 2. **Test isolation**: Multiple tests shared the same temp directory (`tmpdir()`), causing one test to see files from another 3. **Outdated expectations**: IVF-PQ indexes now create 2 files (`auxiliary.idx` + `index.idx`) instead of 1, but the test expected only 1 ## Solution - Replace all `fs.readdir()` callbacks with `fs.promises.readdir()` and `await` - Use `fs.promises.mkdtemp()` to create unique temporary directories for each test run - Update index file count expectations from 1 to 2 files to match current Lance behavior - Add descriptive assertion labels for easier debugging ## Analysis The mirroring implementation in `MirroringObjectStore::put_opts` is synchronous - it awaits writes to both secondary (local) and primary (S3) stores before returning. The test failures were due to callback/async pattern mismatch and test isolation issues, not actual async mirroring behavior. ## Test plan - [x] Local tests are running without timing-based failures - [x] Integration tests with AWS credentials pass in CI This resolves the flaky failures including 'expected 2 to equal 1' assertions and 'done() called multiple times' errors seen in CI runs.
The Multimodal AI Lakehouse
How to Install ✦ Detailed Documentation ✦ Tutorials and Recipes ✦ Contributors
The ultimate multimodal data platform for AI/ML applications.
LanceDB is designed for fast, scalable, and production-ready vector search. It is built on top of the Lance columnar format. You can store, index, and search over petabytes of multimodal data and vectors with ease. LanceDB is a central location where developers can build, train and analyze their AI workloads.
Demo: Multimodal Search by Keyword, Vector or with SQL
Star LanceDB to get updates!
Key Features:
- Fast Vector Search: Search billions of vectors in milliseconds with state-of-the-art indexing.
- Comprehensive Search: Support for vector similarity search, full-text search and SQL.
- Multimodal Support: Store, query and filter vectors, metadata and multimodal data (text, images, videos, point clouds, and more).
- Advanced Features: Zero-copy, automatic versioning, manage versions of your data without needing extra infrastructure. GPU support in building vector index.
Products:
- Open Source & Local: 100% open source, runs locally or in your cloud. No vendor lock-in.
- Cloud and Enterprise: Production-scale vector search with no servers to manage. Complete data sovereignty and security.
Ecosystem:
- Columnar Storage: Built on the Lance columnar format for efficient storage and analytics.
- Seamless Integration: Python, Node.js, Rust, and REST APIs for easy integration. Native Python and Javascript/Typescript support.
- Rich Ecosystem: Integrations with LangChain 🦜️🔗, LlamaIndex 🦙, Apache-Arrow, Pandas, Polars, DuckDB and more on the way.
How to Install:
Follow the Quickstart doc to set up LanceDB locally.
API & SDK: We also support Python, Typescript and Rust SDKs
| Interface | Documentation |
|---|---|
| Python SDK | https://lancedb.github.io/lancedb/python/python/ |
| Typescript SDK | https://lancedb.github.io/lancedb/js/globals/ |
| Rust SDK | https://docs.rs/lancedb/latest/lancedb/index.html |
| REST API | https://docs.lancedb.com/api-reference/introduction |
Join Us and Contribute
We welcome contributions from everyone! Whether you're a developer, researcher, or just someone who wants to help out.
If you have any suggestions or feature requests, please feel free to open an issue on GitHub or discuss it on our Discord server.
Check out the GitHub Issues if you would like to work on the features that are planned for the future. If you have any suggestions or feature requests, please feel free to open an issue on GitHub.
