Compare commits

...

18 Commits

Author SHA1 Message Date
rmeng
24526bda4c patch 2024-05-15 13:44:27 -04:00
Cory Grinstead
055efdcdb6 refactor(nodejs): use biomejs instead of eslint & prettier (#1304)
I've been noticing a lot of friction with the current toolchain for
'/nodejs'. Particularly with the usage of eslint and prettier.

[Biome](https://biomejs.dev/) is an all in one formatter & linter that
replaces the need for two different ones that can potentially clash with
one another.

I've been using it in the
[nodejs-polars](https://github.com/pola-rs/nodejs-polars) repo for quite
some time & have found it much more pleasant to work with.

---

One other small change included in this PR:

use [ts-jest](https://www.npmjs.com/package/ts-jest) so we can run our
tests without having to rebuild typescript code first
2024-05-14 11:11:18 -05:00
Cory Grinstead
bc582bb702 fix(nodejs): add better error handling when missing embedding functions (#1290)
note: 
running the default lint command `npm run lint -- --fix` seems to have
made a lot of unrelated changes.
2024-05-14 08:43:39 -05:00
Will Jones
df9c41f342 ci: write down breaking change policy (#1294)
* Enforce conventional commit PR titles
* Add automatic labelling of PRs
* Write down breaking change policy.

Left for another PR:
* Validation of breaking change version bumps. (This is complicated due
to separate releases for Python and other package.)
2024-05-13 10:25:55 -07:00
Raghav Dixit
0bd6ac945e Documentation : Langchain doc bug fix (#1301)
nav bar update
2024-05-13 20:56:34 +05:30
Raghav Dixit
c9d5475333 Documentation: Langchain Integration (#1297)
Integration doc update
2024-05-13 10:19:33 -04:00
asmith26
3850d5fb35 Add ollama embeddings function (#1263)
Following the docs
[here](https://lancedb.github.io/lancedb/python/python/#lancedb.embeddings.openai.OpenAIEmbeddings)
I've been trying to use ollama embedding via the OpenAI API interface,
but unfortunately I couldn't get it to work (possibly related to
https://github.com/ollama/ollama/issues/2416)

Given the popularity of ollama I thought it could be helpful to have a
dedicated Ollama Embedding function in lancedb.

Very much welcome any thought on this or my code etc. Thanks!
2024-05-13 13:09:19 +05:30
Lance Release
b37c58342e [python] Bump version: 0.6.12 → 0.6.13 2024-05-10 16:15:13 +00:00
Lance Release
a06e64f22d Updating package-lock.json 2024-05-09 22:46:19 +00:00
Lance Release
e983198f0e Updating package-lock.json 2024-05-09 22:12:17 +00:00
Lance Release
76e7b4abf8 Updating package-lock.json 2024-05-09 21:14:47 +00:00
Lance Release
5f6eb4651e Bump version: 0.4.19 → 0.4.20 2024-05-09 21:14:30 +00:00
Bert
805c78bb20 chore: bump lance to v0.10.18 (#1287)
https://github.com/lancedb/lance/releases/tag/v0.10.18
2024-05-09 17:06:26 -03:00
QianZhu
4746281b21 fix rename_table api and cache pop (#1283) 2024-05-08 13:41:18 -07:00
Aman Kishore
7b3b6bdccd Remove semvar strict dependancy (#1253) 2024-05-08 11:16:15 -07:00
Ryan Green
37e1124c0f chore: upgrade lance to 0.10.17 (#1280) 2024-05-08 09:56:48 -02:30
Lance Release
93f037ee41 Updating package-lock.json 2024-05-07 20:50:44 +00:00
Lance Release
e4fc06825a Updating package-lock.json 2024-05-07 20:09:25 +00:00
50 changed files with 9922 additions and 9029 deletions

View File

@@ -1,5 +1,5 @@
[bumpversion] [bumpversion]
current_version = 0.4.19 current_version = 0.4.20
commit = True commit = True
message = Bump version: {current_version} → {new_version} message = Bump version: {current_version} → {new_version}
tag = True tag = True

33
.github/labeler.yml vendored Normal file
View File

@@ -0,0 +1,33 @@
version: 1
appendOnly: true
# Labels are applied based on conventional commits standard
# https://www.conventionalcommits.org/en/v1.0.0/
# These labels are later used in release notes. See .github/release.yml
labels:
# If the PR title has an ! before the : it will be considered a breaking change
# For example, `feat!: add new feature` will be considered a breaking change
- label: breaking-change
title: "^[^:]+!:.*"
- label: breaking-change
body: "BREAKING CHANGE"
- label: enhancement
title: "^feat(\\(.+\\))?!?:.*"
- label: bug
title: "^fix(\\(.+\\))?!?:.*"
- label: documentation
title: "^docs(\\(.+\\))?!?:.*"
- label: performance
title: "^perf(\\(.+\\))?!?:.*"
- label: ci
title: "^ci(\\(.+\\))?!?:.*"
- label: chore
title: "^(chore|test|build|style)(\\(.+\\))?!?:.*"
- label: Python
files:
- "^python\\/.*"
- label: Rust
files:
- "^rust\\/.*"
- label: typescript
files:
- "^node\\/.*"

25
.github/release.yml vendored Normal file
View File

@@ -0,0 +1,25 @@
# TODO: create separate templates for Python and other releases.
changelog:
exclude:
labels:
- ci
- chore
categories:
- title: Breaking Changes 🛠
labels:
- breaking-change
- title: New Features 🎉
labels:
- enhancement
- title: Bug Fixes 🐛
labels:
- bug
- title: Documentation 📚
labels:
- documentation
- title: Performance Improvements 🚀
labels:
- performance
- title: Other Changes
labels:
- "*"

81
.github/workflows/dev.yml vendored Normal file
View File

@@ -0,0 +1,81 @@
name: PR Checks
on:
pull_request_target:
types: [opened, edited, synchronize, reopened]
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}
cancel-in-progress: true
jobs:
labeler:
permissions:
pull-requests: write
name: Label PR
runs-on: ubuntu-latest
steps:
- uses: srvaroa/labeler@master
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
commitlint:
permissions:
pull-requests: write
name: Verify PR title / description conforms to semantic-release
runs-on: ubuntu-latest
steps:
- uses: actions/setup-node@v3
with:
node-version: "18"
# These rules are disabled because Github will always ensure there
# is a blank line between the title and the body and Github will
# word wrap the description field to ensure a reasonable max line
# length.
- run: npm install @commitlint/config-conventional
- run: >
echo 'module.exports = {
"rules": {
"body-max-line-length": [0, "always", Infinity],
"footer-max-line-length": [0, "always", Infinity],
"body-leading-blank": [0, "always"]
}
}' > .commitlintrc.js
- run: npx commitlint --extends @commitlint/config-conventional --verbose <<< $COMMIT_MSG
env:
COMMIT_MSG: >
${{ github.event.pull_request.title }}
${{ github.event.pull_request.body }}
- if: failure()
uses: actions/github-script@v6
with:
script: |
const message = `**ACTION NEEDED**
Lance follows the [Conventional Commits specification](https://www.conventionalcommits.org/en/v1.0.0/) for release automation.
The PR title and description are used as the merge commit message.\
Please update your PR title and description to match the specification.
For details on the error please inspect the "PR Title Check" action.
`
// Get list of current comments
const comments = await github.paginate(github.rest.issues.listComments, {
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: context.issue.number
});
// Check if this job already commented
for (const comment of comments) {
if (comment.body === message) {
return // Already commented
}
}
// Post the comment about Conventional Commits
github.rest.issues.createComment({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: context.issue.number,
body: message
})
core.setFailed(message)

View File

@@ -52,8 +52,7 @@ jobs:
cargo fmt --all -- --check cargo fmt --all -- --check
cargo clippy --all --all-features -- -D warnings cargo clippy --all --all-features -- -D warnings
npm ci npm ci
npm run lint npm run lint-ci
npm run chkformat
linux: linux:
name: Linux (NodeJS ${{ matrix.node-version }}) name: Linux (NodeJS ${{ matrix.node-version }})
timeout-minutes: 30 timeout-minutes: 30

View File

@@ -10,9 +10,12 @@ repos:
rev: v0.2.2 rev: v0.2.2
hooks: hooks:
- id: ruff - id: ruff
- repo: https://github.com/pre-commit/mirrors-prettier - repo: local
rev: v3.1.0
hooks: hooks:
- id: prettier - id: local-biome-check
name: biome check
entry: npx biome check
language: system
types: [text]
files: "nodejs/.*" files: "nodejs/.*"
exclude: nodejs/lancedb/native.d.ts|nodejs/dist/.* exclude: nodejs/lancedb/native.d.ts|nodejs/dist/.*

View File

@@ -14,10 +14,10 @@ keywords = ["lancedb", "lance", "database", "vector", "search"]
categories = ["database-implementations"] categories = ["database-implementations"]
[workspace.dependencies] [workspace.dependencies]
lance = { "version" = "=0.10.16", "features" = ["dynamodb"] } lance = { "version" = "=0.10.18", "features" = ["dynamodb"] }
lance-index = { "version" = "=0.10.16" } lance-index = { "version" = "=0.10.18" }
lance-linalg = { "version" = "=0.10.16" } lance-linalg = { "version" = "=0.10.18" }
lance-testing = { "version" = "=0.10.16" } lance-testing = { "version" = "=0.10.18" }
# Note that this one does not include pyarrow # Note that this one does not include pyarrow
arrow = { version = "51.0", optional = false } arrow = { version = "51.0", optional = false }
arrow-array = "51.0" arrow-array = "51.0"
@@ -29,7 +29,7 @@ arrow-arith = "51.0"
arrow-cast = "51.0" arrow-cast = "51.0"
async-trait = "0" async-trait = "0"
chrono = "0.4.35" chrono = "0.4.35"
half = { "version" = "=2.3.1", default-features = false, features = [ half = { "version" = "=2.4.1", default-features = false, features = [
"num-traits", "num-traits",
] } ] }
futures = "0" futures = "0"

View File

@@ -119,7 +119,7 @@ nav:
- Polars: python/polars_arrow.md - Polars: python/polars_arrow.md
- DuckDB: python/duckdb.md - DuckDB: python/duckdb.md
- LangChain: - LangChain:
- LangChain 🔗: https://python.langchain.com/docs/integrations/vectorstores/lancedb/ - LangChain 🔗: integrations/langchain.md
- LangChain JS/TS 🔗: https://js.langchain.com/docs/integrations/vectorstores/lancedb - LangChain JS/TS 🔗: https://js.langchain.com/docs/integrations/vectorstores/lancedb
- LlamaIndex 🦙: https://docs.llamaindex.ai/en/stable/examples/vector_stores/LanceDBIndexDemo/ - LlamaIndex 🦙: https://docs.llamaindex.ai/en/stable/examples/vector_stores/LanceDBIndexDemo/
- Pydantic: python/pydantic.md - Pydantic: python/pydantic.md

View File

@@ -206,6 +206,44 @@ print(actual.text)
``` ```
### Ollama embeddings
Generate embeddings via the [ollama](https://github.com/ollama/ollama-python) python library. More details:
- [Ollama docs on embeddings](https://github.com/ollama/ollama/blob/main/docs/api.md#generate-embeddings)
- [Ollama blog on embeddings](https://ollama.com/blog/embedding-models)
| Parameter | Type | Default Value | Description |
|------------------------|----------------------------|--------------------------|------------------------------------------------------------------------------------------------------------------------------------------------|
| `name` | `str` | `nomic-embed-text` | The name of the model. |
| `host` | `str` | `http://localhost:11434` | The Ollama host to connect to. |
| `options` | `ollama.Options` or `dict` | `None` | Additional model parameters listed in the documentation for the [Modelfile](./modelfile.md#valid-parameters-and-values) such as `temperature`. |
| `keep_alive` | `float` or `str` | `"5m"` | Controls how long the model will stay loaded into memory following the request. |
| `ollama_client_kwargs` | `dict` | `{}` | kwargs that can be past to the `ollama.Client`. |
```python
import lancedb
from lancedb.pydantic import LanceModel, Vector
from lancedb.embeddings import get_registry
db = lancedb.connect("/tmp/db")
func = get_registry().get("ollama").create(name="nomic-embed-text")
class Words(LanceModel):
text: str = func.SourceField()
vector: Vector(func.ndims()) = func.VectorField()
table = db.create_table("words", schema=Words, mode="overwrite")
table.add([
{"text": "hello world"},
{"text": "goodbye world"}
])
query = "greetings"
actual = table.search(query).limit(1).to_pydantic(Words)[0]
print(actual.text)
```
### OpenAI embeddings ### OpenAI embeddings
LanceDB registers the OpenAI embeddings function in the registry by default, as `openai`. Below are the parameters that you can customize when creating the instances: LanceDB registers the OpenAI embeddings function in the registry by default, as `openai`. Below are the parameters that you can customize when creating the instances:

View File

@@ -13,7 +13,7 @@ Get started using these examples and quick links.
| Integrations | | | Integrations | |
|---|---:| |---|---:|
| <h3> LlamaIndex </h3>LlamaIndex is a simple, flexible data framework for connecting custom data sources to large language models. Llama index integrates with LanceDB as the serverless VectorDB. <h3>[Lean More](https://gpt-index.readthedocs.io/en/latest/examples/vector_stores/LanceDBIndexDemo.html) </h3> |<img src="../assets/llama-index.jpg" alt="image" width="150" height="auto">| | <h3> LlamaIndex </h3>LlamaIndex is a simple, flexible data framework for connecting custom data sources to large language models. Llama index integrates with LanceDB as the serverless VectorDB. <h3>[Lean More](https://gpt-index.readthedocs.io/en/latest/examples/vector_stores/LanceDBIndexDemo.html) </h3> |<img src="../assets/llama-index.jpg" alt="image" width="150" height="auto">|
| <h3>Langchain</h3>Langchain allows building applications with LLMs through composability <h3>[Lean More](https://python.langchain.com/docs/integrations/vectorstores/lancedb) | <img src="../assets/langchain.png" alt="image" width="150" height="auto">| | <h3>Langchain</h3>Langchain allows building applications with LLMs through composability <h3>[Lean More](https://lancedb.github.io/lancedb/integrations/langchain/) | <img src="../assets/langchain.png" alt="image" width="150" height="auto">|
| <h3>Langchain TS</h3> Javascript bindings for Langchain. It integrates with LanceDB's serverless vectordb allowing you to build powerful AI applications through composibility using only serverless functions. <h3>[Learn More]( https://js.langchain.com/docs/modules/data_connection/vectorstores/integrations/lancedb) | <img src="../assets/langchain.png" alt="image" width="150" height="auto">| | <h3>Langchain TS</h3> Javascript bindings for Langchain. It integrates with LanceDB's serverless vectordb allowing you to build powerful AI applications through composibility using only serverless functions. <h3>[Learn More]( https://js.langchain.com/docs/modules/data_connection/vectorstores/integrations/lancedb) | <img src="../assets/langchain.png" alt="image" width="150" height="auto">|
| <h3>Voxel51</h3> It is an open source toolkit that enables you to build better computer vision workflows by improving the quality of your datasets and delivering insights about your models.<h3>[Learn More](./voxel51.md) | <img src="../assets/voxel.gif" alt="image" width="150" height="auto">| | <h3>Voxel51</h3> It is an open source toolkit that enables you to build better computer vision workflows by improving the quality of your datasets and delivering insights about your models.<h3>[Learn More](./voxel51.md) | <img src="../assets/voxel.gif" alt="image" width="150" height="auto">|
| <h3>PromptTools</h3> Offers a set of free, open-source tools for testing and experimenting with models, prompts, and configurations. The core idea is to enable developers to evaluate prompts using familiar interfaces like code and notebooks. You can use it to experiment with different configurations of LanceDB, and test how LanceDB integrates with the LLM of your choice.<h3>[Learn More](./prompttools.md) | <img src="../assets/prompttools.jpeg" alt="image" width="150" height="auto">| | <h3>PromptTools</h3> Offers a set of free, open-source tools for testing and experimenting with models, prompts, and configurations. The core idea is to enable developers to evaluate prompts using familiar interfaces like code and notebooks. You can use it to experiment with different configurations of LanceDB, and test how LanceDB integrates with the LLM of your choice.<h3>[Learn More](./prompttools.md) | <img src="../assets/prompttools.jpeg" alt="image" width="150" height="auto">|

View File

@@ -0,0 +1,92 @@
# Langchain
![Illustration](../assets/langchain.png)
## Quick Start
You can load your document data using langchain's loaders, for this example we are using `TextLoader` and `OpenAIEmbeddings` as the embedding model.
```python
import os
from langchain.document_loaders import TextLoader
from langchain.vectorstores import LanceDB
from langchain_openai import OpenAIEmbeddings
from langchain_text_splitters import CharacterTextSplitter
os.environ["OPENAI_API_KEY"] = "sk-..."
loader = TextLoader("../../modules/state_of_the_union.txt") # Replace with your data path
documents = loader.load()
documents = CharacterTextSplitter().split_documents(documents)
embeddings = OpenAIEmbeddings()
docsearch = LanceDB.from_documents(documents, embeddings)
query = "What did the president say about Ketanji Brown Jackson"
docs = docsearch.similarity_search(query)
print(docs[0].page_content)
```
## Documentation
In the above example `LanceDB` vector store class object is created using `from_documents()` method which is a `classmethod` and returns the initialized class object.
You can also use `LanceDB.from_texts(texts: List[str],embedding: Embeddings)` class method.
The exhaustive list of parameters for `LanceDB` vector store are :
- `connection`: (Optional) `lancedb.db.LanceDBConnection` connection object to use. If not provided, a new connection will be created.
- `embedding`: Langchain embedding model.
- `vector_key`: (Optional) Column name to use for vector's in the table. Defaults to `'vector'`.
- `id_key`: (Optional) Column name to use for id's in the table. Defaults to `'id'`.
- `text_key`: (Optional) Column name to use for text in the table. Defaults to `'text'`.
- `table_name`: (Optional) Name of your table in the database. Defaults to `'vectorstore'`.
- `api_key`: (Optional) API key to use for LanceDB cloud database. Defaults to `None`.
- `region`: (Optional) Region to use for LanceDB cloud database. Only for LanceDB Cloud, defaults to `None`.
- `mode`: (Optional) Mode to use for adding data to the table. Defaults to `'overwrite'`.
```python
db_url = "db://lang_test" # url of db you created
api_key = "xxxxx" # your API key
region="us-east-1-dev" # your selected region
vector_store = LanceDB(
uri=db_url,
api_key=api_key, #(dont include for local API)
region=region, #(dont include for local API)
embedding=embeddings,
table_name='langchain_test' #Optional
)
```
### Methods
To add texts and store respective embeddings automatically:
##### add_texts()
- `texts`: `Iterable` of strings to add to the vectorstore.
- `metadatas`: Optional `list[dict()]` of metadatas associated with the texts.
- `ids`: Optional `list` of ids to associate with the texts.
```python
vector_store.add_texts(texts = ['test_123'], metadatas =[{'source' :'wiki'}])
#Additionaly, to explore the table you can load it into a df or save it in a csv file:
tbl = vector_store.get_table()
print("tbl:", tbl)
pd_df = tbl.to_pandas()
pd_df.to_csv("docsearch.csv", index=False)
# you can also create a new vector store object using an older connection object:
vector_store = LanceDB(connection=tbl, embedding=embeddings)
```
For index creation make sure your table has enough data in it. An ANN index is ususally not needed for datasets ~100K vectors. For large-scale (>1M) or higher dimension vectors, it is beneficial to create an ANN index.
##### create_index()
- `col_name`: `Optional[str] = None`
- `vector_col`: `Optional[str] = None`
- `num_partitions`: `Optional[int] = 256`
- `num_sub_vectors`: `Optional[int] = 96`
- `index_cache_size`: `Optional[int] = None`
```python
# for creating vector index
vector_store.create_index(vector_col='vector', metric = 'cosine')
# for creating scalar index(for non-vector columns)
vector_store.create_index(col_name='text')
```

View File

@@ -8,6 +8,7 @@ excluded_globs = [
"../src/embedding.md", "../src/embedding.md",
"../src/examples/*.md", "../src/examples/*.md",
"../src/integrations/voxel51.md", "../src/integrations/voxel51.md",
"../src/integrations/langchain.md",
"../src/guides/tables.md", "../src/guides/tables.md",
"../src/python/duckdb.md", "../src/python/duckdb.md",
"../src/embeddings/*.md", "../src/embeddings/*.md",

74
node/package-lock.json generated
View File

@@ -1,12 +1,12 @@
{ {
"name": "vectordb", "name": "vectordb",
"version": "0.4.19", "version": "0.4.20",
"lockfileVersion": 3, "lockfileVersion": 3,
"requires": true, "requires": true,
"packages": { "packages": {
"": { "": {
"name": "vectordb", "name": "vectordb",
"version": "0.4.19", "version": "0.4.20",
"cpu": [ "cpu": [
"x64", "x64",
"arm64" "arm64"
@@ -52,11 +52,11 @@
"uuid": "^9.0.0" "uuid": "^9.0.0"
}, },
"optionalDependencies": { "optionalDependencies": {
"@lancedb/vectordb-darwin-arm64": "0.4.19", "@lancedb/vectordb-darwin-arm64": "0.4.20",
"@lancedb/vectordb-darwin-x64": "0.4.19", "@lancedb/vectordb-darwin-x64": "0.4.20",
"@lancedb/vectordb-linux-arm64-gnu": "0.4.19", "@lancedb/vectordb-linux-arm64-gnu": "0.4.20",
"@lancedb/vectordb-linux-x64-gnu": "0.4.19", "@lancedb/vectordb-linux-x64-gnu": "0.4.20",
"@lancedb/vectordb-win32-x64-msvc": "0.4.19" "@lancedb/vectordb-win32-x64-msvc": "0.4.20"
}, },
"peerDependencies": { "peerDependencies": {
"@apache-arrow/ts": "^14.0.2", "@apache-arrow/ts": "^14.0.2",
@@ -333,6 +333,66 @@
"@jridgewell/sourcemap-codec": "^1.4.10" "@jridgewell/sourcemap-codec": "^1.4.10"
} }
}, },
"node_modules/@lancedb/vectordb-darwin-arm64": {
"version": "0.4.20",
"resolved": "https://registry.npmjs.org/@lancedb/vectordb-darwin-arm64/-/vectordb-darwin-arm64-0.4.20.tgz",
"integrity": "sha512-ffP2K4sA5mQTgePyARw1y8dPN996FmpvyAYoWO+TSItaXlhcXvc+KVa5udNMCZMDYeEnEv2Xpj6k4PwW3oBz+A==",
"cpu": [
"arm64"
],
"optional": true,
"os": [
"darwin"
]
},
"node_modules/@lancedb/vectordb-darwin-x64": {
"version": "0.4.20",
"resolved": "https://registry.npmjs.org/@lancedb/vectordb-darwin-x64/-/vectordb-darwin-x64-0.4.20.tgz",
"integrity": "sha512-GSYsXE20RIehDu30FjREhJdEzhnwOTV7ZsrSXagStzLY1gr7pyd7sfqxmmUtdD09di7LnQoiM71AOpPTa01YwQ==",
"cpu": [
"x64"
],
"optional": true,
"os": [
"darwin"
]
},
"node_modules/@lancedb/vectordb-linux-arm64-gnu": {
"version": "0.4.20",
"resolved": "https://registry.npmjs.org/@lancedb/vectordb-linux-arm64-gnu/-/vectordb-linux-arm64-gnu-0.4.20.tgz",
"integrity": "sha512-FpNOjOsz3nJVm6EBGyNgbOW2aFhsWZ/igeY45Z8hbZaaK2YBwrg/DASoNlUzgv6IR8cUaGJ2irNVJfsKR2cG6g==",
"cpu": [
"arm64"
],
"optional": true,
"os": [
"linux"
]
},
"node_modules/@lancedb/vectordb-linux-x64-gnu": {
"version": "0.4.20",
"resolved": "https://registry.npmjs.org/@lancedb/vectordb-linux-x64-gnu/-/vectordb-linux-x64-gnu-0.4.20.tgz",
"integrity": "sha512-pOqWjrRZQSrLTlQPkjidRii7NZDw8Xu9pN6ouVu2JAK8n81FXaPtFCyAI+Y3v9GpnYDN0rvD4eQ36aHAVPsa2g==",
"cpu": [
"x64"
],
"optional": true,
"os": [
"linux"
]
},
"node_modules/@lancedb/vectordb-win32-x64-msvc": {
"version": "0.4.20",
"resolved": "https://registry.npmjs.org/@lancedb/vectordb-win32-x64-msvc/-/vectordb-win32-x64-msvc-0.4.20.tgz",
"integrity": "sha512-5J5SsYSJ7jRCmU/sgwVHdrGz43B/7R2T9OEoFTKyVAtqTZdu75rkytXyn9SyEayXVhlUOaw76N0ASm0hAoDS/A==",
"cpu": [
"x64"
],
"optional": true,
"os": [
"win32"
]
},
"node_modules/@neon-rs/cli": { "node_modules/@neon-rs/cli": {
"version": "0.0.160", "version": "0.0.160",
"resolved": "https://registry.npmjs.org/@neon-rs/cli/-/cli-0.0.160.tgz", "resolved": "https://registry.npmjs.org/@neon-rs/cli/-/cli-0.0.160.tgz",

View File

@@ -1,6 +1,6 @@
{ {
"name": "vectordb", "name": "vectordb",
"version": "0.4.19", "version": "0.4.20",
"description": " Serverless, low-latency vector database for AI applications", "description": " Serverless, low-latency vector database for AI applications",
"main": "dist/index.js", "main": "dist/index.js",
"types": "dist/index.d.ts", "types": "dist/index.d.ts",
@@ -88,10 +88,10 @@
} }
}, },
"optionalDependencies": { "optionalDependencies": {
"@lancedb/vectordb-darwin-arm64": "0.4.19", "@lancedb/vectordb-darwin-arm64": "0.4.20",
"@lancedb/vectordb-darwin-x64": "0.4.19", "@lancedb/vectordb-darwin-x64": "0.4.20",
"@lancedb/vectordb-linux-arm64-gnu": "0.4.19", "@lancedb/vectordb-linux-arm64-gnu": "0.4.20",
"@lancedb/vectordb-linux-x64-gnu": "0.4.19", "@lancedb/vectordb-linux-x64-gnu": "0.4.20",
"@lancedb/vectordb-win32-x64-msvc": "0.4.19" "@lancedb/vectordb-win32-x64-msvc": "0.4.20"
} }
} }

View File

@@ -27,23 +27,23 @@ import {
RecordBatch, RecordBatch,
makeData, makeData,
Struct, Struct,
Float, type Float,
DataType, DataType,
Binary, Binary,
Float32 Float32
} from 'apache-arrow' } from "apache-arrow";
import { type EmbeddingFunction } from './index' import { type EmbeddingFunction } from "./index";
import { sanitizeSchema } from './sanitize' import { sanitizeSchema } from "./sanitize";
/* /*
* Options to control how a column should be converted to a vector array * Options to control how a column should be converted to a vector array
*/ */
export class VectorColumnOptions { export class VectorColumnOptions {
/** Vector column type. */ /** Vector column type. */
type: Float = new Float32() type: Float = new Float32();
constructor (values?: Partial<VectorColumnOptions>) { constructor(values?: Partial<VectorColumnOptions>) {
Object.assign(this, values) Object.assign(this, values);
} }
} }
@@ -60,7 +60,7 @@ export class MakeArrowTableOptions {
* The schema must be specified if there are no records (e.g. to make * The schema must be specified if there are no records (e.g. to make
* an empty table) * an empty table)
*/ */
schema?: Schema schema?: Schema;
/* /*
* Mapping from vector column name to expected type * Mapping from vector column name to expected type
@@ -80,7 +80,9 @@ export class MakeArrowTableOptions {
*/ */
vectorColumns: Record<string, VectorColumnOptions> = { vectorColumns: Record<string, VectorColumnOptions> = {
vector: new VectorColumnOptions() vector: new VectorColumnOptions()
} };
embeddings?: EmbeddingFunction<any>;
/** /**
* If true then string columns will be encoded with dictionary encoding * If true then string columns will be encoded with dictionary encoding
@@ -91,10 +93,10 @@ export class MakeArrowTableOptions {
* *
* If `schema` is provided then this property is ignored. * If `schema` is provided then this property is ignored.
*/ */
dictionaryEncodeStrings: boolean = false dictionaryEncodeStrings: boolean = false;
constructor (values?: Partial<MakeArrowTableOptions>) { constructor(values?: Partial<MakeArrowTableOptions>) {
Object.assign(this, values) Object.assign(this, values);
} }
} }
@@ -193,59 +195,68 @@ export class MakeArrowTableOptions {
* assert.deepEqual(table.schema, schema) * assert.deepEqual(table.schema, schema)
* ``` * ```
*/ */
export function makeArrowTable ( export function makeArrowTable(
data: Array<Record<string, any>>, data: Array<Record<string, any>>,
options?: Partial<MakeArrowTableOptions> options?: Partial<MakeArrowTableOptions>
): ArrowTable { ): ArrowTable {
if (data.length === 0 && (options?.schema === undefined || options?.schema === null)) { if (
throw new Error('At least one record or a schema needs to be provided') data.length === 0 &&
(options?.schema === undefined || options?.schema === null)
) {
throw new Error("At least one record or a schema needs to be provided");
} }
const opt = new MakeArrowTableOptions(options !== undefined ? options : {}) const opt = new MakeArrowTableOptions(options !== undefined ? options : {});
if (opt.schema !== undefined && opt.schema !== null) { if (opt.schema !== undefined && opt.schema !== null) {
opt.schema = sanitizeSchema(opt.schema) opt.schema = sanitizeSchema(opt.schema);
opt.schema = validateSchemaEmbeddings(opt.schema, data, opt.embeddings);
} }
const columns: Record<string, Vector> = {}
const columns: Record<string, Vector> = {};
// TODO: sample dataset to find missing columns // TODO: sample dataset to find missing columns
// Prefer the field ordering of the schema, if present // Prefer the field ordering of the schema, if present
const columnNames = ((opt.schema) != null) ? (opt.schema.names as string[]) : Object.keys(data[0]) const columnNames =
opt.schema != null ? (opt.schema.names as string[]) : Object.keys(data[0]);
for (const colName of columnNames) { for (const colName of columnNames) {
if (data.length !== 0 && !Object.prototype.hasOwnProperty.call(data[0], colName)) { if (
data.length !== 0 &&
!Object.prototype.hasOwnProperty.call(data[0], colName)
) {
// The field is present in the schema, but not in the data, skip it // The field is present in the schema, but not in the data, skip it
continue continue;
} }
// Extract a single column from the records (transpose from row-major to col-major) // Extract a single column from the records (transpose from row-major to col-major)
let values = data.map((datum) => datum[colName]) let values = data.map((datum) => datum[colName]);
// By default (type === undefined) arrow will infer the type from the JS type // By default (type === undefined) arrow will infer the type from the JS type
let type let type;
if (opt.schema !== undefined) { if (opt.schema !== undefined) {
// If there is a schema provided, then use that for the type instead // If there is a schema provided, then use that for the type instead
type = opt.schema?.fields.filter((f) => f.name === colName)[0]?.type type = opt.schema?.fields.filter((f) => f.name === colName)[0]?.type;
if (DataType.isInt(type) && type.bitWidth === 64) { if (DataType.isInt(type) && type.bitWidth === 64) {
// wrap in BigInt to avoid bug: https://github.com/apache/arrow/issues/40051 // wrap in BigInt to avoid bug: https://github.com/apache/arrow/issues/40051
values = values.map((v) => { values = values.map((v) => {
if (v === null) { if (v === null) {
return v return v;
} }
return BigInt(v) return BigInt(v);
}) });
} }
} else { } else {
// Otherwise, check to see if this column is one of the vector columns // Otherwise, check to see if this column is one of the vector columns
// defined by opt.vectorColumns and, if so, use the fixed size list type // defined by opt.vectorColumns and, if so, use the fixed size list type
const vectorColumnOptions = opt.vectorColumns[colName] const vectorColumnOptions = opt.vectorColumns[colName];
if (vectorColumnOptions !== undefined) { if (vectorColumnOptions !== undefined) {
type = newVectorType(values[0].length, vectorColumnOptions.type) type = newVectorType(values[0].length, vectorColumnOptions.type);
} }
} }
try { try {
// Convert an Array of JS values to an arrow vector // Convert an Array of JS values to an arrow vector
columns[colName] = makeVector(values, type, opt.dictionaryEncodeStrings) columns[colName] = makeVector(values, type, opt.dictionaryEncodeStrings);
} catch (error: unknown) { } catch (error: unknown) {
// eslint-disable-next-line @typescript-eslint/restrict-template-expressions // eslint-disable-next-line @typescript-eslint/restrict-template-expressions
throw Error(`Could not convert column "${colName}" to Arrow: ${error}`) throw Error(`Could not convert column "${colName}" to Arrow: ${error}`);
} }
} }
@@ -260,97 +271,116 @@ export function makeArrowTable (
// To work around this we first create a table with the wrong schema and // To work around this we first create a table with the wrong schema and
// then patch the schema of the batches so we can use // then patch the schema of the batches so we can use
// `new ArrowTable(schema, batches)` which does not do any schema inference // `new ArrowTable(schema, batches)` which does not do any schema inference
const firstTable = new ArrowTable(columns) const firstTable = new ArrowTable(columns);
// eslint-disable-next-line @typescript-eslint/no-non-null-assertion const batchesFixed = firstTable.batches.map(
const batchesFixed = firstTable.batches.map(batch => new RecordBatch(opt.schema!, batch.data)) // eslint-disable-next-line @typescript-eslint/no-non-null-assertion
return new ArrowTable(opt.schema, batchesFixed) (batch) => new RecordBatch(opt.schema!, batch.data)
);
return new ArrowTable(opt.schema, batchesFixed);
} else { } else {
return new ArrowTable(columns) return new ArrowTable(columns);
} }
} }
/** /**
* Create an empty Arrow table with the provided schema * Create an empty Arrow table with the provided schema
*/ */
export function makeEmptyTable (schema: Schema): ArrowTable { export function makeEmptyTable(schema: Schema): ArrowTable {
return makeArrowTable([], { schema }) return makeArrowTable([], { schema });
} }
// Helper function to convert Array<Array<any>> to a variable sized list array // Helper function to convert Array<Array<any>> to a variable sized list array
function makeListVector (lists: any[][]): Vector<any> { function makeListVector(lists: any[][]): Vector<any> {
if (lists.length === 0 || lists[0].length === 0) { if (lists.length === 0 || lists[0].length === 0) {
throw Error('Cannot infer list vector from empty array or empty list') throw Error("Cannot infer list vector from empty array or empty list");
} }
const sampleList = lists[0] const sampleList = lists[0];
let inferredType let inferredType;
try { try {
const sampleVector = makeVector(sampleList) const sampleVector = makeVector(sampleList);
inferredType = sampleVector.type inferredType = sampleVector.type;
} catch (error: unknown) { } catch (error: unknown) {
// eslint-disable-next-line @typescript-eslint/restrict-template-expressions // eslint-disable-next-line @typescript-eslint/restrict-template-expressions
throw Error(`Cannot infer list vector. Cannot infer inner type: ${error}`) throw Error(`Cannot infer list vector. Cannot infer inner type: ${error}`);
} }
const listBuilder = makeBuilder({ const listBuilder = makeBuilder({
type: new List(new Field('item', inferredType, true)) type: new List(new Field("item", inferredType, true))
}) });
for (const list of lists) { for (const list of lists) {
listBuilder.append(list) listBuilder.append(list);
} }
return listBuilder.finish().toVector() return listBuilder.finish().toVector();
} }
// Helper function to convert an Array of JS values to an Arrow Vector // Helper function to convert an Array of JS values to an Arrow Vector
function makeVector (values: any[], type?: DataType, stringAsDictionary?: boolean): Vector<any> { function makeVector(
values: any[],
type?: DataType,
stringAsDictionary?: boolean
): Vector<any> {
if (type !== undefined) { if (type !== undefined) {
// No need for inference, let Arrow create it // No need for inference, let Arrow create it
return vectorFromArray(values, type) return vectorFromArray(values, type);
} }
if (values.length === 0) { if (values.length === 0) {
throw Error('makeVector requires at least one value or the type must be specfied') throw Error(
"makeVector requires at least one value or the type must be specfied"
);
} }
const sampleValue = values.find(val => val !== null && val !== undefined) const sampleValue = values.find((val) => val !== null && val !== undefined);
if (sampleValue === undefined) { if (sampleValue === undefined) {
throw Error('makeVector cannot infer the type if all values are null or undefined') throw Error(
"makeVector cannot infer the type if all values are null or undefined"
);
} }
if (Array.isArray(sampleValue)) { if (Array.isArray(sampleValue)) {
// Default Arrow inference doesn't handle list types // Default Arrow inference doesn't handle list types
return makeListVector(values) return makeListVector(values);
} else if (Buffer.isBuffer(sampleValue)) { } else if (Buffer.isBuffer(sampleValue)) {
// Default Arrow inference doesn't handle Buffer // Default Arrow inference doesn't handle Buffer
return vectorFromArray(values, new Binary()) return vectorFromArray(values, new Binary());
} else if (!(stringAsDictionary ?? false) && (typeof sampleValue === 'string' || sampleValue instanceof String)) { } else if (
!(stringAsDictionary ?? false) &&
(typeof sampleValue === "string" || sampleValue instanceof String)
) {
// If the type is string then don't use Arrow's default inference unless dictionaries are requested // If the type is string then don't use Arrow's default inference unless dictionaries are requested
// because it will always use dictionary encoding for strings // because it will always use dictionary encoding for strings
return vectorFromArray(values, new Utf8()) return vectorFromArray(values, new Utf8());
} else { } else {
// Convert a JS array of values to an arrow vector // Convert a JS array of values to an arrow vector
return vectorFromArray(values) return vectorFromArray(values);
} }
} }
async function applyEmbeddings<T> (table: ArrowTable, embeddings?: EmbeddingFunction<T>, schema?: Schema): Promise<ArrowTable> { async function applyEmbeddings<T>(
table: ArrowTable,
embeddings?: EmbeddingFunction<T>,
schema?: Schema
): Promise<ArrowTable> {
if (embeddings == null) { if (embeddings == null) {
return table return table;
} }
if (schema !== undefined && schema !== null) { if (schema !== undefined && schema !== null) {
schema = sanitizeSchema(schema) schema = sanitizeSchema(schema);
} }
// Convert from ArrowTable to Record<String, Vector> // Convert from ArrowTable to Record<String, Vector>
const colEntries = [...Array(table.numCols).keys()].map((_, idx) => { const colEntries = [...Array(table.numCols).keys()].map((_, idx) => {
const name = table.schema.fields[idx].name const name = table.schema.fields[idx].name;
// eslint-disable-next-line @typescript-eslint/no-non-null-assertion // eslint-disable-next-line @typescript-eslint/no-non-null-assertion
const vec = table.getChildAt(idx)! const vec = table.getChildAt(idx)!;
return [name, vec] return [name, vec];
}) });
const newColumns = Object.fromEntries(colEntries) const newColumns = Object.fromEntries(colEntries);
const sourceColumn = newColumns[embeddings.sourceColumn] const sourceColumn = newColumns[embeddings.sourceColumn];
const destColumn = embeddings.destColumn ?? 'vector' const destColumn = embeddings.destColumn ?? "vector";
const innerDestType = embeddings.embeddingDataType ?? new Float32() const innerDestType = embeddings.embeddingDataType ?? new Float32();
if (sourceColumn === undefined) { if (sourceColumn === undefined) {
throw new Error(`Cannot apply embedding function because the source column '${embeddings.sourceColumn}' was not present in the data`) throw new Error(
`Cannot apply embedding function because the source column '${embeddings.sourceColumn}' was not present in the data`
);
} }
if (table.numRows === 0) { if (table.numRows === 0) {
@@ -358,45 +388,60 @@ async function applyEmbeddings<T> (table: ArrowTable, embeddings?: EmbeddingFunc
// We have an empty table and it already has the embedding column so no work needs to be done // We have an empty table and it already has the embedding column so no work needs to be done
// Note: we don't return an error like we did below because this is a common occurrence. For example, // Note: we don't return an error like we did below because this is a common occurrence. For example,
// if we call convertToTable with 0 records and a schema that includes the embedding // if we call convertToTable with 0 records and a schema that includes the embedding
return table return table;
} }
if (embeddings.embeddingDimension !== undefined) { if (embeddings.embeddingDimension !== undefined) {
const destType = newVectorType(embeddings.embeddingDimension, innerDestType) const destType = newVectorType(
newColumns[destColumn] = makeVector([], destType) embeddings.embeddingDimension,
innerDestType
);
newColumns[destColumn] = makeVector([], destType);
} else if (schema != null) { } else if (schema != null) {
const destField = schema.fields.find(f => f.name === destColumn) const destField = schema.fields.find((f) => f.name === destColumn);
if (destField != null) { if (destField != null) {
newColumns[destColumn] = makeVector([], destField.type) newColumns[destColumn] = makeVector([], destField.type);
} else { } else {
throw new Error(`Attempt to apply embeddings to an empty table failed because schema was missing embedding column '${destColumn}'`) throw new Error(
`Attempt to apply embeddings to an empty table failed because schema was missing embedding column '${destColumn}'`
);
} }
} else { } else {
throw new Error('Attempt to apply embeddings to an empty table when the embeddings function does not specify `embeddingDimension`') throw new Error(
"Attempt to apply embeddings to an empty table when the embeddings function does not specify `embeddingDimension`"
);
} }
} else { } else {
if (Object.prototype.hasOwnProperty.call(newColumns, destColumn)) { if (Object.prototype.hasOwnProperty.call(newColumns, destColumn)) {
throw new Error(`Attempt to apply embeddings to table failed because column ${destColumn} already existed`) throw new Error(
`Attempt to apply embeddings to table failed because column ${destColumn} already existed`
);
} }
if (table.batches.length > 1) { if (table.batches.length > 1) {
throw new Error('Internal error: `makeArrowTable` unexpectedly created a table with more than one batch') throw new Error(
"Internal error: `makeArrowTable` unexpectedly created a table with more than one batch"
);
} }
const values = sourceColumn.toArray() const values = sourceColumn.toArray();
const vectors = await embeddings.embed(values as T[]) const vectors = await embeddings.embed(values as T[]);
if (vectors.length !== values.length) { if (vectors.length !== values.length) {
throw new Error('Embedding function did not return an embedding for each input element') throw new Error(
"Embedding function did not return an embedding for each input element"
);
} }
const destType = newVectorType(vectors[0].length, innerDestType) const destType = newVectorType(vectors[0].length, innerDestType);
newColumns[destColumn] = makeVector(vectors, destType) newColumns[destColumn] = makeVector(vectors, destType);
} }
const newTable = new ArrowTable(newColumns) const newTable = new ArrowTable(newColumns);
if (schema != null) { if (schema != null) {
if (schema.fields.find(f => f.name === destColumn) === undefined) { if (schema.fields.find((f) => f.name === destColumn) === undefined) {
throw new Error(`When using embedding functions and specifying a schema the schema should include the embedding column but the column ${destColumn} was missing`) throw new Error(
`When using embedding functions and specifying a schema the schema should include the embedding column but the column ${destColumn} was missing`
);
} }
return alignTable(newTable, schema) return alignTable(newTable, schema);
} }
return newTable return newTable;
} }
/* /*
@@ -417,21 +462,24 @@ async function applyEmbeddings<T> (table: ArrowTable, embeddings?: EmbeddingFunc
* embedding columns. If no schema is provded then embedding columns will * embedding columns. If no schema is provded then embedding columns will
* be placed at the end of the table, after all of the input columns. * be placed at the end of the table, after all of the input columns.
*/ */
export async function convertToTable<T> ( export async function convertToTable<T>(
data: Array<Record<string, unknown>>, data: Array<Record<string, unknown>>,
embeddings?: EmbeddingFunction<T>, embeddings?: EmbeddingFunction<T>,
makeTableOptions?: Partial<MakeArrowTableOptions> makeTableOptions?: Partial<MakeArrowTableOptions>
): Promise<ArrowTable> { ): Promise<ArrowTable> {
const table = makeArrowTable(data, makeTableOptions) const table = makeArrowTable(data, makeTableOptions);
return await applyEmbeddings(table, embeddings, makeTableOptions?.schema) return await applyEmbeddings(table, embeddings, makeTableOptions?.schema);
} }
// Creates the Arrow Type for a Vector column with dimension `dim` // Creates the Arrow Type for a Vector column with dimension `dim`
function newVectorType <T extends Float> (dim: number, innerType: T): FixedSizeList<T> { function newVectorType<T extends Float>(
dim: number,
innerType: T
): FixedSizeList<T> {
// Somewhere we always default to have the elements nullable, so we need to set it to true // Somewhere we always default to have the elements nullable, so we need to set it to true
// otherwise we often get schema mismatches because the stored data always has schema with nullable elements // otherwise we often get schema mismatches because the stored data always has schema with nullable elements
const children = new Field<T>('item', innerType, true) const children = new Field<T>("item", innerType, true);
return new FixedSizeList(dim, children) return new FixedSizeList(dim, children);
} }
/** /**
@@ -441,17 +489,17 @@ function newVectorType <T extends Float> (dim: number, innerType: T): FixedSizeL
* *
* `schema` is required if data is empty * `schema` is required if data is empty
*/ */
export async function fromRecordsToBuffer<T> ( export async function fromRecordsToBuffer<T>(
data: Array<Record<string, unknown>>, data: Array<Record<string, unknown>>,
embeddings?: EmbeddingFunction<T>, embeddings?: EmbeddingFunction<T>,
schema?: Schema schema?: Schema
): Promise<Buffer> { ): Promise<Buffer> {
if (schema !== undefined && schema !== null) { if (schema !== undefined && schema !== null) {
schema = sanitizeSchema(schema) schema = sanitizeSchema(schema);
} }
const table = await convertToTable(data, embeddings, { schema }) const table = await convertToTable(data, embeddings, { schema, embeddings });
const writer = RecordBatchFileWriter.writeAll(table) const writer = RecordBatchFileWriter.writeAll(table);
return Buffer.from(await writer.toUint8Array()) return Buffer.from(await writer.toUint8Array());
} }
/** /**
@@ -461,17 +509,17 @@ export async function fromRecordsToBuffer<T> (
* *
* `schema` is required if data is empty * `schema` is required if data is empty
*/ */
export async function fromRecordsToStreamBuffer<T> ( export async function fromRecordsToStreamBuffer<T>(
data: Array<Record<string, unknown>>, data: Array<Record<string, unknown>>,
embeddings?: EmbeddingFunction<T>, embeddings?: EmbeddingFunction<T>,
schema?: Schema schema?: Schema
): Promise<Buffer> { ): Promise<Buffer> {
if (schema !== null && schema !== undefined) { if (schema !== null && schema !== undefined) {
schema = sanitizeSchema(schema) schema = sanitizeSchema(schema);
} }
const table = await convertToTable(data, embeddings, { schema }) const table = await convertToTable(data, embeddings, { schema });
const writer = RecordBatchStreamWriter.writeAll(table) const writer = RecordBatchStreamWriter.writeAll(table);
return Buffer.from(await writer.toUint8Array()) return Buffer.from(await writer.toUint8Array());
} }
/** /**
@@ -482,17 +530,17 @@ export async function fromRecordsToStreamBuffer<T> (
* *
* `schema` is required if the table is empty * `schema` is required if the table is empty
*/ */
export async function fromTableToBuffer<T> ( export async function fromTableToBuffer<T>(
table: ArrowTable, table: ArrowTable,
embeddings?: EmbeddingFunction<T>, embeddings?: EmbeddingFunction<T>,
schema?: Schema schema?: Schema
): Promise<Buffer> { ): Promise<Buffer> {
if (schema !== null && schema !== undefined) { if (schema !== null && schema !== undefined) {
schema = sanitizeSchema(schema) schema = sanitizeSchema(schema);
} }
const tableWithEmbeddings = await applyEmbeddings(table, embeddings, schema) const tableWithEmbeddings = await applyEmbeddings(table, embeddings, schema);
const writer = RecordBatchFileWriter.writeAll(tableWithEmbeddings) const writer = RecordBatchFileWriter.writeAll(tableWithEmbeddings);
return Buffer.from(await writer.toUint8Array()) return Buffer.from(await writer.toUint8Array());
} }
/** /**
@@ -503,49 +551,87 @@ export async function fromTableToBuffer<T> (
* *
* `schema` is required if the table is empty * `schema` is required if the table is empty
*/ */
export async function fromTableToStreamBuffer<T> ( export async function fromTableToStreamBuffer<T>(
table: ArrowTable, table: ArrowTable,
embeddings?: EmbeddingFunction<T>, embeddings?: EmbeddingFunction<T>,
schema?: Schema schema?: Schema
): Promise<Buffer> { ): Promise<Buffer> {
if (schema !== null && schema !== undefined) { if (schema !== null && schema !== undefined) {
schema = sanitizeSchema(schema) schema = sanitizeSchema(schema);
} }
const tableWithEmbeddings = await applyEmbeddings(table, embeddings, schema) const tableWithEmbeddings = await applyEmbeddings(table, embeddings, schema);
const writer = RecordBatchStreamWriter.writeAll(tableWithEmbeddings) const writer = RecordBatchStreamWriter.writeAll(tableWithEmbeddings);
return Buffer.from(await writer.toUint8Array()) return Buffer.from(await writer.toUint8Array());
} }
function alignBatch (batch: RecordBatch, schema: Schema): RecordBatch { function alignBatch(batch: RecordBatch, schema: Schema): RecordBatch {
const alignedChildren = [] const alignedChildren = [];
for (const field of schema.fields) { for (const field of schema.fields) {
const indexInBatch = batch.schema.fields?.findIndex( const indexInBatch = batch.schema.fields?.findIndex(
(f) => f.name === field.name (f) => f.name === field.name
) );
if (indexInBatch < 0) { if (indexInBatch < 0) {
throw new Error( throw new Error(
`The column ${field.name} was not found in the Arrow Table` `The column ${field.name} was not found in the Arrow Table`
) );
} }
alignedChildren.push(batch.data.children[indexInBatch]) alignedChildren.push(batch.data.children[indexInBatch]);
} }
const newData = makeData({ const newData = makeData({
type: new Struct(schema.fields), type: new Struct(schema.fields),
length: batch.numRows, length: batch.numRows,
nullCount: batch.nullCount, nullCount: batch.nullCount,
children: alignedChildren children: alignedChildren
}) });
return new RecordBatch(schema, newData) return new RecordBatch(schema, newData);
} }
function alignTable (table: ArrowTable, schema: Schema): ArrowTable { function alignTable(table: ArrowTable, schema: Schema): ArrowTable {
const alignedBatches = table.batches.map((batch) => const alignedBatches = table.batches.map((batch) =>
alignBatch(batch, schema) alignBatch(batch, schema)
) );
return new ArrowTable(schema, alignedBatches) return new ArrowTable(schema, alignedBatches);
} }
// Creates an empty Arrow Table // Creates an empty Arrow Table
export function createEmptyTable (schema: Schema): ArrowTable { export function createEmptyTable(schema: Schema): ArrowTable {
return new ArrowTable(sanitizeSchema(schema)) return new ArrowTable(sanitizeSchema(schema));
}
function validateSchemaEmbeddings(
schema: Schema<any>,
data: Array<Record<string, unknown>>,
embeddings: EmbeddingFunction<any> | undefined
) {
const fields = [];
const missingEmbeddingFields = [];
// First we check if the field is a `FixedSizeList`
// Then we check if the data contains the field
// if it does not, we add it to the list of missing embedding fields
// Finally, we check if those missing embedding fields are `this._embeddings`
// if they are not, we throw an error
for (const field of schema.fields) {
if (field.type instanceof FixedSizeList) {
if (data.length !== 0 && data?.[0]?.[field.name] === undefined) {
missingEmbeddingFields.push(field);
} else {
fields.push(field);
}
} else {
fields.push(field);
}
}
if (missingEmbeddingFields.length > 0 && embeddings === undefined) {
console.log({ missingEmbeddingFields, embeddings });
throw new Error(
`Table has embeddings: "${missingEmbeddingFields
.map((f) => f.name)
.join(",")}", but no embedding function was provided`
);
}
return new Schema(fields);
} }

View File

@@ -12,19 +12,20 @@
// See the License for the specific language governing permissions and // See the License for the specific language governing permissions and
// limitations under the License. // limitations under the License.
import { type Schema, Table as ArrowTable, tableFromIPC } from 'apache-arrow' import { type Schema, Table as ArrowTable, tableFromIPC } from "apache-arrow";
import { import {
createEmptyTable, createEmptyTable,
fromRecordsToBuffer, fromRecordsToBuffer,
fromTableToBuffer, fromTableToBuffer,
makeArrowTable makeArrowTable
} from './arrow' } from "./arrow";
import type { EmbeddingFunction } from './embedding/embedding_function' import type { EmbeddingFunction } from "./embedding/embedding_function";
import { RemoteConnection } from './remote' import { RemoteConnection } from "./remote";
import { Query } from './query' import { Query } from "./query";
import { isEmbeddingFunction } from './embedding/embedding_function' import { isEmbeddingFunction } from "./embedding/embedding_function";
import { type Literal, toSQL } from './util' import { type Literal, toSQL } from "./util";
import { type HttpMiddleware } from './middleware'
import { type HttpMiddleware } from "./middleware";
const { const {
databaseNew, databaseNew,
@@ -48,14 +49,18 @@ const {
tableAlterColumns, tableAlterColumns,
tableDropColumns tableDropColumns
// eslint-disable-next-line @typescript-eslint/no-var-requires // eslint-disable-next-line @typescript-eslint/no-var-requires
} = require('../native.js') } = require("../native.js");
export { Query } export { Query };
export type { EmbeddingFunction } export type { EmbeddingFunction };
export { OpenAIEmbeddingFunction } from './embedding/openai' export { OpenAIEmbeddingFunction } from "./embedding/openai";
export { convertToTable, makeArrowTable, type MakeArrowTableOptions } from './arrow' export {
convertToTable,
makeArrowTable,
type MakeArrowTableOptions
} from "./arrow";
const defaultAwsRegion = 'us-west-2' const defaultAwsRegion = "us-west-2";
export interface AwsCredentials { export interface AwsCredentials {
accessKeyId: string accessKeyId: string
@@ -128,19 +133,19 @@ export interface ConnectionOptions {
readConsistencyInterval?: number readConsistencyInterval?: number
} }
function getAwsArgs (opts: ConnectionOptions): any[] { function getAwsArgs(opts: ConnectionOptions): any[] {
const callArgs: any[] = [] const callArgs: any[] = [];
const awsCredentials = opts.awsCredentials const awsCredentials = opts.awsCredentials;
if (awsCredentials !== undefined) { if (awsCredentials !== undefined) {
callArgs.push(awsCredentials.accessKeyId) callArgs.push(awsCredentials.accessKeyId);
callArgs.push(awsCredentials.secretKey) callArgs.push(awsCredentials.secretKey);
callArgs.push(awsCredentials.sessionToken) callArgs.push(awsCredentials.sessionToken);
} else { } else {
callArgs.fill(undefined, 0, 3) callArgs.fill(undefined, 0, 3);
} }
callArgs.push(opts.awsRegion) callArgs.push(opts.awsRegion);
return callArgs return callArgs;
} }
export interface CreateTableOptions<T> { export interface CreateTableOptions<T> {
@@ -173,56 +178,56 @@ export interface CreateTableOptions<T> {
* *
* @see {@link ConnectionOptions} for more details on the URI format. * @see {@link ConnectionOptions} for more details on the URI format.
*/ */
export async function connect (uri: string): Promise<Connection> export async function connect(uri: string): Promise<Connection>;
/** /**
* Connect to a LanceDB instance with connection options. * Connect to a LanceDB instance with connection options.
* *
* @param opts The {@link ConnectionOptions} to use when connecting to the database. * @param opts The {@link ConnectionOptions} to use when connecting to the database.
*/ */
export async function connect ( export async function connect(
opts: Partial<ConnectionOptions> opts: Partial<ConnectionOptions>
): Promise<Connection> ): Promise<Connection>;
export async function connect ( export async function connect(
arg: string | Partial<ConnectionOptions> arg: string | Partial<ConnectionOptions>
): Promise<Connection> { ): Promise<Connection> {
let opts: ConnectionOptions let opts: ConnectionOptions;
if (typeof arg === 'string') { if (typeof arg === "string") {
opts = { uri: arg } opts = { uri: arg };
} else { } else {
const keys = Object.keys(arg) const keys = Object.keys(arg);
if (keys.length === 1 && keys[0] === 'uri' && typeof arg.uri === 'string') { if (keys.length === 1 && keys[0] === "uri" && typeof arg.uri === "string") {
opts = { uri: arg.uri } opts = { uri: arg.uri };
} else { } else {
opts = Object.assign( opts = Object.assign(
{ {
uri: '', uri: "",
awsCredentials: undefined, awsCredentials: undefined,
awsRegion: defaultAwsRegion, awsRegion: defaultAwsRegion,
apiKey: undefined, apiKey: undefined,
region: defaultAwsRegion region: defaultAwsRegion
}, },
arg arg
) );
} }
} }
if (opts.uri.startsWith('db://')) { if (opts.uri.startsWith("db://")) {
// Remote connection // Remote connection
return new RemoteConnection(opts) return new RemoteConnection(opts);
} }
const storageOptions = opts.storageOptions ?? {}; const storageOptions = opts.storageOptions ?? {};
if (opts.awsCredentials?.accessKeyId !== undefined) { if (opts.awsCredentials?.accessKeyId !== undefined) {
storageOptions.aws_access_key_id = opts.awsCredentials.accessKeyId storageOptions.aws_access_key_id = opts.awsCredentials.accessKeyId;
} }
if (opts.awsCredentials?.secretKey !== undefined) { if (opts.awsCredentials?.secretKey !== undefined) {
storageOptions.aws_secret_access_key = opts.awsCredentials.secretKey storageOptions.aws_secret_access_key = opts.awsCredentials.secretKey;
} }
if (opts.awsCredentials?.sessionToken !== undefined) { if (opts.awsCredentials?.sessionToken !== undefined) {
storageOptions.aws_session_token = opts.awsCredentials.sessionToken storageOptions.aws_session_token = opts.awsCredentials.sessionToken;
} }
if (opts.awsRegion !== undefined) { if (opts.awsRegion !== undefined) {
storageOptions.region = opts.awsRegion storageOptions.region = opts.awsRegion;
} }
// It's a pain to pass a record to Rust, so we convert it to an array of key-value pairs // It's a pain to pass a record to Rust, so we convert it to an array of key-value pairs
const storageOptionsArr = Object.entries(storageOptions); const storageOptionsArr = Object.entries(storageOptions);
@@ -231,8 +236,8 @@ export async function connect (
opts.uri, opts.uri,
storageOptionsArr, storageOptionsArr,
opts.readConsistencyInterval opts.readConsistencyInterval
) );
return new LocalConnection(db, opts) return new LocalConnection(db, opts);
} }
/** /**
@@ -533,7 +538,11 @@ export interface Table<T = number[]> {
* @param data the new data to insert * @param data the new data to insert
* @param args parameters controlling how the operation should behave * @param args parameters controlling how the operation should behave
*/ */
mergeInsert: (on: string, data: Array<Record<string, unknown>> | ArrowTable, args: MergeInsertArgs) => Promise<void> mergeInsert: (
on: string,
data: Array<Record<string, unknown>> | ArrowTable,
args: MergeInsertArgs
) => Promise<void>
/** /**
* List the indicies on this table. * List the indicies on this table.
@@ -558,7 +567,9 @@ export interface Table<T = number[]> {
* expressions will be evaluated for each row in the * expressions will be evaluated for each row in the
* table, and can reference existing columns in the table. * table, and can reference existing columns in the table.
*/ */
addColumns(newColumnTransforms: Array<{ name: string, valueSql: string }>): Promise<void> addColumns(
newColumnTransforms: Array<{ name: string, valueSql: string }>
): Promise<void>
/** /**
* Alter the name or nullability of columns. * Alter the name or nullability of columns.
@@ -699,23 +710,23 @@ export interface IndexStats {
* A connection to a LanceDB database. * A connection to a LanceDB database.
*/ */
export class LocalConnection implements Connection { export class LocalConnection implements Connection {
private readonly _options: () => ConnectionOptions private readonly _options: () => ConnectionOptions;
private readonly _db: any private readonly _db: any;
constructor (db: any, options: ConnectionOptions) { constructor(db: any, options: ConnectionOptions) {
this._options = () => options this._options = () => options;
this._db = db this._db = db;
} }
get uri (): string { get uri(): string {
return this._options().uri return this._options().uri;
} }
/** /**
* Get the names of all tables in the database. * Get the names of all tables in the database.
*/ */
async tableNames (): Promise<string[]> { async tableNames(): Promise<string[]> {
return databaseTableNames.call(this._db) return databaseTableNames.call(this._db);
} }
/** /**
@@ -723,7 +734,7 @@ export class LocalConnection implements Connection {
* *
* @param name The name of the table. * @param name The name of the table.
*/ */
async openTable (name: string): Promise<Table> async openTable(name: string): Promise<Table>;
/** /**
* Open a table in the database. * Open a table in the database.
@@ -734,23 +745,20 @@ export class LocalConnection implements Connection {
async openTable<T>( async openTable<T>(
name: string, name: string,
embeddings: EmbeddingFunction<T> embeddings: EmbeddingFunction<T>
): Promise<Table<T>> ): Promise<Table<T>>;
async openTable<T>( async openTable<T>(
name: string, name: string,
embeddings?: EmbeddingFunction<T> embeddings?: EmbeddingFunction<T>
): Promise<Table<T>> ): Promise<Table<T>>;
async openTable<T>( async openTable<T>(
name: string, name: string,
embeddings?: EmbeddingFunction<T> embeddings?: EmbeddingFunction<T>
): Promise<Table<T>> { ): Promise<Table<T>> {
const tbl = await databaseOpenTable.call( const tbl = await databaseOpenTable.call(this._db, name);
this._db,
name,
)
if (embeddings !== undefined) { if (embeddings !== undefined) {
return new LocalTable(tbl, name, this._options(), embeddings) return new LocalTable(tbl, name, this._options(), embeddings);
} else { } else {
return new LocalTable(tbl, name, this._options()) return new LocalTable(tbl, name, this._options());
} }
} }
@@ -760,32 +768,32 @@ export class LocalConnection implements Connection {
optsOrEmbedding?: WriteOptions | EmbeddingFunction<T>, optsOrEmbedding?: WriteOptions | EmbeddingFunction<T>,
opt?: WriteOptions opt?: WriteOptions
): Promise<Table<T>> { ): Promise<Table<T>> {
if (typeof name === 'string') { if (typeof name === "string") {
let writeOptions: WriteOptions = new DefaultWriteOptions() let writeOptions: WriteOptions = new DefaultWriteOptions();
if (opt !== undefined && isWriteOptions(opt)) { if (opt !== undefined && isWriteOptions(opt)) {
writeOptions = opt writeOptions = opt;
} else if ( } else if (
optsOrEmbedding !== undefined && optsOrEmbedding !== undefined &&
isWriteOptions(optsOrEmbedding) isWriteOptions(optsOrEmbedding)
) { ) {
writeOptions = optsOrEmbedding writeOptions = optsOrEmbedding;
} }
let embeddings: undefined | EmbeddingFunction<T> let embeddings: undefined | EmbeddingFunction<T>;
if ( if (
optsOrEmbedding !== undefined && optsOrEmbedding !== undefined &&
isEmbeddingFunction(optsOrEmbedding) isEmbeddingFunction(optsOrEmbedding)
) { ) {
embeddings = optsOrEmbedding embeddings = optsOrEmbedding;
} }
return await this.createTableImpl({ return await this.createTableImpl({
name, name,
data, data,
embeddingFunction: embeddings, embeddingFunction: embeddings,
writeOptions writeOptions
}) });
} }
return await this.createTableImpl(name) return await this.createTableImpl(name);
} }
private async createTableImpl<T>({ private async createTableImpl<T>({
@@ -801,27 +809,27 @@ export class LocalConnection implements Connection {
embeddingFunction?: EmbeddingFunction<T> | undefined embeddingFunction?: EmbeddingFunction<T> | undefined
writeOptions?: WriteOptions | undefined writeOptions?: WriteOptions | undefined
}): Promise<Table<T>> { }): Promise<Table<T>> {
let buffer: Buffer let buffer: Buffer;
function isEmpty ( function isEmpty(
data: Array<Record<string, unknown>> | ArrowTable<any> data: Array<Record<string, unknown>> | ArrowTable<any>
): boolean { ): boolean {
if (data instanceof ArrowTable) { if (data instanceof ArrowTable) {
return data.data.length === 0 return data.data.length === 0;
} }
return data.length === 0 return data.length === 0;
} }
if (data === undefined || isEmpty(data)) { if (data === undefined || isEmpty(data)) {
if (schema === undefined) { if (schema === undefined) {
throw new Error('Either data or schema needs to defined') throw new Error("Either data or schema needs to defined");
} }
buffer = await fromTableToBuffer(createEmptyTable(schema)) buffer = await fromTableToBuffer(createEmptyTable(schema));
} else if (data instanceof ArrowTable) { } else if (data instanceof ArrowTable) {
buffer = await fromTableToBuffer(data, embeddingFunction, schema) buffer = await fromTableToBuffer(data, embeddingFunction, schema);
} else { } else {
// data is Array<Record<...>> // data is Array<Record<...>>
buffer = await fromRecordsToBuffer(data, embeddingFunction, schema) buffer = await fromRecordsToBuffer(data, embeddingFunction, schema);
} }
const tbl = await tableCreate.call( const tbl = await tableCreate.call(
@@ -830,11 +838,11 @@ export class LocalConnection implements Connection {
buffer, buffer,
writeOptions?.writeMode?.toString(), writeOptions?.writeMode?.toString(),
...getAwsArgs(this._options()) ...getAwsArgs(this._options())
) );
if (embeddingFunction !== undefined) { if (embeddingFunction !== undefined) {
return new LocalTable(tbl, name, this._options(), embeddingFunction) return new LocalTable(tbl, name, this._options(), embeddingFunction);
} else { } else {
return new LocalTable(tbl, name, this._options()) return new LocalTable(tbl, name, this._options());
} }
} }
@@ -842,69 +850,69 @@ export class LocalConnection implements Connection {
* Drop an existing table. * Drop an existing table.
* @param name The name of the table to drop. * @param name The name of the table to drop.
*/ */
async dropTable (name: string): Promise<void> { async dropTable(name: string): Promise<void> {
await databaseDropTable.call(this._db, name) await databaseDropTable.call(this._db, name);
} }
withMiddleware (middleware: HttpMiddleware): Connection { withMiddleware(middleware: HttpMiddleware): Connection {
return this return this;
} }
} }
export class LocalTable<T = number[]> implements Table<T> { export class LocalTable<T = number[]> implements Table<T> {
private _tbl: any private _tbl: any;
private readonly _name: string private readonly _name: string;
private readonly _isElectron: boolean private readonly _isElectron: boolean;
private readonly _embeddings?: EmbeddingFunction<T> private readonly _embeddings?: EmbeddingFunction<T>;
private readonly _options: () => ConnectionOptions private readonly _options: () => ConnectionOptions;
constructor (tbl: any, name: string, options: ConnectionOptions) constructor(tbl: any, name: string, options: ConnectionOptions);
/** /**
* @param tbl * @param tbl
* @param name * @param name
* @param options * @param options
* @param embeddings An embedding function to use when interacting with this table * @param embeddings An embedding function to use when interacting with this table
*/ */
constructor ( constructor(
tbl: any, tbl: any,
name: string, name: string,
options: ConnectionOptions, options: ConnectionOptions,
embeddings: EmbeddingFunction<T> embeddings: EmbeddingFunction<T>
) );
constructor ( constructor(
tbl: any, tbl: any,
name: string, name: string,
options: ConnectionOptions, options: ConnectionOptions,
embeddings?: EmbeddingFunction<T> embeddings?: EmbeddingFunction<T>
) { ) {
this._tbl = tbl this._tbl = tbl;
this._name = name this._name = name;
this._embeddings = embeddings this._embeddings = embeddings;
this._options = () => options this._options = () => options;
this._isElectron = this.checkElectron() this._isElectron = this.checkElectron();
} }
get name (): string { get name(): string {
return this._name return this._name;
} }
/** /**
* Creates a search query to find the nearest neighbors of the given search term * Creates a search query to find the nearest neighbors of the given search term
* @param query The query search term * @param query The query search term
*/ */
search (query: T): Query<T> { search(query: T): Query<T> {
return new Query(query, this._tbl, this._embeddings) return new Query(query, this._tbl, this._embeddings);
} }
/** /**
* Creates a filter query to find all rows matching the specified criteria * Creates a filter query to find all rows matching the specified criteria
* @param value The filter criteria (like SQL where clause syntax) * @param value The filter criteria (like SQL where clause syntax)
*/ */
filter (value: string): Query<T> { filter(value: string): Query<T> {
return new Query(undefined, this._tbl, this._embeddings).filter(value) return new Query(undefined, this._tbl, this._embeddings).filter(value);
} }
where = this.filter where = this.filter;
/** /**
* Insert records into this Table. * Insert records into this Table.
@@ -912,16 +920,19 @@ export class LocalTable<T = number[]> implements Table<T> {
* @param data Records to be inserted into the Table * @param data Records to be inserted into the Table
* @return The number of rows added to the table * @return The number of rows added to the table
*/ */
async add ( async add(
data: Array<Record<string, unknown>> | ArrowTable data: Array<Record<string, unknown>> | ArrowTable
): Promise<number> { ): Promise<number> {
const schema = await this.schema const schema = await this.schema;
let tbl: ArrowTable
let tbl: ArrowTable;
if (data instanceof ArrowTable) { if (data instanceof ArrowTable) {
tbl = data tbl = data;
} else { } else {
tbl = makeArrowTable(data, { schema }) tbl = makeArrowTable(data, { schema, embeddings: this._embeddings });
} }
return tableAdd return tableAdd
.call( .call(
this._tbl, this._tbl,
@@ -930,8 +941,8 @@ export class LocalTable<T = number[]> implements Table<T> {
...getAwsArgs(this._options()) ...getAwsArgs(this._options())
) )
.then((newTable: any) => { .then((newTable: any) => {
this._tbl = newTable this._tbl = newTable;
}) });
} }
/** /**
@@ -940,14 +951,14 @@ export class LocalTable<T = number[]> implements Table<T> {
* @param data Records to be inserted into the Table * @param data Records to be inserted into the Table
* @return The number of rows added to the table * @return The number of rows added to the table
*/ */
async overwrite ( async overwrite(
data: Array<Record<string, unknown>> | ArrowTable data: Array<Record<string, unknown>> | ArrowTable
): Promise<number> { ): Promise<number> {
let buffer: Buffer let buffer: Buffer;
if (data instanceof ArrowTable) { if (data instanceof ArrowTable) {
buffer = await fromTableToBuffer(data, this._embeddings) buffer = await fromTableToBuffer(data, this._embeddings);
} else { } else {
buffer = await fromRecordsToBuffer(data, this._embeddings) buffer = await fromRecordsToBuffer(data, this._embeddings);
} }
return tableAdd return tableAdd
.call( .call(
@@ -957,8 +968,8 @@ export class LocalTable<T = number[]> implements Table<T> {
...getAwsArgs(this._options()) ...getAwsArgs(this._options())
) )
.then((newTable: any) => { .then((newTable: any) => {
this._tbl = newTable this._tbl = newTable;
}) });
} }
/** /**
@@ -966,26 +977,26 @@ export class LocalTable<T = number[]> implements Table<T> {
* *
* @param indexParams The parameters of this Index, @see VectorIndexParams. * @param indexParams The parameters of this Index, @see VectorIndexParams.
*/ */
async createIndex (indexParams: VectorIndexParams): Promise<any> { async createIndex(indexParams: VectorIndexParams): Promise<any> {
return tableCreateVectorIndex return tableCreateVectorIndex
.call(this._tbl, indexParams) .call(this._tbl, indexParams)
.then((newTable: any) => { .then((newTable: any) => {
this._tbl = newTable this._tbl = newTable;
}) });
} }
async createScalarIndex (column: string, replace?: boolean): Promise<void> { async createScalarIndex(column: string, replace?: boolean): Promise<void> {
if (replace === undefined) { if (replace === undefined) {
replace = true replace = true;
} }
return tableCreateScalarIndex.call(this._tbl, column, replace) return tableCreateScalarIndex.call(this._tbl, column, replace);
} }
/** /**
* Returns the number of rows in this table. * Returns the number of rows in this table.
*/ */
async countRows (filter?: string): Promise<number> { async countRows(filter?: string): Promise<number> {
return tableCountRows.call(this._tbl, filter) return tableCountRows.call(this._tbl, filter);
} }
/** /**
@@ -993,10 +1004,10 @@ export class LocalTable<T = number[]> implements Table<T> {
* *
* @param filter A filter in the same format used by a sql WHERE clause. * @param filter A filter in the same format used by a sql WHERE clause.
*/ */
async delete (filter: string): Promise<void> { async delete(filter: string): Promise<void> {
return tableDelete.call(this._tbl, filter).then((newTable: any) => { return tableDelete.call(this._tbl, filter).then((newTable: any) => {
this._tbl = newTable this._tbl = newTable;
}) });
} }
/** /**
@@ -1006,55 +1017,65 @@ export class LocalTable<T = number[]> implements Table<T> {
* *
* @returns * @returns
*/ */
async update (args: UpdateArgs | UpdateSqlArgs): Promise<void> { async update(args: UpdateArgs | UpdateSqlArgs): Promise<void> {
let filter: string | null let filter: string | null;
let updates: Record<string, string> let updates: Record<string, string>;
if ('valuesSql' in args) { if ("valuesSql" in args) {
filter = args.where ?? null filter = args.where ?? null;
updates = args.valuesSql updates = args.valuesSql;
} else { } else {
filter = args.where ?? null filter = args.where ?? null;
updates = {} updates = {};
for (const [key, value] of Object.entries(args.values)) { for (const [key, value] of Object.entries(args.values)) {
updates[key] = toSQL(value) updates[key] = toSQL(value);
} }
} }
return tableUpdate return tableUpdate
.call(this._tbl, filter, updates) .call(this._tbl, filter, updates)
.then((newTable: any) => { .then((newTable: any) => {
this._tbl = newTable this._tbl = newTable;
}) });
} }
async mergeInsert (on: string, data: Array<Record<string, unknown>> | ArrowTable, args: MergeInsertArgs): Promise<void> { async mergeInsert(
let whenMatchedUpdateAll = false on: string,
let whenMatchedUpdateAllFilt = null data: Array<Record<string, unknown>> | ArrowTable,
if (args.whenMatchedUpdateAll !== undefined && args.whenMatchedUpdateAll !== null) { args: MergeInsertArgs
whenMatchedUpdateAll = true ): Promise<void> {
let whenMatchedUpdateAll = false;
let whenMatchedUpdateAllFilt = null;
if (
args.whenMatchedUpdateAll !== undefined &&
args.whenMatchedUpdateAll !== null
) {
whenMatchedUpdateAll = true;
if (args.whenMatchedUpdateAll !== true) { if (args.whenMatchedUpdateAll !== true) {
whenMatchedUpdateAllFilt = args.whenMatchedUpdateAll whenMatchedUpdateAllFilt = args.whenMatchedUpdateAll;
} }
} }
const whenNotMatchedInsertAll = args.whenNotMatchedInsertAll ?? false const whenNotMatchedInsertAll = args.whenNotMatchedInsertAll ?? false;
let whenNotMatchedBySourceDelete = false let whenNotMatchedBySourceDelete = false;
let whenNotMatchedBySourceDeleteFilt = null let whenNotMatchedBySourceDeleteFilt = null;
if (args.whenNotMatchedBySourceDelete !== undefined && args.whenNotMatchedBySourceDelete !== null) { if (
whenNotMatchedBySourceDelete = true args.whenNotMatchedBySourceDelete !== undefined &&
args.whenNotMatchedBySourceDelete !== null
) {
whenNotMatchedBySourceDelete = true;
if (args.whenNotMatchedBySourceDelete !== true) { if (args.whenNotMatchedBySourceDelete !== true) {
whenNotMatchedBySourceDeleteFilt = args.whenNotMatchedBySourceDelete whenNotMatchedBySourceDeleteFilt = args.whenNotMatchedBySourceDelete;
} }
} }
const schema = await this.schema const schema = await this.schema;
let tbl: ArrowTable let tbl: ArrowTable;
if (data instanceof ArrowTable) { if (data instanceof ArrowTable) {
tbl = data tbl = data;
} else { } else {
tbl = makeArrowTable(data, { schema }) tbl = makeArrowTable(data, { schema });
} }
const buffer = await fromTableToBuffer(tbl, this._embeddings, schema) const buffer = await fromTableToBuffer(tbl, this._embeddings, schema);
this._tbl = await tableMergeInsert.call( this._tbl = await tableMergeInsert.call(
this._tbl, this._tbl,
@@ -1065,7 +1086,7 @@ export class LocalTable<T = number[]> implements Table<T> {
whenNotMatchedBySourceDelete, whenNotMatchedBySourceDelete,
whenNotMatchedBySourceDeleteFilt, whenNotMatchedBySourceDeleteFilt,
buffer buffer
) );
} }
/** /**
@@ -1083,16 +1104,16 @@ export class LocalTable<T = number[]> implements Table<T> {
* uphold this promise can lead to corrupted tables. * uphold this promise can lead to corrupted tables.
* @returns * @returns
*/ */
async cleanupOldVersions ( async cleanupOldVersions(
olderThan?: number, olderThan?: number,
deleteUnverified?: boolean deleteUnverified?: boolean
): Promise<CleanupStats> { ): Promise<CleanupStats> {
return tableCleanupOldVersions return tableCleanupOldVersions
.call(this._tbl, olderThan, deleteUnverified) .call(this._tbl, olderThan, deleteUnverified)
.then((res: { newTable: any, metrics: CleanupStats }) => { .then((res: { newTable: any, metrics: CleanupStats }) => {
this._tbl = res.newTable this._tbl = res.newTable;
return res.metrics return res.metrics;
}) });
} }
/** /**
@@ -1106,62 +1127,64 @@ export class LocalTable<T = number[]> implements Table<T> {
* for most tables. * for most tables.
* @returns Metrics about the compaction operation. * @returns Metrics about the compaction operation.
*/ */
async compactFiles (options?: CompactionOptions): Promise<CompactionMetrics> { async compactFiles(options?: CompactionOptions): Promise<CompactionMetrics> {
const optionsArg = options ?? {} const optionsArg = options ?? {};
return tableCompactFiles return tableCompactFiles
.call(this._tbl, optionsArg) .call(this._tbl, optionsArg)
.then((res: { newTable: any, metrics: CompactionMetrics }) => { .then((res: { newTable: any, metrics: CompactionMetrics }) => {
this._tbl = res.newTable this._tbl = res.newTable;
return res.metrics return res.metrics;
}) });
} }
async listIndices (): Promise<VectorIndex[]> { async listIndices(): Promise<VectorIndex[]> {
return tableListIndices.call(this._tbl) return tableListIndices.call(this._tbl);
} }
async indexStats (indexUuid: string): Promise<IndexStats> { async indexStats(indexUuid: string): Promise<IndexStats> {
return tableIndexStats.call(this._tbl, indexUuid) return tableIndexStats.call(this._tbl, indexUuid);
} }
get schema (): Promise<Schema> { get schema(): Promise<Schema> {
// empty table // empty table
return this.getSchema() return this.getSchema();
} }
private async getSchema (): Promise<Schema> { private async getSchema(): Promise<Schema> {
const buffer = await tableSchema.call(this._tbl, this._isElectron) const buffer = await tableSchema.call(this._tbl, this._isElectron);
const table = tableFromIPC(buffer) const table = tableFromIPC(buffer);
return table.schema return table.schema;
} }
// See https://github.com/electron/electron/issues/2288 // See https://github.com/electron/electron/issues/2288
private checkElectron (): boolean { private checkElectron(): boolean {
try { try {
// eslint-disable-next-line no-prototype-builtins // eslint-disable-next-line no-prototype-builtins
return ( return (
Object.prototype.hasOwnProperty.call(process?.versions, 'electron') || Object.prototype.hasOwnProperty.call(process?.versions, "electron") ||
navigator?.userAgent?.toLowerCase()?.includes(' electron') navigator?.userAgent?.toLowerCase()?.includes(" electron")
) );
} catch (e) { } catch (e) {
return false return false;
} }
} }
async addColumns (newColumnTransforms: Array<{ name: string, valueSql: string }>): Promise<void> { async addColumns(
return tableAddColumns.call(this._tbl, newColumnTransforms) newColumnTransforms: Array<{ name: string, valueSql: string }>
): Promise<void> {
return tableAddColumns.call(this._tbl, newColumnTransforms);
} }
async alterColumns (columnAlterations: ColumnAlteration[]): Promise<void> { async alterColumns(columnAlterations: ColumnAlteration[]): Promise<void> {
return tableAlterColumns.call(this._tbl, columnAlterations) return tableAlterColumns.call(this._tbl, columnAlterations);
} }
async dropColumns (columnNames: string[]): Promise<void> { async dropColumns(columnNames: string[]): Promise<void> {
return tableDropColumns.call(this._tbl, columnNames) return tableDropColumns.call(this._tbl, columnNames);
} }
withMiddleware (middleware: HttpMiddleware): Table<T> { withMiddleware(middleware: HttpMiddleware): Table<T> {
return this return this;
} }
} }
@@ -1184,7 +1207,7 @@ export interface CompactionOptions {
*/ */
targetRowsPerFragment?: number targetRowsPerFragment?: number
/** /**
* The maximum number of rows per group. Defaults to 1024. * The maximum number of T per group. Defaults to 1024.
*/ */
maxRowsPerGroup?: number maxRowsPerGroup?: number
/** /**
@@ -1284,21 +1307,21 @@ export interface IvfPQIndexConfig {
*/ */
index_cache_size?: number index_cache_size?: number
type: 'ivf_pq' type: "ivf_pq"
} }
export type VectorIndexParams = IvfPQIndexConfig export type VectorIndexParams = IvfPQIndexConfig;
/** /**
* Write mode for writing a table. * Write mode for writing a table.
*/ */
export enum WriteMode { export enum WriteMode {
/** Create a new {@link Table}. */ /** Create a new {@link Table}. */
Create = 'create', Create = "create",
/** Overwrite the existing {@link Table} if presented. */ /** Overwrite the existing {@link Table} if presented. */
Overwrite = 'overwrite', Overwrite = "overwrite",
/** Append new data to the table. */ /** Append new data to the table. */
Append = 'append', Append = "append",
} }
/** /**
@@ -1310,14 +1333,14 @@ export interface WriteOptions {
} }
export class DefaultWriteOptions implements WriteOptions { export class DefaultWriteOptions implements WriteOptions {
writeMode = WriteMode.Create writeMode = WriteMode.Create;
} }
export function isWriteOptions (value: any): value is WriteOptions { export function isWriteOptions(value: any): value is WriteOptions {
return ( return (
Object.keys(value).length === 1 && Object.keys(value).length === 1 &&
(value.writeMode === undefined || typeof value.writeMode === 'string') (value.writeMode === undefined || typeof value.writeMode === "string")
) );
} }
/** /**
@@ -1327,15 +1350,15 @@ export enum MetricType {
/** /**
* Euclidean distance * Euclidean distance
*/ */
L2 = 'l2', L2 = "l2",
/** /**
* Cosine distance * Cosine distance
*/ */
Cosine = 'cosine', Cosine = "cosine",
/** /**
* Dot product * Dot product
*/ */
Dot = 'dot', Dot = "dot",
} }

View File

@@ -32,7 +32,7 @@ import {
Bool, Bool,
Date_, Date_,
Decimal, Decimal,
DataType, type DataType,
Dictionary, Dictionary,
Binary, Binary,
Float32, Float32,
@@ -74,12 +74,12 @@ import {
DurationNanosecond, DurationNanosecond,
DurationMicrosecond, DurationMicrosecond,
DurationMillisecond, DurationMillisecond,
DurationSecond, DurationSecond
} from "apache-arrow"; } from "apache-arrow";
import type { IntBitWidth, TimeBitWidth } from "apache-arrow/type"; import type { IntBitWidth, TimeBitWidth } from "apache-arrow/type";
function sanitizeMetadata( function sanitizeMetadata(
metadataLike?: unknown, metadataLike?: unknown
): Map<string, string> | undefined { ): Map<string, string> | undefined {
if (metadataLike === undefined || metadataLike === null) { if (metadataLike === undefined || metadataLike === null) {
return undefined; return undefined;
@@ -90,7 +90,7 @@ function sanitizeMetadata(
for (const item of metadataLike) { for (const item of metadataLike) {
if (!(typeof item[0] === "string" || !(typeof item[1] === "string"))) { if (!(typeof item[0] === "string" || !(typeof item[1] === "string"))) {
throw Error( throw Error(
"Expected metadata, if present, to be a Map<string, string> but it had non-string keys or values", "Expected metadata, if present, to be a Map<string, string> but it had non-string keys or values"
); );
} }
} }
@@ -105,7 +105,7 @@ function sanitizeInt(typeLike: object) {
typeof typeLike.isSigned !== "boolean" typeof typeLike.isSigned !== "boolean"
) { ) {
throw Error( throw Error(
"Expected an Int Type to have a `bitWidth` and `isSigned` property", "Expected an Int Type to have a `bitWidth` and `isSigned` property"
); );
} }
return new Int(typeLike.isSigned, typeLike.bitWidth as IntBitWidth); return new Int(typeLike.isSigned, typeLike.bitWidth as IntBitWidth);
@@ -128,7 +128,7 @@ function sanitizeDecimal(typeLike: object) {
typeof typeLike.bitWidth !== "number" typeof typeLike.bitWidth !== "number"
) { ) {
throw Error( throw Error(
"Expected a Decimal Type to have `scale`, `precision`, and `bitWidth` properties", "Expected a Decimal Type to have `scale`, `precision`, and `bitWidth` properties"
); );
} }
return new Decimal(typeLike.scale, typeLike.precision, typeLike.bitWidth); return new Decimal(typeLike.scale, typeLike.precision, typeLike.bitWidth);
@@ -149,7 +149,7 @@ function sanitizeTime(typeLike: object) {
typeof typeLike.bitWidth !== "number" typeof typeLike.bitWidth !== "number"
) { ) {
throw Error( throw Error(
"Expected a Time type to have `unit` and `bitWidth` properties", "Expected a Time type to have `unit` and `bitWidth` properties"
); );
} }
return new Time(typeLike.unit, typeLike.bitWidth as TimeBitWidth); return new Time(typeLike.unit, typeLike.bitWidth as TimeBitWidth);
@@ -172,7 +172,7 @@ function sanitizeTypedTimestamp(
| typeof TimestampNanosecond | typeof TimestampNanosecond
| typeof TimestampMicrosecond | typeof TimestampMicrosecond
| typeof TimestampMillisecond | typeof TimestampMillisecond
| typeof TimestampSecond, | typeof TimestampSecond
) { ) {
let timezone = null; let timezone = null;
if ("timezone" in typeLike && typeof typeLike.timezone === "string") { if ("timezone" in typeLike && typeof typeLike.timezone === "string") {
@@ -191,7 +191,7 @@ function sanitizeInterval(typeLike: object) {
function sanitizeList(typeLike: object) { function sanitizeList(typeLike: object) {
if (!("children" in typeLike) || !Array.isArray(typeLike.children)) { if (!("children" in typeLike) || !Array.isArray(typeLike.children)) {
throw Error( throw Error(
"Expected a List type to have an array-like `children` property", "Expected a List type to have an array-like `children` property"
); );
} }
if (typeLike.children.length !== 1) { if (typeLike.children.length !== 1) {
@@ -203,7 +203,7 @@ function sanitizeList(typeLike: object) {
function sanitizeStruct(typeLike: object) { function sanitizeStruct(typeLike: object) {
if (!("children" in typeLike) || !Array.isArray(typeLike.children)) { if (!("children" in typeLike) || !Array.isArray(typeLike.children)) {
throw Error( throw Error(
"Expected a Struct type to have an array-like `children` property", "Expected a Struct type to have an array-like `children` property"
); );
} }
return new Struct(typeLike.children.map((child) => sanitizeField(child))); return new Struct(typeLike.children.map((child) => sanitizeField(child)));
@@ -216,47 +216,47 @@ function sanitizeUnion(typeLike: object) {
typeof typeLike.mode !== "number" typeof typeLike.mode !== "number"
) { ) {
throw Error( throw Error(
"Expected a Union type to have `typeIds` and `mode` properties", "Expected a Union type to have `typeIds` and `mode` properties"
); );
} }
if (!("children" in typeLike) || !Array.isArray(typeLike.children)) { if (!("children" in typeLike) || !Array.isArray(typeLike.children)) {
throw Error( throw Error(
"Expected a Union type to have an array-like `children` property", "Expected a Union type to have an array-like `children` property"
); );
} }
return new Union( return new Union(
typeLike.mode, typeLike.mode,
typeLike.typeIds as any, typeLike.typeIds as any,
typeLike.children.map((child) => sanitizeField(child)), typeLike.children.map((child) => sanitizeField(child))
); );
} }
function sanitizeTypedUnion( function sanitizeTypedUnion(
typeLike: object, typeLike: object,
UnionType: typeof DenseUnion | typeof SparseUnion, UnionType: typeof DenseUnion | typeof SparseUnion
) { ) {
if (!("typeIds" in typeLike)) { if (!("typeIds" in typeLike)) {
throw Error( throw Error(
"Expected a DenseUnion/SparseUnion type to have a `typeIds` property", "Expected a DenseUnion/SparseUnion type to have a `typeIds` property"
); );
} }
if (!("children" in typeLike) || !Array.isArray(typeLike.children)) { if (!("children" in typeLike) || !Array.isArray(typeLike.children)) {
throw Error( throw Error(
"Expected a DenseUnion/SparseUnion type to have an array-like `children` property", "Expected a DenseUnion/SparseUnion type to have an array-like `children` property"
); );
} }
return new UnionType( return new UnionType(
typeLike.typeIds as any, typeLike.typeIds as any,
typeLike.children.map((child) => sanitizeField(child)), typeLike.children.map((child) => sanitizeField(child))
); );
} }
function sanitizeFixedSizeBinary(typeLike: object) { function sanitizeFixedSizeBinary(typeLike: object) {
if (!("byteWidth" in typeLike) || typeof typeLike.byteWidth !== "number") { if (!("byteWidth" in typeLike) || typeof typeLike.byteWidth !== "number") {
throw Error( throw Error(
"Expected a FixedSizeBinary type to have a `byteWidth` property", "Expected a FixedSizeBinary type to have a `byteWidth` property"
); );
} }
return new FixedSizeBinary(typeLike.byteWidth); return new FixedSizeBinary(typeLike.byteWidth);
@@ -268,7 +268,7 @@ function sanitizeFixedSizeList(typeLike: object) {
} }
if (!("children" in typeLike) || !Array.isArray(typeLike.children)) { if (!("children" in typeLike) || !Array.isArray(typeLike.children)) {
throw Error( throw Error(
"Expected a FixedSizeList type to have an array-like `children` property", "Expected a FixedSizeList type to have an array-like `children` property"
); );
} }
if (typeLike.children.length !== 1) { if (typeLike.children.length !== 1) {
@@ -276,14 +276,14 @@ function sanitizeFixedSizeList(typeLike: object) {
} }
return new FixedSizeList( return new FixedSizeList(
typeLike.listSize, typeLike.listSize,
sanitizeField(typeLike.children[0]), sanitizeField(typeLike.children[0])
); );
} }
function sanitizeMap(typeLike: object) { function sanitizeMap(typeLike: object) {
if (!("children" in typeLike) || !Array.isArray(typeLike.children)) { if (!("children" in typeLike) || !Array.isArray(typeLike.children)) {
throw Error( throw Error(
"Expected a Map type to have an array-like `children` property", "Expected a Map type to have an array-like `children` property"
); );
} }
if (!("keysSorted" in typeLike) || typeof typeLike.keysSorted !== "boolean") { if (!("keysSorted" in typeLike) || typeof typeLike.keysSorted !== "boolean") {
@@ -291,7 +291,7 @@ function sanitizeMap(typeLike: object) {
} }
return new Map_( return new Map_(
typeLike.children.map((field) => sanitizeField(field)) as any, typeLike.children.map((field) => sanitizeField(field)) as any,
typeLike.keysSorted, typeLike.keysSorted
); );
} }
@@ -319,7 +319,7 @@ function sanitizeDictionary(typeLike: object) {
sanitizeType(typeLike.dictionary), sanitizeType(typeLike.dictionary),
sanitizeType(typeLike.indices) as any, sanitizeType(typeLike.indices) as any,
typeLike.id, typeLike.id,
typeLike.isOrdered, typeLike.isOrdered
); );
} }
@@ -454,7 +454,7 @@ function sanitizeField(fieldLike: unknown): Field {
!("nullable" in fieldLike) !("nullable" in fieldLike)
) { ) {
throw Error( throw Error(
"The field passed in is missing a `type`/`name`/`nullable` property", "The field passed in is missing a `type`/`name`/`nullable` property"
); );
} }
const type = sanitizeType(fieldLike.type); const type = sanitizeType(fieldLike.type);
@@ -489,7 +489,7 @@ export function sanitizeSchema(schemaLike: unknown): Schema {
} }
if (!("fields" in schemaLike)) { if (!("fields" in schemaLike)) {
throw Error( throw Error(
"The schema passed in does not appear to be a schema (no 'fields' property)", "The schema passed in does not appear to be a schema (no 'fields' property)"
); );
} }
let metadata; let metadata;
@@ -498,11 +498,11 @@ export function sanitizeSchema(schemaLike: unknown): Schema {
} }
if (!Array.isArray(schemaLike.fields)) { if (!Array.isArray(schemaLike.fields)) {
throw Error( throw Error(
"The schema passed in had a 'fields' property but it was not an array", "The schema passed in had a 'fields' property but it was not an array"
); );
} }
const sanitizedFields = schemaLike.fields.map((field) => const sanitizedFields = schemaLike.fields.map((field) =>
sanitizeField(field), sanitizeField(field)
); );
return new Schema(sanitizedFields, metadata); return new Schema(sanitizedFields, metadata);
} }

File diff suppressed because it is too large Load Diff

View File

@@ -1,3 +0,0 @@
**/dist/**/*
**/native.js
**/native.d.ts

View File

@@ -1 +0,0 @@
.eslintignore

View File

@@ -43,29 +43,20 @@ npm run test
### Running lint / format ### Running lint / format
LanceDb uses eslint for linting. VSCode does not need any plugins to use eslint. However, it LanceDb uses [biome](https://biomejs.dev/) for linting and formatting. if you are using VSCode you will need to install the official [Biome](https://marketplace.visualstudio.com/items?itemName=biomejs.biome) extension.
may need some additional configuration. Make sure that eslint.experimental.useFlatConfig is To manually lint your code you can run:
set to true. Also, if your vscode root folder is the repo root then you will need to set
the eslint.workingDirectories to ["nodejs"]. To manually lint your code you can run:
```sh ```sh
npm run lint npm run lint
``` ```
LanceDb uses prettier for formatting. If you are using VSCode you will need to install the to automatically fix all fixable issues:
"Prettier - Code formatter" extension. You should then configure it to be the default formatter
for typescript and you should enable format on save. To manually check your code's format you
can run:
```sh ```sh
npm run chkformat npm run lint-fix
``` ```
If you need to manually format your code you can run: If you do not have your workspace root set to the `nodejs` directory, unfortunately the extension will not work. You can still run the linting and formatting commands manually.
```sh
npx prettier --write .
```
### Generating docs ### Generating docs

View File

@@ -13,32 +13,26 @@
// limitations under the License. // limitations under the License.
import { import {
convertToTable,
fromTableToBuffer,
makeArrowTable,
makeEmptyTable,
} from "../dist/arrow";
import {
Field,
FixedSizeList,
Float16,
Float32,
Int32,
tableFromIPC,
Schema,
Float64,
type Table,
Binary, Binary,
Bool, Bool,
Utf8,
Struct,
List,
DataType, DataType,
Dictionary, Dictionary,
Int64, Field,
FixedSizeList,
Float, Float,
Precision, Float16,
Float32,
Float64,
Int32,
Int64,
List,
MetadataVersion, MetadataVersion,
Precision,
Schema,
Struct,
type Table,
Utf8,
tableFromIPC,
} from "apache-arrow"; } from "apache-arrow";
import { import {
Dictionary as OldDictionary, Dictionary as OldDictionary,
@@ -46,14 +40,20 @@ import {
FixedSizeList as OldFixedSizeList, FixedSizeList as OldFixedSizeList,
Float32 as OldFloat32, Float32 as OldFloat32,
Int32 as OldInt32, Int32 as OldInt32,
Struct as OldStruct,
Schema as OldSchema, Schema as OldSchema,
Struct as OldStruct,
TimestampNanosecond as OldTimestampNanosecond, TimestampNanosecond as OldTimestampNanosecond,
Utf8 as OldUtf8, Utf8 as OldUtf8,
} from "apache-arrow-old"; } from "apache-arrow-old";
import { type EmbeddingFunction } from "../dist/embedding/embedding_function"; import {
convertToTable,
fromTableToBuffer,
makeArrowTable,
makeEmptyTable,
} from "../lancedb/arrow";
import { type EmbeddingFunction } from "../lancedb/embedding/embedding_function";
// eslint-disable-next-line @typescript-eslint/no-explicit-any // biome-ignore lint/suspicious/noExplicitAny: skip
function sampleRecords(): Array<Record<string, any>> { function sampleRecords(): Array<Record<string, any>> {
return [ return [
{ {
@@ -438,7 +438,7 @@ describe("when using two versions of arrow", function () {
new OldField("ts_no_tz", new OldTimestampNanosecond(null)), new OldField("ts_no_tz", new OldTimestampNanosecond(null)),
]), ]),
), ),
// eslint-disable-next-line @typescript-eslint/no-explicit-any // biome-ignore lint/suspicious/noExplicitAny: skip
]) as any; ]) as any;
schema.metadataVersion = MetadataVersion.V5; schema.metadataVersion = MetadataVersion.V5;
const table = makeArrowTable([], { schema }); const table = makeArrowTable([], { schema });

View File

@@ -14,11 +14,13 @@
import * as tmp from "tmp"; import * as tmp from "tmp";
import { Connection, connect } from "../dist/index.js"; import { Connection, connect } from "../lancedb";
describe("when connecting", () => { describe("when connecting", () => {
let tmpDir: tmp.DirResult; let tmpDir: tmp.DirResult;
beforeEach(() => (tmpDir = tmp.dirSync({ unsafeCleanup: true }))); beforeEach(() => {
tmpDir = tmp.dirSync({ unsafeCleanup: true });
});
afterEach(() => tmpDir.removeCallback()); afterEach(() => tmpDir.removeCallback());
it("should connect", async () => { it("should connect", async () => {

View File

@@ -14,7 +14,11 @@
/* eslint-disable @typescript-eslint/naming-convention */ /* eslint-disable @typescript-eslint/naming-convention */
import { connect } from "../dist"; import {
CreateKeyCommand,
KMSClient,
ScheduleKeyDeletionCommand,
} from "@aws-sdk/client-kms";
import { import {
CreateBucketCommand, CreateBucketCommand,
DeleteBucketCommand, DeleteBucketCommand,
@@ -23,11 +27,7 @@ import {
ListObjectsV2Command, ListObjectsV2Command,
S3Client, S3Client,
} from "@aws-sdk/client-s3"; } from "@aws-sdk/client-s3";
import { import { connect } from "../lancedb";
CreateKeyCommand,
ScheduleKeyDeletionCommand,
KMSClient,
} from "@aws-sdk/client-kms";
// Skip these tests unless the S3_TEST environment variable is set // Skip these tests unless the S3_TEST environment variable is set
const maybeDescribe = process.env.S3_TEST ? describe : describe.skip; const maybeDescribe = process.env.S3_TEST ? describe : describe.skip;
@@ -63,9 +63,10 @@ class S3Bucket {
// Delete the bucket if it already exists // Delete the bucket if it already exists
try { try {
await this.deleteBucket(client, name); await this.deleteBucket(client, name);
} catch (e) { } catch {
// It's fine if the bucket doesn't exist // It's fine if the bucket doesn't exist
} }
// biome-ignore lint/style/useNamingConvention: we dont control s3's api
await client.send(new CreateBucketCommand({ Bucket: name })); await client.send(new CreateBucketCommand({ Bucket: name }));
return new S3Bucket(name); return new S3Bucket(name);
} }
@@ -78,27 +79,32 @@ class S3Bucket {
static async deleteBucket(client: S3Client, name: string) { static async deleteBucket(client: S3Client, name: string) {
// Must delete all objects before we can delete the bucket // Must delete all objects before we can delete the bucket
const objects = await client.send( const objects = await client.send(
// biome-ignore lint/style/useNamingConvention: we dont control s3's api
new ListObjectsV2Command({ Bucket: name }), new ListObjectsV2Command({ Bucket: name }),
); );
if (objects.Contents) { if (objects.Contents) {
for (const object of objects.Contents) { for (const object of objects.Contents) {
await client.send( await client.send(
// biome-ignore lint/style/useNamingConvention: we dont control s3's api
new DeleteObjectCommand({ Bucket: name, Key: object.Key }), new DeleteObjectCommand({ Bucket: name, Key: object.Key }),
); );
} }
} }
// biome-ignore lint/style/useNamingConvention: we dont control s3's api
await client.send(new DeleteBucketCommand({ Bucket: name })); await client.send(new DeleteBucketCommand({ Bucket: name }));
} }
public async assertAllEncrypted(path: string, keyId: string) { public async assertAllEncrypted(path: string, keyId: string) {
const client = S3Bucket.s3Client(); const client = S3Bucket.s3Client();
const objects = await client.send( const objects = await client.send(
// biome-ignore lint/style/useNamingConvention: we dont control s3's api
new ListObjectsV2Command({ Bucket: this.name, Prefix: path }), new ListObjectsV2Command({ Bucket: this.name, Prefix: path }),
); );
if (objects.Contents) { if (objects.Contents) {
for (const object of objects.Contents) { for (const object of objects.Contents) {
const metadata = await client.send( const metadata = await client.send(
// biome-ignore lint/style/useNamingConvention: we dont control s3's api
new HeadObjectCommand({ Bucket: this.name, Key: object.Key }), new HeadObjectCommand({ Bucket: this.name, Key: object.Key }),
); );
expect(metadata.ServerSideEncryption).toBe("aws:kms"); expect(metadata.ServerSideEncryption).toBe("aws:kms");
@@ -137,6 +143,7 @@ class KmsKey {
public async delete() { public async delete() {
const client = KmsKey.kmsClient(); const client = KmsKey.kmsClient();
// biome-ignore lint/style/useNamingConvention: we dont control s3's api
await client.send(new ScheduleKeyDeletionCommand({ KeyId: this.keyId })); await client.send(new ScheduleKeyDeletionCommand({ KeyId: this.keyId }));
} }
} }

View File

@@ -16,18 +16,18 @@ import * as fs from "fs";
import * as path from "path"; import * as path from "path";
import * as tmp from "tmp"; import * as tmp from "tmp";
import { Table, connect } from "../dist";
import { import {
Schema,
Field, Field,
Float32,
Int32,
FixedSizeList, FixedSizeList,
Int64, Float32,
Float64, Float64,
Int32,
Int64,
Schema,
} from "apache-arrow"; } from "apache-arrow";
import { makeArrowTable } from "../dist/arrow"; import { Table, connect } from "../lancedb";
import { Index } from "../dist/indices"; import { makeArrowTable } from "../lancedb/arrow";
import { Index } from "../lancedb/indices";
describe("Given a table", () => { describe("Given a table", () => {
let tmpDir: tmp.DirResult; let tmpDir: tmp.DirResult;

136
nodejs/biome.json Normal file
View File

@@ -0,0 +1,136 @@
{
"$schema": "https://biomejs.dev/schemas/1.7.3/schema.json",
"organizeImports": {
"enabled": true
},
"files": {
"ignore": [
"**/dist/**/*",
"**/native.js",
"**/native.d.ts",
"**/npm/**/*",
"**/.vscode/**"
]
},
"formatter": {
"indentStyle": "space"
},
"linter": {
"enabled": true,
"rules": {
"recommended": false,
"complexity": {
"noBannedTypes": "error",
"noExtraBooleanCast": "error",
"noMultipleSpacesInRegularExpressionLiterals": "error",
"noUselessCatch": "error",
"noUselessThisAlias": "error",
"noUselessTypeConstraint": "error",
"noWith": "error"
},
"correctness": {
"noConstAssign": "error",
"noConstantCondition": "error",
"noEmptyCharacterClassInRegex": "error",
"noEmptyPattern": "error",
"noGlobalObjectCalls": "error",
"noInnerDeclarations": "error",
"noInvalidConstructorSuper": "error",
"noNewSymbol": "error",
"noNonoctalDecimalEscape": "error",
"noPrecisionLoss": "error",
"noSelfAssign": "error",
"noSetterReturn": "error",
"noSwitchDeclarations": "error",
"noUndeclaredVariables": "error",
"noUnreachable": "error",
"noUnreachableSuper": "error",
"noUnsafeFinally": "error",
"noUnsafeOptionalChaining": "error",
"noUnusedLabels": "error",
"noUnusedVariables": "error",
"useIsNan": "error",
"useValidForDirection": "error",
"useYield": "error"
},
"style": {
"noNamespace": "error",
"useAsConstAssertion": "error",
"useBlockStatements": "off",
"useNamingConvention": {
"level": "error",
"options": {
"strictCase": false
}
}
},
"suspicious": {
"noAssignInExpressions": "error",
"noAsyncPromiseExecutor": "error",
"noCatchAssign": "error",
"noClassAssign": "error",
"noCompareNegZero": "error",
"noControlCharactersInRegex": "error",
"noDebugger": "error",
"noDuplicateCase": "error",
"noDuplicateClassMembers": "error",
"noDuplicateObjectKeys": "error",
"noDuplicateParameters": "error",
"noEmptyBlockStatements": "error",
"noExplicitAny": "error",
"noExtraNonNullAssertion": "error",
"noFallthroughSwitchClause": "error",
"noFunctionAssign": "error",
"noGlobalAssign": "error",
"noImportAssign": "error",
"noMisleadingCharacterClass": "error",
"noMisleadingInstantiator": "error",
"noPrototypeBuiltins": "error",
"noRedeclare": "error",
"noShadowRestrictedNames": "error",
"noUnsafeDeclarationMerging": "error",
"noUnsafeNegation": "error",
"useGetterReturn": "error",
"useValidTypeof": "error"
}
},
"ignore": ["**/dist/**/*", "**/native.js", "**/native.d.ts"]
},
"javascript": {
"globals": []
},
"overrides": [
{
"include": ["**/*.ts", "**/*.tsx", "**/*.mts", "**/*.cts"],
"linter": {
"rules": {
"correctness": {
"noConstAssign": "off",
"noGlobalObjectCalls": "off",
"noInvalidConstructorSuper": "off",
"noNewSymbol": "off",
"noSetterReturn": "off",
"noUndeclaredVariables": "off",
"noUnreachable": "off",
"noUnreachableSuper": "off"
},
"style": {
"noArguments": "error",
"noVar": "error",
"useConst": "error"
},
"suspicious": {
"noDuplicateClassMembers": "off",
"noDuplicateObjectKeys": "off",
"noDuplicateParameters": "off",
"noFunctionAssign": "off",
"noImportAssign": "off",
"noRedeclare": "off",
"noUnsafeNegation": "off",
"useGetterReturn": "off"
}
}
}
}
]
}

View File

@@ -1,28 +0,0 @@
/* eslint-disable @typescript-eslint/naming-convention */
// @ts-check
const eslint = require("@eslint/js");
const tseslint = require("typescript-eslint");
const eslintConfigPrettier = require("eslint-config-prettier");
const jsdoc = require("eslint-plugin-jsdoc");
module.exports = tseslint.config(
eslint.configs.recommended,
jsdoc.configs["flat/recommended"],
eslintConfigPrettier,
...tseslint.configs.recommended,
{
rules: {
"@typescript-eslint/naming-convention": "error",
"jsdoc/require-returns": "off",
"jsdoc/require-param": "off",
"jsdoc/require-jsdoc": [
"error",
{
publicOnly: true,
},
],
},
plugins: jsdoc,
},
);

View File

@@ -13,25 +13,25 @@
// limitations under the License. // limitations under the License.
import { import {
Field,
makeBuilder,
RecordBatchFileWriter,
Utf8,
type Vector,
FixedSizeList,
vectorFromArray,
type Schema,
Table as ArrowTable, Table as ArrowTable,
RecordBatchStreamWriter, Binary,
DataType,
Field,
FixedSizeList,
type Float,
Float32,
List, List,
RecordBatch, RecordBatch,
makeData, RecordBatchFileWriter,
RecordBatchStreamWriter,
Schema,
Struct, Struct,
type Float, Utf8,
DataType, type Vector,
Binary, makeBuilder,
Float32, makeData,
type makeTable, type makeTable,
vectorFromArray,
} from "apache-arrow"; } from "apache-arrow";
import { type EmbeddingFunction } from "./embedding/embedding_function"; import { type EmbeddingFunction } from "./embedding/embedding_function";
import { sanitizeSchema } from "./sanitize"; import { sanitizeSchema } from "./sanitize";
@@ -85,6 +85,7 @@ export class MakeArrowTableOptions {
vectorColumns: Record<string, VectorColumnOptions> = { vectorColumns: Record<string, VectorColumnOptions> = {
vector: new VectorColumnOptions(), vector: new VectorColumnOptions(),
}; };
embeddings?: EmbeddingFunction<unknown>;
/** /**
* If true then string columns will be encoded with dictionary encoding * If true then string columns will be encoded with dictionary encoding
@@ -208,6 +209,7 @@ export function makeArrowTable(
const opt = new MakeArrowTableOptions(options !== undefined ? options : {}); const opt = new MakeArrowTableOptions(options !== undefined ? options : {});
if (opt.schema !== undefined && opt.schema !== null) { if (opt.schema !== undefined && opt.schema !== null) {
opt.schema = sanitizeSchema(opt.schema); opt.schema = sanitizeSchema(opt.schema);
opt.schema = validateSchemaEmbeddings(opt.schema, data, opt.embeddings);
} }
const columns: Record<string, Vector> = {}; const columns: Record<string, Vector> = {};
// TODO: sample dataset to find missing columns // TODO: sample dataset to find missing columns
@@ -287,8 +289,8 @@ export function makeArrowTable(
// then patch the schema of the batches so we can use // then patch the schema of the batches so we can use
// `new ArrowTable(schema, batches)` which does not do any schema inference // `new ArrowTable(schema, batches)` which does not do any schema inference
const firstTable = new ArrowTable(columns); const firstTable = new ArrowTable(columns);
// eslint-disable-next-line @typescript-eslint/no-non-null-assertion
const batchesFixed = firstTable.batches.map( const batchesFixed = firstTable.batches.map(
// eslint-disable-next-line @typescript-eslint/no-non-null-assertion
(batch) => new RecordBatch(opt.schema!, batch.data), (batch) => new RecordBatch(opt.schema!, batch.data),
); );
return new ArrowTable(opt.schema, batchesFixed); return new ArrowTable(opt.schema, batchesFixed);
@@ -313,7 +315,7 @@ function makeListVector(lists: unknown[][]): Vector<unknown> {
throw Error("Cannot infer list vector from empty array or empty list"); throw Error("Cannot infer list vector from empty array or empty list");
} }
const sampleList = lists[0]; const sampleList = lists[0];
// eslint-disable-next-line @typescript-eslint/no-explicit-any // biome-ignore lint/suspicious/noExplicitAny: skip
let inferredType: any; let inferredType: any;
try { try {
const sampleVector = makeVector(sampleList); const sampleVector = makeVector(sampleList);
@@ -337,7 +339,7 @@ function makeVector(
values: unknown[], values: unknown[],
type?: DataType, type?: DataType,
stringAsDictionary?: boolean, stringAsDictionary?: boolean,
// eslint-disable-next-line @typescript-eslint/no-explicit-any // biome-ignore lint/suspicious/noExplicitAny: skip
): Vector<any> { ): Vector<any> {
if (type !== undefined) { if (type !== undefined) {
// No need for inference, let Arrow create it // No need for inference, let Arrow create it
@@ -648,3 +650,41 @@ function alignTable(table: ArrowTable, schema: Schema): ArrowTable {
export function createEmptyTable(schema: Schema): ArrowTable { export function createEmptyTable(schema: Schema): ArrowTable {
return new ArrowTable(sanitizeSchema(schema)); return new ArrowTable(sanitizeSchema(schema));
} }
function validateSchemaEmbeddings(
schema: Schema,
data: Array<Record<string, unknown>>,
embeddings: EmbeddingFunction<unknown> | undefined,
) {
const fields = [];
const missingEmbeddingFields = [];
// First we check if the field is a `FixedSizeList`
// Then we check if the data contains the field
// if it does not, we add it to the list of missing embedding fields
// Finally, we check if those missing embedding fields are `this._embeddings`
// if they are not, we throw an error
for (const field of schema.fields) {
if (field.type instanceof FixedSizeList) {
if (data.length !== 0 && data?.[0]?.[field.name] === undefined) {
missingEmbeddingFields.push(field);
} else {
fields.push(field);
}
} else {
fields.push(field);
}
}
if (missingEmbeddingFields.length > 0 && embeddings === undefined) {
console.log({ missingEmbeddingFields, embeddings });
throw new Error(
`Table has embeddings: "${missingEmbeddingFields
.map((f) => f.name)
.join(",")}", but no embedding function was provided`,
);
}
return new Schema(fields);
}

View File

@@ -12,10 +12,10 @@
// See the License for the specific language governing permissions and // See the License for the specific language governing permissions and
// limitations under the License. // limitations under the License.
import { Table as ArrowTable, Schema } from "apache-arrow";
import { fromTableToBuffer, makeArrowTable, makeEmptyTable } from "./arrow"; import { fromTableToBuffer, makeArrowTable, makeEmptyTable } from "./arrow";
import { ConnectionOptions, Connection as LanceDbConnection } from "./native"; import { ConnectionOptions, Connection as LanceDbConnection } from "./native";
import { Table } from "./table"; import { Table } from "./table";
import { Table as ArrowTable, Schema } from "apache-arrow";
/** /**
* Connect to a LanceDB instance at the given URI. * Connect to a LanceDB instance at the given URI.

View File

@@ -12,8 +12,8 @@
// See the License for the specific language governing permissions and // See the License for the specific language governing permissions and
// limitations under the License. // limitations under the License.
import { type EmbeddingFunction } from "./embedding_function";
import type OpenAI from "openai"; import type OpenAI from "openai";
import { type EmbeddingFunction } from "./embedding_function";
export class OpenAIEmbeddingFunction implements EmbeddingFunction<string> { export class OpenAIEmbeddingFunction implements EmbeddingFunction<string> {
private readonly _openai: OpenAI; private readonly _openai: OpenAI;

View File

@@ -12,14 +12,14 @@
// See the License for the specific language governing permissions and // See the License for the specific language governing permissions and
// limitations under the License. // limitations under the License.
import { RecordBatch, tableFromIPC, Table as ArrowTable } from "apache-arrow"; import { Table as ArrowTable, RecordBatch, tableFromIPC } from "apache-arrow";
import { type IvfPqOptions } from "./indices";
import { import {
RecordBatchIterator as NativeBatchIterator, RecordBatchIterator as NativeBatchIterator,
Query as NativeQuery, Query as NativeQuery,
Table as NativeTable, Table as NativeTable,
VectorQuery as NativeVectorQuery, VectorQuery as NativeVectorQuery,
} from "./native"; } from "./native";
import { type IvfPqOptions } from "./indices";
export class RecordBatchIterator implements AsyncIterator<RecordBatch> { export class RecordBatchIterator implements AsyncIterator<RecordBatch> {
private promisedInner?: Promise<NativeBatchIterator>; private promisedInner?: Promise<NativeBatchIterator>;
private inner?: NativeBatchIterator; private inner?: NativeBatchIterator;
@@ -29,7 +29,7 @@ export class RecordBatchIterator implements AsyncIterator<RecordBatch> {
this.promisedInner = promise; this.promisedInner = promise;
} }
// eslint-disable-next-line @typescript-eslint/no-explicit-any // biome-ignore lint/suspicious/noExplicitAny: skip
async next(): Promise<IteratorResult<RecordBatch<any>>> { async next(): Promise<IteratorResult<RecordBatch<any>>> {
if (this.inner === undefined) { if (this.inner === undefined) {
this.inner = await this.promisedInner; this.inner = await this.promisedInner;
@@ -56,7 +56,9 @@ export class QueryBase<
QueryType, QueryType,
> implements AsyncIterable<RecordBatch> > implements AsyncIterable<RecordBatch>
{ {
protected constructor(protected inner: NativeQueryType) {} protected constructor(protected inner: NativeQueryType) {
// intentionally empty
}
/** /**
* A filter statement to be applied to this query. * A filter statement to be applied to this query.
@@ -150,7 +152,7 @@ export class QueryBase<
return new RecordBatchIterator(this.nativeExecute()); return new RecordBatchIterator(this.nativeExecute());
} }
// eslint-disable-next-line @typescript-eslint/no-explicit-any // biome-ignore lint/suspicious/noExplicitAny: skip
[Symbol.asyncIterator](): AsyncIterator<RecordBatch<any>> { [Symbol.asyncIterator](): AsyncIterator<RecordBatch<any>> {
const promise = this.nativeExecute(); const promise = this.nativeExecute();
return new RecordBatchIterator(promise); return new RecordBatchIterator(promise);
@@ -368,7 +370,7 @@ export class Query extends QueryBase<NativeQuery, Query> {
* a default `limit` of 10 will be used. @see {@link Query#limit} * a default `limit` of 10 will be used. @see {@link Query#limit}
*/ */
nearestTo(vector: unknown): VectorQuery { nearestTo(vector: unknown): VectorQuery {
// eslint-disable-next-line @typescript-eslint/no-explicit-any // biome-ignore lint/suspicious/noExplicitAny: skip
const vectorQuery = this.inner.nearestTo(Float32Array.from(vector as any)); const vectorQuery = this.inner.nearestTo(Float32Array.from(vector as any));
return new VectorQuery(vectorQuery); return new VectorQuery(vectorQuery);
} }

View File

@@ -21,60 +21,60 @@
// and so we must sanitize the input to ensure that it is compatible. // and so we must sanitize the input to ensure that it is compatible.
import { import {
Field, Binary,
Utf8,
FixedSizeBinary,
FixedSizeList,
Schema,
List,
Struct,
Float,
Bool, Bool,
DataType,
DateDay,
DateMillisecond,
type DateUnit,
Date_, Date_,
Decimal, Decimal,
DataType, DenseUnion,
Dictionary, Dictionary,
Binary,
Float32,
Interval,
Map_,
Duration, Duration,
Union, DurationMicrosecond,
Time, DurationMillisecond,
Timestamp, DurationNanosecond,
Type, DurationSecond,
Null, Field,
FixedSizeBinary,
FixedSizeList,
Float,
Float16,
Float32,
Float64,
Int, Int,
type Precision,
type DateUnit,
Int8, Int8,
Int16, Int16,
Int32, Int32,
Int64, Int64,
Interval,
IntervalDayTime,
IntervalYearMonth,
List,
Map_,
Null,
type Precision,
Schema,
SparseUnion,
Struct,
Time,
TimeMicrosecond,
TimeMillisecond,
TimeNanosecond,
TimeSecond,
Timestamp,
TimestampMicrosecond,
TimestampMillisecond,
TimestampNanosecond,
TimestampSecond,
Type,
Uint8, Uint8,
Uint16, Uint16,
Uint32, Uint32,
Uint64, Uint64,
Float16, Union,
Float64, Utf8,
DateDay,
DateMillisecond,
DenseUnion,
SparseUnion,
TimeNanosecond,
TimeMicrosecond,
TimeMillisecond,
TimeSecond,
TimestampNanosecond,
TimestampMicrosecond,
TimestampMillisecond,
TimestampSecond,
IntervalDayTime,
IntervalYearMonth,
DurationNanosecond,
DurationMicrosecond,
DurationMillisecond,
DurationSecond,
} from "apache-arrow"; } from "apache-arrow";
import type { IntBitWidth, TKeys, TimeBitWidth } from "apache-arrow/type"; import type { IntBitWidth, TKeys, TimeBitWidth } from "apache-arrow/type";
@@ -228,7 +228,7 @@ function sanitizeUnion(typeLike: object) {
return new Union( return new Union(
typeLike.mode, typeLike.mode,
// eslint-disable-next-line @typescript-eslint/no-explicit-any // biome-ignore lint/suspicious/noExplicitAny: skip
typeLike.typeIds as any, typeLike.typeIds as any,
typeLike.children.map((child) => sanitizeField(child)), typeLike.children.map((child) => sanitizeField(child)),
); );
@@ -294,7 +294,7 @@ function sanitizeMap(typeLike: object) {
} }
return new Map_( return new Map_(
// eslint-disable-next-line @typescript-eslint/no-explicit-any // biome-ignore lint/suspicious/noExplicitAny: skip
typeLike.children.map((field) => sanitizeField(field)) as any, typeLike.children.map((field) => sanitizeField(field)) as any,
typeLike.keysSorted, typeLike.keysSorted,
); );
@@ -328,7 +328,7 @@ function sanitizeDictionary(typeLike: object) {
); );
} }
// eslint-disable-next-line @typescript-eslint/no-explicit-any // biome-ignore lint/suspicious/noExplicitAny: skip
function sanitizeType(typeLike: unknown): DataType<any> { function sanitizeType(typeLike: unknown): DataType<any> {
if (typeof typeLike !== "object" || typeLike === null) { if (typeof typeLike !== "object" || typeLike === null) {
throw Error("Expected a Type but object was null/undefined"); throw Error("Expected a Type but object was null/undefined");

View File

@@ -13,6 +13,8 @@
// limitations under the License. // limitations under the License.
import { Schema, tableFromIPC } from "apache-arrow"; import { Schema, tableFromIPC } from "apache-arrow";
import { Data, fromDataToBuffer } from "./arrow";
import { IndexOptions } from "./indices";
import { import {
AddColumnsSql, AddColumnsSql,
ColumnAlteration, ColumnAlteration,
@@ -20,8 +22,6 @@ import {
Table as _NativeTable, Table as _NativeTable,
} from "./native"; } from "./native";
import { Query, VectorQuery } from "./query"; import { Query, VectorQuery } from "./query";
import { IndexOptions } from "./indices";
import { Data, fromDataToBuffer } from "./arrow";
export { IndexConfig } from "./native"; export { IndexConfig } from "./native";
/** /**
@@ -186,7 +186,7 @@ export class Table {
*/ */
async createIndex(column: string, options?: Partial<IndexOptions>) { async createIndex(column: string, options?: Partial<IndexOptions>) {
// Bit of a hack to get around the fact that TS has no package-scope. // Bit of a hack to get around the fact that TS has no package-scope.
// eslint-disable-next-line @typescript-eslint/no-explicit-any // biome-ignore lint/suspicious/noExplicitAny: skip
const nativeIndex = (options?.config as any)?.inner; const nativeIndex = (options?.config as any)?.inner;
await this.inner.createIndex(nativeIndex, column, options?.replace); await this.inner.createIndex(nativeIndex, column, options?.replace);
} }

View File

@@ -1,18 +1,12 @@
{ {
"name": "@lancedb/lancedb-darwin-arm64", "name": "@lancedb/lancedb-darwin-arm64",
"version": "0.4.19", "version": "0.4.20",
"os": [ "os": ["darwin"],
"darwin" "cpu": ["arm64"],
], "main": "lancedb.darwin-arm64.node",
"cpu": [ "files": ["lancedb.darwin-arm64.node"],
"arm64" "license": "Apache 2.0",
], "engines": {
"main": "lancedb.darwin-arm64.node", "node": ">= 18"
"files": [ }
"lancedb.darwin-arm64.node"
],
"license": "Apache 2.0",
"engines": {
"node": ">= 18"
}
} }

View File

@@ -1,18 +1,12 @@
{ {
"name": "@lancedb/lancedb-darwin-x64", "name": "@lancedb/lancedb-darwin-x64",
"version": "0.4.19", "version": "0.4.20",
"os": [ "os": ["darwin"],
"darwin" "cpu": ["x64"],
], "main": "lancedb.darwin-x64.node",
"cpu": [ "files": ["lancedb.darwin-x64.node"],
"x64" "license": "Apache 2.0",
], "engines": {
"main": "lancedb.darwin-x64.node", "node": ">= 18"
"files": [ }
"lancedb.darwin-x64.node"
],
"license": "Apache 2.0",
"engines": {
"node": ">= 18"
}
} }

View File

@@ -1,21 +1,13 @@
{ {
"name": "@lancedb/lancedb-linux-arm64-gnu", "name": "@lancedb/lancedb-linux-arm64-gnu",
"version": "0.4.19", "version": "0.4.20",
"os": [ "os": ["linux"],
"linux" "cpu": ["arm64"],
], "main": "lancedb.linux-arm64-gnu.node",
"cpu": [ "files": ["lancedb.linux-arm64-gnu.node"],
"arm64" "license": "Apache 2.0",
], "engines": {
"main": "lancedb.linux-arm64-gnu.node", "node": ">= 18"
"files": [ },
"lancedb.linux-arm64-gnu.node" "libc": ["glibc"]
],
"license": "Apache 2.0",
"engines": {
"node": ">= 18"
},
"libc": [
"glibc"
]
} }

View File

@@ -1,21 +1,13 @@
{ {
"name": "@lancedb/lancedb-linux-x64-gnu", "name": "@lancedb/lancedb-linux-x64-gnu",
"version": "0.4.19", "version": "0.4.20",
"os": [ "os": ["linux"],
"linux" "cpu": ["x64"],
], "main": "lancedb.linux-x64-gnu.node",
"cpu": [ "files": ["lancedb.linux-x64-gnu.node"],
"x64" "license": "Apache 2.0",
], "engines": {
"main": "lancedb.linux-x64-gnu.node", "node": ">= 18"
"files": [ },
"lancedb.linux-x64-gnu.node" "libc": ["glibc"]
],
"license": "Apache 2.0",
"engines": {
"node": ">= 18"
},
"libc": [
"glibc"
]
} }

View File

@@ -1,18 +1,12 @@
{ {
"name": "@lancedb/lancedb-win32-x64-msvc", "name": "@lancedb/lancedb-win32-x64-msvc",
"version": "0.4.14", "version": "0.4.14",
"os": [ "os": ["win32"],
"win32" "cpu": ["x64"],
], "main": "lancedb.win32-x64-msvc.node",
"cpu": [ "files": ["lancedb.win32-x64-msvc.node"],
"x64" "license": "Apache 2.0",
], "engines": {
"main": "lancedb.win32-x64-msvc.node", "node": ">= 18"
"files": [ }
"lancedb.win32-x64-msvc.node"
],
"license": "Apache 2.0",
"engines": {
"node": ">= 18"
}
} }

15579
nodejs/package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@@ -1,6 +1,6 @@
{ {
"name": "@lancedb/lancedb", "name": "@lancedb/lancedb",
"version": "0.4.19", "version": "0.4.20",
"main": "./dist/index.js", "main": "./dist/index.js",
"types": "./dist/index.d.ts", "types": "./dist/index.d.ts",
"napi": { "napi": {
@@ -18,19 +18,16 @@
}, },
"license": "Apache 2.0", "license": "Apache 2.0",
"devDependencies": { "devDependencies": {
"@aws-sdk/client-s3": "^3.33.0",
"@aws-sdk/client-kms": "^3.33.0", "@aws-sdk/client-kms": "^3.33.0",
"@aws-sdk/client-s3": "^3.33.0",
"@biomejs/biome": "^1.7.3",
"@jest/globals": "^29.7.0",
"@napi-rs/cli": "^2.18.0", "@napi-rs/cli": "^2.18.0",
"@types/jest": "^29.1.2", "@types/jest": "^29.1.2",
"@types/tmp": "^0.2.6", "@types/tmp": "^0.2.6",
"@typescript-eslint/eslint-plugin": "^6.19.0",
"@typescript-eslint/parser": "^6.19.0",
"apache-arrow-old": "npm:apache-arrow@13.0.0", "apache-arrow-old": "npm:apache-arrow@13.0.0",
"eslint": "^8.57.0", "eslint": "^8.57.0",
"eslint-config-prettier": "^9.1.0",
"eslint-plugin-jsdoc": "^48.2.1",
"jest": "^29.7.0", "jest": "^29.7.0",
"prettier": "^3.1.0",
"shx": "^0.3.4", "shx": "^0.3.4",
"tmp": "^0.2.3", "tmp": "^0.2.3",
"ts-jest": "^29.1.2", "ts-jest": "^29.1.2",
@@ -45,33 +42,26 @@
"engines": { "engines": {
"node": ">= 18" "node": ">= 18"
}, },
"cpu": [ "cpu": ["x64", "arm64"],
"x64", "os": ["darwin", "linux", "win32"],
"arm64"
],
"os": [
"darwin",
"linux",
"win32"
],
"scripts": { "scripts": {
"artifacts": "napi artifacts", "artifacts": "napi artifacts",
"build:debug": "napi build --platform --dts ../lancedb/native.d.ts --js ../lancedb/native.js dist/", "build:debug": "napi build --platform --dts ../lancedb/native.d.ts --js ../lancedb/native.js lancedb",
"build:release": "napi build --platform --release --dts ../lancedb/native.d.ts --js ../lancedb/native.js dist/", "build:release": "napi build --platform --release --dts ../lancedb/native.d.ts --js ../lancedb/native.js dist/",
"build": "npm run build:debug && tsc -b && shx cp lancedb/native.d.ts dist/native.d.ts", "build": "npm run build:debug && tsc -b && shx cp lancedb/native.d.ts dist/native.d.ts && shx cp lancedb/*.node dist/",
"build-release": "npm run build:release && tsc -b && shx cp lancedb/native.d.ts dist/native.d.ts", "build-release": "npm run build:release && tsc -b && shx cp lancedb/native.d.ts dist/native.d.ts",
"chkformat": "prettier . --check", "lint-ci": "biome ci .",
"docs": "typedoc --plugin typedoc-plugin-markdown --out ../docs/src/js lancedb/index.ts", "docs": "typedoc --plugin typedoc-plugin-markdown --out ../docs/src/js lancedb/index.ts",
"lint": "eslint lancedb __test__", "lint": "biome check . && biome format .",
"lint-fix": "eslint lancedb __test__ --fix", "lint-fix": "biome check --apply-unsafe . && biome format --write .",
"prepublishOnly": "napi prepublish -t npm", "prepublishOnly": "napi prepublish -t npm",
"test": "npm run build && jest --verbose", "test": "jest --verbose",
"integration": "S3_TEST=1 npm run test", "integration": "S3_TEST=1 npm run test",
"universal": "napi universal", "universal": "napi universal",
"version": "napi version" "version": "napi version"
}, },
"dependencies": { "dependencies": {
"openai": "^4.29.2", "apache-arrow": "^15.0.0",
"apache-arrow": "^15.0.0" "openai": "^4.29.2"
} }
} }

View File

@@ -1,5 +1,5 @@
[bumpversion] [bumpversion]
current_version = 0.6.12 current_version = 0.6.13
commit = True commit = True
message = [python] Bump version: {current_version} → {new_version} message = [python] Bump version: {current_version} → {new_version}
tag = True tag = True

View File

@@ -1,6 +1,6 @@
[project] [project]
name = "lancedb" name = "lancedb"
version = "0.6.12" version = "0.6.13"
dependencies = [ dependencies = [
"deprecation", "deprecation",
"pylance==0.10.12", "pylance==0.10.12",
@@ -10,7 +10,7 @@ dependencies = [
"tqdm>=4.27.0", "tqdm>=4.27.0",
"pydantic>=1.10", "pydantic>=1.10",
"attrs>=21.3.0", "attrs>=21.3.0",
"semver>=3.0", "semver",
"cachetools", "cachetools",
"overrides>=0.7", "overrides>=0.7",
] ]
@@ -80,6 +80,7 @@ embeddings = [
"boto3>=1.28.57", "boto3>=1.28.57",
"awscli>=1.29.57", "awscli>=1.29.57",
"botocore>=1.31.57", "botocore>=1.31.57",
"ollama",
] ]
azure = ["adlfs>=2024.2.0"] azure = ["adlfs>=2024.2.0"]

View File

@@ -16,6 +16,7 @@ from .bedrock import BedRockText
from .cohere import CohereEmbeddingFunction from .cohere import CohereEmbeddingFunction
from .gemini_text import GeminiText from .gemini_text import GeminiText
from .instructor import InstructorEmbeddingFunction from .instructor import InstructorEmbeddingFunction
from .ollama import OllamaEmbeddings
from .open_clip import OpenClipEmbeddings from .open_clip import OpenClipEmbeddings
from .openai import OpenAIEmbeddings from .openai import OpenAIEmbeddings
from .registry import EmbeddingFunctionRegistry, get_registry from .registry import EmbeddingFunctionRegistry, get_registry

View File

@@ -0,0 +1,69 @@
# Copyright (c) 2023. LanceDB Developers
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from functools import cached_property
from typing import TYPE_CHECKING, List, Optional, Union
from ..util import attempt_import_or_raise
from .base import TextEmbeddingFunction
from .registry import register
if TYPE_CHECKING:
import numpy as np
@register("ollama")
class OllamaEmbeddings(TextEmbeddingFunction):
"""
An embedding function that uses Ollama
https://github.com/ollama/ollama/blob/main/docs/api.md#generate-embeddings
https://ollama.com/blog/embedding-models
"""
name: str = "nomic-embed-text"
host: str = "http://localhost:11434"
options: Optional[dict] = None # type = ollama.Options
keep_alive: Optional[Union[float, str]] = None
ollama_client_kwargs: Optional[dict] = {}
def ndims(self):
return len(self.generate_embeddings(["foo"])[0])
def _compute_embedding(self, text):
return self._ollama_client.embeddings(
model=self.name,
prompt=text,
options=self.options,
keep_alive=self.keep_alive,
)["embedding"]
def generate_embeddings(
self, texts: Union[List[str], "np.ndarray"]
) -> List["np.array"]:
"""
Get the embeddings for the given texts
Parameters
----------
texts: list[str] or np.ndarray (of str)
The texts to embed
"""
# TODO retry, rate limit, token limit
embeddings = [self._compute_embedding(text) for text in texts]
return embeddings
@cached_property
def _ollama_client(self):
ollama = attempt_import_or_raise("ollama")
# ToDo explore ollama.AsyncClient
return ollama.Client(host=self.host, **self.ollama_client_kwargs)

View File

@@ -37,7 +37,7 @@ import pyarrow as pa
import pydantic import pydantic
import semver import semver
PYDANTIC_VERSION = semver.Version.parse(pydantic.__version__) PYDANTIC_VERSION = semver.parse_version_info(pydantic.__version__)
try: try:
from pydantic_core import CoreSchema, core_schema from pydantic_core import CoreSchema, core_schema
except ImportError: except ImportError:

View File

@@ -285,7 +285,7 @@ class RemoteDBConnection(DBConnection):
self._client.post( self._client.post(
f"/v1/table/{name}/drop/", f"/v1/table/{name}/drop/",
) )
self._table_cache.pop(name) self._table_cache.pop(name, default=None)
@override @override
def rename_table(self, cur_name: str, new_name: str): def rename_table(self, cur_name: str, new_name: str):
@@ -300,9 +300,9 @@ class RemoteDBConnection(DBConnection):
""" """
self._client.post( self._client.post(
f"/v1/table/{cur_name}/rename/", f"/v1/table/{cur_name}/rename/",
json={"new_table_name": new_name}, data={"new_table_name": new_name},
) )
self._table_cache.pop(cur_name) self._table_cache.pop(cur_name, default=None)
self._table_cache[new_name] = True self._table_cache[new_name] = True
async def close(self): async def close(self):

View File

@@ -45,7 +45,9 @@ except Exception:
@pytest.mark.slow @pytest.mark.slow
@pytest.mark.parametrize("alias", ["sentence-transformers", "openai", "huggingface"]) @pytest.mark.parametrize(
"alias", ["sentence-transformers", "openai", "huggingface", "ollama"]
)
def test_basic_text_embeddings(alias, tmp_path): def test_basic_text_embeddings(alias, tmp_path):
db = lancedb.connect(tmp_path) db = lancedb.connect(tmp_path)
registry = get_registry() registry = get_registry()

44
release_process.md Normal file
View File

@@ -0,0 +1,44 @@
# Release process
There are five total packages we release. Three are the `lancedb` packages
for Python, Rust, and Node.js. The other two are the legacy `vectordb`
packages for Rust and node.js.
The Python package is versioned and released separately from the Rust and Node.js
ones. For Rust and Node.js, the release process is shared between `lancedb` and
`vectordb` for now.
## Breaking changes
We try to avoid breaking changes, but sometimes they are necessary. When there
are breaking changes, we will increment the minor version. (This is valid
semantic versioning because we are still in `0.x` versions.)
When a PR makes a breaking change, the PR author should mark the PR using the
conventional commit markers: either exclamation mark after the type
(such as `feat!: change signature of func`) or have `BREAKING CHANGE` in the
body of the PR. A CI job will add a `breaking-change` label to the PR, which is
what will ultimately be used to CI to determine if the minor version should be
incremented.
A CI job will validate that if a `breaking-change` label is added, the minor
version is incremented in the `Cargo.toml` and `pyproject.toml` files. The only
exception is if it has already been incremented since the last stable release.
**It is the responsibility of the PR author to increment the minor version when
appropriate.**
Some things that are considered breaking changes:
* Upgrading `lance` to a new minor version. Minor version bumps in Lance are
considered breaking changes during `0.x` releases. This can change behavior
in LanceDB.
* Upgrading a dependency pin that is in the Rust API. In particular, upgrading
`DataFusion` and `Arrow` are breaking changes. Changing dependencies that are
not exposed in our public API are not considered breaking changes.
* Changing the signature of a public function or method.
* Removing a public function or method.
We do make exceptions for APIs that are marked as experimental. These are APIs
that are under active development and not in major use. These changes should not
receive the `breaking-change` label.

View File

@@ -1,6 +1,6 @@
[package] [package]
name = "lancedb-node" name = "lancedb-node"
version = "0.4.19" version = "0.4.20"
description = "Serverless, low-latency vector database for AI applications" description = "Serverless, low-latency vector database for AI applications"
license.workspace = true license.workspace = true
edition.workspace = true edition.workspace = true

View File

@@ -1,6 +1,6 @@
[package] [package]
name = "lancedb" name = "lancedb"
version = "0.4.19" version = "0.4.20"
edition.workspace = true edition.workspace = true
description = "LanceDB: A serverless, low-latency vector database for AI applications" description = "LanceDB: A serverless, low-latency vector database for AI applications"
license.workspace = true license.workspace = true