Files
greptimedb/tests-integration
Lei, HUANG a51853846a fix: schema cache invalidation (#5067)
* fix: use SchemaCache to locate database metadata

* main:
 Refactor SchemaMetadataManager to use TableInfoCacheRef

 - Replace TableInfoManagerRef with TableInfoCacheRef in SchemaMetadataManager
 - Update DatanodeBuilder to pass TableInfoCacheRef to SchemaMetadataManager
 - Rename error MissingCacheRegistrySnafu to MissingCacheSnafu in datanode module
 - Adjust tests to use new mock_schema_metadata_manager with TableInfoCacheRef

* fix/schema-cache-invalidation: Add cache module and integrate cache registry into datanode

 • Implement build_datanode_cache_registry function to create cache registry for datanode
 • Integrate cache registry into datanode by modifying DatanodeBuilder and HeartbeatTask
 • Refactor InvalidateTableCacheHandler to InvalidateCacheHandler and move to common-meta crate
 • Update Cargo.toml to include cache as a dev-dependency for datanode
 • Adjust related modules (flownode, frontend, tests-integration, standalone) to use new cache handler and registry
 • Remove obsolete handler module from frontend crate

* fix: fuzz imports

* chore: add some doc for cahce builder functions

* refactor: change table info cache to table schema cache

* fix: remove unused variants

* fix fuzz

* chore: apply suggestion

Co-authored-by: Weny Xu <wenymedia@gmail.com>

* chore: apply suggestion

Co-authored-by: Weny Xu <wenymedia@gmail.com>

* fix: compile

---------

Co-authored-by: dennis zhuang <killme2008@gmail.com>
Co-authored-by: Weny Xu <wenymedia@gmail.com>
2024-12-03 10:44:29 +00:00
..
2024-11-18 08:39:17 +00:00

Setup tests for multiple storage backend

To run the integration test, please copy .env.example to .env in the project root folder and change the values on need.

Take s3 for example. You need to set your S3 bucket, access key id and secret key:

# Settings for s3 test
GT_S3_BUCKET=S3 bucket
GT_S3_REGION=S3 region
GT_S3_ACCESS_KEY_ID=S3 access key id
GT_S3_ACCESS_KEY=S3 secret access key

Run

Execute the following command in the project root folder:

cargo test integration

Test s3 storage:

cargo test s3

Test oss storage:

cargo test oss

Test azblob storage:

cargo test azblob

Setup tests with Kafka wal

To run the integration test, please copy .env.example to .env in the project root folder and change the values on need.

GT_KAFKA_ENDPOINTS = localhost:9092

Setup kafka standalone

cd tests-integration/fixtures/kafka 

docker compose -f docker-compose-standalone.yml up