mirror of
https://github.com/GreptimeTeam/greptimedb.git
synced 2026-01-03 20:02:54 +00:00
* fmt: correct format * test: add negative tests * feat: Add pipeline dispatching and execution output handling * refactor: Enhance ingest function to correctly process original data values custom table names during pipeline execution while optimizing the management of transformed rows and multiple dispatched pipelines * refactor: call greptime_identity with intermediate values * fix: typo * test: port tests to refactored apis * refactor: adapt dryrun api call * refactor: move pipeline execution code to a separated module * refactor: update otlp pipeline execution path * fmt: format imports * fix: compilation * fix: resolve residual issues * refactor: address review comments * chore: use btreemap as pipeline intermediate status trait modify * refactor: update dispatcher to accept BTreeMap * refactor: update identity pipeline * refactor: use new input for pipeline * chore: wip * refactor: use updated prepare api * refactor: improve error and header name * feat: port flatten to new api * chore: update pipeline api * chore: fix transform and some pipeline test * refactor: reimplement cmcd * refactor: update csv processor * fmt: update format * chore: fix regex and dissect processor * chore: fix test * test: add integration test for http pipeline * refactor: improve regex pipeline * refactor: improve required field check * refactor: rename table_part to table_suffix * fix: resolve merge issue --------- Co-authored-by: paomian <xpaomian@gmail.com>
Setup tests for multiple storage backend
To run the integration test, please copy .env.example to .env in the project root folder and change the values on need.
Take s3 for example. You need to set your S3 bucket, access key id and secret key:
# Settings for s3 test
GT_S3_BUCKET=S3 bucket
GT_S3_REGION=S3 region
GT_S3_ACCESS_KEY_ID=S3 access key id
GT_S3_ACCESS_KEY=S3 secret access key
Run
Execute the following command in the project root folder:
cargo test integration
Test s3 storage:
cargo test s3
Test oss storage:
cargo test oss
Test azblob storage:
cargo test azblob
Setup tests with Kafka wal
To run the integration test, please copy .env.example to .env in the project root folder and change the values on need.
GT_KAFKA_ENDPOINTS = localhost:9092
Setup kafka standalone
cd tests-integration/fixtures
docker compose -f docker-compose-standalone.yml up kafka