mirror of
https://github.com/GreptimeTeam/greptimedb.git
synced 2025-12-27 00:19:58 +00:00
* feat: support set variable statement of session
* feat: support printing postgresql's bytea data type in its "hex" and "escape" format in ugly way
* refactor: add 'SessionConfigValue' type and unify the name
* doc: add license header
* refactor: confine coupling with 'sql::ast::Value' in SessionConfigValue
* refactor: move all bytea wrapper into bytea.rs
* fix: remove unused import in context.rs and postgres.rs
* refactor: rename 'set_configuration_parameter' to 'set_session_config'
rename 'set_configuration_parameter' in statement_.rs to 'set_session_config'
* refactor: use mod to organize options via macro
* refactor: re-model the session config value with static type
* test: add integration test
* refactor: move the encode bytea by format type logic into encoder
refactor: use Arc<DashMap> instead of DashMap in QueryContext
refactor: use Arc<DashMap> instead of DashMap in QueryContext
Avoid expensive clone
refactor: use unreachable!() instead of unimplemented!()
refactor: move the encode bytea by format type logic into encoder
test: add binary format integration test case
* test: add ut for byte related type
* doc: remove TODO of bytea_output
* refactor: simplify the implementation with simple struct instead of complex typing
* fix: typo of 'Available'
* fix compile
Signed-off-by: tison <wander4096@gmail.com>
---------
Signed-off-by: tison <wander4096@gmail.com>
Co-authored-by: tison <wander4096@gmail.com>
Setup tests for multiple storage backend
To run the integration test, please copy .env.example to .env in the project root folder and change the values on need.
Take s3 for example. You need to set your S3 bucket, access key id and secret key:
# Settings for s3 test
GT_S3_BUCKET=S3 bucket
GT_S3_REGION=S3 region
GT_S3_ACCESS_KEY_ID=S3 access key id
GT_S3_ACCESS_KEY=S3 secret access key
Run
Execute the following command in the project root folder:
cargo test integration
Test s3 storage:
cargo test s3
Test oss storage:
cargo test oss
Test azblob storage:
cargo test azblob
Setup tests with Kafka wal
To run the integration test, please copy .env.example to .env in the project root folder and change the values on need.
GT_KAFKA_ENDPOINTS = localhost:9092
Setup kafka standalone
cd tests-integration/fixtures/kafka
docker compose -f docker-compose-standalone.yml up