* feat: support reporting env vars in heartbeat messages to metasrv Add `heartbeat_env_vars` config option for datanode and frontend. When configured, the specified environment variable values are read at startup and sent to metasrv in every heartbeat via the `extensions` map. Metasrv extracts and stores them in `NodeInfo` for use in routing decisions (e.g. AZ-aware region placement). - Add `EnvVars` helper in `common/meta/src/datanode.rs` following the existing `GcStat` extension pattern with `into_extensions`/`from_extensions` - Add `env_vars: HashMap<String, String>` field to `NodeInfo` in `common/meta/src/cluster.rs` with `#[serde(default)]` for backward compat - Add `heartbeat_env_vars: Vec<String>` config field to `DatanodeOptions`, `FrontendOptions`, and `StandaloneOptions` - Inject env vars into heartbeat `extensions` in both datanode and frontend heartbeat tasks (`datanode/src/heartbeat.rs`, `frontend/src/heartbeat.rs`) - Extract env vars from `req.extensions` in all three metasrv `CollectXxxClusterInfoHandler`s - Update `NodeInfo` construction sites in `meta-client`, `discovery/lease.rs`, and `standalone/information_extension.rs` - Update expected TOML output in `tests-integration/tests/http.rs` - Add unit tests for `EnvVars` round-trip and `NodeInfo` backward compat Signed-off-by: Lei, HUANG <leih@nvidia.com> Signed-off-by: Lei, HUANG <mrsatangel@gmail.com> * refactor: address heartbeat env review feedback Signed-off-by: Lei, HUANG <mrsatangel@gmail.com> * chore: log error on deserialization failure Signed-off-by: Lei, HUANG <mrsatangel@gmail.com> * refactor: send heartbeat env vars once Signed-off-by: Lei, HUANG <mrsatangel@gmail.com> * fix: resend heartbeat env vars after reconnect Signed-off-by: Lei, HUANG <mrsatangel@gmail.com> * revert: keep env vars in every heartbeat Signed-off-by: Lei, HUANG <mrsatangel@gmail.com> --------- Signed-off-by: Lei, HUANG <leih@nvidia.com> Signed-off-by: Lei, HUANG <mrsatangel@gmail.com>
Setup tests for multiple storage backend
To run the integration test, please copy .env.example to .env in the project root folder and change the values on need.
Take s3 for example. You need to set your S3 bucket, access key id and secret key:
# Settings for s3 test
GT_S3_BUCKET=S3 bucket
GT_S3_REGION=S3 region
GT_S3_ACCESS_KEY_ID=S3 access key id
GT_S3_ACCESS_KEY=S3 secret access key
Run
Execute the following command in the project root folder:
cargo test integration
Test s3 storage:
cargo test s3
Test oss storage:
cargo test oss
Test azblob storage:
cargo test azblob
Setup tests with Kafka wal
To run the integration test, please copy .env.example to .env in the project root folder and change the values on need.
GT_KAFKA_ENDPOINTS = localhost:9092
Setup kafka standalone
cd tests-integration/fixtures
docker compose -f docker-compose.yml up kafka
Setup tests with etcd TLS
This guide explains how to set up and test TLS-enabled etcd connections in GreptimeDB integration tests.
Quick Start
TLS certificates are already at tests-integration/fixtures/etcd-tls-certs/.
-
Start TLS-enabled etcd:
cd tests-integration/fixtures docker compose up etcd-tls -d -
Start all services (including etcd-tls):
cd tests-integration/fixtures docker compose up -d --wait
Certificate Details
The checked-in certificates include:
ca.crt- Certificate Authority certificateserver.crt/server-key.pem- Server certificate for etcd-tls serviceclient.crt/client-key.pem- Client certificate for connecting to etcd-tls
The server certificate includes SANs for localhost, etcd-tls, 127.0.0.1, and ::1.
Regenerating Certificates (Optional)
If you need to regenerate the etcd certificates:
# Regenerate certificates (overwrites existing ones)
./scripts/generate-etcd-tls-certs.sh
# Or generate in custom location
./scripts/generate-etcd-tls-certs.sh /path/to/cert/directory
If you need to regenerate the mysql and postgres certificates:
# Regenerate certificates (overwrites existing ones)
./scripts/generate_certs.sh
# Or generate in custom location
./scripts/generate_certs.sh /path/to/cert/directory
Note: The checked-in certificates are for testing purposes only and should never be used in production.