Skip to content

Commit

Permalink
fix dead links
Browse files Browse the repository at this point in the history
  • Loading branch information
nicecui committed Jan 3, 2025
1 parent 0ff93bf commit fd0aef2
Show file tree
Hide file tree
Showing 3 changed files with 3 additions and 3 deletions.
2 changes: 1 addition & 1 deletion blog/release-0-7-2.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ date: 2024-04-08

Release date: April 08, 2024

This is a patch release, containing a critial bug fix to avoid wrongly delete data files ([#3635](https://github.com/GreptimeTeam/greptimedb/pull/3635)).
This is a patch release, containing a critical bug fix to avoid wrongly delete data files ([#3635](https://github.com/GreptimeTeam/greptimedb/pull/3635)).

**It's highly recommended to upgrade to this version if you're using v0.7.**

Expand Down
2 changes: 1 addition & 1 deletion docs/user-guide/ingest-data/for-observerbility/kafka.md
Original file line number Diff line number Diff line change
Expand Up @@ -76,7 +76,7 @@ A pipeline processes the logs into structured data before ingestion into Greptim
### Logs with JSON format

For logs in JSON format (e.g., `{"timestamp": "2024-12-23T10:00:00Z", "level": "INFO", "message": "Service started"}`),
you can use the built-in [`greptime_identity`](/logs/manage-pipelines.md#greptime_identity) pipeline for direct ingestion.
you can use the built-in [`greptime_identity`](/user-guide/logs/manage-pipelines.md#greptime_identity) pipeline for direct ingestion.
This pipeline creates columns automatically based on the fields in your JSON log message.

Simply configure Vector's `transforms` settings to parse the JSON message and use the `greptime_identity` pipeline as shown in the following example:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -75,7 +75,7 @@ Pipeline 在写入到 GreptimeDB 之前将日志处理为结构化数据。
### JSON 格式的日志

对于 JSON 格式的日志(例如 `{"timestamp": "2024-12-23T10:00:00Z", "level": "INFO", "message": "Service started"}`),
你可以使用内置的 [`greptime_identity`](/logs/manage-pipelines.md#greptime_identity) pipeline 直接写入日志。
你可以使用内置的 [`greptime_identity`](/user-guide/logs/manage-pipelines.md#greptime_identity) pipeline 直接写入日志。
此 pipeline 根据 JSON 日志消息中的字段自动创建列。

你只需要配置 Vector 的 `transforms` 设置以解析 JSON 消息,并使用 `greptime_identity` pipeline,如以下示例所示:
Expand Down

0 comments on commit fd0aef2

Please sign in to comment.