We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hello.
I have simple scenario.
I have multiline input I want to save it as one record.
Unfortunately I can not do it
Environment:
2019-06-17 13:10:06 +0000 [info]: starting fluentd-1.5.2 pid=7 ruby="2.5.5" 2019-06-17 13:10:06 +0000 [info]: spawn command to main: cmdline=["/usr/bin/ruby", "-Eascii-8bit:ascii-8bit", "/usr/bin/fluentd", "-c", "/fluentd/etc/fluent.conf", "-p", "/fluentd/plugins", "--under-supervisor"] 2019-06-17 13:10:07 +0000 [info]: gem 'fluent-plugin-concat' version '2.3.0' 2019-06-17 13:10:07 +0000 [info]: gem 'fluent-plugin-elasticsearch' version '3.5.2' 2019-06-17 13:10:07 +0000 [info]: gem 'fluent-plugin-grok-parser' version '2.5.1' 2019-06-17 13:10:07 +0000 [info]: gem 'fluent-plugin-kafka' version '0.9.6' 2019-06-17 13:10:07 +0000 [info]: gem 'fluent-plugin-kubernetes_metadata_filter' version '2.2.0' 2019-06-17 13:10:07 +0000 [info]: gem 'fluentd' version '1.5.2' 2019-06-17 13:10:07 +0000 [info]: adding match pattern="xxx" type="stdout" 2019-06-17 13:10:07 +0000 [info]: adding filter pattern="es" type="concat" 2019-06-17 13:10:07 +0000 [info]: adding match pattern="es" type="copy" 2019-06-17 13:10:07 +0000 [warn]: #0 [out_es] Detected ES 7.x or above: `_doc` will be used as the document `_type`. 2019-06-17 13:10:07 +0000 [info]: adding source type="forward" 2019-06-17 13:10:07 +0000 [info]: #0 starting fluentd worker pid=15 ppid=7 worker=0 2019-06-17 13:10:07 +0000 [info]: #0 listening port port=24224 bind="0.0.0.0" 2019-06-17 13:10:07 +0000 [info]: #0 fluentd worker is now running worker=0
This is my config I thought should work. Config:
<source> @type forward port 24224 bind 0.0.0.0 tag es </source> <match xxx> @type stdout </match> <filter es> @type concat key log stream_identity_key container_id flush_interval 3s #n_lines 3 #multiline_start_regexp /^\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2},\d{3} #multiline_end_regexp /^\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2},\d{3} multiline_start_regexp /^2019/ #multiline_end_regexp /^2019 </filter>
Test case:
docker run --log-driver=fluentd --log-opt tag="docker.{.ID}}" ubuntu echo '2019-06-17 12:41:44,827 - Hello Fluentd! XXXXXXXXXXXXXX YYYYYYYYYYYY'
Current fluentd output.
2019-06-17 13:10:12 +0000 [warn]: #0 dump an error event: error_class=Fluent::Plugin::ConcatFilter::TimeoutError error="Timeout flush: es:281f3b726ee01d1df25e67084c329969523e03533a1490eeb4b9654ba7b36f0d" location=nil tag="es" time=2019-06-17 13:10:12.377438826 +0000 record={"log"=>"2019-06-17 12:41:44,827 - Hello Fluentd!\nXXXXXXXXXXXXXX\nYYYYYYYYYYYY", "container_id"=>"281f3b726ee01d1df25e67084c329969523e03533a1490eeb4b9654ba7b36f0d", "container_name"=>"/blissful_yonath", "source"=>"stdout"} 2019-06-17 13:10:12 +0000 [info]: #0 Timeout flush: es:281f3b726ee01d1df25e67084c329969523e03533a1490eeb4b9654ba7b36f0d
I thought it will merge this 3 lines without any errors but it doesn't. Unfortunately it raise Timeout.
Do you know how I can fix it to combine these three lines into one fluentd record ?
The text was updated successfully, but these errors were encountered:
You can use timeout_label parameter to handle timeout. See README.md carefully.
timeout_label
Sorry, something went wrong.
No branches or pull requests
Hello.
I have simple scenario.
I have multiline input
I want to save it as one record.
Unfortunately I can not do it
Environment:
This is my config I thought should work.
Config:
Test case:
Current fluentd output.
I thought it will merge this 3 lines without any errors but it doesn't.
Unfortunately it raise Timeout.
Do you know how I can fix it to combine these three lines into one fluentd record ?
The text was updated successfully, but these errors were encountered: