-
Notifications
You must be signed in to change notification settings - Fork 380
Closed
Description
Suppose I have 300 million events in my local Splunk instance. I write a chunked streaming search command read records from Splunk and write to Splunk then. Then I got a error:
ChunkedExternProcessor - Failure writing result chunk, buffer full. External process possibly failed to read its stdin.
Seems it's a very common issue when handling big scale data. Maybe it's a bug?
Another issue is when I write a big data to Splunk, I can also get this exception sometime:
File=search_command.py, Line=1016, IOError at "/Applications/Splunk/etc/apps/searchcommands_app/bin/packages/splunklib/searchcommands/internals.py", line 820 : [Errno 32] Broken pipe
Traceback:
File "/Applications/Splunk/etc/apps/searchcommands_app/bin/packages/splunklib/searchcommands/search_command.py", line 801, in _process_protocol_v2
self._execute(ifile, None)
File "/Applications/Splunk/etc/apps/searchcommands_app/bin/packages/splunklib/searchcommands/streaming_command.py", line 54, in _execute
SearchCommand._execute(self, ifile, self.stream)
File "/Applications/Splunk/etc/apps/searchcommands_app/bin/packages/splunklib/searchcommands/search_command.py", line 869, in _execute
self._record_writer.write_records(process(self._records(ifile)))
File "/Applications/Splunk/etc/apps/searchcommands_app/bin/packages/splunklib/searchcommands/internals.py", line 528, in write_records
write_record(record)
File "/Applications/Splunk/etc/apps/searchcommands_app/bin/packages/splunklib/searchcommands/internals.py", line 651, in _write_record
self.flush(partial=True)
File "/Applications/Splunk/etc/apps/searchcommands_app/bin/packages/splunklib/searchcommands/internals.py", line 774, in flush
self._write_chunk(metadata, self._buffer.getvalue())
File "/Applications/Splunk/etc/apps/searchcommands_app/bin/packages/splunklib/searchcommands/internals.py", line 820, in _write_chunk
write(body)
I think maybe you guys want to know this.
xcgspring, lowell80 and jakjohnson