-
Notifications
You must be signed in to change notification settings - Fork 45
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Export Files Throws Exception #920
Comments
Hmmm that's strange, did you try several times ? I'll try to reproduce on one of my orgs :) |
Yes I have tried it quite a few times and still occurs. Thanks for the respnse! |
@SegalDaniel I try to reproduce but didn 't succeed to :/ Please can you share your file export configuration ? Example with my working test case:
|
@nvuillam appreciate the effort! Unfortunately I'm not able to make this work properly with different query versions. When using a filtered query on the Account level the Node error is thrown as follows: { [sfdx-hardis] scripts/files/AccountsAttachmentsProd Error [ERR_STREAM_WRITE_AFTER_END]: write after end Node.js v22.11.0 When using a non-filtered query on the Account level the CSV parsing error is thrown as follows: { [sfdx-hardis] 🦙 Please select a files workspace to EXPORT Look up in VsCode ⬆️ |
@SegalDaniel the issue seem to appear at the first call to the Bulk API, so I suspect an authorization issue Please can you check that your user has the following permissions ? |
Hi,
I'm using the export files feature for exporting Account's attachments for my org, but receiving an Unhandled 'error' event in runtime.
[sfdx-hardis] Processing parent records chunk #1 on 12 (1000 records) ...
throw er; // Unhandled 'error' event
^
Error [ERR_STREAM_WRITE_AFTER_END]: write after end
at _write (node:internal/streams/writable:489:11)
at Writable.write (node:internal/streams/writable:510:10)
at DuplexifiedStream.ondata (node:internal/streams/readable:1009:22)
at DuplexifiedStream.emit (node:events:518:28)
at addChunk (node:internal/streams/readable:561:12)
at readableAddChunkPushByteMode (node:internal/streams/readable:512:3)
at Readable.push (node:internal/streams/readable:392:5)
at DuplexifiedStream.readStream (/Users/.local/share/sf/node_modules/@jsforce/jsforce-node/lib/util/stream.js:81:18)
at PassThrough. (/Users/.local/share/sf/node_modules/@jsforce/jsforce-node/lib/util/stream.js:64:18)
at PassThrough.emit (node:events:518:28)
at emitReadable (node:internal/streams/readable:834:12)
at process.processTicksAndRejections (node:internal/process/task_queues:89:21)
Emitted 'error' event on PassThrough instance at:
at PassThrough.onerror (node:internal/streams/readable:1028:14)
at PassThrough.emit (node:events:518:28)
at emitErrorNT (node:internal/streams/destroy:170:8)
at emitErrorCloseNT (node:internal/streams/destroy:129:3)
at process.processTicksAndRejections (node:internal/process/task_queues:90:21) {
code: 'ERR_STREAM_WRITE_AFTER_END'
In addition to that, when trying to perform this on a larger amount of records, I got a CSV parsing error, without being able to detect the problematic record/records.
Bulk Query error: Error: Invalid Closing Quote: got "I" at line 8051 instead of delimiter, record delimiter, trimable character (if activated) or comment.
Any ideas? :)
The text was updated successfully, but these errors were encountered: