Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Export Files Throws Exception #920

Open
SegalDaniel opened this issue Dec 10, 2024 · 5 comments
Open

Export Files Throws Exception #920

SegalDaniel opened this issue Dec 10, 2024 · 5 comments

Comments

@SegalDaniel
Copy link

Hi,

I'm using the export files feature for exporting Account's attachments for my org, but receiving an Unhandled 'error' event in runtime.

[sfdx-hardis] Processing parent records chunk #1 on 12 (1000 records) ...
throw er; // Unhandled 'error' event
^

Error [ERR_STREAM_WRITE_AFTER_END]: write after end
at _write (node:internal/streams/writable:489:11)
at Writable.write (node:internal/streams/writable:510:10)
at DuplexifiedStream.ondata (node:internal/streams/readable:1009:22)
at DuplexifiedStream.emit (node:events:518:28)
at addChunk (node:internal/streams/readable:561:12)
at readableAddChunkPushByteMode (node:internal/streams/readable:512:3)
at Readable.push (node:internal/streams/readable:392:5)
at DuplexifiedStream.readStream (/Users/.local/share/sf/node_modules/@jsforce/jsforce-node/lib/util/stream.js:81:18)
at PassThrough. (/Users/.local/share/sf/node_modules/@jsforce/jsforce-node/lib/util/stream.js:64:18)
at PassThrough.emit (node:events:518:28)
at emitReadable
(node:internal/streams/readable:834:12)
at process.processTicksAndRejections (node:internal/process/task_queues:89:21)
Emitted 'error' event on PassThrough instance at:
at PassThrough.onerror (node:internal/streams/readable:1028:14)
at PassThrough.emit (node:events:518:28)
at emitErrorNT (node:internal/streams/destroy:170:8)
at emitErrorCloseNT (node:internal/streams/destroy:129:3)
at process.processTicksAndRejections (node:internal/process/task_queues:90:21) {
code: 'ERR_STREAM_WRITE_AFTER_END'

In addition to that, when trying to perform this on a larger amount of records, I got a CSV parsing error, without being able to detect the problematic record/records.

Bulk Query error: Error: Invalid Closing Quote: got "I" at line 8051 instead of delimiter, record delimiter, trimable character (if activated) or comment.

Any ideas? :)

@nvuillam
Copy link
Member

Hmmm that's strange, did you try several times ?
THe bulk query error could make think about a connection issue ^^

I'll try to reproduce on one of my orgs :)

@SegalDaniel
Copy link
Author

SegalDaniel commented Dec 16, 2024

Yes I have tried it quite a few times and still occurs.
I was thinking it is a connection issue as well, but I saw that the number of records & chunks are calculated correctly.
This issue happens consistently, with different versions of the query.
Maybe this error is occurring while using the Bulk API with JSforce for querying the data, when the stream was already closed.

Thanks for the respnse!

@nvuillam
Copy link
Member

nvuillam commented Dec 16, 2024

@SegalDaniel I try to reproduce but didn 't succeed to :/

Please can you share your file export configuration ?

Example with my working test case:

🦙 This export of files could run on 19758 records, in 20 chunks, and consume up to 41 API calls on the 291318 remaining API calls. Do you want to proceed ?

Config:
{
  "sfdxHardisLabel": "Opp",
  "sfdxHardisDescription": "",
  "soqlQuery": "SELECT Id,Name FROM Opportunity",
  "fileTypes": "all",
  "outputFolderNameField": "Name",
  "outputFileNameFormat": "id_title",
  "overwriteParentRecords": true,
  "overwriteFiles": false
}

Result:
[sfdx-hardis] API limit: 307000
[sfdx-hardis] API used before process: 15682
[sfdx-hardis] API used after process: 15683
[sfdx-hardis] API calls remaining for today: 291317
[sfdx-hardis] Total SOQL requests: 42
[sfdx-hardis] Total parent records found: 19758
[sfdx-hardis] Total parent records with files: 0
[sfdx-hardis] Total parent records ignored because already existing: 0
[sfdx-hardis] Total files downloaded: 5624
[sfdx-hardis] Total file download errors: 9
[sfdx-hardis] Total file skipped because of type constraint: 0
[sfdx-hardis] Total file skipped because previously downloaded: 0
[sfdx-hardis] Successfully exported files from project scripts\files\Opp from org [email protected]  
[sfdx-hardis] hardis:org:files:export execution time 1:08:03.538

@SegalDaniel
Copy link
Author

SegalDaniel commented Dec 17, 2024

@nvuillam appreciate the effort!

Unfortunately I'm not able to make this work properly with different query versions.

When using a filtered query on the Account level the Node error is thrown as follows:

{
"sfdxHardisLabel": "Account Attachments - Prod",
"sfdxHardisDescription": "",
"soqlQuery": "SELECT Id FROM Account WHERE CustomField__c = 'Test' AND SecondCustomField__c = 'Test'",
"fileTypes": "all",
"outputFolderNameField": "Id",
"outputFileNameFormat": "id_title",
"overwriteParentRecords": true,
"overwriteFiles": true
}

[sfdx-hardis] scripts/files/AccountsAttachmentsProd
[sfdx-hardis] SOQL REST: SELECT COUNT() FROM Account WHERE CustomField__c = 'Test' AND SecondCustomField__c = 'Test' on https://.my.salesforce.com
[sfdx-hardis] 🦙 This export of files could run on 2269 records, in 2 chunks, and consume up to 5 API calls on the 4186737 remaining API calls. Do you want to proceed ? Look up in VsCode ⬆️
[sfdx-hardis] {"value":true}
[sfdx-hardis] Use --startchunknumber command line argument if you do not want to start from first chunk
[sfdx-hardis] [BulkApiV2] SELECT Id FROM Account WHERE CustomField__c = 'Test' AND SecondCustomField__c = 'Test'
🌓 [BulkApiV2] Bulk Query: SELECT Id FROM Account WHERE CustomField__c = 'Test' AND SecondCustomField__c = 'Test'node:events:496
throw er; // Unhandled 'error' event
^

Error [ERR_STREAM_WRITE_AFTER_END]: write after end
at _write (node:internal/streams/writable:489:11)
at Writable.write (node:internal/streams/writable:510:10)
at DuplexifiedStream.ondata (node:internal/streams/readable:1009:22)
at DuplexifiedStream.emit (node:events:518:28)
at addChunk (node:internal/streams/readable:561:12)
at readableAddChunkPushByteMode (node:internal/streams/readable:512:3)
at Readable.push (node:internal/streams/readable:392:5)
at DuplexifiedStream.readStream (/Users/daniel.segal/.local/share/sf/node_modules/@jsforce/jsforce-node/lib/util/stream.js:81:18)
at PassThrough. (/Users/daniel.segal/.local/share/sf/node_modules/@jsforce/jsforce-node/lib/util/stream.js:64:18)
at PassThrough.emit (node:events:518:28)
at emitReadable
(node:internal/streams/readable:834:12)
at process.processTicksAndRejections (node:internal/process/task_queues:89:21)
Emitted 'error' event on PassThrough instance at:
at PassThrough.onerror (node:internal/streams/readable:1028:14)
at PassThrough.emit (node:events:518:28)
at emitErrorNT (node:internal/streams/destroy:170:8)
at emitErrorCloseNT (node:internal/streams/destroy:129:3)
at process.processTicksAndRejections (node:internal/process/task_queues:90:21) {
code: 'ERR_STREAM_WRITE_AFTER_END'
}

Node.js v22.11.0

When using a non-filtered query on the Account level the CSV parsing error is thrown as follows:

{
"sfdxHardisLabel": "Account Attachments - Prod",
"sfdxHardisDescription": "",
"soqlQuery": "SELECT Id FROM Account",
"fileTypes": "all",
"outputFolderNameField": "Id",
"outputFileNameFormat": "id_title",
"overwriteParentRecords": true,
"overwriteFiles": true
}

[sfdx-hardis] 🦙 Please select a files workspace to EXPORT Look up in VsCode ⬆️
[sfdx-hardis] {"value":"scripts/files/AccountsAttachmentsProd"}
[sfdx-hardis] 🦙 Do you want to use default configuration for Account Attachments - Prod ? Look up in VsCode ⬆️
[sfdx-hardis] {"value":true}
[sfdx-hardis] Exporting files from [AccountsAttachmentsProd]: Account Attachments - Prod ...
[sfdx-hardis] scripts/files/AccountsAttachmentsProd
[sfdx-hardis] SOQL REST: SELECT COUNT() FROM Account on https://my.salesforce.com
[sfdx-hardis] 🦙 This export of files could run on 608498 records, in 608 chunks, and consume up to 1217 API calls on the 4187917 remaining API calls. Do you want to proceed ? Look up in VsCode ⬆️
[sfdx-hardis] {"value":true}
[sfdx-hardis] Use --startchunknumber command line argument if you do not want to start from first chunk
[sfdx-hardis] [BulkApiV2] SELECT Id FROM Account
🌗 [BulkApiV2] Bulk Query: SELECT Id FROM Account[sfdx-hardis] Bulk Query error: Error: Invalid Closing Quote: got "I" at line 14825 instead of delimiter, record delimiter, trimable character (if activated) or comment
✖ [BulkApiV2] Bulk query error: Invalid Closing Quote: got "I" at line 14825 instead of delimiter, record delimiter, trimable character (if activated) or comment
Error (CSV_INVALID_CLOSING_QUOTE): Invalid Closing Quote: got "I" at line 14825 instead of delimiter, record delimiter, trimable character (if activated) or comment
[sfdx-hardis] Bulk Query error: AbortError: The operation was aborted

@nvuillam
Copy link
Member

@SegalDaniel the issue seem to appear at the first call to the Bulk API, so I suspect an authorization issue

Please can you check that your user has the following permissions ?

image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants