Topics

#fabric #fabric-sdk-node #fabric #fabric-sdk-node

soumya nayak
 

Hi Team ,

I was having one doubt regarding bulk upload of data from a CSV file to a blockchain

  •  
    this is my code in javascript file 
    for i < count {
    var property = Property{PropertyId: strconv.Itoa(startPropId), LegalDescription: "xyz" + strconv.Itoa(startPropId), TimeStamp:"2019-05-20T16:14:37-07:00", EntityId: "DBS"}

    propertiesAsBytes, _ := json.Marshal(property)
    loc, _ := time.LoadLocation("America/Los_Angeles")
    t := time.Now().In(loc)
    property.TimeStamp = t.Format(time.RFC3339)
    APIstub.PutState(property.PropertyId, propertiesAsBytes)
    i = i + 1
    startPropId = startPropId + 1
    }
  •  
    so when i am giving a CSV of 10K records ... its just creating one block with 10K records
  •  
    but the moment when i give 15K or 20K records in CSV file and try to do --its throwing me below error
    •  
      2019-08-07T15:25:05.115Z - error: [Transaction]: _validatePeerResponses: No valid responses from any peers. 1 peer error responses:
      peer=peer1, status=500, message=failed to execute transaction 0ef8897d6cd2fb3b623afd160d1d8aeef0bf2482a6cd383201e10483e01c3397: error sending: timeout expired while executing transaction
      (node:78791) UnhandledPromiseRejectionWarning: Error: No valid responses from any peers. 1 peer error responses:
      peer=peer1, status=500, message=failed to execute transaction 0ef8897d6cd2fb3b623afd160d1d8aeef0bf2482a6cd383201e10483e01c3397: error sending: timeout expired while executing transaction
      at Transaction._validatePeerResponses (/home/ranjan/LDBCApp/node_modules/fabric-network/lib/transaction.js:227:10)
      at Transaction.submit (/home/ranjan/LDBCApp/node_modules/fabric-network/lib/transaction.js:145:33)
      at <anonymous>
      (node:78791) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). (rejection id: 1)
      (node:78791) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.

  •  
    Also sometimes i have got the below error :-
    UnhandledPromiseRejectionWarning: Error: Failed to send transaction successfully to the orderer status:SERVICE_UNAVAILABLE
    at BasicCommitHandler._commit (/home/ranjan/LDBCApp/node_modules/fabric-client/lib/impl/BasicCommitHandler.js:120:23)
    at <anonymous>
    at process._tickCallback (internal/process/next_tick.js:189:7)
    (node:70577) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). (rejection id: 5)
    (node:70577) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.

    So. based on the parameters in configtx.yaml i am not getting how the block is getting created ?
    BatchTimeout: 2s
    BatchSize:
    MaxMessageCount: 10
    AbsoluteMaxBytes: 99 MB
    PreferredMaxBytes: 512 KB

    Please let me know how we go abput this and whats the best way of doing bulk uploads as i have some millions of records to be uploaded first to the blockchain from a SQL server

    Regards,
    Soumya
  •  
     

soumya nayak
 

Hi Team,

Any update on the above or idea?

Regards,
Soumya

Gari Singh
 

From the code snippet, I assume you are sending a single CSV file to your chaincode, then looping through it and calling PutState for each line/record in the CSV file.
I do not think you are actually getting a block with 10K transactions. From a Fabric perspective, this is a single transaction. That's why you get a single block with a single Fabric transaction which then includes all of the PutStates in the writeset.

For the error with 15/20K records, looks like you are hitting a timeout ... either the chaincode timeout itself (set as a property of the peer) or a timeout from the invoke call in the SDK.


-----------------------------------------
Gari Singh
Distinguished Engineer, CTO - IBM Blockchain
IBM Middleware
550 King St
Littleton, MA 01460
Cell: 978-846-7499
garis@...
-----------------------------------------

-----fabric@... wrote: -----
To: fabric@...
From: "soumya nayak"
Sent by: fabric@...
Date: 08/08/2019 02:07AM
Subject: [EXTERNAL] [Hyperledger Fabric] #fabric #fabric-sdk-node

Hi Team ,



I was having one doubt regarding bulk upload of data from a CSV file to a blockchain




this is my code in javascript file
for i < count {
var property = Property{PropertyId: strconv.Itoa(startPropId), LegalDescription: "xyz" + strconv.Itoa(startPropId), TimeStamp:"2019-05-20T16:14:37-07:00", EntityId: "DBS"}

propertiesAsBytes, _ := json.Marshal(property)
loc, _ := time.LoadLocation("America/Los_Angeles")
t := time.Now().In(loc)
property.TimeStamp = t.Format(time.RFC3339)
APIstub.PutState(property.PropertyId, propertiesAsBytes)
i = i + 1
startPropId = startPropId + 1
}



so when i am giving a CSV of 10K records ... its just creating one block with 10K records



but the moment when i give 15K or 20K records in CSV file and try to do --its throwing me below error



2019-08-07T15:25:05.115Z - error: [Transaction]: _validatePeerResponses: No valid responses from any peers. 1 peer error responses:
peer=peer1, status=500, message=failed to execute transaction 0ef8897d6cd2fb3b623afd160d1d8aeef0bf2482a6cd383201e10483e01c3397: error sending: timeout expired while executing transaction
(node:78791) UnhandledPromiseRejectionWarning: Error: No valid responses from any peers. 1 peer error responses:
peer=peer1, status=500, message=failed to execute transaction 0ef8897d6cd2fb3b623afd160d1d8aeef0bf2482a6cd383201e10483e01c3397: error sending: timeout expired while executing transaction
at Transaction._validatePeerResponses (/home/ranjan/LDBCApp/node_modules/fabric-network/lib/transaction.js:227:10)
at Transaction.submit (/home/ranjan/LDBCApp/node_modules/fabric-network/lib/transaction.js:145:33)
at <anonymous>
(node:78791) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). (rejection id: 1)
(node:78791) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.





Also sometimes i have got the below error :-
UnhandledPromiseRejectionWarning: Error: Failed to send transaction successfully to the orderer status:SERVICE_UNAVAILABLE
at BasicCommitHandler._commit (/home/ranjan/LDBCApp/node_modules/fabric-client/lib/impl/BasicCommitHandler.js:120:23)
at <anonymous>
at process._tickCallback (internal/process/next_tick.js:189:7)
(node:70577) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). (rejection id: 5)
(node:70577) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.

So. based on the parameters in configtx.yaml i am not getting how the block is getting created ?
BatchTimeout: 2s
BatchSize:
MaxMessageCount: 10
AbsoluteMaxBytes: 99 MB
PreferredMaxBytes: 512 KB

Please let me know how we go abput this and whats the best way of doing bulk uploads as i have some millions of records to be uploaded first to the blockchain from a SQL server

Regards,
Soumya

soumya nayak
 

Hi Gari,

Is it the block getting created for the parameter - BatchTimeout: 2s . 

So whats the best way to do a bulk upload for a millions of records because creating a million blocks - each record as a block would take lot of time. 

Thanks and Regards,
Soumya

Gari Singh
 

Oh .. .right ... in your case is would be the BatchTimeout which would force creation of blocks.

You can do things the way you were doing them, you'll just need to find the right combination of timeout settings which work.
You also need to be wary of the 100MB limit currently enforced for payloads in Fabric and break apart your millions of records into smaller batches.

-----------------------------------------
Gari Singh
Distinguished Engineer, CTO - IBM Blockchain
IBM Middleware
550 King St
Littleton, MA 01460
Cell: 978-846-7499
garis@...
-----------------------------------------

-----fabric@... wrote: -----
To: fabric@...
From: "soumya nayak"
Sent by: fabric@...
Date: 08/09/2019 07:16AM
Subject: [EXTERNAL] Re: [Hyperledger Fabric] #fabric #fabric-sdk-node

Hi Gari,

Is it the block getting created for the parameter - BatchTimeout: 2s .
So whats the best way to do a bulk upload for a millions of records because creating a million blocks - each record as a block would take lot of time.
Thanks and Regards,
Soumya