javascript - Import from MySQL to ArangoDB consistently incomplete (own Node.js script) -
In order to migrate a MySQL table to a collection, I wrote a naive nod. JS script.
This works quite well, in spite of this always the records are lost such that the connection was closed very quickly. How many documents will be missing random, however, this is always the same amount :
-
The source has 68,750 records,
< < / Li> -
68868 (-68) documents have been created in Argodibo
var mysql = require ('mysql'); Var Arango = requires ('arango'); Var docs = []; Function processor (line, connection) {if (docs.length & lt; 1000 & amp; line! == incorrect) {docs.push (line); } And {connection.pause (); Db.import.importJSONData ("target_collection", JSON.stringify (Docs, Function (key, value) {if (value == null || (typeof value === "string" & amp;! Value.trim ( ) {Return undefined;} else {return value;}}), {createCollection: true, waitForSync: false}, function (error, retire) {docs = []; connection.resume (); if (line === Wrong) process.exit ();} connection) = mysql.createConnection ({host: 'localhost', user: 'root', password: ''}); var db = arango.Connection ("http: // localhost: 8529 / my_database "); connection.connect (); var query = connection.query (select 'FROM my_database.source_table'); var i = 0; query (. 'Error', function (mistake) {Console.log (err);}) .on ('result', function (line) {i ++; if (i% 1000 == 0) console.log (i); processor (line, connection);}). ('End', function () {processRow (wrong, connection);});
I have written that uses a Transform Stream and actually imports 68,744 records, and records all but the goal collections and records, although it records every n < source.
Am I missing something?
A counter variable can confirm that all 68,750 records have been read and there is no source record that is completely empty (all columns NULL
), because at least one The primary key is integer (and I have also tried without the customized JSON string handler).
Solution:
There is a minor error in your process line function. You trigger it from one row to one step and push all the rows into the Docs array. When executing it for 1000 lines, the documents are written in Arongodibi and you enter the next line and here is the error, the 1000th line is not stored in docs at any point. One possible correction:
db.import.importJSONData ("target_collection", JSON.stringify (docs, function (key, value) {if (value == null || (typeof value == ) = {String} line]; // Insert line; connection.resume (); if (line === incorrect) process.exit ();});
Comments
Post a Comment