I am trying to import 50 thousands records. Import is very slow.
I noticed it imports 100 rows at a time.
While when i run the import on other machine it imports 500 rows at a time.
What could be the reason on my machine it imports just 100 rows at a time?
How can we increase it?
I don’t have an answer for this, but just wanted to chime in to the fact that I would like to import more rows at a time as well and agree that the import is very slow. I have a process that imports 300,000 rows, and it takes me 20 minutes.
-Greg
The default size to process is 500 rows. However, if you have a lot of columns which increase the row length, 500 rows can cause the app to run out of memory. So when there are more than 50 columns we change the processing size to 100. We plan to improve this area in 3.o. For now, you can manually adjust this in the import template. Using notepad change the BlockInsertValue=“100” to a different value.
Yes, I found this true.
If table column exceeds 50, it insert 100 rows at a time.
It tooks me couple of days to know how to change the value of **BlockInsertValue.**and its simple …
First you have to save the import as an import template file (.tim) by checking the checkbox on the last page of import wizard. Then open the .tim file in notepad and search for BlockInsertValue, you will find BlockInsertValue=“100”, just change the value to 500 as BlockInsertValue=“500” and save the file. When you execute the template, the import engine will insert the data 500 rows per block instead of 100 rows.