Hi Everyone,
We are using Javelin Bulk import functionality to load high volume of data into Oracle Database. Which works very good.
Problem is, Target table is having unique index on it.
When i run the load job twice, Bulk import is 1. disabling unique index, 2. loading duplicate data, 3. index is set to re-usable state due to duplicate records.
This leads to issue, inserting or deleting records.
Is there any way, we can do duplicate check while loading data into target table using Bulk upload in Javelin. ------------------------------
Thanks
Rajesh
------------------------------