Transform and Cleanse - Remove Duplicates and Load to Local Storage - Not Working?

Greetings,

I am having a consistent problem with Transform and Cleanse, since the time you created the new Transform and Cleanse interface. I have been hoping that it would clear up with subsequent changes, or better yet, that I would understand what is going on, but I am still having a problem.

It has to do with the FIND DUPLICATES feature, and the removal of rows of data when subsequently moving the remainder to Local Storage, especially in working with a large amount of records, such as 300,000 rows of data to over a million. I have the same problem, whether I use the new Transform and Cleanse Interface, or use the Legacy Transform and Cleanse.

What happens is that when I select the Key data column to search for duplicate records, and choose to only save the first occurrence and ignore the remainder (Exclude all but one), and then move the data to Local Storage, it will not work correctly with the latest versions, such as 3.6, 3.7, and 3.8. What will happen is that it will start importing to Local Storage, and as soon as it finds the first iteration of a Duplicate (say around row 75,000), it errors out and stops, and gives me a pop-up that it has found a duplicate key record, regardless of the fact that I have chosen Apply and Apply All to the Remove Duplicates filter. This happens regardless of whether I use the new Transform and Cleanse interface, or the Legacy interface.

If I take the same data and mimic the same process in Toad Data Point version 3.4, it works correctly and as expected, although I do have to open Local Storage in 3.7 first, and then open Local Storage in 3.4 and do it from there.

This is very frustrating, as I would like to cease using 3.4 altogether.

Am I doing something wrong, an improper setting, or…? It may just be me, as I would have assumed that others would have had the same issue by now.

Any help you can give me would be much appreciated.

Thanks!!

Richard

That definitely is not right. Anyway, you can send me 100,000 or so of your data so I can reproduce the issue? This is obviously a data driven error. You can send to debbie.peabody@quest.com

We are not able to reproduce this issue. Some sample data and more info is needed. You can send to me or open a support ticket.