Select to File: Cannot access a closed file error

We are getting a "Cannot access a closed file" error exporting a large volume of data to csv using Select to File . The physical file size is 739,851 KB which is being overwritten each time. It's not really clear what's happening or if it's impacting the ability to write everything in its entirety to a file. Wondering what the limits of this object are or how to mitigate the surrounding circumstances.

First of all, upgrade to the latest version of TDP. This was a known issue for previous versions, and this issue may have been fixed in the latest version (can't verify right now).

If you're still getting this issue even in the latest version, and assuming you have the Professional Edition of TDP, a work-around might be to save the data in your Local Storage and then export from there.

I think I see what happened. We were using v5 and downloaded v5.1, but it installed a new instance rather than upgrade the existing and that led to some confusion. We'll have validate the install and try again. Thanks.

Installs by default should upgrade the existing version, unless a different install directory was given. That said, would be interested to see if your issue abates after you get installs squared away...:slight_smile:

I don't recall the option, but it did install to a 5.1 directory. So all the existing processes need to point to the new toad.exe I presume.

There is one other issue. I've got a Teradata table set up to import raw data from a file TDP exports to. The table has an identity key set to no cycle so by default it generates the ID when one isn't present. If I attempt to load the table without including the ID TDP doesn't know what to do and the CPU spikes. So I included a ROW_NUM in the query output to make sure the ID exists and this gets written to a file and is unique. When I attempt to import the file it is encountering a duplication error and the rows specified in the error changes every time, yet the physical ID value is unique in the file that's being imported. The ID is just there to ensure there is duplication in the table, but it's still encountering problems when the PK is unique.

CREATE SET TABLE db.log_tmp ,FALLBACK ,
NO BEFORE JOURNAL,
NO AFTER JOURNAL,
CHECKSUM = DEFAULT,
DEFAULT MERGEBLOCKRATIO
(
ID NUMBER(18,0) GENERATED ALWAYS AS IDENTITY (NO CYCLE),
USER_ID VARCHAR(255) CHARACTER SET Latin NOT CaseSpecific NOT NULL,
USER_GUID VARCHAR(100) CHARACTER SET Latin NOT CaseSpecific,
SERVER_NAME VARCHAR(255) CHARACTER SET Latin NOT CaseSpecific,
EVENT_NAME VARCHAR(100) CHARACTER SET Latin NOT CaseSpecific,
CREATE_DATE DATE Format 'yyyy-mm-dd' NOT NULL,
USER_PORTALS VARCHAR(100) CHARACTER SET Latin NOT CaseSpecific NOT NULL,
PROFILE_ID INTEGER NOT NULL
)
UNIQUE PRIMARY INDEX (ID)

image

Not knowing much about Teradata, I won't be much help, I'm afraid. Are you re-importing to the table fresh every time? Or does the import append to existing rows in the table? What happens if you define the ID column with the CYCLE option?

In this case I'm purging the temp table, which I could let TDP rebuild, but sometimes I run into data type conversion issues with TDP and have to use unicode. This isn't one of those cases, but I was encountering duplication and decided to maintain the table structure so I could control how the records are being added. At this point I've made the table multiset instead of set to see if I can get all the records in. I'm not sure why the ID wasn't being accepted as unique so I'm going to loosen the restrictions on the temp table which gets loaded into a production table and see what happens.

As for the CYCLE I opted for the first version:

We have version 5.1 up and running, but still running into this error intermittently. We are simply using a select to file and when it tries to write to file it encounters the error. The only workaround might be to create the file beforehand, however it worked earlier for one file, but then it failed for a second. Seems like it's not severing the connection to the file correctly or there is something lingering that's causing it to fail from the last export.

We just made another attempt and it wrote to the file, but still failed, so it completed a portion of the export, but not the entire export and it's retrieving fewer records than it did the last time. It's not an obvious issue to resolve on this end.

We killed TDP, renamed the output file, and reran and it exported. I suspect it's not severing the connection somewhere which is resolved by killing the entire app.

Any idea why this is still happening in v5.1? I'm trying to export a large data set which takes a while and then it kils the export. The only upside is it did write a portion to file, but that has to be cleaned up and the process resumed from a different point.

Does this require a continue on error?

I'm hoping that one of the Dev team members can chime in here...I'll see if I can't get one of them to take a closer look at how TDP is managing Teradata connections. Meanwhile, it's probably a good idea to officially open a ticket with our Support team.

This appears to be a filestream error and nothing to do with Teradata. If you're developing with .NET (and even if you're not) there's more than enough examples online how to resolve it. I think the only way we can solve it now is to kill TDP and remove the last file that was opened so the filestream is severed and re-export. Bit of a time waste I'm afraid.

How does one go about opening a ticket? This problem is a game-changer for us. For some reason we had less trouble with the earlier version, but we really need a solution to the file export in order to keep using TDP.

I was really hoping the export wizard would work better than select to file, but apparently not.

image

Disappointing.

It does partial exports only.

image

Hi,
at https://support.quest.com/my-account you'll see a list of your registered products, here click the number of open cases and there you'll see a link "Submit an SR"
image

I think our support expired and the end of the year so someone else might to put a ticket in that the "cannot access a closed file" error has not been resolved an occurs with both the select to file and export wizard objs. I was up most of the night trying to deal with this issue manually.

Does this in any way have to do with block size?

Cannot access a closed file.
Inner exception:
System.ObjectDisposedException - Cannot access a closed file.
at Quest.Toad.Workflow.Activities.Database.ExportWizardActivity.Execute(ActivityExecutionContext executionContext)

Hi all
Any commits in export query ? Don't know if it can help but, with Oracle, if commit ends the query used in export wizard, it fails !
I'm using last version of TDP and use to exports billions of rows in csv...

No commits, but good to know. Thanks.