TDP conversion issue

Just a give you a history that I was able to insert initial 400 rows without any issue, however when I modified the where clause to insert more, it started giving me errors in array DML. Is there any particular log file where I could go and take a look at the detailed error exclusively?

It is unclear which method you are using. I think you are using the Import Wizard. Here we use Oracle Array processing and when Oracle throws an error they do not say what row of data errored.

To debug let’s do the following to narrow it down.

  1. When you enter query for SQL Server source data, order the data and limit rows to 200 rows.

  2. On the Oracle side, do you have a predefined table or are you letting the Wizard create a new table. If you are letting wizard create a new table, make sure and change the two column sizes to varchar(4000). The import wizard only sets column size based on sample rows.

  3. If it still get the error on the 200 rows, can you execute the same SQL in editor and save the file to Toad Editor Files (*.tef) and send to me? This will give me better sample of your data that I can use to replicate the issue.

Yes I am using Import wizard. So I was following your plan of action and selected 198 rows from SQL Server table to insert into an existing Oracle table, but still generating some different type of error: (I am attaching SQL Server and Oracle Table DDL statement below.)

SQL Server Table Definition:

CREATE TABLE [dbo].[Message](
[ID] [bigint] IDENTITY(6,10) NOT FOR REPLICATION NOT NULL,
[Subject] nvarchar NULL,
[Body] [ntext] NULL,
[SendTime] NOT NULL,
[CreatedTime] NOT NULL,
[CreatedByUserID] [int] NOT NULL,
[LastUpdatedTime] NOT NULL,
[LastUpdatedByUserID] [int] NOT NULL,
[IsBroadcastMessage] [bit] NULL,
[MessageTypeId] [bigint] NOT NULL,
[SMSMessage] varchar NULL,
PRIMARY KEY CLUSTERED
(
[ID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]

GO

Oracle Table Definition:

CREATE TABLE USRMBR.MSG
(
MSG_ID NUMBER(27) NOT NULL,
SUBJECT VARCHAR2(4000),
BODY CLOB,
SENDTIME DATE,
CREATEDTIME DATE,
CREATEDBYUSERID NUMBER(27),
LASTUPDATEDTIME DATE,
LASTUPDATEDBYUSERID NUMBER(27),
MESSAGETYPEID NUMBER(27),
SMSMESSAGE VARCHAR2(4000)
)

If you look into table definition, All I am changiing ntext datatype to clob in Oracle.

Error that I am receiving:

4:37:05 PM Thread (1) Connecting to :MIGRATEDB (SYS), SYS.
4:37:05 PM Thread (51) Import Started [1/28/2015 4:37:05 PM]
4:37:05 PM Thread (51) Processing “Embedded Query” into “USRMBR.MSG”
4:37:05 PM Thread (51) Heterogeneous Import 1/28/2015 4:37:05 PM
4:37:05 PM Thread (51) Importing 1 out of 1 files
4:37:05 PM Thread (51) Import embedded query
4:37:05 PM Thread (51) Change Schema to USRMBR
4:37:05 PM Thread (51) Inserting row 198 into table MSG
4:37:11 PM Thread (51) Import canceled
4:37:11 PM Thread (51) Failed importing 0 rows on step of
Failed inserting from row #0 to #198 without detailed row number info : Value length exceeds the parameter size
4:37:11 PM Thread (51) Import canceled and inserted rows were rolled back.

4:37:11 PM Thread (51)
Failed inserting from row #0 to #198 without detailed row number info : Value length exceeds the parameter size
4:37:11 PM Thread (51) Import Finished [1/28/2015 4:37:11 PM]
4:37:11 PM Thread (51) Import time: 00:00:05.7625762


Thanks,

Imran-



Try NCLOB instead of CLOB on the oracle side. See if that makes a difference.

Hi Greg, Just tried that and getting the same error as above: (Wish I would know what column value length is exceeding in the import process )

4:54:11 PM Thread (1) Connecting to :MIGRATEDB (SYS), SYS.

4:54:11 PM Thread (31) Import Started [1/28/2015 4:54:11 PM]

4:54:11 PM Thread (31) Processing “Embedded Query” into “USRMBR.MSG”

4:54:11 PM Thread (31) Heterogeneous Import 1/28/2015 4:54:11 PM

4:54:11 PM Thread (31) Importing 1 out of 1 files

4:54:11 PM Thread (31) Import embedded query

4:54:12 PM Thread (31) Change Schema to USRMBR

4:54:12 PM Thread (31) Inserting row 198 into table MSG

4:54:16 PM Thread (31) Import canceled

4:54:16 PM Thread (31) Failed importing 0 rows on step of

Failed inserting from row #0 to #198 without detailed row number info : Value length exceeds the parameter size

4:54:16 PM Thread (31) Import canceled and inserted rows were rolled back.

4:54:16 PM Thread (31)

Failed inserting from row #0 to #198 without detailed row number info : Value length exceeds the parameter size

4:54:16 PM Thread (31) Import Finished [1/28/2015 4:54:16 PM]

4:54:16 PM Thread (31) Import time: 00:00:05.7395739

row 198 is where your problem is. Try running a select with datalength(body) to see what lengths you are getting. Note the Datalength function includes trailing spaces unlike the Len function.

The max datalength of body column is 202400340 and min is 12. Row # 198 has 408 datalength.

It’s not like I am fully dead in a water, though I had success in few tables migrating from sql server to oracle. however these errors (below) led me to open a service tickets to resolve it in timely fashion. Is there a way we can further drill down the error and find the root cause

Error # 1:

4:54:16 PM Thread (31) Failed importing 0 rows on step of Failed inserting from row #0 to #198 without detailed row number info : Value length exceeds the parameter size

Error # 2:

7:18:34 PM Thread (52) Failed inserting from row #0 to #399 without detailed row number info : ORA-24381: error(s) in array DML Do you want to cancel the import?

Error # 3:

4:37:11 PM Thread (51) Failed inserting from row #0 to #198 without detailed row number info : Value length exceeds the parameter size

Error # 4:

9:14:30 PM Thread (36) Error importing data, please check file format options: Exception of type ‘System.OutOfMemoryException’ was thrown.

Thanks,

Well if we are really trying to track down a data driven error we will need to go to row by row inserts. Set up an ODBC connection or an Oracle DIrect Connect and do the same Import. This method does not use array processing and will error at the row level and give more precise info. Once we find out what data errors are concurring we can probably go back to the faster method.

The size of those large columns is difficult for simple importing. Is it crucial to have all data? Can you select the first 4000 chars and insert into smaller column? It depends on what you need the data for. If you truly need to cross imort all of the Clob values you will need to go back to Data Compare and take much samller chunks.

I am chasing a small data comparison and now getting the following error: (Seems like TDP doesn’t like me at all…:slight_smile:

Message is Unable to cast object of type ‘Quest.Toad.Oracle.Proxy.OracleProxyConnection’ to type ‘Devart.Data.Oracle.OracleConnection’.

Any clue? I was just adding a ‘where’ clause in target side (Oracle) and when validating the where clause , I am getting the following error. However, on source side (SQL Server) I am not seeing this type of error when validating the where clause…

Thanks

Imran–

Are you using a 64-bit Oracle client? Looks like we are stumbling on that. Data Compare also does not support the use of Oracle 64-bit client. As a quick change you could try Oracle Direct Connect

Yes we are using a 64-bit OS, Is there a way we can work around the below error? I am only collecting 400 records of data but still giving me the following error, btw I have downloaded and installed beta version of TDP 3.7.52

9:14:30 PM Thread (36) Error importing data, please check file format options: Exception of type ‘System.OutOfMemoryException’ was thrown.

But are you using a 64-bit Oracle client. We are a 32-bit app and do support a 64-bit client but Data Compare doesn’t play nice with that client. Can you check to see if you are using 64-bit client or 32?

yes we are using 64-bit client but was able to convert few tables without any issue. Just getting 3 more tables to be converted where datatype is CLOB. Any other suggestion. I appreciate it.

Let me review the data on the support case