Developer forum

Forum » Integration » Failing Data integration jobs

Failing Data integration jobs

Jesse Bakker
Reply

Hi,

We have an import job where we import Itemtypes from an custom database table. We use the DynamicwebProvider as source and the ItemProvider as destination.

With testing small data sets this works fine. But when using the real data (46k records) this fails after about 30min, with the exception below.

Is there a way to fix this? See the attached job xml.

The solution is running on 9.7.4.

2020-01-27 09:56:46.847: Starting job - Import ReviewData 2 of 2.
2020-01-27 09:56:49.349: Starting reading item type Review.
2020-01-27 09:56:50.278: Finished reading item type Review.
2020-01-27 10:32:46.727: Starting import item type Review
2020-01-27 10:32:46.769: Job Failed with the following message: The operation is not valid for the state of the transaction.
at System.Transactions.TransactionState.EnlistPromotableSinglePhase(InternalTransaction tx, IPromotableSinglePhaseNotification promotableSinglePhaseNotification, Transaction atomicTransaction, Guid promoterType)    
at System.Transactions.Transaction.EnlistPromotableSinglePhase(IPromotableSinglePhaseNotification promotableSinglePhaseNotification, Guid promoterType)    
at System.Transactions.Transaction.EnlistPromotableSinglePhase(IPromotableSinglePhaseNotification promotableSinglePhaseNotification)    
at System.Data.SqlClient.SqlInternalConnection.EnlistNonNull(Transaction tx)    
at System.Data.SqlClient.SqlInternalConnection.Enlist(Transaction tx)    
at System.Data.ProviderBase.DbConnectionInternal.ActivateConnection(Transaction transaction)   
at System.Data.ProviderBase.DbConnectionPool.PrepareConnection(DbConnection owningObject, DbConnectionInternal obj, Transaction transaction)    
at System.Data.ProviderBase.DbConnectionPool.TryGetConnection(DbConnection owningObject, UInt32 waitForMultipleObjectsTimeout, Boolean allowCreate, Boolean onlyOneCheckConnection, DbConnectionOptions userOptions, DbConnectionInternal& connection)    
at System.Data.ProviderBase.DbConnectionPool.TryGetConnection(DbConnection owningObject, TaskCompletionSource`1 retry, DbConnectionOptions userOptions, DbConnectionInternal& connection)    
at System.Data.ProviderBase.DbConnectionFactory.TryGetConnection(DbConnection owningConnection, TaskCompletionSource`1 retry, DbConnectionOptions userOptions, DbConnectionInternal oldConnection, DbConnectionInternal& connection)    
at System.Data.ProviderBase.DbConnectionInternal.TryOpenConnectionInternal(DbConnection outerConnection, DbConnectionFactory connectionFactory, TaskCompletionSource`1 retry, DbConnectionOptions userOptions)   
at System.Data.SqlClient.SqlConnection.TryOpenInner(TaskCompletionSource`1 retry)    
at System.Data.SqlClient.SqlConnection.TryOpen(TaskCompletionSource`1 retry)   
at System.Data.SqlClient.SqlConnection.Open()    
at Dynamicweb.Data.DatabaseConnectionProvider.CreateConnection()    
at Dynamicweb.Content.Items.Queries.Repository.Update(IEnumerable`1 items, ItemContext context, Boolean synchronizePages)    
at Dynamicweb.DataIntegration.Providers.ItemDestinationWriter.UpdateItems(String itemType, List`1 items)    
at Dynamicweb.DataIntegration.Providers.ItemDestinationWriter.ImportItems()    
at Dynamicweb.DataIntegration.Providers.ItemProvider.ItemProvider.RunJob(Job job)
2020-01-27 10:33:04.696: Finished job - Import ReviewData 2 of 2.
2020-01-27 10:33:04.730: Batch failed.

 


Replies

 
Dmitriy Benyuk Dynamicweb Employee
Dmitriy Benyuk
Reply

Hi Jesse,
it looks like there is a problem with 2 transactions running at the same time on the same tables. Are you running just one data integration job at a time?
Do you have other scheduled tasks/scheduled data integration jobs that are importing data too at the same time while you have running this data integration job?
Regards, Dmitrij

 
Jesse Bakker
Reply

Hi Dmitrij,

No, it is the only job running. And it is triggered manualy without any scheled tasks in place.

Also this is on my local machine so no other activity on the application or database at the same time.

 
Dmitriy Benyuk Dynamicweb Employee
Dmitriy Benyuk
Reply

Hi Jesse,
could you edit the GlobalSettings file ConnectionString property:
/GLOBALSETTINGS/SYSTEM/DATABASE/CONNECTIONSTRING
and add the enlist=false option and try to restart the website and run the job again?
Regards, Dmitrij

 
Jesse Bakker
Reply

Thanks Dmitrij,

That solved the inital exception. But now it throws another exception:

2020-01-27 16:44:05.265: Starting job - Import ReviewData 2 of 2.
2020-01-27 16:44:06.697: Starting reading item type Review.
2020-01-27 16:44:07.488: Finished reading item type Review.
2020-01-27 17:23:43.581: Starting import item type Review
2020-01-27 17:24:02.810: Finished import item type Review
2020-01-27 17:24:12.066: Start update item usages
2020-01-27 17:24:12.066: Finish update item usages
2020-01-27 17:34:03.067: Job Failed with the following message: The transaction has aborted.
at System.Transactions.TransactionStateAborted.BeginCommit(InternalTransaction tx, Boolean asyncCommit, AsyncCallback asyncCallback, Object asyncState)    
at System.Transactions.CommittableTransaction.Commit()    
at System.Transactions.TransactionScope.InternalDispose()    
at System.Transactions.TransactionScope.Dispose()    
at Dynamicweb.DataIntegration.Providers.ItemProvider.ItemProvider.RunJob(Job job)
2020-01-27 17:34:13.130: Finished job - Import ReviewData 2 of 2.
2020-01-27 17:34:13.130: Batch failed.
 
Dmitriy Benyuk Dynamicweb Employee
Dmitriy Benyuk
Reply

Hi Jesse,
can you try to update the ItemProvider.dll in the website bin folder with the one attached in the zipped folder and check if that works?
Regards, Dmitrij

 
Jesse Bakker
Reply

Thanks Dmitrij,

I tried it again with the new dll but unfortunately it didn't solve the problem. See logs below:

2020-01-28 09:37:00.963: Starting job - Import ReviewData 2 of 2.
2020-01-28 09:37:02.269: Starting reading item type Review.
2020-01-28 09:37:03.218: Finished reading item type Review.
2020-01-28 10:24:20.845: Starting import item type Review
2020-01-28 10:25:13.685: Finished import item type Review
2020-01-28 10:25:22.641: Start update item usages
2020-01-28 10:25:22.641: Finish update item usages
2020-01-28 10:35:25.870: Job Failed with the following message: The transaction has aborted.
at System.Transactions.TransactionStateAborted.BeginCommit(InternalTransaction tx, Boolean asyncCommit, AsyncCallback asyncCallback, Object asyncState)    
at System.Transactions.CommittableTransaction.Commit()    
at System.Transactions.TransactionScope.InternalDispose()    
at System.Transactions.TransactionScope.Dispose()    
at Dynamicweb.DataIntegration.Providers.ItemProvider.ItemProvider.RunJob(Job job)
2020-01-28 10:35:53.772: Finished job - Import ReviewData 2 of 2.
2020-01-28 10:35:53.805: Batch failed.
 
Dmitriy Benyuk Dynamicweb Employee
Dmitriy Benyuk
Reply

Hi Jesse,
can not get it reproduced. Does this happen only for this specific item type import? Does this only happens when there thousands of item records to import?
Regards, Dmitrij

 
Jesse Bakker
Reply

Hi Dmitrij,

I tried the import with a smaller sets (7 000) that worked fine again. Then i tried a bit bigger 10 000 that one failed again with the same log as my previous post.

We've have a different job in the same solution which works fine. This has similar contruction but that has a lot less records only around 900.

 
Nicolai Pedersen
Reply

Hi Jesse

The provider can probably not take that many records in one go. You might need to cut the import file in smaller sizes and run 2 or more jobs...

BR Nicolai

 
Jesse Bakker
Reply

Hi Nicolai,

If it was a file we could split it into chucks. But unfortunately the source is a database table that is read by the DynamicwebProvider. 

So i'm not sure how we could run this in multiple jobs? Any suggestions?

 
Dmitriy Benyuk Dynamicweb Employee
Dmitriy Benyuk
Reply

Hi Jesse,
you could consider to export data from your database table to xml file using Dynamicweb -> Xml provider job, then split it and make a Xml -> Dynamicweb provider job to import the data.
Regards, Dmitrij

 
Jesse Bakker
Reply

Hi Dmitrij,

This is a job that needs to run automaticly regularly (each night). So splitting this up into multiple jobs with mutliple files is not a workable solution for us.

Do maybe you have another option we can try?

 
Dmitriy Benyuk Dynamicweb Employee
Dmitriy Benyuk
Reply

Hi Jesse,
You can also try to split your export job to several jobs to export the items with some filtering conditions in each job.
Regards, Dmitrij

 

You must be logged in to post in the forum