Are there any reason for the class Dynamicweb.Data.Providers.DynamicwebBulkInsertDestinationWrite to be internal instead of public?
If possible it would be nice, to reuse som of the functionality in a custom desination provider.
Are there any reason for the class Dynamicweb.Data.Providers.DynamicwebBulkInsertDestinationWrite to be internal instead of public?
If possible it would be nice, to reuse som of the functionality in a custom desination provider.
What do you need to accomplish? You could create a custom provider that inherits from DynamicwebProvider, which makes use of DynamicwebBulkInsertDestinationWrite internally. However, I'm not really sure why you would need to. What is it that you need to do that you cannot accomplish by using the default providers?
/Morten
Hi Morten,
Thanks for your reply.
My goal is to create a custom provider that inherits from the DynamicwebPrvovider. That custom provider has som extra parameteres, regarding which rows to delete on the target table. And to do that i need to override the RunJob method. But most of the functionality in that method i want to reuse in my new RunJob method and that is not possible because the DynamicwebProvider uses that internal class.
The following code shows the original DynamicwebProvider RunJob method and also where i need to put my custom code
public override bool RunJob(Job job, string logFile) { OrderTablesByConstraints(job, Connection); var writers = new List<DynamicwebBulkInsertDestinationWriter>(); SqlTransaction sqlTransaction = null; try { SetupLogging(logFile); if(Connection.State != ConnectionState.Open) Connection.Open(); foreach(Mapping mapping in job.Mappings) { if(mapping.Active) { Logger.Log("Starting import to temporary table for " + mapping.DestinationTable.Name + "."); using( var reader = job.Source.GetReader(mapping)) { var writer = new DynamicwebBulkInsertDestinationWriter(mapping, Connection, DeactivateMissingProducts, RemoveMissingAfterImport, Logger); while(!reader.IsDone()) writer.Write(reader.GetNext()); writer.FinishWriting(); writers.Add(writer); } Logger.Log("Finished import to temporary table for " + mapping.DestinationTable.Name + "."); } } sqlTransaction = Connection.BeginTransaction(); if(DeactivateMissingProducts) HandleProductsWriter(writers); foreach(DynamicwebBulkInsertDestinationWriter writer in writers) { writer.MoveDataToMainTable(sqlTransaction); writer.DeleteExcessFromMainTable(Shop, sqlTransaction); } sqlTransaction.Commit(); ABasePovider.KillAll(); if(UpdateSearchIndexAfterImport) { Logger.Log("Starting index update."); IndexManager.Current.UpdateIndex("Products", true); Logger.Log("Index update completed."); } } catch(Exception ex) { Logger.Log("Import job failed: " + ex.Message); if(sqlTransaction != null) sqlTransaction.Rollback(); return false; } finally { foreach(var writer in writers) { writer.Close(); } job.Source.Close(); Connection.Dispose(); } return true; }
This code shows where i need to make the changes
foreach(DynamicwebBulkInsertDestinationWriter writer in writers) { writer.MoveDataToMainTable(sqlTransaction); //Not used in my provider //writer.DeleteExcessFromMainTable(Shop, sqlTransaction); //My custom code for deleting specific rows deleteTableRows(writer, sqlTransaction, newParameter) }
Do you need to delete previously imported rows that is no longer in the source, but keep anything that has been added manually or through other imports? I discussed that scenario with Morten Snedker some time ago, but he didn't seem very keen on adding this functionality to the standard provider.
If what I described is what you are trying to do then I don't think you need access to the writer or transaction at all. Standard deletion of excess rows will only be executed if "delete missing rows" has been enabled under configuration settings, so you can leave that unchecked and implement your own logic for deletion.
You can implement a custom provider that inherits from DynamicwebProvider, override RunJob, call RunJob on the base class and then delete excess rows after successful import.
public override bool RunJob(Job job, string logFile) { var importSuccess = base.RunJob(job, logFile); var success = false; if (importSuccess) { foreach (var mapping in job.Mappings) { var sourceTable = mapping.SourceTable; var destinationTable = mapping.DestinationTable; //TODO: Delete excess rows. } success = true; } return success; }
You must be logged in to post in the forum