Developer forum

Forum » Integration » Getting value for Job.LastSuccessfulRun

Getting value for Job.LastSuccessfulRun

Imar Spaanjaars Dynamicweb Employee
Imar Spaanjaars
Reply

Hi there,

I looked at the implementation of Job.LastSuccessfulRun and found this code (decompiled):

    public DateTime? LastSuccessfulRun
    {
      get
      {
        if (Job._logFiles.Value.Any<FileInfo>())
        {
          Regex dataIntegrationLogFilePattern = new Regex(Regex.Escape(Task.MakeSafeFileName(this.Name)) + "\\d{4}\\d{2}\\d{2}-\\d{2}\\d{2}\\d{2}\\d+.log");
          foreach (FileInfo fileInfo in (IEnumerable<FileInfo>) Job._logFiles.Value.Where<FileInfo>((Func<FileInfo, bool>) (file => dataIntegrationLogFilePattern.IsMatch(file.Name))).OrderByDescending<FileInfo, DateTime>((Func<FileInfo, DateTime>) (obj => obj.CreationTime)))
          {
            if (File.ReadLines(fileInfo.FullName).Last<string>().EndsWith("Batch succeeded.", StringComparison.OrdinalIgnoreCase))
              return new DateTime?(fileInfo.CreationTime);
          }
        }
        return new DateTime?();
      }
    }

This code loops through all the log files and finds the newest file that ends with "Batch succeeded". I can see these log files when I run the job manually. However, when run from a scheduled task, the log fille is different from what I can see. I ends like this:

2022-03-15 06:06:46.415: Update products information finished.
2022-03-15 06:06:51.075: Job succeeded.

 

whereas the one from running the job directly looks like this:

2022-02-19 10:05:59.206: Update products information finished.
2022-02-19 10:06:06.659: Job succeeded.
2022-02-19 10:13:00.917: Finished job - 0. Import from NAV - EcomProvider.
2022-02-19 10:13:00.933: Batch succeeded.

Is that a correct analysis? And if so, would it mean that LastSuccessfulRun is unreliable as it means "last manual run" as opposed to "last run from anywhere including a scheduled task"?

I am trying to find out when the job last ran in order to optimize the import.

Thanks,

Imar

 

 

 


Replies

 
Matthias Sebastian Sort Dynamicweb Employee
Matthias Sebastian Sort
Reply

Hi Imar,

You are correct.

That function is only used by the EndpointProvider to do delta-replication, and that is why it's looking for "Batch succeeded" as we only have batch-imports of OData for now.

BR

Matthias Sort

 
Imar Spaanjaars Dynamicweb Employee
Imar Spaanjaars
Reply

Not entirely sure I get that. In my case, it *is* a batch import also, just not run manually, but from a scheduled task. What am I missing?

 
Matthias Sebastian Sort Dynamicweb Employee
Matthias Sebastian Sort
Reply

Scheduled tasks is a wrapper around the job/batch-imports, and therefor it will skip the DW-UI where it adds the "Batch succeeded" (it does that inside the Dynamicweb.DataIntegration.Integration.AddToQueue)

 
Imar Spaanjaars Dynamicweb Employee
Imar Spaanjaars
Reply

Exactly. Which is kind of weird considering the property is called Job,LastSuccessfulRun, no? I mean that would imply the date and time the job ran, irrespective of the context it ran in.

Is there another way to find out when a job last ran? I need to know it within a TableScript and currently use mapping.Job.LastSuccessfulRun inside ProcessInputRow.

Thanks!

Imar

 
Matthias Sebastian Sort Dynamicweb Employee
Matthias Sebastian Sort
Reply

That is true. :)

We have noted it down for a feature.

But what is the scenario you want to solve, as we have the function job.LastRun that gives the LastWriteTime of the last log file?

BR

Matthias Sort

 
Imar Spaanjaars Dynamicweb Employee
Imar Spaanjaars
Reply

I have a table script that looks at the last successful run date and time of the job. For each row it compares that to an incoming date and time called ErpUpdateDate. When the last run date is greater than the incoming ErpUpdateDate, it means we have already imported that row so we can subsequently skip it from the import using a filter. We need that to reduce the number of rows imported as we have notification subscribers running against ProductUpdated that synchronize data to an external system. ProductUpdated is fired regardless of whether there were any actual updates to the product meaning all products are considered dirty every time the job runs. By filtering out unchanged rows at the job level, we can greatly minimize the number of products affected.

LastRun is nice, but doesn't consider whether the job succeeded or not.

I may just write my own log file with "Batch succeeded" after the job runs for the time being ;-)

Thanks!

Imar

 

You must be logged in to post in the forum