Dynamicweb Setup

The Dynamicweb Integration Framework is a collection of components for transferring data and maintaining data consistency between a Dynamicweb solution and a remote system.

The integration data-flow is as follows:

  • The Dynamicweb solution makes a request for data in XML format
  • The request is relayed to the remote system by the Dynamicweb Connector service
  • The plugin/code unit reacts by extracting data from the remote system
  • The data is wrapped in an XML format understood by Dynamicweb (or - for passive plugins - returned as is, and transformed using XSLT on the Dynamicweb side)
  • The XML is returned via the Dynamicweb Connector service to the Dynamicweb solution as a response

This articles covers the main activities which happen on the Dynamicweb side of an integration project:

  • Creating data integration jobs for placing data in Dynamicweb & doing the initial data import
  • Creating batch integration activities which request updated data on a schedule
  • (Optional) Installing and configuring the live integration add-in

We also present a set of recommended configurations for each feature in Integration Framework v2, which should help you get off to a good start.

Before starting work on the Dynamicweb side of an integration project, you must mock, retrieve, or receive an XML file with data per feature included in the integration. We recommend you retrieve the data using the Dynamicweb Connector TestTool.

A data integration job is a set of instructions for taking data and placing it correctly in the Dynamicweb database. It has two components:

  • A source provider which matches the data source – usually an XML file
  • A destination provider matching the data destination

You can read more about data integration jobs and providers in the Data Integration Module documentation.

In principle, features which share a source provider and a destination provider can use the same integration job and batch integration activity.

In reality, separate jobs are used per feature, as this allows you to schedule jobs at an interval which matches the feature and the solution. For instance, currencies are typically not updated very often, whereas prices and stock levels are updated frequently.

When starting an integration project, you likely won’t have any ‘real’ data in the system to begin with. So, for each feature you must do an initial data import to populate the database with data from the remote system – we recommend using the Dynamicweb Connector TestTool to produce XML files with the correct data.

Once you have retrieved or received that data do the following:

  • Upload the appropriate XML file to the /Files/System/Integration folder on the solution
  • Create a new integration job matching the feature
    • Select the XML Provider as source provider
    • Select the appropriate XML file as the source file
    • Click Next and select the appropriate destination provider
    • Check any relevant checkboxes – see the Integration Provider articles for details
    • Click Next
    • Select which source tables to import
    • Map them to an appropriate target table (this usually happens automatically)
    • Click Next
    • Name the job
  • Review the column mappings for each table in the job
  • Save
  • Run the job

For each job, review the results thoroughly. Any custom data on the remote side must be handled on the Dynamicweb side as well, and placed in either standard field or a custom field of the appropriate data type.

Remember, you have access to a number of useful standard features such as context-sensitive values, conditionals, scripting, etc. 

A batch integration activity is a scheduled task which does three things when executed:

  • Connects to a remote system – directly or via the DynamicwebConnector service – and submits requests for data using a scheduled task add-in
  • Receives a response in XML from the remote system and saves it to disk
  • Triggers an integration job, which processes the XML file and places the data in the Dynamicweb database

To create a batch integration activity:

  • Go to Settings > Integration > Integration Framework Batch
  • Click Add in the toolbar and fill in the general information (Figure 3.1)
Figure 3.1 Creating a new batch integration activity

You must:

  • Name the task
  • Select a begin and end time
  • Specify an intervalonce, hourly, weekly, etc.
  • Use the Type dropdown to select and configure an appropriate scheduled task add-in

scheduled task add-in connects to a remote system and submits a request for data whenever the scheduled task is run. Each add-in comes with a set of parameters which are required by the remote system and/or Dynamicweb.

Out of the box, we supply the following scheduled task add-ins for batch integration tasks:

Name

Use with

Notes

Active Directory Import data add-in

Microsoft Active Directory

Used to import users and validate user credentials

Export data add-in

Any remote system

Used to e.g. export order data

Import data add-in

Any remote system

Basic add-in for users & data (products, prices). Used with Integration Framework v.1

Import Data from Multiple Requests Add-in

Any remote system

Used to request data from a ConnectorService Add-in with multiple end-points. For custom integrations.

Import data with custom request add-in

Any remote system

Used to submit custom request against e.g. a modified remote system plugin. Otherwise very basic.

Import data with paging add-in

Any remote system

Used to connect to remote systems with a plugin that can react to paging requests, reimport requests, etc.

 

This is true for our AX2012R2 and D365 plugins.

Import Perfion Images Add-in

Perfion

Retrieves image data from a Perfion solution by querying the Perfion API Service

Sync queued orders using Live Integration

Any remote system with Live Integration and queued orders enabled

This add-in is used to sync queued Live Integration orders with the remote system, for scenarios where the connection has been unavailable for a while

In general, system-specific add-ins connect to remote systems which are passive, and where the returned data must be transformed on the Dynamicweb-side of the integration.

For the Integration Framework v2 code units, we recommend that you use the Import data with paging add-in as it has been specifically developed for this purpose.

 

This section provides you with an overall description of each feature in Integration Framework v2, including the typical:

  • Purpose & use case
  • Integration job settings
  • Batch integration task settings

As such, this can be seen as a hands-on guide to implementing the various integration features.

With the Import Currencies Job and Task you can bring over currencies from your remote system to Dynamicweb.

Since currencies usually don’t change much during the lifetime of a project, you typically run this job at the beginning of the project to bring over the initial currencies and then don’t schedule it to run frequently. If, however, the exchange rates are managed in your remote system, you can schedule the task to run as often as required to get up to date rates in Dynamicweb.

Typical integration job:

Setting

Value

Comments

Name

Import Currencies

 

Source provider

XML provider

 

Source

Singe file, Currencies.xml

 

Destination provider

Dynamicweb provider

No additional settings required on the destination provider

Index update?

No

 

Table mapping

EcomCurrencies

 

Primary keys

CurrencyCode
CurrencyLanguageId

 

Typical scheduled task:

Setting

Value

Comments

Name

Import Currencies

 

Recommended task type

Import data with paging add in

You can use other types as well as paging isn’t needed for small data sets such as currencies.

Web service URL

Default

 

Security key

Default

 

Request XML

<GetEcomData><tables><Currencies type="all"/></tables></GetEcomData>

 

Erp process request timeout in minutes (default 30)

Default

 

Import activity

Import Currencies

 

Reimport data

Not used; the current end points always return all currency records in each request.

 

Set checkpoint to Now

Not used; the current setup doesn’t track deltas for currencies

 

Custom modifier

Default

 

Enable paging

No

 

Page size (default 1000)

Not used

 

Timeout between paging requests in seconds

Not used (requires paging)

 

Maximum times to repeat (between failed or same responses)

3

Increase or decrease depending on stability and response times of your remote system.

Repeat until condition

None

Not needed when not paging

Enable batch certification

No

Not needed when not paging

With the Import Languages Job and Task you can bring over languages from your remote system to Dynamicweb. You don’t have to do this, and you can easily create the languages directly in Dynamicweb. However, if you have many languages in your remote system, or they have specific IDs and settings you want to match in Dynamicweb, you can use the Languages end point and its associated job and task to import the data to Dynamicweb.

Since languages typically don’t change much during the lifetime of a project, you usually run this job at the beginning of the project to bring over the initial languages, and then don’t schedule it to run frequently. And if you only have one or two languages, you can skip this whole end point and create the languages directly in the backend.

Typical integration job:

Setting

Value

Comments

Name

Import Languages

 

Source provider

XML provider

 

Source

Singe file, Languages.xml

 

Destination provider

Ecom provider

No additional settings required on the destination provider

Index update?

No

 

Table mapping

EcomLanguages

 

Primary keys

CurrencyCode
LanguageId

 

Typical scheduled task settings:

Setting

Value

Comments

Name

Import Languages

 

Recommended task type

Import data with paging add in

You can use other types as well as paging isn’t available on a small data set such as languages.

Web service URL

Default

 

Security key

Default

 

Request XML

<GetEcomData><tables><Languages type="all"/></tables></GetEcomData>

 

Erp process request timeout in minutes (default 30)

Default

 

Import activity

Import Languages

 

Reimport data

Not used; the current end points always return all language records in each request.

 

Set checkpoint to Now

Not used; the current setup doesn’t track deltas for languages.

 

Custom modifier

Default

 

Enable paging

No

 

Page size (default 1000)

Not used

 

Timeout between paging requests in seconds

Not used (requires paging)

 

Maximum times to repeat a request (between failed or same responses)

3

Increase or decrease depending on stability and response times of your remote system.

Repeat until condition

None

Not needed when not paging

Enable batch certification

No

Not needed when not paging

With the Import Units Job and Task you can bring over packaging units from your remote system to Dynamicweb.

The available units and groups usually don’t change much during the lifetime of a project so you typically run this job at the beginning of the project to bring over the initial units, and then don’t schedule it to run frequently. If, however, the units are managed in your remote system and change frequently, you can schedule the task to run as often as required to get up to date units in Dynamicweb.

Typical integration job:

Setting

Value

Comments

Name

Import Units

 

Source provider

XML provider

 

Source

Singe file, Units.xml

 

Destination provider

Ecom provider

No additional settings required on the destination provider

Index update?

No

 

Table mapping

EcomVariantOptions

EcomVariantGroups

 

Primary keys

EcomVariantOptions
- VariantOptionId

- VariantOptionLanguageId
EcomVariantGroups

- VariantGroupId

- VariantGroupLanguageId

 

 

Typical scheduled task:

Setting

Value

Comments

Name

Import Units

 

Recommended task type

Import data with paging add in

You can use other types as well as paging isn’t supported on units anyway.

 

 

Web service URL

Default

 

Security key

Default

 

Request XML

 

 

<GetEcomData><tables><Units type="all"/></tables></GetEcomData>

Erp process request timeout in minutes (default 30)

Default

 

Import activity

Import Units

 

Reimport data

Not used; the current end points always return all unit records in each request.

 

Set checkpoint to Now

Not used; the current setup doesn’t track deltas for units.

 

Custom modifier

Default

 

Enable paging

No

 

Page size (default 1000)

Not used

 

Timeout between paging requests in seconds

Not used (requires paging)

 

Maximum times to repeat a request (between failed or same responses)

3

Increase or decrease depending on stability and response times of your remote system.

Repeat until condition

None

Not needed when not paging

Enable batch certification

No

Not needed when not paging

With the Import Manufacturers Job and Task you can bring over manufacturers from your remote system to Dynamicweb. Products can then be associated with these manufacturers, which enables you to show manufacturer data (such as a name, web site and phone number) along with the product.

In many systems, manufacturers change from time to time, but not very often. Synchronizing once a day will be sufficient in most cases; increase the frequency if you need to have more up-to-date manufacturer data in Dynamicweb (at the expense of increase pressure on your remote system).

Typical integration job:

Setting

Value

Comments

Name

Import Manufacturers

 

Source provider

XML provider

 

Source

Singe file, Manufacturers.xml

 

Destination provider

Ecom provider

No additional settings required on the destination provider

Index update?

No

 

Table mapping

EcomManufacturers

 

Primary keys

ManufacturerId

 

 

Typical scheduled task:

Setting

Value

Comments

Name

Import Manufacturers

 

Recommended task type

Import data with paging add in

For a smaller number of manufacturers (say, less than 1,000 or so) you can choose another task type and don’t use paging. For larger sets, it’s recommended to use this task type as it supports paging.

Web service URL

Default

 

Security key

Default

 

Request XML

<GetEcomData><tables><Manufacturers type="all"/></tables></GetEcomData>

 

Erp process request timeout in minutes (default 30)

Default

 

Import activity

Import Manufacturers

 

Reimport data

True/False

Set to True during initial development and loading of data to ensure a fresh copy of the data is retrieved. Set to False in production so only changed manufacturers are brought back.

Set checkpoint to Now

True/False

Set this to True if you want to signal to the remote system that you’re caught up with importing changes. This is useful when you accidentally run an import with “Reimport data” (which starts a full reimport) or don’t care about recent changes but just want to see new changes come in.

Custom modifier

Default

 

Enable paging

Yes

 

Page size (default 1000)

Default (1000)

 

Timeout between paging requests in seconds

5 – 10 seconds.

Increase to lower the pressure on your remote system; decrease to improve throughput at the cost of pressure on the remote system.

Maximum times to repeat a request (between failed or same responses)

3

Increase or decrease depending on stability and response times of your remote system.

Repeat until condition

Empty XML response or batch id

 

Enable batch certification

Yes

 

With the Import Customers Job and Task you can bring over customers from your remote system to Dynamicweb. Imported customers can then place orders. Impersonation is currently not implemented which means that a Customer / Company is the same as a User / Login. This will change in the future versions of the Integration framework.

In many systems, customers change on a regular basis. New ones are added, old ones archived, shipping addresses change, and so on. Therefore, you want to create a scheduled task that imports customers on a regular basis. How often you need to run the job depends on the actual change frequency of your data. Typically, once an hour or several times a day is sufficient, but you can schedule the job to run more often if needed, at the expense of increase pressure on your remote system.

Typical integration job:

Setting

Value

Comments

Name

Import Customers

 

Source provider

XML provider

 

Source

Singe file, Customers.xml

 

Destination provider

User provider

Turn on “Generate passwords for users” if you want users to be able to log in. You can then use the Email configuration section to send users a password reset link.

In order to distinguish customers from other users in the system, it’s recommended to create a specific user group and assign that group as the Use email for username

Destination group on the User Provider.

 

Table mapping

AccessUser

AccessUserAddress

 

Primary keys

AccessUser
- AccessUserExternalId
AccessUserAddress

- AccessUserAddressUId
- AccessUserAddressUserId

By combining AccessUserAddressUId and AccessUserAddressUserId you can have non-unique address records (like a sequential number from 1 to N) for each customer.

Typical scheduled task:

Name

Import Customers

Comments

Recommended task type

Import data with paging add in

Especially with a large customer base it’s important to select this task type as it supports paging, meaning you get smaller chunks of data at a time.

Web service URL

Default

 

Security key

Default

 

Request XML

<GetEcomData><tables><Customers type="all"/></tables></GetEcomData>

 

Erp process request timeout in minutes (default 30)

Default

 

Import activity

Import Customers

 

Reimport data

True/False

Set to True during initial development and loading of data to ensure a fresh copy of the data is retrieved. Set to False in production so only changed customers are brought back.

Set checkpoint to Now

True/False

Set this to True if you want to signal to the remote system that you’re caught up with importing changes. This is useful when you accidentally run an import with “Reimport data” (which starts a full reimport) or don’t care about recent changes but just want to see new changes come in.

Custom modifier

Default

 

Enable paging

Yes

 

Page size (default 1000)

250-1000 depending on the amount of addresses your customers have on average. With a few addresses per customer, 1000 works well. With tens of addresses or more you need to choose a lower number.

 

Timeout between paging requests in seconds

5 – 10 seconds

Increase to lower the pressure on your remote system; decrease to improve throughput at the cost of pressure on the remote system.

Maximum times to repeat a request (between failed or same responses)

3

Increase or decrease depending on stability and response times of your remote system.

Repeat until condition

Empty XML response or batch id

 

Enable batch certification

Yes

 

With the Import Product Groups Job and Task you can bring over product groups from your remote system to Dynamicweb. Imported groups are used to categorize your products in the backend and frontend (although the two don’t need to match).

It’s pretty common for product hierarchies to change in the remote system. Therefore, you should schedule this task to run on a frequent basis. For most systems, once an hour might be enough. You can decrease or increase this frequency, but be aware of the overhead you cause to the remote system if you run jobs too often.

Typical integration job:

Setting

Value

Comments

Name

Import Product Groups

 

Source provider

XML provider

 

Source

Singe file, ProductGroups.xml

 

Destination provider

Ecom provider

No additional settings required on the destination provider

Index update?

No

 

Table mapping

EcomGroups

 

Primary keys

- GroupId

- GroupLanguageId

 

Typical scheduled task:

Setting

Value

Comments

Name

Import Product Groups

 

Recommended task type

Import data with paging add in

 

Web service URL

Default

 

Security key

Default

 

Request XML

<GetEcomData><tables><ProductGroups type="all"/></tables></GetEcomData>

 

Erp process request timeout in minutes (default 30)

Default

 

Import activity

Import Product Groups

 

Reimport data

True/False

Set to True during initial development and loading of data to ensure a fresh copy of the data is retrieved. Set to False in production so only changed product groups are brought back.

Set checkpoint to Now

True/False

Set this to True if you want to signal to the remote system that you’re caught up with importing changes. This is useful when you accidentally run an import with “Reimport data” (which starts a full reimport) or don’t care about recent changes but just want to see new changes come in.

Custom modifier

Default

 

Enable paging

Yes

 

Page size (default 1000)

1000 is a good default for most systems, but you can increase or decrease this based on your requirements.

 

Timeout between paging requests in seconds

5 – 10 seconds

Increase to lower the pressure on your remote system; decrease to improve throughput at the cost of pressure on the remote system.

Maximum times to repeat a request (between failed or same responses)

3

Increase or decrease depending on stability and response times of your remote system.

Repeat until condition

Empty XML response or batch id

 

Enable batch certification

Yes

 

With the Import Products Job and Task you can bring over ecommerce products from your remote system to Dynamicweb. These products can then be assigned to the product groups as imported using the Import Product Groups job and task, as well as to other groups that have not been imported (to create a marketing focused product hierarchy for example).

It’s pretty common for the products to change very frequently in the remote system. Therefore, you should schedule this task on a frequent basis. For most systems, once an hour might be enough. You can decrease or increase this frequency but be aware of the overhead you cause to the remote system if you run jobs too often. If highly up-to-date data is required you could go as far as to schedule this every five minutes. Be sure you have turned paging and delta tracking on (also on the end point) so you only process the changes, not the entire set.

Depending on how often you run this task, you can also enable rebuilding the index on the associated data job. If you run this task often, you must use a partial update build to avoid constantly rebuilding the index. If you’re only importing a few times a day and you have a small data set, you could choose to do a full rebuild of the index.

As an alternative to rebuilding the index as part of the integration job, you can also schedule the index build separately. This improves availability of the index (and therefore of the entire web site) at the expense of accuracy of your data.

Typical integration job:

Setting

Value

Comments

Name

Import Products

 

Source provider

XML provider

 

Source

Singe file, Products.xml

 

Destination provider

Ecom provider

No additional settings required on the destination provider

Index update?

Yes

See discussion in the introduction to this job

Table mapping

EcomProducts

EcomProductCategoryFieldValue

Note: if you have custom fields you want to bring over, create them in Dynamicweb and make them available in the Products end point in the remote system. Then update the mapping to include these fields.

Primary keys

EcomProducts

- ProductId

- ProductVariantId

- ProductLanguageId

EcomProductCategoryFieldValue

- FieldValueFieldCategoryId

- FieldValueFieldId

- FieldValueProductId

- FieldValueProductLanguageId

- FieldValueProductVariantId

 

Typical scheduled task:

Setting

Value

Comments

Name

Import Products

 

Recommended task type

Import data with paging add in

For small product sets, you could also use another task type. However, for most systems it’s recommended to use the Import data with paging add in because of its paging capabilities.

Web service URL

Default

 

Security key

Default

 

Request XML

<GetEcomData><tables><Product type="all"/></tables></GetEcomData>

 

Erp process request timeout in minutes (default 30)

Default

 

Import activity

Import Products

 

Reimport data

True/False

Set to True during initial development and loading of data to ensure a fresh copy of the data is retrieved. Set to False in production so only changed products are brought back.

Set checkpoint to Now

True/False

Set this to True if you want to signal to the remote system that you’re caught up with importing changes. This is useful when you accidentally run an import with “Reimport data” (which starts a full reimport) or don’t care about recent changes but just want to see new changes come in.

Custom modifier

Default

 

Enable paging

Yes

 

Page size (default 1000)

1000 is a good default for most systems, but you can increase or decrease this based on your requirements. If you have a large number of custom product fields or product category fields you should lower this number to decrease pressure on the remote system and keep the size of the transferred XML down.

 

Timeout between paging requests in seconds

5 – 10 seconds.

Increase to lower the pressure on your remote system; decrease to improve throughput at the cost of pressure on the remote system.

Maximum times to repeat a request (between failed or same responses)

3

Increase or decrease depending on stability and response times of your remote system.

Repeat until condition

Empty XML response or batch id

 

Enable batch certification

Yes

 

With the Import Related Products Job and Task you can bring over related products from your remote system to Dynamicweb.

Related products tend to change very frequently in the remote system, so you usually want to schedule this task on a frequent basis. For most systems, once an hour might be enough. You can decrease or increase this frequency but be aware of the overhead you cause to the remote system if you run jobs too often. If highly up-to-date data is required you could go as far as to schedule this every five minutes. Be sure you have turned paging and delta tracking on (also on the end point) so you only process the changes, not the entire set.

For a discussion on index rebuilds and when to use them, see the Products feature.

Typical integration job:

Setting

Value

Comments

Name

Import Related Products

 

Source provider

XML provider

 

Source

Singe file, RelatedProducts.xml

 

Destination provider

Ecom provider

No additional settings required on the destination provider

Index update?

Yes

See section “Recommended usage”

Table mapping

EcomProductsRelated

 

Primary keys

- ProductRelatedGroupId

- ProductRelatedProductId

- ProductRelatedProductRelId

 

Typical scheduled task:

Setting

Value

Comments

Name

Import Related Products

 

Recommended task type

Import data with paging add in

For small product sets, you could also use another task type. However, for most systems it’s recommended to use the Import data with paging add in because of its paging capabilities.

Web service URL

Default

 

Security key

Default

 

Request XML

<GetEcomData><tables><RelatedProducts type="all"/></tables></GetEcomData>

 

Erp process request timeout in minutes (default 30)

Default

 

Import activity

Import Related Products

 

Reimport data

True/False

Set to True during initial development and loading of data to ensure a fresh copy of the data is retrieved. Set to False in production so only changed related products are brought back.

Set checkpoint to Now

True/False

Set this to True if you want to signal to the remote system that you’re caught up with importing changes. This is useful when you accidentally run an import with “Reimport data” (which starts a full reimport) or don’t care about recent changes but just want to see new changes come in.

Custom modifier

Default

 

Enable paging

Yes

 

Page size (default 1000)

1000 is a good default for most systems, but you can increase or decrease this based on your requirements. If you have a large number of custom product fields or product category fields you should lower this number to decrease pressure on the remote system and keep the size of the transferred XML down.

 

Timeout between paging requests in seconds

5 – 10 seconds. 

Increase to lower the pressure on your remote system; decrease to improve throughput at the cost of pressure on the remote system.

Maximum times to repeat a request (between failed or same responses)

3

Increase or decrease depending on stability and response times of your remote system.

Repeat until condition

Empty XML response or batch id

 

Enable batch certification

Yes

 

With the Import Prices Job and Task you can bring over the current prices for your ecommerce products from your remote system to Dynamicweb. The setup of the endpoint, data integration job and scheduled task is very similar to those for products. The big difference is that the prices endpoint has been optimized for speed, so you can run it more frequently. It’s optimized by only returning the data needed to update a price and doesn’t bring back other data such as the product name, description, custom fields etc.

It’s pretty common for product prices to change very frequently in the remote system. Therefore, you should schedule this task on a frequent basis. For most systems, once an hour might be enough. You can decrease or increase this frequency but be aware of the overhead you cause to the remote system if you run jobs too often. If highly up-to-date data is required you could go as far as to schedule this every five minutes. Be sure you have turned paging and delta tracking on (also on the end point) so you only process the changes, not the entire set. Also, if you need prices to be more accurate than this, consider a Price Provider which fetches prices from the remote system in real-time.

For a discussion on index rebuilds and when to use them, see the Products feature.

Typical integration job:

Setting

Value

Comments

Name

Import Product Prices

 

Source provider

XML provider

 

Source

Singe file, ProductsPrices.xml

 

Destination provider

Ecom provider

No additional settings required on the destination provider

Index update?

Yes

 

Table mapping

EcomProducts

 

Primary keys

EcomProducts

- ProductId

- ProductVariantId

- ProductLanguageId

 

Typical scheduled task:

Setting

Value

Comments

Name

Import Product Prices

 

Recommended task type

Import data with paging add in

For small product sets, you could also use another task type. However, for most systems it’s recommended to use the Import data with paging add in because of its paging capabilities.

Web service URL

Default

 

Security key

Default

 

Request XML

<GetEcomData><tables><ProductPrices type="all"/></tables></GetEcomData>

 

Erp process request timeout in minutes (default 30)

Default

 

Import activity

Import Product Prices

 

Reimport data

No

 

Set checkpoint to Now

No

 

Custom modifier

Default

 

Enable paging

Yes

 

Page size (default 1000)

2000 is a good default for most systems, but you can increase or decrease this based on your requirements.

 

Timeout between paging requests in seconds

2 – 5 seconds

Increase to lower the pressure on your remote system; decrease to improve throughput at the cost of pressure on the remote system.

Maximum times to repeat a request (between failed or same responses)

3

Increase or decrease depending on stability and response times of your remote system.

Repeat until condition

Empty XML response or batch id

 

Enable batch certification

Yes

 

With the Import Product Stock Job and Task you can bring over the current stock levels for your ecommerce products from your remote system to Dynamicweb. The setup of the endpoint, data integration job and scheduled task is very similar to those for products. The big difference is that the stock endpoint - just like the Product Prices endpoint - has been optimized for speed so you can run it more frequently. It’s optimized by only returning the data needed to update the stock and doesn’t bring back other data such as the product name, description, custom fields etc.

Stock changes constantly and therefore needs to be brought back from the remote system frequently (especially when products are sold through other channels as well). Therefore, you should schedule this task on a frequent basis. For most systems, once an hour might be enough. You can decrease or increase this frequency but be aware of the overhead you cause to the remote system if you run jobs too often. If highly up-to-date data is required you could go as far as to schedule this every five minutes. Be sure you have turned paging and delta tracking on (also on the end point) so you only process the changes, not the entire set. Also, if you need stock to be more accurate than this, consider real-time stock checks during checkout.

For a discussion on index rebuilds and when to use them, see the Products feature.

If stock levels aren’t needed in the index (for example, you show products regardless their stock level) then you might not need an index update at all.

Typical integration job:

Setting

Value

Comments

Name

Import Product Stock

 

Source provider

XML provider

 

Source

Singe file, ProductsStock.xml

 

Destination provider

Ecom provider

No additional settings required on the destination provider

Index update?

Yes

See section “Recommended usage”

Table mapping

EcomProducts

 

Primary keys

EcomProducts

- ProductId

- ProductVariantId

- ProductLanguageId

 

 

Typical scheduled task:

Name

Import Product Stock

Comments

Recommended task type

Import data with paging add in

For small product sets, you could also use another task type. However, for most systems it’s recommended to use the Import data with paging add in because of its paging capabilities.

Web service URL

Default

 

Security key

Default

 

Request XML

<GetEcomData><tables><ProductStock type="all"/></tables></GetEcomData>

 

Erp process request timeout in minutes (default 30)

Default

 

Import activity

Import Product Stock

 

Reimport data

No

 

Set checkpoint to Now

No

 

Custom modifier

Default

 

Enable paging

Yes

 

Page size (default 1000)

2000 is a good default for most systems, but you can increase or decrease this based on your requirements.

 

Timeout between paging requests in seconds

2 – 5 seconds. 

Increase to lower the pressure on your remote system; decrease to improve throughput at the cost of pressure on the remote system.

Maximum times to repeat a request (between failed or same responses)

3

Increase or decrease depending on stability and response times of your remote system.

Repeat until condition

Empty XML response or batch id

 

Enable batch certification

Yes

 

With the Import Orders Job and Task you can bring over up-to-date information about your orders from your remote system to Dynamicweb. This is often used to bring over information such as the track and trace code from the shipping provider. Another common scenario is to import orders that were never placed through Dynamicweb, so the customer can see a full order history of web orders and other orders in the customer center.

Most historical orders don’t change anymore, so if you use paging and deltas on orders you can run this task pretty frequently and still not bring over a lot of data. Once an hour should be good for most systems, but you can increase or decrease depending on the number of orders that are placed on a day as well as the requested speed at which order information is updated.

Typical integration job:

Setting

Value

Comments

Name

Import Orders

 

Source provider

XML provider

 

Source

Singe file, Orders.xml

 

Destination provider

Order provider

No additional settings required on the destination provider

Table mapping

EcomOrders

EcomOrderLines

 

Primary keys

EcomOrders

- OrderId

EcomOrderLines

- OrderLineId

 

Typical scheduled task:

Setting

Value

Comments

Name

Import Orders

 

Recommended task type

Import data with paging add in

 

Web service URL

Default

 

Security key

Default

 

Request XML

<GetEcomData><tables><SalesHeaders type="all"/></tables></GetEcomData>

 

Erp process request timeout in minutes (default 30)

Default

 

Import activity

Import Orders

 

Reimport data

True/False

Set to True during initial development and loading of data to ensure a fresh copy of the data is retrieved. Set to False in production so only changed orders are brought back.

Set checkpoint to Now

True/False

Set this to True if you want to signal to the remote system that you’re caught up with importing changes. This is useful when you accidentally run an import with “Reimport data” (which starts a full reimport) or don’t care about recent changes but just want to see new changes come in.

Custom modifier

Default

 

Enable paging

Yes

 

Page size (default 1000)

200 is a good default for most systems, but you can increase or decrease this based on your requirements.

 

Timeout between paging requests in seconds

2 – 5 seconds.

Increase to lower the pressure on your remote system; decrease to improve throughput at the cost of pressure on the remote system.

Maximum times to repeat a request (between failed or same responses)

3

Increase or decrease depending on stability and response times of your remote system.

Repeat until condition

Empty XML response or batch id

 

Enable batch certification

Yes

 

The change tracking mechanism also enables you to delete or disable data in Dynamicweb when it gets deleted in the remote system. The standard implementation handles true deletes, but you can modify the remote system’s behavior to also end over “soft deletes” – for example, a product that is deactivated.

Deletes are currently supported on the following end points:

  • Currencies
  • Manufacturers
  • Customers
  • Products
  • Orders

Although supported by the system, it’s uncommon to use this end point as you’re unlike to delete currencies on a regular basis. You’re probably better off doing it manually when you ever need this to have more control over the process.

Job settings:

Setting

Value

Comments

Name

Delete Currencies

 

Source provider

XML provider

 

Source

Singe file, DeleteCurrecies.xml

 

Destination provider

Dynamicweb provider

Enable: “Delete incoming rows”

Table mapping

Currencies <> EcomCurrencies

 

Primary keys

EcomCurrencies

- CurrencyCode

- CurrencyLanguageId

 

Scheduled task settings:

Setting

Value

Comments

Name

Delete Currencies

 

Recommended task type

Import data with paging add in

For small product sets like currencies, you could also use another task type.

Web service URL

Default

 

Security key

Default

 

Request XML

<GetDeletedEcomData><tables><Currencies /></tables></GetDeletedEcomData>

 

Erp process request timeout in minutes (default 30)

Default

 

Import activity

Delete Currencies

 

Reimport data

True/False

Set to True during initial development and loading of data to ensure a fresh copy of the data is retrieved. Set to False in production so only deleted currencies are brought back.

Set checkpoint to Now

True/False

Set this to True if you want to signal to the remote system that you’re caught up with importing changes. This is useful when you accidentally run an import with “Reimport data” (which starts a full reimport) or don’t care about recent changes but just want to see new changes come in.

Custom modifier

Default

 

Enable paging

No

 

Page size (default 1000)

 

Not needed when not paging

Timeout between paging requests in seconds

2 – 5 seconds.

Increase to lower the pressure on your remote system; decrease to improve throughput at the cost of pressure on the remote system.

Maximum times to repeat a request (between failed or same responses)

3

Increase or decrease depending on stability and response times of your remote system.

Repeat until condition

None

Not needed when not paging

Enable batch certification

No

Not needed when not paging

Depending on how frequently you delete manufacturers, you can schedule this task to run anywhere between once an hour to once a day.

Job settings:

Setting

Value

Comments

Name

Delete Manufacturers

 

Source provider

XML provider

 

Source

Singe file, DeleteManufacturers.xml

 

Destination provider

Dynamicweb provider

Enable: “Delete incoming rows”

Table mapping

Manufacturers <> EcomManufacturers

 

Primary keys

EcomManufacturers

- ManufacturerId

 

Scheduled task settings:

Setting

Value

Comments

Name

Delete Manufacturers

 

Recommended task type

Import data with paging add in

 

Web service URL

Default

 

Security key

Default

 

Request XML

<GetDeletedEcomData><tables><Manufacturers /></tables></GetDeletedEcomData>

 

Erp process request timeout in minutes (default 30)

Default

 

Import activity

Delete Manufacturers

 

Reimport data

True/False

Set to True during initial development and loading of data to ensure a fresh copy of the data is retrieved. Set to False in production so only deleted manufacturers are brought back.

Set checkpoint to Now

True/False

Set this to True if you want to signal to the remote system that you’re caught up with importing changes. This is useful when you accidentally run an import with “Reimport data” (which starts a full reimport) or don’t care about recent changes but just want to see new changes come in.

Custom modifier

Default

 

Enable paging

Yes

For small sets of changes, you could turn off paging.

Page size (default 1000)

 

 

Timeout between paging requests in seconds

2 – 5 seconds.

Increase to lower the pressure on your remote system; decrease to improve throughput at the cost of pressure on the remote system.

Maximum times to repeat a request (between failed or same responses)

3

Increase or decrease depending on stability and response times of your remote system.

Repeat until condition

Empty XML response or batch id

 

Enable batch certification

Yes

 

Depending on how frequently you delete customers, you can schedule this task to run anywhere between a few times per hour to a few times per day.

Job settings:

Setting

Value

Comments

Name

Delete Customers

 

Source provider

XML provider

 

Source

Singe file, DeleteCustomers.xml

 

Destination provider

Dynamicweb provider

Enable: “Delete incoming rows”

Table mapping

Users <> AccessUser

 

Primary keys

AccessUser

- AccessUserExternalId

 

Scheduled task settings:

Setting

Value

Comments

Name

Delete Customers

 

Recommended task type

Import data with paging add in

 

Web service URL

Default

 

Security key

Default

 

Request XML

<GetDeletedEcomData><tables><Customers /></tables></GetDeletedEcomData>

 

Erp process request timeout in minutes (default 30)

Default

 

Import activity

Delete Customers

 

Reimport data

True/False

Set to True during initial development and loading of data to ensure a fresh copy of the data is retrieved. Set to False in production so only deleted customers are brought back.

Set checkpoint to Now

True/False

Set this to True if you want to signal to the remote system that you’re caught up with importing changes. This is useful when you accidentally run an import with “Reimport data” (which starts a full reimport) or don’t care about recent changes but just want to see new changes come in.

Custom modifier

Default

 

Enable paging

Yes

For small sets of changes, you could turn off paging.

Page size (default 1000)

 

 

Timeout between paging requests in seconds

2 – 5 seconds

. Increase to lower the pressure on your remote system; decrease to improve throughput at the cost of pressure on the remote system.

Maximum times to repeat a request (between failed or same responses)

3

Increase or decrease depending on stability and response times of your remote system.

Repeat until condition

Empty XML response or batch id

 

Enable batch certification

Yes

 

Depending on how frequently you delete products, you can schedule this task to run anywhere between a few times per hour to a few times per day.

Job settings:

Setting

Value

Comments

Name

Delete Products

 

Source provider

XML provider

 

Source

Singe file, DeleteProducts.xml

 

Destination provider

Dynamicweb provider

Enable: “Delete incoming rows”

Table mapping

Products <> EcomProducts

ProductGroups <> EcomGroups

 

Primary keys

EcomProducts

- ProductId

- ProductVariantId

- ProductLanguageId

EcomGroups

- GroupId

- GroupLanguageId

 

Scheduled task settings:

Setting

Value

Comments

Name

Delete Products

 

Recommended task type

Import data with paging add in

 

Web service URL

Default

 

Security key

Default

 

Request XML

<GetDeletedEcomData><tables><Products /><ProductGroups /></tables></GetDeletedEcomData>

 

Erp process request timeout in minutes (default 30)

Default

 

Import activity

Delete Products

 

Reimport data

True/False

Set to True during initial development and loading of data to ensure a fresh copy of the data is retrieved. Set to False in production so only deleted products are brought back.

Set checkpoint to Now

True/False

Set this to True if you want to signal to the remote system that you’re caught up with importing changes. This is useful when you accidentally run an import with “Reimport data” (which starts a full reimport) or don’t care about recent changes but just want to see new changes come in.

Custom modifier

Default

 

Enable paging

Yes

For small sets of changes, you could turn off paging.

Page size (default 1000)

 

 

Timeout between paging requests in seconds

2 – 5 seconds.

Increase to lower the pressure on your remote system; decrease to improve throughput at the cost of pressure on the remote system.

Maximum times to repeat a request (between failed or same responses)

3

Increase or decrease depending on stability and response times of your remote system.

Repeat until condition

Empty XML response or batch id

 

Enable batch certification

Yes

 

Depending on how frequently you delete orders, you can schedule this task to run a few times a day. Be careful with this job as it deletes the order history in Dynamicweb.

Job settings:

Name

Delete Orders

Comments

Source provider

XML provider

 

Source

Singe file, DeleteOrders.xml

 

Destination provider

Dynamicweb provider

Enable: “Delete incoming rows”

Table mapping

Orders <> EcomOrders

OrderLines <> EcomOrderLines

 

Primary keys

EcomOrders

- OrderId

EcomOrderLines

- OrderLineId

 

Scheduled task settings:

Name

Delete Orders

Comments

Recommended task type

Import data with paging add in

 

Web service URL

Default

 

Security key

Default

 

Request XML

<GetDeletedEcomData><tables><OrderLines /><Orders /></tables></GetDeletedEcomData>

 

Erp process request timeout in minutes (default 30)

Default

 

Import activity

Delete Orders

 

Reimport data

True/False

Set to True during initial development and loading of data to ensure a fresh copy of the data is retrieved. Set to False in production so only deleted orders are brought back.

Set checkpoint to Now

True/False

Set this to True if you want to signal to the remote system that you’re caught up with importing changes. This is useful when you accidentally run an import with “Reimport data” (which starts a full reimport) or don’t care about recent changes but just want to see new changes come in.

Custom modifier

Default

 

Enable paging

Yes

For small sets of changes, you could turn off paging.

Page size (default 1000)

 

 

Timeout between paging requests in seconds

2 – 5 seconds.

Increase to lower the pressure on your remote system; decrease to improve throughput at the cost of pressure on the remote system.

Maximum times to repeat a request (between failed or same responses)

3

Increase or decrease depending on stability and response times of your remote system.

Repeat until condition

Empty XML response or batch id

 

Enable batch certification

Yes

 

A Live Integration is an extension of a batch integration, which makes it possible to make real-time requests for data to the remote system.

This makes it possible to render e.g. live or customer-specific prices, live stock levels, or to have an order total calculated in the remote system and returned.

The data flow is somewhat dependent on the feature(s) implemented:

  • When a page with live prices and stock is rendered:
    • A list of product IDs or product numbers is sent to the remote system along with the ID of the current user
    • The plugin on the remote system then extracts prices and stock states for that user, and returns them
    • The products in memory are rendered with the returned prices – no data is saved in the database
  • When an order/shopping cart is calculated in the remote system:
    • Cart information (products IDs, quantities, user information) is send to the remote system
    • The remote system calculates the cart with live prices, discounts, etc.
    • The updated cart is returned to Dynamicweb and saved to the database, then shown to the user
    • If an order is created in the remote system, the remote ID is also returned and saved, to ensure data consistency between the two systems

As such, a live integration requires that a functioning batch integration is in place – you can’t request real-time information from nothing, so you need e.g. a ProductID and a UserID to request a customer-specific price, and the product must exist in the database before it can be shown in frontend.

To work with live integration, you must add it to your solution manually:

  • Download the Integration v2 LiveIntegration dll  from the Downloads area
  • Place it in the bin-folder of the solution – like all bin-folder modifications, this causes an IIS reset
  • Log in again

The Live Integration add-in is now available from Settings > Integration > Integration Framework Live.

It consists of 6 sections, which are used to configure different aspects of a live integration:

  1. General – contains connectivity and other overall settings
  2. Products – contains settings related to synchronizing product related information
  3. Orders – contains settings that define how carts and orders are handled
  4. Users – contains settings related to synchronizing users
  5. Notifications – define how notifications are sent and when
  6. Logs – configures the logging behavior of the connector

Once it has been installed, the Live Integration add-in is available from Settings > Integration > Integration Framework Live.

It consists of 6 sections, which are used to configure different aspects of a live integration:

  1. General – contains connectivity and other overall settings
  2. Products – contains settings related to synchronizing product related information
  3. Orders – contains settings that define how carts and orders are handled
  4. Users – contains settings related to synchronizing users
  5. Notifications – define how notifications are sent and when
  6. Logs – configures the logging behavior of the connector

These are described in more detail below. 

The General section (Figure 24.1) contains settings related to connectivity and other overall settings.

Figure 24.1 General parameters

Setting

Notes

Web Service URL

Enter the URL to the web service exposed by the Dynamicweb Connector

Security key

Enter the security key/secret you received from the Dynamicweb Connector service config file

Connection timeout

 

Shop

Select a shop for which the live integration is active – or set to Any to use it on all shops

Number format culture

Select an approproate number format culture to use when parsing information from the remote system

The Products section (Figure 25.1) contains settings related to retrieving live product information.

Figure 25.1 Products parameters

Setting

Notes

Enable live prices

When enabled, prices are looked up in the remote system when shown in the frontend

Lazy load product info

Skips loading prices and other product info from the remote system unless getproductinfo=true is in the request. This can be used for lazy loading scenarios using AJAX.

Include product custom fields in request

Whether to send the custom fields for a product to the ERP when request price and product information. If the ERP doesn’t need this information to determine the correct return values, leave this off for performance reasons.

Product information cache level

Sets caching to either page or session, depending on how important it is to continually update the information

Use product number in price calculation

 

Retry request for the product information

Whether or not the connector should retry the request for product information in case of a failure.

Include variants in the product information request

 

The Orders section (Figure 26.1) contains settings that define how carts and orders are handled.

Figure 26.1 Orders parameters

Setting

Notes

Cart communication type

Defines if and how the connector communicates with the ERP. The available options are:
 

  • None – No carts or orders are sent to the ERP
  • Full – The connector communicates with the ERP for cart calculation as well as for full, completed orders.
  • OnlyOnOrderComplete – Dynamicweb manages the cart, and only the completed order is sent to the ERP.

Queue orders and allow payments if no connection

Queues orders if no connection – they will be processed once the connection is reestablished, provided that the appropriate batch job is created - see previous section.

Include order custom fields in request

Includes order fields in the request

Include order line custom fields in request

Includes order line fields in the request

Include parts order lines in request

Text for discount order lines

Dicounts returned by a remote system do not contain a description for showing in the cart and on an order – this allows you to define one manually

Order state after export succeeded

Set an order state to apply to the order after a succesful export

Order state after export failed

Set an order state to apply to the order after a failed export

Order cache level

Sets caching to either page or session, depending on how important it is to continually update the information

Do not process ledger order

If using the Ledger functionality, this prevents ledger order from being processed by the live integration

The Notifications section (Figure 27.1) defines how and when email notifications are sent when the live integration cannot connect to the remote system.

Figure 27.1 Notifications parameters

Setting

Description

Notification recipient e-mail

Enter an email address

Notification e-mail template

Select an email template

Notification e-mail subject

Enter an email subject

Notification e-mail sender name

Specify a sender name

Notification e-mail sender e-mail

Specify a sender email

Notification sending frequency

Set a frequency – how often do you want to receive these emails?

The Logs section (Figure 28.1) allows you to configure the logging behavior of the live integration – this is particularly useful for debugging purposes, but can also be used for production.

Figure 28.1 The Logs section

Setting

Description

Log file max size

Set a max size in MB for the log file

Keep all log files

 

Log general errors

 

Log connection errors

 

Log response errors

 

Log request and response content

 

Once the Live Integration add-in has been configured, you can click the Test Connection button on the Live Integration add-in toolbar to access to a (barebones) version of the DynamicwebConnector TestTool (Figure 29.1).

Figure 29.1 Testing connectivity

Using that, you can connect to the DynamicwebConnector web service and test requests & responses from the Dynamicweb side of the integration - provided, of course, that the connection is up and running.

You can also examine trace information.

When you use the Live Integration, you can have the remote system calculate shipping fees and/or discounts for your orders.

If you do, you must to configure the associated providers in Dynamicweb:

  • Live Shipping Provider
  • Live Discount Provider

The reason is fairly technical; communication with the remote system can happen at various stages of a user’s interaction with the website, including stages where the discount and shipping fee data is not normally available. Instead of requesting the data again – and negatively impacting performance - the Live Integration stores the data in session state using the cart/order ID, and the Live Shipping Provider and Live Discount Provider uses the data from session state to calculate the correct fees and discounts.

The Live Shipping & Live Discount providers are available once the Live Integration dll has been installed, as outlined above.

The Live Shipping Provider applies shipping fees returned by the remote system to carts and orders in Dynamicweb.

Please note, that you’re not forced to have the remote system calculate the shipping fees. You can also use Dynamicweb’s built-in shipping providers to calculate the fee directly or with the use of an external system such as UPS or FedEx. When you configure one of the built-in providers, the fee is calculated by Dynamicweb and submitted to the remote system; any fees returned by the remote system are ignored.

To configure the Live Shipping Provider.

  1. Go to Settings > Ecommerce > Orders > Shipping
  2. Click New shipping on the toolbar
  3. Provide a name, and an optional description and icon
  4. Under Countries, make sure All Countries is selected
    1. Since your remote system handles the fee calculation, Dynamicweb should not limit the availability of the provider to specific countries.
  5. Set both Default fee and No fee for purchases over to zero
    1. These rules should be handled by the remote system, not by Dynamicweb
  6. Under Shipping provider, select Live integration shipping fee provider as the type
  7. Under Fee settings, select Use fee rules from provider
  8. Click Save and close in the toolbar

If you’re changing the behavior in the remote system or creating your own implementation, you should know that the values for shipping are stored in the XML as follows:

XML
<?xml version="1.0" encoding="utf-8"?> <tables> <table tableName="EcomOrders"> <item table="EcomOrders"> <!-- "OrderShippingMethodName" Or "OrderShippingMethodId"can be used--> <column columnName="OrderShippingMethodName"><![CDATA[Truck]]></column> <column columnName="OrderShippingFee"><![CDATA[10.00]]></column> </item> </table> </tables>

The Live Discount Provider applies the discounts returned by the remote system to carts and orders in Dynamicweb.

Unlike shipping fees, you can’t have Dynamicweb calculate the discounts and still use the remote system to calculate the remainder of the order such as customer specific prices. If you need to support that scenario, you’ll have modify the source code for the live integration framework.

To configure the Live Discount Provider.

  1. Go to Ecommerce > Discounts (not Order discounts, which is a different engine).
    1. If you don’t see the Discounts node, contact your partner account manager to have it added to your license
  2. Click New sales discount in the toolbar
  3. Provide a name, and an optional description, make sure the discount is active and clear the end date of the discount to make sure it remains active and does not expire after some time.
  4. Under Discount type, choose Live integration discount – this causes most of the settings to disappear. This is intentional.
  5. Under User > Used by select either All or Authenticated users, or select particular users and user groups, as appropriate
  6. Under Shop, select a shop or leave to All to have the discount provider kick in for all orders in all shops.
  7. Under Country, select one or more specific countries or leave to All to have the discount provider kick in for all countries.
  8. Leave minimum order price set to 0 – this business rule should be applied by the remote system instead.
  9. Click Save and close on the toolbar.

If you’re changing the behavior in the remote system or creating your own implementation, you should know that the values for discount are stored in the XML as follows:

XML
<?xml version="1.0" encoding="utf-8"?> <tables> … <table tableName="EcomOrderLines"> … <item table="EcomOrderLines"> <column columnName="OrderLineOrderId"><![CDATA[ORDER386]]></column> <column columnName="OrderLineId"><![CDATA[ORDER386_1]]></column> <column columnName="OrderLineParentLineId"><![CDATA[]]></column> <column columnName="OrderLineProductId"><![CDATA[]]></column> <column columnName="OrderLineProductVariantId"><![CDATA[]]></column> <column columnName="OrderLineProductNumber"><![CDATA[]]></column> <column columnName="OrderLineProductName"><![CDATA[Order discount]]></column> <!-- Discount name --> <column columnName="OrderLineQuantity"><![CDATA[1]]></column> <column columnName="OrderLineType"><![CDATA[1]]></column> <!-- Orderline type --> <column columnName="OrderLinePriceWithoutVat"><![CDATA[-98.00]]></column> <column columnName="OrderLineUnitPriceWithoutVat"><![CDATA[-98.00]]></column> <column columnName="OrderLinePriceWithVat"><![CDATA[-101.92]]></column> <column columnName="OrderLineUnitPriceWithVat"><![CDATA[-101.92]]></column> <column columnName="OrderLinePriceVat"><![CDATA[-3.92]]></column> <column columnName="OrderLineUnitPriceVat"><![CDATA[-3.92]]></column> <column columnName="OrderLineTypeName"><![CDATA[Discount]]></column> <column columnName="OrderLineBom"><![CDATA[FALSE]]></column> <column columnName="OrderLineBomItemId"><![CDATA[]]></column> <column columnName="OrderLineGiftCardCode"><![CDATA[]]></column> <column columnName="OrderLineIsGiftCardDiscount"><![CDATA[FALSE]]></column> <column columnName="OrderLineFieldValues"><![CDATA[]]></column> </item> </table> </tables>

Here is a sample XML for a product discount:

XML
<item table="EcomOrderLines"> <column columnName="OrderLineOrderId"><![CDATA[CART431]]></column> <column columnName="OrderLineId"><![CDATA[OL30975_1]]></column> <column columnName="OrderLineParentLineId"><![CDATA[OL30975]]></column> <column columnName="OrderLineProductId"><![CDATA[D0003]]></column> <column columnName="OrderLineProductVariantId"><![CDATA[]]></column> <column columnName="OrderLineProductNumber"><![CDATA[D0003]]></column> <column columnName="OrderLineProductName"><![CDATA[Order line discount]]></column> <!-- Discount name --> <column columnName="OrderLineQuantity"><![CDATA[1]]></column> <column columnName="OrderLineType"><![CDATA[3]]></column> <!-- Orderline type --> <column columnName="OrderLinePriceWithoutVat"><![CDATA[-5.00]]></column> <column columnName="OrderLineUnitPriceWithoutVat"><![CDATA[-5.00]]></column> <column columnName="OrderLinePriceWithVat"><![CDATA[-5.20]]></column> <column columnName="OrderLineUnitPriceWithVat"><![CDATA[-5.20]]></column> <column columnName="OrderLinePriceVat"><![CDATA[-0.20]]></column> <column columnName="OrderLineUnitPriceVat"><![CDATA[-0.20]]></column> <column columnName="OrderLineTypeName"><![CDATA[ProductDiscount]]></column> <column columnName="OrderLineBom"><![CDATA[FALSE]]></column> <column columnName="OrderLineBomItemId"><![CDATA[]]></column> <column columnName="OrderLineGiftCardCode"><![CDATA[]]></column> <column columnName="OrderLineIsGiftCardDiscount"><![CDATA[FALSE]]></column> <column columnName="OrderLineFieldValues"><![CDATA[]]></column> </item>

Note: If the OrderLineProductName XML field is empty, the value defined in the Orders section of the Live Integration configuration will be used.

Usually, when a user completes a checkout in the frontend, the order is immediately transferred to the remote system. This allows the remote system to start processing orders as soon as they arrive.

However, there are two scenarios where this doesn’t happen:

  1. The system encounters an error while sending the order, e.g. because the remote system is down
  2. You turned off live synchronization of orders in the Live Integration settings, by setting Cart communication type to ‘None’

Either way, you now have two options for getting these orders to the remote system:

  1. Manually, using a button on the ribbon bar
  2. Automatically, using the Sync Queued Orders task.

For both options, the following criteria must be true for an order to be eligible for synchronization:

  • OrderComplete – must be true
  • OrderDeleted – must be false
  • OrderIntegrationOrderID – must be null or empty. When it’s not, it means the order has already been sent to the remote system.
  • OrderIsExported – must be null or 0. When it’s not, it means the order has already been sent to the remote system.

You can manually send an un-synchronized order to the remote system using the Transfer to ERP button in the Live Integration section of the Ribbon bar (Figure 34.1). 

Figure 34.1 Manually submitting an order to a remote system

Only orders which have not been submitted before can be transferred. This is determined by looking at the IntegrationOrderId property of the order; if it’s not null or empty, the order cannot be submitted.

To send orders from the order list follow these steps:

  1. Click Ecommerce > Orders
  2. In the order list, select one or more orders you want to submit
  3. Switch to the Live Integration tab of the ribbon bar
  4. Click Transfer to ERP

After a short while you either get a notification that the orders have been transferred correctly, or you get an error message. The button is also available from both the order details page.

In addition to live synchronization of orders and manually submitting them, you can also send orders in batch using a scheduled task. This is useful in a few scenarios:

  1. You frequently encounter connection issues with live sync, and you want to submit the orders as soon as you can after the first failure.
  2. You want to minimize the checkout time for the user by not immediately sending the order. A scheduled task can then send the order shortly after the user has completed it.
  3. You don’t want to submit orders as they arrive, but send them in batch at certain intervals to lower the pressure on the remote system.

Your reasons for using scheduled order sync has some impact on how you want to configure the scheduled task. If you want to optimize the checkout time for the user and take out live integration during checkout, you can use this scheduled task to send the orders shortly after they have been completed. In that case, schedule the task to run often, such as every 5 or 10 minutes.

If, on the other hand, you disabled live integration to, say, decrease pressure on your remote system during business hours, you want your scheduled task to respect that also. In that case, you could schedule the task to run outside business hours once a day. Make sure you set “Maximum orders to process in each execution” to a high enough number so all new orders are transferred correctly.

To create the task to submit orders in batch follow these steps:

  1. Go to Settings > Integration > Integration Framework Batch
  2. Create a new batch integration activity and:
    1. Name it
    2. Set a start and end time and a repetition interval
  3. For Type, select the Sync queued orders using Live Integration add-in (Figure 35.1)
Figure 35.1 Sync Queued Orders using Live Integration
  1. Configure the Queued Orders section according to the table below:

Name

Description

Finished for X minutes

Defines the minimum number of minutes that an order must be completed before it gets sent. This helps prevent submitting orders that are still being processed by Dynamicweb or a payment gateway. Defaults to 5 minutes.

Maximum orders to process in each execution

Defines the maximum number of orders that are sent per run of the scheduled task. A lower number means a lower pressure on your remote server at the expense of latency. Defaults to 25.

Shop

Defines the shop for which orders should be sent. Defaults to Any which means all orders for all shops are sent.

Order States

Defines the state that an order should be in in order to be sent to the remote system. Defaults to none selected which means all orders are sent.

Exclude recurring order templates

When selected, only orders that have OrderIsRecurringOrderTemplate set to 0 or null are being sent. This enables you to skip sending templates and only send real orders.

As with all other scheduled tasks, you can also set up notifications, if you want to be notified of the task’s execution and its result.