Developer forum

Forum » CMS - Standard features » Query and Facets behavior on deleted fields

Query and Facets behavior on deleted fields

Nuno Aguiar Dynamicweb Employee
Nuno Aguiar
Reply

Hi,

 

Every now and again, our integration team will be adding or removing fields (or even some Rapido fields) at some point during the development of a project, but when a field is being referenced on a Query expression OR on a Facet, the result in the frontend is that it throws and error.

 

We were hoping the Repository could be a bit friendlier by skipping the expression or the facet instead of breaking.

 

A use case scenario is that if you delete a field in the backend, you now need to search for all queries (and potentially the queries in the PIM) to get everything to work again.

 

Could this experience be improved?

 

Best Regards,

Nuno Aguiar


Replies

 
Martin Vang
Martin Vang
Reply

Hi Nuno,

I see where you're going, but I don't think it's that simple. What you are suggesting is, that we leave a lot of bad queries etc. lying arround in the system. What about next time when the query is opened and the user wants to save a small change. Then he needs to consider all the fields that have previously been deleted and find out if those fields are important (eg. deleted without the person who deleted it fully understanding what they where doing), or if the entire condition can be safely removed. No small task.

The important part here is, that if you delete a field, and that field is in use, you're doing something that you don't fully understand. We cannot fix for that. Example:

PIM has a query that field that is being used to make a Structure Query where you accedentally delete Level 1: Should we just show an empty tree with no exceptions? I mean, here we destroy the entire workflow for the PIM editors... Or how about if you delete a field that has been used for the completion rules. Now all completion values go up (yay?).

TLDR; You should never delete fields that are in use. Just stop using them and leave the fields alone. Maybe write the field into a list of Obsolete Fields, that you remove in bulk, at a time where you're comitted to doing a complete test of the site.

Does this explaination change your mind regarding the feature request or do you still think it's a good idea?

BR

Martin

 
Nuno Aguiar Dynamicweb Employee
Nuno Aguiar
Reply

Hi Martin,

 

Your explanation makes a lot of sense and our team is torn into best approaches. There are certainly pros and cons to both.

 

Your suggestion doesn't prevent us from actually deleting the field itself. What you're saying is that by convention we should never delete fields OR we need to do a very extensive manual audit to determine if we can delete any - also not easy in code either, I know. So where does this leave us?

 

Our current pain point is that throughout the live span of a project, and working more and more with integrated solution where we need to create field definitions dynamically, fields will have to be deleted. Our integration team can very "efficiently" do that and the database has no bad data or old data laying around. But then a number of things break and we're chaising our tails:

  • Was this a needed field after all OR just a dummy Query in PIM?
  • Why did product search in the site suddenly stopped working? Who changed what?

 

We also don't want to deliver a brand new site with a bunch of bad and dummy data, that we're "scared" to delete in case it breaks something. It's kind of like giving a new car owner a dirty car.

 

So I don't have a good recommendation either at this point, because I do understand where you are coming from, but hopefully you too have a better understanding that the current behavior also causes us to spend some time and raise questions and (with involving customers) lose some faith in the process by seeing things break for "no apparent reason".

 

Maybe a future idea is to do better field deleting validations through the API and check all know usages of fields to see it it's being used? That does not prevent it from doing it in Integration, but if have that, we can incorporate some extra logic and logging on DataIntegration jobs to say that Field X was being used in Query Y.

 

Some food for thought I guess.

 

Best Regards,

Nuno Aguiar

 
Martin Vang
Martin Vang
Reply
Original message by Nuno Aguiar posted on 04/03/2021 14:13:28:

Our current pain point is that throughout the live span of a project, and working more and more with integrated solution where we need to create field definitions dynamically, fields will have to be deleted. Our integration team can very "efficiently" do that and the database has no bad data or old data laying around. But then a number of things break and we're chaising our tails:​

Hi Nuno,

I understand your pains. I really, really do... But. I would like to challenge the bolded text or at least the assumption that you "need to delete the bad data"! Here's what I did when I worked as a consultant, where I primarily did import of semi-big (by danish standard) datasets:

1. Setup a Test site to test the imports on, and keep this "bad data site" seperate from all other work (I really cannot stress how important and timesaving this part is if you have these kinds of problems!)

2. Rerun the imports on Staging/Production once things are ready for production (and pray a bit of course, because environments are never exactly the same ...)

3. Live with the "bad data" by hiding it from the users where possible

I think it's much worse to see breaking sites, then to see a field that we "just ignore". I never had any trouble explaining the customer why the few "bad data" fields where present, and the customers who really felt strongly about removing them, where happy to pay me to do so as a seperate followup project.

Maybe doing things in this kind of way can help alliviate your pains? I just don't see use in Dynamicweb doing a reliable data dependency validation check during deletion of fields (especially during data integration runs!). It would kill all performance, take us a year to complete which is probably too long to spend on this compared to the other work we could have been doing.

I hope you don't mind the pushback to your workflow. I actually think, on a personal level, that the described feature request would be really fun to do. :)

BR

Martin

 
Nuno Aguiar Dynamicweb Employee
Nuno Aguiar
Reply

Hi Martin,

 

I don't mind the pushback at all. It's an enriching experience to hear how other people do it. My thread ataully comes from a set of experiences that the US team has, and I am just the spokesperson.

 

We've been doing steps 1 through 3 that you describe already. But taking that next step to clean it up is what we're working/struggling towards. I understand the technological challenges and impact on performance to do the data dependency validation, however isn't that what technology is supposed to help us with? Repetetive and/or error prone work that humans suck at? But anyway, I did not expect a quick answer or turnaround. I'm happy to plant the seed and I'll take this back to the team.

 

On the fun side of things, if you can imagine, we're re-architecturing a lot on our end and will have up to 7 environments with CI/CD running and multiple teams and the whole shebang. Pretty cool and things are in sync.

 

 

As always thank you for the thorough explanation. I really appreciate it.

 

Best Regards,

Nuno Aguiar

 

You must be logged in to post in the forum