Duplicate Checking On Save in Sugar (Part 2)

By Adam Chodoff • February 2nd, 2016
Audiences: Developers

In Part 1 of our blog post, we listed the default logic used for checking duplicates on save in some of the default modules. In this post, we’ll detail the process for changing that logic.

Modifying these defaults is relatively easy—we simply need to extend the vardefs file for the module in question and add in our custom filter.

To do this, you first want to map out the filter you’d like to use. As an example, let’s say that we want to change the default logic on Accounts so that a duplicate is detected if

The name of the duplicate record starts with the name of the new record.


The billing_address_postalcode of the duplicate record equals the billing_address_postalcode of the new record)


The field custom_field_c on the duplicate record equals the custom_field_c field on the new record.

Now that we’ve got that sketched out let’s construct the vardefs extension file. We will put the following in the file custom/Extension/modules/Accounts/Ext/Vardefs/customduplicatecheck.php. We’ll need to set two arrays, one for the filter logic itself and one for the sorting logic:

The structure here is not too hard to understand, but it is certainly confusing to look at. One way to think about this is that each operator has the same structure as the broader filter. That is, the filter_template array is an array containing exactly one array. That array must have a logical operator (‘$and’ or ‘$or’) as a key, and another array as a value.

That array then contains a number of arrays, each of which has exactly one key, which is either a logical operator or a field name. If the key is a field, then its value must be an array containing a comparison operator as key, and the field being compared against as a value. If the key is a logical operator, its value must be a valid filter on its own (e.g. you should be able to take any array with ‘$or’ set as its key and use that by itself for the filter_template array).

The ‘ranking_fields’ array is simpler to understand. This serves as a way to sort the records returned as potential duplicates. Once the potential duplicates are found, it will be ranked according to the order of this array. That is, a duplicate record will be given a score based on how many matches it contains (where the value of the field specified as the ‘in_field_name’ matches the value of the field specified as the ‘dupe_field_name’) and where those matches rank on this list. In our example, the ‘name’ field will score higher than ‘custom_field_c,’ so in most cases those records which match on the name and postal code will be listed before those which just match on custom_field_c. It should be noted that the fields you’d like to sort on do not necessarily need to be part of your filter.

Note that this is only really part of the story behind the structure here. What’s actually going on is that this filter that you’ve built will eventually be passed on to the FilterApi class (the same method that handles filters built via list views, for example) which will do most of the work and return the results. If your needs do not fall within the scope of simple comparison operators, you’ll need to do some more legwork either by extending the FilterApi class or by writing another duplicate checking method entirely.

However, by combining the above method with Sugar’s calculated fields, you should be able to write some fairly deep logic.

More From This Author
Adam Chodoff
Application Specialist at UpCurve Cloud