877.703.4488 info@cloudmybiz.com
Tip of the Week – Best Practices for Salesforce Development

Tip of the Week – Best Practices for Salesforce Development

Best Practices for Salesforce

When it comes to developing and implementing Salesforce based system, there are undoubtedly best practices (and worst practices). Worst practices might range from writing code in pig latin, to rubbing melted chocolate on your keyboard. But what are best practices? “A best practice is an industry-wide agreement that standardizes the most efficient and effective way to accomplish the desired outcome. They generally consist of a technique, method, or process.”

Not only do best practices provide the most effective processes or techniques, but they also allow any and all users to be able to work with the system and not break the structure or format. Not a bad idea right? With the new availability of Salesforce DX, best practices are becoming even more important. DX is all about focusing on better and more effective ways to do development work. A few quick examples of best practices include test automation, audit trails, and rollback ability. In the end, following these standards is good for everyone involved, and as DX becomes more widely used worldwide, increasingly essential.

The team at SalesforceBen has put together a great article (below) that will take you in-depth on these and some other Salesforce best practices!

Salesforce Best Practices

-Ryan and the CloudMyBiz Team

Tip of the Week – Solving for Salesforce Data Loader Errors

Tip of the Week – Solving for Salesforce Data Loader Errors

data center, data loader

Using the Salesforce Data Loader is pretty much mandatory when you are working with large numbers of records, and need to perform a mass upload or update. To get optimal results, Salesforce recommends Data Loader usage when you are working with 50,000 to 5 million records.  If less, regular batch tools work perfectly, and if you are working with more, 3rd data apps are the best way to go.

Nonetheless, when running Data Loader, you can schedule these big loads into batches, and let automation process them in the background. There is a catch though: the larger the batch size, the greater the chance of running into issues, like CPU timeouts from lack of memory or query limits in orgs with lots of automation. Basically, errors that are the result of too much data being processed all at once. Considering how essential it can be to use a Data Loader process when working with a large volume of records, how can you avoid hitting these errors and still process your uploads?

One solution is to manually drop the Data loader batch size from the default 200 to 1. Whoa, that’s a big leap, right? By dropping your default batch size you can easily avoid the above errors from large data batches and the resulting lack of bandwidth. However, this solution does have a few drawbacks to consider.

First, there are daily batch limits, and for larger datasets, you may not be able to process all records without reaching that limit. Second, it smaller batch sizes mean your total upload time will take a bit longer. How much longer varies depending on automation and the number of records, but it generally won’t be a crazy large time increase. In the end, if you have a lot of processes running every day, lowering your batch size is a handy trick to help you avoid overloading the system, but still being able to process all of your data. 

Read the: Data Loader Guide from Salesforce

-Ryan and the CloudMyBiz Team