CPQ Specialist Certification Guide

My Experience

This was the toughest exams that I had in my Salesforce career and passed on the retake. My first attempt was with a lot of preparation, however, unlucky to get through. On my second attempt, I planned as per the exam guide and was able to crack it. Below are some details that would help you to find relevant articles and clear the exam. Please note I gave the SU20 exam.

1. Exam Outline

https://trailhead.salesforce.com/help?article=Salesforce-Certified-CPQ-Specialist-Exam-Guide#outline

From the above link its evident that the below topics are covered in the exam.

TopicWeightage (%)# of Questions
CPQ Platform2314
Bundle Configurations1710
Pricing1610
Quote Templates74
Product Selection74
Orders, Contracts, Amendments, and Renewals159
Products117
Approvals42

More details here: https://quip.com/OFC4APDlwMfO

2. CPQ 211 Training.

Next step is to understand the CPQ in detail. The CPQ 211 training would help here. The trainer has articulated the topics in layman’s terms and is the best training available out on the internet.

https://salesforce.vidyard.com/watch/5tWx7yjh2N6PaEQkzXp7ug

3. CPQ Fast Path Training.

Fast path training are those specific on an exam point and so would advice to undergo the below training.

https://share.vidyard.com/watch/TVaHEPmdmBRXjvamDcZ8B4

4. Trailheads

Complete the below trailhead trails/modules/trailmixes to have the required hands-on.

  1. https://trailhead.salesforce.com/users/strailhead/trailmixes/prepare-for-your-salesforce-cpq-specialist-credential – This one is official.
    1. https://trailhead.salesforce.com/content/learn/modules/salesforce-cpq-order-generation – Specific to order module.
    1. https://trailhead.salesforce.com/content/learn/projects/simplify-home-security-subscription-renewals – On Subscription Renewals
5. Partner Learning Camp.

If you have access to partner learning camp, please make use of the training in there. It is again a fastpath training. I would recommend taking a look at the videos a day before the exam. The videos uploaded by Aman Jain are concise and crip and will make the concepts solid in your mind.

https://sfdc.co/CPQPLC

SFDC ANT Deployments using Azure Pipelines

Introduction

Azure DevOps is a very powerful application that has a git-based repo and pipelines to automate tasks that is relevant to any IT process. In this blog, we are diving into the use of Azure Pipelines for Salesforce Developers to carry out deployment activities. You can use this blog to understand how you could push your metadata in your repo into a salesforce org. This blog uses Azure Repo to store the metadata and ANT based deployments. You could also use Github/Bitbucket repo to link into Azure pipeline and run the deployments from there as well. Instead of ANT based deployments, sfdx can also be leveraged, but that might need different pipeline tasks to support it. So let us get started to retrieve and deploy the metadata using ADO pipelines.

Prerequisites

We’ll start with the assumption that you are having good experience with Salesforce ANT Migration Tool because the Azure Pipeline that we’ll build will use this migration tool at the backend. You can start by cloning this repo from my Github here. Sign up for an Azure developer edition to try this out from you personal azure dev org. You could also if your project (at work) allows, create a new repo or branch within the repo with the files from Github and set up the pipeline to do the deployments.

Process Diagram

From the above diagram, one can understand how the pipeline is configured to run. These are the steps/tasks in the pipeline file. Let us see what each step does:

  1. Checkout from the repo: This task checkouts the entire repo into the virtual machine environment. (Azure Pipelines works on a virtual machine)
  2. Run ANT Scripts. This is a standard pipeline task that is created to work in the following way:
    1. Retrieve – to just retrieve from you source org.
    2. Deploy – to deploy the metadata from the repo to the target org.
    3. Both – to do a retrieve and deployment with a single run of pipeline.
  3. Push to Repo: This task commits and pushes the files retrieved from source org to the repo from the local of the Azure virtual machine.

The pipeline that I’ve created here is a dynamic one that accepts the ANT target from a variable that user can set just before running the pipeline so that he can choose between retrieve/deploy/both. Now if you are good with the build.xml, you know with multiple targets that use a different set of username/password or by using multiple pipelines linked to each other, you can automate the complete deployment process of retrieve and deploy from one org to another. In the example I’ve in my GitHub, you many notice, am retrieving from and deploying to the same org. I have explained how this could be done between orgs in the video.

The Master File

Now lets look at the pipleine file in detail. I’ve explained the below yml file in detail in the video.

# SFDC Retrieve and Deploy sample

trigger: none

pool:
  vmImage: 'ubuntu-latest'

steps:
- checkout: self

- script: |
    echo "Build Type: $(BUILD_TYPE)"
  displayName: 'Confirming Variables'

- task: Ant@1
  inputs:
    buildFile: 'MyDevWorks/build.xml'
    options: 
    targets: 'retrieve'
    publishJUnitResults: false
    javaHomeOption: 'JDKVersion'
  displayName: 'Retrieving from Source Org'
  condition: or(eq(variables['BUILD_TYPE'], 'retrieve'),eq(variables['BUILD_TYPE'], 'both'))

- task: Ant@1
  inputs:
    buildFile: 'MyDevWorks/build.xml'
    options: 
    targets: 'deployCode'
    publishJUnitResults: false
    javaHomeOption: 'JDKVersion'
  displayName: 'Deploy to Target Org'
  condition: or(eq(variables['BUILD_TYPE'], 'deploy'),eq(variables['BUILD_TYPE'], 'both'))

- script: |
    echo ***All Extract Successfull!!!***
    echo ***Starting copying from VM to Repo****
    git config --local user.email "myemail@gmail.com"
    git config --local user.name "Rohit"
    git config --local http.extraheader "AUTHORIZATION: bearer $(System.AccessToken)"
    git add --all
    git commit -m "commit after extract"
    git remote rm origin
    git remote add origin https://<<<YOUR_REPO_TOKEN>>>@<<YOUR_REPO_URL>
    git push -u origin HEAD:master
  displayName: 'Push to Repo'

Summing Up

With this setup, if your project employs a DevOps strategy, you can easily create Pull Requests to your target branch. You don’t need to get into the pain of retrieving the metadata using ANT on your local, commit it on local, push to remote using Github for Desktop/TortoiseGit/SourceTree apps.

Always on Cloud. Retrieve and Deploy just by using a browser.

#Giveback #StaySafe

Automate SFDC Data Export Using ADO

Data export has been a hot topic ever since the inception of salesforce and there are a lot of tools that help you to automate this task. There are tools available to automate the process as well. Probably these tools all generate either on a local drive or even might be cloud servers. How about the data extract that could be available on your repo! Yes, you heard it right. Its possible. It has been possible since long but then after Azure DevOps (ADO) pipelines popular in the market this has become much easier to implement. The same setup that I’ll be explaining could be modified a bit to run it from Docker or Jenkins as well. However, lets focus our discussion on setting up this task on ADO.

Process Flow

ado_process

Setup Dataloader

The dataloader comes with its Command Line part of it. Command Line dataloader is the way by which one could run the dataloader via the command line. This way it used a process-conf.xml file that holds the task details to be performed. Install the latest version of dataloader from your salesforce org and the zulu OpenJDK. Salesforce Dataloader uses this JDK library and the path variable must be set for this in your machine to run and test it locally. For the ADO setup, I’ll explain further down as how we could install this JDK when we run the job.

Encrypt your password using the encrypt.bat file as outlined in the official documentation. Also, setup the process-conf.xml file in the samples folder. In this example, I’ve used two beans (that’s how its is called in the command line dataloader), one for Account extract and another for Contact extract.

Create YML Script

Now its time to create the YML file. This file is for the ADO job to pickup and do the actions as we have mentioned in it. Create an empty yml file and add the below code and save it.

# Starter pipeline
# Start with a minimal pipeline that you can customize to build and deploy your code.
# Add steps that build, run tests, deploy, and more:
# https://aka.ms/yaml
trigger: none
pool:
  vmImage: 'windows-latest'
  
steps:
- task: JavaToolInstaller@0
  inputs:
    versionSpec: '11'
    jdkArchitectureOption: 'x86'
    jdkSourceOption: 'LocalDirectory'
    jdkFile: 'build/setups/zulu13.29.9-ca-jdk13.0.2-win_x64.zip'
    jdkDestinationDirectory: '/builds/binaries/externals'
    cleanDestinationDirectory: true
- script: |
    mkdir extractFiles
    cd build/dataLoaderApp/bin
    echo ******Starting Customer Extract.....*******
    echo -----------------------------------
    echo Extracting Account...
    echo -----------------------------------
    call process.bat "D:/a/1/s/build/dataLoaderApp/samples/conf" "accountExtract"
    echo --------------------------------------------------------
    echo Account extraction completed successfully!
    echo --------------------------------------------------------    
  displayName: 'Account Extract'
- script: |
    cd build/dataLoaderApp/bin
    
    echo ------------------------
    echo Extracting Contact...
    echo ------------------------
    call process.bat "D:/a/1/s/build/dataLoaderApp/samples/conf" "contactExtract"
    echo ----------------------------------------------
    echo Contact  extraction completed successfully!
    echo ----------------------------------------------  
  displayName: 'Contact Extract'
- script: |
    echo ***All Extract Successfull!!!***
    echo ***Starting copying from VM to Repo****
    git config --local user.email "youremail@email.com
    git config --local user.name "Rohit"
    git config --local http.extraheader "AUTHORIZATION: bearer $(System.AccessToken)"
    git add extractFiles/\*.csv
    git commit -m "commit after extract"
    git remote rm origin
    git remote add origin <Repo URL>
    # Replace the username with password in the url in the format https://<password>@dev.azure.com/..../../.../
    git push -u origin HEAD:master
  displayName: 'Push to Repo'

Setup ADO Pipeline

Now its time to move on to git and setup the pipeline. Limiting to the scope of this blog, am not going into details of ADO and pipelines, lets focus on the dataloader automation part. ADO can work with any git repo and in this tutorial, we’ll use azure repo itself.

There is a free version of Azure that you could sign up for and in this tutorial, I’ll use my personal azure instance.

Get yours by visiting here. Choose Sign up, create an account. After that login to your azure and follow the below steps:

  1. Create a new repo.
  2. Initialize the repo with readme file
  3. Clone the repo to your local.
  4. Merge the below files/folder.
    1. YML file
    1. Dataloader folder
    1. Zulu OpenJDK zip.
  5. Commit the changes.
  6. Push to Remote.

Now you have the required files on your branch/repo and its time to create a pipeline job. Choose the pipeline account and click on pipeline.

ado_pipeline

Follow the below steps:

  • Choose New Pipeline.
  • Choose ‘Azure Repos Git’
ado_git
  • Select your repo.
ado_repo
  • Choose existing pipelines YAML file.
ado_pipeline
  • Enter YML file path
ado_loc
  • Choose Continue at the bottom
  • At his point you can preview the YML file. C
  • Choose Save.
  • Click on Run Pipeline to run the job.

You can see the job status on choosing the job. Once the job ran successfully, you can see the extracted files in the extractFiles folder on the repo.

ado_files

Conclusion

You saw how the files got extracted and was committed to the repo. An ADO job assigns an agent that you specify in the yml and runs the scripts/tasks on that vm environment. In this example we have used the vm image as windows. This is because command line dataloader works only on a windows environment. This job was manually run and for you to schedule it, for e.g., to run first of every month, you need to add triggers with a CRON expression. I will have this covered in the upcoming video.

schedules:
- cron: "0 10 1 * *"
  displayName: First of Month 10AM Build
  branches:
    include:
    - master

Cheers.

#GiveBack

Authenticate SFDX using JWT

Being late into understanding SFDX, I wasn’t sure what were its capabilities and on a normal development project, I don’t think there is enough opportunity to work with SFDX. Luckily, I got a chance to work with few of the DevOps setup for my client and got hands on to the Salesforce Developer Experience – the SFDX.

I’m not going into details of what is SFDX and its capabilities as those are covered by fellow bloggers. Instead, will focus on how you could authenticate SFDX with an org of any choice. And again, there are blogs on this as well. So what’s the next focus!?

Here on this blog, let’s forget about terms like scratch org, developer hub etc. Instead will make sure sfdx works for any “type” of org. I’m using my developer edition and the rules applies for a production or sandbox instance. Hmm… that’s a lot of prologue. Lets get started.

Prerequisities

The below tools must be installed on your machine:

  • SFDX
  • OpenSSL

Flow in this tutorial

jwt_token_process

Setup SSL Certificate

This setup is required only for the purpose of this tutorial. As what we could generate is a self-signed certificate. A self-signed certificate is not recommended for use in a production instance. For a real project and application, you should go with a CA signed certificate. Don’t worry about the jargons, these are explained almost everywhere on internet. You could contact the ‘Digital Security’ team to procure a CA signed certificate. They would provide you with the required certificate and its key.

Install the openssl if you don’t have in your system using the below link and choose the version for your OS and don’t choose light version. After the installation, restart your machine and verify you have the openssl path variable set. This is the part I struggled a lot as getting an openssl binary was toughest part of this journey.

https://slproweb.com/products/Win32OpenSSL.html

Create a folder ‘JWT’ in a directory of your choice, navigate to that directory on the command line and run the below commands one after the other.

openssl genrsa -passout pass:x -out server.pass.key 2048
openssl rsa -passin pass:password -in server.pass.key -out server.key
openssl req -new -key server.key -out server.csr

At this point, you need to enter few details which will be taken into consideration, while generating the certificate. After you complete entering the details, it will again prompt on the cli path. Enter the below command and hit enter.

openssl x509 -req -sha256 -days 365 -in server.csr -signkey server.key -out server.crt

Now you can see four files in the folder of which two files are in need: server.key and server.crt. We will upload the server.crt file while we create a connected app and pass the server.key along when we make the connection – in this case through the sfdx command.

openssl_output

Setup Connected App

Time to login to salesforce. Login to your developer edition and create a connected app. Check the ‘Enable OAuth Settings’ & ‘Use Digital Signatures’. Your app should have details as below screenshots. Upload the server.crt file under the digital signature.

connected_app
connect_app_policy

Run SFDX Commands

All set. Now its time to test the connection using the sfdx auth command. Run the below sfdx command. I’ve kept the server.key file in the location: C:\JWT\server.key

sfdx force:auth:jwt:grant -u <username> -f C:\JWT\server.key -i <cosumerkey> -r https://login.salesforce.com
sfdx_auth_done

As you see from the above image, the SFDX got authenticated using the JWT. The command used the key with which the certificate was generated and connected to sfdc using the consumer key app that uses that certificate. Be very careful with the key file as it holds the pass to your org.

Walkthrough