Thursday, May 14, 2015

Trailhead: New Modules and Guided Learning

New options for learning Salesforce emerge all the time.  As an update to my earlier article on How to Learn Salesforce, Trailhead has been recently updated with new modules.  There is now a set of modules covering the new Lightning technologies – Connect, App Builder, Components and Process Builder.

These are a great set of modules if you want to get your hands dirty with guided practice.  I especially liked the Lightning Connect module.  The idea of being able to connect to external sources with OData is intriguing.  However, from a practice standpoint without an OData source, it is hard to try it out.  This module solves that issue by giving you a source to connect with.  The best part is once you finish the exercises, you can experiment on your own.

Another new module, is Reports and Dashboards.  I have always had an affinity for reporting and analytics, as it is where I started my professional career.  One of the challenges with trying to learn reporting, is that of context.  You can learn the mechanics of a tool, but without business requirements it lacks substance.  The Reports and Dashboard module helps provide background, making the learning more meaningful.

Perhaps the most exciting development is guided learning paths to blaze your own trail.  When Trailhead was just a few modules, it was easy to pick and choose what you want.  Now with more content, finding what you are looking for can be more difficult.  If you are a new admin, sifting through the more advanced developer options can be a challenge.  Now there’s no worries, as Guided Paths exists for the following:

With each track you get an overview of the course time remaining and the points you have earned.  Personally, I have made it through most content with a couple of modules left in the Intermediate Developer track.  I am also excited about what may come.  Looking and the track naming and interjecting a little bit of speculation, I am looking forward to the advanced topics for Admins and Developers.  What have you enjoyed most about Trailhead?

Sunday, May 10, 2015

Loading Analytics Cloud Directly from Excel

Since my previous post on Combining Salesforce and External Data, it is now even easier to load the example sales data.  With the release of Salesforce Wave Connector for Excel, no longer do you need to export data to a comma separated value (csv) file first.  Let’s take a look at using the connector to directly populate Salesforce Analytics Cloud from our Excel file.

Salesforce Wave Connector foe Excel Installation

First, you want to add the Salesforce Wave Connector for Excel from the Office store, which is a free install.  Alternatively, you can install from directly within Excel.  Using Excel 2013, my steps were the following:

  • Go to the Insert tab
  • Click Apps for Office, See All, then Featured Apps
  • Search for Wave Connector, Select the result and click Trust It
  • A side-bar opens, and you supply your Microsoft account details

With the Wave Connector installed, it’s time to specify the Salesforce credentials to your Analytics Cloud environment.  You must also grant permission to the Wave Connector, like any OAuth app.  You can check out the help page if you need additional details on installing the connector.


Uploading Data from Excel

Now we need to specify the Excel data to load.  Simply select a range of columns and rows and provide a name to use in Analytics Cloud for your dataset. You can also review the number of Columns and Rows (32,767 is the row limit per import) as well as preview the Column Names and Data types. 

If you want to change a data type, look for a column that is hyperlinked.  Clicking on it will cycle through the supported data types (text and date for dimensions and numeric for measures)


Then simply click submit data and the magic happens.  The upload progress will be reported and a summary of columns and rows displayed.  You can then jump to Analytics cloud or Import Another Dataset.



Head over to your Analytics Cloud environment and look for your new dataset.  You may have to refresh the page if it takes a few minutes for the import to complete.  Our new SalesDataGenerator dataset is now ready to be analyzed through lenses and dashboards.


Additional Notes

  • If you need to change Analytics Cloud environment, use the drop down next to your login in the Wave Connector for Excel to Log Out and supply new credentials
  • Administrators, take note of additional steps if you are using Office 365, outlined in the help files
  • Use the ? next to Get Started in the Wave Connector for Excel for additional details and troubleshooting tips.  Excel in Office 365 users need to remember to Bind Current Selection when uploading data
  • Remember you are limited to 20 datasets per day from external sources, and each upload from the Wave Connector for Excel counts towards this limit

Sunday, May 3, 2015

How to Combine Salesforce and External Data in Analytics Cloud

It is easy to build a dataset from Salesforce data in the Analytic Cloud. It is also rather easy to create a data set from an external file, as long as you don’t have to change the metadata file. How to combine two sets of data together in the Analytics Cloud is less obvious. With a little bit of dataflow manipulation, you can create a new, composite dataset of both sources.  Here is a guide to help you through the process.

External Data

First, you want to load your external data from a CSV (comma-separated value) file by creating a new dataset. I have created a sample Sales Data file that contains sales by account broken down by product and time period. I have also included my Sales Data.json metadata file for reference. 

In this case, I did not have to make any changes from the system generated mappings. It is always a good idea to check your date format and values, as this is a common source of error. Now select Create Dataset and your Sales Data is in the Analytics Cloud.

You can then start exploring the data by creating Lenses and Dashboards. What if you want to see this data linked with your Account details from Salesforce in order to analyze by state or account type?


Salesforce Data

Loading Salesforce data is a snap using the Dataset Builder user interface. In this example, I am only interested in pulling Account information, to keep it simple. I am going to add just the Account object and a select fields to my dataset (Account ID, Account Name, Account Type, Industry and Region). 

Next, select Create Dataset, provide a Dataset Name and use Create to save. This will add the associate JSON entries to the dataflow for this dataset. The data will not be loaded until the next schedule run of the dataflow, currently set for once a day, or a manual run. Don’t start the dataflow yet, as we are going to have some fun first.


Dataflow JSON

We have avoided JSON manipulation so far, that that is about to change. You cannot combine external data with Salesforce data in the UI, but we can use previously loaded datasets in the dataflow. Start by downloading the dataflow json definition (SalesEdgeEltWorkflow.json) and open it in your preferred editor. I use JSON Editor, but would be willing to switch if you have suggestions on a better tool.

Here is my dataflow JSON that shows the Account load.
  "101_Load_Account": {
    "action": "sfdcDigest",
    "parameters": {
      "object": "Account",
      "fields": [
          "name": "Id"
          "name": "Name"
          "name": "Type"
          "name": "Industry"
          "name": "Region__c"
  "102_Register_Account": {
    "action": "sfdcRegister",
    "parameters": {
      "source": "101_Load_Account",
      "alias": "Account",
      "name": "Account"

For simplicity, I have removed other flows for this example.  By default, the workflow steps are labeled with numbers.  I have added additional text to the name to help distinguish. Your dataflow JSON may look different, what is important is to find the Account load from Salesforce.

This will be a sfdcDigest, with an Object of Account and the fields we select.  My is named 101_Load_Account.  sfdcDigest will pull the data from Salesforce.  Next, find the step with sfdcRegister action that takes has the same source name as the Account load.  Mine is named 102_Register_Account, which makes the dataset accessible in the Analytics Cloud based on the name supplied.

Now, we are going to add a couple of additional steps. First, we need to add an edgemart transformation. This allows us to load a previously created dataset into the dataflow.
  "200_Sales_Data": {
    "action": "edgemart",
    "parameters": {
      "alias": "Sales_Data"

If you don’t recall the alias name of your dataset, you can always use the lens.apexp page in your org for reference - https://<pod> My step is 200_Sales_Data and my alias was Sales_Data.  Now, the Sales Data file we loaded earlier is available within our data flow.

With the external data available, we can use an augment transformation to combine it with the account data. This is my step 201_Combine_Sales_Account with an augment action.  We are going to make the Sales Data our left object and Account the right, specifying the join condition via left key (Account from the Sales file) and right key (Name from the Account file). 

All of the columns from the Sales Data data set are automatically included, and we need to specify the columns to include from the right (Account data set) as well as specify the the relationship prefix name to use, Sales in this case.

  "201_Combine_Sales_Account": {
    "action": "augment",
    "parameters": {
      "relationship": "Sales",
      "left_key": [
      "right_key": [
      "left": "200_Sales_Data",
      "right": "101_Load_Account",
      "right_select": [
  "202_Register_Account_Sales_Dataset": {
    "action": "sfdcRegister",
    "parameters": {
      "source": "201_Combine_Sales_Account",
      "alias": "SalesWithAccounts",
      "name": "Sales With Accounts"

Save the dataflow json file and go back to Data Monitor.  Before you upload, you may want to save a backup of the previous dataflow in case you want to revert to your original dataflow.  Then, Upload the dataflow json file and start the data flow. Once the dataflow finishes without errors, you have a brand new dataset of external and Salesforce Data.


Final Touches

With our new dataset, we can put together lenses and dashboards of sales activity, over time by various account attributes such as account type, key status, etc.


With the ability to load edgemarts into dataflows, we are not limited to combining Salesforce and external data. We could also load two sets of external data, load both into the dataflow and combine them into one final dataset. A very powerful feature of Analytics Cloud.


External Data Generation Notes

The Account Names are specific to my environment and may not match yours. I created a simple Sales Data Generator Excel Template that will generate random, sample data. Simply replace the Account Name on the Account tab. If you want to match on a different key to Salesforce, replace the values in Account Name with the desired data. Adding additional columns to the right of Account Name is fine.

The Dynamic tab uses a random number between one and the number of rows with a Row ID to find an Account. You can add more or fewer Accounts, just be sure to have Row ID values for valid rows. The same approach is used for Product.
The period is generated as a random date between a Start and End date specified on the Control Tab. To change the date range, simply enter the desired start and end date. Similarly, Price is selected between the Low and High Price on the control tab. Change the Control values for a different price range.

Finally, as the Dynamic tab changes on each change, you will want to create a static copy of the data for repeatability. Copy the columns from Dynamic and then head to the Sales tab. Perform a Paste Special of Values with Number Formatting. Now you can save the tab as Sales_Data.CSV and you are ready to go with your own custom data.