Lessons Learned: Building a Google Data Studio Partner Connector

NewPath Consulting is adopting Google Data Studio as a custom reporting solution for Wild Apricot and related analytics systems like Google Analytics. Google Data Studio allows you to connect, visualize and share powerful reports and dashboards to anyone with a network connection. Your web browser is all you need to view and edit a report, without the need to install complicated and expensive desktop software. There are several good short videos for Google Data Studio, but this longer walk-through with screenshots is really great if you want to have a deeper understanding of the product. To be sure, Google Data Studio launched as beta in 2016, and it was released to the public in fall of 2018. At the Google Cloud Next 2018 conference, there were some really cool demos of Google Data Studio and Google’s multi-terabyte BigQuery database. To date, it is still a completely free service that Google makes available to anyone with a Google account. A fair warning: there is a lot of documentation about Google Data Studio, the connector library, and a set of developer resources. Take your time to digest it, as it is constantly evolving and being improved. Recently we found Google’s codelabs which has a 30-minute introduction to building community connectors, but this code walk-through uses a basic non-authenticated API and also assumes the developer has quite a bit of background knowledge about the underlying principles of a REST API architecture, API authentication and most importantly, data modeling.

We built an approved Partner connector called Wild Apricot Reports Manager to get familiar with the capabilities of Google Data Studio and the Wild Apricot API. Community connectors are very powerful as they let you use data that is stored in any REST-API enabled app including thousands of SaaS applications available today. Community connectors can also fetch data through JDBC APIs, flat files (e.g. CSV, JSON, XML), and Google Apps Script Services. Community connectors work like a “data pipeline” between Google Data Studio and your favorite web service or app. Google Data Studio sends a request to your SaaS app through the community connector and the connector returns the requested data, so you can analyze and visualize it in Google Data Studio.

Google Data Studio Community Connector Architecture

Google Data Studio Community Connector Overview (click to enlarge)

You can build and deploy community connectors using Google Apps Script, which is JavaScript-based, rapid application development platform that makes it fast and easy to create business applications that integrate with G Suite. Google Apps Script is Google’s solution to scripting and automating their online suite of products. The best analogy is Visual Basic for Applications, the Microsoft-designed programming language from the 1990s that automates Microsoft Office and the rest of their desktop suite of products. Google Apps Script is designed and maintained for all of Google’s apps and services, so it it very “online” friendly and can automate much of Google’s web-based products.

This blog post will cover the things we learned between the summer of 2018 to fall 2019 about building a partner connector:

  1. Understand how the REST API works
  2. Create and test the basic community connector code
  3. Start with the simplest API endpoint
  4. Identify the dimensions and metrics in each API endpoint
  5. Test and deploy your community connector
  6. Use the Community Connector API reference
  7. The partner connector submission process is rigorous

Understand how a REST API works for your app or service

To analyze data with Google Data Studio, you need to understanding the REST API of the app whose data you wish to analyze. The REST API’s underlying data structures will be used extensively by your connector. Wild Apricot’s API, like any other REST API, is intended for use by developers with technical expertise, so it is important to get the technical documentation for your app’s REST API. The Wild Apricot API help documentation is an excellent example. Doing simple tests on your app’s REST API can help you see much more clearly what your connector will see.

You will also need to understand and implement the correct API authentication mechanisms for your REST API. Thankfully Wild Apricot supports a simple API key so that is what our connector uses for now. Most partner connectors require a more secure authentication method (such as oAuth 2) that does not pass the API key as a configuration parameter.

Here are the most common approaches to REST API authentication:

No Authentication

Some APIs don’t require any authentication so you can simply call the endpoint (‘url’ in the call below). This is common for open data sets that are non-proprietary or don’t contain any private information about people.

var response = UrlFetchApp.fetch(url);

Authentication in the query parameter

A web app that stores any personal data or transactional information will have an authentication step to access the API. The simplest protocol to use is the API key in a query parameter of the REST API endpoint URL. Here’s an example code snippet where the API_KEY variable contains a uniquely created API key. You can think of this key as a password to your API, and it should not be stored in the code or in plain text. This key is generated (and can be revoked) in the web app. Wild Apricot supports the creation of one or more API keys.

var response = UrlFetchApp.fetch(url + '&api_key=' + API_KEY);

Authentication via HTTP headers

Another REST API authentication method involves passing the API key in the HTTP header, rather than in the URL. This can be considered more secure especially when the calls are done via SSL. Some REST APIs only allow authentication via HTTP headers. The code snipped below passes the API key  inside the advanced parameters of the Url Fetch App call.

var params = {
    'headers': {
      'Authorization': 'Basic ' + API_KEY
    }
  };

var response = UrlFetchApp.fetch(url, params);

oAuth Authentication

Finally, the most secure and supported REST API authentication method uses the oAuth 2 protocol. This is a multi-step process where you first validate your credentials and are granted an access token to use access the API.

Google has published an Apps Script oAuth 2 library to handle a lot of the mechanics of this process.

Once you’ve received an access token, you can call the API and send the access token in the headers object, like so:

var headers = {
    "Authorization": "Bearer " + service.getAccessToken()
  };

Create and test the basic community connector code

Currently there are many code samples that can be good starting points on the Google Data Studio Community Connectors github repository. You will be using the Google Apps Script editor for editing the set of files that comprise your connector. A community connector uses 4 function calls to drive the functionality — getAuthType(), getConfig(), getSchema() and getData(). Learn them and love them! Remember you have complete access to JavaScript for manipulating the REST API data and doing basically anything you want before you return them to your community connector. The code of the connector can be used validate configuration inputs to the connector, create the schema and any extra dimensions or metrics beyond what the REST API returns as well as much else.

We published the code under an open-source licence for the Wild Apricot Reports Manager (WARM) on GitHub so you can use it as a starting point. You can see how we did some of the more interesting things like validate the configuration parameters and use date ranges for the Invoices and Auditlog endpoints. There is also a way to call multiple endpoints and combine the data in one set of Google Data Studio dimensions and metrics.

The purpose of the connector is to authorize access to your API (if required), collect any configuration parameters,  expose the appropriate dimensions and metrics (the schema) and populate the scheme with real-time data from your app into Google Data Studio. The configuration parameters along with the manifest file allow you to collect a variety of information like API keys, which end point you wish to use and other configuration information. There is a mechanism to allow the configuration parameters to be changed once they have been added to a report as well using the .SetAllowOverride option. This can be very useful especially when you wish to use alternate API keys from one data source!

Start with the simplest API endpoint

We started our development from the Accounts API endpoint. It doesn’t have any custom fields  and describes the Wild Apricot account information like the Account ID, domain name and name. This endpoint is a very simple introduction to the Wild Apricot API.

Here’s a redacted JSON response from this call which we can expose to Google Data Studio. Every API response in JSON format will provide a very similar structure of name/value key pairs. In certain cases there will be nested arrays in this structure. It is really important to understand that every successful REST API call should respond with JSON like this:

[
 {
"Id": 221748,
"Url": "https://api.wildapricot.org/v2.1/Accounts/221748&quot,;
"PrimaryDomainName": "NewpathConsulting.wildapricot.org&quot,;

...

"Currency": {
"Name": "Canadian Dollar&quot,;
"Code": "CAD&quot,;
"Symbol": "$"
            },

...

"Name": "NewPath Consulting&quot,;
"ContactLimitInfo": {
"CurrentContactsCount": 100,
"BillingPlanContactsLimit": 100
                    }
 }
]

Identify the dimensions and metrics in each API endpoint

There are only 2 kinds of data fields in Google Data Studio: dimensions and metrics. Dimensions are things you want to measure, or that serve as ways to categorize your data. Metrics are numbers that measure the things contained in dimensions. A Google Data Studio report is simply a combination of visualizations that are comprised of one or more dimensions and zero or more metrics. Your connector must identify which part of the REST API is a dimension and which is a metric, so you can make your data source readily usable by your reporting analysts. In Wild Apricot an invoice number is a dimension, where as the invoice total amount is a metric.

Here’s a sample piece of code in the Wild Apricot community connector that defines the Account schema. Note that the conceptType key/value pair defines whether the data field will be a dimension or metric. In this case there are 3 dimensions: Account Number, Account Domain and Account Name that will be available to any report.

account: [
{
name: "Id&quot,;
label: "Account Number&quot,;
dataType: "NUMBER&quot,;
semantics: {
conceptType: "DIMENSION"
           }
},
{
name: "PrimaryDomainName&quot,;
label: "Account Domain&quot,;
dataType: "STRING&quot,;
semantics: {
conceptType: "DIMENSION&quot,;
semanticType: "URL"
           }
},
{
name: "Name&quot,;
label: "Account Name&quot,;
dataType: "STRING&quot,;
semantics: {
conceptType: "DIMENSION"
           }
}
]

Test and deploy your community connector

When you test your community connector you will be creating data sources and reports. A community connector has to be “installed” as a data source before it can be used. The deployment mechanism in Google Apps Script is used to create links that can be used to create a data source. Note that the Google Data Studio developer docs assume you know how to use the Google Apps Script development environment. Each community connector must have a manifest contained in the appsscript.json file. This file defines a lot of the elements of the connectors meta data including the support contact for the connector, the logo and other descriptive information.

To publish a connector for testing go to Publish –> Deploy from manifest… in the Apps Script editor.

2019-09-23_17-34-14

By building and managing deployments, you can control the exact code version your users use. You can also maintain multiple deployments for production, testing and development.

You can create a new deployment with the red Create button. A new deployment is essentially a “fork” of your connector that can be deployed for use through a link. Within each deployment you can manage a “snapshot” of your code base that can be deployed for testing or to the public. For WARM we have a UAT (user acceptance testing) version that we use internally for testing. When we are ready to publish a production version we create a new Production version.

To create a new version just click on “Edit” next to the deployment you wish to update and select New under the Version drop down. You can also add some notes to the version.

To install create a new data source click on the Google Data Studio icon to the right of Get ID, which will reveal a link that can be linked (and even shared) for other Google Data Studio users to try. This is how a community connector is distributed. You can just select the deployment you wish to install and deploy, and you can install multiple versions of your data connectors under the same Google Account.

2019-09-23_17-41-52

Debugging can be done in various ways but we normally use the console.log function call to print debugging messages out to the console or to Google’s Stackdriver logging system.  You can see the output of various messages from the execution of the connector in the My Executions section in the G Suite Developer Hub

Note that once a connector has been deployed to your account you should see the connector appear in your Partner Connectors list, although your account will be the only one that sees community connectors deployed this way. See the partner connector submission process below to publish your connector to the public without needing to deploy using a link. In the screenshot below we have both a Production and a UAT deployment installed for use.

2019-09-23_17-46-51

Use the Community Connector API reference

We started to build the Wild Apricot connector in summer 2018 from the Google Data Studio community connector repository before the Community Connector API reference was published. In early 2019 we started to redesign the connector code with the Community Connector API reference and will redesign the connector using this approach because it is more compact, easier to read and exposes a variety of interesting features. The Community Connector API reference will also be supported by Google to add functionality to connectors as the platform evolves. It will also make it easier for us to develop a custom fields functionality to dynamically load any custom common and membership fields for the Members endpoint and any other future endpoints that have custom data that can be defined by the Wild Apricot administrator (such as event registration form fields).

The partner connector review process is rigorous

Partner connectors are built and supported by  Google Data Studio developer partners and are tested and evaluated by Google’s partner program for inclusion in the partner data connector list. One main benefit of partner connectors is that they are usable by anyone who tries Google Data Studio without having to install a community connector by link. To promote a community connector into a partner connector your connector must review and pass the requirements through the partner connector submission process. Here are a few gotchas that we had to ensure we built into the connector to pass the review process:

  • the connector name must be 28 characters or less
  • provide a complete the project manifest file (appsscript.json) with a valid set of key value pairs (tip: here is the WARM manifest file)
  • get an approved pull request/merge of a new organization and source into the Google Data Studio official registry of community connectors.
  • get an approved oAuth consent screen in a Google Cloud Platform project, which will allow your connector to be installed by any Google user without security warnings
  • any template report, if provided in a the manifest file, must operate without any errors
  • the connector code must validate any configuration parameters that are provided and present any errors as appropriate
  • you may be asked to submit a screencast of your production connector being added to a new Google account (not the one you used to develop the connector) and show all the functionality of your connector including oAuth authorization

Before your users authenticate, the oAuth consent screen will allow them to choose whether they want to grant access to their private data, as well as give them a link to your terms of service and privacy policy. 

Good luck on your Google Data Studio community connector journey!

And if you would like to get in touch with us about our Google Data Studio expertise, please contact us.

ps. The Google Data Studio Resources dashboard (built in Google Data Studio!) is a fantastic way to search through hundreds of GDS articles, videos and other resources.

About the author

Alex is a pioneer in using the cloud to meet the needs of small and medium sized business (SMBs) and membership-based organizations. He has a BSc in computer science from the University of Michigan and has worked as a product manager at two Internet startups. Alex is a father of 2 and plays the trumpet for fun. He is the founder and the president of the University of Michigan Alumni Club of Toronto.