1. Home
  2. Knowledge Base
  3. Automating Data Tasks
  4. [Desktop] – Automating workflows with Desktop-Cloud Automation

How to use the Desktop Cloud Automation Service

All Analytics Canvas users have access to the Desktop Cloud Automation service to schedule workflows.  As a Cloud Hosted Service, it is subject to quota, which is shared with other Analytics Canvas products, including the Looker Studio Partner Data Connector and Analytics Canvas Online. 

Overview of Desktop Cloud Automation

Desktop-Cloud automation is a cost effective way to automate workflows created with Analytics Canvas Desktop.  Any files created with the Desktop application can be automated, provided all data sources can be accessed by the Analytics Canvas Cloud.  

Generally if you are using APIs the process is straight forward.  If you use reference files, such as an Excel Template, it will have to be uploaded to the cloud. If you access a database, you will need to permit our cloud to read from, or write to, your database.

Preparing your workflows for Cloud Automation

Cloud Automation means that you will upload your Canvas file to the cloud where it will run on our Analytics Canvas Cloud Servers.  The servers are not synced with or accessible by your local machine - you are placing your files onto the servers where they will run. 

As such, your local directory is not accessible, nor can data be written back to your desktop directly.  Instead, we must connect to your data sources through remote channels, such as APIs to file sharing services, and to remotely available databases.  

  • Inputs from databases must use a username and password as NT Login will not work remotely from outside of your firewall. 
  • Canvas Cloud must have permission to access your databases.  For that you must permit our IPs.  Contact support@analyticscanvas.com to get the list of IPs. 
  • If an input file is not connected by API, it will remain static and will only change when the package is manually synced. 
  • Output files, such as Text, CSV, Excel,Tableau Data Extracts (TDE), or Tableau Hyper files, must be connected to one of the 4 APIs for file delivery:  Amazon S3, Google Cloud Storage, Dropbox, or Google Drive.  This way the file can be delivered from our cloud to you. 
  • Tableau users have the option of having their files uploaded directly to Tableau Online or Tableau Server.

Creating a Canvas Cloud Automation Package

To have canvases run automatically in the cloud, you create a Canvas Automation Package. This package contains one or more canvases (represented by .ACC files) that are run in order.

It is recommended that you put all the .ACC files that you want to run together, and any data files they need into a single folder on your machine.

To begin, click the Cloud Automation button on the main Canvas toolbar, or click Automation > Cloud Automation from the main menu.


If you haven't already, you will be asked to login with your Analytics Canvas Cloud account

Click on "New Package"


Then click on the "Add Canvas" button to select which canvas or canvases you want to run. You can pick multiple .ACC canvas files at a time.


You can then reorder the selected canvases by dragging them, remove any unwanted canvases by clicking on the "X" button to the left, or add more canvases as needed using the "Add Canvas" button. Once you have the Canvases you want, in the order you want, press Next.


After this, you will need to provide any credentials for databases, and you will be shown which credentials are being used for API sources, and be able to change them if needed. Remember that for Google Analytics, or Google Docs, all the canvases in the package will use this credential, so it must have access to everything that the canvases need.

If you are using data sources such as Google Analytics, Google Docs, Dropbox, Google Drive, or destinations like Data Studio or BigQuery then Canvas will handle all the credentials for you. If you use databases or other sources that require a login, then you may be prompted for this login at this time so that the canvas can be run on a schedule. 

To save the passwords locally so you do not have to reenter them, you can select "generate password file" this generates and obfuscated password file in the folder with the .ACC file. Keep this file safe as it contains your credentials.


Press next, and you can then define any schedule you want the package to run on. A schedule is not required, a package can also be run on demand.


You can also have notification emails sent when packages are run. More importantly, it is possible to send an email when packages fail- it is recommended that this option be used- it is best to receive an email only when there is a problem.


On the error handling an logging tab, you have options as to how you want any failed canvases to be handled.

If the canvases in the package rely on the previous canvases running, select "stop batch file immediately if a canvas fails" - this is the default.

However if you are running a number of canvases that are not related, you can select "run all canvases even if canvases fail".

Once these settings are as desired, press update, and the package will be published up to the server. You will see that for each canvas there is a green "In Sync" indicator. This advises you that there have not been any changes to the local canvas. You can change the local canvas file as you wish- the file in the cloud will not be updated until you sync the file.


If you want to run your package at a time other than the scheduled time, just click the "Run Now" button.

If you run a package just before it is scheduled to run, and it is still running when the schedule time arrives, then the scheduler will skip the scheduled run, and log it as "Skipped". The same package cannot run twice at the same time- once a package is running, all users will see that it is running, and the run now button will be disabled.

If you run into any issues with cloud automation, contact support@analyticscanvas.com

Was this article helpful?

Related Tutorials/Video