< back to knowledge

Request to use Google Cloud Platform (GCP)

by Andreas Voniatis Founder. Fractional SEO Consultant.

Often clients that are large companies, publicly listed or otherwise, have strict protocols which are usually in place to protect the security of those companies and prevent ‘shadow IT’. We’ve put together a guide should you need to justify to your own IT department why you’d like to have your own Google Cloud Platform (GCP) to work with our systems. This can obviously be adapted for anything else you’d like to do with GCP or Amazon Web Services (AWS), or any other cloud computing provider, by way of example.

Why we are requesting permission to use Google Cloud Platform (GCP)

We are requesting permission to use Google Cloud Platform (GCP) because we would like to automate our Organic Search / Search Engine Optimisation (SEO) software reporting and Artios (our proposed software vendor) has APIs that (at this time) work exclusively on GCP.

What we will use GCP for

We will use GCP to extract and store the data from all of our SEO and website analytics tools using the Artios API. The data will be stored in GCP’s data warehouse product known Google Big Query (GBQ). 

The data held in GBQ will be available for querying to put together datasets to power our charts in our data visualisation platforms like:

  • Google Data Studio
  • Tableau
  • Microsoft Power BI
  • QlikView
  • And any other platform that uses SQL and connects to GCP

Data sources

Data sources will include aggregated data from:

  • Google Analytics 
  • Google Search Console
  • Ranking data from a third party tool
  • Website crawling data from a third party tool
  • Backlinks data from a third party tool
  • Social media data from a third party tool
  • CRM data

Why our own cloud stack and not Artios

Artios could host all of our data to power our dashboards like any SEO software vendor. However, we would like to have full total control and independence over our data going forward.

 

In our own Google stack, Artios APIs will deploy in minutes to to start pulling and processing data so that we can unifying our data and power our marketing dashboards.

Who will have access to this data

Initially Artios will be given an IAMs account at “Project Editor” level, so that they can deploy the APIs. This will then be removed once the dashboards are operational. Artios may be restored access on occasion for new release updates or upon our request to maximise our usage of the system or system training.

Most of the time, only our marketing team will have access at different levels to reflect levels of seniority and responsibility. For example:

  • Requester (me) – Project Owner
  • Manager – Big Query Editor
  • Analyst – Big Query Editor

What resources will be used

The resources to be used are:

  • Compute Engine – operating the Artios API
  • Big Query – storing the data for querying and powering our dashboards.

Costs

For 1,000 keywords, the approximate usage cost will be no more than $50 per month which will be virtually all Compute Engine.

Download Our SEO Guide
<