Future-Proof Your Analytics: A Guide to Storing and Analyzing Universal Analytics Data
Google officially announced that the Universal Analytics (UA) platform will be discontinued on July 1, 2024. This was a much-anticipated update ever since the sunset was announced last year in March, stating that, beginning July 1, 2023, standard UA properties will stop processing data.
Users will still be able to access previously processed data in their UA property for at least six months. For users continuing to rely on the Google Analytics platform, next steps not only include upgrading to GA4, but also solving for the long-term storage of historical data that their UA property has been collecting.
There are a few checkpoints businesses using standard UA properties need to complete before the platform officially retires in 2024. These checkpoints are designed in a way that allows users to create a migration plan and learn to utilize and navigate through the new GA4 product. A recent update by Google outlines these checkpoints for those who either haven’t migrated to GA4 or for those who were using it to understand and define audiences in Google Ads. For late adopters in migrating to GA4, Google Analytics will automatically create a property for those using the audience feature and will add the GA4 audiences to the specific ad group or campaign. This update also outlined suggested ways to export UA data. As anticipated, these approaches require some technical expertise or extensive resources.
Before we dive into how to store historical data, it is important to understand why we are doing it and what results we can expect. Historical data enables you to track progress and improvement over time, which provides key insights and the ability to make important strategic decisions. This also helps you understand patterns and seasonality, which is an important part of building predictions and future trends. While exporting and storing historical information may seem like a straightforward and tactical step, it is an important aspect of data management, which is a crucial component of digital transformation. A thought-through data management approach enables organizations to leverage data as a strategic asset to drive innovation, improve customer experiences, and gain a competitive advantage.
The most crucial step in data management is to define clear goals and metrics. Before you start storing data, you must make sure to have a clear understanding of the metrics and KPIs you want to store/track. This will help you determine what data to export, how to align with the new data model, and how to organize it. Once the strategy is established, it will serve as the guidepost to define the requirements and understand which solution will be most helpful.
There are several options available to users for storing and accessing their UA data in the long term. Each of these solutions have certain limitations, which we have outlined below, along with notes on how technical/resource intensive they will be.
- Export individual reports into the following formats: CSV, TSV, TSV for Excel, Excel (XLSX), Google Sheets, PDF
- Apart from being a time-intensive process, some of the other limitations of exporting in Excel include:
- Only 2 dimensions at a time, e.g., Date and Users
- Limit of maximum 5000 rows
- Issue of data sampling
- Custom Reports can help circumvent dimension limitation and expands to up to 5 dimensions. However, the issue of data sampling will persist if you gather reports for a longer date range that goes further than the limit of 5000 rows. Google Sheets currently has an add-on that can automate some of the processes, with the ability to specify the dimensions and metrics needed.
- Apart from being a time-intensive process, some of the other limitations of exporting in Excel include:
- Use Google Sheets Add-on: The Google Analytics Spreadsheet add-on makes it easier for users to access, visualize, share, and manipulate their data in Google Spreadsheets. It is user-friendly and requires no coding or technical expertise, making it accessible to users of all levels. Custom queries can be created that include the specific metrics and dimensions they need, providing more flexibility than pre-built reports. It can be set up to automatically update the data in Google Sheets, saving time and allowing for easy collaboration and sharing within teams. While this is a very user-friendly and relatively seamless process, it does have a few limitations:
- The add-on is limited to exporting up to 10,000 rows of data per query, which may not be sufficient for large datasets.
- Limited functionality: The add-on does not support all the features available in the Analytics interface, such as custom reports.
- Use Google Analytics Query Explorer: Google Analytics Query Explorer is an interface that lets users construct API queries to gather data from the Google Analytics account. It is a great way to export and explore data for non-technical users, allowing them to explore and collect information across various dimensions and metrics. Data can be exported as a TSV (which can be changed to CSV format) or using an API query. An advantage of the Query Explorer is that, unlike the Google Analytics export, it does not have the limitation of 5000 rows, which allows users to interpret larger data sets.
- Exporting to BigQuery: (Available only to 360 Users): Google BigQuery is a cloud-based data warehouse that allows users to store and analyze large amounts of data. By exporting their UA data to BigQuery, users can continue to access and analyze their data using custom SQL queries and third-party data visualization tools. To export data to BigQuery, users will need to set up a BigQuery project and create a dataset. They can then use the Google Analytics BigQuery Export schema to export their Universal Analytics data to the dataset. Once the data is exported, users can query it using SQL and visualize it using tools like Data Studio or Tableau.
- While this option is available to Google 360 Users, Google Sheets presents an option to connect to BigQuery. This would mean pulling data in Google sheets and then setting up a BigQuery instance to ingest that data.
- Using the Google Analytics Reporting API to export data: This allows users to integrate analytics data with other applications and build complex reporting. The biggest advantage of using the Reporting API is that users can connect applications that might not have direct integration options with Google Analytics. However, technical expertise is required to install the API code and connect it to business applications.
- Utilizing a program such as R/Python: Aligned to the solution above, you can use a coding program like R or Python to export historical UA data by connecting to the Analytics Reporting API. These programs allow users to retrieve data directly from the UA servers. Although this does require technical expertise, it offers users more flexibility, scalability, and automation than other methods. It can also enable long-term storage and integration with other analytical tools, enabling more sophisticated analysis and reporting.
Regardless of the option chosen, companies must ensure that their UA data is stored securely and in compliance with applicable data protection regulations. Storage and maintenance of data presents a lot of opportunities; however, one of the main limitations is the effort required to store and process. Storing large amounts of data can be time-consuming and resource intensive. It is important to define the “why” of this data management process to set businesses up for success and enable them to make data-informed decisions.
By choosing the right method for storing and processing data and considering how it can be combined with GA4 data in the future, businesses can make the most of their analytics data and use it to inform their business decisions.
Photo Credit: Uriel SC | Unspash