QA / Debugging

For QA/Debugging the event based tracking implementation install the Chrome plugin Event Based web analytics https://chrome.google.com/webstore/detail/gddl-inspector/glemmkkipabbechhnlmfiddhebdmandc

Go to chrome developer tools > tab EBTAnalyticsQA > page events

Testing

Using this approach allows us to ensure online data quality in several ways. With the attribution layer addition, we can parse web pages for analytics data attributes defined and verify them with a reference set. This can be done in a visual attractive way, giving content managers, developers and analysts direct insight into the content and quality of the analytics attributes.

The event chain mechanism allows us to listen, with custom code, to events passing the chain, and also verifying if they should appear, if they are well formatted and contain the correct information. These mechanisms are easy to implement in automated test suites for both unit and end-to-end testing. This allows us to lock-in quality control of online data capturing, and makes it part of the normal development process. The information contained in the events must not be limited to the attributed values. If back-end data is passing the front-end and needs to be added to the Digital Data, it can easily be added to the event data structure from front-end javascript code. This is also the case for localStorage and or Cookie data that should be passed.

We already have referred to QA a few times before. It is one of the pillars of the GDDL framework to work in a standardized way. And this is exactly what is needed for automated QA testing: standards.

A lot of quality issues can be tackled by defining clear roles and responsibilities. Within our framework this is taken care of by defining the responsibilities for a developer and a (technical) web analyst. The first is responsible for all the code that runs on the web page itself and the latter for all code that runs in the tag manager.

Using the event driven GGDL approach and in a stable configuration, all of the specific scripting for sending events and translating them to the data layer, can be loaded directly on the web pages itself, and not from the TMS. This leaves the IT part of the implementation to put it in a controllable way into the QA chain.

As already mentioned, Humix will take ownership of the tag manager. By limiting the number of parties that can publish tags and having someone who truly knows the tag manager configuration, we will already drastically reduce the number of erroneous implementations.

Another quality measure that we will put in place is a reduction of the number of tag manager containers and their included tags. By centralizing most of the tracking, we will decrease the effort it takes to maintain the configuration.

D’Ieteren suggested ObservePoint as a QA audit tool in the RFP. However, we advise to first investigate the possibilities of including the implementation part of the GDDL in the QA process of your development team. Next to that, we are also working on an end-2-end testing project that can check if the GDDL is correctly implemented on a URL level, and that expected events are emitted correctly throughout the framework. This project is based on an existing testing framework "nightwatch + cucumber" running on the Selenium based web driver architecture to simulate different browser implementations.

Last updated