Adobe’s performanceTiming plugin, with some improvements and an explanation

As Page Performance (rightfully) gets more and more attention, I’ve been hearing more and more questions about the Performance Timing plugin from Adobe consulting. Adobe does have public documentation for this plugin, but I think it deserves a little more explanation, as well as some discussions of gotchas, and potential enhancements.

How It Works

Adobe’s Page Performance plugin is actually just piggybacking on built-in functionality: your browser already determined at what time your content starting loading and at what time is stopped loading. You can see this in a JavaScript Console by looking at performance.timing:
performanceTiming

This shows a timestamp (in milliseconds since Jan 1, 1970, which the internet considers the beginning of time) for when the current page hit certain loading milestones.

Adobe’s plugin does look at that performance timing data, compares a bunch of the different milestone timestamps versus each other, then does some math to put it into nice, easy-to-read seconds. For instance, my total load time would be the number of seconds between navigationStart and loadEventEnd:

1556048746779 (loadEventEnd) – 1556048745659 (navigationStart) = 1120 milliseconds, or 1.12 seconds.

Additionally, if I choose to, I can have the plugin grab information from the built-into-the-browser performance.getEntries(), put it into session storage (not a cookie because it can be a long list), and put it into the variable of your choice (usually a listVar or list prop) on the next page. These entries show you for EACH FILE on the page, how long they took to load.

Unfortunately, if I’m sending my analytics page view beacon while the page is still loading, then the browser can’t tell me when “domComplete” happened…. because it hasn’t happened yet! So the plugin writes all these values to a cookie, then on your NEXT beacon, reads them back and puts them into numeric events that you define when you set the plugin up. This means you won’t get a value on the first page of the visit, and the values for the last page of the visit won’t ever be sent in. It also means you don’t want to break these metrics down by page, but rather by PREVIOUS page- so often this plugin is rolled out alongside the getPreviousValue plugin. This also means that the plugin is not going to return data for single-page visits or for the last page of visits (it may collect the data but doesn’t have a second beacon to send the data in on). for this reason, your Performance Timing Instances metric may look significantly different from your Page Views metric.

What It Captures

Out of the box, the plugin captures all of the following into events:

  • Redirect Timing (seconds from navigationStart to fetchStart- should be zero if there was no redirect)
  • App Cache Timing (seconds from fetchStart to domainLookupStart)
  • DNS Timing (seconds from domainLookupStart to domainLookupEnd)
  • TCP Timing (seconds from connectStart to connectEnd)
  • Request Timing (seconds from connectEnd to responseStart)
  • Response Timing (seconds from responseStart to responseEnd )
  • Processing Timing (seconds from domLoading to loadEventStart)
  • onLoad Timing (seconds from loadEventStart to loadEventEnd)
  • Total Page Load Time (seconds from navigationStart to loadEventEnd )
  • Instances (for calculated metric- otherwise you only really get the aggregated seconds, which is fairly meaningless if your traffic fluctuates)

Which gets you reporting that looks like this:
exampleReport

…Which, to be honest, isn’t that useful, because it shows the aggregated number of seconds. The fact that our product page took 1.3 million seconds in redirect timing in this reporting period means nothing without some context. That’s why that last metric, “instances”, exists: you can turn any of the first 9 metrics into a calculated metric that shows you the AVERAGE number of seconds in each phase of the page load:

calculatedMetric

This gives me a much more useful report, so I can start seeing which pages take the longest to load:

exampleReportWithMetrics

As you can see, the calculated metric can use either the “Time” format or the “Decimal” format, depending on your preference.

Performance Entries

As mentioned previously, the plugin can also capture your performance entries (that is, a list of ALL of the resources a page loaded, like images and JS files) and put them into a listVar or prop of your choice. This returns a list, delimited by “!”, where each value has a format that includes:

The name of the resource (ignoring query params)!at what second in the page load this resource started loading!how long it took for that resource to finish loading!resource type (img, script, etc).

For example, on my blog, I might see it return something like this:

https://digitaldatatactics.com/beaconParser/index.html|0.0|0.9|navigation!https://www.digitaldatatactics.com/utility/spiffy.css|0.2|0.1|link!https://digitaldatatactics.com/beaconParser/decoder.css|0.2|0.1|link!https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js|0.2|0.1|script!https://digitaldatatactics.com/beaconParser/js/varInfo.js|0.2|0.1|script!https://digitaldatatactics.com/beaconParser/js/HBvarInfo.js|0.2|0.1|script!https://digitaldatatactics.com/beaconParser/js/wsse.js|0.2|0.1|script!https://digitaldatatactics.com/beaconParser/js/apiConfig.js|0.2|0.2|script!https://digitaldatatactics.com/beaconParser/js/apiRetrieve.js|0.2|0.1|script!https://digitaldatatactics.com/beaconParser/js/decode.js|0.2|0.1|script!https://digitaldatatactics.com/beaconParser/js/print.js|0.2|0.1|script!https://digitaldatatactics.com/images/NarrowBanner.png|0.2|0.3|img!https://digitaldatatactics.com/beaconParser/images/trash.png|0.2|0.5|img!https://assets.adobedtm.com/launch-EN3911ddbdce3b4c4697e2d3c903e9cfc5.min.js|0.4|0.1|script!https://digitaldatatactics.com/beaconParser/images/background-trans.png|0.4|0.3|css!first-paint|0.5|0.0|undefined!first-contentful-paint|0.5|0.0|undefined!https://assets.adobedtm.com/extensions/EP4c3fcccffd524251ae198bf677f3b6e9/AppMeasurement.min.js|0.5|0.0|script!https://jenniferkunz.d1.sc.omtrdc.net/b/ss/jenniferkunztestdev/1/JS-2.12.0-L9SG/s44127282881639|0.7|0.2|img

From this, I can see every file that is used on my page and how long it took to load (and yes, it is telling me that the last resource to load was my analytics beacon, which started .7 seconds into my page loading, and took .2 seconds to complete). This is a LOT of information, and at bare minimum, it can make my analytics beacons very long (you can pretty much accept that most of your beacons are going to become POST requests rather than GET requests), but it can be useful to see if certain files are consistently slowing down your page load times.

An Enhancement: Time to Interaction

Unfortunately, the plugin most commonly used by folks omits one performance timing metric that many folks believe is the most critical: Time to DomInteractive. As this helpful site states:

  • Page Load Time is the time in which it takes to download the entire content of a web page and to stabilize.
  • Time to Interactive is the amount of time in which it takes for the content on your page to become functional and ready for the user to interact with once the content has stabilized.

In other words, Page Load Time might include the time it takes for a lot of background activity to go on, which may not necessarily stop the user from interacting with the site. If your page performance goal is for the best user experience, then Time To Interaction should be a key metric in measuring that. So, how do we track that? It already exists in that performance.timing object, so I tweaked the existing plugin code to include it. I can then create a calculated metric off of that (Time to Interactive/Page Performance Instances) and you can see it tells a very different story for this site than Total Page Load Time did:
reportTimeToInteractive

9.49 seconds DOES sound like a pretty awful experience, but all three of these top pages had a much lower (and much more consistent) number of seconds before the user could start interacting with the page.

Basic Implementation

There are three parts to setting up the code for this plugin: before doPlugins (configuration), during doPlugins (execution), and after doPlugins (definition).

Configuration

First, before doPlugins, you need to configure your usage by setting s.pte and s.ptc:

s.pte = 'event1,event2,event3,event4,event5,event6,event7,event8,event9,event10,event11'
s.ptc = false; //this should always be set to false for when your library first loads

In my above example, here is what each event will set:

  • event1= Redirect Timing (seconds from navigationStart to fetchStart- should be zero if there was no redirect)- set as Numeric Event
  • event2= App Cache Timing (seconds from fetchStart to domainLookupStart)- set as Numeric Event
  • event3= DNS Timing (seconds from domainLookupStart to domainLookupEnd)- set as Numeric Event
  • event4= TCP Timing (seconds from connectStart to connectEnd)- set as Numeric Event
  • event5= Request Timing (seconds from connectEnd to responseStart)- set as Numeric Event
  • event6= Response Timing (seconds from responseStart to responseEnd )- set as Numeric Event
  • event7= Processing Timing (seconds from domLoading to loadEventStart)- set as Numeric Event
  • event8= onLoad Timing (seconds from loadEventStart to loadEventEnd)- set as Numeric Event
  • event9= Total Page Load Time (seconds from navigationStart to loadEventEnd )- set as Numeric Event
  • event10= Total Time to Interaction (seconds from connectStart to timeToInteraction)- set as Numeric Event. NOTE- THIS IS ONLY ON MY VERSION OF THE PLUGIN, OTHERWISE SKIP TO INSTANCES
  • event11= Instances – set as Counter Event

I’d also need to make sure those events are enabled in my Report Suite with the correct settings (everything should be a Numeric Event, with the exception of instances, which should be a Counter Event).

Execution

Within doPlugins, I need to just run the s.performanceTiming function. If I don’t want to capture performance entries (which is reasonable- not everyone has the listVars to spare, and it can return a VERY long value that can be difficult to get value out of), then I fire the function without any arguments:

s.performanceTiming()

If I DO want those performance entries, then I add the name of that variable as an argument:

s.performanceTiming("list3")

Also, you’re going to want to be capturing Previous Page Name into a prop or eVar if you aren’t already:

s.prop1=s.getPreviousValue(s.pageName,'gpv_pn');

(If you are already capturing Previous Page Name into a variable, you don’t need to capture it separately just for this plugin- you just need to be capturing it once somewhere).

Definition

Finally, where I have all of my plugin code, I need to add the plugin definitions. You can get Adobe’s version from their documentation, or if you want it with Time To Interactive, you can use my version:

/* Plugin: Performance Timing Tracking - 0.11 BETA - with JKunz's changes for Time To Interaction. 
Can you guess which line I changed ;)?*/
s.performanceTiming=new Function("v",""
+"var s=this;if(v)s.ptv=v;if(typeof performance!='undefined'){if(perf"
+"ormance.timing.loadEventEnd==0){s.pi=setInterval(function(){s.perfo"
+"rmanceWrite()},250);}if(!s.ptc||s.linkType=='e'){s.performanceRead("
+");}else{s.rfe();s[s.ptv]='';}}");
s.performanceWrite=new Function("",""
+"var s=this;if(performance.timing.loadEventEnd>0)clearInterval(s.pi)"
+";try{if(s.c_r('s_ptc')==''&&performance.timing.loadEventEnd>0){try{"
+"var pt=performance.timing;var pta='';pta=s.performanceCheck(pt.fetc"
+"hStart,pt.navigationStart);pta+='^^'+s.performanceCheck(pt.domainLo"
+"okupStart,pt.fetchStart);pta+='^^'+s.performanceCheck(pt.domainLook"
+"upEnd,pt.domainLookupStart);pta+='^^'+s.performanceCheck(pt.connect"
+"End,pt.connectStart);pta+='^^'+s.performanceCheck(pt.responseStart,"
+"pt.connectEnd);pta+='^^'+s.performanceCheck(pt.responseEnd,pt.respo"
+"nseStart);pta+='^^'+s.performanceCheck(pt.loadEventStart,pt.domLoad"
+"ing);pta+='^^'+s.performanceCheck(pt.loadEventEnd,pt.loadEventStart"
+");pta+='^^'+s.performanceCheck(pt.loadEventEnd,pt.navigationStart);pta+='^^'+s.performanceCheck(pt.domInteractive, pt.connectStart);"
+"s.c_w('s_ptc',pta);if(sessionStorage&&navigator.cookieEnabled&&s.pt"
+"v!='undefined'){var pe=performance.getEntries();var tempPe='';for(v"
+"ar i=0;i-1?pe[i].name.split('?')[0]:pe[i].name;tempPe+='|'+(Math.round(pe["
+"i].startTime)/1000).toFixed(1)+'|'+(Math.round(pe[i].duration)/1000"
+").toFixed(1)+'|'+pe[i].initiatorType;}sessionStorage.setItem('s_pec"
+"',tempPe);}}catch(err){return;}}}catch(err){return;}");
s.performanceCheck=new Function("a","b",""
+"if(a>=0&&b>=0){if((a-b)<60000&&((a-b)>=0)){return((a-b)/1000).toFix"
+"ed(2);}else{return 600;}}");
s.performanceRead=new Function("",""
+"var s=this;if(performance.timing.loadEventEnd>0)clearInterval(s.pi)"
+";var cv=s.c_r('s_ptc');if(s.pte){var ela=s.pte.split(',');}if(cv!='"
+"'){var cva=s.split(cv,'^^');if(cva[1]!=''){for(var x=0;x<(ela.lengt"
+"h-1);x++){s.events=s.apl(s.events,ela[x]+'='+cva[x],',',2);}}s.even"
+"ts=s.apl(s.events,ela[ela.length-1],',',2);}s.linkTrackEvents=s.apl"
+"(s.linkTrackEvents,s.pte,',',2);s.c_w('s_ptc','',0);if(sessionStora"
+"ge&&navigator.cookieEnabled&&s.ptv!='undefined'){s[s.ptv]=sessionSt"
+"orage.getItem('s_pec');sessionStorage.setItem('s_pec','',0);}else{s"
+"[s.ptv]='sessionStorage Unavailable';}s.ptc=true;");
/* Remove from Events 0.1 - Performance Specific, 
removes all performance events from s.events once being tracked. */
s.rfe=new Function("",""
+"var s=this;var ea=s.split(s.events,',');var pta=s.split(s.pte,',');"
+"try{for(x in pta){s.events=s.rfl(s.events,pta[x]);s.contextData['ev"
+"ents']=s.events;}}catch(e){return;}");
/* Plugin Utility - RFL (remove from list) 1.0*/
s.rfl=new Function("l","v","d1","d2","ku",""
+"var s=this,R=new Array(),C='',d1=!d1?',':d1,d2=!d2?',':d2,ku=!ku?0:"
+"1;if(!l)return'';L=l.split(d1);for(i=0;i-1){C=L[i].split(':');C[1]=C[0]+':'+C[1];L[i]=C[0];}if(L[i"
+"].indexOf('=')>-1){C=L[i].split('=');C[1]=C[0]+'='+C[1];L[i]=C[0];}"
+"if(L[i]!=v&&C)R.push(C[1]);else if(L[i]!=v)R.push(L[i]);else if(L[i"
+"]==v&&ku){ku=0;if(C)R.push(C[1]);else R.push(L[i]);}C='';}return s."
+"join(R,{delim:d2})");

You’ll also need to have s.apl and s.split.

You can see a full example of what your plugins code might look like, as well as a deobfuscated picking-apart of the plugin, on our gitHub.

Performance Entries Classifications

I recommend if you ARE capturing Performance Entries in a listVar, setting up 5 classifications on that listVar:

  • Resource/File
  • Starting Point
  • Duration
  • Duration- Bucketed (if desired)
  • Resource Type

Then set up a Classification Rule, using this regex string as the basis:

^(.+)\|(.+)\|(.+)\|(.+)

In our git repo, I have a full list of the classification rules and regex I used, including how to bucket the durations so you get less granular values like “0.5-1.0 seconds”, which can give you a report like this:

bucketedDuration

Implications for Single Page Apps

Unfortunately, this plugin will NOT be able to tell you how long a “virtual page” on a single page app (SPA) takes to load, because it relies on the performance.timing info, which is tied to a when an initial DOM loads. This isn’t to say you can’t deploy it on a Single Page App- you may still get some good data, but the data will be tied to when the overall app loads. Take this user journey for example, where the user navigates through Page C of a SPA, then refreshes the page:

Example Flow of how this plugin would work in a SPA

As you can see, we’d only get performanceTiming entries twice- once on Page A and once on the refreshed Page C. Even without the “virtaul pages”, it may still be worth tracking- especially since a SPA may have a lot of upfront loading on the initial DOM. But it’s not going to tell the full story about how much time the user is spending waiting for content to load.

You can still try to measure performance for state changes/”virtual pages” on a SPA, but you’ll need to work with your developers to figure out a good point to start measuring (is it when the user clicks on the link that takes them to the next page? Or when the URL change happens?) and at what point to stop measuring (is there a certain method or API call that brings in content? Do you having a “loading” icon you can piggy back on to listen to the end?). Make sure if you start going this route (which could be pretty resource intensive), you ask yourselves what you can DO with the data: if you find out that it takes an average 2.5 seconds to get from virtual page B to virtual page C, what would your next step be? Would developers be able to speed up that transition if the data showed them the current speed was problematic?

Use the Data

Finally, it’s important to make sure after you’ve implemented the plugin, you set aside some time to gather insights and make recommendations. I find that this plugin is one that is often used to just “check a box”- it’s nice to know you have it implemented in case anyone ever wants it, but once it is implemented, if often goes ignored. It is good to have in place sooner rather than later, because often, questions about page performance only come up after a change to the site, and you’ll want a solid baseline already in place. For instance, if you’re migrating from DTM to Launch, you might want to roll this plugin out in DTM well in advance of your migration so after the migration, you can see the effect the migration had on page performance. Consider setting a calendar event 2 weeks after any major site change to remind you to go and look at how it affected the user experience.

 

[author] Jenn Kunz [author_image timthumb=’on’]https://33sticks.com/wp-content/uploads/2018/03/Jenn.Kunz_.jpeg [/author_image] [author_info]Jenn is an industry expert on Adobe Analytics, Implementation, Tag Management, and Data Layers. Her favorite video game is currently Horizon Zero Dawn, her favorite board game is currently Ex Libris, and her favorite books are the Stormlight Archive by Brandon Sanderson.

She is based out of Marietta, Georgia.[/author_info] [/author]
 
 

Published by Jenn Kunz

Jenn is an industry expert on Adobe Analytics, Implementation, Tag Management, and Data Layers. Her favorite video game is currently Horizon Zero Dawn, her favorite board game is currently Ex Libris, and herfavorite books are the Stormlight Archive by Brandon Sanderson. She is based out of Marietta, Georgia.

Join the Conversation

9 Comments

  1. Nice one and great explanation, small question this solution heavily depends on cookie, any recent apple safari browser ITP 2.2 impact on this?

    1. I’ll admit I haven’t examined this extremely closely, but I believe that since the cookie usage is really just to get the data from page A to page B (it doesn’t need to persist multiple days or between visits), it’s not as affected as some of the other things we use cookies for in Analytics.

  2. Hi Jenn

    Have you seen any significant outliers with this implementation? We get a small share of big positive/negative performance metrics (less than 2.5%), but can’t seem to associate it with any specific journey

    1. Yes, there are always some outliers, and usually it’s specific to the user’s internet connection and not your page performance. For instance, my AT&T U-verse connection at home has random hiccups sometimes, where Page A and Page B will load fine, then Page C randomly times out (or close to it), but reloads fine when I refresh. OR users may just have consistently stellar or really awful connection speeds.
      For instance, on a site where I’ve implemented this, when I load the page I just got a total load time of 5.06 seconds. If I throttle my connection to mimic a slow 3G connection, then that number just became 17.17 seconds. If you want to play around with how the user experience changes based on the user’s connection, you can use “throttling” on the Network tab in Chrome’s developer console.
      Where to find the throttling option in the Network tab in Chrome's dev console

  3. I’ve not looked at performance reporting before.
    What conditional formatting have people used? So, your “upper limit”, “midpoint” and “lower limit” values.

  4. Your plugin code shows html entities e.g. &amp > which causes js errors when attempting to copy/paste.

  5. Hi!
    Great run-through of this extremely powerful plugin!
    One note/question: as you mention, the asset level data is stored in a sessionStorage item, but this method is largely incompatible with subdomains; asset level data stored on “mywebsite.com” will not be available on “support.mywebsite.com” and vice versa. The getPreviousValue on the other hand is compatible with such cross-domain navigation as the cookie will likely be set on “.mywebsite.com”.
    Therefore, when correlating the asset reporting with pages, I’d suggest using the “navigation” entry in the asset list, by simply setting it via a classification (e.g. “Asset Page”) – this way, we’re 100% certain of which page the assets were present on at the time. If you’re correlating asset performance with the “previous page” (from getPreviousValue), you will see invalid or missing data when the user navigates between subdomains.
    Would you agree with this caveat and the workaround suggested? Would love to hear your thoughts on “best practice” implementation and reporting on cross domain navigation hits.
    Happy to elaborate if needed. Thanks for taking your time 🙂

Leave a comment

Your email address will not be published. Required fields are marked *