Lately I’ve been thinking about how the Web could benefit from a Standardized System of Measurements, especially in how we discuss performance.

I was pleased to find this video where Paul Irish and a consortium of folks are discussing how to measure User Experience (delivery, paint, jank, etc) in Progressive Web Apps.

Collectively defining our own standards

While those big brains think on how to measure real world performance1, maybe there’s something us “normies” can do in the field.

I’m beginning to believe that front-end developers need to normalize how we perceive and test our sites; and not putting it lightly, it must be Mobile First. I believe using “Chrome on my quad-core desktop” as a baseline skews our lens of performance perception the wrong way.

If the quasi-scientific Web Community were to collectively agree (yes, I realize this will never happen) to use the same testing environments, like Science, it could help eliminate vanity metrics and produce better comparisons and better practices. Here’s my proposal, a starting point, with Mobile First as a guiding principle.

  • Throttled to 3G Good/Fast (~750kb/s with ~40ms RTT latency)
    • This emulates still-common 3G signal and bad wifi (aka, lie-fi).
    • Fiddler, Charles, or Web Inspector can throttle to 3G.
    • WebPageTests can run at 3G.
  • Tested in Chrome on actual Nexus 5X
    • Android has ~80% global mobile marketshare.
    • Nexus 5X is probably a good example of a “median” Android device.
    • USB inspecting is Mac and Windows compatible.
    • Chrome is currently the most advanced mobile browser.
    • Real devices have CPU, GPU, and memory limitations that our quad-core super computers don’t. Alex Russel talks about this in his Google I/O 2016 talk.

All measurements would be based off of those two scenarios. You could build and test your websites with that little device tethered out to the side. This could be accompanied by a second teir of test scenarios that, in the interest of simplicity, are optional:

  • Tested on Edge w/ multitouch screen
    • Default browser for Windows 10
    • Windows has a ~80% global marketshare
    • 10-point touch PointerEvents
    • Historically lacking in feature support
  • Tested in Safari
    • Default browser for MacOS and iOS
    • Slower annual release cycle
    • Currently lowest on HTML5Test
  • Tested on Opera Mini
    • Harshest of all proxy browsers
    • Simulation of when things fail
    • Ignores some CSS, no fonts, low JavaScript support
    • No asynchronous JavaScript
    • Arbitrary JavaScript execution timeout
  • Tested in Firefox
    • Dependable “median” rendering and functionality

Together these cover the high and low end of bandwidth, the high and low end of browser feature support, and some of the high and low end of global market share. I’ve also tried to limit this to an “affordable” number of devices.

I don’t think this replaces a robust device lab or a browser support matrix based on your own company’s stats, but I do think it’s an honest mirror to reflect our work. And covering these bases would make the edge cases more tolerable.

You can develop your sites however you want, but by giving ourselves limitations we can do away with vanity stats and get an honest look at how our servers, HTML, CSS, and JavaScript frameworks are performing.

I don’t actually own a Nexus 5X, but if enough people find this reasonable, I’d glady pony up the money for one. Heck, maybe Google can cut us an awesome discount if enough people are interested.

  1. A neat quote from someone in that meeting is “We’ve done a ton of magic to make scrolling awesome because scrolling is the primary interaction. Also tapping links. Those are the built in interactions the Web gives you… If we scroll and tap on a link, and nothing happens, what have we won?” And I love that. Let’s not break those things.