Documentation / Upgrade


Upgrading is just updating to the new version. There’s a couple of important thing that has changed.

Graphite keys

The keys in Graphite has a new structure. The reason is that we wants to have a generic solution where we can use the same dashboards for whatever site and we want to follow the new data structure. If old data is important to you, you need to run multiple instances for a while, one with 3.x and one with 4 and have multiple dashboards, and then when you have the history, make the switch to the 4.0 dashboards.

CLI mapping

A lot changed in the CLI and the easiest way for you is just to run with --help to see what you can use. You can also check this mapping.

-u <URL>, --url <URL>N/AThe start url that will be used when crawling.
-f <FILE>, --file <FILE>N/AThe path to a plain text file with one URL on each row. Each URL will be analyzed.
--sites <FILE>N/AThe path to a plain text file with one URL on each row. You can use the parameter multiple times to point out many files
-V, --version-V, --versionDisplay the version.
--silentN/AOnly output info in the logs, not to the console.
-v, --verbose-v, --verboseEnable verbose logging.
--noColorN/ADon’t use colors in console output. [false]
-d <INTEGER>, --deep <INTEGER>--crawler.depth, -dHow deep to crawl. [1]
-c <KEYWORD>, --containInPath <KEYWORD>N/AOnly crawl URLs that contains this in the path.
-s <KEYWORD>, --skip <KEYWORD>N/ADo not crawl pages that contains this in the path.
-t <NOOFTHREADS>, --threads <NOOFTHREADS>N/AThe number of threads/processes that will analyze pages. [5]
--name <NAME>N/AGive your test a name, it will be added to all HTML pages.
--memory <INTEGER>N/AHow much memory the Java processed will have (in mb). [256]
-r <DIR>, --resultBaseDir <DIR>--outputFolder <DIR>The result base directory, the base dir where the result ends up. [sitespeed-result]
--outputFolderName--outputFolderDefault the folder name is a date of format yyyy-mm-dd-HH-MM-ss
--suppressDomainFolderN/ADo not use the domain folder in the output directory
--userAgent <USER-AGENT>--browsertime.userAgent <USER-AGENT>The full User Agent string, default is Chrome for MacOSX. [userAgent|ipad|iphone]. [Mozilla/5.0 (Macintosh; Intel Mac OS X 10_9_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/37.0.2062.120 Safari/537.36]
--viewPort <WidthxHeight>--browsertime.viewPort <WidthxHeight>The view port, the page viewport size WidthxHeight like 400x300. [1280x800]
-y <FILE>, --yslow <FILE>N/AThe compiled YSlow file. Use this if you have your own rules.
--headlessN/AChoose which backend to use for headless [phantomjs|slimerjs] [phantomjs]
--ruleSet <RULE-SET>--plugins.load, --pluglins.disableWhich ruleset to use. []
--limitFile <PATH>--configThe path to the limit configuration file.
--basicAuth <USERNAME:PASSWORD>--browsertime.preScript, --preScriptBasic auth user & password.
-b <BROWSER>, --browser <BROWSER>--browsertime.browser, -b, --browserChoose which browser to use to collect timing data. Use multiple browsers in a comma separated list (firefox|chrome|headless)
--connection--browsertime.connectivity.profile, -c, --connectivityLimit the speed by simulating connection types. Choose between mobile3g,mobile3gfast,cable,native [cable]
--waitScript--browsertime.pageCompleteCheckSupply a javascript that decides when a browser run is finished. Use it to fetch timings happening after the loadEventEnd. [ if (window.performance && window.performance.timing){ return ((window.performance.timing.loadEventEnd > 0) && ((new Date).getTime() - window.performance.timing.loadEventEnd > 2000 ));} else { return true;}]
--customScripts--browsertime.postScript, --postScriptThe path to an extra script folder with scripts that will be executed in the browser. See
--seleniumServer <URL>--browsertime.selenium.url <URL>Configure the path to the Selenium server when fetching timings using browsers. If not configured the supplied NodeJS/Selenium version is used.
--btConfig <FILE>N/AAdditional BrowserTime JSON configuration as a file
--profile <desktop|mobile>--browsertime.connectivity.profile, -c, --connectivityChoose between testing for desktop or mobile. Testing for desktop will use desktop rules & user agents and vice verca. [desktop]
-n <NUMBEROFTIMES>, --no <NUMBEROFTIMES>--browsertime.iterations <NUMBEROFTIMES>, -n <NUMBEROFTIMES>The number of times you should test each URL when fetching timing metrics. Default is 3 times. [3]
--screenshotN/ATake screenshots for each page (using the configured view port).
--junit--budget.outputCreate JUnit output to the console.
--tap--budget.outputCreate TAP output to the console.
--skipTest <ruleid1,ruleid2,...>N/AA comma separated list of rules to skip when generating JUnit/TAP/budget output.
--testDataN/AChoose which data to send test when generating TAP/JUnit output or testing a budget. Default is all available [rules,page,timings,wpt,gpsi] [all]
--budget <FILE>--budget <FILE>A file containing the web perf budget rules. See
-m <NUMBEROFPAGES>, --maxPagesToTest <NUMBEROFPAGES>--crawler.maxPages <NUMBEROFPAGES>, -m <NUMBEROFPAGES>The max number of pages to test. Default is no limit.
--storeJsonN/AStore all collected data as JSON.
-p <PROXY>, --proxy <PROXY>N/A
--cdns <,>N/AA comma separated list of additional CDNs.
--postTasksDir <DIR>--browsertime.postScript <DIR>, --postScript <DIR>The directory where you have your extra post tasks.
--boxes <box1,box2>N/AThe boxes showed on site summary page, see
-c <column1,column2>, --columns <column1,column2>N/AThe columns showed on detailed page summary table, see
--configFile <PATH>--config <PATH>The path to a config.json file, if it exists all other input parameters will be overridden.
--aggregators <PATH>N/AThe path to a directory with extra aggregators.
--collectors <PATH>N/AThe path to a directory with extra collectors.
--graphiteHost <HOST> <HOST>The Graphite host.
--graphitePort <INTEGER>--graphite.port <INTEGER>The Graphite port. [2003]
--graphiteNamespace <NAMESPACE>--graphite.namespace <NAMESPACE>The namespace of the data sent to Graphite. []
--graphiteData--metrics.filterChoose which data to send to Graphite by a comma separated list. Default all data is sent. [summary,rules,pagemetrics,timings,requests,domains] [all]
--graphiteUseQueryParameters--graphite.includeQueryParamsChoose if you want to use query parameters from the URL in the Graphite keys or not
--graphiteUseNewDomainKeyStructureN/AUse the updated domain section when sending data to Graphite “” to “http.www_sitespeed_io” (issue #651)
--gpsiKey--gpsi.keyYour Google API Key, configure it to also fetch data from Google Page Speed Insights.
--noYslowN/ASet to true to turn off collecting metrics using YSlow.
--html--plugins.disable htmlCreate HTML reports. Default to true. Set no-html to disable HTML reports. [true]
--wptConfig <FILE>--webpagetest.location, --webpagetest.connectivity, --webpagetest.runsWebPageTest configuration, see runTest method
--wptScript <FILE>--webpagetest.script <FILE>WebPageTest scripting. Every occurance of {{{URL}}} will be replaced with the real URL.
--wptCustomMetrics <FILE>--webpagetest.custom <FILE>Fetch metrics from your page using Javascript
--wptHost <DOMAIN> <DOMAIN>The domain of your WebPageTest instance.
--wptKey <KEY>--webpagetest.key <KEY>The API key if running on webpagetest on the public instances.
--requestHeaders <FILE>|<HEADER>N/AAny request headers to use, a file or a header string with JSON form of {“name”:”value”,”name2”:”value”}. Not supported for WPT & GPSI.
--postURL <URL>N/AThe full URL where the result JSON will be sent by POST. Warning: Testing many pages can make the result JSON massive.
--phantomjsPath <PATH>N/AThe full path to the phantomjs binary, to override the supplied version


With the new container you don’t need to tell it to start, just do:

$ docker run --privileged --rm -v "$(pwd)":/ sitespeedio/