VORTEX 2018
Mon 16 - Sat 21 July 2018 Amsterdam, Netherlands
co-located with ECOOP and ISSTA 2018
Wed 18 Jul 2018 14:50 - 15:10 at Hanoi - JavaScript & Dynamic Behaviour

Node.js brought JavaScript from the browser to the server and NPM, its associated code repository, is now the largest in the world. However, despite its popularity, existing JavaScript benchmarks are not representative of the JavaScript ecosystem. For example, Google recently deprecated its Octane micro-benchmark and now plans to use real-world workloads to optimize its V8 JavaScript engine instead 1. In this talk, we present our initial steps towards building a benchmark that: 1) is representative of the Node.js ecosystem, and 2) lends itself to static and dynamic analysis.

To address our first goal, we start by downloading meta-data for all projects that use NPM on GitHub, collecting metrics such as: # of dependencies, # open and closed issues, # stars, # topics, size, # commits, # contributors, and proportion of project-specific code vs. dependency code. Then, we infer the probability distribution of each metrics (e.g. normal, log-normal, exponential, etc.) and sample projects in such a way that the original probability distributions are preserved. To address our second goal, we further select projects that have working test suites, pre-download all their dependencies, and build a harness that runs tests from all projects. Then, we build a test environment in the form of a container or virtual machine where specific versions of Node.js, web browsers, and databases are pre-installed, in an attempt to minimize variations across users of our benchmark.

During the talk, we will explain how we collected various metrics from GitHub, how we inferred their distributions, and how we sampled projects. We will also share our experience building an initial, executable, benchmark suite of 50 applications, and highlight the challenges we foresee ex-panding it. Finally, we will discuss two potential uses of the benchmark: dynamic analyses for performance, and security, and the additional metrics that we might need to consider.

Wed 18 Jul

Displayed time zone: Amsterdam, Berlin, Bern, Rome, Stockholm, Vienna change

14:00 - 15:30
JavaScript & Dynamic BehaviourBenchWork at Hanoi
14:00
30m
Benchmarking WebKit
BenchWork
File Attached
14:30
20m
Analyzing Duplication in JavaScript
BenchWork
Petr Maj Czech Technical University, Celeste Hollenbeck Northeastern University, USA, Shabbir Hussain Northeastern University, Jan Vitek Northeastern University
14:50
20m
Building a Node.js Benchmark: Initial Steps
BenchWork
Petr Maj Czech Technical University, François Gauthier Oracle Labs, Celeste Hollenbeck Northeastern University, USA, Jan Vitek Northeastern University, Cristina Cifuentes Oracle Labs
File Attached
15:10
20m
A Micro-Benchmark for Dynamic Program Behaviour
BenchWork
Li Sui Massey University, New Zealand, Jens Dietrich Massey University, Michael Emery Massey University, Amjed Tahir Massey University, Shawn Rasheed Massey University