Building a VR data visualization with Statcast batted ball data

Virtual reality (VR) and its proliferation into our lives is a popular topic right now and is being greeted with healthy doses of both excitement and skepticism. I for one am bullish on VR! Having been fortunate to play with an HTC Vive in the office, I find that even mundane things, like mini golf, become mind blowing experiences! An exciting use case for VR is in data visualization (although a more precise description would be in promoting insights and intuition for data – visualization is a powerful way of achieving this, but is not necessarily the only way).  Three-dimensional data visualizations have a long history of being maligned by data visualization academics and practitioners, and I think it’s safe to say that the consensus view is that they are gimmicky and are in fact misleading and harmful to the goal of accurately representing data. Making compelling data visualizations in VR is going to mean more than taking a bar chart or scatter plot and putting it 3d. What’s new and different about VR, in my view, is that one gets a visceral sense of expansiveness, and successful VR data visualization will take advantage of this and give us new ways of interacting and experiencing data.  I think it’s safe to say that many people are thinking about this very interesting challenge and opportunity, but no one has quite cracked it yet.

A few prominent examples of VR data visualizations are,

3d Nasdaq roller coaster from the WSJ

d3 and aframe roller coaster from Ian Johnson

A tour through England, showing (simulated) dislike of Piers Morgan for each town

For my visualization I’m using batted ball data from MLBAM and Statcast, which were obtained from this Statcast-Modeling R package . The layout is similar to one I built with d3.js some months back, which looks like this,

pobguy3_2In this visualization the circles show hits (blue) and outs (red), in their landing position according to hit f/x. The grids on the right hand side use mouseover to give the user the option to filter in the any of the launch-angle / launch-speed, launch-angle / hang-time and launch-speed / hang-time planes. The fully interactive version of this visualization is available on my github page.


The main feature of extending this to VR is using the device orientation to control the filtering in the launch-angle / launch-speed plane. The user changes launch angle by tilting their head up and down, and launch speed by tilting their head left to right. There is also a mouseover fallback for users on desktop, i.e moving the mouse up the screen simulates tilting ones head up and down and moving the mouse left to right, tilting head left to right. In addition, I tilted the plane of the field down in more of a perspective view, and I pop the filtered batted balls in the vertical direction to add additional highlighting. The end result looks like the image below; the fully interactive version is available at


There are a number of technical details about building this visualization that may be interesting, but a description of those is really beyond the scope of this post. In short, the visualization was built using three.js ,with a BufferGeometry and custom shader to render the points as a particle system; I map the output of the DeviceOrientation controls from the three.js examples to the launch angle and launch speed filters. I welcome any additional technical questions in comments or the contact form.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s