| Subcribe via RSS

NAB Perspectives: ChyronHego’s Prince Goes Behind the Scenes of MLBAM’s Statcast

April 24th, 2015 Posted in Headlines By Jason Dachman

NAB could not have arrived at a more fitting moment for ChyronHego: on the heels of MLB Opening Day and just as MLBAM ramped up its Statcast player-tracking system, which the company was integral in developing, at all 30 MLB ballparks.

ChyronHego highlighted the cutting-edge MLBAM Statcast and its TRACAB image-based tracking system at NAB 2015 and continues to work with the league on its development. In addition, ChyronHego will look to build on its role with MLB Statcast and expand into more sports properties (TRACAB is already used on a variety of soccer, rugby, and cricket events).

In addition, ChyronHego launched VistaCam, an all-new broadcast-graphics solution for panoramic in-studio video displays, which provides an ultra-high-resolution video wall backdrop for in-studio shows. Other announcements included the launch of the Lyric64 graphics-creation and -playout solution and updates to Virtual Placement and Paint virtual-graphics platforms.

VP of Business Development Kevin Prince at ChyronHego’s booth at NAB 2015

VP of Business Development Kevin Prince at ChyronHego’s booth at NAB 2015

SVG sat down with ChyronHego VP of Business Development Kevin Prince at the show to go behind the scenes of Statcast’s development with MLBAM, how TRACAB can expand to more sports properties, and the potential role of VistaCam at regional sports nets.

Can you provide an update on the launch of the MLBAM player-tracking system?
We are just finishing the installation of all 31 parks. It’s 31 because we have our Salt River Fields research facility [in Scottsdale, AZ]. The final checkout of all the parks happened [April 14], when all 30 teams had played a home game. Until you’ve actually been through an actual game, you have no idea whether everything’s fully operational. It’s just the nature of the business.

We have been launching the data-acquisition portion to acquire the data of all the players and umpires on the field. We know exactly where they are in XYZ space 30 times a second every second. We are gathering that data and feeding it into MLBAM’s black box, which then does all the derivative analysis for things like pop time, reaction time, arm strength, or whatever they want to get out of that information.

What you saw at the All-Star Game last year, when we first did the original Statcast, there were a couple of bits and pieces that were done live but with fixed cameras. So it’s one fixed camera, high home, and you show people running around the bases and things like that. And we had the home-run tracker for the Home Run Derby. But that was a special implementation of this entire system, definitely not an everyday thing. It was all based on real data, but it was hand manipulated over the video because the rest of the process wasn’t in place at that point.

Now that we have implemented all the tracking and can move that data to a central location for MLBAM to distribute, Phase 2 is the visualization of all that data in a real-time environment. We’ve recently been developing Replay Builder, which sits on the data bus, receives all the data, and then visualizes it [in real time].

We have also developed a technology that sprung out of our Virtual Placement system that we are internally calling SceneTracker. The manifestation is through-the-lens camera tracking. So there are no instrumented cameras, and we can take up to three cameras into the system. We do a quick sweep to calibrate the cameras at the beginning of the game, and that sweep takes into account the curvature of the pitch of the field, which is not a problem on baseball. We take into account all the lens distortion. By looking at various elements in the image, at various zoom ratios, we can track the image with correct lens distortion, and therefore we can map all of our data that we’ve gathered through tracking over that video.

What other applications will the MLBAM tracking system enable in addition to Statcast?
Baseball is such a situational game. Since we are collecting the ball data through TrackMan, the instant the ball leaves the pitcher’s hand, we tag everybody for their correct positions. In addition to determining if there is a shift in play, that means that we can break every play down into its individual pieces. We can do searches on the data for individual plays based against the timecode or against functions and then derive new data from that: all the curve balls or surely all the steal attempts or pickoff attempts, things of that nature. So that becomes the basis of driving the database: being able to have a search engine in place, where we can develop things like heat maps and whatever historical information we want. That is where breaking it up into plays becomes a very important part.

Also, we can actually extract

from all the three broadcast cameras running into our environment, so you end up with 18-20 minutes of action. Because we can search for those videos, you can go back to each play and review them with the three camera angles. Those can be sent back to the coaches for analysis or can be given to the broadcasters for SportsCenter or another [news program]. And, of course, we have all the tracking data associated with that, so they can do their own [analysis graphics] on that play. So this is going to be quite an expansive system.

Do you see this system expanding to other sports beyond baseball?
Extremely relevant question, one that has been in the back of my head for a long time. The way I describe it, the process of acquiring data to visualize it is just like a sandwich. We’re the two slices of bread: we acquire the data, and we visualize it. MLBAM or the leagues become the analysis people in between. I think that’s a really interesting model for us to push forward on.

First, the next phase of our process [with MLB] is to go to the teams and get into the minor leagues with the player-tracking. I think the Replay Builder may have a role there for coaching aspects or possibly for streaming [games]. Replay Builder could be a good tool to help get them onto the online environment. Of course, hopefully, we can start gathering all the information that stays with a player all the way from when they start in the minors through when they begin playing in the major leagues.

After that, one area that’s really interesting to us is football — both at the NFL and, in particular, the college level. While we may not be doing the optical-based tracking, we don’t particularly care where the tracking information comes from. We just want to be able to take that data, map it to the camera positions, and then visualize that for the receiver of that information. For football, it becomes really easy for us to look at that and break everything up into plays. It’s easy to recognize the line of scrimmage, so you can see the start of the play, see where the play finished, and reset based on timecode. That becomes a really interesting coaching tool, and, obviously, it becomes very interesting to put that in the public domain for analysis of the game.

We already have this Replay Builder scenario working for soccer, cricket, rugby, and other applications. I see in-stadium visualizations for the fan experience as a great opportunity as well. I think that, by being on the visualization end of the data we’re collecting, we have got quite a substantial market there to continue to develop this.

Can you tell me a bit about the VistaCam release here at NAB?
VistaCam is an offshoot of all the player-tracking technology. We use exactly the same camera arrays that we do for our player-tracking. Only, it’s one camera, and then we put four cameras in as opposed to three. What we get out of that is a very wide, high-resolution, panoramic view of the stadium, and the objective is just to bring that back over the Internet at about 20 MBps. You can put that on your monitor walls in the back of the studio, make it look as if you’re in the stadium with the backdrop of the entire stadium there.

We are trying to make the studio presentation a bit more immersive and, possibly, cut down some of the onsite costs that a studio might have. I see the [RSNs] liking it a lot. We could make it look as if the studio is right in the press box looking out onto the field.