Skip to Main Content
Skip to main content
Stories Asia

Artificial intelligence steps up to the plate: Transforming the business of sports photography

Pre-trained AI, combined with deep learning, identifies images of Japanese baseball players in one-eighth of the time

Sports stars are among the most photographed people on the planet today. Their on-field performance, style, gestures, and facial expressions are almost continuously captured digitally for fans, the media, commercial use, and, ultimately, posterity.

It’s not unusual for thousands of pictures to be shot from all angles at any professional encounter nowadays. So, a typical season is likely to produce virtual mountains of images for major clubs and competitions in most sports.

Now, professional baseball in Japan is turning to artificial intelligence and the cloud to handle the magnitude of what has been a laborious and time-consuming task – photo management.

Sports photos can have immediate, lasting, and lucrative value – but only if they are kept in well organized and cataloged collections that can be accessed efficiently. IMAGE WORKS – a service of iconic Japanese photo giant, Fujifilm – manages the Nippon Professional Baseball’s (NPB) cloud-based Content Images Center (CIC).

Here curators sort images, identify players in each image and tag those images with that information. It sounds simple, but the volume of imagery now being produced is huge. The usual way of managing this is simply not keeping up.

To understand why let’s look at the special place baseball holds in modern Japan where it has been a wildly popular game since the 1930s. While its rules differ slightly from those of America’s favorite pastime, the NPB is to Japan is what Major League Baseball (MLB) is to the United States. The NPB consists of two top professional leagues: the Central League and the Pacific League. Each has six teams, and each holds 146 games a season, playing on most days of the week from March to October. Each league then holds its own playoffs, which are followed by the seven-game Nippon Series Championship between the two league champions – in a spectacle similar to that of World Series in the United States.

The automatic player name-tagging function can often identify players even in images that do not show their faces.

There is a steady deluge of images from across the country for much of the year with about 3,000 images shot at each game. After the crowds have left the stadiums, curators from each team typically select about 300 photographs. They then spend around four hours manually identifying and tagging player information to each picture.

That sort of timing can be a problem in our fast-paced world. Demand for images is usually at its highest in realtime or near realtime – that is, during or immediately after, each game. Fans and media can quickly lose interest in content from a past game once a new one begins. So, not only is the job of player image identification massive, it needs to be done fast.

Now AI has stepped up to the plate. Developers from Fujifilm and Microsoft Japan have devised a solution: an automatic player name-tagging function that identifies and tags images much faster than people can, and in greater volumes.

Since June 2018, it has been in a trial that has focused on just five baseball teams – including Hiroshima Toyo Carp, which has won the Central League championship eight times, and the Nippon Series three times. The trial was such a success, the function will be used for all NPB teams in the 2019 season.

Its photo analysis capabilities are based on pre-trained AI from Microsoft Cognitive Services and a deep learning framework from the Microsoft Cognitive Toolkit. Specifically, facial recognition using the Microsoft Cognitive Services Face API is combined with a unique determination model built on the Microsoft Cognitive Toolkit.

This enables the classification of images into four types—batting, pitching, fielding, and base running. Often, it can also determine a player’s name when his face is not visible in an angled or side shot. Azure Durable Functions and Automatic Player Name Tagging, and a final manual check by people has reduced overall processing time from the traditional four hours to just 30 minutes.

A sample of IMAGE WORKS baseball photo collection

Through its developmental stages, Microsoft Japan provided a ResNet neural network model from Microsoft Research, its research and development arm. It also held several hackathons with Fujifilm Software, which is the developer of IMAGE WORKS. Repeated verification exercises saw player recognition accuracy rates jump to over 90%.

“With the power of Azure and deep learning, we have been able to create an AI solution that makes of our photo service so much more efficient and faster. And, that is good for our customers,” said Riki Sato, Team Leader of the Advanced Solution Group at IMAGE WORKS. His colleague Daichi Hayata hailed the collaboration between IMAGE WORKS team and Microsoft Japan. “This was the first time we have dealt with deep learning, and we could do it with basic knowledge,” he said.

Fujifilm Imaging Systems now has plans to widen its use to amateur baseball leagues and then other sports. It might also be applied to the content needs outside the sports world. And, it is looking at the use of video analysis through Azure Video Indexer.

Microsoft Japan is committed to helping companies and organization embrace digital transformation with AI and is considering how to use this combination of pre-trained AI and a customizable deep learning framework in other fields, such as medicine.