Cost-effective, compact remote production with Sony’s Camera Remote SDK

TBS Television and WOWOW are channels that go beyond traditional broadcasting, focusing on expanding into new media channels such as online streaming and enriching content. The new technology development divisions of both channels teamed up to create a compact and cost-effective remote production system, that can be implemented by other companies. They have already garnered significant operational experience, covering from online live streaming to live broadcasts from abroad on terrestrial TV.

In an interview with Mr. Fujimoto from TBS Television’s Media Technology Division, Innovative Technology Design Section, and Mr. Shintaro Ishimura from WOWOW’s Technology Department, Technology Planning Section, we explore the revolutionary system’s features, the role of the Camera Remote SDK, and their outlook.

How we helped
Target icon
The Challenge
  • Creating a cost-effective and compact remote production system
Puzzle icon
The Solution
  • Introduction of Sony’s α, Cinema Line cameras and Camera Remote SDK
Diagram icon
The Outcome
  • Remote control of cameras in the US from a studio in Japan
  • Excellent image quality, significantly higher level of fine-tuning and freedom through Camera Remote SDK setting
  • The ability to control the zoom of the lens via the camera is a strong point of Sony

Please share your experience with using the Camera Remote SDK?

Ishimura: The budget scale for our content production varies significantly depending on the project. In the past, when production costs exceeded the budget, we often had no choice but to halt production altogether. By reducing production costs through the incorporation of new technology, we aimed to revive content ideas that we had previously abandoned, thereby expanding the range of our programs. This led us to the concept of “remote production”, where we minimize on-site personnel and control operations remotely. TBS and WOWOW possessed a unique technology called “Live Multi-Viewing” (LMV), which we had developed. LMV enables the transmission of multi-stream video with low bandwidth and low latency. By leveraging LMV, we believed we could achieve a novel form of remote production using common public networks at an affordable cost.

Mr. Ishimura

Fujimoto: However, transforming it into remote production had its limitations in terms of cost reduction. Drawing on my experience in production technology for terrestrial programs, I recognized that, for instance, “you can achieve remote production by attaching a ¥10 million remote camera head to a ¥10 million camera.” However, this approach wouldn’t fulfil our original goal of reducing production costs. The crucial aspect was figuring out how to build a compact and cost-effective system.

Ishimura: Even before the concept of remote production, we had been involved in software-based production. We chose to develop it by leveraging PCs and software-based switchers, managing cost-effective cameras through our proprietary remote-control software. This became our primary focus.

Mr. Fujimoto

Could you tell us why you chose α series?

Fujimoto: Initially, we used cameras from other manufacturers due to their affordability and available SDKs. However, in terms of usability, we encountered several inconveniences, including slow autofocus, a single set mode, and the inability to fine-tune white balance. Additionally, the cameras exhibited low sensitivity, leading to challenges in achieving satisfactory image quality.

Ishimura: In sports broadcasts, adding lighting is not feasible. Whether indoors or outdoors, during evening matches or night games, the illumination was insufficient. The camera we previously used also had a limited dynamic range. In scenes with both sunlight and shadows, adjusting for the dark side led to washed-out images, while adjustments for the bright side resulted in blacked-out images. Having utilized α and Cinema Line cameras for other purposes, we were aware of the high sensitivity, wide dynamic range, stability, and reliability of α. The extensive range of lenses available for α E-mount and the inclusion of electric zoom lenses perfectly aligned with our requirements.

Fujimoto: I knew that α had excellent image quality and high sensitivity. We had already been employing α and Cinema Line cameras in music programs and dramas, and I had been contemplating their use for remote production. That’s when Sony released the ‘Camera Remote SDK’ which supports cameras running on Windows, MacOS, and Linux. While numerous cameras support Windows and MacOS, there were no other cameras supporting Linux, the operating system we use for control. This became the decisive factor for adoption.

Could you explain the specific operations of remote control?

Ishimura: For the camera itself, we utilized the camera’s SDK to manage settings such as white balance, ISO sensitivity, shutter speed, autofocus, lens aperture, and zoom. To facilitate the role of a remote camera head that tilts the camera up, down, left, and right, we implemented an electric gimbal for handheld shooting. By employing the SDK on the gimbal side, we achieved control over the camera angle. These controls are conducted on the Linux-compatible board computer NVIDIA Jetson, responsible for video encoding and network transmission. Each camera is paired with one of these computers. For instance, in the main tennis relay setup, our primary scene, the number of cameras ranges from 3 to 5, and the basic configuration is consistent for all.

Sony camera at a sports stadium
Sony camera pointing at a tennis court

Could you share an example of an actual on-site usage with Sony's cameras and Camera Remote SDK?

Fujimoto: In July 2022, we deployed it for the live broadcast of a track and field competition on terrestrial television. We remotely controlled two ‘α7S III’ cameras set up in a stadium in Oregon, USA, from the TBS Television headquarters in Tokyo. We projected the transmitted 4K images onto the LED wall of the XR (Extended Reality) studio as a background image. We opted for the FE 12-24mm F2.8 GM lens, capturing the entire stadium at an ultra-wide angle of about 120 degrees. The clear 4K images covered even the peripheral areas, made possible by the unique performance of α with its full-frame image sensor. Those present in the studio were visibly excited due to the high sense of presence.

Additionally, we had been conducting remote shooting for tennis matches since 2019. However, starting with the streaming of the national high school tennis tournament in March 2022, we switched to using α7 IV and α7S III cameras with Sony’s Camera Remote SDK. ‘ We opted for the E PZ 18-105mm F4 F OSS lens, which supports electric zoom and is designed for video shooting. In December 2022, we also introduced the Cinema Line camera ILME-FX30 and utilized it for the live streaming of a junior tennis tournament held in Ehime.

Why did you choose the ILME-FX30?

Fujimoto: Firstly, the fact that it has an APS-C-sized image sensor is significant. This makes telephoto shooting easier and allows for a deeper depth of field, making it suitable for sports photography. The second deciding factor was the reasonable price which was within out budget including the cost of the lens. Additionally, being part of the Cinema Line, FX30 takes into account considerations such as heat dissipation for long-duration video shooting, providing a sense of reassurance. The image quality is, of course, impeccable, and the autofocus tracking performance is outstanding. It feels like there is nothing more we could ask for. In February 2023, for the men’s tennis international competition, we also added the Cinema Line camera FX6 and controlled it remotely through camera remote SDK. Regarding FX6, we used it for manned operation by on-site camera operators to maintain colour consistency and exposure unity with other unmanned cameras. One of the strengths of Camera Remote SDK is its compatibility with a wide range of camera models. In March 2023, we also introduced the Cinema Line camera FX3.

What were your impressions of using Camera Remote SDK?

Fujimoto: I was personally involved in developing the dedicated remote-control software. Compared to the SDK from other companies that we initially used, Sony’s SDK stood out for its significantly higher level of fine-tuning and freedom in settings. Moreover, the ability to control the zoom of the lens via the camera is a strong point of Sony. Updates are frequent, with features and sample codes being added continuously. More updates mean increased motivation for developers Regarding Cinema Line cameras, there is also the electric pan/tilt integrated FR7, which supports remote control. We are considering using this as well.

Could you share your future plans?

Fujimoto: In the future, we are considering deployment on vehicles. For example, in marathon broadcasts, we used to deploy a dedicated OB truck with a substantial vibration isolation device to control the camera inside the vehicle. With our system, remote control is feasible, allowing us to install it on ordinary cars or motorcycles. We also aim to broaden the utilization of this system in terrestrial program production.

Ishimura: We are also experimenting with AI-based automation of camera work to further minimize the number of personnel involved in operations. We built this system with a focus on cost effectiveness that can be accessible a wide range of professionals and companies. While continuously enhancing the system, we actively aim to offer services and systems to external parties.

From left to right, Mr. Fujimoto and Mr. Ishimura