Aldec 4K UltraHD Imaging Solutions powered by TySOM-3-ZU7EV board (Zynq UltraScale+ MPSoC) Taking into account the constantly growing requirements for image data resolution used by both image data producers (e.g. CMOS image sensors, live online video streams) and display panels, 4K UltraHD imaging (2160p) de facto became a standard for today. Nowadays, the applications include but not limited to consumer electronics, medical, automotive and professional A/V broadcast and is getting even wider. While specialized desktop solutions have enough processing power to overcome UltraHD image computations, especially with access to cloud data services, it still can be challenging when it takes to embedded edge processing, particularly when low power consumption must be considered. Aldec meets this challenge with its Zynq UltraScale+ MPSoC-based TySOM-3-ZU7EV prototyping board, which contains all the necessary connectivity and processing power to acquire, process and display the high-resolution 4K image data up to 60 FPS data rate. Aldec clients do not have to choose between network channel bandwidth and image data quality, since 4K processing bundle already satisfies both of these requirements, providing two separate Linux-based solutions which are focused on two different types of image broadcasting: • Solution #1: lossless raw image data (QSFP+ transferring)• Solution #2: low bitrate encoded image data (TCP/IP over Ethernet transferring, Video Codec Unit - VCU encoding/decoding) Figure 1: Aldec's 4K Imaging Solution Overview Each of those solutions contains also a special single board version, where image data is simply running from image source (e.g. 4K-capable camera) to a display (HDMI or DP) connected to the same TySOM board. Lossless Raw Image Data Broadcast The main goal of this project is to demonstrate the ability to transmit untouched high-bandwidth image data between the source and destination devices using a high-speed QSFP+ connection, backed by on-board QSFP+ peripheral connectivity and on-chip serial transceivers leveraging the high-speed point-to-point Xilinx’ Aurora communication protocol. According to the internal pixel data representation, the result bandwidth for 2160p@60Hz data rate is about 1 GB/s, which is unreachable requirement for widely used 1Gbit Ethernet. Main hardware components and features used in the design: Leopard Imaging LI-IMX274MIPI-FMC (v1.1) based on Sony IMX274 Imager as video source device; 4K-capable monitor with HDMI/DP interfaces as video display device; Zynq UltraScale+ built-in hardware blocks: DP 1.2a controller (up to 2160p@30Hz), *Mali GPU; Zynq UltraScale+ PL-side soft IP blocks: MIPI CSI2 RX SS, HDMI 2.0 TX SS (up to 2160p@60Hz), Aurora TX/RX SS; QSFP+ compliant copper or optical cable. * Mali GPU is used in single board GUI version only. Figure 2: Lossless Image Dataflow Single board 4K pass through (Zynq MPSoC Base TRD) Linux demo application for the Single board design is provided in a form of GUI-less command line application or more advanced Qt-based GUI application with additional user controls over the Sony IMX274 image sensor. Both the applications support image displaying on HDMI (up to 2160p@60) or DP (up to 2160p@30) monitors. Qt-based demo application is using built-in Mali GPU for graphics hardware acceleration under the hood.< /p> Dual-board 4K QSFP+ pass through The board-to-board communication is done utilizing a low-level GT-based Aurora communication protocol, which allows to transmit large amounts of image data meeting low-latency and high data bandwidth requirements. Moreover, it also allows to reduce overall system cost by replacing the overpriced proprietary multi-gigabit Ethernet implementations, which are typically used for the high bandwidth data transferring. Similarly, to the single board version, the dataflow is under control of Linux user space application and each side has to be configured separately. Low Bitrate Encoded Image Data Broadcast Unlike the raw image broadcast, encoded image data is better suited for transferring through industry standard Gigabit Ethernet networks. On-chip built-in VCU is responsible for video encoding and decoding memory-to-memory tasks, providing low bitrate and acceptable image quality. VCU supports both of the most popular video compression standards, such as H.264/AVC – Advanced Video Coding and H.265/HEVC – High Efficiency Video Coding up to 2160p@60 data rate. The whole image dataflow assumes the pixel data is represented in NV12 format, which is a two-plane version of YUV 4:2:0. Main hardware components and features used in the design: Leopard Imaging LI-IMX274MIPI-FMC (v1.1) based on Sony IMX274 Imager as video source device; 4K-capable monitor with HDMI/DP interfaces as video display device; Zynq UltraScale+ built-in hardware blocks: DP 1.2a controller (up to 2160p@30Hz), VCU, **Mali GPU, PS 1Gbit Ethernet controller; Zynq UltraScale+ PL-side soft IP blocks: MIPI CSI2-RX SS, HDMI 2.0 TX SS (up to 2160p@60Hz); USB3.0 or SATA HDD as non-volatile memory storage for File I/O. ** Mali GPU is used in singleboard GUI version with DP display only. There are six predefined presets used to configure VCU encoder and decoder hardware: AVC (Low, Medium, High) and HEVC (Low, Medium, High), where the target bitrates are 10/30/60 Mbit/s accordingly for Low/Medium/High for both AVC and HEVC standards. The rest part of compression settings (e.g. Profile, Rate control, GoP etc.) is the same for all the used presets. Figure 3: Compressed Image Dataflow Single board 4K pass through (Zynq MPSoC VCU TRD) Linux demo application for the Single board VCU design is provided in a form of GUI-less command line application or more advanced Qt-based GUI application with additional user controls over VCU. Both the applications support image displaying on DP (up to 2160p@30) monitor, note that HDMI (up to 2160p@60) is supported only for GUI-less application. In both cases user application operates using underlying Gstreamer framework with OpenMAX Integration Layer and VCU CtrlSW middleware. The following VCU dataflow modes are supported: Record: VCU encoder is used to store live image data to a .mp4 file on disk storage (Camera only); Display: VCU encoder/decoder is used in a loop to pass image data from camera to a display. In case of File I/O only decoder part of VCU is used; Stream: Camera image date is passed through VCU decoder and next transmitted using TCP/IP protocol over the network to a specific IP/Port. In case of File I/O .mp4 file data bypasses the VCU, since it’s already encoded for Ethernet transferring.