# Print screen test

Text Contents :

# Introduction

While so far the most common way of representing the visual component of the world has been to take the output of a camera, compress it for transmission and storage using one of the MPEG video coding standards and eventually decode it and present it on 2D displays, there are now more and more devices that capture and present 3D representations of the world.

A point cloud is a set of points in a 3D space each with associated data relative to the value of the two angles (phi and theta) used in the acquisition, e.g. color, material properties and/or other attributes. Point clouds can be used to reconstruct an object or a scene as a composition of such points. Point clouds can be captured using multiple cameras and depth sensors in various setups and may be made up of thousands up to billions of points in order to represent realistically reconstructed scenes.

As compression technologies are needed to reduce the amount of data required to represent a point cloud, MPEG is planning to develop a Point Cloud Compression standard targeting lossy compression for use in real-time communications, lossless compression for GIS, CAD and cultural heritage applications, with attributes of efficient geometry and attributes compression, scalable/progressive coding, coding of sequences of point clouds captured over time, and random access to subsets of the point cloud.

The acquisition of Point Clouds is outside of the scope of this standard.

# Test material Datasets

Below is a list of the 3D point cloud and mesh content sequences to be used, organized in sections based on the data characteristics. All datasets will be uploaded to the MPEG Content repository.

# Summary

We report unexpected results for the BD-Rate metric when evaluating the ultra high definition sequences from SVT. The behaviour appears due the polynomial curve fitting and the high frequency noise in the sequences. As this metric is widely used by Question 6/16, we report our results for discussion. We also discuss an alternative method for calculating the average rate difference.

# 2 Reported Observations

The table below shows the results of coding an SVT sequence “IntoTree” of resolution 3840x2160, with IBBP structure with the JM13.0. Here, we report using configurations from VCEG common test condition [2], which are denoted as "JM NO Slice". Additionally, we report data when slices are enabled in the JM. This is marked as "JM with 2MB row per slice". This second set of data points serves to illustrate the unexpected behaviour that we observe.

BD difference of the above data points allows us to compute the BD-rate and BD-PSNR values comparison between the two configurations. These are -16.32% and -0.04dB, respectively. This is significantly different than expected, as a BD-rate value of -16.32% traditionally corresponds to a BD-PSNR value that is much larger than -.04dB. We plot the RD performance in Fig. 1. Again, the R-D curve does not seem representative of the reported RD-rate measurement.

Fig. 1 RD plot IntoTree sequence

To understand this unexpected result, we plot the polynomial curve used to compute the BD-rate. This is shown in Fig. 2, where the curves in red and blue are the fitted polynomial curves. The dashed lines are the linear connection of the data points. Inspecting the figure, we see that the interpolation provides results that are not consistent with most coding technology. Specifically, look at the points corresponding 37 dB and 38 dB. In this case, the bit-rate corresponding to the 38 dB point is substantially lower than the bit-rate corresponding to the 37dB point.

As additional information, we report that Piecewise Cubic Hermite Polynomial Interpolation seems to fit the data points well. The Cubic interpolation has the advantage of shape preserving and respecting monotonicity. It is also easy to calculate the average differences of two cubic interpolation fitted curves. The cubit fitted curves of the data sets in Table 1 are shown in Figure 3. By using this new cubic fitted curve, the average rate difference of these two sets of data points is 3.74%. For the "well behaved" sequences, we mention that Cubic interpolation results very close to the BD differences.

Fig. 2 Polynomial curve fitting used in BD rate calculation

Fig. 3 Cubic Interpolation used in alternative rate calculation

# 3 Conclusions

We have observed that for some larger resolution sequences, it is likely that the R-D performance of the sequence may not be well modelled by the polynomial curve fitting using in a BD calculation. Specifically, we have observed "non-normal" or "unexpected" calculations for the sequences DucksTakeOff, IntoTree and OldTownCross in SVT test sequence set.

# 4 References

[1] G. Bjøntegaard, "Calculation of Average PSNR Differences between RD curves",

ITU-T SG 16/Q.6 13th VCEG Meeting, Austin, Texas, USA, document VCEG-M33 (available at http://ftp3.itu.int/av-arch/video-site), April 2001.

[2] T. Tan, G. Sullivan, T. Wedi, “Recommended Simulation Common Conditions for Coding Efficiency Experiments Revision 2”, ITU-T SG 16/Q.6, document VCEG-AH10.

**IPR Statement:** Sharp may have IPR relating to the technology described in this contribution and, conditioned on reciprocity, is prepared to grant licenses under reasonable and non-discriminatory terms as necessary for implementation of the resulting ITU-T Recommendation (per box 2 of the ITU-T/ITU-R/ISO/IEC patent statement and licensing declaration form).