View Issue Details

IDProjectCategoryView StatusLast Update
0000355JVT JM H.264/AVC reference softwareencoderpublic2015-07-08 19:23
ReporterandrewK Assigned ToKarsten Suehring  
PriorityhighSeveritymajorReproducibilityalways
Status acknowledgedResolutionopen 
PlatformDesktop PCOSWindowsOS VersionWin7
Product VersionJM 18.6 
Summary0000355: MaxMvsPer2Mb constraint violated per the Table A-1 = Number of MVs per two consecutive MBs exceeded limit
DescriptionDear Karsten, Experts,

I have just encoded some 1080p video content at different frame rates using the JM 19.0. After checking for the bit-streams conformance, I'm getting a number of the messages: 'Number of MVs per two consecutive MBs exceeded limit. Max is 16, found 19. I suppose, that refers to the violated constraint for the MaxMvsPer2Mb in the Table A-1 Level limits. As expected that tend to happens at lower QP values, when better video quality is achievable.
Thanks!

ps. the example bitstream is type30QPB202020.h264 - but can't be uploaded due to size limit > 5000k below - will try to send it directly to Karsten
TagsNo tags attached.

Activities

andrewK

2015-07-07 16:58

reporter  

MaxMvsPer2Mb_issue.bmp (2,588,598 bytes)

Alexis Michael Tourapis

2015-07-07 19:27

developer   ~0000636

The JM software currently can only consider a "soft" constrain for this level limitation. That is, this is applied by restricting the blocktypes allowed appropriately, e.g. by disabling PSliceSearch4x4/BSliceSearch4x4 or even larger partitions. The software does not have an "internal" limiting scheme that would count the partitions. This also follows common practice where people tend to not use such very small partitions given other drawbacks.

andrewK

2015-07-08 18:03

reporter   ~0000639

Thanks Alexis for your clarifications. So basically you're saying that once you disable all smaller <8x8 partition sizes you'll never exceed these Table A-1 specified limits. Yes, encoding of a higher resolution video (e.g. >=960x540) normally doesn't need to use these to achieve required high quality bitstream. Wondering what had caused to put these limits there in the first place - reducing memory requirements and/or computational complexity? Formally this is quite unfortunate and potentially harming interoperability for some systems, if someone follows exactly the reference implementation, which itself breaks the official specification.

Karsten Suehring

2015-07-08 18:12

administrator   ~0000640

Right, the limit has been introduced to reduce memory access and complexity. When interpolation is done on the fly, too many motion vectors may need too many different sub-pel positions.

A decoder, which can decode streams that are encoded by the reference software should be fine, because it has higher capabilities than the spec requirement.

On the other hand, a stream encoded with an encoder that does not follow the constraint may not be decodable with a compliant decoder.

We have been pointing out missing profile/level checks for years. Our resources are constraint. So if there are no contributions, there will be no fixes.

andrewK

2015-07-08 18:22

reporter   ~0000641

>>On the other hand, a stream encoded with an encoder that does not follow the constraint may not be decodable with a compliant decoder.
Yes, exactly. Thanks Karsten!

Alexis Michael Tourapis

2015-07-08 19:23

developer   ~0000642

As Karsten has said, this was introduced to reduce memory access and "decoder" complexity. As a note, this check only checks partitions, not actual motion vectors, and it is quite possible that your partitions may have the same the same motion vectors or that they use integer samples (thus not requiring subpixel interpolation). It also can complicate the design of encoders and to some extent the design of this was a bit shortsighted (at least this is my opinion; others may disagree), and could have been done in much more elegant ways. But it is what it is.

A somewhat "easy" way to fix this is to add a few counter variables at different locations of the software. The primary one basically contains the number of partitions from the previous block. This basically would set the limit for the partitions that can be tested in the current MB. Every subpartition that is also evaluated/decided would update any related RD decision related counters for any subsequent subpartition decisions. It is quite important to also leave a buffer of "1" for the next Macroblock, since otherwise, if you use up your budget, you may end up in forcing that partition to intra.

Special handling needs to also be made for MBAFF partitions, assuming you care about interlace.

Issue History

Date Modified Username Field Change
2015-07-07 16:58 andrewK New Issue
2015-07-07 16:58 andrewK File Added: MaxMvsPer2Mb_issue.bmp
2015-07-07 19:27 Alexis Michael Tourapis Note Added: 0000636
2015-07-08 15:20 Karsten Suehring Assigned To => Karsten Suehring
2015-07-08 15:20 Karsten Suehring Status new => acknowledged
2015-07-08 18:03 andrewK Note Added: 0000639
2015-07-08 18:12 Karsten Suehring Note Added: 0000640
2015-07-08 18:22 andrewK Note Added: 0000641
2015-07-08 19:23 Alexis Michael Tourapis Note Added: 0000642