Skip to content

Conversation

@y-guyon
Copy link
Contributor

@y-guyon y-guyon commented Jun 18, 2025

Based on #2599.

Note that "(Default: psnr)" was incorrect.

@maryla-uc
Copy link
Contributor

WIth this PR, using -a sharpness=0 does nothing because when you set tune=iq it overrides it.
This also happens if you manually set -a sharpness=0 -a tune=iq. It only works as expected with -a tune=iq -a sharpness=0.

@juliobbv-p
Copy link

@maryla-uc good catch! tune iq sets up many optimized defaults (including --sharpness), so the codec control call to set tune iq should happen before the -a parameters.

@y-guyon
Copy link
Contributor Author

y-guyon commented Jul 23, 2025

@maryla-uc good catch! tune iq sets up many optimized defaults (including --sharpness), so the codec control call to set tune iq should happen before the -a parameters.

Done.

@castano
Copy link

castano commented Oct 3, 2025

Any idea what's holding up this PR?

@y-guyon
Copy link
Contributor Author

y-guyon commented Oct 3, 2025

Any idea what's holding up this PR?

There are a few consequences of TUNE=IQ to take into consideration when using it by default:

  • The quality encoding setting does not lead to the same file size and visual distortion, so the default encoding settings may end in noticeably different output after this change.
  • TUNE=IQ is slower to encode. One option may be to increase the default speed to 7 instead of 6, but that could also have side effects on output file size and decoding time.

@juliobbv-p
Copy link

@castano another thing worth mentioning: tune IQ is already considered production quality (on libaom 3.13.1 and newer). If a project doesn't have to worry about specific tune ssim behavior, then said project can just switch to use tune IQ right away.

For example, The Guardian has been using tune IQ for a few months, and we've received positive feedback from the website admin in the form of overall better image quality and consistency. Tune IQ's improvement in bit allocation by itself can be seen as a "bug fix".

I wouldn't wait for this PR to be merged as a sign of feature "readiness". On the contrary, we're just trying to be super thorough in understanding the inherent differences in bitrate allocation and encode time with tune IQ across images, compared to tune SSIM.

@castano
Copy link

castano commented Oct 4, 2025

I think it makes sense to evaluate the consequences of enabling tune IQ before making it the default choice, but it would be helpful to first add support for tune IQ as a user option, and only later make it the default after more users have had the chance to evaluate it and provide feedback.

I have clients interested in using tune IQ, but they won't maintain a custom build of libavif with modifications that they have to merge on every update.

Similarly, tools that use libavif won't offer this option until it's officially supported by libavif, so not having this exposed also delays its upstream adoption.

@juliobbv-p
Copy link

juliobbv-p commented Oct 5, 2025

@castano you can enable tune IQ in avifenc with the following parameter: -a c:tune=iq. This is the official way to switch between different tunings, so it's guaranteed to never change.

avifenc in turn passes in the parameter to libavif as a key ("tune")/ value ("iq") pair in the following manner:

static avifBool avifCodecSpecificOptionsAdd(avifCodecSpecificOptions * options, const char * keyValue)
{
    const char * value = strchr(keyValue, '=');
    if (value) {
        // Keep the parts on the left and on the right of the equal sign,
        // but not the equal sign itself.
        const size_t keyLength = value - keyValue;
        return avifCodecSpecificOptionsAddKeyValue(options, keyValue, keyLength, value + 1);
    }
    // Pass in a non-NULL, empty string. Codecs can use the mere existence of a key as a boolean value.
    return avifCodecSpecificOptionsAddKeyValue(options, keyValue, strlen(keyValue), "");
}

avifenc.c has the necessary code to talk to libavif, including passing in codec-specific options.

@castano
Copy link

castano commented Oct 6, 2025

Hmm... I haven't tried avifenc, but I didn't see the iq option in the aomOptionDefs struct, so I assumed that would not work without additional changes:

https://github.com/AOMediaCodec/libavif/blob/40cb2c1fbff1e14b6c81816b73a364490b12ffb6/src/codec_aom.c#L397C39-L397C49

Ah, I see it gets routed to aom_codec_set_option, never mind!

@wantehchang
Copy link
Collaborator

Ignacio: The aomOptionDefs struct is only used with older versions of libaom that don't have the aom_codec_set_option() function. In that case, the feature detection macro HAVE_AOM_CODEC_SET_OPTION is not defined.

For testing this pull request, please use the latest version of libaom, v3.13.1 if you can. Thanks!

@y-guyon y-guyon changed the title Use AOM_TUNE_IQ by default Use AOM_TUNE_IQ by default for YUV Jan 5, 2026
@y-guyon y-guyon requested a review from wantehchang January 16, 2026 10:56
Co-authored-by: Vincent Rabaud <vrabaud@google.com>
#if defined(AOM_HAVE_TUNE_IQ) && defined(AOM_USAGE_ALL_INTRA) && defined(ALL_INTRA_HAS_SPEEDS_7_TO_9)
// AOM_TUNE_IQ is favored for its low perceptual distortion on luma and chroma samples.
// AOM_TUNE_IQ sets --deltaq-mode=6 which can only be used in all intra mode.
if (image->matrixCoefficients != AVIF_MATRIX_COEFFICIENTS_IDENTITY && (addImageFlags & AVIF_ADD_IMAGE_FLAG_SINGLE)) {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

IMPORTANT: If the addImageFlags & AVIF_ADD_IMAGE_FLAG_SINGLE test is intended to detect whether all intra mode is used, ideally we should wait until we have set the aomUsage variable and then test aomUsage == AOM_USAGE_ALL_INTRA.

This requires moving some code around and possibly save some variables such as aomUsage and useTuneIq in the codec->internal struct.

Would you like me to take a stab at this and attach a patch to this pull request?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would you like me to take a stab at this and attach a patch to this pull request?

Thank you for the proposal. Could you do that in a separate pull request, so that this PR only switches t=IQ on by default?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, we can reorder the code in a separate pull request.

Optional: If the addImageFlags & AVIF_ADD_IMAGE_FLAG_SINGLE check is intended to detect whether all intra mode is used rather than whether we are encoding a still image, it would be good to use a local variable whose name clarifies the purpose of the check:

            avifBool useAllIntra = (addImageFlags & AVIF_ADD_IMAGE_FLAG_SINGLE) != 0;
            if (image->matrixCoefficients != AVIF_MATRIX_COEFFICIENTS_IDENTITY && useAllIntra) {

@wantehchang
Copy link
Collaborator

Yannis: Sorry about the late review. I missed your review request.

Copy link
Collaborator

@wantehchang wantehchang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. Thanks!

} else {
libavifDefaultTuneMetric = AOM_TUNE_SSIM;
#if defined(AOM_HAVE_TUNE_IQ)
#if !defined(AOM_USAGE_ALL_INTRA) || !defined(ALL_INTRA_HAS_SPEEDS_7_TO_9)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This check looks strange here, especially the check for ALL_INTRA_HAS_SPEEDS_7_TO_9. The reason for this check is not obvious until we get to line 721 below:

            if (aomCpuUsed >= 7) {
#if defined(AOM_USAGE_ALL_INTRA) && defined(ALL_INTRA_HAS_SPEEDS_7_TO_9)
                if (!(addImageFlags & AVIF_ADD_IMAGE_FLAG_SINGLE)) {
                    aomUsage = AOM_USAGE_REALTIME;
                }
#else
                aomUsage = AOM_USAGE_REALTIME;
#endif
            }

If you want to avoid reordering the code now, I suggest we delete this check. This check is always false in libaom releases or on the libaom main branch.

#if defined(AOM_HAVE_TUNE_IQ) && defined(AOM_USAGE_ALL_INTRA) && defined(ALL_INTRA_HAS_SPEEDS_7_TO_9)
// AOM_TUNE_IQ is favored for its low perceptual distortion on luma and chroma samples.
// AOM_TUNE_IQ sets --deltaq-mode=6 which can only be used in all intra mode.
if (image->matrixCoefficients != AVIF_MATRIX_COEFFICIENTS_IDENTITY && (addImageFlags & AVIF_ADD_IMAGE_FLAG_SINGLE)) {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, we can reorder the code in a separate pull request.

Optional: If the addImageFlags & AVIF_ADD_IMAGE_FLAG_SINGLE check is intended to detect whether all intra mode is used rather than whether we are encoding a still image, it would be good to use a local variable whose name clarifies the purpose of the check:

            avifBool useAllIntra = (addImageFlags & AVIF_ADD_IMAGE_FLAG_SINGLE) != 0;
            if (image->matrixCoefficients != AVIF_MATRIX_COEFFICIENTS_IDENTITY && useAllIntra) {

printf(" sharpness=S : Bias towards block sharpness in rate-distortion optimization of transform coefficients in 0..7. (Default: 0)\n");
printf(" tune=METRIC : Tune the encoder for distortion metric, one of 'psnr', 'ssim' or 'iq'.\n");
printf(" (Default for color: ssim, default for alpha: psnr)\n");
printf(" (Default for color: still images (libaom v3.12.0+): iq, otherwise: ssim; default for alpha: psnr)\n");
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we change "still images" to "still non-RGB images"?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants