[Bug 2044852] [NEW] libgcrypt < 1.10.2 returns wrong sha3 hashes for inputs > 4 GiB

Tobias Heider 2044852 at bugs.launchpad.net
Mon Nov 27 20:59:39 UTC 2023


Public bug reported:

[ Impact ]

SHA3 produces wrong results for inputs bigger than 4 GiB

[ Test Plan ]

Calculate sha3 hash of a big input file and compare with output of
another implementation like OpenSSL.

Expected behavior: same output
Actual behavior: different output

[ Where problems could occur ]

People relying on the broken hash might be surprised by the new fixed
result.  The impact is hopefully low since SHA3 from libgcrypt is not
too widely used, especially not with this input size.

[ Other Info ]

>From upstream bug report:

The SHA3 functions give wrong results for inputs larger than 4GB,
because the originally size_t argument handled as unsigned int in
keccak_write and leads to integer overflows. This does not happen if the
data is fed into the md_write by smaller chunks. More information and
reproducers are available from Clemens in the attached bug.

The fix that should solve the problem (use of the size_t) is available
now at gitlab: https://gitlab.com/redhat-crypto/libgcrypt/libgcrypt-
mirror/-/merge_requests/6 Comments welcomed.

I was considering updating the some of the hash tests to capture this
issue, but did not find a simple way to do that yet so I will keep it on
you to decide if you believe some regression test is needed here.

Upstream Bug: https://dev.gnupg.org/T6217
Upstream Fix: https://dev.gnupg.org/rC9c828129b2058c3f36e07634637929a54e8377ee

** Affects: libgcrypt20 (Ubuntu)
     Importance: Undecided
         Status: New

-- 
You received this bug notification because you are a member of Ubuntu
Foundations Bugs, which is subscribed to libgcrypt20 in Ubuntu.
https://bugs.launchpad.net/bugs/2044852

Title:
  libgcrypt < 1.10.2 returns wrong sha3 hashes for inputs > 4 GiB

Status in libgcrypt20 package in Ubuntu:
  New

Bug description:
  [ Impact ]

  SHA3 produces wrong results for inputs bigger than 4 GiB

  [ Test Plan ]

  Calculate sha3 hash of a big input file and compare with output of
  another implementation like OpenSSL.

  Expected behavior: same output
  Actual behavior: different output

  [ Where problems could occur ]

  People relying on the broken hash might be surprised by the new fixed
  result.  The impact is hopefully low since SHA3 from libgcrypt is not
  too widely used, especially not with this input size.

  [ Other Info ]

  From upstream bug report:

  The SHA3 functions give wrong results for inputs larger than 4GB,
  because the originally size_t argument handled as unsigned int in
  keccak_write and leads to integer overflows. This does not happen if
  the data is fed into the md_write by smaller chunks. More information
  and reproducers are available from Clemens in the attached bug.

  The fix that should solve the problem (use of the size_t) is available
  now at gitlab: https://gitlab.com/redhat-crypto/libgcrypt/libgcrypt-
  mirror/-/merge_requests/6 Comments welcomed.

  I was considering updating the some of the hash tests to capture this
  issue, but did not find a simple way to do that yet so I will keep it
  on you to decide if you believe some regression test is needed here.

  Upstream Bug: https://dev.gnupg.org/T6217
  Upstream Fix: https://dev.gnupg.org/rC9c828129b2058c3f36e07634637929a54e8377ee

To manage notifications about this bug go to:
https://bugs.launchpad.net/ubuntu/+source/libgcrypt20/+bug/2044852/+subscriptions




More information about the foundations-bugs mailing list