One of the first tasks I was given at my new job is building a new REST API. Generating random strings and hashing them (not for passwords but kinda) was one of the requirements.
So I naturally started working on it and thought that's gonna be easy. After all, Django provides a bunch of hashers to use.
When we started testing locally, i.e. local server and network, we saw a pretty noticeable lag handling a request for like one thousand cards. I started trying to optimize my code here and there and saw insignificant improvement. Making some database queries RAW instead of using the ORM didn't help either.
So I started benchmarking and oh boy! it was the hashing algorithm. PBKDF2 in particular since it was Django's default algorithm.
It took 1111.366 seconds (18 minutes and 31 seconds) to create 10k cards. Afterwards, it took 226.852 seconds (3 minutes and 46 seconds) to add 2k cards (20.411% time to create 20% cards) then, it took 118.977 seconds (1 minute and 59 seconds) to add 1k cards (10.705% to create 10% cards). Finally, it took 56.03 seconds to add 500 cards (5.041% time to create 5% cards).
Lo and behold when I decided to use Argon2, it brought the time down dramatically with each call generating 100 cards on the same system, it took 0.1797 seconds for one call, 1.6952 seconds for ten calls and only 16.9347 seconds for 100 calls
A HUGE improvement that's actually incomparable.
Use Argon2 for hashing when ever possible
P.S: Tests were performed with postgres on a PC spec'ed with an Intel Core i3-9100 CPU with a Samsung EVO 860 SSD