[Day 457 bug] PNG deflate bug

I felt like debugging today, so I took a crack at it. While I'm pretty sure Casey can find it faster than I did. I just want to post the solution in case it does take him longer than he is willing to spend on it.
But since giving the solution might deprive us of the learning experience of seing Casey debug this kind of stuff I'm putting it in spoiler tags ;)

The PeekBits function does not take the IDAT chaining into account and should keep Consuming bytes while 'Buf->ContentsSize || Buf->First', not just 'Buf->ContentsSize'.

Happy coding!
Mox

Edited by Mox on Reason: Initial post
As per request of Miblo on stream, here is little write up about my debugging experience so you can compare it to the stream.

Since this is not my own sourcecode I first wanted to get more familiar with the code. I started out looking at the normal decoding. So just stepping through with f10. I focused on whether the Filter Type byte of each line would come out correctly and whether the LitLen and Distance seemed reasonable. To check this I used the Memory window.

After getting comfortable and seeing reasonable values for the first few lines I checked out the PNGLengthExtra and PNGDistExtra; No joy there, like on stream

At this point I advanced to the point of the assertion and inspected the Memory window. I saw huge runs of f3f3f3f3....
So I tried to find at what point that run started. Just scrolling through the Memory view and calculating it based on row length I found something like row 529. So I set a conditional breakpoint to stop at just before that row.
Until then I had blackboxed the HuffmanDecode function, but now it became suspect. I dove deeper and looked at HuffmanDecode more. I saw that the buffer used was getting close to the end with only a few bytes left and continued stepping to see what happened when it ended. Breakpoints at the place that sets up the next IDAT block were never hit, so I finally found the source of the bug.

Always funny that when you know the bug, you also see that you could have seen it far sooner. If I had looked at the state of CompData at the time of asserting, you could have seen the underflow there and then. But I guess that is always the case with hindsight ;)

In the end I think stepping through the code and just looking at the memory was a good but slow way to get a feel for the code. It seemed that Casey was trying a more loose approach to get a better feeling about the problem, before actually looking at more code. But not having written the code I think made it more difficult for me to pinpoint the exact area until I got myself familiar.

Looking back at the debugging, I think it's a good learning point for me to see that I might have went too deep at first and maybe should take a more global overview first to find a good avenue of attack. But then again, analyzing new code does take time.

Hope somebody finds this other debugging experience interesting!

Happy coding!
Mox