vidc: Adjust core timeout based on input frame size

Video core is dropping certain IDR frames due to timeout.
This is resulting in video corruption. Computing the core
timeout based on input frame size and programming it for
each input frame fixes the issue.

Change-Id: I75d5039bc09f9be6a3028461ee4a2f13064bf53a
CRs-fixed: 370570
Signed-off-by: Rajeshwar Kurapaty <rkurapat@codeaurora.org>
5 files changed