Hello,
Probably this question belongs to kernelnewbies
list but I think i will get accurate answer from here.
I am doing some optimization in kernel video driver
code to reduce the latency from the time buffer
is given to the time it is displayed.
Userspace:
t1 = gettimeofday()
Kernelspace:
t2 = do_gettimeofday()
Using debugfs i am copying t2 to userspace and
subtracting the result.
latency = t2 - t1
Would this be accurate? I know there is context
switch involved and other stuff but this is the method
i came up with. Any better way?
System: ARM (Android)
Kernel- 3.10
On Fri, Apr 4, 2014 at 3:02 PM, noman pouigt <[email protected]> wrote:
> Hello,
>
> Probably this question belongs to kernelnewbies
> list but I think i will get accurate answer from here.
>
> I am doing some optimization in kernel video driver
> code to reduce the latency from the time buffer
> is given to the time it is displayed.
>
> Userspace:
> t1 = gettimeofday()
>
> Kernelspace:
> t2 = do_gettimeofday()
>
> Using debugfs i am copying t2 to userspace and
> subtracting the result.
>
> latency = t2 - t1
>
> Would this be accurate? I know there is context
> switch involved and other stuff but this is the method
> i came up with. Any better way?
>
Not sure whether it is accurate or not...
But from the code, gettimeofday syscall define in kernel/time.c
is also using the kernel's do_gettimeofday API, and then copy to
userspace, which shall not so different with your debugfs usage.
For better way, would this vDSO getimeofday could fulfill your need?
You may refer to: https://lwn.net/Articles/585203/
Thanks,
Lei