posted 13 years ago
Hello,
I am trying to write a SQL that will give me the difference between two timestamps. The Datatype of the column is TIMESTAMP.
The query is as below
This gives me the result as "0 0:0:0.92349000".
If we just take seconds and FF part of the above and do the numeric calculation as in SQL below -
This does return me "0.09" which appears to be correct.
I understand that second SQL is numeric substraction whereas the firstone is Timestamp one. But the difference between miliseconds should be consistent. Isnt it?
Or is there anything that I am not able to understand.
Thanks and Regards,
Amit