• Post Reply Bookmark Topic Watch Topic
  • New Topic

Convert unix timestamps to GMT  RSS feed

 
Harathi Rao
Ranch Hand
Posts: 42
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi All,

I have an issue with converting unix timestamps into GMT format..
I have used this piece of code ...

I get the o/p as 1970-01-11 21:16:44.4
where as the actual output should be March 7th 1999, 11:00:00

Where am I going wrong?
Kindly help me in resolving this problem.

Thanks in advance
Harathi Rao
 
Alvaro Arce
Greenhorn
Posts: 2
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
which is normal
the timestamp are the number of milliseconds since january the first 1970
so if you divide your timestamp by 1000 then by 3600 then by 24, you'll get something like 10 days since january the first 1970, so the date you got is the correct one.
maybe you wgere thinking in seconds and not in milliseconds
 
William Brogden
Author and all-around good cowpoke
Rancher
Posts: 13078
6
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
You need to specify that the constant is a long, otherwise Java thinks it is an int and truncates the value, producing the unexpected date.

Bill
 
Alvaro Arce
Greenhorn
Posts: 2
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
No
the problem comes definitely form the fact that he was assuming that the TimeStamp was in seconds, and actually it is in millisenconds
he only needs to multiply his timestamp by 1000
 
Harathi Rao
Ranch Hand
Posts: 42
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Thanks Alvaro Arce and Bill for your replies.. i am now able to get the required results.. all i did was multiple the timestamp by 1000, and also took care that i have the value declared as long.

Thanks once again
 
Peter Chase
Ranch Hand
Posts: 1970
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Originally posted by William Brogden:
You need to specify that the constant is a long, otherwise Java thinks it is an int and truncates the value, producing the unexpected date.


Surely the Java compiler isn't dumb enough to silently truncate an oversized literal number ... is it?
 
William Brogden
Author and all-around good cowpoke
Rancher
Posts: 13078
6
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
A quick test shows that Java 1.5 will indeed warn of an oversize int literal. I can't recall if earlier versions did or not.
Bill
 
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
Boost this thread!