Win a copy of Java Persistence with Spring Data and Hibernate this week in the Spring forum!
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other Pie Elite all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Ron McLeod
  • Tim Cooke
  • Paul Clapham
  • Liutauras Vilda
Sheriffs:
  • Junilu Lacar
  • Rob Spoor
  • Jeanne Boyarsky
Saloon Keepers:
  • Stephan van Hulst
  • Carey Brown
  • Tim Holloway
  • Piet Souris
Bartenders:

UDP DatagramSocket.receive() glitches

 
Greenhorn
Posts: 3
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Hello,
I have a thread that constantly polls a UDP connection and posts the received data to another thread. It uses a connected DatagramSocket, reads a packet (which is always the same size), gets the data and posts it to a message queue (not the android one, my implementation).
The problem happens mainly over 3G connection, not wifi.

Code looks like:

protected void work() {
byte[] buffer = new byte[SOME_BUFFER_SIZE];

// Read UDP packet
DatagramPacket packet = new DatagramPacket(buffer, buffer.length);
long startTime = System.currentTimeMillis();
socket.receive(packet);
long elapsed = System.currentTimeMillis() - startTime;
Log.out.d("UDPReader: time for socket read: " + elapsed);

// Post the data
messageQueue.post(packet.getData());
}

My server side sends the UDP packets at a constant rate (About 80 millis) and I expect that rate more or less for streaming purposes.
Problem is: in the beginning all is fine (log print shows a good interval - around 80ms), then there a huge gap (about half a second), then some packets arrive with no interval at all (like they were buffered somewhere), then everything gets normal for the rest of the operation.
So the effect for me is having a glitch in the beginning of the stream, and the everything is back to normal infinitely.

I verified that my reader thread isn't blocking anywhere else, and that the socket read elapsed time really covers just the read itself, so I get reliable log prints (as seen in the code). I also used wireshark in the server side to verify constant packet rate.
So the 2 left options are:
1. Network issue - but that would happen for the whole duration, not just in the beginning, right? It always glitches like mad in the start, then everything is normal.
2. Platform issue - is there anything about DatagramSocket implementation on Android that might cause this? It seems like there's an initial capacity for buffering and then it's increased after that first glitch and it doesn't happen anymore...

I also checked: 1. CPU usage during the glitch and after (using traceview), nothing unusual. 2. DatagramSocket receive buffer size - it seems to default to 100000+ bytes which is far more than my packet size.

I would appreciate any help =)
 
Author and all-around good cowpoke
Posts: 13078
6
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Might be a rush of object creation. Does your queue mechanism have everything created before the first packet?

Bill
 
Omer Gilad
Greenhorn
Posts: 3
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Hmm yes, and even if not... My problem lies in the call to DatagramSocket.receive(), how can this possibly interfere?
 
William Brogden
Author and all-around good cowpoke
Posts: 13078
6
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Is it just one call to receive() that takes this long time to return or are there several?

How many packets are actually lost?

Bill
 
Omer Gilad
Greenhorn
Posts: 3
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
1. It is one call. It looks like several reads are performed with good timing (~80ms, like the sending rate), then one call takes ~0.5sec, then several calls take 0ms (seems like buffering), then everything back to normal for the rest of the time. Notice that time is measured only for DatagramSocket.receive(), nothing else...
2. No packets seem to be lost, it's just the huge interval that causes a glitch. I have jitter protection inside my message queue (that gets those packets), but 0.5secs one time only is too much for normal network behavior.

Is there any possible way to debug this inside DatagramSocket?
 
"How many licks ..." - I think all of this dog's research starts with these words. Tasty tiny ad:
The Low Tech Laboratory Movie Kickstarter is LIVE NOW!
https://www.kickstarter.com/projects/paulwheaton/low-tech
reply
    Bookmark Topic Watch Topic
  • New Topic