Help coderanch get a
new server
by contributing to the fundraiser

Omer Gilad

Greenhorn
+ Follow
since Dec 28, 2009
Merit badge: grant badges
For More
Cows and Likes
Cows
Total received
0
In last 30 days
0
Total given
0
Likes
Total received
0
Received in last 30 days
0
Total given
0
Given in last 30 days
0
Forums and Threads
Scavenger Hunt
expand Ranch Hand Scavenger Hunt
expand Greenhorn Scavenger Hunt

Recent posts by Omer Gilad

1. It is one call. It looks like several reads are performed with good timing (~80ms, like the sending rate), then one call takes ~0.5sec, then several calls take 0ms (seems like buffering), then everything back to normal for the rest of the time. Notice that time is measured only for DatagramSocket.receive(), nothing else...
2. No packets seem to be lost, it's just the huge interval that causes a glitch. I have jitter protection inside my message queue (that gets those packets), but 0.5secs one time only is too much for normal network behavior.

Is there any possible way to debug this inside DatagramSocket?
14 years ago
Hmm yes, and even if not... My problem lies in the call to DatagramSocket.receive(), how can this possibly interfere?
14 years ago
Hello,
I have a thread that constantly polls a UDP connection and posts the received data to another thread. It uses a connected DatagramSocket, reads a packet (which is always the same size), gets the data and posts it to a message queue (not the android one, my implementation).
The problem happens mainly over 3G connection, not wifi.

Code looks like:

protected void work() {
byte[] buffer = new byte[SOME_BUFFER_SIZE];

// Read UDP packet
DatagramPacket packet = new DatagramPacket(buffer, buffer.length);
long startTime = System.currentTimeMillis();
socket.receive(packet);
long elapsed = System.currentTimeMillis() - startTime;
Log.out.d("UDPReader: time for socket read: " + elapsed);

// Post the data
messageQueue.post(packet.getData());
}

My server side sends the UDP packets at a constant rate (About 80 millis) and I expect that rate more or less for streaming purposes.
Problem is: in the beginning all is fine (log print shows a good interval - around 80ms), then there a huge gap (about half a second), then some packets arrive with no interval at all (like they were buffered somewhere), then everything gets normal for the rest of the operation.
So the effect for me is having a glitch in the beginning of the stream, and the everything is back to normal infinitely.

I verified that my reader thread isn't blocking anywhere else, and that the socket read elapsed time really covers just the read itself, so I get reliable log prints (as seen in the code). I also used wireshark in the server side to verify constant packet rate.
So the 2 left options are:
1. Network issue - but that would happen for the whole duration, not just in the beginning, right? It always glitches like mad in the start, then everything is normal.
2. Platform issue - is there anything about DatagramSocket implementation on Android that might cause this? It seems like there's an initial capacity for buffering and then it's increased after that first glitch and it doesn't happen anymore...

I also checked: 1. CPU usage during the glitch and after (using traceview), nothing unusual. 2. DatagramSocket receive buffer size - it seems to default to 100000+ bytes which is far more than my packet size.

I would appreciate any help =)
14 years ago