Write a Java code segment (not a complete program) to open a datagram socket for receiving a datagram of up to 1000 bytes, timing out in 5 seconds. If a timeout happens, the message "timed out" should be displayed on the screen.
What happens if the timeout is set to 0?
What happens if the timeout is set to 0?