The application which I am developing is a typical client-server app using
Java sockets with TCP/IP. I am using bufferedinputstream and a byte array of size-1024. I will be transferring multiple files one after the other. Information regarding the file is send as a header splitted by delimiters. This info is extracted to get the file name as well as size. The header size is calculated using the convention by which I add to the byte stream before I send the file.
when I run the application in the same machine it gives fairly consistent results regarding the file size, but running the client and server apps on different machines result in data corruption with a significant increase in the number of bytes recieved. I am unable to comprehend why this is happening when the app is the same and is supposed to use the same TCP/IP stack whether run on one machine or another...Can somebody give me a soln???