Hey all, I have designed a class, specifically designed for unix systems which dumps a specified number of bytes from the /dev/random block device to a specified binary file. Unfortunately I discovered that while java will read and copy 128 bytes of data, after that, the stream puts out zeros. IE: If I request a 256 byte file, it will give 128 bytes random data, and 128 bytes of zeroes.
Further complicating the matter is the fact that the exact same code, directed to the /dev/urandom device, performs flawlessly. I have checked the entropy pool, and there is more than enough entropy to fill the need. The presentation makes me think it is a limitation of the system, and not java, but I cannot find any reference to the problem online.
Any help would be greatly appreciated!
Further complicating the matter is the fact that the exact same code, directed to the /dev/urandom device, performs flawlessly. I have checked the entropy pool, and there is more than enough entropy to fill the need. The presentation makes me think it is a limitation of the system, and not java, but I cannot find any reference to the problem online.
Any help would be greatly appreciated!