Stephan van Hulst wrote:Hi Jimmy.
You don't encrypt/decrypt strings. Encryption works with binary data. If you need to decrypt a hex string, you need to decode it first. After decoding, you decrypt the binary data. Finally, you encode the binary data to a string again. The same goes for a key. Keys are binary data, you need to decode the string first.
Furthermore, you need to know what padding scheme is used to create blocks and keys of the correct size for the algorithm. Just knowing the algorithm and the cipher mode isn't enough.
This is what it should look like roughly:
Stephan van Hulst wrote:Do you have an example of the key and encrypted message? Do you know what message was encrypted?
Stephan van Hulst wrote:PKCS#5 requires a multiple of 8 bytes, so the key isn't valid using that padding scheme. As a matter of fact, it can't unambiguously be decoded to a byte array at all, because the number of bytes is uneven. Is the final 'A' at the end a mistake?
Stephan van Hulst wrote:Because your encoding the binary data to UTF-8, and the original message probably wasn't UTF-8.
Are you trying to decrypt some message given to you, or are you trying to reverse your own encryption process?
Stephan van Hulst wrote:I understand.
The reason you're getting garbage with UTF-8 is because I interpreted "FFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFF" as hexadecimal, not as UTF-8. Again, hexadecimal is an encoding, and it's different from UTF-8, ASCII, or Base64.
Do you know what an encoding is? Please take a look at this article: http://www.joelonsoftware.com/articles/Unicode.html
First I decoded the hexadecimal to a binary message, encrypted it, and encoded it to hexadecimal again. In order to get the original message, you have to reverse this process by decoding the hexadecimal, decrypting the message, and then encoding it to hexadecimal again, not UTF-8.
Stephan van Hulst wrote:Good job. Can you show your full solution?