A fun thing I recently learned about Large Language Models (LLMs) is that they understand base64, a simple encoding of text. Here’s a demonstration: the base64 encoding of What is 2 + 3? is V2hhdC...
Attached a pretty cool article covering it. This is something I never would have thought of before.
That’s not the LLM that understand your encoded string, it’s simply a preprocessing filter recognizing the signature of a base64 encoded string that decodes it and pass it back to the LLM.
That’s not the LLM that understand your encoded string, it’s simply a preprocessing filter recognizing the signature of a base64 encoded string that decodes it and pass it back to the LLM.
LLM left to its own devices.