Converts single digits into old-school ASCII string.
See my digital clock example.
_ _ _ _ _ _ _ _
| | | _| _| |_| |_ |_ | |_| |_|
|_| | |_ _| | _| |_| | |_| _|
_ _ _ _
| |_| . _| _| . |_ |
| | . |_ _| . _| |
See the 140byt.es site for a showcase of entries (built itself using 140-byte entries!), and follow @140bytes on Twitter.
To learn about byte-saving hacks for your own code, or to contribute what you've learned, head to the wiki.
140byt.es is brought to you by Jed Schmidt, with help from Alex Kloss. It was inspired by work from Thomas Fuchs and Dustin Diaz.
Hey @tsaniel,
I can't get this to work.
This is how my test output looks like:
So, everything is fine except the number 4.
... but as soon as I copy and paste the resulting character, it will be somehow converted:
This might be a stupid Mac OS issue or something else when working with UTF8.
And what about the "compatible" version? Is the other one not working in all browsers?