Skip to content

Hacking with Swift -> All ASCII chars in UTF-8 string? 💩

Unicode is great! It’s so great that it solves all of our localized character issues. But what if you’re working with a low level accessory that requires ASCII ONLY characters as valid input? Let’s say you were to throw in some 💩? How would this system deal with that? This may produce unexpected results and really mess-up-yo-day.

Fortunately, we high aptitude programmers can write validation functions that filter for unwanted UTF-8 bytes that don’t have valid ASCII representations. The gist below offers a function where we turn each Character into an individual String and turn that Character String into a sequence of bytes. If the amount of bytes in the Character String array is greater than 1 or the decimal value of the first valid byte in the sequence is less than 0 or greater than 127, than this character is not valid ascii.

Be First to Comment

Leave a Reply

Your email address will not be published. Required fields are marked *

21 − 16 =