How do I measure string length vs character count with Unicode in Swift?

In Swift, you can measure the length of a string in terms of both its character count and its byte size. This is important when dealing with Unicode characters, as some characters may use more than one byte. Here's how to do it:

// Example in Swift to measure string length vs character count let stringWithUnicode = "????‍????‍???? Family" // A string with a family emoji let characterCount = stringWithUnicode.count // Measures the number of characters let stringLength = stringWithUnicode.utf8.count // Measures the byte length print("Character Count: \(characterCount)") // Outputs: Character Count: 8 print("String Length (bytes): \(stringLength)") // Outputs: String Length (bytes): 22

Measure string length character count Unicode Swift programming