The PROPER way to convert a String to a Custom Type in Swift

The PROPER way to convert a String to a Custom Type in Swift

Some of the types built into Swift allow to convert a String value into specific type. Let's say we have a string that contains a number. We can convert that string to an Int or Double:

let stringValue = "15"
let intValue = Int(stringValue) ?? 0
let doubleValue = Double(stringValue) ?? 0

Because String can contain a value that is not a number e.g: "Hey, can I get your number?" ,  the conversion to numeric type can fail and initializer will return nil.

All of  this is possible, because Int and Double conform to the LosslessStringConvertible protocol.

LosslessStringConvertible

The LosslessStringConvertible protocol has been designed to standardize conversion from String to a custom type that conforms it. Before we see it in action, let's define our custom type:

struct Vector2D {
  let x: Int
  let y: Int
  
  static let zero = Vector2D(x: 0, y: 0)
  
  init(x: Int, y: Int) {
    self.x = x
    self.y = y
  }
}

Above structure represents a simple 2D Vector. We can create an instance by passing explicit values, or by using a static helper:

let origin = Vector(x: 0, y: 0)
let alsoOrigin = Vector.zero

Now, let's say we received the vector coordinates as a string - 0;0 - where the X and Y values are separated by a semicolon ;.  At the moment, our Vector2D is not able to create an instance from such string. Let's change that by conforming to the LosslessStringConvertible protocol:

extension Vector2D: LosslessStringConvertible {
    init?(_ description: String) {
        let coordinates = description.split(separator: ";")
        guard coordinates.count == 2, let x = Int(coordinates[0]), let y = Int(coordinates[1]) else {
            return nil
        }
    
        self.init(x: x, y: y)
    }
    
    var description: String {
        return "\(x);\(y)"
    }
}

First we need to define the init?(_ description: String) initializer. The initialization logic is quite simple. First, we split the string on the  semicolon ;. That should give us an array with all the values between semicolons. Then we verify whether we have exactly two values, and we can convert those values into an Int. If something is wrong with those values we return a nil, otherwise we instantiate the Vector2D structure.

The LosslessStringConvertible inherits from the CustomStringConvertible protocol and requires to implement the description property. Conforming to the CustomStringConvertible  protocol  allows to convert our custom type to a String which gives us a two-way conversion - from String to Vector2D and from Vector2D to String:

let stringVector = "0;0"
let anotherOrigin = Vector2D(stringVector) ?? .zero
let alsoStringVector = vector.description 

Why to use LosslessStringConvertible?

Some might ask why should we even bother to conform to the LosslessStringConvertible protocol? We can achieve the same result just by creating a convenience initializer:

extension Vector2D {
    init?(_ string: String) {
       // ...
    }
}

This is completely fine solution, but LosslessStringConvertible gives us not only a standardization across the code, but  also a little bit of flexibility. Les't take a look at this extension:

extension Array where Element == String {
    func compactMap<T: LosslessStringConvertible>() -> [T] {
        return self.compactMap({ T($0) })
    }
}

Above code extends an Array - that contains String  elements - with a special compactMap function. This function converts a string value to a  type that conforms the LosslessStringConvertible protocol, so we can do something like this:

let vectorsAsString = [
    "0;0",
    "1;1",
    "2;2"
]

let vectors: [Vector2D] = vectorsAsString.compactMap()

And what's really cool, this extension method works with all the types that conforms the LosslessStringConvertible, e.g.:

let stringValues = ["0", "1", "2"]
let intValues: [Int] = stringValues.compactMap()

Image credits: Les Triconautes.


Comments

Anything interesting to share? Write a comment.