There are multiple notations for defining colors. The hex notation is probably my favorite because it is easy to store as a string or pass it around as a numerical value. In Mastering Core Data With Swift, I use it to store a color in a SQLite database.
Unfortunately, Apple's frameworks don't work well with hex color values. How do you convert a hex color value to a UIColor
instance? And how do you extract a hex color value from a UIColor
instance?
Fire up Xcode and create a playground if you want to follow along. At the top of the playground, add an import statement for the UIKit framework.
import UIKit
From Hex to UIColor in Swift
I would like to start by converting a hex value to a UIColor
instance. Create an extension for the UIColor
class and define a convenience initializer with a parameter of type String
. Notice that the initializer is failable.
extension UIColor {
// MARK: - Initialization
convenience init?(hex: String) {}
}
We first need to sanitize the string that is passed to the initializer. We strip it from new lines and whitespaces. We also remove the # symbol if present. That isn't going to help us.
var hexSanitized = hex.trimmingCharacters(in: .whitespacesAndNewlines)
hexSanitized = hexSanitized.replacingOccurrences(of: "#", with: "")
We then declare a slew of helper variables. The first helper variable we define will store the hexadecimal value as an unsigned integer.
var rgb: UInt32 = 0
Next, we declare a variable for the red, green, and blue color components of the color and also one for the alpha value. Notice that we set the a
variable to 1.0
, not 0.0
.
var r: CGFloat = 0.0
var g: CGFloat = 0.0
var b: CGFloat = 0.0
var a: CGFloat = 1.0
The last helper is a constant that stores the length of the sanitized string for convenience.
let length = hexSanitized.characters.count
The next line may seem a bit foreign. We use an instance of the Scanner
class, formerly NSScanner
, to scan the string for an unsigned value. It is important that you invoke scanHexInt32(_:)
, not scanInt32(_:)
. The scanHexInt32(_:)
method scans the string for a hexadecimal representation, which is what we need.
guard Scanner(string: hexSanitized).scanHexInt32(&rgb) else { return nil }
Notice that we pass the rgb
helper variable by reference hence the &
symbol. The scanHexInt32(_:)
method returns true
if the Scanner
instance was able to find a match. If scanHexInt32(_:)
returns false
, the initialization fails and we return `nil.
Based on the length of the string, we extract the red, green, and blue color components of the value stored in rgb
. If the length of the string is equal to 8
, it also contains the alpha value.
if length == 6 {
r = CGFloat((rgb & 0xFF0000) >> 16) / 255.0
g = CGFloat((rgb & 0x00FF00) >> 8) / 255.0
b = CGFloat(rgb & 0x0000FF) / 255.0
} else if length == 8 {
r = CGFloat((rgb & 0xFF000000) >> 24) / 255.0
g = CGFloat((rgb & 0x00FF0000) >> 16) / 255.0
b = CGFloat((rgb & 0x0000FF00) >> 8) / 255.0
a = CGFloat(rgb & 0x000000FF) / 255.0
} else {
return nil
}
That was the hard part. Creating the UIColor
instance with the values stored in the helper variables is easy and should look familiar.
self.init(red: r, green: g, blue: b, alpha: a)
From UIColor to Hex in Swift
To convert a UIColor
instance to a hex value, we define a convenience method, toHex(alpha:)
. The method accepts one parameter of type Bool
, which indicates whether the alpha value should be included in the string that is returned from the method.
func toHex(alpha: Bool = false) -> String? {}
We first need to extract the color components. The CGColor
equivalent of the UIColor
instance gives us easy access to the color components.
guard let components = cgColor.components, components.count >= 3 else {
return nil
}
Because the components
property is of type [CGFloat]?
, we need to safely unwrap its value. We also make sure it contains a minimum of three components.
Extracting the individual color components is trivial as you can see below.
let r = Float(components[0])
let g = Float(components[1])
let b = Float(components[2])
var a = Float(1.0)
If the components
constant also contains the alpha value, we store it in a variable, a
.
if components.count >= 4 {
a = Float(components[3])
}
With the components and alpha value extracted from the UIColor
instance, we create the resulting string. Notice that we multiply the values of the color components and the alpha value by 255
and convert them to an integer using the lroundf(_:)
function.
if alpha {
return String(format: "%02lX%02lX%02lX%02lX", lroundf(r * 255), lroundf(g * 255), lroundf(b * 255), lroundf(a * 255))
} else {
return String(format: "%02lX%02lX%02lX", lroundf(r * 255), lroundf(g * 255), lroundf(b * 255))
}
The string format specifier may need an explanation. You can break it down into several components.
%
defines the format specifier02
defines the length of the stringl
casts the value to an unsigned longX
prints the value in hexadecimal (0-9 and A-F)
Because the notation with the alpha value excluded is most common, we also implement a computed property that invokes the toHex()
method we implemented earlier.
var toHex: String? {
return toHex()
}
That's it. It isn't rocket science, but it isn't trivial either. This is what the UIColor
extension looks like in the wild.
let green = UIColor(hex: "12FF10")
let greenWithAlpha = UIColor(hex: "12FF10AC")
UIColor.blue.toHex
UIColor.orange.toHex()
UIColor.brown.toHex(with: true)
But it looks much better in a playground.
import UIKit
extension UIColor {
// MARK: - Initialization
convenience init?(hex: String) {
var hexSanitized = hex.trimmingCharacters(in: .whitespacesAndNewlines)
hexSanitized = hexSanitized.replacingOccurrences(of: "#", with: "")
var rgb: UInt32 = 0
var r: CGFloat = 0.0
var g: CGFloat = 0.0
var b: CGFloat = 0.0
var a: CGFloat = 1.0
let length = hexSanitized.characters.count
guard Scanner(string: hexSanitized).scanHexInt32(&rgb) else { return nil }
if length == 6 {
r = CGFloat((rgb & 0xFF0000) >> 16) / 255.0
g = CGFloat((rgb & 0x00FF00) >> 8) / 255.0
b = CGFloat(rgb & 0x0000FF) / 255.0
} else if length == 8 {
r = CGFloat((rgb & 0xFF000000) >> 24) / 255.0
g = CGFloat((rgb & 0x00FF0000) >> 16) / 255.0
b = CGFloat((rgb & 0x0000FF00) >> 8) / 255.0
a = CGFloat(rgb & 0x000000FF) / 255.0
} else {
return nil
}
self.init(red: r, green: g, blue: b, alpha: a)
}
// MARK: - Computed Properties
var toHex: String? {
return toHex()
}
// MARK: - From UIColor to String
func toHex(alpha: Bool = false) -> String? {
guard let components = cgColor.components, components.count >= 3 else {
return nil
}
let r = Float(components[0])
let g = Float(components[1])
let b = Float(components[2])
var a = Float(1.0)
if components.count >= 4 {
a = Float(components[3])
}
if alpha {
return String(format: "%02lX%02lX%02lX%02lX", lroundf(r * 255), lroundf(g * 255), lroundf(b * 255), lroundf(a * 255))
} else {
return String(format: "%02lX%02lX%02lX", lroundf(r * 255), lroundf(g * 255), lroundf(b * 255))
}
}
}
I'd like to thank @devarty for making a great suggestion to improve the name of the toHex(alpha:)
method.