The UserDefaults
class makes it trivial to store small pieces of data and persist them across application sessions. The problem is that only strings, numbers, Date
objects, and Data
objects can be stored in the user's defaults database. How do you store an enum in the user's defaults database in Swift? I answer that question in this post.
Storing the Raw Value
It isn't possible to store an enum in the user's defaults database. We need to convert the value to a type that is supported by the defaults system. The easiest solution is to ask the enum for its raw value and store that value in the user's defaults database. Let me show you how that works.
Let's assume we are building a weather application and we give the user the option to toggle between degrees Fahrenheit and degrees Celsius. We start by defining an enum with name TemperatureNotation
. The raw value of the enum is Int
. The TemperatureNotation
enum defines two cases, fahrenheit
and celsius
.
import Foundation
enum TemperatureNotation: Int {
case fahrenheit
case celsius
}
If we try to store a TemperatureNotation
object in the user's defaults database, a runtime exception is thrown. This isn't surprising because the defaults system only supports strings, numbers, Date
objects, and Data
objects. This is true for iOS, tvOS, macOS, iPadOS, and watchOS.
The solution is surprisingly simple. We ask the enum for its raw value and store that raw value in the user's defaults database. This is what that looks like.
import Foundation
enum TemperatureNotation: Int {
case fahrenheit
case celsius
}
// Write/Set Raw Value in User Defaults
UserDefaults.standard.set(TemperatureNotation.fahrenheit.rawValue, forKey: "temperatureNotation")
Retrieving the Enum From User Defaults
To read or get the enum from the defaults system, we first need to ask the user's defaults database for the raw value. This code snippet should look familiar.
import Foundation
enum TemperatureNotation: Int {
case fahrenheit
case celsius
}
// Write/Set Raw Value in User Defaults
UserDefaults.standard.set(TemperatureNotation.fahrenheit.rawValue, forKey: "temperatureNotation")
// Read/Get Raw Value from User Defaults
let rawValue = UserDefaults.standard.integer(forKey: "temperatureNotation")
We use the raw value to create a TemperatureNotation
object.
import Foundation
enum TemperatureNotation: Int {
case fahrenheit
case celsius
}
// Write/Set Raw Value in User Defaults
UserDefaults.standard.set(TemperatureNotation.fahrenheit.rawValue, forKey: "temperatureNotation")
// Read/Get Raw Value from User Defaults
let rawValue = UserDefaults.standard.integer(forKey: "temperatureNotation")
// Create Temperature Notation
let temperatureNotation = TemperatureNotation(rawValue: rawValue)
There is one problem, though. The init?(rawValue:)
initializer is failable. It only returns a valid TemperatureNotation
object if the raw value is equal to 0
or 1
. We use the nil-coalescing operator to default to fahrenheit
should the initialization fail.
// Create Temperature Notation
let temperatureNotation = TemperatureNotation(rawValue: rawValue) ?? .fahrenheit
You can also use this approach if the raw value is of another type as long as the type is supported by the defaults system.
Using an Extension for User Defaults
Even though the above code snippets are simple and easy to understand, it is tedious if you need to access the value in several places in your project. That is why I often use an extension for the UserDefaults
class. I explain this technique in another post.