DEV Community

Gamya
Gamya

Posted on

Decimals in Swift

🌸 Swift Decimal Numbers✨

When you work with decimal numbers like 3.14, 0.5, or 9.99, Swift uses floating-point numbers, usually of type Double.

Decimals are used for things like ratings, percentages, and measurements.


🔹 Creating Decimal Numbers

If a number contains a dot (.), Swift treats it as a Double:

let animeRating = 4.5
let bloomGrowth = 0.75
Enter fullscreen mode Exit fullscreen mode

Swift automatically chooses Double for decimal values.


🔹 Decimal Precision Gotcha

Decimal numbers are not always perfectly accurate:

let value = 0.1 + 0.2
print(value)
Enter fullscreen mode Exit fullscreen mode

This prints:

0.30000000000000004
Enter fullscreen mode Exit fullscreen mode

This happens because decimals are stored as binary approximations.


🔹 Int vs Double (Type Safety)

Swift does not allow mixing integers and decimals:

let episodes = 12
let duration = 2.0

// ❌ Error
// let total = episodes + duration
Enter fullscreen mode Exit fullscreen mode

You must convert explicitly:

let total1 = Double(episodes) + duration
let total2 = episodes + Int(duration)
Enter fullscreen mode Exit fullscreen mode

This is called type safety.


🔹 How Swift Chooses the Type

Swift decides the type based on the value:

let d1 = 3.14   // Double
let d2 = 3.0    // Double
let i1 = 3      // Int
Enter fullscreen mode Exit fullscreen mode

Once set, a variable cannot change its type.


🔹 Decimal Math

Decimals support the same operators as integers:

var flowerRating = 4.0

flowerRating *= 2
flowerRating -= 1
flowerRating /= 2

print(flowerRating)
Enter fullscreen mode Exit fullscreen mode

🌟 Wrap Up

Decimals in Swift use Double by default.

They’re powerful and fast, but not always perfectly precise—which is why Swift keeps Int and Double strictly separate 🌸✨

Top comments (0)