How can I deal with operation between Int and CGFloat?
In Swift you cannot multiply two numbers of different types (NSNumber, Int, Double, etc.) directly. The width of a CGRect is of floating point type and the array count is of integer type. Here's a working example:
let myArray: Int[] = [1,2,3]
let rect: CGRect = CGRect(x:0,y:0,width:100,height:100)
let total: Double = rect.size.width * Double(myArray.count)
Swift does not allow operations between two numbers of different types. Therefore, before to multiply your array.count
(Int
) by your width (CGFloat
), you'll have to cast it to CGFloat
.
Fortunately, Swift provides a simple CGFloat
initializer init(_:)
that creates a new CGFloat
from an Int
. This initializer has the following declaration:
init<Source>(_ value: Source) where Source : BinaryInteger
Creates a new value, rounded to the closest possible representation.
The Swift 5 Playground sample code below shows how to perform your calculation by using CGFloat
's initializer:
import UIKit
import CoreGraphics
// Set array and view
let array = Array(1...3)
let rect = CGRect(x: 0, y: 0, width: 100, height: 100)
let view = UIView(frame: rect)
// Perform operation
let width = view.bounds.width
let total = width * CGFloat(array.count)
print(total) // prints: 300.0
First, let's create some demo data
let array: NSArray = ["a", "b", "c"] //you could use a Swift array, too
let view = UIView() //just some view
Now, everything else works almost the same way as in Obj-C
let width: CGFloat = CGRectGetWidth(view.bounds)
or simply
let width = CGRectGetWidth(rect) //type of the variable is inferred
and total:
let total = width * CGFloat(array.count)
Note that we have to add a CGFloat
cast for array.count
. Obj-C would implicitly cast NSUInteger
to CGFloat
but Swift has no implicit casts, so we have to add an explicit one.