I've got two Int values (they have to be Ints) and I want them to round off to the nearest value when in an equation;
var Example = Int()
var secondExample = Int()
Example = (secondExample / 7000)
This equation makes the variable Example
always round down to the lowest value. Say for example that the numbers are the following;
var Example = Int()
var secondExample : Int = 20000
Example = (20000 / 7000)
20000 / 7000 equals 2.857… But the variable Example
displays 2
.
How can I make Example
round off to closest number without changing it to a Double
Best Answer
For nonnegative integers, the following function gives the desired result in pure integer arithmetic :
Examples:
The idea is that
And here is a possible implementation for arbitrarily signed integers which also does not overflow:
(Based on @user3441734's updated solution, so we have a reference cycle between our answers now :)
There is also a
ldiv
function which computes both quotient and remainder of a division, so the last function could also be implemented as(I did not test which version is faster.)