Convert rune to int?
why don't you do only "string(rune)".
s:="12345678910"
var factor,sum int
for i,x:=range s{
if i%2==0{
factor=1
}else{
factor=3
}
xstr:=string(x) //x is rune converted to string
xint,_:=strconv.Atoi(xstr)
sum+=xint*factor
}
fmt.Println(sum)
The problem is simpler than it looks. You convert a rune
value to an int
value with int(r)
. But your code implies you want the integer value out of the ASCII (or UTF-8) representation of the digit, which you can trivially get with r - '0'
as a rune
, or int(r - '0')
as an int
. Be aware that out-of-range runes will corrupt that logic.
For example, sum += (int(c) - '0') * factor
,
package main
import (
"fmt"
"strconv"
"unicode/utf8"
)
func main() {
s := "9780486653556"
var factor, sum1, sum2 int
for i, c := range s[:12] {
if i%2 == 0 {
factor = 1
} else {
factor = 3
}
buf := make([]byte, 1)
_ = utf8.EncodeRune(buf, c)
value, _ := strconv.Atoi(string(buf))
sum1 += value * factor
sum2 += (int(c) - '0') * factor
}
fmt.Println(sum1, sum2)
}
Output:
124 124