Reproducing Rump’s example


#1

Hi everyone.

New to Julia, I try to reproduce a tricky case study of floating-point arithmetic with the BigFloat module: the “Rump’s example” described in

https://tel.archives-ouvertes.fr/tel-00657843v2/document

page 27. I got completely different results between the output of the C+MPFR code provided in the document, and my Julia code:

Precision |               C + MPFR | Julia + BigFloat
      23 |  1.1726038455963134    |  1.284406e+09
      24 | -6.3382530011411471e29 | -6.338253e+29
      53 | -1.1805916207174114e21 | -1.1805916207161269e+21
      54 |  1.1726039400531785    |  1.28440611600000000e+09
      55 |  1.1726039400531786    |  1.28440611600000000e+09
      56 |  1.1726039400531786    |  1.28440611600000000e+09
      57 |  1.1726039400531786    |  1.284406116000000000e+09
      58 |  3.6893488147419103e19 |  3.689348814870350925e+19
      59 | -1.8446744073709552e19 | -1.84467440724251455e+19
      60 | -1.8446744073709552e19 | -1.8446744072425145504e+19
      62 |  1.1726039400531786    | -2.305843007929287836e+18
      63 |  1.1529215046068469e18 |  1.152921505891253092e+18
      65 |  1.1726039400531786    |  1.28440611600000000000e+09
      70 |  1.1726039400531786    |  1.2844061160000000000000e+09
      80 |  1.1726039400531786    |  1.2844061160000000000000000e+09
      85 |  2.7487790694517260e11 |  2.76162313060000000000000000e+11
      90 |  1.7179869185172603e10 |  9.8743407080000000000000000000e+09
     100 |  1.1726039400531786    |  1.2844061160000000000000000000000e+09
     110 |  1.1726039400531786    |  1.2844061160000000000000000000000000e+09
     120 |  1.1726039400531786    |  1.2844061160000000000000000000000000000e+09
     122 | -8.2739605994682137e-1 |  1.2844061140000000000000000000000000000e+09

To my understanding, BigFloat uses MPFR as a back-end so the results should be the same. But results above are completely different (except for single (24) and double (53) precision).

Can someone help me on understanding this result. As I am new to Julia, I may have coded this wrong, so here is the Julia code I used:

function rump(T)
    a::T = parse(T, "77617.0");
    b::T = parse(T, "33096.0");
    b2 = b*b
    b4 = b2*b2
    b6 = b4*b2
    b8 = b4*b4
    a2 = a*a
    firstexpr = 11*a2*b2 - b6 - 121*b4 - 2
    f = parse(T, "333.75")*b6 + a2 * firstexpr + parse(T, "5.5")*b8 + (a/(parse(T, "2.0")/b))
    println(" Precision is ", f.prec , " Result is ", f)
end


function main()
    prec = eval(parse(ARGS[1]))
    Base.MPFR.setprecision(prec)
    rump(BigFloat)
end
main()

Thanks for your help!


#2

Try it with ArbFloats.jl

If you get the same results as MPFR you can be pretty sure it’s nothing to do with the number types and it must be your code.


#3

As a side comment, I think you can safely get rid of the parse calls you have in the function body and use type constructors instead:

function rump(T)
a = T(77617.0)
b = T(33096.0)
...
end

#4

I think you have the operation wrong and it should be * instead of /:

f = ... (a/(parse(T, "2.0") * b))

By the way, you can simply write

f = 333.75*b6 + a2 * firstexpr + 5.5*b8 + a / (2 * b)

since all those numbers are represented exactly in Float64.


#5

Is that true? this is creating a Float64, and converting that to a BigFloat, while the original was creating a BigFloat from a decimal string. Couldn’t that create different bits?


#6

You are probably right @avik, thanks for the correction.


#7

Both those numbers are exactly represented in Float64 and are exact in BigFloat too:

julia> BigFloat("77617.0") - BigFloat(77617.0)
0.000000000000000000000000000000000000000000000000000000000000000000000000000000e+04

whereas it is different for say 77617.1 which has no exact representation in Float64:

julia> BigFloat("77617.1") - BigFloat(77617.1)
-5.820766091346740722656249999999999999999999999999999999999999773608023029332191e-12

That said, it is only because of the choice of those specific numbers that it works.


#8

Damn you’re right!

Results are much better this way. Using Float64 as constants also works. I just got paranoid as things didn’t worked well.

My mad. Thanks @mzaffalon.