Explanation of casting/conversion int/double in C#

Posted by cad on Stack Overflow See other posts from Stack Overflow or by cad
Published on 2010-03-31T18:57:41Z Indexed on 2010/03/31 19:03 UTC
Read the original article Hit count: 391

Filed under:
|
|
|
|

I coded some calculation stuff (I copied below a really simplifed example of what I did) like CASE2 and got bad results. Refactored the code like CASE1 and worked fine. I know there is an implicit cast in CASE 2, but not sure of the full reason. Any one could explain me what´s exactly happening below?

  //CASE 1, result 5.5
    double auxMedia = (5 + 6);
    auxMedia = auxMedia / 2;

    //CASE 2, result 5.0
    double auxMedia1 = (5 + 6) / 2;

    //CASE 3, result 5.5
    double auxMedia3 = (5.0 + 6.0) / 2.0;

    //CASE 4, result 5.5
    double auxMedia4 = (5 + 6) / 2.0;

My guess is that /2 in CASE2 is casting (5 + 6) to int and causing round of division to 5, then casted again to double and converted to 5.0.

CASE3 and CASE 4 also fixes the problem.

© Stack Overflow or respective owner

Related posts about c#

Related posts about .NET