Using typedefs (or #defines) on built in types - any sensible reason?

Posted by jb on Stack Overflow See other posts from Stack Overflow or by jb
Published on 2009-05-13T23:14:15Z Indexed on 2010/04/04 9:13 UTC
Read the original article Hit count: 286

Filed under:
|
|
|

Well I'm doing some Java - C integration, and throught C library werid type mappings are used (theres more of them;)):

#define CHAR        char                    /*  8 bit signed int            */
#define SHORT       short                   /* 16 bit signed int            */
#define INT         int                     /* "natural" length signed int  */
#define LONG        long                    /* 32 bit signed int            */
typedef unsigned    char    BYTE;           /*  8 bit unsigned int          */
typedef unsigned    char    UCHAR;          /*  8 bit unsigned int          */
typedef unsigned    short   USHORT;         /* 16 bit unsigned int          */
typedef unsigned    int     UINT;           /* "natural" length unsigned int*/

Is there any legitimate reason not to use them? It's not like char is going to be redefined anytime soon.

I can think of:

  1. Writing platform/compiler portable code (size of type is underspecified in C/C++)
  2. Saving space and time on embedded systems - if you loop over array shorter than 255 on 8bit microprocessor writing:

     for(uint8_t ii = 0; ii < len; ii++)
    

    will give meaureable speedup.

© Stack Overflow or respective owner

Related posts about c

    Related posts about c++