Program using read() entering into an infinite loop

Posted by Soham on Stack Overflow See other posts from Stack Overflow or by Soham
Published on 2011-02-18T17:36:39Z Indexed on 2011/02/19 7:25 UTC
Read the original article Hit count: 111

Filed under:
|
|
1oid ReadBinary(char *infile,HXmap* AssetMap)
{
    int fd; 
   size_t bytes_read, bytes_expected = 100000000*sizeof(char); 
   char *data;

   if ((fd = open(infile,O_RDONLY)) < 0) 
      err(EX_NOINPUT, "%s", infile);


   if ((data = malloc(bytes_expected)) == NULL)
      err(EX_OSERR, "data malloc");

   bytes_read = read(fd, data, bytes_expected);

   if (bytes_read != bytes_expected) 
      printf("Read only %d of %d bytes %d\n", \
         bytes_read, bytes_expected,EX_DATAERR);

   /* ... operate on data ... */
    printf("\n");
    int i=0;
    int counter=0;
    char ch=data[0];
    char message[512];
    Message* newMessage;
    while(i!=bytes_read)
    {

        while(ch!='\n')
        {
        message[counter]=ch;
        i++;
        counter++;
        ch =data[i];
        }
    message[counter]='\n';
    message[counter+1]='\0';
//---------------------------------------------------
    newMessage = (Message*)parser(message);
    MessageProcess(newMessage,AssetMap);
//--------------------------------------------------    
    //printf("idNUM %e\n",newMessage->idNum);
    free(newMessage);
    i++;
    counter=0;
    ch =data[i];
    }
   free(data);  

}

Here, I have allocated 100MB of data with malloc, and passed a file big enough(not 500MB) size of 926KB about. When I pass small files, it reads and exits like a charm, but when I pass a big enough file, the program executes till some point after which it just hangs. I suspect it either entered an infinite loop, or there is memory leak.

EDIT For better understanding I stripped away all unnecessary function calls, and checked what happens, when given a large file as input. I have attached the modified code

void ReadBinary(char *infile,HXmap* AssetMap)
{
    int fd; 
   size_t bytes_read, bytes_expected = 500000000*sizeof(char); 
   char *data;

   if ((fd = open(infile,O_RDONLY)) < 0) 
      err(EX_NOINPUT, "%s", infile);


   if ((data = malloc(bytes_expected)) == NULL)
      err(EX_OSERR, "data malloc");

   bytes_read = read(fd, data, bytes_expected);

   if (bytes_read != bytes_expected) 
      printf("Read only %d of %d bytes %d\n", \
         bytes_read, bytes_expected,EX_DATAERR);

   /* ... operate on data ... */
    printf("\n");
    int i=0;
    int counter=0;
    char ch=data[0];
    char message[512];
    while(i<=bytes_read)
    {

        while(ch!='\n')
        {
        message[counter]=ch;
        i++;
        counter++;
        ch =data[i];
        }
    message[counter]='\n';
    message[counter+1]='\0';
    i++;
    printf("idNUM \n");
    counter=0;
    ch =data[i];
    }
   free(data);  

}

What looks like is, it prints a whole lot of idNUM's and then poof segmentation fault

I think this is an interesting behaviour, and to me it looks like there is some problem with memory

FURTHER EDIT I changed back the i!=bytes_read it gives no segmentation fault. When I check for i<=bytes_read it blows past the limits in the innerloop.(courtesy gdb)

© Stack Overflow or respective owner

Related posts about c

    Related posts about linux