Search Results

Search found 616 results on 25 pages for 'fopen'.

Page 3/25 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • using fopen on uploaded file with php

    - by Patrick
    Im uploading a file, and then attempting to use fgetcsv to do things to it. This is my script below. Everything was working fine before i added the file upload portions. Now its not showing any errors, its just not displaying the uploaded data. Im getting my "file uploaded" notice, and the file is saving properly, but the counter for number of entries is showing 0. if(isset($_FILES[csvgo])){ $tmp = $_FILES[csvgo][tmp_name]; $name = "temp.csv"; $tempdir = "csvtemp/"; if(move_uploaded_file($tmp, $tempdir.$name)){ echo "file uploaded<br>"; } $csvfile = $tempdir.$name; $fh = fopen($csvfile, 'r'); $headers = fgetcsv($fh); $data = array(); while (! feof($fh)) { $row = fgetcsv($fh); if (!empty($row)) { $obj = new stdClass; foreach ($row as $i => $value) { $key = $headers[$i]; $obj->$key = $value; } $data[] = $obj; } } fclose($fh); $c=0; foreach($data AS $key){ $c++; $phone = $key->PHONE; $fax = $key->Fax; $email = $key->Email; // do things with the data here } echo "$c entries <br>"; }

    Read the article

  • C programing fopen

    - by Pedro
    #include <stdio.h> #include <stdlib.h> typedef struct aluno{ char cabecalho[60]; char info[100]; int n_alunos; char dados[100]; char curso[100]; int numero; char nome[100]; char e_mail[100]; int n_disciplinas; int nota; }ALUNO; void cabclh(ALUNO alunos[],int a){ FILE *fp; int i; for(i=0;i<100;i++){ fp=fopen("trabalho.txt","r"); } if(fp==NULL){ printf("Erro ao abrir o ficheiro\n"); } while(!feof(fp)){ fgets(alunos[i].cabecalho,100,fp); printf("%s\n",alunos[i].cabecalho); } } fclose(fp); } what is wrong here? main: int main(int argc, char *argv[]){ ALUNO alunos[100]; int aluno; int b; cabclh(aluno,b); system("PAUSE"); return 0

    Read the article

  • close file with fopen() but file still in use

    - by Marco
    Hi all, I've got a problem with deleting/overwriting a file using my program which is also being used(read) by my program. The problem seems to be that because of the fact my program is reading data from the file (output.txt) it puts the file in a 'in use' state which makes it impossible to delete or overwrite the file. I don't understand why the file stays 'in use' because I close the file after use with fclose(); this is my code: bool bBool = true while(bBool){ //Run myprogram.exe tot generate (a new) output.txt //Create file pointer and open file FILE* pInputFile = NULL; pInputFile = fopen("output.txt", "r"); // //then I do some reading using fscanf() // //And when I'm done reading I close the file using fclose() fclose(pInputFile); //The next step is deleting the output.txt if( remove( "output.txt" ) == -1 ){ //ERROR }else{ //Succesfull } } I use fclose() to close the file but the file remains in use by my program until my program is totally shut down. What is the solution to free the file so it can be deleted/overwrited? In reality my code isn't a loop without an end ; ) Thanks in advance! Marco

    Read the article

  • how to get contents of site use HTTPS

    - by cashmoney
    ex of site using ssl ( HTTPs ) : https://www.eb2a.com 1 - i tried to get its content using file_get_contents, but not work and give error ex : <?php $contents = file_get_contents("https://www.eb2a.com/"); echo $contents; ?> 2 - i tried to use fopen, but not work and give error ex: <?php $url = 'https://www.eb2a.com/'; $contents = fopen($url, 'r'); echo "$contents"; ?> 3 - i tried to use CURL, but not work and give BLANK PAGE ex : function cURL($url, $ref, $header, $cookie, $p){ $ch = curl_init(); curl_setopt($ch, CURLOPT_URL, $url); curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 0); curl_setopt($ch, CURLOPT_USERAGENT, $_SERVER['HTTP_USER_AGENT']); curl_setopt($ch, CURLOPT_REFERER, $ref); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 0); if ($p) { curl_setopt($ch, CURLOPT_CUSTOMREQUEST, "POST"); curl_setopt($ch, CURLOPT_POST, 1); curl_setopt($ch, CURLOPT_POSTFIELDS, $p); } $result = curl_exec($ch); curl_close($ch); if ($result){ return $result; }else{ return ''; } } $file = cURL('https://www.eb2a.com/','https://www.eb2a.com/',0,0,null); echo $file any one have any idea ??

    Read the article

  • Print the first line of a file C programming

    - by Pedro
    void cabclh(){ FILE *fp; char *val, aux; int i=0; char *result, cabeca[60]; fp=fopen("trabalho.txt","r"); if(fp==NULL){ printf("ERROR\n"); return ; } val=(char*)calloc(aux, sizeof(char)); while(fp='\n'){ fgets(cabeca,60,fp); printf("%s\n",cabeca); } fclose(fp); free(fp); } I want to open a file and print the first line. the problem here is in while(fp='\n'), what i'm doing wrong. How can i make a function that recognize the first char from a file... like: FILE *arq; char info[20]; arq=fopen("trabalho.txt","r"); if(fp==NULL){ printf("ERROR\n"); return ; } if(fp[0]='-'){ //check if the first element is a '-' printf("It's info\n"); }

    Read the article

  • How to check which files I fopen'ed on Windows?

    - by Igor Oks
    I received a "Too many open files" error when tried to do fopen (C++, Windows XP). Probably it happened because somewhere in my program I open files without closing them. Is there a way on Windows to see a list of all open file descriptors (or all files that my program fopened)?

    Read the article

  • How do I open a file in such a way that if the file doesn't exist it will be created and opened automatically?

    - by snakile
    Here's how I open a file for writing+ : if( fopen_s( &f, fileName, "w+" ) !=0 ) { printf("Open file failed\n"); return; } fprintf_s(f, "content"); If the file doesn't exist the open operation fails. What's the right way to fopen if I want to create the file automatically if the file doesn't already exist? EDIT: If the file does exist, I would like fprintf to overwrite the file, not to append to it.

    Read the article

  • [php] how to read only 5 last line of the txt file

    - by safaali
    hello i have a file named "file.txt" it updates by adding lines to it. I am reading it by this code: $fp = fopen("file.txt", "r"); $data = ""; while(!feof($fp)) { $data .= fgets($fp, 4096); } echo $data; and a huge number of lines appears. I just want to echo the last 5 lines of the file how can i do that ? thanks in advanced

    Read the article

  • Can "\Device\NamedPipe\\Win32Pipes" handles cause "Too many open files" error?

    - by Igor Oks
    Continuing from this question: When I am trying to do fopen on Windows, I get a "Too many open files" error. I tried to analyze, how many open files I have, and seems like not too much. But when I executed Process Explorer, I noticed that I have many open handles with similar names: "\Device\NamedPipe\Win32Pipes.00000590.000000e2", "\Device\NamedPipe\Win32Pipes.00000590.000000e3", etc. I see that the number of these handles is exactly equal to the number of the iterations that my program executed, before it returned "Too many open files" and stopped. I am looking for an answer, what are these handles, and could they actually cause the "Too many open files" error? In my program I am loading files from remote drive, and I am creating TCP/IP connections. Could one of these operations create these handles?

    Read the article

  • Writing a new line to file in PHP

    - by James P
    My code: $i = 0; $file = fopen('ids.txt', 'w'); foreach ($gemList as $gem) { fwrite($file, $gem->getAttribute('id') . '\n'); $gemIDs[$i] = $gem->getAttribute('id'); $i++; } fclose($file); For some reason, it's writing \n as a string, so the file looks like this: 40119\n40122\n40120\n42155\n36925\n45881\n42145\n45880 From Google'ing it tells me to use \r\n, but \r is a carriage return which doesn't seem to be what I want to do. I just want the file to look like this: 40119 40122 40120 42155 36925 45881 42145 45880 Thanks.

    Read the article

  • Getting the title of a page in PHP

    - by Francesc
    Hi. When I want to get the title of a remote webiste, I use this script: function get_remotetitle($urlpage) { $file = @fopen(($urlpage),"r"); $text = fread($file,16384); if (preg_match('/<title>(.*?)<\/title>/is',$text,$found)) { $title = $found[1]; } else { $title = 'Title N/A'; } return $title; } But when I parase a webiste title with accents, I get "?". But if I look in PHPMyAdmin, I see the accents correctly. What's happening?

    Read the article

  • What's the best way to write to more files than the kernel allows open at a time?

    - by Elpezmuerto
    I have a very large binary file and I need to create separate files based on the id within the input file. There are 146 output files and I am using cstdlib and fopen and fwrite. FOPEN_MAX is 20, so I can't keep all 146 output files open at the same time. I also want to minimize the number of times I open and close an output file. How can I write to the output files effectively? I also must use the cstdlib library due to legacy code.

    Read the article

  • How do I effectively write to 146 output files in C++ using cstdlib library

    - by Elpezmuerto
    I have a very large binary file and I need to create separate files based on the id within the input file. There are 146 output files and I am using cstdlib and fopen and fwrite. FOPEN_MAX is 20, so I can't keep all 146 output files open at the same time. I also want to minimize the number of times I open and close an output file. How can I write to the output files effectively? I also must use the cstdlib library due to legacy code.

    Read the article

  • How to create custom filenames in C?

    - by eSKay
    Please see this piece of code: #include<stdio.h> #include<string.h> #include<stdlib.h> int main() { int i = 0; FILE *fp; for(i = 0; i < 100; i++) { fp = fopen("/*what should go here??*/","w"); //I need to create files with names: file0.txt, file1.txt, file2.txt etc //i.e. file{i}.txt } }

    Read the article

  • debug assertion fail

    - by Kolt
    I have a program that runs correctly if I start it manually. However, if I try to add a registry key to start it automatically during startup, I get this error: Debug assertion failed (str!=null) fprintf.c line:55 I tried to add Sleep(20000) before anything hapens, but same error. here`s the code: main() { FILE* filetowrite; filetowrite = fopen("textfile.txt", "a+"); writefunction(filetowrite); } int writefunction(FILE* filetowrite) { fprintf(filetowrite, "%s", "\n\n"); ... } I also tried with passing filename as char* and opening it in wrtiefunction(), but same error.

    Read the article

  • Streaming local file from PHP while it's been written to by a CURL process

    - by Fahim
    I am creating a simple Proxy server for my website. Why I am not using mod_proxy and mod_cache is a different discussion. Here's the code: shell_exec("nohup curl --create-dirs -o {$write_path} {$source_url} > /dev/null 2> /dev/null & echo $!"); sleep(1); $read_speed = 65.5; # 65.5 kb/s download rate $handle = fopen($write_path, "rb"); $content_type = select_meta_item($headers, 'Content-Type'); $file_size = select_meta_item($headers, 'Content-Length'); send_headers($content_type, $file_size); flush(); while (!feof($handle)) { echo fread($handle, round($read_speed * 1024)); flush(); sleep(1); } fclose($handle); Streaming an MP3 doesn't work using this method. Plays in Chrome, but not in Firefox. Initially I'll be using this to stream MP3 files through Long Tail's JW Player. If it all works out, I'll also be using this to send ZIP files.

    Read the article

  • Reading a .dat file as "rb" read binary

    - by donpal
    I have a web-accessible php script that accesses a folder above the webroot (not web accessible) called \folder\. This is done via setting the path to \folder\ in .htaccess the usual way so that \folder\ becomes part of the project. \folder\ contains a .php script (communicates with the web-accessible script inside the webroot) some .inc files (used by the .php in the same folder, above the webroot) a dat file (used by the .inc in the same folder, above the webroot) All files are accessible to each other as needed: the web-accessible php inside the webroot can communicate with the php above the webroot the php above the webroot can communicate with the inc in the same folder But the inc above the webroot can't communicate with the dat in the same folder, and I have no idea why that's the case The inc myinc.inc is supposed to open the dat mydat.dat in the same folder like this fopen('mydat.dat', "rb"); but I get an error that no file called mydat.dat exists inside \folder\myinc.inc. Of course it does not, the .dat is sibling to .inc and is not supposed to be inside it. Why is php expecting to find the .dat file inside the .inc. The stranger thing is that if I move the .dat in the web-accessible folder, it becomes readable now. Any ideas why php is trying to find the .dat inside the .inc?

    Read the article

  • Default input and output buffering for fopen'd files?

    - by Evan Teran
    So a FILE stream can have both input and output buffers. You can adjust the output stream using setvbuf (I am unaware of any method to play with the input buffer size and behavior). Also, by default the buffer is BUFSIZ (not sure if this is a POSIX or C thing). It is very clear what this means for stdin/stdout/stderr, but what are the defaults for newly opened files? Are they buffered for both input and output? Or perhaps just one? If it is buffered, does output default to block or line mode?

    Read the article

  • Using a function found in a different file in a loop

    - by Anders
    This question is related to BuddyPress, and a follow-up question from this question I have a .csv-file with 790 rows and 3 columns where the first column is the group name, second is the group description and last (third) the slug. As far as I've been told I can use this code: <?php $groups = array(); if (($handle = fopen("groupData.csv", "r")) !== FALSE) { while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) { $group = array('group_id' = 'SOME ID', 'name' = $data[0], 'description' = $data[1], 'slug' = groups_check_slug(sanitize_title(esc_attr($data[2]))), 'date_created' = gmdate( "Y-m-d H:i:s" ), 'status' = 'public' ); $groups[] = $group; } fclose($handle); } foreach ($groups as $group) { groups_create_group($group); } With http://www.nomorepasting.com/getpaste.php?pasteid=35217 which is called bp-groups.php. The thing is that I can't make it work. I've created a new file with the code written above called groupgenerator.php uploaded the .csv file to the same folder and opened groupgenerator.php in my browser. But, i get this error: Fatal error: Call to undefined function groups_check_slug() in What am I doing wrong?

    Read the article

  • Howto open a file and remove the last line?

    - by sologhost
    I am looking to open up a file, grab the last line in the file where the line = "?", which is the closing tag for a php document. Than I am wanting to append data into it and add back in the "?" to the very last line. I've been trying a few approaches, but I'm not having any luck. Here's what I got so far, as I am reading from a zip file. Though I know this is all wrong, just needing some help with this please... // Open for reading is all we can do with zips and is all we need. if (zip_entry_open($zipOpen, $zipFile, "r")) { $fstream = zip_entry_read($zipFile, zip_entry_filesize($zipFile)); $fp = fopen($curr_lang_file, 'r+b'); while (!feof($fp)) { $output = fgets($fp, 16384); if (trim($output) == '?>') break; fwrite($fp, $output); } fclose($fp); file_put_contents($curr_lang_file, $fstream, FILE_APPEND); } $curr_lang_file is a filepath string to the actual file that needs to have the fstream appended to it, but after we remove the last line that equals '?'

    Read the article

  • How to open a file and remove the last line?

    - by sologhost
    I am looking to open up a file, grab the last line in the file where the line = "?", which is the closing tag for a php document. Than I am wanting to append data into it and add back in the "?" to the very last line. I've been trying a few approaches, but I'm not having any luck. Here's what I got so far, as I am reading from a zip file. Though I know this is all wrong, just needing some help with this please... // Open for reading is all we can do with zips and is all we need. if (zip_entry_open($zipOpen, $zipFile, "r")) { $fstream = zip_entry_read($zipFile, zip_entry_filesize($zipFile)); $fp = fopen($curr_lang_file, 'r+b'); while (!feof($fp)) { $output = fgets($fp, 16384); if (trim($output) == '?>') break; fwrite($fp, $output); } fclose($fp); file_put_contents($curr_lang_file, $fstream, FILE_APPEND); } $curr_lang_file is a filepath string to the actual file that needs to have the fstream appended to it, but after we remove the last line that equals '?'

    Read the article

  • Editing a remote file on-the-fly with PHP

    - by user275074
    Hi, I have a requirement to edit a remote text file on-the-fly, the content of which currently stands at ~1Mb. I have tried a couple of approaches and both seem to be clunky or hog memory which I can't rely on. Thinking out logically what I'm trying to achieve is: FTP to a remote server. Download a copy of the file for backup purposes and store it somewhere locally. Open the remote file and add the necessary lines required. Remove lines from the remote file as per an array of un-required data generated from the local server. Is this possible? I've managed to code steps 1 and 2 but I'm having difficult with 3 and 4. The way I'm doing it now is to use fgets and return the whole string. Really, I don't want to do this as it involves manipulating and re-generating the whole string (and it's large) and then re-inserting it in between two markers in the remote file. Is there no way of manipulating the lines of text in the file on-the-fly?

    Read the article

  • MATLAB - Delete elements of binary files without loading entire file

    - by Doresoom
    This may be a stupid question, but Google and MATLAB documentation have failed me. I have a rather large binary file (10 GB) that I need to open and delete the last forty million bytes or so. Is there a way to do this without reading the entire file to memory in chunks and printing it out to a new file? It took 6 hours to generate the file, so I'm cringing at the thought of re-reading the whole thing. EDIT: The file is 14,440,000,000 bytes in size. I need to chop it to 14,400,000,000.

    Read the article

  • unbuffered I/O in Linux

    - by stuck
    I'm writing lots and lots of data that will not be read again for weeks - as my program runs the amount of free memory on the machine (displayed with 'free' or 'top') drops very quickly, the amount of memory my app uses does not increase - neither does the amount of memory used by other processes. This leads me to believe the memory is being consumed by the filesystems cache - since I do not intend to read this data for a long time I'm hoping to bypass the systems buffers, such that my data is written directly to disk. I dont have dreams of improving perf or being a super ninja, my hope is to give a hint to the filesystem that I'm not going to be coming back for this memory any time soon, so dont spend time optimizing for those cases. On Windows I've faced similar problems and fixed the problem using FILE_FLAG_NO_BUFFERING|FILE_FLAG_WRITE_THROUGH - the machines memory was not consumed by my app and the machine was more usable in general. I'm hoping to duplicate the improvements I've seen but on Linux. On Windows there is the restriction of writing in sector sized pieces, I'm happy with this restriction for the amount of gain I've measured. is there a similar way to do this in Linux?

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >