Search Results

Search found 3953 results on 159 pages for 'overlapped io'.

Page 30/159 | < Previous Page | 26 27 28 29 30 31 32 33 34 35 36 37  | Next Page >

  • Python text file processing speed issues

    - by Anonymouslemming
    Hi all, I'm having a problem with processing a largeish file in Python. All I'm doing is f = gzip.open(pathToLog, 'r') for line in f: counter = counter + 1 if (counter % 1000000 == 0): print counter f.close This takes around 10m25s just to open the file, read the lines and increment this counter. In perl, dealing with the same file and doing quite a bit more (some regular expression stuff), the whole process takes around 1m17s. Perl Code: open(LOG, "/bin/zcat $logfile |") or die "Cannot read $logfile: $!\n"; while (<LOG>) { if (m/.*\[svc-\w+\].*login result: Successful\.$/) { $_ =~ s/some regex here/$1,$2,$3,$4/; push @an_array, $_ } } close LOG; Can anyone advise what I can do to make the Python solution run at a similar speed to the Perl solution? I've tried just uncompressing the file and dealing with it using open instead of gzip.open, but that made a very small difference to the overall time.

    Read the article

  • What is the fastest way to write hundreds of files to disk using C#?

    - by Ehsan
    My program should write hundreds of files to disk, received by external resources (network) each file is a simple document that I'm currently store it with the name of GUID in a specific folder but creating hundred files, writing, closing is a lengthy process. Is there any better way to store these amount of files to disk? I've come to a solution, but I don't know if it is the best. First, I create 2 files, one of them is like allocation table and the second one is a huge file storing all the content of my documents. But reading from this file would be a nightmare; maybe a memory-mapped file technique could help. Could working with 30GB or more create a problem?

    Read the article

  • What would happen if a same file being read and appended at the same time(python programming)?

    - by Shane
    I'm writing a script using two separate thread one doing file reading operation and the other doing appending, both threads run fairly frequently. My question is, if one thread happens to read the file while the other is just in the middle of appending strings such as "This is a test" into this file, what would happen? I know if you are appending a smaller-than-buffer string, no matter how frequently you read the file in other threads, there would never be incomplete line such as "This i" appearing in your read file, I mean the os would either do: append "This is a test" - read info from the file; or: read info from the file - append "This is a test" to the file; and such would never happen: append "This i" - read info from the file - append "s a test". But if "This is a test" is big enough(assuming it's a bigger-than-buffer string), the os can't do appending job in one operation, so the appending job would be divided into two: first append "This i" to the file, then append "s a test", so in this kind of situation if I happen to read the file in the middle of the whole appending operation, would I get such result: append "This i" - read info from the file - append "s a test", which means I might read a file that includes an incomplete string?

    Read the article

  • How can I determine if a file is read-only for my process on *nix?

    - by user109078
    Using the stat function, I can get the read/write permissions for: owner user other ...but this isn't what I want. I want to know the read/write permissions of a file for my process (i.e. the application I'm writing). The owner/user/other is only helpful if I know if my process is running as the owner/user/other of the file...so maybe that's the solution but I'm not sure of the steps to get there.

    Read the article

  • basic file input using C

    - by user1781966
    So im working on learning how to do file I/O, but the book I'm using is terrible at teaching how to receive input from a file. Below is is their example of how to receive input from a file, but it doesn't work. I have copied it word for word, and it should loop through a list of names until it reaches the end of the file( or so they say in the book), but it doesn't. In fact if I leave the while loop in there, it doesn't print anything. #include <stdio.h> #include <conio.h> int main() { char name[10]; FILE*pRead; pRead=fopen("test.txt", "r"); if (pRead==NULL) { printf("file cannot be opened"); }else printf("contents of test.txt"); fscanf(pRead,"%s",name); while(!feof(pRead)) { printf("%s\n",name); fscanf(pRead, "%s", name); } getch(); } Even online, every beginners tutorial I see does some variation of this, but I can't seem to get it to work even a little bit.

    Read the article

  • Get license file from a folder in C# project

    - by daft
    I have a license file that I need to access at runtime in order to create pdf-files. After I have created the in memory pdf, I need to call a method on that pdf to set the license, like this: pdf.SetLicense("pathToLicenseFileHere"); The license file is located in the same project as the.cs-file that creates the pdf, but is in a separate folder. I cannot get this simple thing to behave correctly, which makes me a bit sad, since it really shouldn't be that hard. :( I try to set the path like this: string path = @"\Resources\File.lic"; But it just isn't working out for me.

    Read the article

  • Program crashes after trying to use a recently created file. C#

    - by Jason T.
    So here is my code if (!File.Exists(pathName)) { File.Create(pathName); } StreamWriter outputFile = new StreamWriter(pathName,true); But whenever I run the program the first time the path with file gets created. However once I get to the StreamWriter line my program crashes because it says my fie is in use by another process. Is there something I'm missing between the File.Create and the StreamWriter statements?

    Read the article

  • Memory mapping of files and system cache behavior in WinXP

    - by Canopus
    Our application is memory intensive and deals with reading a large number of disk files. The total load can be more than 3 GB. There is a custom memory manager that uses memory mapped files to achieve reading of such a huge data. The files are mapped into the process memory space only when needed and with this the process memory is well under control. But what is observed is, with memory mapping, the system cache keeps on increasing until it occupies the available physical memory. This leads to the slowing down of the entire system. My question is how to prevent system cache from hogging the physical memory? I attempted to remove the file buffering (by using FILE_FLAG_NO_BUFFERING ), but with this, the read operations take considerable amount of time and slows down the application performance. How to achieve the scalability without sacrificing much on performance. What are the common techniques used in such cases? I dont have a good understanding of the WinXP OS caching behavior. Any good links explaining the same would also be helpful.

    Read the article

  • Best directory to store application data with read\write rights for all users?

    - by Wodzu
    Hi guys. Until Windows Vista I was saving my application data into the directory where the program was located. The most common place was "C:\Program Files\MyApplication". As we know, under Vista and later the common user does't have rights to write under "Program Files" folder. So my first idea was to save the application data under "All Useres\Application Data" folder. But it seams that this folder has writing restrictions too! So to sum up, my requirements are: Folder should exist under Windows XP and above Microsoft's systems. All useres of the system should read\write\creation rights to this folder and it subfolders and files. I want to have only one copy of file\files for all useres. Thanks for your time.

    Read the article

  • fopen() fails to open stream: permission denied, yet permissions should be valid

    - by about blank
    So, I have this error: Warning: fopen(/path/to/test-in.txt) [function.fopen]: failed to open stream: Permission denied Performing ls -l in the directory where test-in.txt is produces the following output: -rw-r--r-- 1 $USER $USER 1921 Sep 6 20:09 test-in.txt -rw-r--r-- 1 $USER $USER 0 Sep 6 20:08 test-out.txt In order to get past this, I decided to perform the following: chgrp -R www-data /path/to/php/webroot And then did: chmod g+rw /path/to/php/webroot Yet, I still get this error when I run my php5 script to open the file. Why is this happening? I've tried this using LAMP as well as cherokee through CGI, so it can't be this. Is there a solution of some sort? Edit I'll also add that I'm just developing via localhost right now. Update - PHP fopen() line $fullpath = $this->fileRoot . $this->fileInData['fileName']; $file_ptr = fopen( $fullpath, 'r+' ); I should also mention I'd like to stick with Cherokee if possible. What's this deal about setting file permissions for Apache/Cherokee?

    Read the article

  • unbuffered I/O in Linux

    - by stuck
    I'm writing lots and lots of data that will not be read again for weeks - as my program runs the amount of free memory on the machine (displayed with 'free' or 'top') drops very quickly, the amount of memory my app uses does not increase - neither does the amount of memory used by other processes. This leads me to believe the memory is being consumed by the filesystems cache - since I do not intend to read this data for a long time I'm hoping to bypass the systems buffers, such that my data is written directly to disk. I dont have dreams of improving perf or being a super ninja, my hope is to give a hint to the filesystem that I'm not going to be coming back for this memory any time soon, so dont spend time optimizing for those cases. On Windows I've faced similar problems and fixed the problem using FILE_FLAG_NO_BUFFERING|FILE_FLAG_WRITE_THROUGH - the machines memory was not consumed by my app and the machine was more usable in general. I'm hoping to duplicate the improvements I've seen but on Linux. On Windows there is the restriction of writing in sector sized pieces, I'm happy with this restriction for the amount of gain I've measured. is there a similar way to do this in Linux?

    Read the article

  • Reading in data from a file into an array

    - by Sam
    If I have an options file along the lines of this: size = 4 data = 1100010100110010 And I have a 2d size * size array that I want to populate the values in data into, what's the best way of doing it? To clarify, for the example I have I'd want an array like this: int[4][4] array = {{1,1,0,0}, {0,1,0,1}, {0,0,1,1}, {0,0,1,0}}. (Not real code but you get the idea). Size can be really be any number though. I'm thinking I'd have to read in the size, maloc an array and then maybe read in a string full of data then loop through each char in the data, cast it to an int and stick it in the appropriate index? But I really have no idea how to go about it, have been searching for a while with no luck. Any help would be cool! :)

    Read the article

  • Read a file with 2048 bytes

    - by Suresh S
    Guys i have a file which has only one line. The file has no encoding it is a simple text file with single line. For every 2048 byte in a line , there is new record of 151 byte (totally 13*151 byte = 1945 records + 85 byte empty space). similarly for the next 2048 bytes. What is the best file i/o to use? i am thinking of reading 2048 bytes from file and storing it in an array . while (offset < fileLength &&(numRead=in.read(recordChunks, offset,alength)) >= 0) { } how can i get from the read statement only 2048 bytes at a time . i am getting IndexOutofBoundException.

    Read the article

  • Reading a simple Avro file from HDFS

    - by John Galt... who
    I am trying to do a simple read of an Avro file stored in HDFS. I found out how to read it when it is on the local file system.... FileReader reader = DataFileReader.openReader(new File(filename), new GenericDatumReader()); for (GenericRecord datum : fileReader) { String value = datum.get(1).toString(); System.out.println("value = " value); } reader.close(); My file is in HDFS, however. I cannot give the openReader a Path or an FSDataInputStream. How can I simply read an Avro file in HDFS? EDIT: I got this to work by creating a custom class (SeekableHadoopInput) that implements SeekableInput. I "stole" this from "Ganglion" on github. Still, seems like there would be a Hadoop/Avro integration path for this. Thanks

    Read the article

  • How to close reconnect SocketIOClient on android?

    - by erginduran
    My problem is reconnect.I connect SocketIOClient.connect(..) in background service.I close service when internet connection is off.and I re-start service again connection on. How to close this reconnection?I don't want to reconnect SocketIOClient. Its my code: ConnectCallback mConnectCallback = new ConnectCallback() { @Override public void onConnectCompleted(Exception ex, SocketIOClient client) { if (ex != null) { ex.printStackTrace(); return; } client.setReconnectCallback(new ReconnectCallback() { @Override public void onReconnect() { // TODO Auto-generated method stub } }); client.setDisconnectCallback(new DisconnectCallback() { @Override public void onDisconnect(Exception arg0) { // TODO Auto-generated method stub } }); client.setErrorCallback(new ErrorCallback() { @Override public void onError(String arg0) { // TODO Auto-generated method stub } }); client.on("event", new EventCallback() { @Override public void onEvent(JSONArray jsonArray, Acknowledge acknowledge) { ///bla bla } }); ScreenChat.mClient = client; } };

    Read the article

  • Calling class in Java after editing file used in as source for table

    - by user2892290
    I'm currently working on a project, I'll try to subrscibe first. I save data into text file, that I use as a source for browser of that data. The browser is based on table that contains the data. I have to rewrite the source file everytime I delete or edit data. That's where the problem comes in. After deleting or editing data I call a method to create the table again, but the table never creates. Is it possibly made by editing the file and calling the method right after that? If I restart my app the table is successfully created with right data. Take in note that I don't get any error message. This is the method I use for loading data from source file: try (BufferedReader input1 = new BufferedReader(new FileReader("./src/data.src"))) { int lines = 0; while (input1.read() != -1) { if (!(input1.readLine()).equals("")) { lines++; } } input1.close(); if (lines == 0) { JOptionPane.showMessageDialog(null, "No data to load, create a note first!"); new Writer().build(frame); } else { try (BufferedReader input = new BufferedReader(new FileReader("./src/data.src"))) { Game[] g = new Game[lines]; String currentLine; String[] help; int counter = 0; while (lines > 0) { currentLine = input.readLine(); help = currentLine.split("#"); g[counter] = new Game(help[0],help[1], help[2], help[3], help[4], help[5], help[6], help[7], help[8], help[9]); counter++; lines--; } input.close(); final JButton bButton = new backButton().create(frame, mPanel); build(g, frame, bButton); mPanel.add(panel); mPanel.add(panel2); mPanel.add(searchPanel); mPanel.add(bButton); bButton.addActionListener(new ActionListener() { @Override public void actionPerformed(ActionEvent e) { frame.setCursor(Cursor.getPredefinedCursor(Cursor.WAIT_CURSOR)); panel.removeAll(); frame.setCursor(Cursor.getDefaultCursor()); } }); mPanel.setPreferredSize(new Dimension(1000, 750)); panel.setBorder(new EmptyBorder(10, 10, 10, 10)); frame.setLayout(new FlowLayout()); frame.add(mPanel); frame.pack(); JMenuBar menuBar = new Menu().create(frame, mPanel); frame.setJMenuBar(menuBar); frame.setVisible(true); Rectangle rec = GraphicsEnvironment.getLocalGraphicsEnvironment().getMaximumWindowBounds(); int width = (int) rec.getWidth(); int height = (int) rec.getHeight(); frame.setBounds(1, 3, width, height); frame.addComponentListener(new ComponentAdapter() { @Override public void componentMoved(ComponentEvent e) { frame.setLocation(1, 3); } }); And this is the method I use for creating the table: String[][] tableData = new String[g.length][9]; for (int i = 0; i < tableData.length; i++) { tableData[i][0] = g[i].getChampion(); tableData[i][1] = g[i].getRole(); tableData[i][2] = g[i].getEnemy(); tableData[i][3] = g[i].getDifficulty(); tableData[i][4] = g[i].getResult(); tableData[i][5] = g[i].getScore(); tableData[i][6] = g[i].getGameType(); tableData[i][7] = g[i].getPoints(); tableData[i][8] = g[i].getLeague(); } final JLabel searchLabel = new JLabel("Search for champion played."); final JButton searchButton = new JButton("Search"); final JTextField searchText = new JTextField(20); frame.setTitle("LoL Notepad - reading your notes"); JTable table = new JTable(tableData, columnNames); final JScrollPane scrollPane = new JScrollPane(table); scrollPane.setPreferredSize(new Dimension(980, 500)); panel2.setPreferredSize(new Dimension(1000, 550)); panel2.setVisible(false); panel2.setBorder(new EmptyBorder(10, 10, 10, 10)); panel3.setVisible(false); panel.setLayout(new FlowLayout()); panel.add(scrollPane); searchPanel.add(searchLabel); searchPanel.add(searchText); searchPanel.add(searchButton); searchButton.addActionListener(new ActionListener() { @Override public void actionPerformed(ActionEvent e) { try { frame.setCursor(Cursor.getPredefinedCursor(Cursor.WAIT_CURSOR)); search(g, searchText.getText(), frame, bButton); frame.setCursor(Cursor.getDefaultCursor()); } catch (IOException ex) { Logger.getLogger(Reader.class.getName()).log(Level.SEVERE, null, ex); } } }); table.addMouseListener(new MouseAdapter() { @Override public void mousePressed(MouseEvent e) { if (e.getClickCount() == 1) { JTable target = (JTable) e.getSource(); panel.setVisible(false); searchPanel.setVisible(false); bButton.setVisible(false); int row = target.getSelectedRow(); specific(row, g, frame, bButton); } } });

    Read the article

  • How to create custom filenames in C?

    - by eSKay
    Please see this piece of code: #include<stdio.h> #include<string.h> #include<stdlib.h> int main() { int i = 0; FILE *fp; for(i = 0; i < 100; i++) { fp = fopen("/*what should go here??*/","w"); //I need to create files with names: file0.txt, file1.txt, file2.txt etc //i.e. file{i}.txt } }

    Read the article

  • File Operations in Android NDK

    - by EnderX
    I am using the Android NDK to make an application primarily in C for performance reasons, but it appears that file operations such as fopen do not work correctly in Android. Whenever I try to use these functions, the application crashes. How do I create/write to a file with the Android NDK?

    Read the article

  • using wild card when listing directories in python

    - by user248237
    how can I use wild cars like '*' when getting a list of files inside a directory in Python? for example, I want something like: os.listdir('foo/*bar*/*.txt') which would return a list of all the files ending in .txt in directories that have bar in their name inside of the foo parent directory. how can I do this? thanks.

    Read the article

  • What is a good way of coding a file processing program, which accepts multisource data in Java

    - by jjepsuomi
    I'm making a data prosessing system, which currently is using csv-data as input and output form. In the future I might want to add support for example database-, xml-, etc. typed input and output forms. How should I desing my program so that it would be easy to add support for new type of data sources? Should simply make for example an abstract data class (which would contain the basic file prosessing methods) and then inherit this class for database, xml, etc. cases? Hope my question is clear =) In other words my question is: "How to desing a file prosessing system, which can be easily updated to accept input data from different sources (database, XML, Excel, etc.)".

    Read the article

  • How do I open a file in such a way that if the file doesn't exist it will be created and opened automatically?

    - by snakile
    Here's how I open a file for writing+ : if( fopen_s( &f, fileName, "w+" ) !=0 ) { printf("Open file failed\n"); return; } fprintf_s(f, "content"); If the file doesn't exist the open operation fails. What's the right way to fopen if I want to create the file automatically if the file doesn't already exist? EDIT: If the file does exist, I would like fprintf to overwrite the file, not to append to it.

    Read the article

  • Unable to write to a text file

    - by chrissygormley
    Hello, I am running some tests and need to write to a file. When I run the test's the open = (file, 'r+') does not write to the file. The test script is below: class GetDetailsIP(TestGet): def runTest(self): self.category = ['PTZ'] try: # This run's and return's a value result = self.client.service.Get(self.category) mylogfile = open("test.txt", "r+") print >>mylogfile, result result = ("".join(mylogfile.readlines()[2])) result = str(result.split(':')[1].lstrip("//").split("/")[0]) mylogfile.close() except suds.WebFault, e: assert False except Exception, e: pass finally: if 'result' in locals(): self.assertEquals(result, self.camera_ip) else: assert False When this test run's, no value has been entered into the text file and a value is returned in the variable result. I havw also tried mylogfile.write(result). If the file does not exist is claim's the file does not exist and doesn't create one. Could this be a permission problem where python is not allowed to create a file? I have made sure that all other read's to this file are closed so I the file should not be locked. Can anyone offer any suggestion why this is happening? Thanks

    Read the article

  • [NSData dataWithContentsOfFile:path] doesn't work

    - by Felics
    Hello, when I have the fallowing code to read a binary file: NSString* file = [NSString stringWithUTF8String:fileName]; NSString* filePath = resource ? [[NSBundle mainBundle] pathForResource:file ofType:nil] : [[NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0] stringByAppendingPathComponent: file]; NSData* fileData = [NSData dataWithContentsOfFile:filePath]; Where "fileName" and resource are load function parameters. "resource" is used to know if the file is located in application bundle or in Documents. Sometimes this code works well and sometimes it doesn't. As far I saw this problem is random. I can run the code 10 times in a row and it works fine and after that it gives me nil data without any modification. Does anybody knows what could be the problem? Could it be related with file extension or file name? Thank you. PS: I use this code on iPhone Simulator and the file exists in application bundle.

    Read the article

  • read chars from a file - c#

    - by Saskaaa
    How to read from a file array of numbers? I mean, how to read chars from a file? sorry for bad eng. upd: yes, i can :) just: "1 2 3 4 5 6 7 8" and etc. I just do not know how to read chars from a file.

    Read the article

< Previous Page | 26 27 28 29 30 31 32 33 34 35 36 37  | Next Page >