Search Results

Search found 6654 results on 267 pages for 'socket io'.

Page 59/267 | < Previous Page | 55 56 57 58 59 60 61 62 63 64 65 66  | Next Page >

  • best practices question: How to save a collection of images and a java object in a single file? File

    - by Richard
    Hi all, I am making a java program that has a collection of flash-card like objects. I store the objects in a jtree composed of defaultmutabletreenodes. Each node has a user object attached to it with has a few string/native data type parameters. However, i also want each of these objects to have an image (typical formats, jpg, png etc). I would like to be able to store all of this information, including the images and the tree data to the disk in a single file so the file can be transferred between users and the entire tree, including the images and parameters for each object, can be reconstructed. I had not approached a problem like this before so I was not sure what the best practices were. I found XLMEncoder (http://java.sun.com/j2se/1.4.2/docs/api/java/beans/XMLEncoder.html) to be a very effective way of storing my tree and the native data type information. However I couldn't figure out how to save the image data itself inside of the XML file, and I'm not sure it is possible since the data is binary (so restricted characters would be invalid). My next thought was to associate a hash string instead of an image within each user object, and then gzip together all of the images, with the hash strings as the names and the XMLencoded tree in the same compmressed file. That seemed really contrived though. Does anyone know a good approach for this type of issue? THanks! Thanks!

    Read the article

  • Read a file with 2048 bytes

    - by Suresh S
    Guys i have a file which has only one line. The file has no encoding it is a simple text file with single line. For every 2048 byte in a line , there is new record of 151 byte (totally 13*151 byte = 1945 records + 85 byte empty space). similarly for the next 2048 bytes. What is the best file i/o to use? i am thinking of reading 2048 bytes from file and storing it in an array . while (offset < fileLength &&(numRead=in.read(recordChunks, offset,alength)) >= 0) { } how can i get from the read statement only 2048 bytes at a time . i am getting IndexOutofBoundException.

    Read the article

  • read chars from a file - c#

    - by Saskaaa
    How to read from a file array of numbers? I mean, how to read chars from a file? sorry for bad eng. upd: yes, i can :) just: "1 2 3 4 5 6 7 8" and etc. I just do not know how to read chars from a file.

    Read the article

  • StreamReader not working as expected

    - by Jon Preece
    Hi, I have written a simple utility that loops through all C# files in my project and updates the copyright text at the top. For example, a file may look like this; //Copyright My Company, © 2009-2010 The program should update the text to look like this; //Copyright My Company, © 2009-2010 However, the code I have written results in this; //Copyright My Company, � 2009-2011 Here is the code I am using; public bool ModifyFile(string filePath, List<string> targetText, string replacementText) { if (!File.Exists(filePath)) return false; if (targetText == null || targetText.Count == 0) return false; if (string.IsNullOrEmpty(replacementText)) return false; string modifiedFileContent = string.Empty; bool hasContentChanged = false; //Read in the file content using (StreamReader reader = File.OpenText(filePath)) { string file = reader.ReadToEnd(); //Replace any target text with the replacement text foreach (string text in targetText) modifiedFileContent = file.Replace(text, replacementText); if (!file.Equals(modifiedFileContent)) hasContentChanged = true; } //If we haven't modified the file, dont bother saving it if (!hasContentChanged) return false; //Write the modifications back to the file using (StreamWriter writer = new StreamWriter(filePath)) { writer.Write(modifiedFileContent); } return true; } Any help/suggestions are appreciated. Thanks!

    Read the article

  • File Operations in Android NDK

    - by EnderX
    I am using the Android NDK to make an application primarily in C for performance reasons, but it appears that file operations such as fopen do not work correctly in Android. Whenever I try to use these functions, the application crashes. How do I create/write to a file with the Android NDK?

    Read the article

  • FileNotFound exception

    - by Pratik
    I am trying to read a file in a servlet. I am using eclipse IDE. I get a FileNotFoundException if I provide relative file name. List<String> ls=new ArrayList<String>(); Scanner input = new Scanner(new File("Input.txt")); while(input.hasNextLine()) { ls.add(input.nextLine()); } The same code works if I put the absolute path like this: Scanner input = new Scanner(new File("F:/Spring and other stuff/AjaxDemo/src/com/pdd/ajax/Input.txt")); The Java file and text file are there in the same folder. Does it searches text file in some other folder ?

    Read the article

  • Best directory to store application data with read\write rights for all users?

    - by Wodzu
    Hi guys. Until Windows Vista I was saving my application data into the directory where the program was located. The most common place was "C:\Program Files\MyApplication". As we know, under Vista and later the common user does't have rights to write under "Program Files" folder. So my first idea was to save the application data under "All Useres\Application Data" folder. But it seams that this folder has writing restrictions too! So to sum up, my requirements are: Folder should exist under Windows XP and above Microsoft's systems. All useres of the system should read\write\creation rights to this folder and it subfolders and files. I want to have only one copy of file\files for all useres. Thanks for your time.

    Read the article

  • How to get proper alignment when printing to file

    - by user1067334
    I have this Structure the elements of which that I need to write in a text file struct Stage3ADisplay { int nSlot; char *Item; char *Type; int nIndex; unsigned char attributesMD[17]; //the last character is \0 unsigned char contentsMD[17]; //only for regular files - //the last character is \0 }; buffer = malloc(sizeof(Stage3ADisplayVar[nIterator]->nSlot) + sizeof(Stage3ADisplayVar[nIterator]->Item) + sizeof(Stage3ADisplayVar[nIterator]->Type) + sizeof(Stage3ADisplayVar[nIterator]->nIndex) + sizeof(Stage3ADisplayVar[nIterator]->attributesMD) + sizeof(Stage3ADisplayVar[nIterator]->contentsMD) + 1); sprintf (buffer,"%d %s %s %d %x %x",Stage3ADisplayVar[nIterator]->nSlot, Stage3ADisplayVar[nIterator]->Item,Stage3ADisplayVar[nIterator]->Type,Stage3ADisplayVar[nIterator]->nIndex,Stage3ADisplayVar[nIterator]->attributesMD,Stage3ADisplayVar[nIterator]->contentsMD); How do I make sure the rows in the file are properly aligned. Thank you.

    Read the article

  • How can chunks be allocated in a node.js stream in object mode all at once?

    - by Quentin Engles
    I can see how buffers, and strings can be sent as chunks, but I'm having a problem thinking about how streams can be dealt when working in object mode. Say I have a byte stream from an http request message. I want to take that message, parse, and then transform it into one big object. I already know how to parse the message. What I'm wondering is if the message is big so it has many chunks, but I want to make one object for the output how can I make sure the data event waits for the whole thing? Is this just a matter of not using the push method until the chunked data has finished being sent? That would then restrict the stream data output to a smaller object which I think I'm fine with for now. As an added condition the larger data will be reduced in size after the the transform.

    Read the article

  • Unable to write to a text file

    - by chrissygormley
    Hello, I am running some tests and need to write to a file. When I run the test's the open = (file, 'r+') does not write to the file. The test script is below: class GetDetailsIP(TestGet): def runTest(self): self.category = ['PTZ'] try: # This run's and return's a value result = self.client.service.Get(self.category) mylogfile = open("test.txt", "r+") print >>mylogfile, result result = ("".join(mylogfile.readlines()[2])) result = str(result.split(':')[1].lstrip("//").split("/")[0]) mylogfile.close() except suds.WebFault, e: assert False except Exception, e: pass finally: if 'result' in locals(): self.assertEquals(result, self.camera_ip) else: assert False When this test run's, no value has been entered into the text file and a value is returned in the variable result. I havw also tried mylogfile.write(result). If the file does not exist is claim's the file does not exist and doesn't create one. Could this be a permission problem where python is not allowed to create a file? I have made sure that all other read's to this file are closed so I the file should not be locked. Can anyone offer any suggestion why this is happening? Thanks

    Read the article

  • [NSData dataWithContentsOfFile:path] doesn't work

    - by Felics
    Hello, when I have the fallowing code to read a binary file: NSString* file = [NSString stringWithUTF8String:fileName]; NSString* filePath = resource ? [[NSBundle mainBundle] pathForResource:file ofType:nil] : [[NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0] stringByAppendingPathComponent: file]; NSData* fileData = [NSData dataWithContentsOfFile:filePath]; Where "fileName" and resource are load function parameters. "resource" is used to know if the file is located in application bundle or in Documents. Sometimes this code works well and sometimes it doesn't. As far I saw this problem is random. I can run the code 10 times in a row and it works fine and after that it gives me nil data without any modification. Does anybody knows what could be the problem? Could it be related with file extension or file name? Thank you. PS: I use this code on iPhone Simulator and the file exists in application bundle.

    Read the article

  • Python f.write() at beginning of file?

    - by kristus
    I'm doing it like this now, but i want it to write at the beginning of the file instead. f = open('out.txt', 'a') # or 'w'? f.write("string 1") f.write("string 2") f.write("string 3") f.close() so that the contenst of out.txt will be: string 3 string 2 string 1 and not (like this code does): string 1 string 2 string 3

    Read the article

  • Python text file processing speed issues

    - by Anonymouslemming
    Hi all, I'm having a problem with processing a largeish file in Python. All I'm doing is f = gzip.open(pathToLog, 'r') for line in f: counter = counter + 1 if (counter % 1000000 == 0): print counter f.close This takes around 10m25s just to open the file, read the lines and increment this counter. In perl, dealing with the same file and doing quite a bit more (some regular expression stuff), the whole process takes around 1m17s. Perl Code: open(LOG, "/bin/zcat $logfile |") or die "Cannot read $logfile: $!\n"; while (<LOG>) { if (m/.*\[svc-\w+\].*login result: Successful\.$/) { $_ =~ s/some regex here/$1,$2,$3,$4/; push @an_array, $_ } } close LOG; Can anyone advise what I can do to make the Python solution run at a similar speed to the Perl solution? I've tried just uncompressing the file and dealing with it using open instead of gzip.open, but that made a very small difference to the overall time.

    Read the article

  • Scanner's Read Line returning NoSuchElementException

    - by Brian
    This is my first time using StackOverflow. I am trying to read a text file which consists of a single number one the first line. try { Scanner s = new Scanner(new File("HighScores.txt")); int temp =Integer.parseInt(s.nextLine()); s.close(); return temp; } catch (FileNotFoundException e) { // TODO Auto-generated catch block e.printStackTrace(); } However, I get an error: java.util.NoSuchElementException: No line found at java.util.Scanner.nextLine(Unknown Source) at GameStart.getHighScore(GameStart.java:334) at GameStart.init(GameStart.java:82) at sun.applet.AppletPanel.run(Unknown Source) at java.lang.Thread.run(Unknown Source) I know that HighScores.txt is not empty, so why is this problem occuring? I tried using BufferedReader, and BufferReader.readLine() return null.

    Read the article

  • How do I open a file in such a way that if the file doesn't exist it will be created and opened automatically?

    - by snakile
    Here's how I open a file for writing+ : if( fopen_s( &f, fileName, "w+" ) !=0 ) { printf("Open file failed\n"); return; } fprintf_s(f, "content"); If the file doesn't exist the open operation fails. What's the right way to fopen if I want to create the file automatically if the file doesn't already exist? EDIT: If the file does exist, I would like fprintf to overwrite the file, not to append to it.

    Read the article

  • Read char from txt file in C++

    - by Jack in the Box
    I have a program that will read the number of rows and columns from a txt file. Also, the program has to read the contents of a 2D array from the same file. Here is the txt file 8 20 * * *** *** 8 and 20 are the number of rows and columns respectively. The spaces and asterisks are the contents of the array, Array[8][20] For example, Array[0][1] = '*' I did make the program reading 8 and 20 as follow: ifstream myFile; myFile.open("life.txt"); if(!myFile) { cout << endl << "Failed to open file"; return 1; } myFile >> rows >> cols; myFile.close(); grid = new char*[rows]; for (int i = 0; i < rows; i++) { grid[i] = new char[cols]; } Now, how to assign the spaces and the asterisks to to the fields in the array? I hope you got the point.

    Read the article

  • unbuffered I/O in Linux

    - by stuck
    I'm writing lots and lots of data that will not be read again for weeks - as my program runs the amount of free memory on the machine (displayed with 'free' or 'top') drops very quickly, the amount of memory my app uses does not increase - neither does the amount of memory used by other processes. This leads me to believe the memory is being consumed by the filesystems cache - since I do not intend to read this data for a long time I'm hoping to bypass the systems buffers, such that my data is written directly to disk. I dont have dreams of improving perf or being a super ninja, my hope is to give a hint to the filesystem that I'm not going to be coming back for this memory any time soon, so dont spend time optimizing for those cases. On Windows I've faced similar problems and fixed the problem using FILE_FLAG_NO_BUFFERING|FILE_FLAG_WRITE_THROUGH - the machines memory was not consumed by my app and the machine was more usable in general. I'm hoping to duplicate the improvements I've seen but on Linux. On Windows there is the restriction of writing in sector sized pieces, I'm happy with this restriction for the amount of gain I've measured. is there a similar way to do this in Linux?

    Read the article

  • Windows equivalent of inb(), outb(), low level i/o

    - by Sebastian Dwornik
    I have some Linux code that monitors our hardware by collecting temperatures, voltages, and fan speeds, from the motherboard using inb(), outb(), inl(), etc. low level i/o functions. My challenge is to port that code over to run under Windows as a simple console app. But am puzzled in what functions Win32 (or .NET) provide that allow me permission to access direct memory mapped ports. I don't want to code a system driver either. My Windows tool preference is VS2008. (fyi) Is this possible?

    Read the article

  • What is the fastest way to write hundreds of files to disk using C#?

    - by Ehsan
    My program should write hundreds of files to disk, received by external resources (network) each file is a simple document that I'm currently store it with the name of GUID in a specific folder but creating hundred files, writing, closing is a lengthy process. Is there any better way to store these amount of files to disk? I've come to a solution, but I don't know if it is the best. First, I create 2 files, one of them is like allocation table and the second one is a huge file storing all the content of my documents. But reading from this file would be a nightmare; maybe a memory-mapped file technique could help. Could working with 30GB or more create a problem?

    Read the article

  • Saving MP3 playlist to file

    - by Northernen
    Hello. I am making my own crude MP3 player, and I now have a JList with which I have populated a number of files in the form of MP3 objects (displayed on frame using DefaultListModel). I would now like to have the oppurtunity to save this JList to a file on disk. How would I go about doing this? I'm very new with programming and Java, so help is greatly appreciated.

    Read the article

  • write cache and write sequence order

    - by excanoe
    ok, here i have some weird question: let say we have some binary file (.log), and sequence of write operations, for example log1, log2, log3 and each has some block size n (raw data). question: can I be sure that log1,log2 and log3 sequences can be written in the correct order in ONE file, even if i have few cache levels (disk hardware and os level)? update very interested in what will be with records order (not with records) if we have software or hardware failure (reboot or another reason). update there can be some percent of write failures, but main question is: will write order stay correct?

    Read the article

  • How to load/save C++ class instance (using STL containers) to disk

    - by supert
    I have a C++ class representing a hierarchically organised data tree which is very large (~Gb, basically as large as I can get away with in memory). It uses an STL list to store information at each node plus iterators to other nodes. Each node has only one parent, but 0-10 children. Abstracted, it looks something like: struct node { public: node_list_iterator parent; // iterator to a single parent node double node_data_array[X]; map<int,node_list_iterator> children; // iterators to child nodes }; class strategy { private: list<node> tree; // hierarchically linked list of nodes struct some_other_data; public: void build(); // build the tree void save(); // save the tree from disk void load(); // load the tree from disk void use(); // use the tree }; I would like to implement the load() and save() to disk, and it should be fairly fast, however the obvious problems are: I don't know the size in advance; The data contains iterators, which are volatile; My ignorance of C++ is prodigious. Could anyone suggest a pure C++ solution please?

    Read the article

  • using wild card when listing directories in python

    - by user248237
    how can I use wild cars like '*' when getting a list of files inside a directory in Python? for example, I want something like: os.listdir('foo/*bar*/*.txt') which would return a list of all the files ending in .txt in directories that have bar in their name inside of the foo parent directory. how can I do this? thanks.

    Read the article

< Previous Page | 55 56 57 58 59 60 61 62 63 64 65 66  | Next Page >