Search Results

Search found 12367 results on 495 pages for 'disk io'.

Page 85/495 | < Previous Page | 81 82 83 84 85 86 87 88 89 90 91 92  | Next Page >

  • Execute a Application On The Server Using JavaScript

    - by Nathan Campos
    I have an application on my server that is called leaf.exe, that haves two arguments needed to run, they are: inputfile and outputfile, that will be like this example: pnote.exe input.pnt output.txt They are all on the same directory as my home page file(the executable and the input file). But I need that a JavaScript could run the application like that, then I want to know how could I do this.

    Read the article

  • Performance of fopen vs stat

    - by Alex Marshall
    Hello, I'm writing several C programs for an embedded system where every bit of performance we can squeeze out will matter. Part of that is accessing log files. When determining if a file exists, is there any performance difference between using open / fopen, and stat ? I've been using stat on the assumption that it only has to do a quick check against the file system, whereas fopen would have to actually gain access to a file and manipulate internal data structures before returning. Is there any merit to this ?

    Read the article

  • Android USB Host Communication

    - by Kip Russell
    I'm working on a project that utilizes the USB Host capabilities in Android 3.2. I'm suffering from a deplorable lack of knowledge and talent regarding USB/Serial communication in general. I'm also unable to find any good example code for what I need to do. I need to read from a USB Communication Device. Ex: When I connect via Putty (on my PC) I enter: >GO And the device starts spewing out data for me. Pitch/Roll/Temp/Checksum. Ex: $R1.217P-0.986T26.3*60 $R1.217P-0.986T26.3*60 $R1.217P-0.987T26.3*61 $R1.217P-0.986T26.3*60 $R1.217P-0.985T26.3*63 I can send the initial 'GO' command from the Android device at which time I receive an echo of 'GO'. Then nothing else on any subsequent reads. How can I: 1) Send the 'go' command. 2) Read in the stream of data that results. The USB device I'm working with has the following interfaces (endpoints). Device Class: Communication Device (0x2) Interfaces: Interface #0 Class: Communication Device (0x2) Endpoint #0 Direction: Inbound (0x80) Type: Intrrupt (0x3) Poll Interval: 255 Max Packet Size: 32 Attributes: 000000011 Interface #1 Class: Communication Device Class (CDC) (0xa) Endpoint #0 Address: 129 Number: 1 Direction: Inbound (0x80) Type: Bulk (0x2) Poll Interval (0) Max Packet Size: 32 Attributes: 000000010 Endpoint #1 Address: 2 Number: 2 Direction: Outbound (0x0) Type: Bulk (0x2) Poll Interval (0) Max Packet Size: 32 Attributes: 000000010 I'm able to deal with permission, connect to the device, find the correct interface and assign the endpoints. I'm just having trouble figuring out which technique to use to send the initial command read the ensuing data. I'm tried different combinations of bulkTransfer and controlTransfer with no luck. Thanks. I'm using interface#1 as seen below: public AcmDevice(UsbDeviceConnection usbDeviceConnection, UsbInterface usbInterface) { Preconditions.checkState(usbDeviceConnection.claimInterface(usbInterface, true)); this.usbDeviceConnection = usbDeviceConnection; UsbEndpoint epOut = null; UsbEndpoint epIn = null; // look for our bulk endpoints for (int i = 0; i < usbInterface.getEndpointCount(); i++) { UsbEndpoint ep = usbInterface.getEndpoint(i); Log.d(TAG, "EP " + i + ": " + ep.getType()); if (ep.getType() == UsbConstants.USB_ENDPOINT_XFER_BULK) { if (ep.getDirection() == UsbConstants.USB_DIR_OUT) { epOut = ep; } else if (ep.getDirection() == UsbConstants.USB_DIR_IN) { epIn = ep; } } } if (epOut == null || epIn == null) { throw new IllegalArgumentException("Not all endpoints found."); } AcmReader acmReader = new AcmReader(usbDeviceConnection, epIn); AcmWriter acmWriter = new AcmWriter(usbDeviceConnection, epOut); reader = new BufferedReader(acmReader); writer = new BufferedWriter(acmWriter); }

    Read the article

  • Ruby Thread with "watchdog"

    - by Sergio Campamá
    I'm implementing a ruby server for handling sockets being created from GPRS modules. The thing is that when the module powers down, there's no indication that the socket closed. I'm doing threads to handle multiple sockets with the same server. What I'm asking is this: Is there a way to use a timer inside a thread, reset it after every socket input, and that if it hits the timeout, closes the thread? Where can I find more information about this? EDIT: Code example that doesn't detect the socket closing require 'socket' server = TCPServer.open(41000) loop do Thread.start(server.accept) do |client| puts "Client connected" begin loop do line = client.readline open('log.txt', 'a') { |f| f.puts line.strip } end rescue puts "Client disconnected" end end end

    Read the article

  • Loop over a file and write the next line if a condition is met

    - by 111078384259264152964
    Having a hard time fixing this or finding any good hints about it. I'm trying to loop over one file, modify each line slightly, and then loop over a different file. If the line in the second file starts with the line from the first then the following line in the second file should be written to a third file. !/usr/bin/env python with open('ids.txt', 'rU') as f: with open('seqres.txt', 'rU') as g: for id in f: id=id.lower()[0:4]+'_'+id[4] with open(id + '.fasta', 'w') as h: for line in g: if line.startswith(''+ id): h.write(g.next()) All the correct files appear, but they are empty. Yes, I am sure the if has true cases. :-) "seqres.txt" has lines with an ID number in a certain format, each followed by a line with data. The "ids.txt" has lines with the ID numbers of interest in a different format. I want each line of data with an interesting ID number in its own file. Thanks a million to anyone with a little advice!

    Read the article

  • Interface for reading variable length files with header and footer.

    - by John S
    I could use some hints or tips for a decent interface for reading file of special characteristics. The files in question has a header (~120 bytes), a body (1 byte - 3gb) and a footer (4 bytes). The header contains information about the body and the footer is only a simple CRC32-value of the body. I use Java so my idea was to extend the "InputStream" class and add a constructor such as "public MyInStream( InputStream in)" where I immediately read the header and the direct the overridden read()'s the body. Problem is, I can't give the user of the class the CRC32-value until the whole body has been read. Because the file can be 3gb large, putting it all in memory is a be an idea. Reading it all in to a temporary file is going to be a performance hit if there are many small files. I don't know how large the file is because the InputStream doesn't have to be a file, it could be a socket. Looking at it again, maybe extending InputStream is a bad idea. Thank you for reading the confused thoughts of a tired programmer. :)

    Read the article

  • Creating and writing file from a FileOutputStream in Java

    - by Althane
    Okay, so I'm working on a project where I use a Java program to initiate a socket connection between two classes (a FileSender and FileReceiver). My basic idea was that the FileSender would look like this: try { writer = new DataOutputStream(connect.getOutputStream()); } catch (IOException e) { // TODO Auto-generated catch block e.printStackTrace(); } //While we have bytes to send while(filein.available() >0){ //We write them out to our buffer writer.write(filein.read(outBuffer)); writer.flush(); } //Then close our filein filein.close(); //And then our socket; connect.close(); } catch (IOException e) { // TODO Auto-generated catch block e.printStackTrace(); The constructor contains code that checks to see if the file exists, and that the socket is connected, and so on. Inside my FileReader is this though: input = recvSocket.accept(); BufferedReader br = new BufferedReader(new InputStreamReader(input.getInputStream())); FileOutputStream fOut= new FileOutputStream(filename); String line = br.readLine(); while(line != null){ fOut.write(line.getBytes()); fOut.flush(); line = br.readLine(); } System.out.println("Before RECV close statements"); fOut.close(); input.close(); recvSocket.close(); System.out.println("After RECV clsoe statements"); All inside a try-catch block. So, what I'm trying to do is have the FileSender reading in the file, converting to bytes, sending and flushing it out. FileReceiver, then reads in the bytes, writes to the fileOut, flushes, and continues waiting for more. I make sure to close all the things that I open, so... here comes the weird part. When I try and open the created text file in Eclipse, it tells me "An SWT error has occured ... recommended to exit the workbench... see .log for more details.". Another window pops up saying "Unhandled event loop exception, (no more handles)". However, if I try to open the sent text file in notepad2, I get ThisIsASentTextfile Which is good (well, minus the fact that there should be line breaks, but I'm working on that...). Does anyone know why this is happening? And while we're checking, how to add the line breaks? (And is this a particularly bad way to transfer files over java without getting some other libraries?)

    Read the article

  • shell_exec() Doesn't Show The Output

    - by Nathan Campos
    I'm doing a PHP site that uses a shell_exec() function like this: $file = "upload/" . $_FILES["file"]["name"]; $output = shell_exec("leaf $file"); echo "<pre>$output</pre>"; Where leaf is a program that is located in the same directory of my script, but when I tried to run this script on the server, I just got nothing. What is wrong?

    Read the article

  • Writing the output of IEnumerable<XElement> to a XML File

    - by Googler
    HI folks , This is my code to read two xml files and merge their elemnts. I want to write the output to an xml file. This is my code. IEnumerable<XElement> list0 = doc.Descendants(node1).Concat(doc2.Descendants(node2)); foreach (XElement el in list0) Console.WriteLine(el); Instead of writing to the console i need to write it to a xml file. Output is also in the xml format.How to achieve this. Can anyone pls give me a code or method to achiev this.

    Read the article

  • Cannot write to SD card -- canWrite is returning false

    - by Fizz
    Sorry for the ambiguous title but I'm doing the following to write a simple string to a file: try { File root = Environment.getExternalStorageDirectory(); if (root.canWrite()){ System.out.println("Can write."); File def_file = new File(root, "default.txt"); FileWriter fw = new FileWriter(def_file); BufferedWriter out = new BufferedWriter(fw); String defbuf = "default"; out.write(defbuf); out.flush(); out.close(); } else System.out.println("Can't write."); }catch (IOException e) { e.printStackTrace(); } But root.canWrite() seems to be returning false everytime. I am not running this off of an emulator, I have my android Eris plugged into my computer via USB and running the app off of my phone via Eclipse. Is there a way of giving my app permission so this doesn't happen? Also, this code seems to be create the file default.txt but what if it already exists, will it ignore the creation and just open it to write or do I have to catch something like FileAlreadyExists(if such an exception exists) which then just opens it and writes? Thanks for any help guys.

    Read the article

  • mmap() for large file I/O?

    - by Boatzart
    I'm creating a utility in C++ to be run on Linux which can convert videos to a proprietary format. The video frames are very large (up to 16 megapixels), and we need to be able to seek directly to exact frame numbers, so our file format uses libz to compress each frame individually, and append the compressed data onto a file. Once all frames are finished being written, a journal which includes meta data for each frame (including their file offsets and sizes) is written to the end of the file. I'm currently using ifstream and ofstream to do the file i/o, but I am looking to optimize as much as possible. I've heard that mmap() can increase performance in a lot of cases, and I'm wondering if mine is one of them. Our files will be in the tens to hundreds of gigabytes, and although writing will always be done sequentially, random access reads should be done in constant time. Any thoughts as to whether I should investigate this further, and if so does anyone have any tips for things to look out for? Thanks!

    Read the article

  • How can I find file system concurrency issues ?

    - by krosenvold
    I have an application running on Linux, and I find myself wanting windows (!). The problem is that every 1000 times or so I run into concurrency problems that are consistent with concurrent reading/writing of files. I am fairly sure that this behavior would be prohibited by file locking under Windows, but I don't have any sufficiently fast windows box to check. There is simply too much file access (too much data) to expect strace to work reliably - the sheer volume of output is likely to change the problem too. It also happens on different files every time. Ideally I would like to change/reconfigure the linux file system to be more restrictive (as in fail-fast) wrt concurrent access. Are there any tools/settings I can use to achieve this ?

    Read the article

  • display records which exist in file2 but not in file1

    - by Phoenix
    log file1 contains records of customers(name,id,date) who visited yesterday log file2 contains records of customers(name,id,date) who visited today How would you display customers who visited yesterday but not today? Constraint is: Don't use auxiliary data structure because file contains millions of records. [So, no hashes] Is there a way to do this using Unix commands ??

    Read the article

  • C++ file input/output search

    - by Brian J
    Hi I took the following code from a program I'm writing to check a user generated string against a dictionary as well as other validation. My problem is that although my dictionary file is referenced correctly,the program gives the default "no dictionary found".I can't see clearly what I'm doing in error here,if anyone has any tips or pointers it would be appreciated, Thanks. //variables for checkWordInFile #define gC_FOUND 99 #define gC_NOT_FOUND -99 // static bool certifyThat(bool condition, const char* error) { if(!condition) printf("%s", error); return !condition; } //method to validate a user generated password following password guidelines. void validatePass() { FILE *fptr; char password[MAX+1]; int iChar,iUpper,iLower,iSymbol,iNumber,iTotal,iResult,iCount; //shows user password guidelines printf("\n\n\t\tPassword rules: "); printf("\n\n\t\t 1. Passwords must be at least 9 characters long and less than 15 characters. "); printf("\n\n\t\t 2. Passwords must have at least 2 numbers in them."); printf("\n\n\t\t 3. Passwords must have at least 2 uppercase letters and 2 lowercase letters in them."); printf("\n\n\t\t 4. Passwords must have at least 1 symbol in them (eg ?, $, £, %)."); printf("\n\n\t\t 5. Passwords may not have small, common words in them eg hat, pow or ate."); //gets user password input get_user_password: printf("\n\n\t\tEnter your password following password rules: "); scanf("%s", &password); iChar = countLetters(password,&iUpper,&iLower,&iSymbol,&iNumber,&iTotal); iUpper = countLetters(password,&iUpper,&iLower,&iSymbol,&iNumber,&iTotal); iLower =countLetters(password,&iUpper,&iLower,&iSymbol,&iNumber,&iTotal); iSymbol =countLetters(password,&iUpper,&iLower,&iSymbol,&iNumber,&iTotal); iNumber = countLetters(password,&iUpper,&iLower,&iSymbol,&iNumber,&iTotal); iTotal = countLetters(password,&iUpper,&iLower,&iSymbol,&iNumber,&iTotal); if(certifyThat(iUpper >= 2, "Not enough uppercase letters!!!\n") || certifyThat(iLower >= 2, "Not enough lowercase letters!!!\n") || certifyThat(iSymbol >= 1, "Not enough symbols!!!\n") || certifyThat(iNumber >= 2, "Not enough numbers!!!\n") || certifyThat(iTotal >= 9, "Not enough characters!!!\n") || certifyThat(iTotal <= 15, "Too many characters!!!\n")) goto get_user_password; iResult = checkWordInFile("dictionary.txt", password); if(certifyThat(iResult != gC_FOUND, "Password contains small common 3 letter word/s.")) goto get_user_password; iResult = checkWordInFile("passHistory.txt",password); if(certifyThat(iResult != gC_FOUND, "Password contains previously used password.")) goto get_user_password; printf("\n\n\n Your new password is verified "); printf(password); //writing password to passHistroy file. fptr = fopen("passHistory.txt", "w"); // create or open the file for( iCount = 0; iCount < 8; iCount++) { fprintf(fptr, "%s\n", password[iCount]); } fclose(fptr); printf("\n\n\n"); system("pause"); }//end validatePass method int checkWordInFile(char * fileName,char * theWord){ FILE * fptr; char fileString[MAX + 1]; int iFound = -99; //open the file fptr = fopen(fileName, "r"); if (fptr == NULL) { printf("\nNo dictionary file\n"); printf("\n\n\n"); system("pause"); return (0); // just exit the program } /* read the contents of the file */ while( fgets(fileString, MAX, fptr) ) { if( 0 == strcmp(theWord, fileString) ) { iFound = -99; } } fclose(fptr); return(0); }//end of checkwORDiNFile

    Read the article

  • Write physical table to csv file

    - by urema
    Hi, I was wondering if anyone knows how to write an actual table/grid to a csv file....i dont mean the content of the table/grid, i mean the actual grid lines etc etc, headers, axis..... Thanks greatly in advance. U.

    Read the article

  • Determine which process (b)locks a file, programmatically (under Windows >= XP)

    - by fred-hh
    How to programmatically determine from a process P, which other process P' has a lock on a file, that prevents P from recreating that file ? I know there are tools to do that, but how do they achieve that ? (Context: a batch program that runs overnight fails because of a locked file. Running an admin tool the next day may be too late to get useful information. So it would be nice if the batch program itself was able to determine the culprit.) EDIT: Added complexity: the file resides on a DFS and P' might not run on the same machine as P (but maybe does). But a solution that works locally would be a good beginning.

    Read the article

  • Read file from root directory folder using filestream

    - by SurajSing
    There are two Image files in my folder which I have to call in my program. I have used: AppDomain.curentDomain.baseDirectory + "Path and file name"; But this goes into my bin directory which I don't want; I want to read the folder from root directory where my folder name as resource I have saved my file there and call the image so please what's the code for that? How do I read from root directory in a Windows Form Application?

    Read the article

  • Java Client-Server problem when sending multiple files

    - by Jim
    Client public void transferImage() { File file = new File(ServerStats.clientFolder); String[] files = file.list(); int numFiles = files.length; boolean done = false; BufferedInputStream bis; BufferedOutputStream bos; int num; byte[] byteArray; long count; long len; Socket socket = null ; while (!done){ try{ socket = new Socket(ServerStats.imgServerName,ServerStats.imgServerPort) ; InputStream inStream = socket.getInputStream() ; OutputStream outStream = socket.getOutputStream() ; System.out.println("Connected to : " + ServerStats.imgServerName); BufferedReader inm = new BufferedReader(new InputStreamReader(inStream)); PrintWriter out = new PrintWriter(outStream, true /* autoFlush */); for (int itor = 0; itor < numFiles; itor++) { String fileName = files[itor]; System.out.println("transfer: " + fileName); File sentFile = new File(fileName); len = sentFile.length(); len++; System.out.println(len); out.println(len); out.println(sentFile); //SENDFILE bis = new BufferedInputStream(new FileInputStream(fileName)); bos = new BufferedOutputStream(socket.getOutputStream( )); byteArray = new byte[1000000]; count = 0; while ( count < len ){ num = bis.read(byteArray); bos.write(byteArray,0,num); count++; } bos.close(); bis.close(); System.out.println("file done: " + itor); } done = true; }catch (Exception e) { System.err.println(e) ; } } } Server public static void main(String[] args) { BufferedInputStream bis; BufferedOutputStream bos; int num; File file = new File(ServerStats.serverFolder); if (!(file.exists())){ file.mkdir(); } try { int i = 1; ServerSocket socket = new ServerSocket(ServerStats.imgServerPort); Socket incoming = socket.accept(); System.out.println("Spawning " + i); try { try{ if (!(file.exists())){ file.mkdir(); } InputStream inStream = incoming.getInputStream(); OutputStream outStream = incoming.getOutputStream(); BufferedReader inm = new BufferedReader(new InputStreamReader(inStream)); PrintWriter out = new PrintWriter(outStream, true /* autoFlush */); String length2 = inm.readLine(); System.out.println(length2); String filename = inm.readLine(); System.out.println("Filename = " + filename); out.println("ACK: Filename received = " + filename); //RECIEVE and WRITE FILE byte[] receivedData = new byte[1000000]; bis = new BufferedInputStream(incoming.getInputStream()); bos = new BufferedOutputStream(new FileOutputStream(ServerStats.serverFolder + "/" + filename)); long length = (long)Integer.parseInt(length2); length++; long counter = 0; while (counter < length){ num = bis.read(receivedData); bos.write(receivedData,0,num); counter ++; } System.out.println(counter); bos.close(); bis.close(); File receivedFile = new File(filename); long receivedLen = receivedFile.length(); out.println("ACK: Length of received file = " + receivedLen); } finally { incoming.close(); } } catch (IOException e){ e.printStackTrace(); } } catch (IOException e1){ e1.printStackTrace(); } } The code is some I found, and I have slightly modified it, but I am having problems transferring multiple images over the server. Output on Client: run ServerQueue.Client Connected to : localhost transfer: Picture 012.jpg 1312743 java.lang.ArrayIndexOutOfBoundsException Connected to : localhost transfer: Picture 012.jpg 1312743 Cant seem to get it to transfer multiple images. But bothsides I think crash or something because the file never finishes transfering

    Read the article

  • PHP: What is an efficient way to parse a text file containing very long lines?

    - by Shaun
    I'm working on a parser in php which is designed to extract MySQL records out of a text file. A particular line might begin with a string corresponding to which table the records (rows) need to be inserted into, followed by the records themselves. The records are delimited by a backslash and the fields (columns) are separated by commas. For the sake of simplicity, let's assume that we have a table representing people in our database, with fields being First Name, Last Name, and Occupation. Thus, one line of the file might be as follows [People] = "\Han,Solo,Smuggler\Luke,Skywalker,Jedi..." Where the ellipses (...) could be additional people. One straightforward approach might be to use fgets() to extract a line from the file, and use preg_match() to extract the table name, records, and fields from that line. However, let's suppose that we have an awful lot of Star Wars characters to track. So many, in fact, that this line ends up being 200,000+ characters/bytes long. In such a case, taking the above approach to extract the database information seems a bit inefficient. You have to first read hundreds of thousands of characters into memory, then read back over those same characters to find regex matches. Is there a way, similar to the Java String next(String pattern) method of the Scanner class constructed using a file, that allows you to match patterns in-line while scanning through the file? The idea is that you don't have to scan through the same text twice (to read it from the file into a string, and then to match patterns) or store the text redundantly in memory (in both the file line string and the matched patterns). Would this even yield a significant increase in performance? It's hard to tell exactly what PHP or Java are doing behind the scenes.

    Read the article

  • Need a way to determine if a file is done being written to.

    - by Khorkrak
    The situation I'm in is this - there's a process that's writing to a file, sometimes the file is rather large say 400 - 500MB. I need to know when it's done writing. How can I determine this? If I look in the directory I'll see it there but it might not be done being written. Plus this needs to be done remotely - as in on the same internal LAN but not on the same computer and typically the process that wants to know when the file writing is done is running on a Linux box with a the process that's writing the file and the file itself on a windows box. No samba isn't an option. xmlrpc communication to a service on that windows box is an option as well as using snmp to check if that's viable. Ideally Works on either Linux or Windows - meaning the solution is OS independent. Works for any type of file. Good enough: Works just on windows but can be done through some library or whatever that can be accessed with Python. Works only for PDF files. Current best idea is to periodically open the file in question from some process on the windows box and look at the last bytes checking for the PDF end tag and accounting for the eol differences because the file may have been created on Linux or Windows.

    Read the article

  • C# - Problem while listing directories - DirectoryNotFoundException

    - by HoNgOuRu
    I'm getting a "DirectoryNotFoundException" error, here is the code: string directorio = "D:\MUSICA\La Trampa - El Mísero Espiral De Encanto"; DirectoryInfo dir = new DirectoryInfo(directorio); DirectoryInfo[] dirs = dir.GetDirectories(); <------------This is the line I'm having this problem. I believe it's caused when it tries to parse the tilde part of that string "Mísero". the directory "D:\MUSICA\La Trampa - El Mísero Espiral De Encanto" exists because I can see it and also have some files in it. Is there any way to send this string in correct way? Thanks

    Read the article

  • Write a file in UTF-8 using FileWriter (Java)?

    - by user1280970
    I have the following code however, I want it to write as a UTF-8 file to handle foreign characters. Is there a way of doing this, is there some need to have a parameter? I would really appreciate your help with this. Thanks. try { BufferedReader reader = new BufferedReader(new FileReader("C:/Users/Jess/My Documents/actresses.list")); writer = new BufferedWriter(new FileWriter("C:/Users/Jess/My Documents/actressesFormatted.csv")); while( (line = reader.readLine()) != null) { //If the line starts with a tab then we just want to add a movie //using the current actor's name. if(line.length() == 0) continue; else if(line.charAt(0) == '\t') { readMovieLine2(0, line, surname.toString(), forename.toString()); } //Else we've reached a new actor else { readActorName(line); } } } catch (IOException e) { e.printStackTrace(); } }

    Read the article

  • How do I turn an array of bytes back into a file and open it automatically with C#?

    - by Ace Grace
    Hi, I am writing some code to add file attachments into an application I am building. I have add & Remove working but I don't know where to start to implement open. I have an array of bytes (from a table field) and I don't know how to make it automatically open e.g. If I have an array of bytes which is a PDF, how do I get my app to automatically open Acrobat or whatever the currently assigned application for the extension is using C#?

    Read the article

< Previous Page | 81 82 83 84 85 86 87 88 89 90 91 92  | Next Page >