Search Results

Search found 3856 results on 155 pages for 'io'.

Page 11/155 | < Previous Page | 7 8 9 10 11 12 13 14 15 16 17 18  | Next Page >

  • Hibernate JDBCConnectionException: Communications link failure and java.io.EOFException: Can not read response from server

    - by Marc
    I get a quite well-known using MySql jdbc driver : JDBCConnectionException: Communications link failure, java.io.EOFException: Can not read response from server. This is caused by the wait_timeout parameter in my.cnf. So I decided to use c3p0 pool connection along with Hibernate. Here is what I added to hibernate.cfg.xml : <property name="hibernate.connection.provider_class">org.hibernate.connection.C3P0ConnectionProvider</property> <property name="c3p0.min_size">10</property> <property name="c3p0.max_size">100</property> <property name="c3p0.timeout">1000</property> <property name="c3p0.preferredTestQuery">SELECT 1</property> <property name="c3p0.acquire_increment">1</property> <property name="c3p0.idle_test_period">2</property> <property name="c3p0.max_statements">50</property> idle_test_period is volontarily low for test purposes. Looking at the mysql logs I can see the "SELECT 1" request which is regularly sent to the mysql server so it works. Unfortunately I still get this EOF exception within my app if I wait longer than 'wait_timout' seconds (set to 10 for test purposes). I'm using Hibernate 4.1.1 and mysql-jdbc-connector 5.1.18. So what am I doing wrong? Thanks, Marc.

    Read the article

  • File IO with Streams - Best Memory Buffer Size

    - by AJ
    I am writing a small IO library to assist with a larger (hobby) project. A part of this library performs various functions on a file, which is read / written via the FileStream object. On each StreamReader.Read(...) pass, I fire off an event which will be used in the main app to display progress information. The processing that goes on in the loop is vaired, but is not too time consuming (it could just be a simple file copy, for example, or may involve encryption...). My main question is: What is the best memory buffer size to use? Thinking about physical disk layouts, I could pick 2k, which would cover a CD sector size and is a nice multiple of a 512 byte hard disk sector. Higher up the abstraction tree, you could go for a larger buffer which could read an entire FAT cluster at a time. I realise with today's PC's, I could go for a more memory hungry option (a couple of MiB, for example), but then I increase the time between UI updates and the user perceives a less responsive app. As an aside, I'm eventually hoping to provide a similar interface to files hosted on FTP / HTTP servers (over a local network / fastish DSL). What would be the best memory buffer size for those (again, a "best-case" tradeoff between perceived responsiveness vs. performance).

    Read the article

  • Rapid Opening and Closing System.IO.StreamWriter in C#

    - by ccomet
    Suppose you have a file that you are programmatically logging information into with regards to a process. Kinda like your typical debug Console.WriteLine, but due to the nature of the code you're testing, you don't have a console to write onto so you have to write it somewhere like a file. My current program uses System.IO.StreamWriter for this task. My question is about the approach to using the StreamWriter. Is it better to open just one StreamWriter instance, do all of the writes, and close it when the entire process is done? Or is it a better idea to open a new StreamWriter instance to write a line into the file, then immediately close it, and do this for every time something needs to be written in? In the latter approach, this would probably be facilitated by a method that would do just that for a given message, rather than bloating the main process code with excessive amounts of lines. But having a method to aid in that implementation doesn't necessarily make it the better choice. Are there significant advantages to picking one approach or the other? Or are they functionally equivalent, leaving the choice on the shoulders of the programmer?

    Read the article

  • Need some help understanding IO Statistics

    - by Abe Miessler
    I have a query that has a very costly INDEX SEEK operation in the execution plan. In order to track down the cause i set IO STATISTICS on and ran it. In the problem section it gave the following statistics: Table '#TempStudents_Enrollment2_____________________________________000000004D5F'. Scan count 0, logical reads 60, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. Table 'Worktable'. Scan count 0, logical reads 0, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. Table '#TempRace2______________________________________________000000004D58'. Scan count 1, logical reads 1, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. Table 'Worktable'. Scan count 0, logical reads 0, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. Table 'RefRace'. Scan count 120, logical reads 240, physical reads 1, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. Table 'RefFedEnctyRaceCatg'. Scan count 18, logical reads 36, physical reads 2, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. Table '#43B0BA0F'. Scan count 1, logical reads 60, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. Table '#42BC95D6'. Scan count 1, logical reads 60, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. Table '#41C8719D'. Scan count 1, logical reads 60, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. Table '#40D44D64'. Scan count 1, logical reads 60, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. Table '#LEA2_________________________________________________000000004D56'. Scan count 1, logical reads 60, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. Table '#39332B9C'. Scan count 1, logical reads 60, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. Table '#School2________________________________________________000000004D57'. Scan count 1, logical reads 29164, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. Table '#GenderKey______________________________________________000000004D5A'. Scan count 1, logical reads 29164, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. Table '#LangAcqKey_____________________________________________000000004D5B'. Scan count 1, logical reads 29164, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. Table '#TransferCatKey___________________________________________000000004D5C'. Scan count 1, logical reads 29164, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. Table '#ResCatKey______________________________________________000000004D5D'. Scan count 1, logical reads 29164, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. Table 'RPT_SnapShot_1_4_StuPgm_Denorm'. Scan count 2344954, logical reads 4992518, physical reads 16, read-ahead reads 8, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. Table '#3FE0292B'. Scan count 1, logical reads 2344954, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. Table 'RPT_SnapShot_1_4_StuEnrlmt_Denorm'. Scan count 20, logical reads 87679, physical reads 0, read-ahead reads 87425, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. Table '#GradeKey_______________________________________________000000004D59'. Scan count 1, logical reads 1, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. What should I look for in here when i'm looking to improve the performance? The line with over 2 million for the Scan count looked suspicious to me but I really don't know. Does anyone see anything here that i should look into in more detail?

    Read the article

  • Problem with creating a deterministic finite automata (DFA) - Mercury

    - by Jabba The hut
    I would like to have a deterministic finite automata (DFA) simulated in Mercury. But I’m s(t)uck at several places. Formally, a DFA is described with the following characteristics: a setOfStates S, an inputAlphabet E <-- summation symbol, a transitionFunction : S × E -- S, a startState s € S, a setOfAcceptableFinalStates F =C S. A DFA will always starts in the start state. Then the DFA will read all the characters on the input, one by one. Based on the current input character and the current state, there will be made to a new state. These transitions are defined in the transitions function. when the DFA is in one of his acceptable final states, after reading the last character, then will the DFA accept the input, If not, then the input will be is rejected. The figure shows a DFA the accepting strings where the amount of zeros, is a plurality of three. Condition 1 is the initial state, and also the only acceptable state. for each input character is the corresponding arc followed to the next state. Link to Figure What must be done A type “mystate” which represents a state. Each state has a number which is used for identification. A type “transition” that represents a possible transition between states. Each transition has a source_state, an input_character, and a final_state. A type “statemachine” that represents the entire DFA. In the solution, the DFA must have the following properties: The set of all states, the input alphabet, a transition function, represented as a set of possible transitions, a set of accepting final states, a current state of the DFA A predicate “init_machine (state machine :: out)” which unifies his arguments with the DFA, as shown as in the Figure. The current state for the DFA is set to his initial state, namely, 1. The input alphabet of the DFA is composed of the characters '0'and '1'. A user can enter a text, which will be controlled by the DFA. the program will continues until the user types Ctrl-D and simulates an EOF. If the user use characters that are not allowed into the input alphabet of the DFA, then there will be an error message end the program will close. (pred require) Example Enter a sentence: 0110 String is not ok! Enter a sentence: 011101 String is not ok! Enter a sentence: 110100 String is ok! Enter a sentence: 000110010 String is ok! Enter a sentence: 011102 Uncaught exception Mercury: Software Error: Character does not belong to the input alphabet! the thing wat I have. :- module dfa. :- interface. :- import_module io. :- pred main(io.state::di, io.state::uo) is det. :- implementation. :- import_module int,string,list,bool. 1 :- type mystate ---> state(int). 2 :- type transition ---> trans(source_state::mystate, input_character::bool, final_state::mystate). 3 (error, finale_state and current_state and input_character) :- type statemachine ---> dfa(list(mystate),list(input_character),list(transition),list(final_state),current_state(mystate)) 4 missing a lot :- pred init_machine(statemachine :: out) is det. %init_machine(statemachine(L_Mystate,0,L_transition,L_final_state,1)) :- <-probably fault 5 not perfect main(!IO) :- io.write_string("\nEnter a sentence: ", !IO), io.read_line_as_string(Input, !IO), ( Invoer = ok(StringVar), S1 = string.strip(StringVar), (if S1 = "mustbeabool" then io.write_string("Sentenceis Ok! ", !IO) else io.write_string("Sentence is not Ok!.", !IO)), main(!IO) ; Invoer = eof ; Invoer = error(ErrorCode), io.format("%s\n", [s(io.error_message(ErrorCode))], !IO) ). Hope you can help me kind regards

    Read the article

  • Mongodump on Gridfs is killing the host IOs

    - by Raphael
    I'm trying to make a mongodump from our production mongodb while the production is running. We have three production instances, one regular mongodb, one with very few gb of data on gridfs, one with a larger amount of data on gridfs. All mongodb instances are running in version 2.4.9 on a ubuntu 10.04 virtual server. I use a mongodump command to export the bases to another server. Unfortunately our machines are virtually hosted in a "low performances" datacenter (vmware based) so when I try to export the large gridfs db, the disk IO hits 100% (and 50% of the cpu starts waiting for IO too). This has a very negative impact on the production applicatiosn because db access time is excessively increased, making the applications unusable. I'm looking for a way to regulate the mongodump so the export goes slower but cooler on the hardware ressources allowing better performances for the applications to run. Has anyone had a similar scenario ?

    Read the article

  • Questions related to writing your own file downloader using multiple threads java

    - by Shekhar
    Hello In my current company, i am doing a PoC on how we can write a file downloader utility. We have to use socket programming(TCP/IP) for downloading the files. One of the requirements of the client is that a file(which will be large in size) should be transfered in chunks for example if we have a file of 5Mb size then we can have 5 threads which transfer 1 Mb each. I have written a small application which downloads a file. You can download the eclipe project from http://www.fileflyer.com/view/QM1JSC0 A brief explanation of my classes FileSender.java This class provides the bytes of file. It has a method called sendBytesOfFile(long start,long end, long sequenceNo) which gives the number of bytes. import java.io.File; import java.io.IOException; import java.util.zip.CRC32; import org.apache.commons.io.FileUtils; public class FileSender { private static final String FILE_NAME = "C:\\shared\\test.pdf"; public ByteArrayWrapper sendBytesOfFile(long start,long end, long sequenceNo){ try { File file = new File(FILE_NAME); byte[] fileBytes = FileUtils.readFileToByteArray(file); System.out.println("Size of file is " +fileBytes.length); System.out.println(); System.out.println("Start "+start +" end "+end); byte[] bytes = getByteArray(fileBytes, start, end); ByteArrayWrapper wrapper = new ByteArrayWrapper(bytes, sequenceNo); return wrapper; } catch (IOException e) { throw new RuntimeException(e); } } private byte[] getByteArray(byte[] bytes, long start, long end){ long arrayLength = end-start; System.out.println("Start : "+start +" end : "+end + " Arraylength : "+arrayLength +" length of source array : "+bytes.length); byte[] arr = new byte[(int)arrayLength]; for(int i = (int)start, j =0; i < end;i++,j++){ arr[j] = bytes[i]; } return arr; } public static long fileSize(){ File file = new File(FILE_NAME); return file.length(); } } Second Class is FileReceiver.java - This class receives the file. Small Explanation what this file does This class finds the size of the file to be fetched from Sender Depending upon the size of the file it finds the start and end position till the bytes needs to be read. It starts n number of threads giving each thread start,end, sequence number and a list which all the threads share. Each thread reads the number of bytes and creates a ByteArrayWrapper. ByteArrayWrapper objects are added to the list Then i have while loop which basically make sure that all threads have done their work finally it sorts the list based on the sequence number. then the bytes are joined, and a complete byte array is formed which is converted to a file. Code of File Receiver package com.filedownloader; import java.io.File; import java.io.IOException; import java.util.ArrayList; import java.util.Collections; import java.util.Comparator; import java.util.List; import java.util.zip.CRC32; import org.apache.commons.io.FileUtils; public class FileReceiver { public static void main(String[] args) { FileReceiver receiver = new FileReceiver(); receiver.receiveFile(); } public void receiveFile(){ long startTime = System.currentTimeMillis(); long numberOfThreads = 10; long filesize = FileSender.fileSize(); System.out.println("File size received "+filesize); long start = filesize/numberOfThreads; List<ByteArrayWrapper> list = new ArrayList<ByteArrayWrapper>(); for(long threadCount =0; threadCount<numberOfThreads ;threadCount++){ FileDownloaderTask task = new FileDownloaderTask(threadCount*start,(threadCount+1)*start,threadCount,list); new Thread(task).start(); } while(list.size() != numberOfThreads){ // this is done so that all the threads should complete their work before processing further. //System.out.println("Waiting for threads to complete. List size "+list.size()); } if(list.size() == numberOfThreads){ System.out.println("All bytes received "+list); Collections.sort(list, new Comparator<ByteArrayWrapper>() { @Override public int compare(ByteArrayWrapper o1, ByteArrayWrapper o2) { long sequence1 = o1.getSequence(); long sequence2 = o2.getSequence(); if(sequence1 < sequence2){ return -1; }else if(sequence1 > sequence2){ return 1; } else{ return 0; } } }); byte[] totalBytes = list.get(0).getBytes(); byte[] firstArr = null; byte[] secondArr = null; for(int i = 1;i<list.size();i++){ firstArr = totalBytes; secondArr = list.get(i).getBytes(); totalBytes = concat(firstArr, secondArr); } System.out.println(totalBytes.length); convertToFile(totalBytes,"c:\\tmp\\test.pdf"); long endTime = System.currentTimeMillis(); System.out.println("Total time taken with "+numberOfThreads +" threads is "+(endTime-startTime)+" ms" ); } } private byte[] concat(byte[] A, byte[] B) { byte[] C= new byte[A.length+B.length]; System.arraycopy(A, 0, C, 0, A.length); System.arraycopy(B, 0, C, A.length, B.length); return C; } private void convertToFile(byte[] totalBytes,String name) { try { FileUtils.writeByteArrayToFile(new File(name), totalBytes); } catch (IOException e) { throw new RuntimeException(e); } } } Code of ByteArrayWrapper package com.filedownloader; import java.io.Serializable; public class ByteArrayWrapper implements Serializable{ private static final long serialVersionUID = 3499562855188457886L; private byte[] bytes; private long sequence; public ByteArrayWrapper(byte[] bytes, long sequenceNo) { this.bytes = bytes; this.sequence = sequenceNo; } public byte[] getBytes() { return bytes; } public long getSequence() { return sequence; } } Code of FileDownloaderTask import java.util.List; public class FileDownloaderTask implements Runnable { private List<ByteArrayWrapper> list; private long start; private long end; private long sequenceNo; public FileDownloaderTask(long start,long end,long sequenceNo,List<ByteArrayWrapper> list) { this.list = list; this.start = start; this.end = end; this.sequenceNo = sequenceNo; } @Override public void run() { ByteArrayWrapper wrapper = new FileSender().sendBytesOfFile(start, end, sequenceNo); list.add(wrapper); } } Questions related to this code 1) Does file downloading becomes fast when multiple threads is used? In this code i am not able to see the benefit. 2) How should i decide how many threads should i create ? 3) Are their any opensource libraries which does that 4) The file which file receiver receives is valid and not corrupted but checksum (i used FileUtils of common-io) does not match. Whats the problem? 5) This code gives out of memory when used with large file(above 100 Mb) i.e. because byte array which is created. How can i avoid? I know this is a very bad code but i have to write this in one day -:). Please suggest any other good way to do this? Thanks Shekhar

    Read the article

  • Java textfile I/O problem

    - by KáGé
    Hello, I have to make a torpedo game for school with a toplist for it. I want to store it in a folder structure near the JAR: /Torpedo/local/toplist/top_i.dat, where the i is the place of that score. The files will be created at the first start of the program with this call: File f; f = new File(Toplist.toplistPath+"/top_1.dat"); if(!f.exists()){ Toplist.makeToplist(); } Here is the toplist class: package main; import java.awt.Color; import java.io.BufferedReader; import java.io.File; import java.io.FileNotFoundException; import java.io.FileReader; import java.io.FileWriter; import java.io.IOException; import java.io.PrintWriter; import java.text.SimpleDateFormat; import java.util.Calendar; import java.util.prefs.Preferences; import javax.swing.JFrame; import javax.swing.JOptionPane; import javax.swing.JTextArea; public class Toplist { static String toplistPath = "./Torpedo/local/toplist"; //I know it won't work this easily, it's only to get you the idea public static JFrame toplistWindow = new JFrame("Torpedó - [TOPLISTA]"); public static JTextArea toplist = new JTextArea(""); static StringBuffer toplistData = new StringBuffer(3000); public Toplist() { toplistWindow.setSize(500, 400); toplistWindow.setLocationRelativeTo(null); toplistWindow.setResizable(false); getToplist(); toplist.setSize(400, 400); toplist.setLocation(0, 100); toplist.setColumns(5); toplist.setText(toplistData.toString()); toplist.setEditable(false); toplist.setBackground(Color.WHITE); toplistWindow.setLayout(null); toplistWindow.setVisible(true); } public Toplist(Player winner) { //this is to be done yet, this will set the toplist at first and then display it toplistWindow.setLayout(null); toplistWindow.setVisible(true); } /** * Creates a new toplist */ public static void makeToplist(){ new File(toplistPath).mkdir(); for(int i = 1; i <= 10; i++){ File f = new File(toplistPath+"/top_"+i+".dat"); try { f.createNewFile(); } catch (IOException e) { JOptionPane.showMessageDialog(new JFrame(), "Fájl hiba: toplista létrehozása", "Error", JOptionPane.ERROR_MESSAGE); } } } /** * If the score is a top score it inserts it into the list * * @param score - the score to be checked */ public static void setToplist(int score, Player winner){ BufferedReader input = null; PrintWriter output = null; int topscore; for(int i = 1; i <= 10; i++){ try { input = new BufferedReader(new FileReader(toplistPath+"/top_"+i+",dat")); String s; topscore = Integer.parseInt(input.readLine()); if(score > topscore){ for(int j = 9; j >= i; j--){ input = new BufferedReader(new FileReader(toplistPath+"/top_"+j+".dat")); output = new PrintWriter(new FileWriter(toplistPath+"/top_"+(j+1)+".dat")); while ((s = input.readLine()) != null) { output.println(s); } } output = new PrintWriter(new FileWriter(toplistPath+"/top_"+i+".dat")); output.println(score); output.println(winner.name); if(winner.isLocal){ output.println(Torpedo.session.remote.name); }else{ output.println(Torpedo.session.remote.name); } output.println(Torpedo.session.mapName); output.println(DateUtils.now()); break; } } catch (FileNotFoundException e) { JOptionPane.showMessageDialog(new JFrame(), "Fájl hiba: toplista frissítése", "Error", JOptionPane.ERROR_MESSAGE); } catch (IOException e) { JOptionPane.showMessageDialog(new JFrame(), "Fájl hiba: toplista frissítése", "Error", JOptionPane.ERROR_MESSAGE); } finally { if (input != null) { try { input.close(); } catch (IOException e) { JOptionPane.showMessageDialog(new JFrame(), "Fájl hiba: toplista frissítése", "Error", JOptionPane.ERROR_MESSAGE); } } if (output != null) { output.close(); } } } } /** * This loads the toplist into the buffer */ public static void getToplist(){ BufferedReader input = null; toplistData = null; String s; for(int i = 1; i <= 10; i++){ try { input = new BufferedReader(new FileReader(toplistPath+"/top_"+i+".dat")); while((s = input.readLine()) != null){ toplistData.append(s); toplistData.append('\t'); } toplistData.append('\n'); } catch (FileNotFoundException e) { JOptionPane.showMessageDialog(new JFrame(), "Fájl hiba: toplista betöltése", "Error", JOptionPane.ERROR_MESSAGE); } catch (IOException e) { JOptionPane.showMessageDialog(new JFrame(), "Fájl hiba: toplista betöltése", "Error", JOptionPane.ERROR_MESSAGE); } } } /** * * @author http://www.rgagnon.com/javadetails/java-0106.html * */ public static class DateUtils { public static final String DATE_FORMAT_NOW = "yyyy-MM-dd HH:mm:ss"; public static String now() { Calendar cal = Calendar.getInstance(); SimpleDateFormat sdf = new SimpleDateFormat(DATE_FORMAT_NOW); return sdf.format(cal.getTime()); } } } The problem is, that it can't access any of the files. I've tried adding them to the classpath and at least six different variations of file/path handling I found online but nothing worked. Could anyone tell me what do I do wrong? Thank you.

    Read the article

  • Handling file upload in a non-blocking manner

    - by Kaliyug Antagonist
    The background thread is here Just to make objective clear - the user will upload a large file and must be redirected immediately to another page for proceeding different operations. But the file being large, will take time to be read from the controller's InputStream. So I unwillingly decided to fork a new Thread to handle this I/O. The code is as follows : The controller servlet /** * @see HttpServlet#doPost(HttpServletRequest request, HttpServletResponse * response) */ protected void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException { // TODO Auto-generated method stub System.out.println("In Controller.doPost(...)"); TempModel tempModel = new TempModel(); tempModel.uploadSegYFile(request, response); System.out.println("Forwarding to Accepted.jsp"); /*try { Thread.sleep(1000 * 60); } catch (InterruptedException e) { // TODO Auto-generated catch block e.printStackTrace(); }*/ request.getRequestDispatcher("/jsp/Accepted.jsp").forward(request, response); } The model class package com.model; import java.io.IOException; import java.util.concurrent.ExecutionException; import java.util.concurrent.Future; import javax.servlet.http.HttpServletRequest; import javax.servlet.http.HttpServletResponse; import com.utils.ProcessUtils; public class TempModel { public void uploadSegYFile(HttpServletRequest request, HttpServletResponse response) { // TODO Auto-generated method stub System.out.println("In TempModel.uploadSegYFile(...)"); /* * Trigger the upload/processing code in a thread, return immediately * and notify when the thread completes */ try { FileUploaderRunnable fileUploadRunnable = new FileUploaderRunnable( request.getInputStream()); /* * Future<FileUploaderRunnable> future = ProcessUtils.submitTask( * fileUploadRunnable, fileUploadRunnable); * * FileUploaderRunnable processed = future.get(); * * System.out.println("Is file uploaded : " + * processed.isFileUploaded()); */ Thread uploadThread = new Thread(fileUploadRunnable); uploadThread.start(); } catch (IOException e) { // TODO Auto-generated catch block e.printStackTrace(); } /* * catch (InterruptedException e) { // TODO Auto-generated catch block * e.printStackTrace(); } catch (ExecutionException e) { // TODO * Auto-generated catch block e.printStackTrace(); } */ System.out.println("Returning from TempModel.uploadSegYFile(...)"); } } The Runnable package com.model; import java.io.File; import java.io.FileInputStream; import java.io.FileNotFoundException; import java.io.FileOutputStream; import java.io.IOException; import java.io.InputStream; import java.nio.ByteBuffer; import java.nio.channels.Channels; import java.nio.channels.ReadableByteChannel; public class FileUploaderRunnable implements Runnable { private boolean isFileUploaded = false; private InputStream inputStream = null; public FileUploaderRunnable(InputStream inputStream) { // TODO Auto-generated constructor stub this.inputStream = inputStream; } public void run() { // TODO Auto-generated method stub /* Read from InputStream. If success, set isFileUploaded = true */ System.out.println("Starting upload in a thread"); File outputFile = new File("D:/06c01_output.seg");/* * This will be changed * later */ FileOutputStream fos; ReadableByteChannel readable = Channels.newChannel(inputStream); ByteBuffer buffer = ByteBuffer.allocate(1000000); try { fos = new FileOutputStream(outputFile); while (readable.read(buffer) != -1) { fos.write(buffer.array()); buffer.clear(); } fos.flush(); fos.close(); readable.close(); } catch (FileNotFoundException e) { // TODO Auto-generated catch block e.printStackTrace(); } catch (IOException e) { // TODO Auto-generated catch block e.printStackTrace(); } System.out.println("File upload thread completed"); } public boolean isFileUploaded() { return isFileUploaded; } } My queries/doubts : Spawning threads manually from the Servlet makes sense to me logically but scares me coding wise - the container isn't aware of these threads after all(I think so!) The current code is giving an Exception which is quite obvious - the stream is inaccessible as the doPost(...) method returns before the run() method completes : In Controller.doPost(...) In TempModel.uploadSegYFile(...) Returning from TempModel.uploadSegYFile(...) Forwarding to Accepted.jsp Starting upload in a thread Exception in thread "Thread-4" java.lang.NullPointerException at org.apache.coyote.http11.InternalInputBuffer.fill(InternalInputBuffer.java:512) at org.apache.coyote.http11.InternalInputBuffer.fill(InternalInputBuffer.java:497) at org.apache.coyote.http11.InternalInputBuffer$InputStreamInputBuffer.doRead(InternalInputBuffer.java:559) at org.apache.coyote.http11.AbstractInputBuffer.doRead(AbstractInputBuffer.java:324) at org.apache.coyote.Request.doRead(Request.java:422) at org.apache.catalina.connector.InputBuffer.realReadBytes(InputBuffer.java:287) at org.apache.tomcat.util.buf.ByteChunk.substract(ByteChunk.java:407) at org.apache.catalina.connector.InputBuffer.read(InputBuffer.java:310) at org.apache.catalina.connector.CoyoteInputStream.read(CoyoteInputStream.java:202) at java.nio.channels.Channels$ReadableByteChannelImpl.read(Unknown Source) at com.model.FileUploaderRunnable.run(FileUploaderRunnable.java:39) at java.lang.Thread.run(Unknown Source) Keeping in mind the point 1., does the use of Executor framework help me in anyway ? package com.utils; import java.util.concurrent.Future; import java.util.concurrent.ScheduledThreadPoolExecutor; public final class ProcessUtils { /* Ensure that no more than 2 uploads,processing req. are allowed */ private static final ScheduledThreadPoolExecutor threadPoolExec = new ScheduledThreadPoolExecutor( 2); public static <T> Future<T> submitTask(Runnable task, T result) { return threadPoolExec.submit(task, result); } } So how should I ensure that the user doesn't block and the stream remains accessible so that the (uploaded)file can be read from it?

    Read the article

  • Why does my program not read full files?

    - by user593395
    I have written code in Java to read the content of a file. But it is working for small line of file only not for more than 1000 line of file. Please tell me me what error I have made in the below program. program: import java.io.DataInputStream; import java.io.DataOutputStream; import java.io.File; import java.io.FileInputStream; import java.io.FileNotFoundException; import java.io.FileOutputStream; import java.util.regex.Matcher; import java.util.regex.Pattern; public class aaru { public static void main(String args[]) throws FileNotFoundException { File sourceFile = new File("E:\\parser\\parse3.txt"); File destinationFile = new File("E:\\parser\\new.txt"); FileInputStream fileIn = new FileInputStream(sourceFile); FileOutputStream fileOut = new FileOutputStream(destinationFile); DataInputStream dataIn = new DataInputStream(fileIn); DataOutputStream dataOut = new DataOutputStream(fileOut); String str=""; String[] st; String sub[]=null; String word=""; String contents=""; String total=""; String stri="";; try { while((contents=dataIn.readLine())!=null) { total = contents.replaceAll(",",""); String str1=total.replaceAll("--",""); String str2=str1.replaceAll(";","" ); String str3=str2.replaceAll("&","" ); String str4=str3.replaceAll("^","" ); String str5=str4.replaceAll("#","" ); String str6=str5.replaceAll("!","" ); String str7=str6.replaceAll("/","" ); String str8=str7.replaceAll(":","" ); String str9=str8.replaceAll("]","" ); String str10=str9.replaceAll("\\?",""); String str11=str10.replaceAll("\\*",""); String str12=str11.replaceAll("\\'",""); Pattern pattern = Pattern.compile("\\s+", Pattern.CASE_INSENSITIVE | Pattern.DOTALL | Pattern.MULTILINE); Matcher matcher = pattern.matcher(str12); //boolean check = matcher.find(); String result=str12; Pattern p=Pattern.compile("^www\\.|\\@"); Matcher m=p.matcher(result); stri = m.replaceAll(" "); int i; int j; st=stri.split("\\."); for(i=0;i<st.length;i++) { st[i]=st[i].trim(); /*if(st[i].startsWith(" ")) st[i]=st[i].substring(1,st[i].length);*/ sub=st[i].split(" "); if(sub.length>1) { for(j=0;j<sub.length-1;j++) { word = word+sub[j]+","+sub[j+1]+"\r\n"; } } else { word = word+st[i]+"\r\n"; } } } System.out.println(word); dataOut.writeBytes(word+"\r\n"); fileIn.close(); fileOut.close(); dataIn.close(); dataOut.close(); } catch(Exception e) { System.out.print(e); } } }

    Read the article

  • Write asynchronously to file in perl

    - by Stefhen
    Basically I would like to: Read a large amount of data from the network into an array into memory. Asynchronously write this array data, running it thru bzip2 before it hits the disk. repeat.. Is this possible? If this is possible, I know that I will have to somehow read the next pass of data into a different array as the AIO docs say that this array must not be altered before the async write is complete. I would like to background all of my writes to disk in order as the bzip2 pass is going to take much longer than the network read. Is this doable? Below is a simple example of what I think is needed, but this just reads a file into array @a for testing. use warnings; use strict; use EV; use IO::AIO; use Compress::Bzip2; use FileHandle; use Fcntl; my @a; print "loading to array...\n"; while(<>) { $a[$. - 1] = $_; } print "array loaded...\n"; my $aio_w = EV::io IO::AIO::poll_fileno, EV::WRITE, \&IO::AIO::poll_cb; aio_open "./out", O_WRONLY || O_NONBLOCK, 0, sub { my $fh = shift or die "error while opening: $!\n"; aio_write $fh, undef, undef, $a, -1, sub { $_[0] > 0 or die "error: $!\n"; EV::unloop; }; }; EV::loop EV::LOOP_NONBLOCK;

    Read the article

  • In Haskell, I want to read a file and then write to it. Do I need strictness annotation?

    - by Steve
    Hi, Still quite new to Haskell.. I want to read the contents of a file, do something with it possibly involving IO (using putStrLn for now) and then write new contents to the same file. I came up with: doit :: String -> IO () doit file = do contents <- withFile tagfile ReadMode $ \h -> hGetContents h putStrLn contents withFile tagfile WriteMode $ \h -> hPutStrLn h "new content" However this doesn't work due to laziness. The file contents are not printed. I found this post which explains it well. The solution proposed there is to include putStrLn within the withFile: doit :: String -> IO () doit file = do withFile tagfile ReadMode $ \h -> do contents <- hGetContents h putStrLn contents withFile tagfile WriteMode $ \h -> hPutStrLn h "new content" This works, but it's not what I want to do. The operation in I will eventually replace putStrLn might be long, I don't want to keep the file open the whole time. In general I just want to be able to get the file content out and then close it before working with that content. The solution I came up with is the following: doit :: String -> IO () doit file = do c <- newIORef "" withFile tagfile ReadMode $ \h -> do a <- hGetContents h writeIORef c $! a d <- readIORef c putStrLn d withFile tagfile WriteMode $ \h -> hPutStrLn h "Test" However, I find this long and a bit obfuscated. I don't think I should need an IORef just to get a value out, but I needed "place" to put the file contents. Also, it still didn't work without the strictness annotation $! for writeIORef. I guess IORefs are not strict by nature? Can anyone recommend a better, shorter way to do this while keeping my desired semantics? Thanks!

    Read the article

  • Where to put a textfile I want to use in eclipse?

    - by Jayomat
    hi there, I need to read a text file when I start my program. I'm using eclipse and started a new java project. In my project folder I got the "src" folder and the standard "JRE System Library"... I just don't know where to put the text file. I cannot use a "hard coded" path because the text file needs to be included with my app... I also tried to create a new folder like "Files" and use the pathe "Files/staedteliste.txt".. doesn't work too... I use the following code to read the file, but I get this error: Error:java.io.FileNotFoundException:staedteliste.txt(No such file or directory) import java.io.BufferedReader; import java.io.FileReader; import java.io.IOException; import java.util.ArrayList; public class Test { ArrayList<String[]> values; public static void main(String[] args) { // TODO Auto-generated method stub loadList(); } public static void loadList() { BufferedReader reader; String zeile = null; try { reader = new BufferedReader(new FileReader("staedteliste.txt")); zeile = reader.readLine(); ArrayList<String[]> values = new ArrayList<String[]>(); while (zeile != null) { values.add(zeile.split(";")); zeile = reader.readLine(); } System.out.println(values.size()); System.out.println(zeile); } catch (IOException e) { System.err.println("Error :"+e); } } } thx for help!

    Read the article

  • System.IO.FileLoadException

    - by Raj G
    Hi All, I have got this error when using Enterprise Library 3.1 May 2007 version. We are developing a product and have a common lib directory beneath the Subversion Trunk directory <\Trunk\Lib\ into which we put all the third party DLLs. Inside this we have Microsoft\EnterpriseLibrary\v3.1 in which we have copied all the dlls from \Program Files\Microsoft Enterprise Library May2007\bin. Everything was working properly until one of the developers installed the source code on this machine. There were some dlls copied at the end of the source code installation and once that was done, he is not able to run the project anymore. He always gets this error 'Microsoft.Practices.EnterpriseLibrary.Data, Version=3.1.0.0, Culture=neutral, PublicKeyToken=null' or one of its dependencies. The located assembly's manifest definition does not match the assembly reference. (Exception from HRESULT: 0x80131040)' What is the problem here? I thought that when the source code was installed it was just supposed to build everything and copy in the bin directory within the source code parent directory. Also we have copied the Dlls from Microsoft Enterprise Library May 2007\bin directory into our product development directory and references into our project with a copylocal flag set to true. Can anyone help me out here RK

    Read the article

  • ASP.Net Application Trust Medium File IO Outside Virtual Directory

    - by Trey Gramann
    I am trying to determine how suicidal this is... I have a hosting environment where a custom ASP.Net CMS application needs to access the files in the root folder of a website even though it is in a virtual folder so it can be shared accross many sites. I can modify the Medium trust on the server and came up with this... <IPermission class="FileIOPermission" version="1" Read="$AppDir$;$AppDir$\.." Write="$AppDir$;$AppDir$\.." Append="$AppDir$;$AppDir$\.." PathDiscovery="$AppDir$;$AppDir$\.."/> Oddly enough, it works. Yes, I understand it is doing this for all the Apps. I am a bit at a loss as to easy ways to test what else is being exposed. Feels dangerous. Opinions?

    Read the article

  • Java without gc - io

    - by Dan
    Hi Guys I would like to run a Java program with garbage collection switched off. Managing memory in my own code is not so difficult. However the program needs quite a lot of I/O. Is there any way (short of using JNI for all I/O operations) that I could achieve this using pure Java? Thanks Daniel

    Read the article

  • Why does System.Web.Hosting.ApplicationHost.CreateApplicationHost throw System.IO.FileNotFoundExcept

    - by Scott Langham
    I saw something about needing to have the assembly available for the type of the first argument passed to the function. I think it is, I can't figure out what am I missing. This code is in a service. I was running the service under the 'NETWORK SERVICES' user account, when I changed the account to that of the session I was logged on with it worked ok. But, what's the difference, and how can I get it to work for the NETWORK SERVICES user.

    Read the article

  • VB.Net IO performance

    - by CFP
    Having read this page, I can't believe that VB.Net has such a terrible performance when it comes to I/O. Is this still true today? How does the .Net Framework 2.0 perform in terms of I/O (taht's the version I'm targeting)?

    Read the article

  • On MacOSX, in a C++ program, what guarantees can I have on file IO

    - by anon
    I am on MacOSX. I am writing a multi threaded program. One thread does logging. The non-logging threads may crash at any time. What conventions should I adopt in the logger / what guarantees can I have? I would prefer a solution where even if I crash during part of a write, previous writes still go to disk, and when reading back the log, I can figure out "ah, I wrote 100 complete enties, then I crashed on the 101th". Thanks!

    Read the article

  • Logging hurts MySQL performance - but, why?

    - by jimbo
    I'm quite surprised that I can't see an answer to this anywhere on the site already, nor in the MySQL documentation (section 5.2 seems to have logging otherwise well covered!) If I enable binlogs, I see a small performance hit (subjectively), which is to be expected with a little extra IO -- but when I enable a general query log, I see an enormous performance hit (double the time to run queries, or worse), way in excess of what I see with binlogs. Of course I'm now logging every SELECT as well as every UPDATE/INSERT, but, other daemons record their every request (Apache, Exim) without grinding to a halt. Am I just seeing the effects of being close to a performance "tipping point" when it comes to IO, or is there something fundamentally difficult about logging queries that causes this to happen? I'd love to be able to log all queries to make development easier, but I can't justify the kind of hardware it feels like we'd need to get performance back up with general query logging on. I do, of course, log slow queries, and there's negligible improvement in general usage if I disable this. (All of this is on Ubuntu 10.04 LTS, MySQLd 5.1.49, but research suggests this is a fairly universal issue)

    Read the article

  • getline at a certain line for file IO

    - by BSchlinker
    Is there anyway to use getline to read a specific line within a file? For instance, to immediately read line #20? It seems inefficient to do any type of look to read and discard earlier lines. I know about fseek, but there is no guarantee that the records will be the same length on each line. I imagine this is simply what is required in order to find lines. After all, to know when the end of the line has been reached, it needs to find the break line character, so it makes sense for it to need to read each line. Just wondering if there was any quicker method.

    Read the article

  • error with io stream

    - by Alexander
    What is the problem with the last two statements in the code? #include <iostream> using namespace std; int main() { cout << "2 + 4 = " << 2 + 4 << endl; cout << "2 * 4 = " << 2 * 4 << endl; cout << "2 | 4 = " << 2 | 4 << endl; cout << "2 & 4 = " << 2 & 4 << endl; What should I do to fix this?

    Read the article

  • Non blocking IO call from Django controller from a Windows service

    - by Anders
    Hi all, I have a CherryPy server with a Django application running as a Windows service, inside a controller I need to make a call to wmic, the problem is, so far I have only been able to implement a blocking operation. Does anyone have any recommendation for a non blocking operation so, at least more then one person at a time can access this controller and extract information from wmic? Thanks in advance, Anders

    Read the article

  • Haskell IO russian symbols

    - by Anton
    Hi. I try process file which writen by russian symbols. When read and after write text to file i get something like: "\160\192\231\229\240\225\224\233\228\230\224\237" How i can get normal symbols ? Thanks

    Read the article

  • C# Asynchronous Network IO and OutOfMemoryException

    - by The.Anti.9
    I'm working on a client/server application in C#, and I need to get Asynchronous sockets working so I can handle multiple connections at once. Technically it works the way it is now, but I get an OutOfMemoryException after about 3 minutes of running. MSDN says to use a WaitHandler to do WaitOne() after the socket.BeginAccept(), but it doesn't actually let me do that. When I try to do that in the code it says WaitHandler is an abstract class or interface, and I can't instantiate it. I thought maybe Id try a static reference, but it doesnt have teh WaitOne() method, just WaitAll() and WaitAny(). The main problem is that in the docs it doesn't give a full code snippet, so you can't actually see what their "wait handler" is coming from. its just a variable called allDone, which also has a Reset() method in the snippet, which a waithandler doesn't have. After digging around in their docs, I found some related thing about an AutoResetEvent in the Threading namespace. It has a WaitOne() and a Reset() method. So I tried that around the while(true) { ... socket.BeginAccept( ... ); ... }. Unfortunately this makes it only take one connection at a time. So I'm not really sure where to go. Here's my code: class ServerRunner { private Byte[] data = new Byte[2048]; private int size = 2048; private Socket server; static AutoResetEvent allDone = new AutoResetEvent(false); public ServerRunner() { server = new Socket(AddressFamily.InterNetwork, SocketType.Stream, ProtocolType.Tcp); IPEndPoint iep = new IPEndPoint(IPAddress.Any, 33333); server.Bind(iep); Console.WriteLine("Server initialized.."); } public void Run() { server.Listen(100); Console.WriteLine("Listening..."); while (true) { //allDone.Reset(); server.BeginAccept(new AsyncCallback(AcceptCon), server); //allDone.WaitOne(); } } void AcceptCon(IAsyncResult iar) { Socket oldserver = (Socket)iar.AsyncState; Socket client = oldserver.EndAccept(iar); Console.WriteLine(client.RemoteEndPoint.ToString() + " connected"); byte[] message = Encoding.ASCII.GetBytes("Welcome"); client.BeginSend(message, 0, message.Length, SocketFlags.None, new AsyncCallback(SendData), client); } void SendData(IAsyncResult iar) { Socket client = (Socket)iar.AsyncState; int sent = client.EndSend(iar); client.BeginReceive(data, 0, size, SocketFlags.None, new AsyncCallback(ReceiveData), client); } void ReceiveData(IAsyncResult iar) { Socket client = (Socket)iar.AsyncState; int recv = client.EndReceive(iar); if (recv == 0) { client.Close(); server.BeginAccept(new AsyncCallback(AcceptCon), server); return; } string receivedData = Encoding.ASCII.GetString(data, 0, recv); //process received data here byte[] message2 = Encoding.ASCII.GetBytes("reply"); client.BeginSend(message2, 0, message2.Length, SocketFlags.None, new AsyncCallback(SendData), client); } }

    Read the article

< Previous Page | 7 8 9 10 11 12 13 14 15 16 17 18  | Next Page >