Search Results

Search found 40479 results on 1620 pages for 'binary files'.

Page 681/1620 | < Previous Page | 677 678 679 680 681 682 683 684 685 686 687 688  | Next Page >

  • Make two servers talk to each other

    - by Maksim
    I have application written in GWT and hosted on Google AppEngine/Java. In this application user will have an option to upload video/audio/text file to the server. Those files could be big, up to 1gb or so and because GAE/J does not support large file I have to use another server to store those files. This would be easy to implement if there was no cross-domain security feature in browsers. So, what I'm thinking is to make GAE Server talk to my server (Glassfish or any other java servers if needed) to tell url to the file and if possible send status of uploaded file (how many percent was uploaded) so I can show status on clients screen. Here is what I'm thinking to do. When user loads GWT page that is stored on GAE/J he/she will upload file to my server, then my server will send response back to GAE and GAE will send response to the client. If this scenario is possible what would be the best way to implement GAE to Glassfish conversation?

    Read the article

  • Can't get my head arround background workers in c#

    - by Connel
    I have wrote an application that syncs two folders together. The problem with the program is that it stops responding whilst copying files. A quick search of stack-overflow told me I need to use something called a background worker. I have read a few pages on the net about this but find it really hard to understand as I'm pretty new to programming. Below is the code for my application - how can I simply put all of the File.Copy(....) commands into their own background worker (if that's even how it works)? Below is the code for the button click event that runs the sub procedure and the sub procedure I wish to use a background worker on all the File.Copy lines. Button event: protected virtual void OnBtnSyncClicked (object sender, System.EventArgs e) { //sets running boolean to true booRunning=true; //sets progress bar to 0 prgProgressBar.Fraction = 0; //resets values used by progressbar dblCurrentStatus = 0; dblFolderSize = 0; //tests if user has entered the same folder for both target and destination if (fchDestination.CurrentFolder == fchTarget.CurrentFolder) { //creates message box MessageDialog msdSame = new MessageDialog(this, DialogFlags.Modal, MessageType.Error, ButtonsType.Close, "You cannot sync two folders that are the same"); //sets message box title msdSame.Title="Error"; //sets respone type ResponseType response = (ResponseType) msdSame.Run(); //if user clicks on close button or closes window then close message box if (response == ResponseType.Close || response == ResponseType.DeleteEvent) { msdSame.Destroy(); } return; } //tests if user has entered a target folder that is an extension of the destination folder // or if user has entered a desatination folder that is an extension of the target folder if (fchTarget.CurrentFolder.StartsWith(fchDestination.CurrentFolder) || fchDestination.CurrentFolder.StartsWith(fchTarget.CurrentFolder)) { //creates message box MessageDialog msdContains = new MessageDialog(this, DialogFlags.Modal, MessageType.Error, ButtonsType.Close, "You cannot sync a folder with one of its parent folders"); //sets message box title msdContains.Title="Error"; //sets respone type and runs message box ResponseType response = (ResponseType) msdContains.Run(); //if user clicks on close button or closes window then close message box if (response == ResponseType.Close || response == ResponseType.DeleteEvent) { msdContains.Destroy(); } return; } //gets folder size of target folder FileSizeOfTarget(fchTarget.CurrentFolder); //gets folder size of destination folder FileSizeOfDestination(fchDestination.CurrentFolder); //runs SyncTarget procedure SyncTarget(fchTarget.CurrentFolder); //runs SyncDestination procedure SyncDestination(fchDestination.CurrentFolder); //informs user process is complete prgProgressBar.Text = "Finished"; //sets running bool to false booRunning = false; } Sync sub-procedure: protected void SyncTarget (string strCurrentDirectory) { //string array of all the directories in directory string[] staAllDirectories = Directory.GetDirectories(strCurrentDirectory); //string array of all the files in directory string[] staAllFiles = Directory.GetFiles(strCurrentDirectory); //loop over each file in directory foreach (string strFile in staAllFiles) { //string of just the file's name and not its path string strFileName = System.IO.Path.GetFileName(strFile); //string containing directory in target folder string strDirectoryInsideTarget = System.IO.Path.GetDirectoryName(strFile).Substring(fchTarget.CurrentFolder.Length); //inform user as to what file is being copied prgProgressBar.Text="Syncing " + strFile; //tests if file does not exist in destination folder if (!File.Exists(fchDestination.CurrentFolder + "/" + strDirectoryInsideTarget + "/" + strFileName)) { //if file does not exist copy it to destination folder, the true below means overwrite if file already exists File.Copy (strFile, fchDestination.CurrentFolder + "/" + strDirectoryInsideTarget + "/" + strFileName, true); } //tests if file does exist in destination folder if (File.Exists(fchDestination.CurrentFolder + "/" + strDirectoryInsideTarget + "/" + strFileName)) { //long (number) that contains date of last write time of target file long lngTargetFileDate = File.GetLastWriteTime(strFile).ToFileTime(); //long (number) that contains date of last write time of destination file long lngDestinationFileDate = File.GetLastWriteTime(fchDestination.CurrentFolder + "/" + strDirectoryInsideTarget + "/" + strFileName).ToFileTime(); //tests if target file is newer than destination file if (lngTargetFileDate > lngDestinationFileDate) { //if it is newer then copy file from target folder to destination folder File.Copy (strFile, fchDestination.CurrentFolder + "/" + strDirectoryInsideTarget + "/" + strFileName, true); } } //gets current file size FileInfo FileSize = new FileInfo(strFile); //sets file's filesize to dblCurrentStatus and adds it to current total of files dblCurrentStatus = dblCurrentStatus + FileSize.Length; double dblPercentage = dblCurrentStatus/dblFolderSize; prgProgressBar.Fraction = dblPercentage; } //loop over each folder in target folder foreach (string strDirectory in staAllDirectories) { //string containing directories inside target folder but not any higher directories string strDirectoryInsideTarget = strDirectory.Substring(fchTarget.CurrentFolder.Length); //tests if directory does not exist inside destination folder if (!Directory.Exists(fchDestination.CurrentFolder + "/" + strDirectoryInsideTarget)) { //it directory does not exisit create it Directory.CreateDirectory(fchDestination.CurrentFolder + "/" + strDirectoryInsideTarget); } //run sync on all files in directory SyncTarget(strDirectory); } } Any help will be greatly appreciated as after this the program will pretty much be finished :D

    Read the article

  • Apache RewriteRule with a RewriteMap variable substitution for the VAL argument to environment variable

    - by Eric
    I have an Apache server that serves up binary files to an application (not a browser). The application making the request wants the HTTP Content-MD5 header in HEX format. The default and only option within Apache is Base64. If I add "ContentDigest on" to my VirtualHost, I get this header in Base64. So I wrote a perl script, md5digesthex.pl, that gives me exactly what I want: MD5 in HEX format but I'm struggling with the RewriteRule to get my server to send the result. Here is my current Rewrite recipe: RewriteEngine on RewriteMap md5inhex prg:/www/download/md5digesthex.pl RewriteCond %{REQUEST_URI} ^/download/(.*) RewriteRule ^(.*) %{REQUEST_URI} [E=HASH:${md5inhex:$1}] Header set Content-MD5 "%{HASH}e" env=HASH The problem is that I can't seem to set the HASH environment variable based on the output of the md5inhex map function. It appears this behavior is not supported and I'm at a lost as to how to formulate this...

    Read the article

  • Parsing plain Win32 PE File (Exe/DLL) in .NET

    - by Usman
    I need to parse plain Win32 DLL/Exe and need to get all imports and exports from it and to show it on console or GUI(say Win Forms). Is it possible to parse Win32 DLL/Exe in C#.NET, read its export table,import table and get managed types from it. As its unmanaged PE(.NET doesn't allows you to convert unmanaged PE files to managed .NET assemblies, only it generates COM managed assemblies). So how to parse export and import tables of PE files and take all methods(signatures from it) in managed form.(e.g if char* as argument, it should display as IntPtr)

    Read the article

  • How to apply Ubuntu patch for rpcbind?

    - by Linda
    I am currently running Ubuntu 12.04.1 Desktop and would like to apply this patch: https://launchpad.net/ubuntu/+source/rpcbind/0.2.0-7ubuntu1.2 My current rpcbind version is here: # aptitude show rpcbind Package: rpcbind State: installed Automatically installed: yes Version: 0.2.0-7ubuntu1.1 As you can see on the patch page, I'd like to patch to this version: Version: 0.2.0-7ubuntu1.2 However, based on the downloadable files on the patch page, I'm not sure where to start. (directory structure of the original rpcbind source) # find rpcbind-0.2.0 -type d rpcbind-0.2.0 rpcbind-0.2.0/src rpcbind-0.2.0/man (directory structure of the patch download) # find debian -type d debian debian/patches debian/source [EDIT] I've figured out how to apply the individual patches in the patches directory: # patch -p1 < ../debian/patches/01-usage-fix.patch patching file src/rpcbind.c (and so on for each patch file) ... but I'm not sure what do with the patch-related files in the root debian folder. Any help here? Thanks in advance, Linda

    Read the article

  • git: Is it possible to save the packed objects of a dry run and push them later?

    - by shovavnik
    I'm trying to push a bunch of commits that contain a lot of code and a few thousand MP3 and PDF files besides (ranging from 5-40 MB each). Git successfully packs the objects: C:\MyProject> git push Counting objects: 7582, done. Delta compression using up to 2 threads. Compressing objects: 100% (7510/7510), done. But it fails to send the push for some as yet unknown reason. The problem is that it takes it a very long time to repack the files (I'm on a battery-powered laptop and it took about 20 minutes to pack). So I guess my question can be phrases thus: Is it possible to save the packed objects created in a dry run? Once saved, is it possible to push those packed objects and avoid repacking? I looked it up in the git manual and elsewhere and couldn't find anything conclusive. Any help or pointers are appreciated.

    Read the article

  • visualize irregular data in vtk

    - by aaron berry
    I have an irregular data, x dimension - 384, y dimension - 256 and z dimension 64. Now these coordinates are stored in 3 separate binary files and i have a data file having a data value for these points. I want to know, how can i represent such data to be easily visualized in vtk. Till now we were using AVS which has fld files, which can read such data easily. I dont know how to do it in vtk. Would appreciate any pointers in this direction.

    Read the article

  • How to Publish InfoPath (which is fulltrusted having codebehid code ) in sharePoint?

    - by JanardhanReddy
    Hi all, I created one InfoPath form which is having C# code and i gave security option is 'full trusted' to access infopath object model,and it should be open with Browser.finally i published the Infopath form to SharePoint(by using admin-approved) site. But when i'am trying to open, it is not opening and giving an error that is 'InfoPath can not create a new or blank form InfoPath can not open the form,To fix this problem,Contact your System administrator' and in error show details its giving following message. 'The form template is trying to access files and settings on your computer. InfoPath cannot grant access to these files and settings because the form template is not fully trusted. For a form to run with full trust, it must be installed or digitally signed with a certificate'. please give me a solution.

    Read the article

  • Best Design pattern for social media file transfer

    - by Onema
    Our system would like our clients to link their accounts with different social media sites like youtube, vimeo, facebook, myspace and so on. One of the benefits we would like to give to the user is to transfer, update and delete files they have uploaded to our sites and transfer them to the social media sites mentioned above. this files could be videos, images or audio. We started thinking about using a strategy pattern, as all of these sites share a common process ( authentication, connection, use the API to transfer/edit/delete the file ), but we soon realized that it may not work as me may want to use some of the extended functionality that is specific to each service (eg: associate a youtube video with a channel, or upload images to a specific album on facebook, and much, much more...) My question is, what would be the best Structural Design Patter to use for this scenario?

    Read the article

  • Reporting services - custom library is not working after installing report on production Server

    - by niao
    Greetings, I created a report which uses custom library created by me. I've copied these libraries to the following folders: c:\Program Files\Microsoft SQL Server\MSSQL.3\Reporting Services\ReportServer\bin\ c:\Program Files\Microsoft Visual Studio 8\Common7\IDE\PrivateAssemblies\ Everything works find when I run the report using Visual Studio. When I install it on Production Server (where these dlls were also copied) the following error is returned: Failed to load expression host assembly. Details: The type initializer for 'MyParserForReportingServices.MyParser' threw an exception. (rsErrorLoadingExprHostAssembly) Can someone please help me?

    Read the article

  • JDBC query to Oracle

    - by Harish
    Hi, We are planning to migrate our DB to Oracle.We need to manually check each of the embedded SQL is working in Oracle as few may follow different SQL rules.Now my need is very simple. I need to browse through a file which may contain queries like this. String sql = "select * from test where name="+test+"and age="+age; There are nearly 1000 files and each file has different kind of queries like this where I have to pluck the query alone which I have done through an unix script.But I need to convert these Java based queries to Oracle compatible queries. ie. select * from test where name="name" and age="age" Basically I need to check the syntax of the queries by this.I have seen something like this in TOAD but I have more than 1000 files and can't manually change each one.Is there a way? I will explain more i the question is not clear

    Read the article

  • acessing network shared folder with a username and password string in vb.net

    - by Irene
    i am using the following code to read the details from a network folder which is restricted for only one user shell("net use q: \\serveryname\foldername /user:admin pwrd", AppWinStyle.Hide, True, 10000) Process.Start(path) shell("net use q: /delete") when i run this to open any pdf or jpg or any other files except word/excel/powerpoint, everything is working fine. but the problem comes only when i access a word file. in the step one, i am giving permission to access the word file. in the step two, word file is open. in the third, i am deleting the q drive. the problem is the word file is still open. so i am getting a dos window, saying that "some connections of still connected or searching some folders, do you want to force disconnect" please help.... how to access a word file (editable files) providing user name and password from the code and at the same time he shoud not have access to any other folders directly.

    Read the article

  • Git How do I Push a project, that was Downloaded from Source

    - by JZ
    I worked with a graphic designer that did not clone from my github account. He downloaded the project from source rather than using the command "git clone". Since he pulled his files, a month has gone by and I want to do the following tasks: Create a new branch Push the graphic designers project into that branch Merge his branch with Master I've tried the following the github forking guide with not much luck; when I attempt to push the files into a new branch I get an error: fatal: Not a git repository (or any of the parent directories): .git How do I do this?

    Read the article

  • Localization of strings in static lib

    - by AO
    I have a project that uses a static library (SL). In that SL, there are a couple of strings I'd like to localize and the project includes all of the localization files. The localization works just fine when storing all text translations in the same file. The thing is that I'd like to separate the SL strings from the other strings. I have tried to put two different *.strings files (Localizable.strings and Localizable2.strings) in the language folder of interest but that did not work. I have also tried to use two *.strings file with the same name (Localizable.strings) but with different paths. It didn't work either. It seems that only one localization file is supported, right? Could anyone suggest a good way of doing this? I'm using SDK 3.2 beta 2.

    Read the article

  • BindException/Too many file open while using HttpClient under load

    - by Langali
    I have got 1000 dedicated Java threads where each thread polls a corresponding url every one second. public class Poller { public static Node poll(Node node) { GetMethod method = null; try { HttpClient client = new HttpClient(new SimpleHttpConnectionManager(true)); ...... } catch (IOException ex) { ex.printStackTrace(); } finally { method.releaseConnection(); } } } The threads are run every one second: for (int i=0; i <1000; i++) { MyThread thread = threads.get(i) // threads is a static field if(thread.isAlive()) { // If the previous thread is still running, let it run. } else { thread.start(); } } The problem is if I run the job every one second I get random exceptions like these: java.net.BindException: Address already in use INFO httpclient.HttpMethodDirector: I/O exception (java.net.BindException) caught when processing request: Address already in use INFO httpclient.HttpMethodDirector: Retrying request But if I run the job every 2 seconds or more, everything runs fine. I even tried shutting down the instance of SimpleHttpConnectionManager() using shutDown() with no effect. If I do netstat, I see thousands of TCP connections in TIME_WAIT state, which means they are have been closed and are clearing up. So to limit the no of connections, I tried using a single instance of HttpClient and use it like this: public class MyHttpClientFactory { private static MyHttpClientFactory instance = new HttpClientFactory(); private MultiThreadedHttpConnectionManager connectionManager; private HttpClient client; private HttpClientFactory() { init(); } public static HttpClientFactory getInstance() { return instance; } public void init() { connectionManager = new MultiThreadedHttpConnectionManager(); HttpConnectionManagerParams managerParams = new HttpConnectionManagerParams(); managerParams.setMaxTotalConnections(1000); connectionManager.setParams(managerParams); client = new HttpClient(connectionManager); } public HttpClient getHttpClient() { if (client != null) { return client; } else { init(); return client; } } } However after running for exactly 2 hours, it starts throwing 'too many open files' and eventually cannot do anything at all. ERROR java.net.SocketException: Too many open files INFO httpclient.HttpMethodDirector: I/O exception (java.net.SocketException) caught when processing request: Too many open files INFO httpclient.HttpMethodDirector: Retrying request I should be able to increase the no of connections allowed and make it work, but I would just be prolonging the evil. Any idea what is the best practise to use HttpClient in a situation like above? Btw, I am still on HttpClient3.1.

    Read the article

  • Is it possible to implement any kind of file upload recovery / resumption in a browser?

    - by Pete
    The project is a servlet to which people can upload files via, at present, HTTP POST. This is accompanied by Web page(s) providing a front-end to trigger the upload. We have more or less complete control over the servlet, and the Web pages, but don't want to impose any restrictions on the client beyond being a reasonably modern browser with Javascript. No Java applets etc. Files may potentially be large, and a possible use case is mobile devices on less reliable networks. Some people on the project are demanding the ability to resume an upload if the network connection goes down. I don't think this is possible with plain HTTP and Javascript in a browser, but I'd love to be proved wrong. Any suggestions?

    Read the article

  • A tool to find and fix incomplete source code documentation

    - by Pekka
    I have several finished, older PHP projects with a lot of includes that I would like to document in javadoc/phpDocumentor style. While working through each file manually and being forced to do a code review alongside the documenting would be the best thing, I am, simply out of time constraints, interested in tools to help me automate the task as much as possible. The tool I am thinking about would ideally have the following features: Parse a PHP project tree and tell me where there are undocumented files, classes, and functions/methods (i.e. elements missing the appropriate docblock comment) Provide a method to half-way easily add the missing docblocks by creating the empty structures and, ideally, opening the file in an editor (internal or external I don't care) so I can put in the description. Optional: Automatic recognition of parameter types, return values and such. But that's not really required. The language in question is PHP, though I could imagine that a C/Java tool might be able to handle PHP files after some tweaking. Thanks for your great input!

    Read the article

  • How to -> Visual Studio 2010 Add In Manager

    - by Jeremy Thompson
    Hi, Sorry for such a simple question, but how do I use the Add-In Manager in VS2010? I want to add this "SmartPaster" addin: http://inedo.com/Downloads/SmartPaster.aspx or http://www.mediafire.com/?mzyjamytnlq What do I do with these 3 files to get them listed in the Add-In Manager dialog? SmartPaster2010.AddIn, SmartPaster2010.dll, SmartPaster2010.xml Edit: http://msdn.microsoft.com/en-us/library/19dax6cz.aspx says: "To install the add-in on another computer, the .addin file must be placed in a location where Visual Studio checks for add-ins. These locations are listed in the Options dialog box, in the Environment node, on the Add-in/Macros Security page." I went to Tools Options Environment Add-In/Macro Security, checked some paths put the AddIn files in a couple of these directoryies, restart VS2010 but still no luck!

    Read the article

  • Increase file upload size limit in iis6

    - by JustFoo
    Is there any other place besides the metabase.xml file where the file upload size can be modified? I am currently running a staging server with IIS6 and it is setup to allow uploading of files up to 20mb. This works perfectly fine. I have a new production server where I am trying to setup this same available size limit. So I edited the metabase.xml file and set it to 20971520. Then I restarted IIS and that didn't work. So I then restarted the entire server, that also didn't work. I can upload files around 2mb so it is definitely allowing file sizes larger then the standard 200kb default size. I try uploading a 5mb file and my upload.aspx page completely crashes. Is it possible there is something else I need to configure? The production server is located on a server farm, could there be some limits set on there end? Thanks

    Read the article

  • Why is WPO(whole-program optimization) not doing any improvements in my program size? (FPC 2.4.0)

    - by Gregory Smith
    I use FPC 2.4.0 for WinXP(binary from the official page), also tryed with same version but compiled from source on my comp. I put something like this: I:\pascal\fpc-2.4.0.source\fpc-2.4.0\compiler\ppc386 -FWserver-1.wpo -OWsymbolliveness -CX -XX -Xs- -al -Os -oServer1.o Server I:\pascal\fpc-2.4.0.source\fpc-2.4.0\compiler\ppc386 -FWserver-2.wpo -OWsymbolliveness -Fwserver-1.wpo -Owsymbolliveness -CX -XX -Xs- -al -Os -oServer2.o Server ..(up to 100 times) but always same .wpo files, and same .o sizes(.s, assembly files change intermittently) I also not(through compiler messages), that not used variables are still alive. Also tryed -OWall -owall What am i doing wrong?

    Read the article

  • Reviving a deleted file for use in my workspace

    - by John Cowan
    Greetings We run perforce with several users. Each user has their own development website that shows files in their workspace. This is great for making and viewing changes to webpages before submitting them. Sometime ago, we deleted a few pages in Perforce. I would like to revive these pages, but not to make them visible on our live site. I want to view them in my workspace and on my dev site, but I do want to push them out to our live server. In the "depot" tab of my P4 client, I can see the deleted files. I cannot see them in the "Workspace" tab of my client. How can I revive them for use in my Workspace, but not make them live to the world? I'm not a P4 admin so I could use a little guidance. Thanks for any help,

    Read the article

  • Bootstrapper with custom package

    - by JF
    I'm currently developping a bootstrapper to deploy one of my VSTO addins. I thus created a prerequisites list before compiled it with MSBuild, but I also need to test and install the otkloadr.dll fix (KB907417). In a first time I used a custom bootstrapper package, but the package directory and files must be included with my deployment if I want to use it. In fact I really want to have a very light setup kit, with only the setup.exe and the addin.msi files... Is there a way to use a custom bootstrapper package embedded into the setup.exe ? If not, is there a standard bootstrapper package which include the KB907417 fix ?

    Read the article

  • Indy 10 FTP empty list

    - by Lobuno
    Hello! I have been receiving reports from some of my users that, when using idFTP.List() from some servers (MS FTP) then the listing is received as empty (no files) when in reality there are (non-hidden) files on the current directory. May this be a case of a missing parser? The funny think, when I use the program to get the list from MY server (MSFTP on W2003) everything seems OK but on some servers I've been hitting this problem. Using latest Indy10 on D2010. Any idea?

    Read the article

  • May the FileInputStream.available foolish me?

    - by Tom Brito
    This FileInputStream.available() javadoc says: Returns an estimate of the number of remaining bytes that can be read (or skipped over) from this input stream without blocking by the next invocation of a method for this input stream. The next invocation might be the same thread or another thread. A single read or skip of this many bytes will not block, but may read or skip fewer bytes. In some cases, a non-blocking read (or skip) may appear to be blocked when it is merely slow, for example when reading large files over slow networks. I'm not sure if in this check: if (new FileInputStream(xmlFile).available() == 0) can I rely that empty files will always return zero?

    Read the article

< Previous Page | 677 678 679 680 681 682 683 684 685 686 687 688  | Next Page >