Search Results

Search found 89549 results on 3582 pages for 'large file support'.

Page 141/3582 | < Previous Page | 137 138 139 140 141 142 143 144 145 146 147 148  | Next Page >

  • SVN: Checking out a large project over slow connection

    - by far
    Hello, I am new to SVN. I want to check out a very large project over a slow connection which takes ages to download. I have zipped versions of project on both remote server and my local which are identical. Is there an easy and quick way to sync my local project with remote server without a full checkout? Thanks

    Read the article

  • Robust Large File Transfer with WCF

    - by Sharov
    I want to transfer big files (1GB) over unreliable transport channels. When connection is interrupted, I don't want start file transfering from the begining. I can partially store it in a temp table and store last readed position, so when connection is reestablished I can request continue uploading of file from this position. Is there any best-practice for such kind of things. I'm currently use chunking channel.

    Read the article

  • How can I asynchronously monitor a file in Perl?

    - by Hussain
    I am wondering if it is possible, and if so how, one could create a perl script that constantly monitors a file/db, and then call a subroutine to perform text processing if the file is changed. I'm pretty sure this would be possible using sockets, but this needs to be used for a webchat application on a site running on a shared host, and I'm not so sure sockets would be allowed on it. The basic idea is: create a listener for a chat file/database when the file is updated with a new message, call a subroutine the called subroutine will send the new message back to the browser to be displayed Thanks in advance.

    Read the article

  • Php efficiency question --> Database call vs. File Write vs. Calling C++ executable

    - by JP19
    Hi, What I wish to achieve is - log all information about each and every visit to every page ofmy website (like ip address, browser, referring page, etc). Now this is easy to do. What I am interested is doing this in a way so as to cause minimum overhead (runtime) in the php scripts. What is the best approach for this efficiency-wise: 1) Log all information to a database table 2) Write to a file (from php directly) 3) Call a C++ executable, that will write this info to a file in parallel [so the script can continue execution without waiting for the file write to occur ...... is this even possible] I may be trying to optimize unnecessarily/prematurely, but still - any thoughts / ideas on this would be appreciated. (I think efficiency of file write/logging can really be a concern if I have say 100 visits per minute...) Thanks & Regards, JP

    Read the article

  • Easiest way to split up a large controller file

    - by timpone
    I have a rails controller file that is too large (~900 lines - api_controller). I'd like to just split it up like something like this: api_controller.rb api_controller_item_admin.rb api_controller_web.rb I don't want to split into multiple controllers. What would be the preferred way to do this? Could I just require the new parts at the end? like: require './api_controller_item_admin' require './api_controller_web'

    Read the article

  • Hadoop: Processing large serialized objects

    - by restrictedinfinity
    I am working on development of an application to process (and merge) several large java serialized objects (size of order GBs) using Hadoop framework. Hadoop stores distributes blocks of a file on different hosts. But as deserialization will require the all the blocks to be present on single host, its gonna hit the performance drastically. How can I deal this situation where different blocks have to cant be individually processed, unlike text files ?

    Read the article

  • I am unable to upload file to my local host folder in php.

    - by Nauman khan
    Hi, I have the follwing code <form enctype="multipart/form-data" action="upload.php" method="POST"> Please choose a file: <input name="uploaded" type="file" /><br /> <input type="submit" value="Upload" /> </form> <?php $target = "upload/"; $target = $target . basename( $_FILES['uploaded']['name']) ; $ok=1; if(move_uploaded_file($_FILES['uploaded']['tmp_name'], $target)) { echo "The file ". basename( $_FILES['uploadedfile']['name']). " has been uploaded"; } else { echo "Sorry, there was a problem uploading your file."; } ?> my page is in http://localhost/nausal/upload.php now I am having the follwing error though i have created a folder in site with name upload. Notice: Undefined index: uploaded in C:\wamp\www\Nausal\upload.php on line 15 Notice: Undefined index: uploaded in C:\wamp\www\Nausal\upload.php on line 17 Sorry, there was a problem uploading your file. Please help me I am very new to php. :(

    Read the article

  • Typical Service Response Time for software verndors [closed]

    - by Miky D
    I'm trying to find out what are the standard service/tech-support response times that are expected of a software vendor. We're being asked by a customer to enter into an agreement regarding technical support for a software application that we're selling. Basically, I'm interested in the typical turn-around time (i.e. time to respond, time to resolution) based on the severity of the issue. And also, I'm interested in the financial structure of such agreements: i.e. charge/incident, bundle with unlimited incidents/customer etc. Any information or suggestions of where to find such information (even examples of other software vendors websites) would be greatly appreciated!

    Read the article

  • large databases in sqlite - file size considerations?

    - by Gj
    I'm using a sqlite db which is very convenient and seems to meet all of my needs at this point. Currently my db size is <50MB, but I now need to add a new table which will store large text blobs, which will cause the db to reach up to 5GB within the next year. Would sqlite be able to deal with a 5GB db size? Any caveats to that, compared with say mysql?

    Read the article

  • Does aria2 support write small files in batch?

    - by Jon
    I'm using aria2 to download 8 million jpg from flickr. Each image is about 100KB. I got a list of urls of these images in a txt file, the format is: http://farm2.staticflickr.com/1070/1151334893_5a8e7f77f4.jpg I'm wondering whether aria2 support writing small files in batch? Say write 100 image to disk when all of them are download in the memory, not just write every single file when the download is finished. Because I think writing in batch will better protect my hard disk. Or do you have other software or opensource code to recommend?

    Read the article

  • Auto-resize large images with JavaScript?

    - by Yegor
    I have an application that allows people to post images on each others profiles with bb code. Problem is, some post very large images, which cover other parts of the site when are viewed. How can I scale down images, client-side, so they are no bigger than x by y dimensions?

    Read the article

  • Web Programming language for very large lists?

    - by behrk2
    Hello, In your experience, what is the best web programming language used to handle sorting and comparison of very large lists (ie tens of thousands of email addresses)? I am most familiar with PHP. I think that it could get the job done, but I'm unsure of other languages and if there might be a bettor suitor. Thanks!

    Read the article

  • How to check for a dynamically created file in Java?

    - by Moev4
    I have an application where I need to check for a file which may be created dynamically during my execution, I will give up after some MAX time where the file has yet to show up. I wanted to know if there was a more efficient method in Java of checking for the file other than polling for it and then sleeping every X seconds? If not what would be the most efficient manner of doing this?

    Read the article

  • Paypal Mass pay fails when large transactions are made

    - by Sid
    I am using the paypal mass pay feature but i am unable to make payments greater than $20. When I attempt to make large transactions (say for $120) i get the error that says the account has insufficient funds. My account has more than the requisite amount to make the payment. I am trying to find a solution as there is no documentation that says anything about an a limit for each payment in the mass pay api. I would appreciate any help on this.

    Read the article

< Previous Page | 137 138 139 140 141 142 143 144 145 146 147 148  | Next Page >