Search Results

Search found 40663 results on 1627 pages for 'huge files'.

Page 106/1627 | < Previous Page | 102 103 104 105 106 107 108 109 110 111 112 113  | Next Page >

  • Backup script that excludes large files using Duplicity and Amazon S3

    - by Jason
    I'm trying to write an backup script that will exclude files over a certain size. My script gives the proper command, but when run within the script it outputs an an error. However if the same command is run manually everything works...??? Here is the script based on one easy found with google #!/bin/bash # Export some ENV variables so you don't have to type anything export AWS_ACCESS_KEY_ID="accesskey" export AWS_SECRET_ACCESS_KEY="secretaccesskey" export PASSPHRASE="password" SOURCE=/home/ DEST=s3+http://s3bucket GPG_KEY="7743E14E" # exclude files over 100MB exclude () { find /home/jason -size +100M \ | while read FILE; do echo -n " --exclude " echo -n \'**${FILE##/*/}\' | sed 's/\ /\\ /g' #Replace whitespace with "\ " done } echo "Using Command" echo "duplicity --encrypt-key=$GPG_KEY --sign-key=$GPG_KEY `exclude` $SOURCE $DEST" duplicity --encrypt-key=$GPG_KEY --sign-key=$GPG_KEY `exclude` $SOURCE $DEST # Reset the ENV variables. export AWS_ACCESS_KEY_ID= export AWS_SECRET_ACCESS_KEY= export PASSPHRASE= If run I recieve the error; Command line error: Expected 2 args, got 6 Enter 'duplicity --help' for help screen. Any help your could offer would be greatly appreciated.

    Read the article

  • Executed PHP files are stale unitl "touched" (Symlinked NFS mount as web root)

    - by mmattax
    We have a PHP application that has 3 web servers (running Nginx and Apache). The web server's directory root are symlinked directories that point to an NFS mount. For example: web01 has an NFS mount at /data/webapp, which is symlinked to /home/webapp. Apache serves content from /home/webapp/www. We also use ACP for our PHP opcode cache. When we deploy code, we SCP an archive file to the NFS server and extract it. Since upgrading RedHat 6, when we deploy our code the webserver execute "stale" PHP files until touch is run on the PHP files. We thought that APC might be causing a problem, but the issue exists, even after clearing the opcode cache. Any ideas on how to diagnose why the stale PHP code is being executed?

    Read the article

  • Delete specific files after installation using visual studio setup project

    - by Vadiklk
    I have this problem. I want to build an installer for my c# solution, that will be placed in a folder with other installation folders and files that are needed to be copied to the installed folder. So that is easy, I just copy them to the folder I create using the folder structure I want. Now, I want also to install another program and run a .exe file I've created to unzip some files for me. For that I need to copy 2 .exe files and 2 dlls (for the exes) to the folder to which I am installing and create 2 custom actions that will use them. That I've managed to do. After that I want to delete those 4 extra files, as the user does not need them and shouldn't even be aware they are there. How to do so? I couldn't find a way in the built in setup project preferences + I do not know how to make a custom installer class. A bonus question, is how to make the other installer (one of the .exe files is just a plain installer) install quietly to any path? I do not want the user to see an installer pop out of my program installer. Thanks!

    Read the article

  • Copy files between two windows machines on seperate domains

    - by Simon
    I need to copy several database backups between two computers. The source computer initiates the copy and is a Windows 2000 pc and is a member of domain1. The destination machine is running Windows Server 2000 and is a member of domain2. The machines are on separate networks physically connected via a firewall. The files are currently copied via ssh with http://sshwindows.sourceforge.net/ installed on the destination machine. There is no need to encrypt the contents during the copy, however the passwords should not be sent in the clear. I am looking for a way to copy the files without having to install a server on the destination. I specifically need help with how to set up the permissions and what ports would need to be opened on the firewall.

    Read the article

  • Problem with script that excludes large files using Duplicity and Amazon S3

    - by Jason
    I'm trying to write an backup script that will exclude files over a certain size. If i run the script duplicity gives an error. However if i copy and paste the same command generated by the script everything works... Here is the script #!/bin/bash # Export some ENV variables so you don't have to type anything export AWS_ACCESS_KEY_ID="accesskey" export AWS_SECRET_ACCESS_KEY="secretaccesskey" export PASSPHRASE="password" SOURCE=/home/ DEST=s3+http://s3bucket GPG_KEY="gpgkey" # exclude files over 100MB exclude () { find /home/jason -size +100M \ | while read FILE; do echo -n " --exclude " echo -n \'**${FILE##/*/}\' | sed 's/\ /\\ /g' #Replace whitespace with "\ " done } echo "Using Command" echo "duplicity --encrypt-key=$GPG_KEY --sign-key=$GPG_KEY `exclude` $SOURCE $DEST" duplicity --encrypt-key=$GPG_KEY --sign-key=$GPG_KEY `exclude` $SOURCE $DEST # Reset the ENV variables. export AWS_ACCESS_KEY_ID= export AWS_SECRET_ACCESS_KEY= export PASSPHRASE= When the script is run I get the error; Command line error: Expected 2 args, got 6 Where am i going wrong??

    Read the article

  • Use a folder of xml files as data source for nhibernate

    - by Bart Van Eyndhoven
    I'm going to start writing NUnit tests for a few classes in my project. A certain number of these classes use data gathered through nhibernate from a sql server 2008 database. The part of the program I'm about to test is very specific (and complicated). Therefore I have made a folder of xml files. Combined, the xml files could result in the database structure. I mean each xml file corresponds to a table in the database. The data in the xml files is also consistent with the database. Is there a way to use this folder of xml files as data source for nhibernate? I mean: can I use nhibernate to gather my test data (wich I have specifically chosen) instead of data from the database? In this way, I could usefully test this component without corrrupting the (test) database for future tests.

    Read the article

  • Grand Central Strategy for Opening Multiple Files

    - by user276632
    I have a working implementation using Grand Central dispatch queues that (1) opens a file and computes an OpenSSL DSA hash on "queue1", (2) writing out the hash to a new "side car" file for later verification on "queue2". I would like to open multiple files at the same time, but based on some logic that doesn't "choke" the OS by having 100s of files open and exceeding the hard drive's sustainable output. Photo browsing applications such as iPhoto or Aperture seem to open multiple files and display them, so I'm assuming this can be done. I'm assuming the biggest limitation will be disk I/O, as the application can (in theory) read and write multiple files simultaneously. Any suggestions? TIA

    Read the article

  • Format Factory not working for mov files

    - by LanguaFlash
    I have attempted multiple destination formats with no success. I am using Format Factor 2.30. I drag and drop a .mov file into FF, select all to mpg (or any format) then click Start. It thinks for a couple seconds and then jumps to 100%. When I open the file I get no sound or picture from the video. These mov files were created with a Canon SX10 camera. (It is my brother's camera so I'm not sure of the model.) Any suggestions? TMPGEnc is able to convert the files with the QTReader plug in so the video isn't corrupted or something. Thanks. Jeff

    Read the article

  • FTP transfer timeouts while uploading small files

    - by Hamed Momeni
    I have this problem that when I need to transfer some files (mostly small files < 100KB) the connections time out. Well actually it uploads one file and it fails on the next until my client reconnects to the server and the same thing happens over and over again. I googled the problem and some said that switching from passive mode to active mode could solve the it but it didn't work for me. Even continuously pinging the server to keep the connection alive was to no avail. P.S. I have root access to the server. Update: I'm running ProFTPD on a CentOS vps. I tried a few clients (FireFTP, FileZilla) all having the same problem.

    Read the article

  • Checkout repo from SVN but use local files to populate

    - by aidan
    I have an SVN server on our development server, and I release to our production server using rsync. It not ideal, but it's worked so far. Anyway, I've finally got the SVN client installed on the production server and I want to start using that to copy files from development to production. My problem is this, I don't want to check all the data out of development when I already have it on the production server. Is there a way to "checkout" a repository, but use the files that are already on the production server (and force it to assume they are the head versions for example)? Thanks.

    Read the article

  • Mercurial Messing Up csproj Files?

    - by alphadogg
    I am using Hg to manage and merge code with three other developers involved in a VS2008 project. We do have an .hgignore file that ignores a fair number of files not necessary to track, such as *.pdb, *.obj, etc. However, we do track .csproj files. Periodically, it would seem that files go missing after a merge. We would get build issues, and have to relocate files which were in the project folders, but not in the csproj file. Eventually, I noted during a merge conflict that sometimes Hg seems to merge incorrectly. Here's a screenshot below. The actual conflict that requires manual intervention is lower in the file. But in this section, hg incorrectly replaces DirectoryTasks.cs with a new, different file called ReportTasks.cs, when in fact, both should be added. How do people manage to avoid this?

    Read the article

  • Log Files from bash script output

    - by neildeadman
    I have a script that runs (this works fine). I'd like to produce logfiles from its output and still show it on screen. I have this command that creates three files from this blog: ((./fk.sh 2>&1 1>&3 | tee errors.log) 3>&1 1>&2 | tee output.log) 2>&1 | tee final.log This does exactly what I want it to. My only issue is that I create files in my script and copy them somewhere, and I'd like to copy these logfiles there too, which I can't do whilst this script is running. I also wanted to make it easier for any user to run my script, so I created another script to run this script. According to this post (see last post) I can put a . before the script name and I can use variables assigned in my called script from the first script if I use them in the first. It doesn't seem to work though and I can't figure out why or find alternative methods. Can anyone help?

    Read the article

  • Use html files from another project in ASP.NET MVC

    - by Stacey
    I know that I can use normal html files in ASP.NET MVC; however I have a situation where several (above 20) html files are needed for static display. This is fine and good, but I really don't want it cluttering the MVC project since none of them will have controller actions. Is there any way to load up a second project and use static html files from it, within ASP.NET MVC?

    Read the article

  • Apache Prepending Header Information to ALL FILES

    - by Michael Robinson
    We're in the middle of setting up new servers, and have been having some odd problems with Apache. Apache is prepending text that looks like this: $15plðI‚‚?E?ðA™@?@??yeÔ|~Ÿ²?PγZ" zS€?8i³?? ,ÀŠ{ÿBHTTP/1.1 200 OK Date: Mon, 02 Feb 2009 22:28:05 GMT Server: Apache/2.2.3 (CentOS) Last-Modified: Mon, 02 Feb 2009 22:28:05 GMT ETag: W/"1238007d-2224e-fe617f40" Accept-Ranges: bytes Content-Length: 139854 Connection: close Content-Type: application/x-javascript To all files. The file I copied the above text from is the prototype library js file. As loaded from our server. I've searched, but couldn't find much about this problem Maybe I don't know what I'm searching for... Anyway, if anyone has seen this behaviour before, could they please let me know either 1) how to fix it so that this content is not prepended to all files, or 2) where to look for further help. Thanks

    Read the article

  • php, user-uploaded files, version control, and website deployment

    - by user151841
    I have a website that I regularly update the code to. I keep it in version control. When I want to deploy a new version of the site, I do an export and then symlink the served directory name to the directory of the deployment. There is a place where users can upload files, and I noticed once that, after I had deployed a new version, the user files were gone! Of course, I hadn't added them to the repository, and since the served site was from an export, they weren't uploaded into a version-controlled directory anyways. PHP doesn't yet have integrated svn functionality, so I couldn't do much programmatically to user uploaded files. My solution was to create an additional website, files.website.com, which sits in a parallel directory to the served website, and is served out of a directory that is under version control. That way they don't get obliterated when I do an upgrade to the website. From time to time, I manually add uploaded files to the svn project, deleted user-deleted ones, and commit the new version. I'm working on a shell script to run from cron to do this, but it isn't my forte, so it's on the backburner as it's not a pressing need. Is there a better way to do this?

    Read the article

  • Error when trying to access Shared files from iMac via smb

    - by SatheeshJM
    I used to access all my Windows XP shared files on my Mac using Finder -- Window -- Connect to server. Now all of a sudden, an error crops up when I try to connect. I get the error "There was a problem connecting to the server "192.168.1.*" The server may not exist or it is unavailable at this time. Check the server name or IP address, check your internet connection and then try again. How can I remove this error and access my shared files from my Mac? P.S my network connections is fine.

    Read the article

  • Restart of Master Postgres DB with unconsumed Wal files

    - by Douglas Sellers
    We have a situation where walmanager is being used to ship wal files between a master and a slave Postgres database. The slave machine has failed and has had to have been rebuilt. This has caused a lot of unconsumed wal files to build up on the master. If a reboot is issued to the Postgres master, and there are 24 hours worth of unconsumed wal files hanging around, will the master be effected at all or will it start clean?

    Read the article

  • Vantec NexStar NAS Encloser - Writing large files

    - by peter
    Hi, I have one of these 'Vantec NexStar LX - NST-475LX-BK' drive enclosures. It is a NAS drive. When I write a file to the device using eSata, or a SMB share I cannot write files over 2GB. I think this is because the drive is formatted with FAT32. But when I access the device using FTP it doesn't matter. I can write files of any size. E.g. I wrote one on there last night which was 30GB. Does this make any sense? Why? I guess the most important thing for me is data integrity.

    Read the article

  • Header unset Server not working for static files

    - by Sam Lee
    I'm trying to unset the "Server" field in response headers. I do this using Header unset Server, and that works fine for requests handled by mod_perl. However, for requests to /static I use Apache to serve static files. For some reason, when these files are loaded directly in the browser, the Server field is not removed. How can I go about fixing this? Relevent parts of my httpd.conf: LoadModule headers_module modules/mod_headers.so Header unset Server <VirtualHost *:80> <Location /> SetHandler modperl PerlResponseHandler MyHandler </Location> Alias /static/ /home/site/static/ <Location /static> SetHandler None </Location> </VirtualHost>

    Read the article

  • How to update application files using patching?

    - by Marek
    I am not interested in any auto update solution, such as ClickOnce or the MS Updater Block. For anyone feeling the urge to ask why not: I am already using these and there is nothing wrong with them, I would just like to learn about any efficient alternatives. I would like to publish patches = small differences that will modify existing files of the deployment with the smallest possible delta. Not only code needs to be patched, but also resource files. Patching the running code can be accomplished by maintaining two separate synchronized copies of the deployment (no on the fly changes to the running executable are required). The application itself can be xcopy deployed (to avoid MSI auto-correcting the modified files or breaking ClickOnce signatures). I would like to learn how to handle different versions of patches (e.g. there is a patch issued that fixes one error and later another patch that fixes another error (in the same file) - users may have any combination of these and there comes a third patch - in text files, this may be easy to implement, but how about executable files? (native Win32 code vs. .NET, any difference?) If the first problem is too hard to solve or unsolvable for executables, I would like to at least learn if there is a solution that implements simple patching with serial revisions - in order to install revision 5, user must have all previous revisions installed to ensure validity of the deployment. Are there any existing solutions to accomplish this? NOTE: There are a few questions on SO that may seem like duplicates, but none with a good answer. This question is about the Windows platform, preferably .NET.

    Read the article

  • ACL and moving files in Nautilus

    - by MyOnlyEye
    When I move files from a private home directory (e.g. /home/jack) to a shared directory (e.g. /home/shared-school) Nautilus copies the file permissions from the original file into the shared directory - and ignores the ACL that I've put in the /home/shared-school directory (e.g. setfacl -R -m d:g:school:rwx /home/shared-school). Is it possible to force Nautilus to change ACL on a file that is moved or copied - or not to ignore the ACL on the directory where the files are moved or copied?

    Read the article

  • How to choose my own filename format for subscribed podcast files?

    - by meomaxy
    I subscribe to several podcasts where the filenames of the downloaded mp3 files have no particular pattern to them. When I copy the directory of accumulated mp3 files into my mp3 player, the files play in alphabetical order. What I really want is to play the files chronologically by release date. I currently use iTunes on Windows XP to download the files. What I do now is manually rename the files, adding the date in YYYYMMDD format to the start of each filename so that an alphabetical listing of files will correspond to their chronological order when I listen to them later in PocketTunes on my Palm Centro. Is there some way to get the release date into the filename automatically? If so, I could automate or possibly skip the file renaming step. I would switch from iTunes to something else if that would solve my problem. The file creation time on my local disk isn't a reliable indicator because sometimes I download a few days worth of content at one time, and the files don't necessarily get downloaded in chronological order.

    Read the article

< Previous Page | 102 103 104 105 106 107 108 109 110 111 112 113  | Next Page >