Search Results

Search found 37650 results on 1506 pages for 'files'.

Page 409/1506 | < Previous Page | 405 406 407 408 409 410 411 412 413 414 415 416  | Next Page >

  • Data unaccessible from WHS drive

    - by Eakraly
    Hi all, I had a Windows Home server machine that crashed. Before doing anything I decided to get data out of it. So I took the hard drive and connected it to Windows7 pc. What I get is that I cannot access almost any file! I do see directory structure, I can open small files like .txt and .ini but bigger files like .iso and video - no go. Same goes for Ubuntu and OS X - I can see files and even copy them - but they are corrupted. Any ideas for what the problem is?

    Read the article

  • Using git with cgit for decentralized/centralized development

    - by polemon
    I plan to use git for hosting my projects on my server. I've read about cgit, git-daemon, and I more or less decided to use those tools. But general use is still kind of confusing for me. What do I need to set up on the server, to push my files onto it. And when the files on the server are newer as the files on my computer, how do I merge them? Also, I use, say, two computers where I develop. How do I merge from one computer to the other? Also, when two people are working on the same project, how do they merge their local repos from one another? As you probably can tell by now, I come from SVN, but I've worked with Mercurial and now I'd like to test git.

    Read the article

  • php.ini settings change not taking effect for large file uploads

    - by user51347
    My server was just reprovisioned, and my application which uploads large (100M+) files now breaks upon re-installation. the symptom is quite consistent : smaller files (8mb in my tests) upload just fine. Larger files cut off at quite close to the same point every time given a particular computer. a file that fails at 26% will fail at just about the same every time. One that takes 1:40s to fail will take within 2 seconds of that every time, before failure. I have set my php.ini settings extravagently : post_max_size = 512M upload_max_filesize = 512M max_input_time = 3600 max_execution_time 3600 Is there possibly a setting at the Apache Level which would override PHP?

    Read the article

  • Store and Encrypt data over the internet.

    - by sotsec
    I am trying to build a system where I will be able to access my files remotely. I want to setup an external hard drive or a NAS that I will access over the internet, and I want every file that is stored on that system to be encrypted. Could you please suggest me what is the best way of doing that? Or if you have any knowledge, what is the best way to access your files remotely with maximum safety? but the same time the space that the files are allocated is protected against theft(encryption) etc. thank you

    Read the article

  • Writing to network share failed

    - by Unreason
    I have outlook files stored on a network share and accessed by clients directly. Outlook is version 2003, clients windows XP and server is 2003. The files are quite big at around 3GB. One of the common problem that happens is that I get 'delayed write failed' and this happens only on these PST files. When this happens I have to run scanpst.exe to fix the PST file. I did not find any entries in even logs that I could relate to the issue. What would you suggest to change to fix the issue or where to look to further diagnose it? EDIT: No loss on ping and ping times within normal for LAN.

    Read the article

  • Folder sync application which can sync over Internet (the other machine specified by an IP)?

    - by Adal
    I need to sync some folders between two Win 7 machines. While they are connected to the same LAN, they can't see each-other over Windows Networking since sharing is disabled on both of them (security reasons). Do you know any sync app which can work over IP? The folder I need to sync has 500,000 files in it (80 GB in total), so the sync app should be pretty efficient. At the moment I copy the files from one machine to the other over FTP, but it takes forever, since a separate connection is opened for each file. Or maybe you know some app which can efficiently transfer a large number of files between two machines on the Internet?

    Read the article

  • how to reduce size (disk space) of windows 8?

    - by humanityANDpeace
    This questions is about what things I can do to reduce the size that Windows 8 uses. Background For example: At present and with only one programm installed (MS Access 2007) I have a about 15GB of my harddisk space used. I have little space (its a 17 GB partition on a SSD disk). I would like solutions that are like: Remove files not really needed (drivers not actually needed in the system) Help files not really needed (documentation) pagefile.sys (assuming I would have 4GB ram and no real need for swaping) hiberfil.sys (used for hibernate and sleep... I need that. though I would regain about 4GB space) At best I would like to delete mostly files that I would most likely not need. Though I have no good idea where to start there. Since my setup (hardware will not change) I would be willing to delete all the drivers that windows 8 has for hardware I do not have.... The question is about ways to reduce the space that Windows 8 uses.

    Read the article

  • Remove folder structure from archive, ignore folder while archiving and fix error

    - by Michael
    I am trying to make a script to backup each of my plesk hosts to individual files, I am having two problems: I would like to remove the folder structure from archive, the tar is 3 folders deep I am getting this error: tar: Removing leading `/' from member names I need my archive to ignore folders named "catch" because I don't need them in my archive. The code: FILES=/var/www/vhosts/* FNAME="" for f in $FILES do FNAME=`basename $f` tar cfv "/root/backup/ftp/$FNAME.tar" $f done Sample output: tar: Removing leading `/' from member names /var/www/vhosts/mydomain.com/ /var/www/vhosts/mydomain.com/conf /var/www/vhosts/mydomain.com/etc/ /var/www/vhosts/mydomain.com/etc/group /var/www/vhosts/mydomain.com/etc/termcap /var/www/vhosts/mydomain.com/etc/passwd /var/www/vhosts/mydomain.com/usr/

    Read the article

  • Manage Kickstart library with Puppet

    - by Tim Brigham
    I maintain a library of different kickstart configurations, mostly for CentOS 5 and 6. It has recently gotten to the point I want to deduplicate as much of this information as possible. I am aware of a couple options out there which can dynamically generate kickstart files. Not interested at this point unless I really need to do that route. I would like to create my Kickstart files using a template along the following line: deploy1-centos5.erb .... install=http://.../$arch/... repo=http://.../$arch/... .... My output naming schema is "deploy1-centos5-x86_64". I'd like to be able to create several kickstart files from a given template, one for 32 bit, one 64, ppc, etc. This would work perfectly if I could readily set the value of arch per each time the template is called to create a file. What is the most ready way to address this?

    Read the article

  • Suggestions for hosted file sharing services

    - by Jon
    Before I pose my question, I will give some insight as per my scenario: I work for a small business (cost is an important factor) Our bandwidth is limited and would not support an in-house FTP server We need to share files (mostly pdf, inDesign, Illustrator documents) to our clients, and as we expand, we are finding that our current locally-hosted FTP solution is too slow and is becoming a detriment to our sales team. What we need is a remotely hosted solution to share files with our clients, specifically with the following features: Greater than 100gb of secure storage The Ability to distribute unique log in credentials to clients, granting access to a personalized directory or folder, while limiting access to other files on the server. A relatively simple web-based UI for clients with limited computer knowledge We have considered a dedicated remote server, and web-based services (box.net, yousendit.com, onehub.com, filesanywhere.com) but I am unsure as per the direction we should be taking - have I left another solution out? What would you suggest? Thanks in advance.

    Read the article

  • Our company has 100,000s+ photos, how to store and browse/find these efficiently?

    - by tobefound
    We currently store our photos in a structure like this: folder\1\10000 - 19999.JPG|ORF|TIF (10 000 files) folder\2\20000 - 29999.JPG|ORF|TIF (10 000 files) etc... They are stored on 4 different 2TB D-link NASes attached and shared on our office network (\\nas1, \\nas2, and so on...) Problems: 1) When a client (Windows only, Vista and 7) wishes to browse the let's say \\nas1\folder\1\ folder, performance is quite poor. A problem. List takes a long time to generate in explorer window. Even with icons turned off. 2) Initial access to the NAS itself is sometimes slow. Problem. SAN disks too expensive for us. Even with iSCSI interface/switch technology. I've read a lot of tech pages saying that storing 100 000+ files in one single folder shouldn't be a problem. But we don't dare go there now that we experience problems on a 10K level. All input greatly appreciated, /T

    Read the article

  • Persistent retrying resuming downloads with curl

    - by Svish
    I'm on a mac and have a list of files I would like to download from an ftp server. The connection is a bit buggy so I want it to retry and resume if connection is dropped. I know I can do this with wget, but unfortunately Mac OS X doesn't come with wget. I could install it, but to do that (unless I have missed something) I need to install XCode and MacPorts first, which I would like to avoid. Curl is available though it seems, but I don't know how that works or how to use it really. If I have a list of files in a text file (one full path per line, like ftp://user:pass@server/dir/file1) how can I use curl to download all those files? And can I get curl to never give up? Like, retry infinitely and resume downloads where it left off and such?

    Read the article

  • rsync generates very much traffic

    - by user109459
    I use rsync for backing up one of my servers with 4GB of files. When I now try to transfer these files the traffic for the files isn't the estimated 4GB. It is a lot higher. It's about 60GB. I also checked the traffic on my server, backup server and router and all three say that there was a traffic of 60GB. But at the end rsync says that it only has transfered 4GB. Another problem is that I can't debugg it because the problem occures randomly.

    Read the article

  • Searching Multiple Terms

    - by nevets1219
    I know that grep -E 'termA|termB' files allows me to search multiple files for termA OR termB. What I would like to do instead is search for termA AND termB. They do not have to be on the same line as long as the two terms exists within the same file. Essentially a "search within result" feature. I know I can pipe the results of one grep into another but that seems slow when going over many files. grep -l "termA" * | xargs grep -l "termB" | xargs grep -E -H -n --color "termA|termB" Hopefully the above isn't the only way to do this. It would be extra nice if this could work on Windows (have cygwin) and Linux. I don't mind installing a tool to perform this task.

    Read the article

  • ffmpeg fully html 5 converion supported

    - by user58542
    I need to know about library that required for ffmpeg for convert any format of audio and video for supported following format to have best configuration to convert audio and video files for html5 formats. I need for support mp3 and ogg for audio files. Also need for support FLV, H.264, Ogg Theora and VP8 (WebM) for video files. I'm using debian also using deb-multimedia repository. I need list of packages required for this formats(any format to this formats). also any configuration for install from package management or compile via latest ffmpeg repository. Thanks a lot.

    Read the article

  • Copying large file from SD to HDD via USB failing on Ubuntu

    - by Kent Boogaart
    Hi, I'm attempting to copy some large files from my camera (Canon EOS 500D) to my laptop, which is running 64 bit Ubuntu 9.04. I am using USB to connect the two devices. For most files, it is simply a matter of control-C and control-V. I have done this successfully many times with both photos and small movies (eg. 180MB). However, when I attempt to do this with very large files (eg. 3GB), the copy seems to start with a lot of activity both on the camera and laptop, but after 10 minutes or so the camera is automatically unmounted and the copy fails to complete. I have read that this might be due to the device not mounting as a mass storage device, but I cannot see any obvious way for me to change this behavior. Can anyone offer any direction here? I'll get a USB card reader if necessary, but I'd prefer to be able to just plug my camera in. Thanks

    Read the article

  • rsync filtering

    - by biomed
    I use an rsync command to sync two directories remote local the command is (used in python script) os.system('rsync --verbose --progress --stats --recursive\ --copy-links --times --include="*/" --include="*good_name*.good_ext*"\ --exclude-from "/myhome/mydir/src/rsync.exclude"\ %s %s'%(remotepath,localpath)) I want to exclude certain directories that has the same files that I also want to include. I want to include recursively any_dir_name/any_file_name.good but I want to exclude any and all files that are in bad_dir_name/ I used exclude-from and here is my exclude from file * /*.bad_dir_name/ Unfortunately it doesn't work. I suspect it may have something to do with --include="*/" but if I remove it the command doesn't sync any files at all. Thanks for the help.

    Read the article

  • How to export or view audio file references in a PDF?

    - by redshift
    I have a an interactive PDF file that is over 90+ pages long. Each page is a map with city names that contains a Spanish pronunciation of that city in a .wav file. I'd say there are about 10-15 audio files for each map which comes out to 1000+ audio files. Is there a way to extract/export a list of the sound file names associated with each map? I tried to save the PDF to an HTML file, but it only exported images and text, and because the audio files were embedded in the PDF, the file names did not carry over to the HTML file. Any other ideas? I need to see what audio file goes with what map/page.

    Read the article

  • Transfer using linux ssh and maintaining permissions

    - by jbolt
    I need to transfer files across ssh to another server. The file structures are identical on both sides. I have used scp -r but that does not retain the orginal file/dir permissions. rsync does the job of keeping the permissions in tact but does not delete the files on the destination side if I want to overwrite them because of changes. I know rsync will write the changes when the source files are newer but I need it to just copy everything reguardless of the date (ie replace destination directory with the one I am moving) without having to shell into the destination first and manually delete the dir. I heard tar can do this but I can not seem to get it to work without errors. The syntax is tar -cf - /directory/directory | ssh host.name tar -xf - C /destination_directory Any help would be appreciated.

    Read the article

  • Solaris NFS: user permissions

    - by cjavapro
    I am very new to NFS. I would like to make sure I am clear. If the NFS server shares a directory rw,, and all the files in the directory are permissions 700 and user/group for those files is root/root,,, On the client you would have to log in as root to see it. Is this correct? I am aware that a non root user on the client could make a direct connection to override this. (as in don't use the mount, just use an NFS client hack.) It really seems like anyone who has access to the client machine should have access to the files and that the client machine should be ignoring permissions. Only the server should handle permissions. Am I correct in my understanding? Is it normal to have this type of layout? Is there a way to ignore the permissions on the client side?

    Read the article

  • Unix / linux permissions setup for shared hosting with Apache

    - by weiyin
    I'm in the process of setting up a server from a clean CentOS 5 install. What is the best permission structure (users, groups, unix permissions) for running a single instance of apache for multiple users? Ideally, it should satisfy these requirements: Each user's websites are stored in a subdirectory of their home directory. Users can edit files and permissions. Apache can read the websites of all users. No user can read the website files of other users. Bonus question: how to add PHP and/or Perl and/or Ruby to Apache without allowing any users to access any other user's files?

    Read the article

  • Find RARs with duplicate content

    - by Scott McClenning
    I need a utility to find RAR files that contain duplicate data (i.e. files within the RAR that hash the same, but could have different names). I can open the RARs and see the CRCs are the same, but I was hoping for a more automated process that would work in bulk (hundreds of files). Hashing the overall RAR won't help because the file contained within could have different names, or the archive could be compressed at different levels. If needed, a utility that would extract the contents of the RARs and then compare would work, but is not preferred. I would prefer a free utility for Windows, but a pay utility or a utility for Linux would be acceptable.

    Read the article

  • Tomcat SSL integration issue

    - by small_ticket
    Hi all, I've bought a wildcard ssl certificate from a company, i sent them the csr file and they send me two certificate files namely CA.txt and com_sertificate. I've searched on web and find some tutorials about tomcat and ssl but i can not accomplish with these two files. All that tutorials mention about different files that i don't have. (I asked about this process to the company that i bought certificates but they said they don't have any knowledge about tomcat integration) Is there anyone that has an idea about this? p.s I'm using ubuntu 8.04 server, Java 1.6 and tomcat 6

    Read the article

  • Smart Auto-completion in SVN (and other programs!)

    - by Jimmy
    When I type "svn add path/to/somefile..." and tab to autocomplete, the system should ONLY complete files/directories that are NOT under currently under SVN control. Likewise, when I commit, remove or resolve files, the tab completion should only complete files that are relevant to what I'm doing. This is especially important in SVN where I can waste thousands of keystrokes typing long path and file names, but it of applies to other programs. I know bash has a bash_completion file that can be used to programatically alter this behaviour but I've not found a decent example of SVN completion which actually completes file names rather than SVN command names. My question is: Does anyone have such a setup? Does anyone use a different shell or tool that does something similar? Has anyone given this any thought?

    Read the article

  • Computer won't start when connecting SATA HDD

    - by vlad
    I just bough a new HDD some time ago and recently i bought another SATA cable to have both HDDs connected at the same time, to transfer all the files from the older one to the newer. When i connected both of them, the computer started working, i tapped F2 to go to bios to make sure it was detected, and then after 5-10 seconds the computer instantly shut down. After this incident, my computer won't start at all when connecting the SATA power supply to the old HDD. If i connect only my new drive, it works without any problem. If i connect only my old drive, or both the old drive and the new one, and then push on the power button, the cpu fan simply rotates about 5 degrees and then stops.. and that's it, the computer doesn't start, and no sound from the pc speaker either. Is there any way to recover my files from the old HDD and transfer them to my new one? I do not want to use the old HDD any more, i only need to save all my files.

    Read the article

< Previous Page | 405 406 407 408 409 410 411 412 413 414 415 416  | Next Page >