Search Results

Search found 37883 results on 1516 pages for 'sparse files'.

Page 414/1516 | < Previous Page | 410 411 412 413 414 415 416 417 418 419 420 421  | Next Page >

  • Is there any bad thing happens if I change /etc/ldap/slapd.d/cn=config.ldif manually?

    - by HVNSweeting
    Since 2.3, OpenLDAP uses a configuration engine called slapd-config. They said that use it make all LDAP configuration can be changed on fly. This is the header of /etc/ldap/slapd.d/cn=config.ldif: # AUTO-GENERATED FILE - DO NOT EDIT!! Use ldapmodify. I've changed data in it and some other files which have that header, after restarting slapd, my changes took effects. Is there anything else happen if I change those files manually? If I don't need 'change on fly', should I edit those file manually instead of using ldapmodify? Which application generated those files, and when? NOTE: I'm using openldap-2.4.28 on Ubuntu 12.04

    Read the article

  • apache dont send me mp3 header even when use direct address to the file

    - by user1728307
    apache dont send me mp3 header even when use direct address to the file, it means i can play it with flash audio players on my web pages, but when i tried to download from direct address on my server i got "Error 101 (net::ERR_CONNECTION_RESET): The connection was reset" or sometimes gives me a file with mp3 extension that has just 13B files-size, and when i open that file in gedit/notepad there is just: <html></html> i dont have any problem with php files and images, but mp3 files never be send to browser for download or play. i added this code to httpd.conf: AddType audio/mpeg .mp3 but there is not any difference!! thanks in advance

    Read the article

  • Using git with cgit for decentralized/centralized development

    - by polemon
    I plan to use git for hosting my projects on my server. I've read about cgit, git-daemon, and I more or less decided to use those tools. But general use is still kind of confusing for me. What do I need to set up on the server, to push my files onto it. And when the files on the server are newer as the files on my computer, how do I merge them? Also, I use, say, two computers where I develop. How do I merge from one computer to the other? Also, when two people are working on the same project, how do they merge their local repos from one another? As you probably can tell by now, I come from SVN, but I've worked with Mercurial and now I'd like to test git.

    Read the article

  • Writing to network share failed

    - by Unreason
    I have outlook files stored on a network share and accessed by clients directly. Outlook is version 2003, clients windows XP and server is 2003. The files are quite big at around 3GB. One of the common problem that happens is that I get 'delayed write failed' and this happens only on these PST files. When this happens I have to run scanpst.exe to fix the PST file. I did not find any entries in even logs that I could relate to the issue. What would you suggest to change to fix the issue or where to look to further diagnose it? EDIT: No loss on ping and ping times within normal for LAN.

    Read the article

  • Store and Encrypt data over the internet.

    - by sotsec
    I am trying to build a system where I will be able to access my files remotely. I want to setup an external hard drive or a NAS that I will access over the internet, and I want every file that is stored on that system to be encrypted. Could you please suggest me what is the best way of doing that? Or if you have any knowledge, what is the best way to access your files remotely with maximum safety? but the same time the space that the files are allocated is protected against theft(encryption) etc. thank you

    Read the article

  • vsftpd per group configuration

    - by roqs
    I want to configure a vsftpd in a per group fashion instead of per user configuration. It's possible? Suppose i have two groups: groupA and groupB, so my goal is: users in groupA have permission (wrx) to all files in directory dir1 users in groupB have permission (wrx) to all files in directory dir2 users of the system have permission (wrx) to all files in directory dir3 For example: ftp@test:/home/ftp# ls -l drwxrwxr-x 16 root groupA 4096 Jun 3 10:45 dir1 drwxrwxr-x 2 root groupB 4096 Jun 3 10:56 dir2 drwxrwxr-x 8 root users 4096 Jun 3 11:01 dir3 How to do that with vsftpd?

    Read the article

  • Folder sync application which can sync over Internet (the other machine specified by an IP)?

    - by Adal
    I need to sync some folders between two Win 7 machines. While they are connected to the same LAN, they can't see each-other over Windows Networking since sharing is disabled on both of them (security reasons). Do you know any sync app which can work over IP? The folder I need to sync has 500,000 files in it (80 GB in total), so the sync app should be pretty efficient. At the moment I copy the files from one machine to the other over FTP, but it takes forever, since a separate connection is opened for each file. Or maybe you know some app which can efficiently transfer a large number of files between two machines on the Internet?

    Read the article

  • Hard disk failure. Can I recover my "move"d folders?

    - by Doug
    I am in the process of moving all my files from an old laptop to new one. I just moved 11gb of data from my old laptop to a hard drive (external) and then upon moving it out to the new hard drive, the hard drive is getting a CRC (Data Error (Cyclic Redundancy Check). Now I am looking for a solution to recover the files that I moved on my old laptop (not the external). I understand they they are just marked for potential overwriting to free up space. I was getting ready to test out GetDataBack, but it says to install it on a healthy windows and use the recover-needed drive as an external. However, I don't want to turn off my computer without first getting the okay since it is in a "moved" state. Please help! What can I do to recover the Moved files. I haven't touched the computer since it has been moved. What can I use to recover them?

    Read the article

  • Suggestions for hosted file sharing services

    - by Jon
    Before I pose my question, I will give some insight as per my scenario: I work for a small business (cost is an important factor) Our bandwidth is limited and would not support an in-house FTP server We need to share files (mostly pdf, inDesign, Illustrator documents) to our clients, and as we expand, we are finding that our current locally-hosted FTP solution is too slow and is becoming a detriment to our sales team. What we need is a remotely hosted solution to share files with our clients, specifically with the following features: Greater than 100gb of secure storage The Ability to distribute unique log in credentials to clients, granting access to a personalized directory or folder, while limiting access to other files on the server. A relatively simple web-based UI for clients with limited computer knowledge We have considered a dedicated remote server, and web-based services (box.net, yousendit.com, onehub.com, filesanywhere.com) but I am unsure as per the direction we should be taking - have I left another solution out? What would you suggest? Thanks in advance.

    Read the article

  • php.ini settings change not taking effect for large file uploads

    - by user51347
    My server was just reprovisioned, and my application which uploads large (100M+) files now breaks upon re-installation. the symptom is quite consistent : smaller files (8mb in my tests) upload just fine. Larger files cut off at quite close to the same point every time given a particular computer. a file that fails at 26% will fail at just about the same every time. One that takes 1:40s to fail will take within 2 seconds of that every time, before failure. I have set my php.ini settings extravagently : post_max_size = 512M upload_max_filesize = 512M max_input_time = 3600 max_execution_time 3600 Is there possibly a setting at the Apache Level which would override PHP?

    Read the article

  • Remove folder structure from archive, ignore folder while archiving and fix error

    - by Michael
    I am trying to make a script to backup each of my plesk hosts to individual files, I am having two problems: I would like to remove the folder structure from archive, the tar is 3 folders deep I am getting this error: tar: Removing leading `/' from member names I need my archive to ignore folders named "catch" because I don't need them in my archive. The code: FILES=/var/www/vhosts/* FNAME="" for f in $FILES do FNAME=`basename $f` tar cfv "/root/backup/ftp/$FNAME.tar" $f done Sample output: tar: Removing leading `/' from member names /var/www/vhosts/mydomain.com/ /var/www/vhosts/mydomain.com/conf /var/www/vhosts/mydomain.com/etc/ /var/www/vhosts/mydomain.com/etc/group /var/www/vhosts/mydomain.com/etc/termcap /var/www/vhosts/mydomain.com/etc/passwd /var/www/vhosts/mydomain.com/usr/

    Read the article

  • Manage Kickstart library with Puppet

    - by Tim Brigham
    I maintain a library of different kickstart configurations, mostly for CentOS 5 and 6. It has recently gotten to the point I want to deduplicate as much of this information as possible. I am aware of a couple options out there which can dynamically generate kickstart files. Not interested at this point unless I really need to do that route. I would like to create my Kickstart files using a template along the following line: deploy1-centos5.erb .... install=http://.../$arch/... repo=http://.../$arch/... .... My output naming schema is "deploy1-centos5-x86_64". I'd like to be able to create several kickstart files from a given template, one for 32 bit, one 64, ppc, etc. This would work perfectly if I could readily set the value of arch per each time the template is called to create a file. What is the most ready way to address this?

    Read the article

  • Persistent retrying resuming downloads with curl

    - by Svish
    I'm on a mac and have a list of files I would like to download from an ftp server. The connection is a bit buggy so I want it to retry and resume if connection is dropped. I know I can do this with wget, but unfortunately Mac OS X doesn't come with wget. I could install it, but to do that (unless I have missed something) I need to install XCode and MacPorts first, which I would like to avoid. Curl is available though it seems, but I don't know how that works or how to use it really. If I have a list of files in a text file (one full path per line, like ftp://user:pass@server/dir/file1) how can I use curl to download all those files? And can I get curl to never give up? Like, retry infinitely and resume downloads where it left off and such?

    Read the article

  • Our company has 100,000s+ photos, how to store and browse/find these efficiently?

    - by tobefound
    We currently store our photos in a structure like this: folder\1\10000 - 19999.JPG|ORF|TIF (10 000 files) folder\2\20000 - 29999.JPG|ORF|TIF (10 000 files) etc... They are stored on 4 different 2TB D-link NASes attached and shared on our office network (\\nas1, \\nas2, and so on...) Problems: 1) When a client (Windows only, Vista and 7) wishes to browse the let's say \\nas1\folder\1\ folder, performance is quite poor. A problem. List takes a long time to generate in explorer window. Even with icons turned off. 2) Initial access to the NAS itself is sometimes slow. Problem. SAN disks too expensive for us. Even with iSCSI interface/switch technology. I've read a lot of tech pages saying that storing 100 000+ files in one single folder shouldn't be a problem. But we don't dare go there now that we experience problems on a 10K level. All input greatly appreciated, /T

    Read the article

  • Searching Multiple Terms

    - by nevets1219
    I know that grep -E 'termA|termB' files allows me to search multiple files for termA OR termB. What I would like to do instead is search for termA AND termB. They do not have to be on the same line as long as the two terms exists within the same file. Essentially a "search within result" feature. I know I can pipe the results of one grep into another but that seems slow when going over many files. grep -l "termA" * | xargs grep -l "termB" | xargs grep -E -H -n --color "termA|termB" Hopefully the above isn't the only way to do this. It would be extra nice if this could work on Windows (have cygwin) and Linux. I don't mind installing a tool to perform this task.

    Read the article

  • rsync generates very much traffic

    - by user109459
    I use rsync for backing up one of my servers with 4GB of files. When I now try to transfer these files the traffic for the files isn't the estimated 4GB. It is a lot higher. It's about 60GB. I also checked the traffic on my server, backup server and router and all three say that there was a traffic of 60GB. But at the end rsync says that it only has transfered 4GB. Another problem is that I can't debugg it because the problem occures randomly.

    Read the article

  • how to reduce size (disk space) of windows 8?

    - by humanityANDpeace
    This questions is about what things I can do to reduce the size that Windows 8 uses. Background For example: At present and with only one programm installed (MS Access 2007) I have a about 15GB of my harddisk space used. I have little space (its a 17 GB partition on a SSD disk). I would like solutions that are like: Remove files not really needed (drivers not actually needed in the system) Help files not really needed (documentation) pagefile.sys (assuming I would have 4GB ram and no real need for swaping) hiberfil.sys (used for hibernate and sleep... I need that. though I would regain about 4GB space) At best I would like to delete mostly files that I would most likely not need. Though I have no good idea where to start there. Since my setup (hardware will not change) I would be willing to delete all the drivers that windows 8 has for hardware I do not have.... The question is about ways to reduce the space that Windows 8 uses.

    Read the article

  • Copying large file from SD to HDD via USB failing on Ubuntu

    - by Kent Boogaart
    Hi, I'm attempting to copy some large files from my camera (Canon EOS 500D) to my laptop, which is running 64 bit Ubuntu 9.04. I am using USB to connect the two devices. For most files, it is simply a matter of control-C and control-V. I have done this successfully many times with both photos and small movies (eg. 180MB). However, when I attempt to do this with very large files (eg. 3GB), the copy seems to start with a lot of activity both on the camera and laptop, but after 10 minutes or so the camera is automatically unmounted and the copy fails to complete. I have read that this might be due to the device not mounting as a mass storage device, but I cannot see any obvious way for me to change this behavior. Can anyone offer any direction here? I'll get a USB card reader if necessary, but I'd prefer to be able to just plug my camera in. Thanks

    Read the article

  • ffmpeg fully html 5 converion supported

    - by user58542
    I need to know about library that required for ffmpeg for convert any format of audio and video for supported following format to have best configuration to convert audio and video files for html5 formats. I need for support mp3 and ogg for audio files. Also need for support FLV, H.264, Ogg Theora and VP8 (WebM) for video files. I'm using debian also using deb-multimedia repository. I need list of packages required for this formats(any format to this formats). also any configuration for install from package management or compile via latest ffmpeg repository. Thanks a lot.

    Read the article

  • rsync filtering

    - by biomed
    I use an rsync command to sync two directories remote local the command is (used in python script) os.system('rsync --verbose --progress --stats --recursive\ --copy-links --times --include="*/" --include="*good_name*.good_ext*"\ --exclude-from "/myhome/mydir/src/rsync.exclude"\ %s %s'%(remotepath,localpath)) I want to exclude certain directories that has the same files that I also want to include. I want to include recursively any_dir_name/any_file_name.good but I want to exclude any and all files that are in bad_dir_name/ I used exclude-from and here is my exclude from file * /*.bad_dir_name/ Unfortunately it doesn't work. I suspect it may have something to do with --include="*/" but if I remove it the command doesn't sync any files at all. Thanks for the help.

    Read the article

  • How to export or view audio file references in a PDF?

    - by redshift
    I have a an interactive PDF file that is over 90+ pages long. Each page is a map with city names that contains a Spanish pronunciation of that city in a .wav file. I'd say there are about 10-15 audio files for each map which comes out to 1000+ audio files. Is there a way to extract/export a list of the sound file names associated with each map? I tried to save the PDF to an HTML file, but it only exported images and text, and because the audio files were embedded in the PDF, the file names did not carry over to the HTML file. Any other ideas? I need to see what audio file goes with what map/page.

    Read the article

  • Solaris NFS: user permissions

    - by cjavapro
    I am very new to NFS. I would like to make sure I am clear. If the NFS server shares a directory rw,, and all the files in the directory are permissions 700 and user/group for those files is root/root,,, On the client you would have to log in as root to see it. Is this correct? I am aware that a non root user on the client could make a direct connection to override this. (as in don't use the mount, just use an NFS client hack.) It really seems like anyone who has access to the client machine should have access to the files and that the client machine should be ignoring permissions. Only the server should handle permissions. Am I correct in my understanding? Is it normal to have this type of layout? Is there a way to ignore the permissions on the client side?

    Read the article

  • Transfer using linux ssh and maintaining permissions

    - by jbolt
    I need to transfer files across ssh to another server. The file structures are identical on both sides. I have used scp -r but that does not retain the orginal file/dir permissions. rsync does the job of keeping the permissions in tact but does not delete the files on the destination side if I want to overwrite them because of changes. I know rsync will write the changes when the source files are newer but I need it to just copy everything reguardless of the date (ie replace destination directory with the one I am moving) without having to shell into the destination first and manually delete the dir. I heard tar can do this but I can not seem to get it to work without errors. The syntax is tar -cf - /directory/directory | ssh host.name tar -xf - C /destination_directory Any help would be appreciated.

    Read the article

  • Unix / linux permissions setup for shared hosting with Apache

    - by weiyin
    I'm in the process of setting up a server from a clean CentOS 5 install. What is the best permission structure (users, groups, unix permissions) for running a single instance of apache for multiple users? Ideally, it should satisfy these requirements: Each user's websites are stored in a subdirectory of their home directory. Users can edit files and permissions. Apache can read the websites of all users. No user can read the website files of other users. Bonus question: how to add PHP and/or Perl and/or Ruby to Apache without allowing any users to access any other user's files?

    Read the article

  • Find RARs with duplicate content

    - by Scott McClenning
    I need a utility to find RAR files that contain duplicate data (i.e. files within the RAR that hash the same, but could have different names). I can open the RARs and see the CRCs are the same, but I was hoping for a more automated process that would work in bulk (hundreds of files). Hashing the overall RAR won't help because the file contained within could have different names, or the archive could be compressed at different levels. If needed, a utility that would extract the contents of the RARs and then compare would work, but is not preferred. I would prefer a free utility for Windows, but a pay utility or a utility for Linux would be acceptable.

    Read the article

< Previous Page | 410 411 412 413 414 415 416 417 418 419 420 421  | Next Page >