Search Results

Search found 46908 results on 1877 pages for 'managing files and folder'.

Page 478/1877 | < Previous Page | 474 475 476 477 478 479 480 481 482 483 484 485  | Next Page >

  • How do ulimit -n and /proc/sys/fs/file-max differ?

    - by bantic
    I notice that on a new CentOS image that I just booted up off of EC2 that the ulimit default is 1024 open files, but /proc/sys/fs/file-max is set at 761,408 and I'm wondering how these two limits work together. I'm guessing that ulimit -n is a per-user limit of number of file descriptors while /proc/sys/fs/file-max is system-wide? If that's the case, say I've logged in twice as the same user -- does each logged-in user have a 1024 limit on number of open files, or is it a limit of 1024 combined open files between each of those logged-in users? And is there much performance impact to setting your max file descriptors to a very high number, if your system isn't ever opening very many files?

    Read the article

  • How to use Massren with Vim?

    - by Remonn
    I want to use massren renamer in vim in order to rename files within the texteditor. First, I used this command within Vim to set Vim as default editor for massren :! c:\tools\massren_renamer\massren.exe --config editor "C:\Progra~2\Vim\vim74\gvim.exe --remote-silent-tab" Then I tried to change a number of text files with this command within Vim: :%! c:\tools\massren_renamer\massren.exe d:\t*.txt but the list of files doesn't open at all in Vim. What did I wrong?

    Read the article

  • Good process/software for organizing photos past/present

    - by Matthew
    So I have tons of photos taken all the time. I have a lot from years past that I never went through (meaning deleting duplicates, etc). I've got a new pc with windows 7, and I'm wondering what a good process is to organize those photos. They're in folders that have really no meaning (it used to be people would put them in a folder wherever, even the desktop or somewhere else, not just the My Pictures folder). I'm going to keep all pictures in the "My Pictures" folder from now on. I've used Picasa from g=Google, and it works great. Is this the recommended free software for this? What process do I use to move the old pictures over in to new "organized" folders? Lately in Picasa when I import off my camera card, I would just select something that names the folder after the date it was taken. Is this advised? Just give me ideas on how to stay organized with photos. Should I tag them also? Should I rename the file names? Keep in mind I have over 16,000 photos I'll have to go through, so it can't be anything to thorough.

    Read the article

  • Server not accepting uploads

    - by Tatu Ulmanen
    I'm having a strange problem with my VPS: I can download files from it, I can use PuTTy to connect to it and all behaves normally. But sometimes, when I try to upload a file to the server or save a file via SFTP, the connection inexplicably fails. I am using jEdit to edit files remotely via SFTP. When it works, it works fine. When it doesn't, I get an error message: Cannot save: java.io.IOException: inputstream is closed Cannot save: java.io.IOException: 4: I can see that a temporary save file (#file.php#save#) is created on the server with a filesize of 0. So the connection works, but when it comes to sending the actual data, something fails. The same thing with WinSCP, but the error is different: Copying file fatally failed. Copying files to remote side failed. And I can always browse the server with PuTTy without a problem. I see nothing abnormal in any log files. Auth.log shows this when I try to save: sshd[32638]: Accepted password for - from - port 62272 ssh2 sshd[32638]: pam_unix(sshd:session): session opened for user - by (uid=0) sshd[32640]: subsystem request for sftp sshd[32638]: pam_unix(sshd:session): session closed for user - When I wait for a while (say, an hour), everything works fine again. It can't be a temporary ban, as I am still allowed to connect to the server, right? I know this may not be enough info to solve the problem, but I am grateful for any clues or bits of information that might help me. What are the possible causes for this kind of behaviour, what log files can I check for clues etc.. I'm running out of ideas!

    Read the article

  • How to avoid sshfs freezing?

    - by Andreas Hagen
    So the issue is this: I've installed sshfs on Ubuntu 12.04 and I'm trying to connect to a couple of remote servers. So initially the mount seams successful. Sometimes Gnome even picks it up and displays the "new device found" box at the bottom of the screen. but from here on there is not much that works. Or at least not any more. The first couple of times i connected it seamed to work fine, and I was able to transfer some files, then i disconnected using fusermount -u <folder> and after reconnecting a little later the trouble started. Now after executing sshfs -o ServerAliveInterval=15 -o reconnect -C -o workaround=all -o idmap=user root@<host>:/ <folder>, when I change directory into the mount-point, the shell just freezes. Strangely ls -al <folder> works when listing just the root of the remote system, but nothing more. Also every file-explorer I've tried freezes just like cd <folder>. To me it seamed like there was some kind of zombie thread or something hanging around my system, due to the fact that it did work the first time, so I have tried rebooting but no luck. sshfs -V gives this: SSHFS version 2.3 FUSE library version: 2.8.6 fusermount version: 2.8.6 using FUSE kernel interface version 7.12 So yea, any ideas?

    Read the article

  • apache/nginx html file size limit

    - by Daniel
    When serving/sending HTML files to a users browser, where can I reconfigure this size limit? I want to send an extremely large html files to users via apache and nginx. Files are being truncated in apache/nginx, what setting determines the file size?

    Read the article

  • Windows Vista - overlay icon with two people

    - by abcdefghijkl
    I had to save data from my harddisk to an external drive (with linux) and after reinstalling Windows Vista (and copying the files back) there is a strange overlay icon with two people. How do I get rid of this ? First I thought it could be shared, but the files are not shared. The user is the owner of all those files and they are accesible to everyone. Any ideas what Vista would like to say to me with these icons and how I get rid of them ?

    Read the article

  • Got Hacked. Want to understand how.

    - by gaoshan88
    Someone has, for the second time, appended a chunk of javascript to a site I help run. This javascript hijacks Google adsense, inserting their own account number, and sticking ads all over. The code is always appended, always in one specific directory (one used by a third party ad program), affects a number of files in a number of directories inside this one ad dir (20 or so) and is inserted at roughly the same overnight time. The adsense account belongs to a Chinese website (located in a town not an hour from where I will be in China next month. Maybe I should go bust heads... kidding, sort of), btw... here is the info on the site: http://serversiders.com/fhr.com.cn So, how could they append text to these files? Is it related to the permissions set on the files (ranging from 755 to 644)? To the webserver user (it's on MediaTemple so it should be secure, yes?)? I mean, if you have a file that has permissions set to 777 I still can't just add code to it at will... how might they be doing this? Here is a sample of the actual code for your viewing pleasure (and as you can see... not much to it. The real trick is how they got it in there): <script type="text/javascript"><!-- google_ad_client = "pub-5465156513898836"; /* 728x90_as */ google_ad_slot = "4840387765"; google_ad_width = 728; google_ad_height = 90; //--> </script> <script type="text/javascript" src="http://pagead2.googlesyndication.com/pagead/show_ads.js"> </script> Since a number of folks have mentioned it, here is what I have checked (and by checked I mean I looked around the time the files were modified for any weirdness and I grepped the files for POST statements and directory traversals: access_log (nothing around the time except normal (i.e. excessive) msn bot traffic) error_log (nothing but the usual file does not exist errors for innocuous looking files) ssl_log (nothing but the usual) messages_log (no FTP access in here except for me)

    Read the article

  • Windows: How to add batch-script action to Right Click menu

    - by ervingsb
    I have a few programs that creates temp files or backup files or similar files that are not important. For example, GVim for Windows by default creates a backup file in filename.txt~. I sometimes need to clean up a dir and remove all these files. I have made a simple .bat file for this. However, it is cumbersome to have to start up cmd, navigate to the folder, run the script. Especially since this is a script that I would like to run often on various folders. And I do not want to copy the script to multiple folders, as this would be a maintenance nightmare. So, I was thinking, that the best solution would be to add a Right Click menu item that allows me to run the script. So that I can right click on a folder in Explorer and click Cleanup and then have my script run on this folder. So my question is: How do I add a right click menu action that runs a custom batch script?

    Read the article

  • I need a few minutes of dedicated server a week, but not for hosting, just to convert ogg etc

    - by talkingnews
    I'm completely happy with my webhosting, it's just that I need to do one little thing they won't allow, and that's run an instance of Sox to convert about 30 mp3s to ogg files, in various directories, a couple of times a week, to be done automatically in response to the detection of the upload of an mp3. Probably looking at a minute of server time over the whole week. I've had unhelpful suggestions on other forums like "why not leave your home PC on 24 hours a day and then use all your isp bandwidth to do this", which doesn't work for me. I know that I can host files on, say, Amazon S3, but is there something similar for my needs? All it would need to do would be: wget/ftp the mp3 files, convert them to ogg, ftp the files back to my hosting. Of course, all this wouldn't be needed if there was such a thing as a compiled binary of Sox (or any mp3ogg converter) for Centos which I could upload without needing root access, but I've given up asking that one, but always open to suggestions!

    Read the article

  • backing up ntfs disk using rsync on ubuntu

    - by user70366
    For a long time I was using windows. I have a separate drive I use to keep copies of my media files, photos etc. on, which I periodically backup to an external drive. In Windows I used SyncToy to do this. After my Windows stopped booting, I decided to switch to Linux (Ubuntu 10.10). That seems to be going fine, but now I want to backup my drive to the external drive like before. Mostly the two drives will be already the same with maybe about 10GB of extra files added. So I try to use rsync to synchronise the two drives like this: rsync --dry-run -rvlt --modify-window=1 /media/Antonio1TB/Backup /media/FREECOM\ HDD/Backup The problem is the dry run indicates that every file on the drive will be copied. Not just the files I have recently added. What is the correct command to synch two NTFS drives under Ubuntu so that files that already exist don't get copied again? Thanks.

    Read the article

  • How can I install VLC on RHEL 6.3?

    - by holddame
    I'm having a problem installing VLC on Red hat 6.3 When I try to use yum install vlc all goes well until it shows me this in the end: Error: Package: vlc-2.0.3-6.el6.x86_64 (linuxtech-release) Requires: libminizip.so.1()(64bit) Error: Package: liblrdf-0.5.0-2.el6.x86_64 (linuxtech-release) Requires: ladspa Error: Package: libffado-2.1.0-0.8.20120325.svn2088.el6.x86_64 (linuxtech-release) Requires: libconfig++.so.8()(64bit) also I can't use yum update I'm running on a 32-bit processor and I don't know what's wrong. ok I'v installed live555 and tried again nothing really happened here is my yum whatprovides *BasicUsageEnviroment `live555-devel-0-0.34.2012.01.25.el6.x86_64 : Development files for live555.com streaming : libraries Repo : linuxtech-release Matched from: Filename : /usr/include/BasicUsageEnvironment live555-devel-0-0.34.2012.01.25.el6.i686 : Development files for live555.com streaming : libraries Repo : linuxtech-release Matched from: Filename : /usr/include/BasicUsageEnvironment live555-devel-0-0.27.2010.04.09.el6.rf.x86_64 : Development files for live555.com streaming : libraries Repo : rpmforge Matched from: Filename : /usr/include/BasicUsageEnvironment live555-devel-0-0.27.2012.02.04.el6.rf.x86_64 : Development files for live555.com streaming : libraries Repo : rpmforge Matched from: Filename : /usr/include/BasicUsageEnvironment

    Read the article

  • Shortcut to "printer and faxes" on another computer

    - by Doltknuckle
    I have a print server running windows server 2008 that has about 50 printers on it. In windows XP, I was able to connect to the server using the UNC name and make a shortcut to the "printers and faxes" folder. (For the record, I know that it really isn't a folder, but that's outside the scope of this question.) I have recently switched to windows 7 and I find that the jump lists are really useful. One of the things I want to do is make it easy to connect to that server's "printers and faxes" folder. I would like to use something like a shortcut that I can open and go immediately to that location. The problem is that windows 7 doesn't have a way to create a shortcut like you could in WinXP. They have a button on the toolbar that says "view remote printers" which sends you to the correct folder. I'd like to avoid having to type out the server name. I also can't use the "view network" link in windows explorer. Our organization has over 6,000 machines and viewing the network lists all of them. This is all about saving time by using the minimum number of mouse clicks and key presses in normal operation. Does anyone have any suggestions?

    Read the article

  • Filesystem to quickly get recent modifications

    - by liori
    Hello, I've got relatively big filesystem (ext4) with lots of small files and I'd like to backup it. Making full backups often is not feasible to me so I want to have a way to make differential/incremental backups (differential preferred). But... this is laptop, and scanning for changed files takes lots of time. My questions: 1) Is it possible to get list of files changed since some date from ext4's journal? I know it wasn't designed with this idea in mind, and it might be too small for bigger timespans, but maybe it is somehow possible? 2) Is it possible to monitor filesystem modifications and maintain a list of changed files reliably? I think I could use inotify, but this might be too slow to monitor full filesystem and might be unreliable. (by reliable I mean either I get all modifications since last backup (and this list is not missing anything) or an error message). Laptop runs Debian unstable.

    Read the article

  • What is the best drive cleaner?

    - by allindal
    What is the best drive "cleaner" application, an application that deletes roaming, temp. and different useless caches. Something similar to CCleaner, but more powerful. I need it to delete more than the basic stuff. Like duplications of complex files or redundancies, (example... for every game there's the DirectX suite) without deleting program essential files, obviously. I know most of this has to do with my selection of these programs, but I haven't seen anything that lets me select types of files to delete, not just specific files.

    Read the article

  • Does moving a file outside NTFS loose data in alternate data streams?

    - by jay
    I have a lot of files on machine running Windows Server 2008 which I wanted to move to a Fedora machine. How can I keep the attributes stored in, for example, media files (date taken, rating, length, etc) while transfering it to outside the realm of NTFS's Alternate Data Streams. I'm aware that similar metadata exists in other file systems, but what happens when you move these files? And what's the best way to retain them in other file systems?

    Read the article

  • Windows Server 2003 (as workstation) unable to write to Samba fileshares

    - by remyhorton
    Setup is a Samba fileserver under Linux, which i am trying to access from a Windows Server 2003 box which has been reconfigured as a workstation. I can log onto the fileshares and can copy/delete files, but trying to open a file then write to it fails. Renaming files also fails with an error about requiring a filename. Drag/dropping files onto Xemacs gives me a message about copying from the network zone, and once open the file is read-only. Any ideas of what is wrong? I suspect it is a miscommunication of security details, as folder security options are all unchecked (checking them has no effect). I know it is not a problem with Samba itself, as Window2000, WindowsXP, and Nautulas (under Linux) can all access/edit fileshare files fine using the same userid/password. I am not using domain logins.

    Read the article

  • How can I change Nautilus's delete behavior?

    - by Alex
    I want to make it so that nautilus requires me to press a key combination to delete files - so that I do not accidentally delete files on a network share with no confirmation again. Ideally I would make the behavior identical to OSX's Finder, so that I press ctrl+backspace to delete files.

    Read the article

  • Audio splitting and noise removal on Windows

    - by pts
    My mother has about 100 hours of audio in a mix of MP3 and WAV files, the digitized versions of her vinyl records. Each file contains about 5 songs with a few seconds of (noisy) pause between them. My mother needs software for Windows XP with which she can listen to the files, find the gaps manually, split the files at the gaps found, reduce noise on each song, and export the songs to individual MP3 files. My mother has very limited software user skills and affinity, and she doesn't speak English. The simpler the software, the better for her, even if noise reduction is worse than with a more sophisticated, but more complicated software. I'd prefer free software, freeware or shareware (which can do all above). Please recommend something much simpler than Audacity. The software should guide the user through the process, always showing the next few available steps, and being intuitive in the sense that there are only a few allowed actions and it's obvious what they are and how to activate them. Which software would you recommend?

    Read the article

  • Windows 7: L10N mechanics

    - by John Sonderson
    I have a localized version of Windows 7. I can't figure out where windows gets the names for files and directories on the system. For instance, consider the following (default) files. > cd C:\Users\Public\Pictures\Sample Pictures > dir Chrysanthemum.jpg Desert.jpg ... When I view these files on the default file explorer I see these names: Crisantemo.jpg Deserto.jpg ... This seems to imply that each file can be somehow assigned a localized name somewhere. However I cannot figure out how. Would appreciate if someone could shed some light on this issue. Thanks. UPDATE EDIT: The desktop.ini file in the folder containing Chrysanthemum.jpg contains the following entries. The .dll files used to translate the various resources are unfortunately not human-readable and I have no clue as to how they could be generated for other files created by the user to be translated, but they serve the purpose, and solve the mystery which lead to the post. Thanks. [LocalizedFileNames] Chrysanthemum.jpg=@%systemroot%\system32\SampleRes.dll,-101 Desert.jpg=@%systemroot%\system32\SampleRes.dll,-102 Hydrangeas.jpg=@%systemroot%\system32\SampleRes.dll,-103 Jellyfish.jpg=@%systemroot%\system32\SampleRes.dll,-104 Koala.jpg=@%systemroot%\system32\SampleRes.dll,-105 Tulips.jpg=@%systemroot%\system32\SampleRes.dll,-106 Lighthouse.jpg=@%systemroot%\system32\SampleRes.dll,-107 Penguins.jpg=@%systemroot%\system32\SampleRes.dll,-108 [.ShellClassInfo] LocalizedResourceName=@%SystemRoot%\system32\shell32.dll,-21805

    Read the article

  • rpmbuild gives seg fault

    - by Deepti Jain
    I am trying to build an rpm using the rpmbuild tool. I have source code which build binaries around 30 GB. This software for which I am making the rpm has dozens of executables. When I copy only the binaries of a single executable (Eg. init) my rpm builds successfully. But when I dump the entire build to the rpm, rpmbuild does everything but gives a seg fault in the end. Here is my spec file: # This is a sample spec file for wget %define _topdir /root/mywget %define name source %define release 1 %define version 1.12 %define _builddir /root/mywget/BUILD/glenlivet %define _buildrootdir /root/mywget/BUILDROOT %define _buildroot /root/mywget/BUILDROOT %define _sourcedir /root/mywget/SOURCES BuildRoot: %{_buildroot} Summary: GNU source License: GPL Name: %{name} Version: %{version} Release: %{release} Source: %{name}-%{version}.tar.gz Prefix: /usr Group: Development/Tools %description The GNU sample program downloads files from the Internet using the command-line. %prep %setup -q -n glenlivet %build cd %{_builddir} make all %install rm -rf %{_buildrootdir} mkdir -p %{_buildrootdir}/bin cp -p -r %{_builddir}/build/obj-x64/* %{_buildrootdir}/bin/ %files %defattr(-,root,root) /bin/* If I only copy some of the binaries (let say one utility and its dependent binaries) it works fine. But when I try to copy the entire build, I get a seg fault. I get the seg fault after rpmbuild has executed these sections: %prep %build %install rpmbuild also processes my source file. Processing files: source-1.12-1 Finding Provides: Finding Requires: Finding Supplements: Provides:...... Requires:...... Checking for unpackaged file(s):/ usr/lib/rpm/check-files /root/mywget/BUILDROOT Checking for unpackaged file(s):/ usr/lib/rpm/check-files /root/mywget/BUILDROOT Segmentation fault Any clue what wrong is going on or where does rpmbuild fails? Thanks in advance

    Read the article

  • Run disk error check on NTFS file?

    - by paulius_l
    I have a feeling that my system hard drive is dying. Benchmark kind of enforces it. Here is the benchmark of my system hard drive during low system activity: And here is the benchmark of backup drive: Furthermore, there are some files which I just can't touch because I get CRC errors and the hard drive activity spikes to 100% with operating speeds less than 1 MB/s while working with such files. I haven't yet tried swapping SATA cable as I have read this might cause the problems. Anyway, I would like to run some tests on specific clustsers where those files I am interested in are stored. I don't want to do the full chkdsk because it takes a very long time. I would like to either find the utility which executes the disk check directly on the clusters where the file belongs or a couple utilities where one tells me the cluster locations and another can check just those locations. How do I check and possibly fix disk errors where the files I am interested in are stored? Edit: S.M.A.R.T. info:

    Read the article

  • App to convert from ANSI to UTF8 on windows [closed]

    - by antoniocs
    Possible Duplicate: Batch-convert files for encoding or line ending under Windows Hey! I have many files that are encoded in the ANSI (iso-8859-1) format and I want to change it to utf8. I am converting one by one using notepad++ but I was wondering if there is any application that will convert them all (I have many files) in a quick and easy way. Anyone know of one app that will do this?? (free app would be great) Thanks

    Read the article

  • Discrepancy in file size on disk and ls output

    - by smokinguns
    I have a script that checks for gzipped file sizes greater than 1MB and outputs files along with their sizes as a report. This is the code: myReport=`ls -ltrh "$somePath" | egrep '\.gz$' | awk '{print $9,"=>",$5}'` # Count files that exceed 1MB oversizeFiles=`find "$somePath" -maxdepth 1 -size +1M -iname "*.gz" -print0 | xargs -0 ls -lh | wc -l` if [ $oversizeFiles -eq 0 ];then status="PASS" else status="CHECK FAILED. FOUND FILES GREATER THAN 1MB" fi echo -e $status"\n"$myReport The problem is that ls command outputs the files sizes as 1.0MB in the report but the status is "FAIL" as "$oversizeFiles" variable's value is 2. I checked the file sizes on disk and 2 files are 1.1MB. Why this discrepancy? How should I modify the script so that I can generate an accurate report? BTW, I'm on a Mac. Here is what man page for "find" says on my Mac OSX: -size n[ckMGTP] True if the file's size, rounded up, in 512-byte blocks is n. If n is followed by a c,then the primary is true if the file's size is n bytes (characters). Similarly if n is followed by a scale indicator then the file's size is compared to n scaled as: k kilobytes (1024 bytes) M megabytes (1024 kilobytes) G gigabytes (1024 megabytes) T terabytes (1024 gigabytes) P petabytes (1024 terabytes)

    Read the article

  • Windows Web Server in DataCenter Authenticate with AD in Office

    - by Viper Venom
    Hi, We would like to have put a File Server in DataCentre to allow user to upload/download files when they are home. Since we have hundreds of users and would like to let the user to authenticate with the existing AD in our office. Basically, I will setup the IIS server to allow users to list various directories in the File Server based on their user group. For example, Group A will have list the D:\Files\A and the Group B will have list the D:\Files\B ...etc. After some initial study, I found that the PPTP based Site to Site VPN might fit our need to do the authentication part but I still don't have any idea on how to let them upload files to the server. Is there any suggestions such as any better option to do this (either authentication or upload part) or any area I need to be careful of? Thank you in advance.

    Read the article

< Previous Page | 474 475 476 477 478 479 480 481 482 483 484 485  | Next Page >