Search Results

Search found 37650 results on 1506 pages for 'files'.

Page 404/1506 | < Previous Page | 400 401 402 403 404 405 406 407 408 409 410 411  | Next Page >

  • Break a hard link of a file in use

    - by Stebi
    I used hard links to merge duplicated files on my SSD (space is still precious) and now have a weird problem. Common files like msvcr110.dll got hard linked. Now I want to delete a program which has this file in its installation directory. But I cannot because this file (on another location) is used by a currently running application (don't know which) and windows doesn't allow me to delete this file because it's in use. I can rename the file but it still points to the same file, so not possible to delete it. Is there any way to break a the hard link of a file which is currently in use? I currently use a trash folder where I move those files to so I can delete the directory structure of program to be deleted. But I'd like to get rid of this leftover (although it doesn't take much space as it's a hard link).

    Read the article

  • Directory comparison in Meld but ignoring changes that only involve file timestamp?

    - by creamcheese
    I'm using Meld to compare two directories of source code on Ubuntu. However, because all of the files in one of the directories have been 'touched' so that all of their timestamps were updated, Meld is showing them as different, even though the contents of the files have not changed. But I'm only trying to find files that have different content. I don't see an option to get Meld just to look at changed contents. Any ideas for how to do this in Meld or is there a better GUI directory comparison tool for Ubuntu?

    Read the article

  • How to download a url as a file?

    - by Michelle
    A website url has "hidden" some mp3 files by embedding them as shockwave files, as follows: <span class="caption"><!-- Odeo player --><embed src="http://odeo.com/flash/audio_player_tiny_gray.swf"quality="high" name="audio_player_tiny_gray" align="middle" allowScriptAccess="always" wmode="transparent" type="application/x-shockwave-flash" flashvars="valid_sample_rate=true external_url=http://podcast.cbc.ca/mp3/sundayeditionstream_20081125_9524.mp3" pluginspage="http://www.macromedia.com/go/getflashplayer"></embed></span> How can I download the files for off-line listening? I've found two methods: 1. The StackOverflow Method Create a new local html file with just the links eg <a href="http://podcast.cbc.ca/mp3/sundayeditionstream_20081125_9524.mp3">Sunday Edition 25Nov2008</a> Open the file in the browser, right click the link and File Save Link As. 2. The SuperUser Method Install the Firefox addin Iget. (Be sure to use the right version for your Firefox version.) Tools Downloads Enter url in field. Are there any other ways?

    Read the article

  • Sharepoint 2007 - Transaction log full

    - by Kenny Bones
    So I have this SharePoint 2007 site that is basically trash. I'm supposed to just toss it, but I'm in need of copying all of the data in form of traditional files and folders from certain projects. And since the transaction log is full, it's so damn slow. Even opening SharePoint takes up to 15 minutes, or it won't open at all. Copying of files is extremely slow. So I'm in need of a quick fix here. Just to be able to copy out some files and folders. I don't need to fix the problem per se. What can I do to fix it temporarily to be able to copy out the data?

    Read the article

  • How to INF mod: Replacing 32bit dlls with 64bits

    - by Nime Cloud
    I've got a driver setup for 32 bit: An INF file and an x86 folder with two 32 bit dlls. I need to replace these 32 bit dll files with 64 bit ones. I just simply overwrite 32 bit files but no lock. How can I make 64 bit version of the driver? Update: I tried original setup files on 32 bit Windows XP, setup asks for WdfCoinstaller01009.dll, I just simply browse & point the file from somewhere on XP. ;-------------- WDF Coinstaller installation [DestinationDirs] CoInstaller_CopyFiles = 11 [silabser.Dev.NT.CoInstallers] AddReg=CoInstaller_AddReg CopyFiles=CoInstaller_CopyFiles [CoInstaller_CopyFiles] WdfCoinstaller01009.dll [SourceDisksFiles] WdfCoinstaller01009.dll=1 [CoInstaller_AddReg] HKR,,CoInstallers32,0x00010000, "WdfCoinstaller01009.dll,WdfCoInstaller" [silabser.Dev.NT.Wdf] KmdfService = silabser, silabser_wdfsect [silabser_wdfsect] KmdfLibraryVersion = 1.9

    Read the article

  • Installing software at large organisation

    - by CJ7
    I have to give an MSI to a large organisation for it to install (presumably via GPO) on to some of their workstations that are running XP. I already know that my application is not allowed to write to the application folder. I realise it should be writing to the AppData folder. The organisation has allocated a folder on a server for the database files and other configuration files. This folder is referenced by UNC naming and not by mapped drives. My question is: based on normal practices, is my app likely to have the rights to create a sub-folder on this server? Is my app likely to have the rights to create files on this server?

    Read the article

  • How do I reset the $PATH variable on Mac OS X?

    - by Neil
    I've messed up my path variable, and now some apps that I run raise errors saying Command Not Found (error 127) for commands like 'date' and 'sleep'. These commands work fine when executed directly in the shell. I'm guessing this has something to do with a malformed $PATH variable, and need to know how to reset it. I've deleted the files ~/.bashrc , ~/.bash_profile, /etc/bash.bashrc, and ~/.bashrc and ~/.profile. What other files could hold my $PATH? Is there some simpler way to reset the Path than dig into the myriad files which could hold my path? Note, this path problem is only with my user. I made a test user on my system, and the path was fine, back to normal.

    Read the article

  • Data Archiving vs not

    - by Recursion
    For the sake of data integrity, is it wiser to archive your files or just leave them unarchived. No compression is being used. My thinking is that if you leave your files unarchived, if there is some form of corruption it will only hurt a smaller number of files. Though if you archive, lets say all of your documents, if there is even the slightest corruption, the entire archive is unrecoverable. So whats the best way to keep a clean file system, but not be subject to data corruption.

    Read the article

  • Upload a directory recursively to an FTP server

    - by Nicolas Raoul
    I am writing a Linux shell script to copy a local directory to a remote server (removing any existing files). Local server: ftp and lftp commands are available, no ncftp or any graphical tools. Remote server: only accessible via FTP. No rsync nor SSH nor FXP. I am thinking about listing local and remote files to generate a lftp script and then run it. Is there a better way? Note: Uploading only modified files would be a plus, but not required

    Read the article

  • Windows 7 Slow Searching

    - by Guy Thomas
    I have a new Windows 7 machine with twice as much RAM and a faster processor than my old Windows Server 2008 R2 Machine. I am disappointed that searching amongst my 10,000 image files takes twice as long on my new Windows 7 machine. Both machines have their own copy of these same files. In other respects e.g. opening my huge Outlook files, the new machine is faster. The Windows Search Service has started. And I set indexing on the image folder about 3 days ago. Any ideas why I suffer from this poor index / search experience? Other than adding / removing folders, is there anything I can do to tweak indexing?

    Read the article

  • FTP script download from linux to windows

    - by user53864
    I'm using following FTP script on windows xp to download zip files from ubuntu cloud servers. A zip file is created every day on ubutnu servers and I will download it to windows via this ftp script. I run this script everyday manually as I have to edit the last line(mget /usr/backup_02-11-2010.Zip) of the script to match today's date. I want to edit this script so that it will download only today's zip file at the scheduled time without needing to edit it everyday, when scheduled. It's clear that date is appended to the zip files and is in the format dd-mm-yyyy. Need help... open server-ip-here username-here user-password-here lcd C:\Backup\files bin hash prompt mget /usr/backup_02-11-2010.zip

    Read the article

  • Do large folder sizes slow down IO performance?

    - by Aaron
    We have a Linux server process that writes a few thousand files to a directory, deletes the files, and then writes a few thousand more files to the same directory without deleting the directory. What I'm starting to see is that the process doing the writing is getting slower and slower. My question is this: The directory size of the folder has grown from 4096 to over 200000 as see by this output of ls -l. root@ad57rs0b# ls -l 15000PN5AIA3I6_B total 232 drwxr-xr-x 2 chef chef 233472 May 30 21:35 barcodes On ext3, can these large directory sizes slow down performance? Thanks. Aaron

    Read the article

  • Is Windows Media Player able to play DTS audio?

    - by rolgae
    I'm trying to play DTS audio with Windows Media Player 12 on Windows 7. For a MPEG-TS file with video and DTS audio, only video is played. A file containing only a DTS audio stream is rejected. But: WMP is able to play the DTS audio stream of a DVD. So, Is Windows Media Player able to play DTS audio, or not? And if: How do I make him play my DTS files? I did not find any good resources of the supported codecs. Just things like "WMP can play .mpg files, ..." VLC is able to play all of the above files. I do not want to install third party codec packs, thats not the question!

    Read the article

  • RewriteRule applying pattern even though 1 of the RewriteCond's failed

    - by BHare
    #www. domain . tld RewriteCond %{HTTP_HOST} (?:.*\.)?([^.]+)\.(?:[^.]+)$ RewriteCond /home/%1/ -d RewriteRule ^(.+) %{HTTP_HOST}$1 RewriteRule (?:.*\.)?([^.]+)\.(?:[^.]+)/media/(.*)$ /home/$1/client/media/$2 [L] RewriteRule (?:.*\.)?([^.]+)\.(?:[^.]+)/(.*)$ /home/$1/www/$2 [L] Here is rewritelog output: #(4) RewriteCond: input='tfnoo.mydomain.org' pattern='(?:.*\.)?([^.]+)\.(?:[^.]+)$' [NC] => matched #(4) RewriteCond: input='/home/mydomain/' pattern='-d' => not-matched #(3) applying pattern '(?:.*\.)?([^.]+)\.(?:[^.]+)/media/(.*)$' to uri 'http://www.mydomain.org/files/images/logo.png' #(3) applying pattern '(?:.*\.)?([^.]+)\.(?:[^.]+)/(.*)$' to uri 'http://www.mydomain.org/files/images/logo.png' #(2) rewrite 'http://www.mydomain.org/files/images/logo.png' -> '/home/mydomain/www/logo.png' If you note on the 2nd 4 it failed the -d (if directory exists) pattern. Which is correct. mydomain does not have a /home/. Therefore it should never rewrite, atleast according to my understanding that all rewriterules are subject to rewriteconds as logical ANDs.

    Read the article

  • Apache htaccess with mod_expires Not Working for certain directories

    - by keyboarddrummer
    I have a Joomla site that I am trying to enable caching using mod_expires. I have the .htaccess in the root of the site and have added the options as found on the page http://www.pactsoftware.nl/tools/joomla-optimization.html Using the PageSpeed extension in Chrome, prior to adding this in my .htaccess, my site scores a 55 (Caching is at the top, and lists a lot of images, CSS, and JS files). After these directives, it scores 70, with caching in the yellow, but still lists some image files (some are two directories deep and the rest are four). I checked for any other .htaccess files in the Joomla root, but none are between those folders and the root. It is almost as if htaccess only works in that one directory, not the subfolders. I have tried putting a .htaccess in each affected subdirectory, but it does not work. Does anyone have any ideas?

    Read the article

  • what does it mean for MalwareBytes to find malicious registry keys but nothing else?

    - by EndangeringSpecies
    I have a machine that is obviously infected, and when I ran MalwareBytes it told me that it found some "malicious" registry keys (surprisingly enough these contained file path to currently non-existent javascript files). But, that's it. Full scan did not uncover any malicious files, or malicious hidden processes in memory. Like, maybe the (hidden?) process that for whatever reason periodically injects keystrokes (hotkeys?) into whatever currently open window. Then on another, not obviously infected, machine it found a "malware.trace" registry key but again no files or processes etc. How does this jive with people's experience with MalwareBytes? Does it usually find registry key symptoms of an infection but nothing else? Or is it a common thing to have no infection but some malicious registry keys in place anyway?

    Read the article

  • Panic Transmit file upload

    - by 1ndivisible
    I've ditched Coda and bought Transmit. I'm a little confused by the file uploading. I have exactly the same folder structure remotely and locally, but if I right-click a file and choose Upload "SomeFileName.html" The file is always uploaded into the root of the remote site, even if the file is in a folder. If I choose to upload a file at assets/images/some_image.png I would expect it to be uploaded to the same folder on the remote server, not the root. Coda dealt with this perfectly and also told me what files had been modified and needed uploading. Transmit doesn't seem to do either of these things. So my questions are: How can I upload a file to the same path on the remote server without having to drag and drop Is there any way to have Transmit mark edited files or upload only edited files. [There is no tag for Transmit so if someone with more rep could make and add one that would be grand]

    Read the article

  • ffmpeg volume parameter format

    - by tanon
    ffmpeg's -vol parameter is confusing me. 256 => normal (i guess meaning same as input volume, no change) 512 => (double the volume - read this somewhere). So what to do for 3 times the volume? 1.5 times the volume? Basically, lets say I have the max sound amplitudes (audacity levels) in 3 files as: 0.8 0.6 0.9 I want to amplify in the first two files, so that max=0.9 in all files. What parameters of -vol I would use?

    Read the article

  • Is Software Raid1 Using mdadm with a Local Hard Disk and GNDB Possible?

    - by Travis
    I have multiple webservers which use many small files to created dynamic web pages. Caching the web pages isn't an option. The webserver also performs writes so I need a synchronous filesystem. I'm looking to maximise performance as it's my understanding that small files is the weakness (to varying degreess) of a cluster filesystem over ethernet. Currently I'm using Centos 5.5, 64 bit. Since it's only about 300MB of data, I'm looking at mdadm using RAID-1 with the GNBD and a local hard disk using the "--write-mostly" option so the reads are done using the local hard disk. Is this possible? If so, is there any advantage to making it a tmpfs disk instead of a local hard disk? Or will the files on the local hard disk just get cached in RAM anyway so I won't see a performance gain by using tmpfs, assuming there's enough RAM available?

    Read the article

  • Puzzled about PHP file permission and shared webhosting - what are some explanations?

    - by extrakun
    I have this issue with different web-hosting, particular upload scripts which can only upload to a folder only if it has 777 permission (which is risky). On the test server (on a different webhost), 755 works well. On another web-hosting, log files generated by PHP file functions cannot be write to some time, but other files are mysteriously unaffected (for instance, the log files for the entire week is 655, and they work well, but just today's log-file doesn't work unless it is set to 777). I am more of an application developer than a server backend expert, so these behaviours puzzle me to no end. Why are they happening? What can be done?

    Read the article

  • How to set default permissions for automounted FAT drives in Ubuntu

    - by piman
    I've got many FAT32 drives that I'd like to mount in Ubuntu such that they have permission mode 700 for directories and 600 for all other files. By default, they have 755 for all files, which is not particularly useful since almost no non-directories should be executable, and it screws up version control repos hosted on the drives. "Back in the day" I would have had the drives listed in /etc/fstab with the umask/dmask I want and there was no such thing as a default. These days, drives automount under their volume names. Which is great, except now I have no idea how to set the default. I have tried changing the /system/storage/default_options/vfat/mount_options gconf key with no apparently effect. It was 077 initially but the mounted drive reflected a default of 022; changing it and re-inserting the drives resulted in the files still having permission bits of 755.

    Read the article

  • How to determine main movie DVD track before ripping via mencoder

    - by Ampp3
    Maybe there's a simple answer for this, but when looking at the files on a DVD (IFOs, VOBs,etc), is there a way to easily determine the longest/main track? I'm trying to automate the process of finding the main movie track on a DVD and am running into issues. I thought this could be done by finding the BIGGEST track (look through VTS_XX_N.VOB files, where XX is the track number, and find the track with the largest filesize (sum sizes of VOB files for that track)), but apparently that isn't correct. One DVD had track 7 as the largest track (by my method), but mencoder didn't produce the correct output with this track, but worked with track 9 instead. Am I missing something? EDIT: I've heard of the utility 'lsdvd' for getting track information, but I was hoping to avoid compiling this, and use a basic method instead (ie: what I tried above). Does anyone have any idea WHY my idea didn't work?

    Read the article

  • MacOS X 10.6 Portable Home Directory sync fails due to FileSync agent crashing

    - by tegbains
    On one of our cleanly installed MacPro machines running MacOS X 10.6.6 connected to our MacOS X 10.6.6 Server, syncing data using Portable Home Directories fails. It seems to be due to the filesync agent crashing during the home sync. We get -41 and -8026 errors, which we are suspecting are indicating that there is too much data or filesync agent can't read the files. The user is the owner of the files and can read/write to all of the files. < Logout 0:: [11/02/04 13:10:42.751] Error -41 copying /Volumes/RCAUsers/earlpeng/Library/Mail/Mailboxes/email from old imac./Attachments/12081/2.2. (source = NO) < Logout 0:: [11/02/04 13:10:42.758] Error -8062 copying /Volumes/RCAUsers/earlpeng/Library/Mail/Mailboxes/email from old imac./Attachments/12081/2.2/[email protected]. (source = NO) < Logout 1:: [11/02/04 13:10:42.758] -[DeepCopyContext deepCopyError:sourceError:sourceRef:]: error = -8062, wasSource = NO: return shouldContinue = NO

    Read the article

  • problem with MySQL installation : template configuration file cannot be found

    - by user35389
    Trying to install MySQL onto the Windows XP machine. While going through the installation steps (in the "MySQL Server Instance Config. Wizard"), I get to a point where it the window reads: MySQL Server Instance Configuration (bold header) Choose the configuration for the server instance. Ready to execute... o Prepare configuration o Write configuration file o Start service o Apply security settings (this line is greyed out) Please press [Execute] to start the configuration. [ Back ] [ Execute ] [ Cancel ] So I press execute, and then a red X appears in the second step: Write configuration file and at the bottom, where it originally said: Please press [Execute] to start the configuration. It now says: The template configuration file cannot be found at C:\Program Files\MySQL\MySQL Server 5.0\bin\my-template.cnf I'm unsure what it means, but I canceled the config wizard and looked in the directory that had been created (C:\Program Files\MySQL\MySQL Server 5.0). There are some configuration settings files, and there are 4 folders: bin data Docs share

    Read the article

  • Renaming debian package

    - by Tabiko
    I'm trying to build a customized version of a nginx package for Debian/Ubuntu which had a different set of modules opposed to the default version. What would be the fastest way to modify the debian/ structure (and which files) if I'd want to rename the package from 'nginx' to 'my-nginx' for example? I've got the source deb package unpacked and which files I'd need to modify in nginx-1.4.5/debian/ directory (holding the control, rules.. files) have buildpackage generate my-nginx-1.4.5.deb package instead of nginx-1.4.6.deb package. I appreciate your help!

    Read the article

< Previous Page | 400 401 402 403 404 405 406 407 408 409 410 411  | Next Page >