Search Results

Search found 37883 results on 1516 pages for 'sparse files'.

Page 352/1516 | < Previous Page | 348 349 350 351 352 353 354 355 356 357 358 359  | Next Page >

  • Can I disable this Windows (XP) Security Warning?

    - by FumbleFingers
    I recently reformatted my hard drive and reinstalled Windows XP (I know I'll have to take the plunge and commit to Win8 "real soon, now", but I'm just not quite ready for the upheaval yet! :) I used to use WinRar (and later, when I got fed up with the "nag" messages, 7-Zip), but I haven't installed either of them in my new configuration, so I must be using the built-in XP facility when I open *.zip files. For years, I've been opening downloaded *.zip archives, and using "drag & drop" to copy to a File Explorer window open on the folder where I want the files to end up (usually, My Documents\Downloads). But now I find that when I "drop" the file(s), I get a pop-up Windows Security Warning saying Are you sure you want to copy or move files to this folder? You should only move or copy files from locations that you trust Can anyone explain why I'm getting this message, and is there any (reasonably easy, please! :) way to suppress it? Since I've already put the *.zip file on my computer, it seems a bit late to ask if I trust it. (Thus far, the files in question have always been plain text, so it's not a matter of dodgy programs, etc.) Apologies for the low quality image - I don't have the appropriate tools or knowledge to do any better, and it doesn't help that my "PrtScr" screen capture has included what would have been on my second monitor (TV) if it had been turned on. If you can't read it, trust me - I have copied the text verbatim.

    Read the article

  • Active Directory Support Folder Redirection AND Portable Home Directories?

    - by Robert F
    Does anyone here know if Active Directory will support the use of both Windows Folder Redirection and Mac OS X's Portable Home Directories for synchronizing a user's files to a remote share? I want to synchronize my user's files with a remote share as a way of backing up their data. This is fairly straightforward if a user has only a Windows computer or only a Mac computer. However, will Active Directory support a situation in which a user has both types of computers or they have a Mac on which they're running Windows within Parallels? If I configure a remote share via Group Policies for their Windows files and then configure a different share for their Mac files via ADUC, when they change a file on either computer, will AD know which computer the file was changed on and synchronize that file with the appropriate remote folder? Thanks!

    Read the article

  • Free converter for JPEG to PDF

    - by Codeslayer
    Hello all, Is there any free converter available which will convert multiple JPEG files and dump them in a single PDF file? The reason for this is I have multiple scanned JPEG files which I want to dump as a single PDF. I am not scanning the image as TIFF files because it takes a huge amount of space. Thanx.

    Read the article

  • Using symbolic links with git

    - by Alfredo Palhares
    I used to have my system configuration files all in one directory for better management but now i need to use some version control on it. But the problem is that git doesn't understand symbolic links that point to outside of the repository, and i can't invert the role ( having the real files on the repository and the symbolic links on their proper path ) since some files are read before the kernel loads. I think that I can use unison to sync the files in the repo and and the their paths, but it's just not practical. And hard links will probably be broken. Any idea ?

    Read the article

  • What virus renames all images to EXE?

    - by user29373
    I have a virus that renames all jpg file extensions to EXE files and hide the original files at the same folder!! I can see hidden Files with FarManager and I cannot see them in Windows Explorer(even with show hidden files option?!!) How can I restore it to its original file extension? Do you have any tool to scan the converted file and restore it to its original file extension? What the virus name? how can I remove it manually?

    Read the article

  • Slow file operations, possible I/O error but chkdsk says OK

    - by mikolajek
    I've recently faced a strange problem. I use Directory Opus file manager that suddenly started to report an I/O error when trying to copy files onto one of my disks. The Windows Explorer did copy those files but it was extremely slow. I run chkdsk on this drive (ca. 300GB) and it took it over two days to complete! However it reported no errors found. I run Hard Disk Sentinel that says the drive is OK. But the files still read and write extremely slow! Can anyone advise me what to do? I have a spare space I am copying my files for backup now but should I get rid of this disk? Or can I try to "heal" it somehow? Many thanks for your ideas!

    Read the article

  • DFS-R (2008 and R2) 2 node server cluster, all file writes end in conflictAndDeleted

    - by Andrew Gauger
    Both servers in a 2 server cluster are reporting event 4412 20,000 times per day. If I sit in the conflictAndDetected folder I can observe files appearing and disappearing. Users report that their files saved by peers at the same location are overriding each other. The configuration began with a single server, then DFS-R was set up using the 2008 R2 wizard that set up the share on the second server. DFSN was set up independently. Windows users have drives mapped using domain based namespace (\domain.com\share). Mac users are pointed directly to the new server share created by DFS-R. It is PC users indicating most of the lost files, but there has been 2 reports by Mac users about files reverting.

    Read the article

  • Wget - if / else download condition?

    - by Kai
    I want wget to prefer a certain filetype over another, if the files have the same basename. For example: if foo.ogg available, don't download foo.mp3 the way i use wget so far to crawl/automatically download (if anyone is interested): wget -Dfoo.com -I /folder/ -r -l 1 -nc -A.ogg,.mp3 -i http://www.foo.com/folder/ but this, of course, gets me .mp3 AND .ogg files. It often also gets me image files like .png which i didn't want in the first place, and discards them afterwards. Any Ideas? (Syntax-Explanation: -D: download only from this Domain -I: download only from this subfolder of Domain -r: recursive (follow links and directory structure) -l 1: follow only 1 link deep -nc: no clobber = download only if file doesn't exist -A: accept/download only all *.ogg and *.mp3 (discard necessary html-files) -i: download-url/starting point)

    Read the article

  • File upload permission problem IIS 7

    - by krish
    I am unable to upload files to website hosted under IIS7. I have already given write permissions to "IUSR_websitename" and set the property in web.config also. I am able to upload files with out log in to application at the time of user registration. But once log in to application, if I upload files, it is giving "Access denied" error. Please help me.

    Read the article

  • allowing index access only with .htaccess

    - by YsoL8
    Hello I have this in my .htaccess file, in the site root: Options -Indexes <directory ../.*> Deny from all </directory> <Files .htaccess> order allow,deny deny from all </Files> <Files index.php> Order allow,deny allow from all </Files> What I'm trying to achieve is to block folder and file access to anything that isn't called index.php, regardless of which directory is accessed. I have the folder part working perfectly and the deny from all rule is working as well - but my attempt to allow access to index.php is failing. Basically could someone tell me how to get it working?

    Read the article

  • allowing index access only with .htaccess

    - by Oliver Nourish
    Hello I have this in my .htaccess file, in the site root: Options -Indexes <directory ../.*> Deny from all </directory> <Files .htaccess> order allow,deny deny from all </Files> <Files index.php> Order allow,deny allow from all </Files> What I'm trying to achieve is to block folder and file access to anything that isn't called index.php, regardless of which directory is accessed. I have the folder part working perfectly and the deny from all rule is working as well - but my attempt to allow access to index.php is failing. Basically could someone tell me how to get it working?

    Read the article

  • allowing index access only with .htaccess

    - by YsoL8
    Hello I have this in my .htaccess file, in the site root: Options -Indexes <directory ../.*> Deny from all </directory> <Files .htaccess> order allow,deny deny from all </Files> <Files index.php> Order allow,deny allow from all </Files> What I'm trying to achieve is to block folder and file access to anything that isn't called index.php, regardless of which directory is accessed. I have the folder part working perfectly and the deny from all rule is working as well - but my attempt to allow access to index.php is failing. Basically could someone tell me how to get it working?

    Read the article

  • How can I diff two Redhat Linux servers?

    - by Stuart Woodward
    I have two servers that have should have the same setup except for known differences. By running: find / \( -path /proc -o -path /sys -o -path /dev \) -prune -o -print | sort > allfiles.txt I can find a list of all the files on one server and compare it against the list of files on the the other server. This will show me the differences in the names of the files that reside on the servers. What I really want to do is run a checksum on all the files on both of the servers and compare them to also find where the contents are different. e.g find / \( -path /proc -o -path /sys -o -path /dev \) -prune -o -print | xargs /usr/bin/sha1sum Is this a sensible way to do this? I was thinking that rysnc already has most of this functionality but can it be used to provide the list of differences?

    Read the article

  • Current alternative to the old CHECKSUM program

    - by faulty
    I'm looking for an application that does md5/sha hash check on specific files/folders periodically and store an index file per folder for future verification. I remember such application exist in DOS days, to detect files infected by virus. The main purpose for this is to detect corrupted copy of backup, as I understand that consumer grade hardware are not 100% error free when doing backup or file transfer from device to device. The hash can also be used to generate a list of changed files for backup. Most of the software I can find is hash manually. EDIT: Windows based application, preferably a shell extension which I can right click on a folder and do a checksum/verify all files in that folder. Even better if that can integrate with a backup/sync program like BeyondCopy

    Read the article

  • Batch convert divx to iPhone format

    - by Kelsey
    I am looking for free software to do batch conversions of divx video files to iPhone format. I have read the thread: http://superuser.com/questions/5784/looking-to-convert-video-to-iphone-format Handbrake works good for single files but it has very little customization with regards to files names and the batch functionality is not very good (or at least I can't get it to work very easily). Can anyone recommend a good batch converter? A script for Handbrake to do a batch for all in a specific directory would be useful even.

    Read the article

  • USB flash drive unreadable after a few minutes on Windows 8

    - by B Sharp
    I recently got a new computer with Windows 8. I a have a number of large backup files I am moving from my old PC to my new one. I was able to successfully copy the files to a couple 16 GB flash drives. When I try to copy the files to my new computer, the process starts just fine but after copying about 4 GB of files, the copy stops. If I look at the drive in explorer, the drive is there, but I just get a busy cursor that stays indefinitely if I click it. If I unplug the flash drive and plug it back in, everything is fine again... for a couple minutes. I've tried copying from both flash drives with the same result. I've also tested this on the USB 2.0 and 3.0 ports on my motherboard with the same result (the drives are USB 2.0) It's also puzzling that this is happening since I previously used one of the same flash drives to install Windows 8 on this computer in the first place without any difficulty.

    Read the article

  • How Can I Make Apache Stop Serving ALL Unknown File Types (like .php~)?

    - by user223304
    I am coming from IIS and moving to Apache and recently found out that Apache by default serves up files of an unknown file extension as PURE TEXT. This can be an issue if a user uses certain programs that back up .php files as .php~. Then the .php~ file becomes completely readable by simply navigating to it in a browser. To make matters worse these .php~ files are often considered 'hidden' in the linux environment from the user so some may not even know they exist. Bots have been created around this fact that scour the internet looking for popular file name backups and extracting potentially secure info from them. I already know how to stop serving up .php~ files or any specific file extensions. I also know not to use any editors that would save backup files like this. My question is, how can I stop this default Apache behavior of serving up ANY non-MIME file type at all? I just don't like the this behavior and would like to stop it. I don't want it serving up .aspx~, .html~, .bob, .carl, no extension or anything else that is not a real MIME type. I know that I can probably go and use a directive to first Deny access to all file types. Then add the ones I want to serve out one by one. But I'm wondering if there's an easier/quicker way. Thanks for any help.

    Read the article

  • How to index filenames, size and basic informations for every file on a network?

    - by Antoine
    I have several machines, most of then are Linux and one of them is under Mac OS X. Each machine has several internal hard drives, and I also have a few external hard drives. How can I reliably find files with setup ? External drives are not always plugged, but the files don't move often. Ideally I would like to be able to search the metadata given with the 'file' command, and move files over the network.

    Read the article

  • File doesn't exist in Linux although it's located in Terminal

    - by Mazen Ayman
    I'm a bit new to unix/linux environment, but I have a small problem. I'm using "locate" to find the path of a file I need, it gives me the path for it, but the file doesn't exist in that path, like that: locate test1.txt /home/user/test files/text1.txt /home/user/test1.txt~ "test files" directory is where I was keeping the file and I copied it to the home directory once but I deleted it, no idea what it keeps telling me there is still a tmp file for it. it worth mentioning that I used the command: locate test1.txt~ |xargs -n1 rm to remove that tmp file, but maybe that what caused the problem. I tried to show hidden files, and check for temp files, didn't find it either. any clue what happened?

    Read the article

  • Deploying website content via Subversion

    - by Johann
    we have recently set up a new development infrastructure and process for one of our clients. This involves the strict use of subversion as a central source code repository. The svn repositories contains a seperate branch for code on the live system (/branches/live/). The repositories are use for PHP content (mainly Wordpress Blogs), but in future they may hold other asp code as well. Bonus points for a solutions which more or less in the same way with ASP code on Windows Server 2008 R2. We have two servers: one staging system and one live system. The staging system is updated regularly with the code of the trunk. The live system is update manually. Each webroot on the servers are working copy of either the trunk (staging system) or the live branch (live system). The current workflow is: Developing on the dev's box - commit into the trunk - auto-deploy on staging system - testing on the staging system - merging into /branches/live/ - manual deployment on live system. This works for one-way changes very well, however we have some troubles on every wordpress (or plugin) update: The WP update process removes the directories and unpack the archive of the new version. This removes the svn admin area as well, which produces a lot of errors. We could switch to SVN 1.7 with a single, global admin area, but this would only solve on part of the problem. Finally, we have done the update via the WP Gui, restored the svn admin area, added/removed the files and committed the changes to the trunk. After testing, we had to do basically the same thing on the live server (except the commit, we just reverted the changes and merged the new files from the staging system to the live system). I'm currently thinking of the following: The htdocs of each website is a svn export Each website has a svn working copy beside the htdocs directory a script which "replays" the changes in the wc from htdocs after an update in WP (rsync'ing the changed files to the working copy, rsync'ing new files and svn add them and finally svn delete the deleted files). The script would have to exclude some files (like wp-config.php, uploads/temp directories, etc.). Are there better ways to do this? Unfortunaly, a complete CI server is out of scope due to time and budget limitations.

    Read the article

  • JS Worm : how to find the entry point

    - by Cédric Girard
    Hi, my site is tagged as dangerous by Google / StopBadware.org, and I found this in severals js/html files : <script type="text/javascript" src="http://oployau.fancountblogger.com:8080/Gigahertz.js"></script> <!--a0e2c33acd6c12bdc9e3f3ba50c98197--> I cleaned severals files, I restore a backup but how to understand how the worm had been installed? What can I look for in log files? This server, a Centos 5, is only used as an apache server, with ours programs, a tikiwiki, a drupal installed. Thanks Cédric

    Read the article

  • Patch msp into msi package

    - by Kvad
    The latest update of Windows Live Messenger is an msp added to the package. I want to patch a msp into an msi. Reference download http://wl.dlservice.microsoft.com/download/8/3/D/83D75746-DF04-45E9-8374-BD31B9419128/en/wlsetup-all.exe I extract all msi and msps from this. (To get the msp and msi's I did the following Use resource hacker to open up wlsetup-all.exe In the left hand tree browse to PACKAGE Right click PACKAGE, save PACKAGE resources Save to a new temp folder Eg. D:\temp\package.rc This will output a whole lot of .bin files These are just cab files so we need to do a mass rename “ren *.bin *.cab” Once done select all cab files and extract to a new sub folder \extracted In \extracted you will see all the msi, msp and 7z files you need) I try to apply the msp directly with no result msiexec /p messenger.msp /a messenger.msi I also try doing a admin install with nothing being extracted.

    Read the article

  • Alternatives to musicbrainz picard -audio-fingerprinting tagging services [closed]

    - by Journeyman Geek
    Possible Duplicate: Auto-tagging MP3s I'm currently using musicbranz picard to tag files that don't have any usable tagging information. For some reason, currently none of the files I trying to ID seem to be identified, despite some of them them being artists who are almost certainly on musicbrainz. So, I'm looking for something that uses an alternate database and does the same thing - to identify unknown music files off an audio fingerprint. Windows is preferred, but I would be fine with any of the big 3 OSes (Windows/Linux/OS X).

    Read the article

  • How to check that image sizes are ok or not

    - by Goran
    I am having many image files in one folder. Mostly there are jpg, but some png, bmp, gif, tif also (but jpg is most important if others are not possible). There are also many xml files in same folder with same names as image files are having. So there is something like this: 1.jpg, 1.xml, a.png, a.xml, 3g.bmp, 3g.xml... All xml files are having only 1 line: <IMGRES WIDTH="1234" HEIGHT="567" /> (of course numbers are not same). I am looking for some easy way to check all image sizes and confirm if they are same as in xml or not. So I want output like: imagename, xmlwidth x xmlheight, realwidth x realheight, match/error Is this possible in only Windows with no other software installed?

    Read the article

< Previous Page | 348 349 350 351 352 353 354 355 356 357 358 359  | Next Page >