Search Results

Search found 40479 results on 1620 pages for 'binary files'.

Page 397/1620 | < Previous Page | 393 394 395 396 397 398 399 400 401 402 403 404  | Next Page >

  • Find and Replace String in filenames

    - by shekhar
    I have thousands of files with no specific extensions. What I need to do is to search for a sting in filename and replace with other string and further search for second string and replace with any other string and so on. I.e.: I have multiple strings to replace with other multiple strings. It may be like: "abc" in filename replaced with "def" * String "abc" may be in many files "jkl" in filename replaced with "srt" * String "jkl" may be in many files "pqr" in filename replaced with "xyz" * String "pqr" may be in many files I am currently using excel macro to get the file names in excel and then preserving original names in one column and replacing desired in the content copied in other column. then I create a batch file for the same. Like: rename Path\OriginalName1 NewName1 rename Path\OriginalName2 NewName2 Problem with the above procedure is that it takes a lot of time as the files are many. And As I am using excel 2003 there is limitation on number of rows as well. I need a script in batch like: replacestr abc with def replacestr pqr with xyz in a single directory. Will it be better to do in unix script?

    Read the article

  • Deploy Windows 7 Backup set to Windows 8

    - by Matthias
    Situation: We have a laptop here that's completely fubar. I.e: The hard drive is filled to the brim with bad sectors. Luckily, backups have been made using the built-in Windows 7 backup feature. This produces folders named Backup Set 2012-11-09 003009, containing folders like Backup Files 2012-11-09 003009, containing zip-files like Backup files 1, 2, 3,... Our brand new laptop comes with Windows 8. Now: Can we, using the standard back-up and restore feature in Windows 8, restore all the documents, music, etc. using the Windows 7 backup files? Thanks. (FYI: We also took a normal backup of all the documents just to be sure of course. I'm just curious what would happen. I would test it out, but the new laptop hasn't arrived yet and I wanted to make sure my efforts would not be in vain.)

    Read the article

  • Apache can't access /assets directory (OS X 10.6 Snow Leaopard)

    - by Doug Kaye
    I know this will turn out to be something really stoopid, but I can't find it. Everything was great until I upgraded to OS X 10.6.2 (Snow Leopard) and the supplied Apache 2.2.13. I've replaced all the httpd conf files with my own that were previously working just fine. Everything is great except for one thing: Apache returns a 404 error for any requests to /assets/*. If I rename the directory from 'assets' to anything else, it works fine. I'm going crazy trying to find out why it's sensitive to the string 'assets'. I have no .htaccess files. All permissions have been checked. I've scoured all conf files (including vhosts) for what might cause this and haven't found it. Is there any reason why Apache would treat 'assets' different from anything else? Is there anywhere to check other than conf and .htaccess files?

    Read the article

  • Differential backup missing moved folders (flawed archive attribute logic)

    - by Max
    Recently I've discovered that my backup system it flawed: there are situation where various files/folders are missed. I do my backup from local disk to a network NAS. I use Cobian backup, and I have setup the backup software to create one full backup every week, and one differential backup every day. Now, the backup software (to my knowledge any backup software work this way) decide the files that go in the differential backup by looking at the file archive attribute. If the attribute is set, then the file go in to the backup. Now, when you move a file to a new location, on Windows systems, the archive attribute get set and the file is included in the backup, and that's fine... but when you move an entire folder, no archive attribute is set, nor on the folder, nor in any files inside the folder, so the moved folder isn't included in the differential backup! So, if you have a full backup plus a differential backup, and you moved folders around... then it's impossible to reconstruct the original files/folders structure starting from the full+differential backup, because the backup software didn't include the moved folders in the differential backup. So my differential backup are useless... Why does windows set the archive attribute when moving a file, but not when moving a folder? How can I deal with this issue? Is there a way to create a differential backup that works as it's supposed to do? Doing full backup every day is not practical, because the changed data is about 0.1% at day (by using a differential backup I can keep 4 weeks of files history without using too much disk space.)

    Read the article

  • How to convert tar file from gnu format to pax format

    - by nosid
    On the one hand I have a lot of tar files created with gnu format, and on the other hand I have a tool that only supports pax (aka posix) format. I am looking for an easy way to convert the existing tar files to pax format - without extracting them to the file system and re-create the archives. GNU tar supports both formats. However, I haven't found an easy way to the conversion. How can I convert the existing gnu tar files to pax?

    Read the article

  • Nginx + Wordpress Multisite 3.4.2 + subdirectories + static pages and permalinks

    - by UrkoM
    I am trying to setup Wordpress Multisite, using subdirectories, with Nginx, php5-fpm, APC, and Batcache. As many other people, I am getting stuck in the rewrite rules for permalinks. I have followed these two guides, which seem to be as official as you can get: http://evansolomon.me/notes/faster-wordpress-multisite-nginx-batcache/ http://codex.wordpress.org/Nginx#WordPress_Multisite_Subdirectory_rules It is partially working: http://blog.ssis.edu.vn works. http://blog.ssis.edu.vn/umasse/ works. But other permalinks, like these two to a post or to a static page, don't work: http://blog.ssis.edu.vn/umasse/2008/12/12/hello-world-2/ http://blog.ssis.edu.vn/umasse/sample-page/ They either take you to a 404 error, or to some other blog! Here is my configuration: server { listen 80 default_server; server_name blog.ssis.edu.vn; root /var/www; access_log /var/log/nginx/blog-access.log; error_log /var/log/nginx/blog-error.log; location / { index index.php; try_files $uri $uri/ /index.php?$args; } # Add trailing slash to */wp-admin requests. rewrite /wp-admin$ $scheme://$host$uri/ permanent; # Add trailing slash to */username requests rewrite ^/[_0-9a-zA-Z-]+$ $scheme://$host$uri/ permanent; # Directives to send expires headers and turn off 404 error logging. location ~* \.(js|css|png|jpg|jpeg|gif|ico)$ { expires 24h; log_not_found off; } # this prevents hidden files (beginning with a period) from being served location ~ /\. { access_log off; log_not_found off; deny all; } # Pass uploaded files to wp-includes/ms-files.php. rewrite /files/$ /index.php last; if ($uri !~ wp-content/plugins) { rewrite /files/(.+)$ /wp-includes/ms-files.php?file=$1 last; } # Rewrite multisite '.../wp-.*' and '.../*.php'. if (!-e $request_filename) { rewrite ^/[_0-9a-zA-Z-]+(/wp-.*) $1 last; rewrite ^/[_0-9a-zA-Z-]+.*(/wp-admin/.*\.php)$ $1 last; rewrite ^/[_0-9a-zA-Z-]+(/.*\.php)$ $1 last; } location ~ \.php$ { # Forbid PHP on upload dirs if ($uri ~ "uploads") { return 403; } client_max_body_size 25M; try_files $uri =404; fastcgi_pass unix:/var/run/php5-fpm.sock; fastcgi_index index.php; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; include /etc/nginx/fastcgi_params; } } Any ideas are welcome! Have I done something wrong? I have disabled Batcache to see if it makes any difference, but still no go.

    Read the article

  • Mounting an FTP as a virtual disk (FTPDrive analogue)

    - by axk
    FTPDrive has been a great utility for me, but it does not support 64bit Windows 7. The feature of FTPDrive that is useful for me is accesing files from an FTP as local files without pre-downloading so that I can preview and watch movies from a local FTP server without waiting for a full movie to get downloaded first. Do you know of any software which allows accessing files over FTP without pre-downloading? Thanks!

    Read the article

  • How to install/configure ffmpeg to compress mp4 videos for flash player delivery?

    - by Andrew Fulton
    We have a flash web-app that created interactive video, and are using ffmpeg to do some compression/resizing when a user "publishes" their project. The user can upload flv files and mp4 files, both of which play fine in the Flash UI before publishing. After publishing the flv files work fine, but the mp4 files will not play in the flash player: Audio will play but video won't. The mp4 files will play fine if I download them and play them in the Quicktime player but if I attempt to open them in the Adobe Media Player it reports "The media file does not contain a supported video track". If I open the Movie inspector in quicktime it tells me that the original file is an "h264" video and the ffmpeg-processed ones are "mpeg-4". I have tried forcing it to h264 by adding flags like -f h264 and -vcodec h264 but I get a screenfull of errors (no frame, illegal POC type, sps_id out of range) ending with Could not find codec parameters (Video: h264) h264 will show up if I run ffmpeg -formats and ffmpeg -codecs, and as I said it will play fine in Quicktime. Is there anything else I need to do to convince the flash player to play them? Is there anything else I need to tell you about the server that will help?

    Read the article

  • Mass renaming, *nix version

    - by Paolo B.
    I was looking for a way to rename a huge number of similarly-named files, much like this one (a Windows-related question) except that I'm using *nix (Ubuntu and FreeBSD, separately). Just to sum up, while using the shell (Bash, CSH, etc.) how do I mass-rename a number of files such that, for example, the following files: Beethoven - Fur Elise.mp3 Beethoven - Moonlight Sonata.mp3 Beethoven - Ode to Joy.mp3 Beethoven - Rage Over the Lost Penny.mp3 will be renamed like these? Fur Elise.mp3 Moonlight Sonata.mp3 Ode to Joy.mp3 Rage Over the Lost Penny.mp3 The reason I want to do this is that these collection of files will go under a directory named "Beethoven" (i.e. the filenames' prefix), and having this information on the filename itself will be redundant.

    Read the article

  • Can't change permission/ownership/group of external hard drive on Ubuntu.

    - by MikeN
    I have an external hard drive connected to my Linux box. I wanted to setup a web server to access files on it, but the permission on all files and directories on the drive are "rwx" for the owner which is my local login, and the group is the "root" group. I need the files readable by the apache user, I was trying to set all files to be "chmod a+rwx -R *", but this doesn't do anything (gives no errros, just has no effect.) I tried chaning the group using "chgrp" to my user group, but that won't work either, it gives me errors that I lack permission even when I run all those commands as sudo! What's up with this hard drive??? "sudo chmod a+rwx *" should work on anything, right?

    Read the article

  • Deleting another user's diretories from my own

    - by kwatford
    I am a non-root user, and have made a directory into which other users in my group can write. The directory is setgid, so files and directories within it have the same group. I can delete files placed into this directory, but if a user creates a subdirectory with files in it, I can't seem to delete those. Is there something special I can do (other than, say, bothering the user in question or the sysadmin about it) to get rid of this subdirectory?

    Read the article

  • Fix Video timelines

    - by Josh
    So, I have been going through and riping all of my DVD's and it seems that the way to get the highest quality out of these is to have DVD Shrink de-encrypt, rip, and decompress, the DVD's. After that I usually end up with a high quality (high size) set of .vob files in a classic DVD structure. Then I use a python script that I wrote to automate the process of finding the title sequence and then combining all of the title sequences' .vob files together into one file(similar to the "copy /b" command in windows), and then changing the extension to .mpg (a more widely supported format then .vob). This allows me to get a high quality rip in about 40 min. The problem comes in playing the files. I need all of the ripped dvd's to play on my media computer using windows media center but windows media center (and vlc for that matter) all think that the video files are anywhere from 5 min. to 0 min. which is not a problem (the video will still play all the way through) but if you want to pause it, when it is unpaused the video will start all the way over (Also fast forward and rewind don't work). I suspect that it is something wrong with the way the timeline is encoded in the video file, various forums on the internet recommended using virtualdub to fix the errors. But when I try to open the file virtual dub says that the file is not in mpeg-1 encoding and may be in mpeg-2. Is there any way to fix this? PS: I am aware that there was a similar question but it hasn't had any activity for 2 months and is dealing more with wmv files.

    Read the article

  • Robocopy hiding folders on backup drives

    - by Neil Barnwell
    I have a backup batch file that uses Robocopy to backup my files: robocopy "C:\" "G:\Default\RoboCopyBackup\C" /XF Pagefile.sys /XD "System Volume Information" "Recycler" "Temporary Internet Files" "Installer Cache" "Temp" /E /R:1 /W:0 /TEE /XJ This should create a folder structure on the external backup drive like so: G:\Default\RoboCopyBackup\C\... However, G: appears totally empty. What is weird, is that the folders and files are there! If I type the above path into the address bar, I see all the files and folders! Can anyone help me work out why? I think it might be some NTFS-based ownership/permissions thing but I'm not sure.

    Read the article

  • Problem in Installing Wordpress

    - by Hajloo
    I try to install Wordpress in a Windows Client with WebPI which provided by Microsoft. I had tostop installation process 3 time and installing PHP and mysql Extention manually. but everytime I continue setup by WebPi andfinally it show me a success message. But when I try to see installed wordpress in my client I see this Your PHP installation appears to be missing the MySQL extension which is required by WordPress. I asked it in StackOverFlow here but I couln't get the right answer. I install everything in **C:\Program files\** so these are the location C:\Program Files\MySQL\MySQL Server 5.1 C:\Program Files\Php C:\Program Files\ext mysql root password: admin wordpress database : wordpress wordpress database password : 123 here is my php.ini

    Read the article

  • nginx can't see MySQL

    - by user135235
    I have a fully working Joomla 2.5.6 install driven by a local MySQL server, but I'd like to test nginx to see if it's a faster web serving experience than Apache. \ PHP 5.4.6 (PHP54w) \ CentOS 6.2 \ Joomla 2.5.6 \ PHP54w-fpm.i386 (FastCGI process manager) \ php -m shows: mysql & mysqli modules loaded Nginx seems to have installed fine via yum, it can process a PHP-info file via FastCGI perfectly OK (http://37.128.190.241/php.php) but when I stop Apache, start nginx instead and visit my site I get: "Database connection error (1): The MySQL adapter 'mysqli' is not available." I've tried adjusting my Joomla configuration.php to use mysql instead of mysqli but I get the same basic error, only this time "Database connection error (1): The MySQL adapter 'mysql' is not available" of course! Can anyone think what the problem might be please? I did try explicitly setting extension = mysqli.so and extension = mysql.so in my php.ini to try and force the issue (despite php -m showing they were both successfully loaded anyway) - no difference. I have a pretty standard nginx default.conf: server { listen 80; server_name www.MYDOMAIN.com; server_name_in_redirect off; access_log /var/log/nginx/localhost.access_log main; error_log /var/log/nginx/localhost.error_log info; root /var/www/html/MYROOT_DIR; index index.php index.html index.htm default.html default.htm; # Support Clean (aka Search Engine Friendly) URLs location / { try_files $uri $uri/ /index.php?q=$uri&$args; } # deny running scripts inside writable directories location ~* /(images|cache|media|logs|tmp)/.*\.(php|pl|py|jsp|asp|sh|cgi)$ { return 403; error_page 403 /403_error.html; } location ~ \.php$ { fastcgi_pass 127.0.0.1:9000; fastcgi_index index.php; include fastcgi_params; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; include /etc/nginx/fastcgi.conf; } # caching of files location ~* \.(ico|pdf|flv)$ { expires 1y; } location ~* \.(js|css|png|jpg|jpeg|gif|swf|xml|txt)$ { expires 14d; } } Snip of output from phpinfo under nginx: Server API FPM/FastCGI Virtual Directory Support disabled Configuration File (php.ini) Path /etc Loaded Configuration File /etc/php.ini Scan this dir for additional .ini files /etc/php.d Additional .ini files parsed /etc/php.d/curl.ini, /etc/php.d/fileinfo.ini, /etc/php.d/json.ini, /etc/php.d/phar.ini, /etc/php.d/zip.ini Snip of output from phpinfo under Apache: Server API Apache 2.0 Handler Virtual Directory Support disabled Configuration File (php.ini) Path /etc Loaded Configuration File /etc/php.ini Scan this dir for additional .ini files /etc/php.d Additional .ini files parsed /etc/php.d/curl.ini, /etc/php.d/fileinfo.ini, /etc/php.d/json.ini, /etc/php.d/mysql.ini, /etc/php.d/mysqli.ini, /etc/php.d/pdo.ini, /etc/php.d/pdo_mysql.ini, /etc/php.d/pdo_sqlite.ini, /etc/php.d/phar.ini, /etc/php.d/sqlite3.ini, /etc/php.d/zip.ini Seems that with Apache, PHP is loading substantially more additional .ini files, including ones relating to mysql (mysql.ini, mysqli.ini, pdo_mysql.ini) than nginx. Any ideas how I get nginix to also call these additional .ini's ? Thanks in advance, Steve

    Read the article

  • Mounting an Amazon EC2 instance on MAC OS X

    - by hinghoo
    What are some methods for transferring files to and from Amazon EC2 instances. I'm looking for solutions / tools for editing files as well as copying files to EC2 instances from both Mac and Windows. For example, what are some solutions for mounting a drive from an instance locally? Generally, what other methods are out there?

    Read the article

  • LED Display control software

    - by user978733
    My university bought new big led display from chinese manufacturers. What I want to do is, show some visualizations (like Windows Media Player, Winamp, Itunes ... does) with music. I'd just drag the application window to show on screen But main problem is,The software that controls it (called "Led Vision") doesn't support showing applications' windows: It shows limited types of the files, such as, powerpoint presentation, video files, picture files.. etc. Now the question is, where I can find the visualization video files? something that created in After Effects, looks like that:

    Read the article

  • What kinds of protections against viruses does Linux provide out of the box for the average user?

    - by ChocoDeveloper
    I know others have asked this, but I have other questions related to this. In particular, I'm concerned about the damage that the virus can do the user itself (his files), not the OS in general nor other users of the same machine. This question came to my mind because of that ransomware virus that is encrypting machines all over the world, and then asking the user to send a payment in Bitcoin if he wants to recover his files. I have already received and opened the email that is supposed to contain the virus, so I guess I didn't do that bad because nothing happened. But would I have survived if I opened the attachment and it was aimed at Linux users? I guess not. One of the advantages is that files are not executable by default right after downloading them. Is that just a bad default in Windows and could be fixed with a proper configuration? As a Linux user, I thought my machine was pretty secure by default, and I was even told that I shouldn't bother installing an antivirus. But I have read some people saying that the most important (or only?) difference is that Linux is just less popular, so almost no one writes viruses for it. Is that right? What else can I do to be safe from this kind of ransomware virus? Not automatically executing random files from unknown sources seems to be more than enough, but is it? I can't think of many other things a user can do to protect his own files (not the OS, not other users), because he has full permissions on them.

    Read the article

  • Why does running "$ sudo chmod -R 664 . " cause me to get access denied on all affected directories?

    - by Codemonkey
    I have a project folder which has messy permissions on all files. I've had the bad tendency of setting everything to octal permissions 777 because it solved all non security related issues. Then FTP uploads, files created by text editors etc. has their own set of permissions making everything a mess. I've decided to take myself together and start using the permissions the way they were meant to be used. I figured 664 was a good default for all my files and folders, and I'd just remove permissions for others on private files, and add +x for executable files. The second I changed my project folder to 664 however: $ sudo chmod -R 664 . $ ls ls: cannot open directory .: Permission denied Which makes no sense to me. I have read/write permissions, and I'm the owner of the project folder. The leftmost part of ls -l in my project folder looks like this: -rw-rw-r-- 1 codemonkey codemonkey ... drw-rw-r-- 5 codemonkey codemonkey ... -rw-rw-r-- 1 codemonkey codemonkey ... -rw-rw-r-- 1 codemonkey codemonkey ... drw-rw-r-- 3 codemonkey codemonkey ... -rw-rw-r-- 1 codemonkey codemonkey ... -rw-rw-r-- 1 codemonkey codemonkey ... -rw-rw-r-- 1 codemonkey codemonkey ... drw-rw-r-- 4 codemonkey codemonkey ... drw-rw-r-- 5 codemonkey codemonkey ... I assume this has something to do with the permissions on the directories, but what?

    Read the article

  • google search engine

    - by kourosh
    I am working on a google box, something like this, http://mytwentyfive.com/blog/wp-content/uploads/byme/Google%20Search%20Appliances.jpg I am pointing the crawler to a folder where there are html files. before the crawler was crawling the files and indexing them but right now it finds the pattern or the folder but not following any html files within the folder. I have tried everything I could and know but, can't think of anything else. Can someone help? thanks

    Read the article

  • Ubuntu Software RAID 0 on AWS Does Not Survive Reboot

    - by Eric J.
    I'm experimenting with creating a software RAID 0 device from 4 EBS volumes on Ubuntu 9.10 running at Amazon AWS following this guide: http://alestic.com/2009/06/ec2-ebs-raid The device appears (and according to SysBench is 3.5x faster than a regular attached EBS volume). Problem is, when I reboot the instance, all files on the RAID device are gone. The device is available and mounted where expected, but contains no files. I am able to write new files to it, which survive until the next reboot.

    Read the article

  • Loads of memory in "standby" on Windows Server 2008 R2

    - by Jaap
    In our SharePoint farm, our Web Front End servers all have loads of memory in "standby" mode, meaning very little is available for our IIS worker process. We have 32 GB of RAM in each of the boxes, and standby memory will creep up to about 28 GB, whereas the IIS worker process only seems to be using about 2 GB. Also, we've seen the machine use the swap file extensively while this memory was in standby, so I am starting to think that this memory in standby mode is stopping IIS from using it, forcing it to swap to disk, causing more performance problems. I used SysInternals RamMap to indentify what is being kept in memory, and it was able to tell me that almost everything in standby memory is of type "Mapped File". When I sort the files listed under the file summary tab in RamMap by file size, the largest files (around a few hundred meg each) are IIS log files and SharePoint log files. I would like to understand which process is loading these files into standby memory and why they are not being released. When I do an iisreset, it does not release the memory. Any ideas? Thanks!

    Read the article

  • If I use a video format converter to change a movie from AVI to MKV, will quality stay the same?

    - by Matt
    I know they're both container formats and what matters is the actual codec used, but what I don't know is if video converting software will do anything to change the codec, or if it just repackages. The reason I need to know is that I have several .avi files with subtitle files, and I'm wanting to turn them into .mkv so I can attach the subs and not need a second subtitle file anymore. Will my new .mkv files be identical in video and audio quality to the original .avi?

    Read the article

  • Mavericks permission issues with Windows Server deduplicated shares

    - by dmohlmaster
    We have a number of 10.9-10.9.3 - Mavericks - machines installed throughout our facility. Much of the user content is pulled from shares stored on our Windows Server 2012 fileservers with deduplication enabled. I have found that files newly written or unoptimized are able to be accessed without issue - read, written, modified, etc. Once the file gets optomized/deduplicated and Windows adds the P & L attributes - sparse and symlink - the Macs running Mavericks begin to have access issues. Once the files get deduplicated, users begin receiving read access errors when copying files (see error1 below). This happens when copying to folders within the current folder tree or copying somewhere to the local system. If you 'stop' the copy operation and retry a few more times, it may eventually work for the specific instance but fail again later. I am however, able to copy these files without issue via the terminal. Other systems running 10.7 do not experience the same issues and are able to access file server resources without issue. Many of the systems having issues are newer and thus not able to be downgraded to 10.8 or 10.7. I have tried finder replacements such as Pathfinder but the results are the same. I know this is at least similar to the issues many Mac users are already experiencing and posting about but I haven't seen it directly linked to deduplication and the attributes written by Windows server. Has anyone seen this issue? Have any solutions been found? Error 1: When copying files after the PL attributes have been set by deduplication. "One or more items can't be copied to "Foler" because you don't have permissions to read them. ******************************************' Via the system.log, I am also seeing the following error when accessing these deduplicated file shares. The reparse point tag listed below is "IO_REPARSE_TAG_DEDUP" Reported error: "smbfs_nget: filename.ext - unknown reparse point tag 0x80000013"

    Read the article

< Previous Page | 393 394 395 396 397 398 399 400 401 402 403 404  | Next Page >