Search Results

Search found 85647 results on 3426 pages for 'file write'.

Page 125/3426 | < Previous Page | 121 122 123 124 125 126 127 128 129 130 131 132  | Next Page >

  • "Path Not Found" when attempting to write to a sub folder within a mapped drive

    - by Adam
    We have an interesting issue with one of our server shares, or possibly, our Win 7 desktops. When our users try to save files in a sub folder, either via copy/paste or through an application, to a mapped drive on our DC they receive an error saying "Path not found". They can however browse this folder and open files from it. This is where the "Path Not Found" error doesn't seem to stack up in my opinion. Users can however save files fine in the root folder of the mapped drive, it appears only to affect sub folders. It seems to be random which users and machines this affects. The users can log on to a different machine and be able to save in sub folders fine, on the same mapped drive. Event viewer hasn't been much help either. Currently, the only solution we have found is to image the machines affected which solves the issue. Our servers are Server 2008 R2 with Win 7 Pro desktops. Any help/pointers/suggestions would greatly be appreciated.

    Read the article

  • Looping through a batch file only if the response is "everything is okay"

    - by PeanutsMonkey
    I have a batch file that loops through the contents of a directory and compresses the files in the directory as follows; for %%a in (c:\data\*.*) do if "%%~xa" == "" "C:\Program Files\7-Zip\7za.exe" a -tzip -mx9 "%%a.zip" "%%a" Seeing that I am using 7zip to compress the file, it returns the message "everything is okay" if it has successfully compressed the file and it then moves onto the next file in any. What I would like to do is the following; Only move to the next file if the response is "everything is okay" If the response is anything but "everything is okay", the error is logged Since an error has occurred, it attempts to compress the file again Once when it has succeeded i.e. "everything is okay" it goes to the next file Steps 3 & 4 only occur a maximum of 3 times before it gives up and moves onto the next file. How can I achieve this?

    Read the article

  • Config file (App.config) does not update on new installation

    - by Muhammad Kashif Nadeem
    I am creating setup of my project using Visula Studio 2008. I am facing problem in setup installation. If I uninstall old setup (application) and install the new one then config file (App.config) updates the attributes (surely it is new file) of config file but if I install new setup without uninstalling the old one then config file does not update. from config file I mean MyProject.exe.config Why is this behavior of config file. Should it not be updated on installation of the new setup Is this possible to delete and copy the config file of new setup? Is there a way to update only config file forcefully during installation. Thanks for your help!

    Read the article

  • NTFS write speed really slow (<15MB/s) on Ubuntu

    - by Zulakis
    When copying large files or testing writespeed with dd, the max writespeed I can get is about 12-15MB/s on drives using the NTFS filesystem. I tested multiple drives (all connected using SATA) which all got writespeeds of 100MB/s+ on Windows or when formatted with ext4, so it's not an alignment or drive issue. top shows high cpu usage for the mount.ntfs process. AMD dual core processor (2.2 GHz) Kernel version: 3.5.0-23-generic Ubuntu 12.04 ntfs-3g version: both 2012.1.15AR.1 (Ubuntu default version) and 2013.1.13AR.2 How can I fix the writespeed?

    Read the article

  • Mac can write to samba share

    - by David
    I have a samba share that works fine for PCs, but we have a mac user who seems to only be able to edit and rename existing files, he cannot add new files. Any ideas? Here is the share setup: path = /media/freeagent/officeshare read only = No guest ok = Yes writeable = yes public = yes

    Read the article

  • What could cause the file command in Linux to report a text file as data?

    - by Jonah Bishop
    I have a couple of C++ source files (one .cpp and one .h) that are being reported as type data by the file command in Linux. When I run the file -bi command against these files, I'm given this output (same output for each file): application/octet-stream; charset=binary Each file is clearly plain-text (I can view them in vi). What's causing file to misreport the type of these files? Could it be some sort of Unicode thing? Both of these files were created in Windows-land (using Visual Studio 2005), but they're being compiled in Linux (it's a cross-platform application). Any ideas would be appreciated. Update: I don't see any null characters in either file. I found some extended characters in the .cpp file (in a comment block), removed them, but file still reports the same encoding. I've tried forcing the encoding in SlickEdit, but that didn't seem to have an effect. When I open the file in vim, I see a [converted] line as soon as I open the file. Perhaps I can get vim to force the encoding?

    Read the article

  • Likewise: joined Active Directory but cannot write shares.

    - by Aron Rotteveel
    I have never used a Linux system in an AD environment before and am trying to join my laptop running Ubuntu to join our Active Directory (DC is a Windows Server 2008 machine) using Likewise-open. Using the GUI wizard, I have joined the domain. I can mount network shares using CIFS Problem: I only have read access to our fileserver. What more is needed to get the AD to recognize me as a user who has the appropriate rights? Any help is appreciated.

    Read the article

  • Motherboard jumper setting: BIOS flash write protection.

    - by Wesley
    I have an ECS P4M800PRO-M478 motherboard and I'm just setting up the jumpers right now, of which there are only two sets. One is the CLR_CMOS jumper, which is set to Normal, of course. However, there is another set called BIOS_WP which controls whether BIOS flash writing is protected or unprotected. Which setting should I have it set at and would this affect any BIOS flashes in the future?

    Read the article

  • rename/delete a folder from multipart rar file

    - by kikio
    Hello. I've a question: (I sent it in past) I have multipart rar file. Their contents are: file.part01.rar: myfolder (is a folder) data.cab -- file.part02.rar: myfolder (is a folder) data.cab <- file.part03.rar: myfolder (is a folder) data.cab <- file.part04.rar: difffolder (is a folder) anfolder (is a folder) data.cab <- file.part05.rar: myfolder (is a folder) data.cab <-- I want to extract it, so I right-click on "file.part01.rar" and select "Extract to ...". It extract 3 files, but in part 4, WinRAR said: "CRC. This file is currput." I think it problem is in the folders name in part04.rar. Is there anyway to rename folders in part04.rar? and cut "data.cab" from "afolder" to "difffolder". I really need it!! it is very emergency!!!!!!!! Thank you .....

    Read the article

  • Testing for disk write

    - by Montecristo
    I'm writing an application for storing lots of images (size <5MB) on an ext3 filesystem, this is what I have for now. After some searching here on serverfault I have decided for a structure of directories like this: 000/000/000000001.jpg ... 236/519/236519107.jpg This structure will allow me to save up to 1'000'000'000 images as I'll store a max of 1'000 images in each leaf. I've created it, from a theoretical point of view seems ok to me (though I've no experience on this), but I want to find out what will happen when there will be directories full of files in there. A question about creating this structure: is it better to create it all in one go (takes approx 50 minutes on my pc) or should I create directories as they are needed? From a developer point of view I think the first option is better (no extra waiting time for the user), but from a sysadmin point of view, is this ok? I've thought I could do as if the filesystem is already under the running application, I'll make a script that will save images as fast as it can, monitoring things as follows: how much time does it take for an image to be saved when there is no or little space used? how does this change when the space starts to be used up? how much time does it take for an image to be read from a random leaf? Does this change a lot when there are lots of files? Does launching this command sync; echo 3 | sudo tee /proc/sys/vm/drop_caches has any sense at all? Is this the only thing I have to do to have a clean start if I want to start over again with my tests? Do you have any suggestions or corrections?

    Read the article

  • How to restore from file using Symantec NetBackup 7.5

    - by Tony
    I have an install of Symantec NetBackup 7.5 and I want to restore the server from a NetBackup image file. The file was created using NetBackup before I arrived. We had a hardware failure that corrupted this server and it needed to be rebuilt, now we want to restore from this image file. I can't for the life of me figure out how to restore from that file. I've installed the NetBackup application but it can't find the file when using the restore command within the application. If I double-click the file it opens the application then gives me the same "can't find any NetBackup files" error. I also can't simply drag the file into the NetBackup window. Any advice on how I restore from this file would be appreciated, thank you.

    Read the article

  • SQL Error (1064) when importing data from SQL file

    - by mejpark
    I have a MySQL database, which was originally set up with the default latin1 character set and latin1_swedish_ci collation. I was using the database like this for sometime, until I noticed strange characters on my production web site, which is powered by a database exported from my development machine. At this point, I changed the default character set of the database and tables to utf8 and the collation to utf8_unicode_ci, converted the latin1 data inside each table to utf8 (using the 'convert data' option) and exported the database as a single SQL file using HeidiSQL. When the resulting SQL file is opened in Notepad++, several characters are rendered incorrectly. For example, en dashes (-) are displayed as – and e with accent (é) are displayed as é. I changed the encoding of the file from ANSI to UTF-8 (using the encoding menu option in Notepad++) and the offending characters are rendered correctly. I saved the new utf8-encoded SQL file and attempted to import the contents into the MySQL database on my production server. The import process fails with following error: /* SQL Error (1064): You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '?# -------------------------------------------------------- # Host: ' at line 1 */ /* Error with snippets directory: The specified path was not found */ The head of the SQL file: # -------------------------------------------------------- # Host: 127.0.0.1 # Server version: 5.1.33-community # Server OS: Win32 # HeidiSQL version: 6.0.0.3773 # Date/time: 2011-04-20 09:48:36 # -------------------------------------------------------- It chokes on the first line of the file, which is commented out. Why is this happening? I didn't have a problem loading data from SQL files until I changed the character set and collation of the database. I came up with an ugly workaround to this problem by performing following steps: Export database as single SQL file using HeidiSQL Open resulting file in Notepad++ and convert from ANSI to UTF-8 encoding Create new empty file in Notepad++, paste in UTF-8 and save file normally What am I missing here?

    Read the article

  • What is the difference between the BIN file generated by ImgBurn and UltraISO

    - by user275517
    I have a CD that I would like to generate a BIN file from (with a CUE file to accompany it). I used ImgBurn and UltraISO to to generate two BIN files. However, I have found out that BIN files generated by these programs are not identical (different file size). So, what is the difference between the BIN file formats and which one should I use to backup CD? The same applies to ISO file generation by these two programs - file size does not match.

    Read the article

  • Granting read-write rights to my web application on VPS

    - by davykiash
    Am currently testing a bulk CSV import functionality web application and I came across a error The given destination is not writeable My application is zend based and uses the MVC structure application --uploads library --Zend public --index.php What Ubuntu command do I exectute to safely grant the necessary rights to my uploads folder in my web application?

    Read the article

  • Git push write access for deployment denied

    - by Stepchik
    I have strange issue when try git push. Git clone and commit works fine. W access for my_project DENIED to deploy_my_project_ My gitolite.conf repo my_project R = deploy_my_project_111 RW+ = my_name I wonder why git push takes wrong user (deploy_project_111) with read access. This error is float. Twice i had to change rsa key(rsa keys is unchangeble) and restart computer. May be my computer do something wrong.

    Read the article

  • Removing write permission on home and public_html on Centos/Cpanel

    - by user5858
    I'm running sites on two Cpanel accounts on my VPS on WHM. I'm using DSO php handler and Apache server on my Web server. After recent intrusion attacks I've chowned to root with permission 555 on $HOME and public_html folder. I'm on VPS with Cpanel on Centos. I'm running CMS based software like Joomla Drupal etc. Will this cause any problem to my VPS installation or server side processes? Drupal, Joomla, MyBB etc will not be affected by this. Some files will not be created like error_log. At least hackers will not be able to place any malicious code within home folder or the public_html folder.

    Read the article

  • Gifsicle: How to set it to not overwrite the original GIF file if the resulting modified GIF file is larger than the original?

    - by galacticninja
    About Gifsicle: Gifsicle is a command-line tool for creating, editing, and getting information about GIF images and animations. One of its features is (from its website): Optimize your animations! This stores only the changed portion of each frame, and can radically shrink your GIFs. You can also use transparency to make them even smaller. Gifsicle’s optimizer is pretty powerful, and usually reduces animations to within a couple bytes of the best commercial optimizers. I call Gifsicle through this .BAT file in the Right Click - 'Send to' Menu: @echo off :compressFile "C:\Programs\Compression Scripts\gifsicle\bin\gifsicle.exe" --batch -V -O3 %1% echo. echo. SHIFT if exist %1% goto compressFile PAUSE This animated GIF file, however: http://i.minus.com/i7WdodY5Zwot3.gif, when its compression is optimized with Gifsicle with the above commands, results in a larger-filesized GIF file. Gifsicle overwrites the original GIF file with the resulting larger-filesized GIF file. Initial filesize: 7.57 MiB (7,942,886 bytes). After running through the above commands with Gifsicle: 7.64 MiB (8,017,622 bytes). Is there a way to prevent Gifsicle from overwriting the original file if its output file is larger than the original file, while still overwriting the original file if the output file is smaller? Details: OS: Windows 7 Gifsicle version: 1.63, from the binary provided here: http://www.lcdf.org/gifsicle/ Gifsicle manual

    Read the article

  • I am trying to write an htaccess file performs authentication and redirects authenticated users to a

    - by racl101
    This is what I have so far but I can't get the RewriteCond and RewriteRule properly. RewriteEngine On RewriteCond %{LA-U:REMOTE_USER} (\d{3})$ RewriteRule !^%1 http://subdomain.mydomain.com/%1 [R,L]. AuthName "My Domain Protected Area" AuthType Basic AuthUserFile /path/to/my/.htpasswd Require valid-user This is what I mean the ReWriteCond and RewriteRule to say: "If the REMOTE_USER has a username ending in 3 digits then capture the three digits that match and for whatever url they are trying to access if it does not start with the 3 digits captured then redirect them to the sub directory with the name equal to those captured three digits." In other words, if a user named 'johnny202' is authenticated then if he's requesting any directory other than http://subdomain.mydomain.com/202/ then he should be redirected to http://subdomain.mydomain.com/202/ The only thing I can think of that is wrong is the first instance of '%1'.

    Read the article

  • rkhunter warns of inode change by no file modification date changes

    - by Nicholas Tolley Cottrell
    I have several systems running Centos 6 with rkhunter installed. I have a daily cron running rkhunter and reporting back via email. I very often get reports like: ---------------------- Start Rootkit Hunter Scan ---------------------- Warning: The file properties have changed: File: /sbin/fsck Current inode: 6029384 Stored inode: 6029326 Warning: The file properties have changed: File: /sbin/ip Current inode: 6029506 Stored inode: 6029343 Warning: The file properties have changed: File: /sbin/nologin Current inode: 6029443 Stored inode: 6029531 Warning: The file properties have changed: File: /bin/dmesg Current inode: 13369362 Stored inode: 13369366 From what I understand, rkhunter will usually report a changed hash and/or modification date on the scanned files to, so this leads me to think that there is no real change. My question: is there some other activity on the machine that could make the inode change (running ext4) or is this really yum making regular (~ once a week) changes to these files as part of normal security updates?

    Read the article

  • Terminal Server/Win2K3: Users can't write to their My Documents or Temp folders

    - by Tim Sullivan
    I have a situation where a bunch of users are running our software on a Terminal Services machine on Windows 2003 Server. I've removed most permissions from the User group, but made sure they all have the appropriate permissions for their own application folders, as well as for their Documents and Settings folder. For some reason, even though everything seems to be set up properly, users can't create or delete files from their My Documents, Temp or other D&S folders. Whatever could be going on? I thought this was going to be straightforward, but clearly it's not! :-) Thanks for any help!

    Read the article

  • java -version doesn't write to stdout?

    - by Zárate
    Hi there, Either I'm doing something silly or Sun is. How come something like: java -version > version.txt Still prints out to stdout and leaves version.txt empty? I'm checking out the exit code, and it's still 0, so is not that's writing to stderr. I need this because I'm building a test-environment tool and want to check if the version of Java is adequate, I was planning to catch that version output, but now I'm stuck. I'm on OS X Leopard, Java version 1.6.0_20. Any ideas? Cheers, Juan

    Read the article

  • How to read and write Mac drives on Windows

    - by Svish
    I have some external hard drives that are Mac OS Extended (Journaled) formatted. What software can you recommend for working with those drives when under Windows? Do you have any experience with this? Would be best if the software is free, but it doesn't have to be. Hope someone can help!

    Read the article

  • Alternative Windows Offline Files + Windows Backup + Previous Version Setup

    - by Herson
    Currently our documents are all hosted in a Windows 7 box. Users can access the files using Windows share and the documents are available offline (windows 7 feature). The documents are being backed up daily by Windows 7 backup and restore utility. Users can access previous versions of the file (from the backups) using Windows Explorer "previous versions" feature. This setup is currently working well, except for the following: We would prefer to have access to hourly versions of the file, not daily. The previous version mechanism is tied up to the backup mechanism. Windows 7 performs a full backup every week and incremental backup everyday. The previous versions of a file is actually what are the available in the backups. If you 20GB documents and want to maintain at least three(3) year history, you will use at minimum 3 years * 52 weeks * 20GB or about 3TB even if there are few changes in the documents. Its pretty inefficient use of space. Looking up previous versions of a file is very slow (tens of minutes). This is probably related to the previous issue - Windows has to traverse its all of its backups. I am considering using SVN + autocommit/autoupdate tortoisesvn. It will have the following advantages: Backups are easy and will also backup the whole history of each documents. (Just backup the repository). Creating previous versions can be frequent. I think svn commit / update can be done every two minutes or so. Users can sync over the net. However, I can see the following issues: More conflicts than the original setup because both multiple users can now edit the same file even both are online, i.e. can connect to the SVN repo. The users can off course lock the file first before editing, but that would mean they have to adjust. Delay on propagation of file changes. On windows 7 file sharing, changes made by one online user will be instantaneously available to other online users. With the SVN setup, changes will only be propagated when the users execute the svn add/commit/update sequence. Delay will be probably a few minutes. This workflow will no longer work: "Hi, I just edited document X, can you have a quick look?" I would like to ask the opinion of the community for alternative setups, or improvements on the above setups to work out the kinks.

    Read the article

  • Over writing output to a text file

    - by Naveen Gamage
    I'm trying to write wget command's output to a text file, but it always appends to the text file. #!/bin/sh download() { local url=$1 echo -n " " wget --progress=dot $url 2>&1 | grep --line-buffered "%" | \ sed -u -e "s,\.,,g" | awk '{printf("\b\b\b\b%4s", $2)}' echo " DONE" } file="$1" echo -n "Downloading $file:" download "$file" > file.log I tried using using > won't work, where am I doing wrong?

    Read the article

< Previous Page | 121 122 123 124 125 126 127 128 129 130 131 132  | Next Page >