Search Results

Search found 38755 results on 1551 pages for 'locked files'.

Page 598/1551 | < Previous Page | 594 595 596 597 598 599 600 601 602 603 604 605  | Next Page >

  • Eclipse Juno Switch Editor in Order

    - by inspectorG4dget
    In case it matters: OS: Mac OS X Lion (10.7.4) Eclipse: Juno, Build id: 20120614-1722 I have several files open in my eclipse workspace as tabs. The default shortcuts for previous and next editors are ?F6 and ?shiftF6. I know how to change these shortcuts, that's not the issue. However, what I want to do, is switch between editors in the way in which they're ordered in the tab bar. Currently, the editors change in order of last used/viewed. So, if I had three files (A, B and C in order) open and I'm currently editing A and I edited B last, when I use the shortcut for "Previous Editor", it takes me to B instead of C (and vice versa). Is there any way for me to get this functionality out of eclipse (if so, how)? Thank you

    Read the article

  • External Hard Drive needs format problem

    - by Saher
    I recently bought a new ADATA external Classic hard drive 500GB. I have transferred around 29GB of data on it till I install my new windows 7 operating system. After some work with the hard drive (copying / deleting ... files) . I closed it for some reason and it couldn't open again asking me to format. I don't want to format the hard drive, I have important data I need...Is there a way I can retrieve my data. Is Recover My Files program from GetData a right choice??? part 2 of my question: why might such thing happen (require format to open), is it the hard drive problem or is it just a corrupted file or folder...??? Thanks,

    Read the article

  • Optimize Windows file access over network

    - by Djizeus
    At my company I frequently need to access shared files over a Windows network. These files are located on the other side of the planet, so I guess the file share goes through some kind of VPN over Internet, but I don't control this and it is supposed to be "transparent" for me. However it is extremely slow. Displaying the content of a directory in the file explorer takes about 10s. Even if over the Internet, I did not expect that retrieving a list of file names would be that long. Are there any settings to optimize this from my Windows XP workstation, or is it mostly related to the way the network is configured? The only thing I have found so far is to cache all file names, while by default only short file names are cached (http://support.microsoft.com/kb/843418).

    Read the article

  • Use System Restore to rescue lost user profile in Win XP?

    - by im_chc
    Hi! My win XP account profile has recently been "reset". Many app settings are lost. For example, the "recent project" list in VS 2005 is empty. There should be lots of other stuffs that are painfully lost without me knowing! What can I do? Can I retrieve the app settings from System Restore? I don't have much confidence on this util, even tho I think restoring to a point when the profile still works, and back up away the C:\Documents and Settings (is it where all the app setting files are located?), that should work... Is it reliable to restore to a previous restore pt and then goes back to the latest RP? I've googled on System Restore, looks like what the util does is just back up some physical files, and restore them when doing System Restore. That sounds quite safe, but I am still uncomfortable to this. Thx for u guys' help in advance!

    Read the article

  • What statistics app should I use for my website?

    - by Camran
    I have my own server (with root access). I need statistics of users who visit my website etc etc... I have looked at an app called Webalyzer... Is this a good choice? I run apache2 on a Ubuntu 9 system... If you know of any good statistics apps for servers please let me know. And a follow-up question: All statistics are saved in log-files right? So how large would these log-files become then? Possibility to split them would be good, dont know if this is possible with Webalyzer though...

    Read the article

  • Can Visual Studio track the "size" or "severity" of my changes in TFS?

    - by anaximander
    I'm working on a sizeable project using VS2012 and TFS (also 2012, I think - I didn't set up the server). A lot of my recent tasks have required making very small changes to a lot of files, so I'm quite used to seeing a lot of items in my Pending Changes list. Is there a way to have VS and/or TFS track how much has been changed and let me know when the differences are becoming significant? Similarly, is there a way to quickly highlight where the major changes are when you get the latest version from TFS? It'd really help with tracking down where certain changes have been made without having to go through and compare every file - the difference highlighting tool might be nice, but when you have to use it on a dozen files to find the block you're looking for, you start to wonder if there's a faster way...

    Read the article

  • Change Windows 7 Explorer's Details Pane limits

    - by Paul
    For some reason, MS decided to completely kill the status bar's functionality in Win7 (and maybe Vista, but I don't know for sure). I have tried all possible options such as Classic Shell and so on. Basically, the one thing I miss most is seeing at a glance the total size of my selected files. I know I can press Alt+Enter or whatever, but that's not the point. The point is that the so-called 'details' pane stops providing details if more than 15 files are selected! WTH? Cannot understand the reason behind such a stupid arbitrary limit, that doesn't seem to be user-configurable at all. Anyway, what I'm looking for is a way to change that limit, either via the registry or otherwise. Is this at all possible?

    Read the article

  • Increase Volume of an MKV Video from Linux Terminal

    - by The How-To Geek
    I've got a large amount of .MKV video files which seem to all play at a very low volume - I end up having to turn the TV up all the way to hear them, which is really irritating when I switch to another channel and wake the dead because it's so loud. What I'm looking for is a command-line method to increase the volume (so I can run it on all of them quickly) that would hopefully work regardless of the audio codec in use in the particular file. (I don't mind hard-coding the output audio though). For reference, I'm using Ubuntu 9.04 on my server, and the files are being played back with Boxee on a Mac Mini, but the volume problem is the same on Windows too.

    Read the article

  • Best practice for administering a (hadoop) cluster

    - by Alex
    Dear all, I've recently been playing with Hadoop. I have a six node cluster up and running - with HDFS, and having run a number of MapRed jobs. So far, so good. However I'm now looking to do this more systematically and with a larger number of nodes. Our base system is Ubuntu and the current setup has been administered using apt (to install the correct java runtime) and ssh/scp (to propagate out the various conf files). This is clearly not scalable over time. Does anyone have any experience of good systems for administering (possibly slightly heterogenous: different disk sizes, different numbers of cpus on each node) hadoop clusters automagically? I would consider diskless boot - but imagine that with a large cluster, getting the cluster up and running might be bottle-necked on the machine serving the OS. Or some form of distributed debian apt to keep the machines native environment synchronised? And how do people successfully manage the conf files over a number of (potentially heterogenous) machines? Thanks very much in advance, Alex

    Read the article

  • large RAID 10 vs small RAID1

    - by user116399
    The machine will store and serve millions of small files (<15Kb each), and all those files require a total storage space of 400G Considering the exact same SATA hard drives maker and models, on the exact same environment (OS, cpu, ram, raid controller, etc...) which one of the setups bellow would be faster? A) RAID 1 with 2 drives of 2T each, making up total storage of 2T B) RAID 10 with 4 drives of 2T each, making up total storage of 4T [EDIT]: I'm aware RAID10 is faster than RAID1. The larger the disk, at least in theory, the longer will take to do seeks/writes. So, will the performance gain of RAID10 will be outweighed by the "drag" caused the larger disk area when seek/write operations happened?

    Read the article

  • Add folder name to beginning of filename - getting multiple renames

    - by Flibble Wibble
    I've used dbenham's excellent response to the question of how to add the folder name to the beginning of a filename in a cmd script. @echo off pushd "Folder" for /d %%D in (*) do ( for %%F in ("%%~D\*") do ( for %%P in ("%%F\..") do ( ren "%%F" "%%~nxP_%%~nxF" ) ) ) popd What I'm finding is that seemingly randomly (though it probably isn't) sometimes the script will run through several child folders and rename correctly but then it gets to a folder where it gets stuck in a loop and starts adding the folder name repeatedly to the file inside. I have 90,000 files in 300 folders to rename this weekend. Any chance you can guess the cause? PS: Is there a maximum number of files that are acceptable in each folder?

    Read the article

  • asymetric encryption of directory

    - by ftiaronsem
    Hello alltogether Currently I am wondering whether it is possible to apply asymetric encryption of a directory in Linux. I would like to achieve the following: Write log files to /var/log/secret Everything written to /var/log/secret is instantly encrypted by a public RSA key (or something similar) The encryption programms I know, i.e. ecryptfs do not support asymetric encryption of files, at least as far as I know. (Correct me if I am wrong). Therefore I am asking here whether you know of any possibility to implement this. Thanks in advance

    Read the article

  • Is there a tool for verifying the contents of a Zip archive against the source directory's contents?

    - by Basil
    Here's the scenario: I create a ZIP archive using some GUI package like WinZip, 7-Zip or whatever by right-clicking on a directory "somename" and selecting "Compress to archive 'somename.zip'" When the archive is completed, I open it and discover that some files don't exist in the archive (for reasons yet unknown). I want to find all files that are missing from the archive without having to extract the archive to another directory, then doing directory diff, etc. So.. Is there a tool (GUI or command-line, standalone or built into a compressor, for Windows or Linux, I don't care) that can walk through an archive and compare its contents against a directory on the filesystem?

    Read the article

  • Losing file permissions after rebooting Windows 7

    - by SMTF
    I have a User directory full of files that are not accessible permission wise for the user who's home directory it is. Said user can't run Explorer. For example, it provides an error complaining that permission is not available for required files. I tried various ways to give said user permission of his home directory and things are fine until after rebooting the machine; the permissions reset to the previous state and the problem persists. I followed the solution outlined here. And again things worked until I reboot the machine. I'm in this mess because I replaced a corrupted user profile as outlined here. The original user and the new replacement on are/where both admin accounts. In case it is relevant I will mention that the Users directory is not on the C volume but a D volume on the same machine. Any insight is appreciated.

    Read the article

  • How to enable winhlp on Windows7 64bit?

    - by BGM
    Salvete! I just discovered that winhlp32.exe won't run on Windows7 64bit. I can't run the application, and I can't run hlp files either (but .chm files run fine). How do I make this work? I have downloaded the Microsoft fix here and restarted my computer, but to no avail. I can see the file winhlp32.exe in my c:\windows directory, but cannot run it. When I do run it, I get Windows' own "Help and Support" entitled, "Why can't I get Help from this program?" which sends me to the link above! How can I make it work?

    Read the article

  • What will Time Machine do when

    - by Joel Budgor
    When Time Machine says "I will delete the oldest files first" does it mean this literally. Here is a theoretical example. Source Drive: 300 GB, consisting of 1 280 GB file and a 1 GB file. Backup Drive: 300 GB The initial backup will backup both files, using 281 GB. If I modify the 1 GB file 21 times, what will Time machine do when I run out of room on the backup drive; Delete the original 280 GB because it is the oldest file or delete the oldest version of the file I have modified 21 times. I hope it would delete the oldest version of the file I have modified 21 times, but I want to be sure. Thanks, Joel Budgor

    Read the article

  • Write permissions denied on linked tables between MS Access 2003 and 2007

    - by STEVE KING
    We are in the process of switching over to Access 2007. We have numerous data tables in Access 2003 files. In one case, the user has 2007 on his PC and opened the front end in 2007. No problems. When the the user is done, he clicks a button that executes a macro full of update queries. The macro reaches the first query and halts. We get a message saying we do not have permissions to write to this linked table (2003 format). There were no security files involved. We re-linked from 2007, same problem. LAN permssions were ok. I wound up having to import the tables to front end in order for the user to be able to do his job.

    Read the article

  • Prioritize file sharing performance in Windows Server 2008

    - by cmbrnt
    I've got a server running Windows Server 2008, and use it mainly for sharing files throughout the domain from a number of disks. It's running on VMware ESXi 4.0, in case that matters. My problem is that when I log in to the server to check user permissions etc, the access speed the files on the remote disks almost grinds to a halt. I havn't been able to measure the speeds, but I would guess it slows down to about 100kB/s as soon as I log in. This is on a gigabit network and the problems are equal for all users, even the ones connected to the same switch as the server. I've assigned 2 GB RAM to the server, and reserved it 1,5Ghz processor power. I don't have to do anything special on the server for this halt to occur. How can I make sure file sharing is prioritized on the server, so no matter what applications I'm using it will always make sure file sharing works properly? Could this be a VMware issue?

    Read the article

  • Windows 8 on iSCSI with LIO target: thin provisioning

    - by LubosD
    I have installed Windows 8.1 on an iSCSI target. This target is provided by Linux LIO and is backed by a sparse file. One of the reasons I created such an installation was thin provisioning. In other words, when I free disk space on Windows, LIO should punch holes into the file, thus free storage space on the Linux server as well. I have checked my kernel's sources and the SCSI UNMAP command is really supported for file-backed targets. On the other hand, deleting files on Windows doesn't lower the amount of space taken by the backing file on Linux (checked with du). Actually, the backing file sometimes grows even more. Some sources on Google say Win8 should support UNMAP/DISCARD on iSCSI, but even in Wireshark I only see ordinary read and write commands when files are being deleted. Any way to fix or troubleshoot it?

    Read the article

  • Laptop crashes when connecting to external harddisk

    - by Gnot
    I recently had a problem with my laptop. when I booted up the machine, I would get a SMART failure error message and when I pressed F1 to continue, it would take a very long time to boot and it would come back to the same error message again. Thinking that my hard disk was dying, I bought a new hard disk and installed on my laptop and so now my laptop is alright. However I need to recover data from that old hard disk, so I bought an external hard disk case and placed the old hard disk onto the case and connected to my laptop with USB. The first few times when I connected, I could see the files from the old hard disk and managed to copy some files over although it took extremely long to transfer. But now whenever I connect to the old hard disk, after a few minutes, my laptop will crash and re-boot. Do you think my old hard disk is dead beyond repair? Or you can offer some help here? Any assistance would be appreciated!

    Read the article

  • Permission denied when trying to execute a binary burned to a CD-R

    - by user16654
    On an Ubuntu 9.10 (Karmic Koala) machine, I burned a CD from the command prompt using: cdrecord -v speed=16 dev=0,1,0 /FPS.iso The CD now contains an executable and some files. I tested the CD by loading it onto another machine (Red Hat 5.3) and when I try to run the program I get the following message: bash: ./FPS1_1: Permission denied I can open other files like text documents (the executable also comes with shared libraries). I realized I had burned the CD as root so I burned another one as another user but I still have the same problem. How can I remove this permission or what is the problem? P.S. the image was in / if that helps

    Read the article

  • How could I compress a folder into splitted archives (individual ZIPs)?

    - by Shiki
    I have to compress folders into ZIP packages. But the size is limited, only a ~10-15mb is allowed to used per package. Every major application comes with the "Split archive to..." option, which does what I want... except I can't uncompress them one-by-one. (You need them all, and then use the .7z, .rar, .zip file to uncompress.) Here is an example. FolderX is 35 mb. That makes 4 packages, 4 zip files. The normal split function would give me: folderx.zip, folderx.zip.001, folderx.zip.002, folderx.zip.003 What I would really need is: folderx_1.zip, folderx_2.zip, folderx_3.zip, folderx_4.zip (Individually uncompressable files/packs.) I can code this down into an app, but it's a waste of time if such a utility already exists.

    Read the article

  • Is there a way to create a copy-on-write copy of a directory?

    - by BCS
    I'm thinking of a situation where I would have something that creates a copy of a directory, tweaks a few files, and then does some processing on the result. This wold be done fairly often, maybe a few dozen times a day. (The exact use case is testing patch submissions; dupe the code, patch it, build/test/report/etc.) What I'm looking for could be done by creating a new directory structure and populating it with hard links from the origonal. However this only works if all the tools you use delete and recreate files rather than edit them in place. Is there a way to have the file system do copy-on-write for a file? Note: I'm aware that many FSs use COW at a block level (all updates are done via writes to new blocks) but this is not what I want.

    Read the article

  • One subdomain is not working

    - by BFTrick
    Hello there, My main domain works just fine - www.example.com and a subdomain set up by another developer works as well - sub1.example.com. But when I try to set another subdomain up I go through the process everything seems to work. The software creates the default files where the subdomain files should go. But when I try to browse there it doesn't work. My host uses Plesk to do all of the hosting stuff. What do you think the problem is? I doubt it is some sort of cache issue because I had problems on my phone which I tried after problems on the pc. Maybe for some reason Plesk needs time to set this up? I have used Cpanel before and that works instantly.

    Read the article

< Previous Page | 594 595 596 597 598 599 600 601 602 603 604 605  | Next Page >