Search Results

Search found 97876 results on 3916 pages for 'user folder'.

Page 428/3916 | < Previous Page | 424 425 426 427 428 429 430 431 432 433 434 435  | Next Page >

  • Backup Dropbox to Amazon Glacier

    - by joekr
    i'm using Dropbox for Backup which means i keep all my files in my Dropbox folder (encrypted using encfs but that should not be relevant). I like this solution because it is automatic and keeps copies of my files on several machines at different locations. The only thing i could see go wrong is that Dropbox has some sort of bug that tells all my machines to delete the files. So currently i do a Backup of the Dropbox folder to an external Harddrive. With Amazon Glacier it seems affordable to automate Backup snapshots of my Dropbox. What i am looking for is a tool that will do this for me - the base case scenario would be that files would go from Dropbox (using their API) directly to Amazon as uploading the ~80GB from my home connection would take forever... Thanks!

    Read the article

  • Run Bash Script Another Server

    - by psce
    I want to run command one by one, for change the names of the directories on the server. When I run script, directories renamed in server 1. But, directories are not found in server 2. What the error could be in the script? Script; #!/bin/bash mach_directory=/home/user/example erase_dir1=cache erase_dir2=tmp for i in {0..10} do user=user server=$(ssh $user@server$i hostname) ssh $user@$server find $mach_directory -type d -name $erase_dir1 ! -path "*Admin/$erase_dir1*" -print0 | while IFS= read -r -d '' file ; do mv "$file" "${file}_$(date +%d%m%Y)"; done ssh $user@$server find $mach_directory -type d -name $erase_dir2 ! -path "*Admin/$erase_dir2*" -print0 | while IFS= read -r -d '' file ; do mv "$file" "${file}_$(date +%d%m%Y)"; done done

    Read the article

  • MySQL mistake with grant option

    - by John Tate
    I am unsure reading the MySQL documentation if creating a user with the GRANT option will give them the power to create users and grant privileges, or change the privileges of other users databases. I have been creating databases for users like this CREATE DATABASE user; USE user; GRANT ALL PRIVILEGES ON *.* TO 'user'@'localhost' IDENTIFIED BY 'password' WITH GRANT OPTION; Is this the best way of doing it or have I just given my users too much control? They are people I am hosting sites for. Thankfully at this point they are trustworthy. I use quotas. Edit: I have realized I have been granting users access to all databases. This is obviously stupid I should be using this: GRANT ALL PRIVILEGES ON database.* to 'user'@localhost' IDENTIFIED BY 'password' What is the simplest way to revoke privileges for every user except root so I can quickly end this catastrophic rookie mistake?

    Read the article

  • trouble shooting ntfs-loop-xen combination in wubi based grub of Ubuntu

    - by Registered User
    Here is a situation I installed Ubuntu on a laptop using Wubi in Windows 7 drive.*The laptop is not mine.*I have installed and things worked by now perfectly without any problem.We are trying to set up a Xen (virtualization)environment in this laptop. After setting up every thing cleanly.When I needed to boot with following grub entries menuentry "Xen Linux 2.6.32.27" { insmod ntfs set root='(hd0,2)' loopback loop0 /ubuntu/disks/root.disk set root=(loop0) multiboot /boot/xen.gz module /boot/vmlinuz-2.6.32.27 dummy=dummy root=/dev/sda2 loop=/ubuntu/disks/root.disk ro console=tty0 module /boot/initrd.img-2.6.32.27 } I got error file not found error unknown command 'multiboot' error unknown command 'module' error unknown command 'module' Now to dig this issue further I reboot the machine and go to grub command prompt and manually pass on each of the above parameters which you see in the grub entry when I reached grub> insmod multiboot then I got following message on screen error:file not found. It looks like this wubi+ grub setup has just enough modules to use loopback file on ntfs, but the ACTUAL /boot directory is on the loopback NOT ntfs (hd0,2). Therefore any attempt to read any files from (hd0,2) simply wont work, cause there's no file there.I need to use insmod multiboot and command multiboot and module which are available in grub on a normal install without Wubi.But since the laptop is not mine so I am not allowed to partition it and have to make it work in this situation only. While a normal Kernel is still booting? How can I get module multiboot in this Wubi based install.

    Read the article

  • Can you link an NTFS junction point to a directory on a Network Attached Storage?

    - by Zachary Burt
    I'm using Windows, and I want to use Dropbox to back up a folder outside my Dropbox directory. So I want to create a junction point from my target directory to my Dropbox folder. Accoding to the Wikipedia article on NTFS junction points, which the Dropbox answer links to: "Junction points can only link to directories on a local volume; junction points to remote shares are unsupported." I am looking to link to a directory on networked attached storage, which would not be a local volume, I believe. What should I do?

    Read the article

  • Windows 7, weird/annoying explorer problem

    - by Shiki
    First of all, I'm asking the moderators if you could summarize my problem in the Title and modify it according to that. Thanks in advance and sorry, my english is not that good. The problem. It happened on two of my machines already. Once it happens there are many problems with explorer. I'll try to describe at precisely I just can. So. Basically when you create a new folder, you have to hit the enter key TWICE because first it'll throw an error that "there is no such file..". Okay. Then if you try to delete some, this happens also. Renaming is the same. Here is an example. I tried adding a "2" to the end of the folder name, hit enter and got this. Here I can press cancel, nothing happens. If I press retry, the popup disappears and it'll finally change the folder name. I can't come up with anything for this. Using BitDefender (registered, full protection), Windows 7 x86 Ultimate retail (boxed, english version). The other PC had the same anti-virus protection , but it have a Windows 7 x64. Both systems are activated, updated totally. I don't remember installing any new stuff on the PCs. All the stuff on the PCs are legal, no cracked / any other software present. It just happened, so to say. Already reinstalled the other PC because I thought * I * messed up something and its just something FUBAR. However, I dont want to reinstall my laptop also. Any ideas are welcome. (Today I had a strange bug. Windows Explorer had 1.0+gb memory usage with 100% cpu. Killed it, launched a new explorer.exe and thats all. Nothing changed, but it may worth a try. The other PC did NOT have this problem, ever.) Isn't there a registry fix for this or something? :/

    Read the article

  • How can I have 2 users working on the same PC at the same time

    - by Sharon Cook
    I have a pc/machine that has its own ip address and it can be connected to by certain external PC's through our firewall. User A has a RDP connection from say Germany directly to the PC - his ip number is allowed through our firewall to connect to the PC. He now wants User B to connect at the same time so that User B can see what User A is doing on the screen at the same time and maybe take over the screen to put his input in. I know that you cannot have 2 RDP connections at the same time but what would be the easiest solution to this. I want User A to keep his RDP connection, but I am unsure of what to suggest so that User B can see what is going on at the same time. The Users are not happy to use Real VNC, etc.

    Read the article

  • how to stream audio and video files, but use any media player on Windows (without using Windows file

    - by RamyenHead
    I want to access and play media files on machine S (Windows XP) from machine C (Windows XP). Using Windows File Sharing ("share this folder" stuff), if it works, I would share the folder containing media files on machine S, and I would be able to play media files, sitting in front of C, using any media player I want. Windows somehow ensures that the remote files behave like local files. But Windows file sharing won't work for me, is there any alternative? If two machines were both Linux, I would install an SSH server on S and use Nautilus from C to access and play media files. The reason why I can't use Windows file sharing is, my campus use two different subnets, I have S and C on different subnets and it seems that the firewall governing the whole network in campus doesn't allow file sharing between different subnets. I tried changing Windows Firewall settings on S to allow C in, it still wouldn't work, so it must be the other firewall.

    Read the article

  • Mail Merge in Microsoft Word with images from Sharepoint

    - by Ian Turner
    Is there any way of doing a Mail Merge in Microsoft Word 2007 taking data, including images from a Sharepoint site? It's a bit crude, but I've managed to merge text by taking the data off the sharepoint site as an Excel sheet and then merging that. My problem is what to do with the images. I can set references to the images up in the Sharepoint site, however all I can find is a way of Mail Merging when images are in the same folder as the document you are trying to Merge and I can't find a sensible automated way to pulls these images together into one single folder.

    Read the article

  • How to rename database without first stopping SQL instance to flush connections

    - by John Galt
    Is there a way to force a database into single user mode so a script can be run to rename databases? I find I have to Restart the instance of SQL (to force off any connections from a web app, etc.) and then I can run this script: USE master go sp_dboption MDS, "single user", true go sp_dboption StagingMDS, "single user", true go sp_renamedb MDS, LastMonthMDS go sp_renamedb StagingMDS, MDS go sp_dboption LastMonthMDS, "single user", false go sp_dboption MDS, "single user", false go After this script runs, I can restart IIS for my web app and it can connect to the new production database. All the above works well and we've been doing this for years but now we've upgraded to SQL 2008 and the SQL2008 instance also hosts other databases that support other web apps. So, rather than using a Restart of the whole SQL instance to enable subsequent single-user mode on 2 databases, is there a less intrusive way of accomplishing this? Thanks.

    Read the article

  • What tools can be used to download all images in a webpage?

    - by bobo
    I would like to download all images in a web page. The tool should be smart enough to examine the css and javascript files in the page source to look for the images. Ideally, it should also replicate the folder hierarchy, saving the images in the correct folder. For example, the web page may have some images for menu items stored in images/menu/ and for background images it may be stored in images/bg/. Is there such a tool that you know of? (preferably in Windows but Linux is still ok) Many thanks to you all.

    Read the article

  • Weird execution of ruby/git executables in Windows

    - by Frexuz
    Something strange has happened. I can't run some command line executables in Windows anymore. Steps: Open cmd Run an executable, such as ruby -v or git -h When I do that, a new command prompt opens, running that command (I think, it's too fast to see), and instantly closes again. I've managed to print screen the new command prompt, and it shows that it's running inside this path: C:\Documents and Settings\Administrator\Local Settings\Temp\3582-490 Inside this folder, is the executable I'm tring to run. If I run ruby, then ruby.exe is in there. If I run git, then git.exe is in there. And it's always emptying the folder in between, so there is always just one .exe file

    Read the article

  • Windows XP autostart process as administrator

    - by Zulakis
    I am looking for a way with which i can autostart a certain program on logon of a user with user-rights with administrator-rights. I already tried using task scheduler but it didn't work out because you got to enter a username with format machine\user and our pxe-image-deployment-system automatically patches the machine names so the entered domain\user stopped working. UPDATE: the runas.exe command does not seem appropiate for this task, too. If using /user:machinename\Administrator /savcred it is invalid after imaging. What one user suggested was using .\Administrator or localhost\Administrator but both didn't work on my XP SP3 machines.

    Read the article

  • Windows ACL inheritance issues for FTP server and automated tools

    - by Martin Sall
    I have set up Cerberus FTP server. By default, Cerberus FTP service runs under SYSTEM ACCOUNT. Also I have some console applications which run as scheduled tasks. They are running under a dedicated "Utilities" user account which has "Log on as batch job" permissions. These console applications take uploaded FTP files, process them and then move them to some dedicated archive folder. The problem is that my console apps are throwing Security exceptions when trying to acces the uploaded files. I tried to give the Full control permissions on the ftproot folder for my "Utilities" account and I have checked that "Replace all Child object permissions with inheritable permissions from this object" checkbox, but it affects only current files. When new files are uploaded, they again are not accessible by my "Utilities" account. I tried to go another way and put Cerberus FTP service under "Utilities" account. Then I also needed to give "Utilities" account permissions on Cerberus Data folder in ProgramData. Still no luck - after this operation, Cerberus internal SOAP web service stopped working (although everything else seems to work). I need that SOAP service to be available, so running the Cerberus FTP under "Utilities" account seems to be not an option. Unless I find out, what else do I need to set up for that "Utilities" account to stop Cerberus from complaining. I guess, Cerberus is uploading files to some temporary folder and so those files get the permissions form that folder and keep the same permissions even after moved to the ftproot. What would be the right solution for this which would grant Cerberus FTP server and the "Utilities" account minimal needed permissions to access the contents of the ftproot folder?

    Read the article

  • How can i automatically move files based on their name?

    - by Pasha
    I have 13 folders containing scanned photographs. Some photographs have been renamed to the date on which they were taken, resulting in YYYY.MM.DD.tif name. It could potentially be YYYY.MM.DD (###).tif where ### is just a number. Others are just named IMG_###.tif I would like to move the files with the YYYY.MM.DD name to a YYYY\MM\DD folder structure. While the files are being moved, I would also like to append the original folder name to the end of the file name. So, a file 01\2012.06.26 (1).tif should end up 2012\06\26\2012.06.26 (1) - 01.tif Is there a Windows tool that can help me with this? Or do I need to resort to writing a custom app?

    Read the article

  • Is there a way for Windows 7 to show remaining disk space in the status bar?

    - by Matt Thompson
    This is really driving me nuts. I do a lot of moving media files to and from USB drives, and I am constantly looking to the status bar to see how much remaining space I have on a drive. It's quick, and doesn't involve any clicking. At least, that's what I used to do using Windows XP. Is there a way to get the status bar in Windows 7 to behave in the same way? I saw in a Wikipedia article that some features have been removed from Windows 7, including these two that seem to be affecting me the most: The size of any selected item and free disk space are not shown on the status bar. When no items are selected in a folder, neither the details pane nor the status bar show the total size of files in the folder. Are there any plug-ins or registry tweaks that can be made to return this functionality? If not what is the quickest way to get the remaining space on a drive without having to click on something and leaving the directory you are working in?

    Read the article

  • I'm running Ubuntu notebook on a USB drive with 1GB persistence. Where are my XAMPP files?

    - by CDeanMartin
    I just began running Ubuntu Notebook on my 2GB USB stick, with 1GB persistence. After I installed XAMPP, I rebooted to make sure the persistence was working. It worked! My XAMPP installation worked, and it comes up when I type localhost into the browser. The sample apps work, including the ones using MySQL. But I can't find the application files! I am used to the vast labyrinth of files in Windows Explorer, and looking through 'Files and Folders' in GNOME, there seem to only be 7 or 8 folders in the entire OS? What is really going on here? I looked through them all and where is XAMPP? When I code PHP with a WAMP stack, all the .php files go in the 'WWW folder' What is the Linux equivalent of the WWW folder?

    Read the article

  • Moving only the contents of a map and not the map itself on linux

    - by WebDevHobo
    Using the cp command, one can move files and folders on linux. I want to make a new user and move the contents of the skeleton map to their home directory. I use this command: cp -r /etc/skel/ /home/testuser/ However, this only creates a skel folder in testuser. The idea is that the contents of the /etc/skel folder be copied to /home/testuser, and not that a map be made in /home/testuser with those contents. I've checked the man page: Link, but nothing on there really seemed like the solution to me. Is there a way to do this, or do files really need to be moved manually, 1 by 1?

    Read the article

  • Media player only works as administrator?

    - by Jeremy
    It seems I can only get Media Player 12 to work as administrator. If I run it normally (I am in the administrator group on my local PC) and right click on Music, and choose Manage Music Library. Media Player will sit and think for 5 or so seconds, then just not do anying, no dialog, no error. If I run as administator I can now get into the Manage Music Library dialog and add my a public folder containing my music. I've even tried granting everyone access to the public folder. One thing to note is that I have recently set up a domain controller and added my PC to the domain. With my local account I never noticed this problem, but I've since created a domain account and am now seeing this issue. I can't find much difference between the local and domain accounts - both are in the administrator group. Why would WMP require run as administrator? OS, Windows 7 64bit

    Read the article

  • Running an rsync sweep before initializing lsyncd for synchronizing instances on EC2

    - by chrisallenlane
    My company uses several EC2 servers that will scale up and down according to the load we're receiving on our sites at any given moment. For the sake of our discussion here, we're running four instances: master.ourdomain.com - the file syncing "hub" of the webservers www1/www2/www3.ourdomain.com - three webservers which turn on or off as dictated by load I'm using lsyncd to keep all of the webservers in sync, and for the most part, it's working quite well. We're using a two-way syncing scheme, such that each webserver syncs against master, and master syncs against each webserver. Thus, the webservers are kept in sync, even though they aren't syncing against each other directly. I'm having one problem that I'm having a hard time solving,though. It occurs under these circumstances: When changes are made on master (perhaps after we've pushed new code), while some of the redundant webservers are sleeping And then a sleeping webserver wakes-up to absorb load Under that circumstance, I would like the following to happen: First, the newly-awoken webserver should sync its file structure - one way - against master, to bring its web application code up-to-date. Then, and only then, should it begin pushing changes in its file structure back to master. Unfortunately, currently, when a sleeping server is started, when lsyncd starts up, it pushes changes back to master before updating its own codebase, thus overwriting new code with old. Thus, before lsyncd starts, I'd like to be able to synchronize the webservers code against master's, perhaps by running a simple one-way rsync against the two machines. We're running lsyncd v.2, and I've tried to make this happen by using the "bash" configuration options documented in the lsyncd manual. My configuration file looks like this: settings = { logfile = "/home/user/log/lsyncd/log.txt", statusFile = "/home/user/log/lsyncd/status.txt", maxProcesses = 2, nodaemon = false, } bash = { onStartup = "rsync [email protected]:/home/user/www /home/user/www" } sync{ default.rsyncssh, source="/home/user/www/", host="[email protected]", targetdir="/home/user/www/", rsyncOpts="-ltus", excludeFrom="/home/user/conf/lsyncd/exclude" } (I've obviously redacted that file somewhat to protect the identities of the guilty.) Simply put, though, this just isn't working. How else might I approach this problem? I was looking at the --delete-after option in man rsync, but I don't think that does what I'm looking for. Are there any suggestions about how I should approach this problem? Thanks for lending your time and expertise. Chris

    Read the article

  • Apache unable to write to files and folders on Fedora 16

    - by mickburkejnr
    I've recently installed Fedora 16 on a new PC, and I'm intending to use it for developing my websites. I've set up Apache to host multiple development sites on the machine. Right now though, I am trying to install a PHP framework (Symfony2) and I'm unable to install it on to the web server. It comes back with an error saying that it's unable to write to the cache folder on the server. I have checked and modified the folder so that it is writeable, but still the error keeps being displayed? What am I doing wrong?

    Read the article

  • HOW TO RECOVER A WWW DIRECTORY AND INCLUDED FIELS IN UBUNTU 9.04

    - by Al Mubarak
    hai., i'm using ubuntu 9.04 for drupal development. today morning accidentally i removed my www folder in directory. the folder has so many of my web development documents. O God., I just restart after my system when it happens., and i install some recovery software like gpart. is theri any possibilities to recover my www directory and files., bcos its includes more of web development documents. pls pls pls i'm very afraid about that issue. let me know asap. Thanks in Many more advance,

    Read the article

  • How can I launch RemoteApp on Windows Server from server itself at startup

    - by Rusted
    I have Windows Server 2008 R2 with RDS and custom desktop (GUI) application installed on the server. The app is started as RemoteApp on server by user from his desktop computer (or, sometimes, he can work from notebook over VPN). Some details about environment: the server automatically shuts down every evening and automatically power-on every morning (this is a requirement) desktop application do some precalculations/precaching on startup and it can take lot of time mentioned application have some memory leaks, so I can't use hibernate instead of shutdown When user launching this app from his computer, he can't start work with it until this app finishes pre-initialization. Is there any way to start RemoteApp session at the server startup (without actual user logon), so that the user could connect to this session from his computer later? I don't want to involve the user's computer to make it work. I have tried to do it by Windows startup script, but have no luck - starting RDP session requires actual user session.

    Read the article

  • What is Finder doing when it has a spinner in the bottom-right?

    - by rspeicher
    Whenever I open a folder on a remote network share that hasn't been opened since I last booted, Finder displays this animated spinner in the bottom-right (see picture below) for about 30 seconds and won't display any of the folder's contents until it's done doing whatever it's doing. This makes browsing the share painfully slow for seemingly no reason. Why's it doing this? I should note that I've disabled .DS_Store files littering network shares using defaults write com.apple.desktopservices DSDontWriteNetworkStores true, so maybe that's why. I'm kind of hoping it's something else, though.

    Read the article

< Previous Page | 424 425 426 427 428 429 430 431 432 433 434 435  | Next Page >