Search Results

Search found 46908 results on 1877 pages for 'managing files and folder'.

Page 322/1877 | < Previous Page | 318 319 320 321 322 323 324 325 326 327 328 329  | Next Page >

  • Where do Outlook folders go when moved?

    - by balexandre
    I have an account with external user mailboxes opened and accidentally I have moved a folder and now I can't find it anywhere. Action: I clicked on a folder and dragged it into another one. Result: Can't find the moved folder anywhere The above picture is the folders I currently have from my Outlook 2010 (via Exchange 2010), under an AD Network. Where can I (me, having admin rights over the network) retrieve that missing folder again? Attempts: The original and the one folder I need was accidentally moved, but I have created a poi folder and tried the same way, and I got the same result... the folder went missing. I also tried to reboot the client machine and access the same mailbox from OWA ... no luck on both attempts :( Any ideas on how I can retrieve the missing folder and its emails again?

    Read the article

  • how to rename and move files according to directory names?

    - by Shan
    I have bunch of directories containing the file with the same name. I want to move these files to another directory and at the same time renaming them with the directory name so that they are distinguished and are not over-written. EDIT: All the directories are in the same directory. Destination is one directory on the system which could be anything. We read directory and read file form it and rename it exactly as the directory name and put it to the destination. An important constraint is that the name of the file is given which will be in all of the directories. Directories might contain other files bit also the one which is given Thanks a lot

    Read the article

  • Does anyone know why rsync would keep sending the files over and over again?

    - by beagleguy
    I'm trying to using rsync to backup some files, about half a TB. It's now it a state where it keeps sending the same files everytime it runs. for example: rsync -av /data/source/* user@host:/data/dest sending incremental file list source/file1.txt source/file2.txt I then verify those files are copied over... then the next time it runs it does the same thing rsync -av /data/source/* user@host:/data/dest sending incremental file list source/file1.txt source/file2.txt any idea why it's getting stuck on these files? I've tried to wipe the whole dest directory out and start over but no luck. thanks,

    Read the article

  • How can I maximally compress .gz files in Nautilus?

    - by Takkat
    When selecting Compress... from the right click context menu in Nautilus I am able to quickly compress files to .gz format. However by default Nautilus does not use maximum compression. Can I make Nautilus to use maximum compression like gzip -9? Using gconftool or gconf-editor to set the compression_level for File Roller to maximum seems right but infortunately has not the desired effect and will not lead to maximum compressed files. As this is the expected way of how to set compression levels a bug report has been filed upstream. Any ideas for a workaround are welcome.

    Read the article

  • Problem in Displaying HTML Files' Icons in Win 7 ?

    - by Mohammad
    I have Windows 7 x64 (built 7600). I just installed Firefox 3.6 and I set it as my default browser, upon I've set that all of my HTML files were been without any icons! Could you please guide me how I can fix HTML files' icons when FF3.6 is my default browser? Thanks. P.S: Whenever I set IE8 as my default browser the HTML files' icons were fixed.

    Read the article

  • Backing up files on ubuntu for reinstall. Will there be problems with permissions?

    - by adam
    I have some very important files I want to backup before I reinstall my Ubuntu back to 9.04 from the 9.10 (its causing me all sorts of problems). The files total size is small so im just going to copy them over to Dropbox. Im wondering, when i reinstall Ubuntu and copy them back will there be any issues re the permissions of those files because my old user account which created them and the new user Ill setup on the new install will be different?

    Read the article

  • How can I protect files on my NGiNX server?

    - by Jean-Nicolas Boulay Desjardins
    I am trying to protect files on my server (multiple types), with NGiNX and PHP. Basically I want people to have to sign in to the website if they want to access those static files like images. DropBox does it very well. Where by they force you to sign in to access any static files you put on there server. I though about using NGiNX Perl Module. And I would write a perl script that would check the session to see if the user was sign in to give them access to a static file. I would prefer using PHP because all my code is running under PHP and I am not sure how to check a session created by PHP with PERL. So basically my question is: How can I protect static files of any types that would need the user to have sign in and have a valid session created with a PHP script?

    Read the article

  • How can I send super large files directly to another computer in the Internet for free?

    - by Cruise
    I regulary need to transfer very large files (30 GB) to my friend - financial statistics. I don't have any problem with bandwidth: it is very broad here. I did some research in the area, so: 1. I would not use FTP, as it is very tricky to get it working behind a NAT. 2. I would not use Skype/MSN/ICQ, as it is not designed for file transfer and it underperforms on the huge files. 3. I would not use file-sharing services, as I need to pay for big files (30 GB is a problem here) and I don't like holding any piece of my data on the third-party server. So, I need some smart tool that will do what I need: sending files directly browser-to-browser and not browser-server-browser. Is it so complex? Is there some web application in the Internet that can do this?

    Read the article

  • File storage service that allows clients to upload large files to my account?

    - by deceze
    Can anyone recommend an online file storage service which fulfills these requirements? I can create an account I can invite clients to upload files into my account clients do not need to register to be able to upload clients must not be able to see anything but their own files or they must not see any files at all, they get only a dropbox only I can access the uploaded files, everything is non-public service is multi-lingual I just need clients to be able to send me potentially large files in a dead simple manner online, that's all. No registration step to go through, no software to download, no synching or sharing. No setting up of individual folders and permissions for each individual client. No copying and pasting of links (a la Mediafire, Rapidshare etc).

    Read the article

  • On linux how can make a list of files that are owned by a particular owner and then fix the group and owner?

    - by Stuart Woodward
    I have a deep and complex file system where some files have been accidently written by root. I want to change the ownership of those files back to the original owner in one go. I am playing with commands like: find /folder -type f | xargs ls -l | grep "root root" but there is a lot of garbage coming out too. I want to make a list first and then change only the files in that list after confirmation.

    Read the article

  • 7ZIP - Command Line Compression | Can Never Keep it Simple

    - by OneTwoYou
    I've been Googleing for a few hours on how to just compress a file inside a directory and I can't find anything. I found how to just compress a folder in general. Now I wish to know how I can compress a folder in a folder with a file. Current code: 7zG.exe a -tzip "test.zip" dontcompressme/compressme/new.txt pause As you can see above, I don't want to compress the first folder, but only the second and what ever is within that folder. I have the 7zG.exe sitting in the main folder and I have some files that are three folders in, but I don't know how to only compress those. Here is my directory list: Folder One (don't compress) Folder Two (don't compress) Folder Three (okay to compress) Document One.txt (okay to compress) Document Two.txt (okay to compress) Index.html (okay to compress) Does anyone know how I can do this in the most simplest way ever invented by man? Cause whenever I go to a website using Google it goes throw all these methods on how to compress a folder, but not do it the way I wish it to do. It makes me kinda upset cause I can't get a simple and straight forward answer. Thank you if you answer my question.

    Read the article

  • Lubuntu 14.04 Problem starting lxsession-default-apps

    - by user278179
    I have one problem, I can't execute lxsession-default-apps on Lubuntu 14.04 because I get because said to me "The database is updating, please wait" If I try to run lxsession-default-apps, I get this error: ** Message: utils.vala:30: config_path_directory: /home/USER/.config/lxsession-default-apps ** Message: desktop-files-backend.vala:171: test config_path: /home/USER/.config/lxsession-default-apps/settings.conf ** Message: desktop-files-backend.vala:237: Scanning folder: /usr/share/applications ** Message: desktop-files-backend.vala:278: Start scanning ** Message: desktop-files-backend.vala:257: Scanning folder: /usr/share/app-install/desktop ** Message: desktop-files-backend.vala:278: Start scanning Error: list_files failed: No such file or directory ** Message: desktop-files-backend.vala:333: Finishing scanning ** Message: desktop-files-backend.vala:189: Signal finish scanning with mode: write ** Message: desktop-files-backend.vala:333: Finishing scanning Any help would be appreciated. Thanks. Regards.

    Read the article

  • Why some recovery tools are still able to find deleted files after I purge Recycle Bin, defrag the disk and zero-fill free space?

    - by Ivan
    As far as I understand, when I delete (without using Recycle Bin) a file, its record is removed from the file system table of contents (FAT/MFT/etc...) but the values of the disk sectors which were occupied by the file remain intact until these sectors are reused to write something else. When I use some sort of erased files recovery tool, it reads those sectors directly and tries to build up the original file. In this case, what I can't understand is why recovery tools are still able to find deleted files (with reduced chance of rebuilding them though) after I defragment the drive and overwrite all the free space with zeros. Can you explain this? I thought zero-overwritten deleted files can be only found by means of some special forensic lab magnetic scan hardware and those complex wiping algorithms (overwriting free space multiple times with random and non-random patterns) only make sense to prevent such a physical scan to succeed, but practically it seems that plain zero-fill is not enough to wipe all the tracks of deleted files. How can this be?

    Read the article

  • What to do with ca.crt, name.crt, name.key, name.ovpn files?

    - by tipu
    I was given these four files to access the office's vpn server. I am on ubuntu 12.04, and am unsure how to began using these. I tried using the vpn connection tab under the network connections, but my files didn't specify a username after importing and it forced to me to save one, so attempting to connect to it didn't yield any results. What am I supposed to do with these four files to connect to the vpn?

    Read the article

  • Accessing a webpage folder with .htaccess in it via apache webdav?

    - by pingo
    I have setup webdav access in order to enable an external user to upload the content of his web page to his folder on my server that is served by apache to the web. This way he could update his web page via webdav. Now the problem is that the user requires a .htaccess file and of course .htaccess breaks webdav probably because it overrides settings. (new files cannot be uploaded anymore via webdav if below specified .htaccess exists) I am running Apache2.2.17 and this is my webdav config: Alias /folderDAV "d:/wamp/www/somewebsite/" <Location /folderDAV> Order Allow,Deny Allow from all Dav On AuthType Digest AuthName DAV-upload AuthUserFile "D:/wamp/passtore/user.passwd" AuthDigestProvider file require valid-user </Location> This config is part of my naive solution to fixing this problem. The idea was to specify an alias to the web page folder where webdav would be enabled and then set AllowOverride to none so that the .htaccess would have no effect. Of course I then found out that in <Location /> AllowOverride directive is not valid. The .htaccess file looks like this: #opencart settings Options +FollowSymlinks Options -Indexes <FilesMatch "\.(tpl|ini)"> Order deny,allow Deny from all </FilesMatch> RewriteEngine On RewriteBase / RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*)\?*$ index.php?_route_=$1 [L,QSA] ErrorDocument 403 /403.html deny from 1.1.1.1/19 allow from 2.2.2.2 What would be the solution here? I would like to have the web page accessible from the web but at the same time be able to access and modify it via apache's webdav (with digest auth). How would I do that? Also if possible I would like a solution that permits the existence of the .htaccess so that the user still has the power to setup access rules for his web page.

    Read the article

  • How can I configure Samba to share (read/write) any folder with root permissions?

    - by Mike Toews
    I have a CentOS 5 VirtualBox guest on a Win7x64 host. I am attempting to setup a read/write share a directory owned by root with my Windows host using Samba, but I'm having no luck after running around in circles. To simplify matters, I've disabled my Firewall (/etc/init.d/iptables stop). As security and permissions are irrelevant for this purpose, I'd rather not have to set up another unix user/group/password. Here is the output from testparm Load smb config files from /etc/samba/smb.conf rlimit_max: rlimit_max (1024) below minimum Windows limit (16384) Processing section "[Guest Share]" Loaded services file OK. Server role: ROLE_STANDALONE and the source of /etc/samba/smb.conf: [global] workgroup = WRKGRP netbios name = SMBSERVER security = SHARE load printers = No [Guest Share] comment = Guest access share path = /root/src read only = No guest ok = Yes Running /etc/init.d/smb restart shows an OK status. However, on my Windows host, I can only see the share folder on the guest \\IPv4, but I cannot go into "Guest Share": "The network name cannot be found" error message is a common error, with a likely cause: The user you are trying to access the share with does not have sufficient permissions to access the path for the share. Both read (r) and access (x) should be possible. Am I trying to use root as a passwordless Samba guest? I'd like to, is it possible? How can I configure Samba to share (read/write) any folder with root permissions?

    Read the article

  • Can I use a wildcard to denote subdirectories as opposed to just files in the Windows Command Prompt

    - by Dinosaurus
    I know I can use a wildcard to list the files in a single directory: dir *.java However, does anyone know if it is possible to denote a subdirectory with a wildcard as well? I would like to do something like dir classes/*/*.java Where, it will list all the java files in every subdirectory beneath the classes directory. So, if there is: classes/cs1100/ classes/cs1200/ classes/cs1500/ It will list all the java files within these. Note, I'm not using this specifically for the "Dir" command, but instead another command line tool that accepts a list of files. But, if it works for Dir, it shoudl work in my other program as well.

    Read the article

  • Why does Windows Media Center try to open zip files?

    - by gpryatel
    Notes: OS is windows 7, browser is latest firefox. After saving a zip file to the desktop, Windows Media Center opens up. I looked around its config settings but could not find anything related to zip files. How do I turn that off? Also, don't know if this should be a separate question or not: Unless I right click save link as... for zip files, I don't get a firefox dialogue asking what to do with the file (Open/Save). The files get saved to some place like c:\users\namegoeshere\appdata This only happens on the win7 computer. I looked around in firefox's settings for saving files, and I do have 'ask me where to download...' enabled. I can get more exact path names when I get home.

    Read the article

  • What Scripting Program would you choose to recover deleted and missing files?

    - by Steven Graf
    For a private project I'm looking for a command line tool to scan and recover files. I'm working on Gnome 3 (but I could also change my OS if it helps reaching my goal) and must be able to find and recover files on attached devices with formats such as NTFS, Fat32, MAC OS Extended and ext3. Is there a command line script to cover all of them or do I need to use different programs to reach my goal? can you recommend command line tools for these kind of tasks? is one of you willing and able to show me some examples and teach me further?

    Read the article

  • 403 Forbidden serving static files from VirtualBox shared folder with nginx (Ubuntu 10.04LTS guest, Windows 7 host)

    - by Chris Pratt
    I'm working on a local development VM and trying to test serving my site with gunicorn and nginx as a reverse proxy for static resources only. The site loads minus static resources with user nginx; in nginx.conf. Attempting to load a static resource individually reveals a 403 Forbidden error. For background. The static resources are in a shared folder under /media/sf_work. All files are owned by root:vboxsf (VirtualBox default). My user account on the system has been added to the vboxsf group, and I have full access to the shared folder. For comparison, I tried changing the nginx.conf user to my user account. In that scenario, the static files did load, but then the homepage itself gives a 403 Forbidden error. So, I then tried adding the nginx user to the vboxsf group, but then everything gives a 403 Forbidden error. After further investigation it seems that if the nginx.conf user is in any group, it results in a 403 Forbidden. Any idea what could possibly be going on here?

    Read the article

  • How do I change the default ftp folder in MacOS X 10.6?

    - by Wild_Eep
    I'm running WordPress 2.9.1 from a Mac running 10.6.3. WordPress is installed to the /Library/WebServer/Documents folder. WordPress has a feature called AutoUpdate. Clicking an autoupdate button will download and install updated versions of the WordPress software, or third-party plugin tools. It's a convenient way to keep things up to date. WordPress uses FTP to download the files. I've enabled FTP and set up a user account and opened the requisite ports in my firewall for FTP traffic. This doesn't seem to be enough for my self-hosted installation, though. I'm sure this feature was originally designed for someone who has access to a remote shared webserver, and that it's merely a configuration challenge related to the FTP setup. I feel that if I can adjust the initial directory that the FTP service presents to the AutoUpdate feature, everything else will work properly. So, my question is, how do I adjust what folder is presented when a given user connects to a Mac running 10.6.3 via FTP?

    Read the article

  • Why is it good to have website content files on a separate drive other than system (OS) drive?

    - by Jeffrey
    I am wondering what benefits will give me to move all website content files from the default inetpub directory (C:) to something like D:\wwwroot. By default IIS creates separate application pool for each website and I am using the built-in user and group (IURS) as the authentication method. I’ve made sure each site directory has the appropriate permission settings so I am not sure what benefits I will gain. Some of the environment settings are as below: VMWare Windows 2008 R2 64 IIS 7.5 C:\inetpub\site1 C:\inetpub\site2 Also as this article (moving the iis7 inetpub directory to a different drive) points out, not sure if it's worth the trouble to migrate files to a different drive: PLEASE BE AWARE OF THE FOLLOWING: WINDOWS SERVICING EVENTS (I.E. HOTFIXES AND SERVICE PACKS) WOULD STILL REPLACE FILES IN THE ORIGINAL DIRECTORIES. THE LIKELIHOOD THAT FILES IN THE INETPUB DIRECTORIES HAVE TO BE REPLACED BY SERVICING IS LOW BUT FOR THIS REASON DELETING THE ORIGINAL DIRECTORIES IS NOT POSSIBLE.

    Read the article

  • Why is 'libgnomevfs' files under /usr/include/gnome-vfs-2.0?

    - by George Edison
    Most applications, including the gnomevfs headers themselves, expect the files to be under /usr/include/libgnomevfs, but Ubuntu has them under /usr/include/gnome-vfs-2.0/libgnomevfs. Why? The package I'm referring to is called libgnomevfs2. Inside /usr/include/gnome-vfs-2.0/libgnomevfs/gnome-vfs.h` we find: #include <libgnomevfs/gnome-vfs-acl.h> #include <libgnomevfs/gnome-vfs-address.h> #include <libgnomevfs/gnome-vfs-async-ops.h> #include <libgnomevfs/gnome-vfs-cancellation.h> ... Meaning that even the headers themselves expect the files to be in that location - and nothing that includes this file will work. Am I missing something, or is this a glitch?

    Read the article

< Previous Page | 318 319 320 321 322 323 324 325 326 327 328 329  | Next Page >