Search Results

Search found 3168 results on 127 pages for 'directories'.

Page 101/127 | < Previous Page | 97 98 99 100 101 102 103 104 105 106 107 108  | Next Page >

  • Overriding HOMEDRIVE and HOMEPATH as a Windows 7 user

    - by MikeC
    My employer has an Active Directory group policy which sets my Windows 7 laptop HOMEDRIVE to "M:" (a mapped network drive) and my HOMEPATH to "\". Since I have read-only permissions for the root of that shared drive, I cannot create files or directories in my windows home directory. My attempts to work with the IT department have been unsuccessful. Is there a way for me to globally change these envars at boot or login time? I need for all applications to use alternate values (such as "C:" and "\Users\myname"). I have some installed utilities (like gvim and others) that store preference files in the user's home directory. IMPORTANT: Changing these envars under "System Properties Environment Variables" does not work. I have tried setting these as both User and System Variables (including a reboot). TypingSET HOMEin a DOS window clearly shows that my settings are ignored. Also, using "Start in" in a Windows shortcut will also not solve this, as I need things like Explorer context menu items (like "Edit with Vim") to operate correctly. I do have admin rights on this company laptop, but I am not a Win7 guru. Back in the day, a boot script would have solved this in a minute. Is it even possible today? Thanks.

    Read the article

  • Best shortcut in Total Commander

    - by life-warrior
    So, what's your favourite TC shortcut or shortcut combination ? Which one do you use and for what purpose ? Among my most often used: Ctrl-Left ( or Ctrl-Right ) - open archive or folder under cursor in opposite tab. Ctrl-Shift-Enter, Alt-F8, Ctrl-X - copy full file path to clipboard. Shift-F6, Shift-End(if needed), Ctrl-C - copy only file name w/o path. Select files, Ctrl-M - multi-rename, for example remove "DVDrip" from file names. Ctrl-\ - go to root directory. Ctrl-D, - go to directory with highlighted letter specified. For example, name a downloads directory "&Downloads" in favourites, and the letter after ampersand will be highlighted. Alt-F7, feed to listbox, Ctrl-A, Mark(menu)-Save selection to file - creates a file with all files and directories inside current, with full path. Ctrl-[3-6] - sort files by name(3), extension(4), date(5), size(6). For example, Sort by name, when you need movies and soundracks with the same name and different extension to group them together. Sort by extension, when you need to find EXEs in Windows directory. Sort by Date, when you need to find the latest file downloaded in your dir. Sort by size, when you need to delete the largest files for free space.

    Read the article

  • tab complete not working for vim in particular directory - ubuntu 12.04

    - by user1160958
    I am working on a ruby on rails app. All of the sudden the command line tab complete stopped working for vim, only for files though, and only for the vim command (i.e. works for other commands, ls, rm etc.) After further investigation - this only occurs in a specific directory, the home directory of my rails app. If I go into a sub directory in my rails app, or any other directory on my machine, the tab complete works again. If I go into the root directory of any other rails app, it works. I also tried renaming the diretory, and copying the contents of the directory to another directory, and that did not work either. It only does not work for files, and works for any other command - ls, rm etc. But when I do vim /path/to/file/, then tab to see a list of files in that directory, only other directories show, not files. I am using ubuntu 12.04. Also, I tried re-installing vim, re-booting, removing ~/.viminfo (there was no vimrc file) that didn't work. Any help would be appreciated!

    Read the article

  • How can I avoid permission denied errors when attempting to deploy a rails app with capistrano?

    - by joshee
    Total noob here. I'm attempting to deploy an app through Capistrano. I'm getting relentless permission denied errors when I attempt to run cap deploy:update. Seemingly at least some of these errors are due to missing directories that trigger a "Permission Denied" error. (I'm doing setup on root just temporarily.) set :user, 'root' set :domain, 'domainname.com' set :application, 'appname' # adjust if you are using RVM, remove if you are not $:.unshift(File.expand_path('./lib', ENV['rvm_path'])) require "rvm/capistrano" set :rvm_ruby_string, '1.9.2' # file paths set :repository, "ssh://[email protected]/~/git/appname.git" set :deploy_to, "/var/rails/appname" # distribute your applications across servers (the instructions below put them # all on the same server, defined above as 'domain', adjust as necessary) role :app, domain role :web, domain role :db, domain, :primary => true set :deploy_via, :remote_cache set :scm, 'git' set :branch, 'master' set :scm_verbose, true set :use_sudo, false set :rails_env, :production namespace :deploy do desc "cause Passenger to initiate a restart" task :restart do run "touch #{current_path}/tmp/restart.txt" end desc "reload the database with seed data" task :seed do run "cd #{current_path}; rake db:seed RAILS_ENV=#{rails_env}" end end after "deploy:update_code", :bundle_install desc "install the necessary prerequisites" task :bundle_install, :roles => :app do run "cd #{release_path} && bundle install" end Here's my result: ** [domainname.com :: out] Cloning into '/var/rails/appname/shared/cached-copy'... ** [domainname.com :: err] Permission denied, please try again. ** [domainname.com :: err] Permission denied, please try again. ** [domainname.com :: err] Permission denied (publickey,gssapi-with-mic,password). ** [domainname.com :: err] fatal: The remote end hung up unexpectedly I'm able to ssh without a password, so not sure about that publickey error. By the way, if I run cap deploy:update without set :deploy_via, :remote_cache, here's my result: ** [domainname.com :: out] Cloning into '/var/rails/appname/releases/20120326204237'... ** [domainname.com :: err] Permission denied, please try again. ** [domainname.com :: err] Permission denied, please try again. ** [domainname.com :: err] Permission denied (publickey,gssapi-with-mic,password). ** [domainname.com :: err] fatal: The remote end hung up unexpectedly command finished Thanks a lot for your help with this.

    Read the article

  • Does this exist: a standardized way of documenting a file-system structure

    - by eegg
    At work, I'm in charge of maintaining the organization of a whole lot of varied data on a standard file-system. Part of this is coming up with sensible classification (by similarity, need, read/write access, etc), but the bigger part is actually documenting it: what documents/files/media should go where, what should not be in this directory, "for something slightly different, see ../../other-dir", etc. At the moment, I've documented this using a plaintext file filing.txt in every directory I want to document. If someone is unsure what's meant to be in any directory, they read that file. This works alright, but it seems odd that I have this primitive custom solution to a problem that any maintainer of a non-trivial directory structure must experience. Every company I've known of, for example, has some kind of shared file-system where agreed terminology for categorization is important. In my experience, people just have to learn what's what by trial-and-error and experimentation. So allow me to propose a better solution, and hopefully you can tell me if it exists. Any directory on any filesystem can have a hidden plaintext file named .filing. Its contents are descriptive human language. It uses some markup like Markdown, with little more than bold, italic, and (relative) hyperlinks to other directories. Now a suitably-enabled file browser will check for a file named .filing whenever it displays a directory. If it exists, its contents are parsed and displayed in an unobtrusive pane near the directory-path widget. Any links therein can be clicked, and the user will be taken to the target directory of that link. I think that the effort of implementing such a standard would pay back many times over in usability gains. We would have, say, plugins for Nautilus, Konqueror, etc.. It could be used to display directory information in the standard file lists served by webservers. And so on. So, question: does such a thing exist? If not, why not? Do people think it's a worthwhile idea?

    Read the article

  • FTP issues with Windows 2003 box and Filezilla

    - by vanhornRF
    We've set up a Windows 2003 server with IIS 6 and the FTP is Filezilla. I'm not a sys admin by any means and neither is my other developer. I'm trying to connect to FTP on a Mac with a Cyberduck FTP client. I'm running OSX 10.6.3. The problem is that I can connect to the FTP address, but when I do it will hang for a minute or two trying to list the directories. This address is pointed to the webroot folder so it's only trying to list the pertinent folders for the site we're working on. Eventually after a minute or two it will go through and list everything and then if you try and open a file or another folder, it will hang again and then eventually list the folders. My question: Is there some stupid default setting we're missing? Is this a common occurrence? Like I said I'm not a sys admin at all and I'm sure I'm missing some valuable questions for anyone that reads this, so please fire away if you can help and you need more info, I'll do my best to provide it. Thanks!

    Read the article

  • Backup solution to backup terabytes and lots of static files on linux server?

    - by user28679
    Which backup tool or solution would you use to backup terabytes and lots of files on a production linux server ? Note that the files are all different and almost never modified, and usage is mostly adding files, so data volume is today 3TB growing all the time at around +15GB/day. Please do not reply rsync. Basic unix tools are not enough, rsync does not keep history, rdiff-backup miserably fails from time to time and screw the history. Moreover these are all file based backup, which put a lot of IOwait just to browse directories and query stat(). But i guess, except R1Soft CDP, there is no way around that. We tried R1Soft CDP backup, which is block level backup, and it proved good and efficient for all our other servers, but systematically fails on the server with 3 terabytes and gazillions of files. That is already more than 2 months that the engineers of R1Soft and datacenter are playing a hot ball game... and still no backup except regular rsync We never tried big commercial solutions, except R1Soft CDP since it was provided as an optional service by the datacented hosting our servers.

    Read the article

  • Host Primary Domain from a subfolder

    - by TandemAdam
    I am having a problem making a sub directory act as the public_html for my main domain, and getting a solution that works with that domains sub directories too. My hosting allows me to host multiple sites, which are all working great. I have set up a subfolder under my ~/public_html/ directory called /domains/, where I create a folder for each separate website. e.g. public_html domains websiteone websitetwo websitethree ... This keeps my sites nice and tidy. The only issue was getting my "main domain" to fit into this system. It seems my main domain, is somehow tied to my account (or to Apache, or something), so I can't change the "document root" of this domain. I can define the document roots for any other domains ("Addon Domains") that I add in cPanel no problem. But the main domain is different. I was told to edit the .htaccess file, to redirect the main domain to a subdirectory. This seemed to work great, and my site works fine on it's home/index page. The problem I'm having is that if I try to navigate my browser to say the images folder (just for example) of my main site, like this: www.yourmaindomain.com/images/ then it seems to ignore the redirect and shows the entire server directory in the url, like this: www.yourmaindomain.com/domains/yourmaindomain/images/ It still actually shows the correct "Index of /images" page, and show the list of all my images. Here is an example of my .htaccess file that I am using: RewriteEngine on RewriteCond %{HTTP_HOST} ^(www.)?yourmaindomain.com$ RewriteCond %{REQUEST_URI} !^/domains/yourmaindomain/ RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*)$ /domains/yourmaindomain/$1 RewriteCond %{HTTP_HOST} ^(www.)?yourmaindomain.com$ RewriteRule ^(/)?$ domains/yourmaindomain/index.html [L] Does this htaccess file look correct? I just need to make it so my main domain behaves like an addon domain, and it's subdirectories adhere to the redirect rules.

    Read the article

  • IIS6 Sending a 404 for ".exe" files.

    - by Tracker1
    Recently a bunch of files I had setup for download via IIS6's web server stopped working. They are a number of setup files ending in ".exe" and were working prior to a few months ago. I have the file permissions set properly, and even enabled browsing in IIS to determine that the paths are indeed correct. I'm not sure if it is related, but the directories with a period stopped working as well. ex: "~/download/ApplicationName/0.9/AppName-setup-0.9.123b2.exe" When I rename the directory to say 0_9 the browsing works, but the file itself delivers a 404 message from IIS. For now, I've setup FileZilla FTP for anonymous access to these files, but would prefer to continue using IIS. I've considered creating an HTTP handler to serve the .exe files, but would really prefer a configuration solution. I just can't figure out why it isn't working, as all the settings are correct. Directory is setup for read access. "Everyone" has read permissions on the files themselves, and the directory browsing (aside from the folder "0.9" to "0_9" rename) shows the files.

    Read the article

  • Is it possible to change "working directory" of XeTeX?

    - by Herbert Sitz
    Using XeTeX there are many working files that get created in process of producing the pdf, and they litter the directory where my main .tex file is. Is it possible to change the working directory of XeTeX so that it stores all these scratch files in some other directory, out of the way? There is a previous question on Superuser.com that discusses a utility that cleans up the working files by deleting them after they're produced: http://superuser.com/questions/95712/how-to-avoid-littering-ones-tex-directories-with-intermediate-files That solution doesn't work for me since I'm using XeTeX, but also it seems like it would be preferable to simply be able to designate a "scratch" directory where all working files are saved. I haven't been able to find any info on how to do it though. Is there a way? (My question is prompted partly because of the fact that I often work with files in a directory that is shared using DropBox, so it creates a lot of unnecessary traffic if files are getting created and destroyed willy nilly. I don't know if it affects speed in any way, but the idea of having a separate working directory that is not shared/replicated by DropBox would be a cleaner solution, even if I could use the method suggested in the earlier thread.)

    Read the article

  • Directory tree in a Resource without extraction...

    - by Corelgott
    Hi all, i am looking for a way to store a complete directory including sub directories in an application's resource and not have to extract it to use it. Details: We would like to use GeckoFx (Gecko as C# Component) in one of our applications. GeckoFX needs the XUL-Runner and needs to find it's folder structure We have some other data which I would not prefer to extracted to the customer's pc; At least not onto something persistent like a hdd... Getting the complete directory into the resources is not that kind of a big deal. Compress to one file and done. But not writing it to the disk to use it is something else. I have a strong dislike against temp folders and such things. Would anything like a RAM drive be possible? Some part of the RAM beeing mounted? Does something like this even exist as a lib, or would this only be possible by a device driver? Any thoughts on this? Thanks in advance! Corelgott

    Read the article

  • How do I batch-downsize images on linux, while keeping small images small?

    - by Gabriel
    I have a whole lot of photos and it's time to clean up the mess and free some disk space. I know mogrify is great to batch-resize things down. The problem is, in some directories I have small images mixed with the big ones. I'd like to batch-downsize all the big one but not upsize the small ones. As an example, I have a rep with tens of MBs-pictures in the 3000x2000s. Some of them I have already downsized so I could email them. They may be 1024x768. I'd like to downsize the big ones to 1600x1200, a disk-space-to-quality tradeoff I like. But then, with mogrify or convert, the small ones will be upsized, which would be a waste of disk space. I found some tricky ways to use identify with cut and some scripting to filter the small pics out and mogrify the others, but man, there's got a way to tell mogrify not to upsize my pics... How ? Is there some other tool better suited ?

    Read the article

  • Apache Name-based Virtual Hosts - configuring httpd.conf file

    - by Dave
    Hi there. I am running a web app on Tomcat at the following location on my server. /var/tomcat/webapps/SoccerApp I am looking to update the Tomcat httpd.conf file with the following virtual host... <VirtualHost *:80> DocumentRoot /var/tomcat/webapps/SoccerApp/MyTeam ServerName www.mysoccerapp.com </VirtualHost> This gives me a 404 error as the directory MyTeam does not exist. However my application behaves in a way that it uses this URL directory as the name of the soccer team for which to display data, so it will never be a physical folder on the server. None the less, I would like www.mysoccerapp.com to resolve to webapps/SoccerApp/MyTeam, even though the directory isnt there. does this make any sense? Any ideas on how to get this working. At the end of the day, i want to do the following... www.teamone.com -> runs /webapps/SoccerApp/TeamOne www.teamone.com -> runs /webapps/SoccerApp/TeamTwo ...where TeamOne and TeamTwo are not physical directories, but merely processed by my SoccerApp application as the current soccer team to display data for. Many many thanks! Dave.

    Read the article

  • Linux disk usage analyser that acts like symlinks are real files

    - by Rory
    I am using git-annex, an extension to the DVCS git, which is designed for handling large files. It makes heavy use of symlinks. The actual large files are moved to the .git/annex directory and the original files are symlinked to there. I am running out of disk space, and need to clear up, and see what's using all my space. Usually I'd use a disk usage tool like ncdu, Baobab or Filelight. However they treat the symlink as essentially empty, and only count the file that it is pointing to as using any space. Which means when I use git-annex, it shows no space used in the main directories and lots of space used in the .git/annex directory. This is not helpful. Is there any (graphical or ncurses) based disk usage programme for linux (apt-get installable would be easie that is capable (through options or not) of counting a symlink as using up the space that the original file uses up? Many have options for different behaviour for hard links, so makes sense that some should h (I know counting symlinks as using space has flaws, like counting the space space twice, broken symlinks, etc. But that's OK for my purposes)

    Read the article

  • How to repair damaged repository (which has a centralized .svn directory)?

    - by Heinrich Ulbricht
    I recently upgraded my TortoiseSVN installation to version 1.7.1. This forced me to upgrade my working copy as well. The upgrade removed all (but one) of the .svn directories from all subdirectories leaving only one in the root. Now out of the blue (of course; I suspect my antivirus software) there is an error when I for example try to clean up the working copy. I am also not able to commit anything. The error message when cleaning up is: Cleanup failed to process the following paths: C:\svn Can't open file 'C:\svn.svn\pristine\73\73bcc5fa7819f84f56b81dfa0236f0aac7b7d404.svn-base': The system cannot find the file specified. I traced the error to be related to the presence of one directory within the working copy. If I rename it then everything works. When it is present I get the error. I also deleted it and checked it out again. No change, the error persists. With previous versions I could repair damages in the .svn easily: just delete the offending folder and check out again. I cannot do this anymore because now the .svn dir is centralized. What could I do to repair my working copy?

    Read the article

  • How do I fix error 1303 during TI Connect install?

    - by smoth190
    I recently purchased a TI-84 Plus graphing calculator, and I'm trying install the TI Connect software in order to connect the calculator to my computer via the USB cable. Unfortunately, I'm getting this error while trying to install the program: Error 1303. The installation has insufficient privileges to access this directory: E:\Data\Timothy\Documents\MyTIData. The installation cannot continue. Log on as administrator or contact your system administrator. However, my account is the only account on my PC, and it has administrative privileges. I've also tried running the installer with Run as Administrator, but with no luck. If I create the folder MyTIData manually, I receive this error: Error 1317. An error occurred while attempting to create the directory: E:\Data\Timothy\Documents\MyTIData I've reapplied the security settings to the E:\Data folder (and all its sub-directories) to Full for my account. I've also gone into Computer Management, and given SYSTEM full privileges for the entire disk. I've also logged out, logged back in, restarted, etc. but still, no luck. Now, I should mention that my Documents folder is not at the default location. I changed it due to my C: disk being a 90GB SSD, so I moved all my personal data onto the extra storage disk (which is ~1TB). I don't know if that is causing the issue, but it can't hurt throwing it out there. So why can't I install this program? Google'ing the problem brings up this error for various other installers (such as Visual Studio and Microsoft Office), but nothing for TI Connect. All the solutions are the same: Give the folder Full privileges...but I've already done this! I've also tried running the installer with and without the calculator plugged in, but it didn't change anything. In the prompt that contains the error, repeatedly clicking Retry or waiting a few moments before clicking Retry also produces no result.

    Read the article

  • Apache directory authorization bug (clicking cancel gives acces to partial content)

    - by s4uadmin
    I got a minor problem (as the site is not high priority) but still a very interesting one. I have an apache root domain wherein other sites live "/var/www/" And I have foo.example.com forwarding to "/var/www/foo-example" (wordpress site) The problem here is that when you go to foo.example.com you are prompted to enter credentials. If you hit cancel it gives you the access denied page. But when you go to the servers' direct IP (this gives you the default index page) and hit cancel when prompted for credentials it just keeps giving you the login screen, and after pressing cancel a few times more it gives (a perhaps cached) bare html part of the page. How do I prevent this from happening? Perhaps this is a bug... Even if I would block access to the root directory when going to the ip/foo-example it would still do this. And I want to keep all the directories within the www directory or at least all in the same. Thanks PS: here is my configuration: <VirtualHost *:80> DocumentRoot /var/www/wp-xxxxxxx/ ServerName beta.xxxxxxxxx.nl <Directory "/var/www/wp-xxxxxxxxx/"> Options +Indexes AuthName "xxxxxxxx Beta Site" AuthType Basic require valid-user Satisfy all AuthBasicProvider file AuthUserFile /var/www/wp-xxxxxxx/.htxxxxxxxxx order deny,allow allow from all </Directory> ServerAdmin [email protected] ServerAlias beta.xxxxxxx.nl </VirtualHost>

    Read the article

  • How to fix IE starting when I try to start Opera?

    - by Scott Leis
    I have a problem that started today, where trying to run any of the 3 versions of Opera I have installed causes Internet Explorer to start instead. The Opera versions I have installed are 9.64, 10.10, and 10.63. My OS is Windows Vista with the latest critical/important updates. The behaviour is the same regardless of whether I double-click a shortcut to Opera, double-click on Opera.exe in one of the install directories, or run Opera.exe with the full path from Start-Run. The only work-around I've found is to right-click Opera.exe or a shortcut, and click "Run as administrator". This starts Opera, and it appears to work normally. Opera seems to be the only program so affected. I've checked Firefox and a few other (non-browser) programs, and they work normally. When IE starts instead of Opera, there are always two instances of iexplore.exe started, and they both crash before I can do anything else. IE also crashes if I try to start it from its own shortcut, but I don't know if that only started today, since I hadn't used IE for a few weeks. Does anyone know what might cause this and how to fix it? It's possible my PC has a virus, but BitDefender (for which I have automatic updates enabled) hasn't detected anything.

    Read the article

  • Partitioning & Linux

    - by Zac
    Every tutorial on Linux-based partitioning schemes (or, just partitioning in general) will tell you that a PC can have either 4 primary partitions, or 3 primaries and 1 extended. They will all also tell you that Linux (in my case, Ubuntu) can be installed on either. It's also come to my attention that it is not too atypical for FHS directories, such as usr/, tmp/, etc/, home/ or var/ to be mounted separately on other partitions. Several questions I am unable to find the answers to, purely for my own edification: (1) By "PC", are we really talking about common PC disk types, like IDE or SATA? I guess I'm wondering why PC uses are limited to 4 primaries or 3 primaries + 1 extended (2) I'm choking on some basic OS concepts: it is said that a partition can be mounted by a file system or an OS. So I assume this means I can somehow instruct Ubuntu to mount to 1 partition, and then any part of, say, ReiserFS, to be mounted to another partition? How? (3)(a) What about creating swap partitions? Is there too much of a good thing with swap partitioning? If I have 4GB RAM over 320GB disk, what should my swap partition size be, and why? (3)(b) Are swap files the only way to create swap partitions? Wouldn't a Linux partitioning utility allow me to define a partition as being for virtual memory only? (4) Why are partitions limited to being "mounted" by just OSes and file systems? Why couldn't I write a program to take up its own, say, 512 MB partition, and then have it invoked or uses by an OS installed on another partition? Thanks for shedding any light here... not critical that I know this stuff, but it's got me thinking incessantly. And when I think incessantly, I...can't......sleep....

    Read the article

  • Apache Virtual Host with directory aliases

    - by brechtvhb
    I'm trying to set up a dynamic virtual host in apache with a directory alias pointing to a difirent path for every domain. Here's what I'm trying to achive. Say I have 2 domains: * www.domain1.com * www.domein2.com I want both to point to the same index.php file (C:/cms/index.php). Now the hard part ... I want directories or certain file types to point to a diffirent path for each domain. Example: * www.domain1.com/layout -> C:/store/www.domain1.com/layout * www.domain2.com/layout -> C:/store/www.domain2.com/layout * www.domain1.com/image.png -> C:/store/www.domain1.com/image.png * www.domain2.com/image.png -> C:/store/www.domain2.com/image.png However the admin directory should point to the same path again for all sites * www.domain1.com/admin -> C:/cms/admin * www.domain2.com/admin -> C:/cms/admin Is there a way to achieve this kind of behaviour in apache 2.2 without having to create a virtualhost entry for each new domain?

    Read the article

  • How can I prevent Apache from asking for credentials on non SSL site

    - by Scott
    I have a web server with several virtual hosts. Some of those hosts have an associated ssl site. I have a DirectoryMatch directive in my main config file which requires basic authentication to any directory with secured as part of the directory path. On sites that have an SSL site, I have a rewrite rule (located in the non ssl config for that site), that redirects to the SSL site, same uri. The problem is the http (80) site asks for credentials first, and then the https (443) site asks for credentials again. I would like to prevent the http site from asking and thus avoid the potential for someone entering credentials and having them sent in clear text. I know I could move the DirectoryMatch down to the specific site, and just put the auth statement in the SSL config, but that would introduce the possibility of forgetting to protect critical directories when creating new sites. Here are the pertinent declarations: httpd.conf (all sites): <DirectoryMatch "_secured_"> AuthType Basic AuthName "+ + + Restrcted Area on Server + + +" AuthUserFile /home/websvr/.auth/std.auth Require valid-user </DirectoryMatch> site.conf (specific to individual site) <DirectoryMatch "_secured_"> RewriteEngine On RewriteRule .*(_secured_.*) https://site.com/$1 </DirectoryMatch> Is there a way to leave DirectoryMatch in the main config file and prevent the request for authorization from the http site? Running Apache 2 on Ubuntu 10.04 server from the default package. I have AllowOverride set to none - I prefer to handle things in the config files instead of .htaccess.

    Read the article

  • xauth, ssh and missing home directory

    - by flolo
    We have several servers, and normaly everything works fine, except now... we get a new aircondition installed. This takes 36 hours and for this time almost all servers got shutdown, only 2 remaining servers run for the most important tasks (i.e. accepting incoming email, delivering some important websites, login-server). Everybody was informed that when they need appropiate data from the homedirs they should fetch it before take down. Long story short: Someone realized that he have run a certain program on one of the servers. No Problem, he can remote login into our login server and run the programm there without home directory (binaries are local and necessary information can be copied to the /tmp). That works like a charm until... ... the user needs to run a GUI programm. I find no easy way to make it running, usually ssh -Y honk@loginserver is enough but now the homedirectory is missing and ssh is not able to copy the cookies into ~/.Xauthority (as the file server with the home directories is down). Paranoid as all systemadmins all X-Server just listen locally not on tcp ports, so no remote X connection possible SSH config is waterproof - i.e. no way to set environment variables. My Problem is, that the generated proxy MIT cookie from ssh get lost as the .Xauthority doesnt exist. If I could retrieve it somehow I could reenter it a .Xauthority in /tmp. The only other option (besides changing the config) which came to my mind is, makeing a tunnel (netcat, or better ssh) from the remote host to the loginserver and copy the cookie manually (not sure if it the tcp-unix domain socket stuff works as expected). Any good suggestions (for the future - now our servers are already up)?

    Read the article

  • Moving web files to /home/user/ gives permission denied using apache

    - by Maaz
    I recently created some linux users on my machine and their respective directories were created in the following manner /home/my_user so I decided to treat each user as one of my websites. I moved all my website files over to this directory like so /home/my_user/public_html/. I edited the virtual host in my httpd.conf and changed the root directory folder so this is how that looks <VirtualHost *:80> ServerAdmin [email protected] DocumentRoot "/home/my_user/public_html" ServerName mywebsite.com ServerAlias www.mywebsite.com ErrorLog "/var/log/httpd/mywebsite/error_log" CustomLog "/var/log/httpd/mywebsite/access_log" common </VirtualHost> Now this virtual host configuration was working perfectly fine with my older document root path that was located at /var/www/html/mywebsite/public_html but after changing that to what it is right now, I am getting a permission denied error. But I followed the instructions here: http://stackoverflow.com/questions/14427808/you-dont-have-permission-error-in-apache-in-centos Even after following the above instructions, when I run the following command: sudo -u apache ls /home/my_user/public_html The server responds with ls: cannot open directory /home/my_user/public_html: Permission denied Even so, I do not get a permissions denied error when I try to access my site any more, however, now I am redirected to the default page of apache instead of my website. I am not exactly sure what's wrong any more, if anyone has an idea, it would be great if you guys could help out!

    Read the article

  • How to host multiple FLEX applications in IIS7

    - by Devtron
    Hello, I manually deploy a FLEX application to my web server (IIS 7). There are two virtual directories, 1.) Default 2.) myFlexApp1. myFlexApp1 is where my working FLEX application resides. I now need to deploy a different FLEX application (let's call it myFlexApp2) to the same web server. I set up a virtual directory for [myFlexApp2] and it complains about the "bindings" using port 80, which is already used by [myFlexApp1]. I have tried to give them separate host names in their bindings properties. For example, myFlexApp1.mydomain.com and myFlexApp2.mydomain.com. I can never get [myFlexApp2] to show from an external browser. I was able to get only one or the other to display, but never could run both. Here is what I need: myFlexApp1.mydomain.com -- myFlexApp1 calendar.mydomain.com -- myFlexApp2 test.mydomain.com -- myFlexApp1 where test.mydomain.com is the default URL. Is this possible? What am I doing wrong? I even tried to edit the hosts file in [C:\Windows\System32\drivers\etc] but that didnt work either. How can I serve up two FLEX applications on IIS 7? It shouldn't be this hard!

    Read the article

  • How to set up a PRIVATE vimwiki on Dropbox.com

    - by Zongheng Yang
    Hi everyone, I assume those who are reading this page know what vimwiki and dropbox.com are and what they are for, so I might directly go into my confusion. The common way of setting a PRIVATE vimwiki on dropbox is simply let your vimwiki directories be under Dropbox folder (but not Dropbox/Public/ because it would be PUBLIC). Dropbox allows directly viewing html with dropbox.com/* url: for example a index.html can be accessed by url https://dl-web.dropbox.com/get/Wiki/html/index.html?w=bfead71a, being added after the file name a specified string, ?w=bfead71a. Hence, if inside index.html there is reference to A.html, which is located in the same folder index.html is in, it has to be accessed using some url like https://dl-web.dropbox.com/get/Wiki/html/index.html?w=SPECIFIED_STRING. But it is seemingly impossible to hack vimwiki in order to make the hrefs in converted htmls corrected in this way. Is there some approach that can resolve this problem? I hope I make myself clear. Had you any questions, please ask me for further explanations. Thank you!

    Read the article

< Previous Page | 97 98 99 100 101 102 103 104 105 106 107 108  | Next Page >