Search Results

Search found 23480 results on 940 pages for 'directory structure'.

Page 169/940 | < Previous Page | 165 166 167 168 169 170 171 172 173 174 175 176  | Next Page >

  • Where should the web server root directory go in linux?

    - by Xeoncross
    I see that Apache and Nginx both use var/www as their web root - but that directory is not covered in the Filesystem Hierarchy Standard. I also see some servers with the web root in the /home/username/www directory. So where does the web root go? Or rather, where should it go most of the time for the common web server?

    Read the article

  • Is it possible for a directory to get unlinked while its contained files remain?

    - by Walkerneo
    I used to wonder why deleting directories via PHP or shell wasn't as easy as it was in Windows with just clicking delete. I realize now that deleting is simply unlinking files from the file allocation table, so to delete a directory, you must unlink all the files inside it. Is it ever possible for the directory's entry to be removed, but not those of the files inside it? Do operating systems periodically check for files that can't be reached in the file system?

    Read the article

  • When I click the address bar folder/directory buttons, is there a way to make them open in a new window?

    - by galacticninja
    In Windows XP, installing the software 'Explorer Breadcrumbs' allows me to have an address bar similar to Windows 7 (directories are displayed as buttons that you can click to go to). With Explorer Breadcrumbs in Windows XP, I can open a directory in the address bar in a new window by ctrl-clicking or middle clicking the directory button. Is there a way to have this same functionality in Windows 7?

    Read the article

  • How do I prevent apache from serving the .git directory?

    - by Shoan
    I have started using git for deployment of websites for testing. How do I prevent apache from serving the .git directory contents? I tried <Directorymatch "^/.*/\.svn/"> Order deny,allow Deny from all </Directorymatch> with no success. I know that I can create a .htaccess file in each .git directory and deny access, but I wanted something I could put into the main config file that makes this global across all websites.

    Read the article

  • What file or directory is unique to Plesk and unlikely to change in the future?

    - by Tim Post
    I am writing a program that needs to be able to detect the presence of various web hosting control panels. I don't use Plesk, or have access to a server running Plesk, so I am unable to ascertain what file or directory I could test for to reliably conclude that the system is running Plesk. What file or directory indicates "Yes, this has Plesk installed" that is unlikely to ever change in the future, but would be present on all versions of Plesk out in the wild? Thanks in advance :)

    Read the article

  • vsftp hangs at "150 Here comes the directory listing."

    - by Rikr
    In a vsftpd server enviroment, shared various directories from nfs mountpoints, I can log in without problem, but when I send the first "ls", the vsftp give me the directory listing: lftp [email protected]:~ ls -rw-rw-rw- 1 1160 1016 392 Jun 06 09:28 test.gif but not give me the shell again (lftp client). In the server log I can see that the last message is: "150 Here comes the directory listing." Why happend this?

    Read the article

  • How to copy directory from one Linux server to another with a minimum in-between period?

    - by yegor256
    I have a rather big directory on one server (over 4000 files), which I'd like to copy to another server (which contains a previous version of this directory). rsync is the first option, but it will put the destination folder into waiting status for a rather long period of time (more than a minute). I'd like to do it a bit differently: gzip the source folder scp the archive to the destination server gunzip the file there delete the archive at the source and the destination What is the best way to accomplish all this?

    Read the article

  • CMD: How do I delete all the contents of all directories (in the current directory) without deleting the directories themselves?

    - by merlin2011
    For example: I'm in the directory: F:\Data Inside this directory, I have four directories: F:\Datadir 22179 22915 23459 23460 These directories have various content, including directories and files. I'm trying to run something like: rmdir /s *\* where I delete all the contents of these numbered directories, while leaving the empty directories. Is there a one-liner that can do this, or do I have to loop through the sub-directories?

    Read the article

  • Spamassassin one-liner to tag & move mail with an X-Spam-Flag: YES to a new directory?

    - by ane
    Say you have a directory with tens of thousands of messages in it. And you want to separate the spam from the non-spam. Specifically, you would like to: Run spamassassin against the directory, tagging each message with an X-Spam-Flag: YES if it thinks it's spam Have a tcsh shell or perl one-liner grep all mail with the flag and move those mails to /tmp/spam What command can you run to accomplish this? For example, some pseudocode: /usr/local/bin/spamassassin -eL ./Maildir/cur/* | grep "X-Spam-Flag: YES" | mv %1 /tmp/spam

    Read the article

  • Free web gallery installation that can use existing directory hierarchy in filesystem?

    - by user1338062
    There are several different free software gallery projects (Gallery, Coppermine, etc), but as far as I know each of those creates a copy of imported images in their internal storage, be it directory structure or database). Is there any gallery software that would allow keeping existing directory hierarchy of media files (images, videos), as-is, and just store the meta-data of them in a database? I guess at least various NAS solutions ship with software like this.

    Read the article

  • error while loading shared libraries; cannot open shared object file: No such file or directory

    - by glitchyme
    The program evince complains that it can't find libfreetype.so.6; however I clearly have the file and its included in my LD_LIBRARY_PATH; furthermore I have another program which uses libfreetype6 and is able to run just fine. What's going on here? jbud@jb-pc ~> evince evince: error while loading shared libraries: libfreetype.so.6: cannot open shared object file: No such file or directory jbud@jb-pc ~> ldd /usr/bin/evince | grep freetype libfreetype.so.6 => /usr/local/lib/libfreetype.so.6 (0x00007f912179d000) jbud@jb-pc ~> file /usr/local/lib/libfreetype.so.6 /usr/local/lib/libfreetype.so.6: symbolic link to `libfreetype.so.6.11.1' jbud@jb-pc ~> file /usr/local/lib/libfreetype.so.6.11.1 /usr/local/lib/libfreetype.so.6.11.1: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, BuildID[sha1]=0x21a4b8005e0c9a42af001b35fb984f4e25efc71c, not stripped jbud@jb-pc ~> echo $LD_LIBRARY_PATH /usr/lib/:/usr/lib64/:/usr/lib/x86_64-linux-gnu/:/usr/local/lib/ jbud@jb-pc ~> ldd jdrive/jstuff/work/personal/noengine/client | grep freetype libfreetype.so.6 => /usr/local/lib/libfreetype.so.6 (0x00007feb5ac89000)

    Read the article

  • Apache config file. Redirect permanent gives 403 error

    - by Homunculus Reticulli
    I am changing my domain from foo.com to foobar.org. I used a Redirect permanent in my apache config file, and then restarted apache. When I try to access the old domain foo.com, I get a 403 error. This is what my apache config file looks like: <VirtualHost *:80> ServerName foo.com #ServerAlias www.foo.com #ServerAdmin [email protected] Redirect permanent / http://www.foobar.org/ DocumentRoot /path/to/project/foo/web DirectoryIndex index.php # CustomLog with format nickname LogFormat "%h %l %u %t \"%r\" %>s %b" common CustomLog "|/usr/bin/cronolog /var/log/apache2/%Y%m.foo.access.log" common LogLevel notice ErrorLog "|/usr/bin/cronolog /var/log/apache2/%Y%m.foo.errors.log" <Directory /> Order Deny,Allow Deny from all </Directory> <Files ~ "^\.ht"> Order allow,deny Deny from all </Files> <Directory /path/to/project/foo/web> Options -Indexes -Includes AllowOverride All Allow from All RewriteEngine On # We check if the .html version is here (cacheing) RewriteRule ^$ index.html [QSA] RewriteRule ^([^.])$ $1.html [QSA] RewriteCond %{REQUEST_FILENAME} !-f # No, so we redirect to our front end controller RewriteRule ^(.*)$ index.php [QSA,L] </Directory> <Directory /path/to/project/foo/web/uploads> Options -ExecCGI -FollowSymLinks -Indexes -Includes AllowOverride None php_flag engine off </Directory> Alias /sf /lib/vendor/symfony/symfony-1.3.8/data/web/sf <Directory /lib/vendor/symfony/symfony-1.3.8/data/web/sf> # Alias /sf /lib/vendor/symfony/symfony-1.4.19/data/web/sf # <Directory /lib/vendor/symfony/symfony-1.4.19/data/web/sf> Options -Indexes -Includes AllowOverride All Allow from All </Directory> </VirtualHost> Can anyone spot what I may be doing wrong?. The site foobar.org does exist so I don't know why this error occurs - help?

    Read the article

  • Robots.txt practices with .htaccess redirections (inherits)

    - by Jayhal
    I have a question regarding how to write robots.txt files for many domains and subdomains with redirects in place. We have a hosting account that enacts primary and add-on domains. All of our domains and subdomains, including the primary domain, is redirected via htaccess 301s to their own subdirectories in the primary domain's root directory. I'm confused about how I would write the robots.txt for certain directories. First, I wanted to confirm I am right in understanding that for domains and subdomains, crawlers will look to the directory that acts as that urls root directory for the crawling rules(robots.txt). Also, that a directory will not be affected by a robots.txt present in their parent directory if the directory has its own domain/subdomain, and that url is the one being accessed by crawlers. (Am pretty sure, but I wanted to confirm I didnt have a fundamentally flawed understanding of robots.txt) In the original root directory on the account(where the primary domain was directed before htaccess was put in place) what should the robots.txt contain? When crawlers look to crawl our primary domain, will they look to the original root directory for the robots.txt or will they reference the file contained in the new subdirectory where all the primary domain's site files are located? If so, what should the root's robot.txt include if anything at all. Would I be right to include a simple 'disallow: /' for all agents, and then include more specific robots.txt files in each subdirectory with more specific instructions. Would that affect the crawling of the directory where the primary domain is now redirected? Any help is greatly appreciated, Thanks!

    Read the article

  • Utilizing Generics to make a Class structure more mutable…

    - by Keith Barrows
    While the ASP.NET GridView control supports automatic paging I found it faster to use custom paging in several situations.  I found myself rewriting the same code over and over just to add the basic sorting capabilities to an ASP.NET GridView object.  So today I took just a little bit of time to encapsulate it all into a Class I can use and reuse on any page with a GridView.  In fact, it will probably take longer to write this blog entry than it took to encapsulate the functionality...(read more)

    Read the article

  • How to modify Perl script to move packets in diffrent directory based on version? [migrated]

    - by Peter Penzov
    I have this Perl script which is used to soft packages based on packet version: #!/usr/bin/perl -w # # Compare versions of all *.rpm files against the # latest packages installed (if installed) # # Usage: # rpmver.pl # This script looks for all *.rpm files. # use strict; use RPM2; my $rpm_db = RPM2->open_rpm_db(); for my $filename (<*.rpm>) { my $h = RPM2->open_package( $filename ); # Ensure we compare against the newest # package of the given name. my ($installed) = sort { $b <=> $a } $rpm_db->find_by_name($h->name); if (not $installed) { printf "Package %s not installed.\n", $h->as_nvre; } else { my ($result) = ($h <=> $installed); if ($result < 0) { printf "Installed package %s newer than file %s\n", $installed->as_nvre, $h->as_nvre; } else { printf "File %s newer than installed package %s\n", $h->as_nvre, $installed->as_nvre; } } } I have a Linux repository with SRPMs. I want to move the packages with the latest into different directory for example latest_lackages. How the script must be modified?

    Read the article

  • How do I structure code and builds for continuous delivery of multiple applications in a small team?

    - by kingdango
    Background: 3-5 developers supporting (and building new) internal applications for a non-software company. We use TFS although I don't think that matters much for my question. I want to be able to develop a deployment pipeline and adopt continuous integration / deployment techniques. Here's what our source tree looks like right now. We use a single TFS Team Project. $/MAIN/src/ $/MAIN/src/ApplicationA/VSSOlution.sln $/MAIN/src/ApplicationA/ApplicationAProject1.csproj $/MAIN/src/ApplicationA/ApplicationAProject2.csproj $/MAIN/src/ApplicationB/... $/MAIN/src/ApplicationC $/MAIN/src/SharedInfrastructureA $/MAIN/src/SharedInfrastructureB My Goal (a pretty typical promotion pipeline) When a code change is made to a given application I want to be able to build that application and auto-deploy that change to a DEV server. I may also need to build dependencies on Shared Infrastructure Components. I often also have some database scripts or changes as well If developer testing passes I want to have an manually triggered but automated deploy of that build on a STAGING server where end-users will review new functionality. Once it's approved by end users I want to a manually triggered auto-deploy to production Question: How can I best adopt continuous deployment techniques in a multi-application environment? A lot of the advice I see is more single-application-specific, how is that best applied to multiple applications? For step 1, do I simply setup a separate Team Build for each application? What's the best approach to accomplishing steps 2 and 3 of promoting latest build to new environments? I've seen this work well with web apps but what about database changes

    Read the article

  • What benefits can I get upgrading my ASP.NET (Webform) + DAL(EF) + Repository + BLL structure to MVC?

    - by Etienne
    I'm in the process of defining an approach that may best fit our needs for a big web application development. For now, I'm thinking going with an ASP.NET Architecture with a DAL using Entity Framework, a Repository concept to not access DAL directly from BLL and a BLL that call the repository and make every manipulations necessary to prepare data to push in a presentation layer (.aspx files). I don't plan to use ASP.Net controls and prefer to keep things simple and lightweight using plain html, jQuery UI controls and do most of the server calls with jQuery Ajax. Sometimes, when needed, I plan to use handlers (.ashx) to call BLL methods that will return JSON or HTML to client for dynamic stuff. My solution also has a test project that Mock the Repository with in-memory data to not repose on database for testing BLL methods... It may be usefull to add that we will build a big application over this architecture with hundreds of tables and store procedures with a lot of reading and writing to database. My question is, having this architecture in mind, Is there any evident advantages that I can obtain by using an MVC3 project instead of the described architecture base on Webform? Do you see any problem in this architecture that may cause us problem during the next steps of development? I know the MVC pattern for using it in others projects with Django... but the Microsoft MVC implementation look so much more complex and verbose than Django MVC and it's why I'm hesitating (or waiting for a little push?) right now before jumping into it... We are in a real project with deadlines and don't want to slow the development process without any real benefits.

    Read the article

< Previous Page | 165 166 167 168 169 170 171 172 173 174 175 176  | Next Page >