Search Results

Search found 37684 results on 1508 pages for 'msp files'.

Page 403/1508 | < Previous Page | 399 400 401 402 403 404 405 406 407 408 409 410  | Next Page >

  • wsgi - narrow user permissions.

    - by Tomasz Wysocki
    I have following Apache configuration and my application is working fine: <VirtualHost *:80> ServerName ig-test.example.com WSGIScriptAlias / /home/ig-test/src/repository/django.wsgi WSGIDaemonProcess ig-test user=ig-test </VirtualHost> But I want to protect my files from other users, so I do: chown ig-test /home/ig-test/ -R chmod og-rwx /home/ig-test/ -R And application stops working: (13)Permission denied: /home/ig-test/.htaccess pcfg_openfile: unable to check htaccess file, ensure it is readable Is it possible to achieve what i'm doing with wsgi? If I have to give read permissions to some files it will be fine. But there are files I have to protect (like file with DB configuration or business logic of application).

    Read the article

  • Clean URLS on Hiawatha

    - by Botto
    I am using the Hiawatha web server and running drupal on a FastCGI PHP server. The drupal site is using imagecache and it requires either private files or clean urls. The issue I am having with clean urls is that requests to files are being rewritten into index.php as well. My current config is: UrlToolkit { ToolkitID = drupal RequestURI exists Return Match (/files/*) Rewrite $1 Match ^/(.*) Rewrite /index.php?q=$1 } The above does not work. Drupal's apache set up is: <Directory /var/www/example.com> RewriteEngine on RewriteBase / RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*)$ index.php?q=$1 [L,QSA] </Directory>

    Read the article

  • Tell browsers to cache until last modified date changes?

    - by Chad Johnson
    My web site consists of static HTML files which are usually republished once per day, and sometimes more. I'm using Apache. In the vhost settings for my site, I'd like to tell browsers to cache HTML files indefinitely, until Apache sees that they are modified. So as soon as an HTML file is changed, Apache should immediately begin telling browsers it's changed and send the updated file. As soon as a new file is published, browsers should immediately begin receiving that...they should never receive old versions of files. Maybe ExpiresByType text/html modification and no "plus x days." Is something like this possible?

    Read the article

  • How to fix Windows 2008 R2 BOOTMGR is missing

    - by RichardTheKiwi
    BOOTMGR IS MISSING PRESS CTRL+ALT+DEL TO RESTART Note: This is a VM on VMWare ESX server, but that should not matter I put in the 2008 R2 x64 install dvd and can get to recovery, but it lists no Operating Systems. Clicking on Next brings me to +=========================== System Recovery Options +=========================== Choose a recovery tool Operating system: Unknown or (Unknown) Local Disk ..... Command Prompt I start the command prompt, go to C:\ and perform a dir /a Apart from files I put there myself, these are showing $Recycle.Bin Documents and Settings [C:\Users] Program Files Program Files (x86) ProgramData Recovery System Volume Information Temp Users Windows Where to go next? Is it like the NTLDR problem with Windows 2003 where I can just drop a file in there and it will be hunky dory again?

    Read the article

  • Backup Client/Server Software that Syncs only the Delta?

    - by Urda
    I have a co-located server, and a desktop computer. I push small things and large ammounts of small files (like my iTunes) into a JungleDisk cloud. If a few files change there, no big deal, the file gets re-uped. For larger files JungleDisk backup isn't helpful. Things like movies and VMware images that change a lot, but I want backed up. Just not to JungleDisk since that would cost me even more money. I am looking for a product, closed or open source (preferably open source) that will sync the change, or delta, to my personal server on a schedule. That way I can keep a copy of my larger things, without paying JungleDisk a ton more since they are in the range of many Gigabytes. Right now these few items are backed up over FTP, and take forever. Both the client and server are windows environments.

    Read the article

  • How to fix Windows 2008 R2 BOOTMGR is missing

    - by cyberkiwi
    BOOTMGR IS MISSING PRESS CTRL+ALT+DEL TO RESTART Note: This is a VM on VMWare ESX server, but that should not matter I put in the 2008 R2 x64 install dvd and can get to recovery, but it lists no Operating Systems. Clicking on Next brings me to +=========================== System Recovery Options +=========================== Choose a recovery tool Operating system: Unknown or (Unknown) Local Disk ..... Command Prompt I start the command prompt, go to C:\ and perform a dir /a Apart from files I put there myself, these are showing $Recycle.Bin Documents and Settings [C:\Users] Program Files Program Files (x86) ProgramData Recovery System Volume Information Temp Users Windows Where to go next? Is it like the NTLDR problem with Windows 2003 where I can just drop a file in there and it will be hunky dory again?

    Read the article

  • Apache stopping downloads part way through

    - by Ben Smiley
    On my site there are some digital files which can be downloaded through a PHP script. The script works fine for small files but large files i.e. 115MB cannot be downloaded successfully. The connection dies after around 15 minutes but it's not consistent - sometimes longer sometimes shorter. I don't think it's a problem with the script timing out because the download time isn't consistent. Equally it doesn't seem like a memory limit problem because the amount downloaded varies each time. Does anyone know of any Apache or PHP related settings which could cause this kind of problem?

    Read the article

  • Linux file permissions not being preserved

    - by yellavon
    I am deploying some custom software as root (a necessity for this situation). I set the owner/group to user1:user1 and set all the files to 644 beforehand in shell, then copy and deploy with ant. However, when files get copied over from the deployment directory, the ownership changes back to root and all the files install with 666 permissions. This seems to occur whether the file is overwritten or newly created. I believe there is a way to set an option in cp, mv commands to preserve permissions, but that would be a lot of commands to change. How can I fix this? Is there some setting I can change temporarily for root so the install always preserves the file permissions?

    Read the article

  • re: 3ware raid 10 (4drive) suggested stripe size suggestions?

    - by dasko
    looked around on the site but nothing really concrete on my question. i will have about 120GB of data total, files are made up of 5MB files, excel, word and about 25 .pst files that are about 1.2GB each. Yes they use .pst over network, even though it is not recommended this is legacy setup without issue so we will continue to support this for another year or so. I need to know what you think about a stripe size of 256kb for the raid 10 based on the above requirements. I did try and bench with these settings and it seems alright without any real issue, just trying to rule out anything i might of missed. thanks.

    Read the article

  • Can't Create New folder in my video folder

    - by tiki
    My OS is windows 7 Ultimate 32bit. My video folder location is C:\Users\User\Documents\Downloads\Video. I Can't create folder in the Video folder or see the files i saved. But I can see the files in the root location of the folder i.e by going to the folder from My Computer. I can create and find files when I go to the folder My Computer>C>Users>User>Documents>Downloads>Video. This is an unusual problem I have never seen. Even being a system Administrator I can't able to fix the problem. Please help me out. Thanks in advance

    Read the article

  • Nginx Restart Issues

    - by heavymark
    All of the sudden when restarting Nginx I get the following error: Restarting nginx: [alert]: could not open error log file: open() "/var/log/nginx/error.log" failed (13: Permission denied) 2011/02/16 17:20:58 [warn] 23925#0: the "user" directive makes sense only if the master process runs with super-user privileges, ignored in /etc/nginx/nginx.conf:1 the configuration file /etc/nginx/nginx.conf syntax is ok 2011/02/16 17:20:58 [emerg] 23925#0: open() "/var/run/nginx.pid" failed (13: Permission denied) configuration file /etc/nginx/nginx.conf test failed On the front end part of the site loads but some files such as the CSS in particular are not loading. They exist on the server but when loading the resources directly in Chrome they say "Oops this page can't be found." I set a special group and user to run my apache files using suexec for my domain files. I think the nginx are owned by root however which I'm assuming is the problem but which nginx file ownerships would I change?

    Read the article

  • How to monitor a folder for changes, and execute a command if it does, on Windows?

    - by Camilo Martin
    There are similar questions for Linux and Mac, but I'm after a Windows solution here. The problem is as follows: I want to write several (js) script files in a folder, and have a program monitor that folder for file changes and new files being added, and run a command whenever that happens (to compile them all into one single file). The solution has to: Monitor both file changes and new files being added, in a folder. Run a command only if there is any change. It would be best if it either is a built-in solution (like a JScript or VBscript snippet), or something that does not require installation.

    Read the article

  • How to synchronize two folders on two remote Linux virtual machines

    - by Manoj Agarwal
    I have two virtual machines, Host OS is ESXi 3.5 and guest OS is Centos 4.6. There are two ESXi servers remotely located, each containing a Centos 4.6 virtual machine. I wish, whatever change I make in any file/folder in one virtual machine should be automatically synchronized on other remote virtual machine. The synchronization process should be automatic. It should only sync differentials, not simulate entire copy with overwrite operation. Sync should be intelligent enough to look for what has changed and what not, and should only update the changed files/folders. Further, there should be some sort of overview and selection for syncing, for example, if it shows 4 files have changed, It should be possible to sync only two files and leave other two for the time being. So, some intelligent syncing mechanism for Linux is needed.

    Read the article

  • Create Virtual Image of Laptop before Formatting

    - by Simon Mark Smith
    I have a 3 year old laptop running Windows XP that I used for business. Although I have not used the laptop in over a year, I now want to re-commission it with Windows 7 and a fresh install. Before I do the fresh install I want to create a Virtual Image of the laptop that I can keep and potentially run on my desktop machine should I ever need to access any of the old files/projects that it contains currently. I know that most people will say just copy the files over to your desktop, but my concern is the configuration of the laptop. I used to use it for development and it has older versions of Visual Studio, SQL Server, Active X controls etc, etc than I currently use so I really want to preserve the environment not just the files. So really I am asking what is the best tool-set/method to achieve this? I understand there are free VM tools available but I have never done this before and would appreciate any help.

    Read the article

  • windows server backup 2008 R2 - what is generating all the change data?

    - by bobjandal
    We have a small relatively idle windows server 2008 R2 installation that does basic filesharing and exchange for about 10 not very active users. When running a windows server backup, the incremental data daily is about 20GB. This is not coming from users shared files, nor from changes in their mailbox sizes. The total size of the installation is 249GB, which is mostly old files. Where is all this data coming from, and how can I reduce it ? Using online backup of the vhd file from the backup is taking a while because of this daily change. Is there some way I can at least see what files are changing and contributing to this data ? Options I can think of but am not sure about: 1) pagefile churning - altho the backup does not include the pagefile, perhaps the changed blocks left behind are included ? 2) logs or something ? but the installation size stays the same every day 3) should I zero free space using sdelete before backing up perhaps ?

    Read the article

  • Copying windows 8 Users folder having long long paths

    - by bilal.haider
    I was trying to move my "Users" folder in Windows 8 as described here and here. But when I try to copy the folder using "xcopy" in Windows Installation Disk Repair Mode, after some files are copied, I get "insufficient memory". The files on which the error is given are like C:\Users\Bilal\Application Data\Application Data\Application Data.........Application Data\Application Data..... What is the point in such directories within directories? I also tried copying them using Mini Windows XP, but the problem was there too.. Also tried copying using Parted Magic Live CD... but still.. So now, how can I move them? Another Question. Is moving such/ system files using Linux a good idea? Does it do anything to permissions?

    Read the article

  • Cannot install netbeans on Gentoo

    - by siebz0r
    I'm trying to install Netbeans on my Gentoo system and I just cannot get it to work. When I'm compiling it It fails due to not being able to locate org.apache.maven.project.MavenProject. [nb-javac] class file for org.apache.maven.project.MavenProject not found I'm suspecting it has something to do with the installation not being able to find several files. However these are reported as warnings. The missing files have a version in their filenames, the files that are on the disk have a version number that is slightly off. So that explains why these cannot be found. The full output of the install can be found here: http://pastebin.com/43NS2ktz

    Read the article

  • rsync - How to exclude one .htaccess but not all of them

    - by Cory Gagliardi
    I have an rsync command for copying my files from dev to production. I don't want to copy the .htaccess file that's in the root of the HTML directory but, I do want to copy the few .htaccess files that are in its sub directories. I'm using the argument --exclude .htaccess which is stopping all of the files from getting copied. The other arguments I'm including are -a --recursive --times --perms. Is it possible to configure rsync to do this? Edit: Here is my full command: rsync -a --recursive --times --perms \ --exclude prop_images --exclude tracking --exclude vtours \ --exclude .htaccess --exclude .htaccess_backup --exclude "*~" \ /home/user/dev_html/* /home/user/public_html/

    Read the article

  • IIS's SMTP Pickup timing

    - by fatcat1111
    I have IIS's SMTP server set up as a closed relay, and it's working nicely. I also have an application that writes EML files. If the EML files are written to a temporary directory, then moved to the server's Pickup directory, email is sent as expected. However, if I have the application write the EML files directly to the Pickup directory, the email will often fail to send. This seems to be a race condition: the server starts processing the EML file as soon as it detects it in Pickup, even though the application hasn't completed writing it. The result is the server considers the EML to be malformed, and it punts it to Badmail. While I very much appreciate the server's earnestness, it seems that I need to dial it back a bit for this scenario. Does anybody know if IIS's SMTP server's polling frequency can be configured? I am using IIS7, Windows Server 2008 R2. The application that writes the EML cannot be modified.

    Read the article

  • Data Archiving vs not

    - by Recursion
    For the sake of data integrity, is it wiser to archive your files or just leave them unarchived. No compression is being used. My thinking is that if you leave your files unarchived, if there is some form of corruption it will only hurt a smaller number of files. Though if you archive, lets say all of your documents, if there is even the slightest corruption, the entire archive is unrecoverable. So whats the best way to keep a clean file system, but not be subject to data corruption.

    Read the article

  • Apache - building extensions with apxs

    - by Brian
    Hello, Pardon the newbie question - I haven't worked with manually compiling Apache modules (or anything) before. I am trying to get the mod_concat module going. It seems simple enough - just requires downloading the mod_concat.c file and then running: axps -c mod_concat.c This is new to me. Does it matter which directory I put mod_concat.c before running this command? I ran it from my home directory, and I see some new files - mod_concat.la, mod_concat.lo, mod_concat.o, and mod_concat.slo - along with a new subfolder called .libs/ that contains mod_concat.so along with some other files. I'm not sure where to go from here, I have a feeling these files were created in the wrong place. Don't I need mod_concat.so to be in my apache modules directory with the rest? Thanks for the help, Brian

    Read the article

  • Can't get powershell to return where results from GCI using ACL

    - by Rossaluss
    I'm trying to get Powershell to list files in a directory that are older than a certain date and match a certain user. I've got the below script so far which gives me all the files older than a certain date and lists the directory and who owns them: $date=get-date $age=$date.AddDays(-30) ls '\\server\share\folder' -File -Recurse | ` where {$_.lastwritetime -lt "$age"} | ` select-object $_.fullname,{(Get-ACL $_.FullName).Owner} | ` ft -AutoSize However, when I try and use an additional where parameter to select only files owned by a certain user, I get no results at all, even though I know I should, based on the match I'm trying to obtain (as below): $date=get-date $age=$date.AddDays(-30) ls '\\server\share\folder' -File -Recurse | ` where ({$_.lastwritetime -lt "$age"} -and {{(get-acl $_.FullName).owner} -eq "domain\user"}) | ` select-object $_.fullname,{(Get-ACL $_.FullName).Owner} | ` ft -AutoSize Am I missing something? Can I not use the get-acl command in a where condition as I've tried to? Any help would be appreciated. Thanks

    Read the article

  • Backup Source (non source control)?

    - by acidzombie24
    I back up my code with svn. I have project files in there however i ignore selected things. I also ignore jpg, ogg, etc. Right now i would like to backup everything. However the zip result is 1gb (i have a lot of code). I know i can cut down the filesize by 60%+ Is there an app i can use which will backup everything except the bin and obj folders? perhaps keep ogg, json, jpg files but ignore .svn or .pdb files?

    Read the article

  • Google App Engine says "Must authenticate first." while trying to deploy any app

    - by Oleksandr Bolotov
    Google App Engine says "Must authenticate first." while trying to deploy any app: me@myhost /opt/google_appengine $ python appcfg.py update ~/sda2/workspace/lyapapam/ Application: lyapapam; version: 1. Server: appengine.google.com. Scanning files on local disk. Scanned 500 files. Scanned 1000 files. Initiating update. Email: <my_email_was_here>@gmail.com Password for <my_email_was_here>@gmail.com: Error 401: --- begin server output --- Must authenticate first. --- end server output --- We are getting this message with any appliation and under any developer account avialable to us That's what we have installed: App Engine SDK - 1.3.2 PIL - 1.1.7 Python - 2.5.5 pip - 0.6.3 ssl - 1.15 wsgiref - 0.1.2 So, what can it be? Is it well known problem?

    Read the article

  • The specified module (mod_h264_streaming) could not be found (Apache2)?

    - by rphello101
    I'm trying to get the mod_h264_streaming to work with my Apache2 server. I downloaded a precompiled version of the mod from here. I read here that all I have to do is extract the file to my modules folder, which I did, and add LoadModule h264_streaming_module modules/mod_h264_streaming.so AddHandler h264-streaming.extensions .mp4 to the httpd.conf, which I also did. However, I get this error when I restart Apache: Syntax error on line 173 of C:/Program Files (x86)/Apache Group/Apache2/conf/httpd.conf: Cannot load C:/Program Files (x86)/Apache Group/Apache2/modules/mod_h264_streaming.so into server: The specified module could not be found. Note the errors or messages above, and press the <ESC> key to exit. 26... Even though the file exists right here: C:\Program Files (x86)\Apache Group\Apache2\modules\mod_h264_streaming.so Can anyone tell me what I'm doing wrong?

    Read the article

< Previous Page | 399 400 401 402 403 404 405 406 407 408 409 410  | Next Page >