Search Results

Search found 29591 results on 1184 pages for 'psd into html'.

Page 689/1184 | < Previous Page | 685 686 687 688 689 690 691 692 693 694 695 696  | Next Page >

  • Creating a fallback error page for nginx when root directory does not exist

    - by Ruirize
    I have set up an any-domain config on my nginx server - to reduce the amount of work needed when I open a new site/domain. This config allows me to simply create a folder in /usr/share/nginx/sites/ with the name of the domain/subdomain and then it just works.™ server { # Catch all domains starting with only "www." and boot them to non "www." domain. listen 80; server_name ~^www\.(.*)$; return 301 $scheme://$1$request_uri; } server { # Catch all domains that do not start with "www." listen 80; server_name ~^(?!www\.).+; client_max_body_size 20M; # Send all requests to the appropriate host root /usr/share/nginx/sites/$host; index index.html index.htm index.php; location / { try_files $uri $uri/ =404; } recursive_error_pages on; error_page 400 /errorpages/error.php?e=400&u=$uri&h=$host&s=$scheme; error_page 401 /errorpages/error.php?e=401&u=$uri&h=$host&s=$scheme; error_page 403 /errorpages/error.php?e=403&u=$uri&h=$host&s=$scheme; error_page 404 /errorpages/error.php?e=404&u=$uri&h=$host&s=$scheme; error_page 418 /errorpages/error.php?e=418&u=$uri&h=$host&s=$scheme; error_page 500 /errorpages/error.php?e=500&u=$uri&h=$host&s=$scheme; error_page 501 /errorpages/error.php?e=501&u=$uri&h=$host&s=$scheme; error_page 503 /errorpages/error.php?e=503&u=$uri&h=$host&s=$scheme; error_page 504 /errorpages/error.php?e=504&u=$uri&h=$host&s=$scheme; location ~ \.(php|html) { include /etc/nginx/fastcgi_params; fastcgi_pass 127.0.0.1:9000; fastcgi_intercept_errors on; } } However there is one issue that I'd like to resolve, and that is when a domain that doesn't have a folder in the sites directory, nginx throws an internal 500 error page because it cannot redirect to /errorpages/error.php as it doesn't exist. How can I create a fallback error page that will catch these failed requests?

    Read the article

  • Buying a new PC with fast hardware features

    - by Hooshkar
    I am web designer/developer, and I design and code "Premium Wordpress Themes" and HTML Websites. I wish to buy a new PC system with fast features, and will be installing virtual machines too. What kind of PC system would be best for a professional web designer/developer. Tell me the PC specs, e.g processor, motherboard, how much hard drive, RAM, graphic card etc. Don't really care about the price.

    Read the article

  • conditional mod_deflate based on headers

    - by Ben K.
    mod_deflate seems pretty sweet. I'd love to turn it on across the board for text/html--but for certain pages, I don't want to gzip since upstream proxies need to be able to inspect the content. I know there's an AddOutputFilterByType directive -- is there any way to combine that w/ a header inspect so that if I see X-NO-COMPRESS true I skip mod_deflate?

    Read the article

  • How do I turn on basic HTTP-auth for a page in jboss?

    - by Electrons_Ahoy
    I'm setting up a jboss server for testing some java code that talks to http servers. That's pretty easy. However one of the things I'm testing is interfacing with classic "old-school" HTTP-Auth protected pages, and for the life of me I can't figure out how to turn that on in jboss (and my google-fu seems to have let me down.) So, how do I add a basic username and password to a single html (or jsp) file in jboss using http Basic Access Authentication?

    Read the article

  • Win7 Modifying incoming HTTP packet from specific url automatically

    - by xeross
    Hey, Is there an application that can listen in on my PCs http traffic (Preferably process specific), and modify packets that were requested from a certain url ? So let's say everytime I request http://example.tld/test.html it would replace any occurence of let's say "i" with "I", it's a simple example but still it's an example Thanks for your time, Xeross

    Read the article

  • Nginx Rewrite Rule For File Within Folder Not Working

    - by user3620111
    Good evening everyone or possible early morning if you are in my neck of the woods. My problem seems trivial but after several hours of testing, researching and fiddling I can't seem to get this simple nginx rewrite function to work. There are several rewrites we need, some will have multiple parameters but I cant even get this simple 1 parameter current url to alter at all to the desired. Current: website.com/public/viewpost.php?id=post-title Desired: website.com/public/post/post-title Can someone kindly point me to as what I have done wrong, I am baffled / very tired... For testing purposes before we launch we were just using a simple port on the server. Here is that section. # Listen on port 7774 for dev test server { listen 7774; server_name localhost; root /usr/share/nginx/html/paa; index index.php home.php index.html index.htm /public/index.php; location ~* /uploads/.*\.php$ { if ($request_uri ~* (^\/|\.jpg|\.png|\.gif)$ ) { break; } return 444; } location ~ \.php$ { try_files $uri @rewrite =404; fastcgi_index index.php; include fastcgi_params; fastcgi_pass php5-fpm-sock; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; fastcgi_intercept_errors on; proxy_set_header Host $host; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; } location @rewrite { rewrite ^/viewpost.php$ /post/$arg_id? permanent; } } I have tried countless attempts such as above @rewrite and simpler: location / { rewrite ^/post/(.*)$ /viewpost.php?id=$1 last; } location ~ \.php$ { try_files $uri =404; fastcgi_index index.php; include fastcgi_params; fastcgi_pass php5-fpm-sock; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; fastcgi_intercept_errors on; proxy_set_header Host $host; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; } I can not seem to get anything to work at all, I have tried changing the location tried multiple rules... Please tell me what I have done wrong. Pause for facepalm [relocated from stack overflow as per mod suggestion]

    Read the article

  • Printing without breaking accross pages.

    - by jedberg
    I have a page with a bunch of HTML tables that I need to print. Each table should fit on one page, but both Firefox and Safari want to break them over multiple pages and put more than one on a page. Is there any way to force it to not break the tables across pages? I can use any Mac program to print. Thanks.

    Read the article

  • Why is my htaccess file preventing access to my MP3 file?

    - by Andrew
    My Zend Framework application has a public directory which contains an htaccess file. If the file isn't found in the public directory, it routes the request through the application. I have an MP3 file within my public directory, but the htaccess file is routing the request through the application! Do you see anything wrong with my htaccess file? AddDefaultCharset utf-8 RewriteEngine on RewriteRule ^Resources/.* - [L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule !\.(js|ico|gif|jpg|png|css|htm|html|php|pdf|doc|txt|swf|xml|mp3)$ /index.php [NC]

    Read the article

  • pretty-printing IP packets

    - by pts
    I'm receiving IP packets using the SLIP protocol, and I'd like to pretty-print them similarly to how tcpdump does it. My program is able to decode the SLIP protocol and create a single string containing an IP packet if necessary. I couldn't find any relevant tcpdump command-line flags except for -r. The file format is documented at http://www.tcpdump.org/pcap/pcap.html , but it looks a bit too complicated. Is there a Linux tool for pretty-printing raw IP packets?

    Read the article

  • Where does Chrome store its bookmarks in Ubuntu 11.10?

    - by Alan Wood
    I looked at all the other posts on this but can't find the directories mentioned (~/.config/google-chrome/Default/Bookmarks, it's a JSON file.). Being a 2 day Newbie to Ubuntu/Linux I would like to know if the location has changed in the latest version or if not how I locate the directory indicated. I have logged in as root and searched for the folder and can't find it although I imported my bookmarks from a html file so I know that they must be saved somewhere.

    Read the article

  • Apache Virtual host points to main domain

    - by user37143
    Listen 80 ServerName www.mydomain.com:80 DocumentRoot "/www/tomcat/webapps" Options Indexes FollowSymLinks Order allow,deny Allow from all Options ExecCGI NameVirtualHost *:80 ServerName blog.mydomain.com DocumentRoot /www/blog DirectoryIndex index.php index.html Options All AllowOverride All Allow from all on ssl.conf I have: *Listen 443 * Now if I access mydomain.com or blog.mydomain.com both are forwarded to /www/tomcat/webapps any idea where I went wrong? I have source complied Apache2. Should I add a virtual for the mydomain.com too? Thanks, Anpl

    Read the article

  • Transparently rewrite requests to a subdomain.

    - by ptrin
    I would like to rewrite requests to http://www.mysite.com/foo to http://foo.mysite.com without the user's address bar changing. Using IIRF I can do the rewrite, but only if I use the [R] modifier flag which makes the rewrite a redirect. Is there a way for me to transparently rewrite requests to a subdomain? Here's the rewrite rule I've been testing with: RewriteRule ^/foo/(.*)?$ http://foo.mysite.com/index.html?$1 [R,L]

    Read the article

  • gzip compression good or bad?

    - by WarDoGG
    I have a server that currently does a lot of processing in my application and the target users are those who have a very good internet connection. The output that is sent from the server is always text/html and we do not use any media (audio/video) only images (static site images like logo,etc). We are experiencing severe performance issues and I wonder if turning off gzip/mod_deflate on the server so that the server would avoid compressing the output. Will this cause an improvement in performance?

    Read the article

  • Nginx SSL redirect for one specific page only

    - by jjiceman
    I read and followed this question in order to configure nginx to force SSL for one page (admin.php for XenForo), and it is working well for a few of the site administrators but is not for myself. I was wondering if anyone has any advice on how to improve this configuration: ... ssl_certificate example.net.crt; ssl_certificate_key example.key; server { listen 80 default; listen 443 ssl; server_name www.example.net example.net; access_log /srv/www/example.net/logs/access.log; error_log /srv/www/example.net/logs/error.log; root /srv/www/example.net/public_html; index index.php index.html; location / { if ( $scheme = https ){ rewrite ^ http://example.net$request_uri? permanent; } try_files $uri $uri/ /index.php?$uri&$args; index index.php index.html; } location ^~ /admin.php { if ( $scheme = http ) { rewrite ^ https://example.net$request_uri? permanent; } try_files $uri /index.php; include fastcgi_params; fastcgi_pass 127.0.0.1:9000; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; fastcgi_param HTTPS on; } location ~ \.php$ { try_files $uri /index.php; include fastcgi_params; fastcgi_pass 127.0.0.1:9000; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; fastcgi_param HTTPS off; } } ... It seems that the extra information in the location ^~ /admin.php block is unecessary, does anyone know of an easy way to avoid duplicate code? Without it it skips the php block and just returns the php files. Currently it applies https correctly in Firefox when I navigate to admin.php. In Chrome, it downloads the admin.php page. When returning to the non-https website in Firefox, it does not correctly return to http but stays as SSL. Like I said earlier, this only happens for me, the other admins can go back and forth without a problem. Is this an issue on my end that I can fix? And does anyone know of any ways I could reduce duplicate configuration options in the configuration? Thanks in advance! EDIT: Clearing the cache / cookies seemed to work. Is this the right way to do http/https redirection? I sort of made it up as I went along.

    Read the article

  • Url rewrite rule

    - by vvr
    How to redirect a page form show.php?id=(15charstring) to show/(15charstring) I tried like this it is doing reverse means it is redirecting /show/(15chars) to show.php?id=(15chars) RewriteEngine on RewriteRule ^/show/([a-zA-Z0-9]{15})$ http://site.com/show.php?id=$1 Second case is i have to redirect to another page if he added &m=true to the url show.php?id=(15chars)&m=true html/show.php?id=(15chars).

    Read the article

  • Ubuntu server users question

    - by Camran
    I have read this article: https://help.ubuntu.com/9.04/serverguide/C/user-management.html But it doesn't go into depth with the privileges section. I need to know how to set privileges of me (as a user). I am the only user, but I want access to everything, but I don't want to manage my VPS logged in as root. So I am creating a username. Anybody have a list of privileges, what they mean and how to set them? Thanks

    Read the article

  • Rule of thumb in RAM estimate for static pages? [closed]

    - by IMB
    Possible Duplicate: How do you do Load Testing and Capacity Planning for Web Sites I've seen tutorials saying they can run decent websites on 64MB RAM (Debian/Lighttpd/PHP/MySQL) however it's not clearly defined how much hits/traffic a "decent" site gets. Is there a rule of thumb on how much RAM a web server needs? To keep things simple, let's say you're running a site with static content and it's averaging at 100,000 hits per hour (HTML + images combined, no MySQL). How much RAM is the minimum requirement for that?

    Read the article

  • Updated script for downloading from youtube

    - by asksuperuser
    I'm not looking for a software or site to download youtube but for an opensource script in bash or any which is up todate as youtube often changes the download url. I've found this but seems deprecated: http://linux.byexamples.com/archives/302/how-to-wget-flv-from-youtube/ http://www.daniweb.com/forums/thread104419.html

    Read the article

  • best way to save a web page

    - by Remus Rigo
    Hi all I have tried many ways/software to save a web page (html, mht, doc, pdf). My favorite software was an addon for browsers from Omnipage (OCR). What i like about this is that it prints the whole page (continuously) and it doesn't write the http path and page number on the footer of every page, which i find very annoying. Does anyone know a software like this one (freeware or not) PS I tried CutePDF

    Read the article

< Previous Page | 685 686 687 688 689 690 691 692 693 694 695 696  | Next Page >