Search Results

Search found 60513 results on 2421 pages for 'parse com'.

Page 340/2421 | < Previous Page | 336 337 338 339 340 341 342 343 344 345 346 347  | Next Page >

  • 404 Error on a file that exists?

    - by Abs
    Hello all, A script makes a GET request to my URL like so: http://mydomain.com/cgi-bin/uu_ini_status_audios.pl?tmp_sid=b742be1d131c4d32237a9f1fcdca659e&rnd_id=0.2363453360320319 However, I get a 404 returned straight away: The requested URL /cgi-bin/uu_ini_status_audios.pl was not found on this server. But that script exists on my server, I can see the file! It has the correct permissions (I gave it a 777 to be sure). It is also owned by my apache user and its in the group apache. What am I missing?? Thanks for any help on this! Update I thought it would have been a htaccess (rewrite) but I don't think it is anymore. I tried putting a index.php file in there and try to access it via my URL but I can't even do that! I tried this: http://mydoamin.com/cgi-bin/index.php - same 404 error! I get this in myerror log: [Tue Sep 14 14:42:49 2010] [error] [client xx.xxx.xx.xxx] script not found or unable to stat: /var/www/vhosts/mydomain.com/cgi-bin Access_log file: xx.xxx.xx.xxx - - [14/Sep/2010:14:48:25 +0200] "GET /cgi-bin/index.php HTTP/1.1" 404 475 "-" "Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US; rv:1.9.2.9) Gecko/20100824 Firefox/3.6.9 (.NET CLR 3.5.30729)" Update 2 My htaccess file: <IfModule mod_rewrite.c> RewriteEngine On RewriteRule ^blog/ - [L] RewriteCond %{HTTP_HOST} ^www\.mydomain\.com$ [NC] RewriteRule ^(.*)$ http://mydomain.com/$1 [R=301,L] RewriteRule ^search/(.*)/(.*)/(.*)/(.*) /search.php?searchfor=$1&sortby=$2&page=$3&searchterm=$4 RewriteRule ^confirmemail/(.*) /confirmemail.php?code=$1 RewriteRule ^resetpassword/(.*) /resetpassword.php?code=$1 RewriteRule ^resendconfirmation/(.*) /resendconfirmation.php?userid=$1 RewriteRule ^categories/ /categories.php RewriteRule ^([-_~*a-zA-Z0-9]+)(\/)?$ /memberprofile.php?username=$1 RewriteRule ^browse/audios/(.*)/(.*)/(.*)/(.*) /audios.php?sortby=$1&filter=$2&page=$3&title=$4 RewriteRule ^browse/categories/audios/(.*)/(.*)/(.*)/(.*) /categoryaudios.php?sortby=$1&filter=$2&page=$3&title=$4 RewriteRule ^audios/(.*)/(.*) /playaudio.php?audioid=$1&title=$2 RewriteRule ^download/audio/(.*)/(.*) /downloadaudio.php?AUDIOID=$1&title=$2 RewriteRule ^members/audios/(.*)/(.*) /memberaudios.php?pid=$1&username=$2 RewriteRule ^syndicate/audios/(.*)/(.*) /syndicateaudios.php?filter=$1&title=$2 </IfModule> Update 3 [root@smydomain ~]# ls -la /var/www/vhosts/mydoamin.com/httpdocs/cgi-bin/ total 60 drwxr-xr-x 3 apache root 4096 Sep 14 14:37 . drwxr-x--- 20 som psaserv 4096 Sep 14 14:40 .. drwxr-xr-x 2 apache root 4096 Sep 7 03:01 configs -rwxrwxrwx 1 apache root 4 Sep 14 14:37 index.php -rwxrwxrwx 1 apache apache 6520 Sep 7 03:01 uu_ini_status_audios.pl -rwxr-xr-x 1 apache root 3215 Sep 7 03:01 uu_lib_audios.pl -rwxr-xr-x 1 apache root 30249 Sep 7 03:01 uu_upload_audios.pl

    Read the article

  • How do I delete a route in OS X 10.5

    - by authormichael-olsen-craig
    I somehow configured my Mac to route all requests for a particular IP Name (sample.com) to the loopback address (127.0.0.1). Now I'm trying to remove this, but can't determine where to do it. There is no entry for it under /etc/hosts. The routing table shows that it is mapping the IP Name to the IP address of the Mac. Routing table output below: Internet: Destination Gateway Flags Refs Use Netif Expire default 192.168.2.1 UGSc 4 1 en0 127 sample.com UCS 0 0 lo0 sample.com sample.com UH 1 7093 lo0 169.254 link#4 UCS 0 0 en0 192.168.2 link#4 UCS 6 0 en0 192.168.2.1 0:11:22:22:3f:fa UHLW 20 55565 en0 1070 192.168.2.15 tsema.org UHS 0 9 lo0 192.168.2.255 link#4 UHLWb 4 84777 en0 Any help would be greatly appreciated!

    Read the article

  • How to configure sendmail to relay local user mail to public host?

    - by Chau Chee Yang
    I am using Linux/Fedora's sendmail as my mail server. The server do not has a public domain name. It connect to Internet via dial-up. There are few users in the server. I have successfully configure my sendmail to relay mail to public host (via smart_host): # mail <user>@gmail.com user@gmail.com receive mail from this private host. However, if I send a mail to local user (without domain name): # mail <user> All mails are deliver to my server's mail spooler (/var/spool/mail). I wish all mails send to local user may relay to a public domain that I have registered, is that possible to do so with sendmail? mail user1 will send mail to [email protected].com mail user2 will send mail to [email protected]

    Read the article

  • Logging into Local Statusnet instance on Apache causes browser to download a file

    - by DilbertDave
    I've installed statusnet 0.9.1 on a Windows Server via the WAMP stack and on the whole it seems to be fine. However, when logging in using IE7 or Chrome the browers invoke a file download, i.e. the File Download dialog is displayed. In IE7 the file is called notice with the content below (some parts starred out): <?xml version="1.0" encoding="UTF-8"?> <OpenSearchDescription xmlns="http://a9.com/-/spec/opensearch/1.1/"> <ShortName>Mumble Notice Search</ShortName> <Contact>david.carson@*****.com</Contact> <Url type="text/html" method="get" template="http://voice.*****.com/mumble/search/notice?q={searchTerms}"></Url> <Image height="16" width="16" type="image/vnd.microsoft.icon">http://voice.*****.com/mumble/favicon.ico</Image> <Image height="50" width="50" type="image/png">http://voice.******.com/mumble/theme/cloudy/logo.png</Image> <AdultContent>false</AdultContent> <Language>en_GB</Language> <OutputEncoding>UTF-8</OutputEncoding> <InputEncoding>UTF-8</InputEncoding> </OpenSearchDescription> In Chrome (Linux and Windows!) the file is called people and contains similar XML. This is not an issue when logging in using FireFox. This is obviously a configuration issue but I'm not having much luck tracking it down. I tested the previous version of Statusnet on an Ubuntu Server VM on our network and it worked fine for months. Thanks In Advance

    Read the article

  • Set postfix to send email but not to receive them

    - by CodeShining
    I'm using Google Apps to handle personal email addresses for my domain name, and I set up the DNS as Google suggests. All works fine. Now since I need a SMTP to send emails from my e-commerce I installed Postfix on the server. It works fine when I send emails to any email address but it doesn't send to the same domain name, so let's say my domain is example.com, I set postfix using example.com, if I try to reset a password using myaccount@example.com postfix doesn't send and instead reports on the mail.log Sep 20 01:09:52 ip-10-54-26-162 postfix/pickup[6809]: B09A3415D8: uid=33 from=<www-data> Sep 20 01:09:52 ip-10-54-26-162 postfix/cleanup[6854]: B09A3415D8: message-id=<20120920010952.B09A3415D8@ip-10-54-26-162.eu-west-1.compute.internal> Sep 20 01:09:52 ip-10-54-26-162 postfix/qmgr[30978]: B09A3415D8: from=<[email protected]>, size=4234, nrcpt=1 (queue active) Sep 20 01:09:52 ip-10-54-26-162 postfix/local[6856]: B09A3415D8: to=<[email protected]>, relay=local, delay=0.01, delays=0.01/0/0/0, dsn=5.1.1, status=bounced (unknown user: "myaccount") Of course it cannot find a local user "myaccount" since that account is on Google Apps... How can I tell Postfix to send the email and do not search for a local user?

    Read the article

  • Turn off gzip for a location in Nginx

    - by Nyxynyx
    How can gzip be turned off for a particular location and all its sub-directories? My main site is at http://mydomain.com and I want to turn gzip off for both http://mydomain.com/foo and http://mydomain.com/foo/bar. gzip is turned on in nginx.conf. I tried turning off gzip as shown below, but the Response Headers in Chrome's dev tools shows that Content-Encoding:gzip. How should gzip/output buffering be disabled properly? Attempt: server { listen 80; server_name www.mydomain.com mydomain.com; access_log /var/log/nginx/access.log; error_log /var/log/nginx/error.log; root /var/www/mydomain/public; index index.php index.html; location / { gzip on; try_files $uri $uri/ /index.php?$args ; } location ~ \.php$ { fastcgi_pass unix:/var/run/php5-fpm.sock; fastcgi_index index.php; include fastcgi_params; fastcgi_read_timeout 300; } location /foo/ { gzip off; try_files $uri $uri/ /index.php?$args ; } }

    Read the article

  • Nagios Check_hpjd giving me problems!

    - by Mister IT Guru
    When I run the following [root@host plugins]# ./check_hpjd -H printer1.mydomain.com : Timeout from host printer1.mydomain.com I have Net-snmp installed on my system, I noted that i didn't have net-snmp-utils installed, and then I was able to run [root@host plugins]# snmpwalk -Os -c public -v 1 printer1.mydomain.com system sysDescr.0 = STRING: HP ETHERNET MULTI-ENVIRONMENT sysObjectID.0 = OID: enterprises.11.2.3.9.1 sysUpTimeInstance = Timeticks: (325408663) 37 days, 15:54:46.63 sysContact.0 = STRING: sysName.0 = STRING: printer1 sysLocation.0 = STRING: sysServices.0 = INTEGER: 72 So I know that the printer is working as expected, (as far as SNMP is concerned). But when I run [root@host plugins]# ./check_hpjd -H printer1.mydomain.com -C public Error in packet () I get this error - From what I've tried so far, I know my host can communicate via SNMP, I know the printer responds via SNMP, so I guess I'm left to look at the plug-in, which I will be checking up on. I'm new to SNMP, I am investigating this with my good friend Google search, but I am on a learning curve here, so please forgive my questions if they sound stupid,

    Read the article

  • Setting up DNS in WHM/cPanel

    - by Jon Furmanski
    I don't understand what I'm doing wrong, but I'm sure this is a simple fix. I setup WHM/cPanel for the first time on my VPS and understand how DNS works for the most part (or so I thought). I created under the main domain name 2 nameservers (ns1.maindomain.com & ns2.maindomain.com). I have 2 IP address for my sever so each one points to a unique IP: ns1.maindomain.com => 198.x.x.204 ns2.maindomain.com => 198.x.x.205 I also set up reverse DNS with my hosting provider. When I put in my two nameservers under another domain (secondary domain), GoDaddy states that the nameservers are invalid. Any ideas on why this is or any configurations in cPanel that need to be made?

    Read the article

  • .htaccess template, suggestions needed

    - by purpler
    DefaultLanguage en-US FileETag None Header unset ETag ServerSignature Off SetEnv TZ Europe/Belgrade # Rewrites Options +FollowSymLinks RewriteEngine On RewriteBase / # Redirect to WWW RewriteCond %{HTTP_HOST} ^serpentineseo.com RewriteRule (.*) http://www.serpentineseo.com/$1 [R=301,L] # Redirect index to root RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /.*index\.html\ HTTP/ RewriteRule ^(.*)index\.html$ /$1 [R=301,L] # Cache media files: ExpiresActive On ExpiresDefault A0 # Month <filesMatch "\.(gif|jpg|jpeg|png|ico|swf|js)$"> Header set Cache-Control "max-age=2592000, public" </filesMatch> # Week <FilesMatch "\.(css|pdf)$"> Header set Cache-Control "max-age=604800" </FilesMatch> # 10 Min <FilesMatch "\.(html|htm|txt)$"> Header set Cache-Control "max-age=600" </FilesMatch> # Do not cache <FilesMatch "\.(pl|php|cgi|spl|scgi|fcgi)$"> Header unset Cache-Control </FilesMatch> # Compress output <IfModule mod_deflate.c> <FilesMatch "\.(html|js|css)$"> SetOutputFilter DEFLATE </FilesMatch> </IfModule> # Error Documents ErrorDocument 206 /error/206.html ErrorDocument 401 /error/401.html ErrorDocument 403 /error/403.html ErrorDocument 404 /error/404.html ErrorDocument 500 /error/500.html # Prevent hotlinking RewriteCond %{HTTP_REFERER} !^$ RewriteCond %{HTTP_REFERER} !^http://(www\.)?serpentineseo.com/.*$ [NC] RewriteRule \.(gif|jpg|png)$ http://www.serpentineseo.com/images/angryman.png [R,L] # Prevent offline browsers RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [OR] RewriteCond %{HTTP_USER_AGENT} ^Bot\ mailto:craftbot@yahoo.com [OR] RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [OR] RewriteCond %{HTTP_USER_AGENT} ^Custo [OR] RewriteCond %{HTTP_USER_AGENT} ^DISCo [OR] RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [OR] RewriteCond %{HTTP_USER_AGENT} ^eCatch [OR] RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [OR] RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [OR] RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [OR] RewriteCond %{HTTP_USER_AGENT} ^Express\ WebPictures [OR] RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [OR] RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [OR] RewriteCond %{HTTP_USER_AGENT} ^FlashGet [OR] RewriteCond %{HTTP_USER_AGENT} ^GetRight [OR] RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [OR] RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [OR] RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [OR] RewriteCond %{HTTP_USER_AGENT} ^GrabNet [OR] RewriteCond %{HTTP_USER_AGENT} ^Grafula [OR] RewriteCond %{HTTP_USER_AGENT} ^HMView [OR] RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Image\ Stripper [OR] RewriteCond %{HTTP_USER_AGENT} ^Image\ Sucker [OR] RewriteCond %{HTTP_USER_AGENT} Indy\ Library [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^InterGET [OR] RewriteCond %{HTTP_USER_AGENT} ^Internet\ Ninja [OR] RewriteCond %{HTTP_USER_AGENT} ^JetCar [OR] RewriteCond %{HTTP_USER_AGENT} ^JOC\ Web\ Spider [OR] RewriteCond %{HTTP_USER_AGENT} ^larbin [OR] RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [OR] RewriteCond %{HTTP_USER_AGENT} ^Mass\ Downloader [OR] RewriteCond %{HTTP_USER_AGENT} ^MIDown\ tool [OR] RewriteCond %{HTTP_USER_AGENT} ^Mister\ PiX [OR] RewriteCond %{HTTP_USER_AGENT} ^Navroad [OR] RewriteCond %{HTTP_USER_AGENT} ^NearSite [OR] RewriteCond %{HTTP_USER_AGENT} ^NetAnts [OR] RewriteCond %{HTTP_USER_AGENT} ^NetSpider [OR] RewriteCond %{HTTP_USER_AGENT} ^Net\ Vampire [OR] RewriteCond %{HTTP_USER_AGENT} ^NetZIP [OR] RewriteCond %{HTTP_USER_AGENT} ^Octopus [OR] RewriteCond %{HTTP_USER_AGENT} ^Offline\ Explorer [OR] RewriteCond %{HTTP_USER_AGENT} ^Offline\ Navigator [OR] RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [OR] RewriteCond %{HTTP_USER_AGENT} ^Papa\ Foto [OR] RewriteCond %{HTTP_USER_AGENT} ^pavuk [OR] RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [OR] RewriteCond %{HTTP_USER_AGENT} ^RealDownload [OR] RewriteCond %{HTTP_USER_AGENT} ^ReGet [OR] RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [OR] RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [OR] RewriteCond %{HTTP_USER_AGENT} ^SuperBot [OR] RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [OR] RewriteCond %{HTTP_USER_AGENT} ^Surfbot [OR] RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [OR] RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro [OR] RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [OR] RewriteCond %{HTTP_USER_AGENT} ^Web\ Image\ Collector [OR] RewriteCond %{HTTP_USER_AGENT} ^Web\ Sucker [OR] RewriteCond %{HTTP_USER_AGENT} ^WebAuto [OR] RewriteCond %{HTTP_USER_AGENT} ^WebCopier [OR] RewriteCond %{HTTP_USER_AGENT} ^WebFetch [OR] RewriteCond %{HTTP_USER_AGENT} ^WebGo\ IS [OR] RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [OR] RewriteCond %{HTTP_USER_AGENT} ^WebReaper [OR] RewriteCond %{HTTP_USER_AGENT} ^WebSauger [OR] RewriteCond %{HTTP_USER_AGENT} ^Website\ eXtractor [OR] RewriteCond %{HTTP_USER_AGENT} ^Website\ Quester [OR] RewriteCond %{HTTP_USER_AGENT} ^WebStripper [OR] RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [OR] RewriteCond %{HTTP_USER_AGENT} ^WebZIP [OR] RewriteCond %{HTTP_USER_AGENT} ^Wget [OR] RewriteCond %{HTTP_USER_AGENT} ^Widow [OR] RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [OR] RewriteCond %{HTTP_USER_AGENT} ^Xaldon\ WebSpider [OR] RewriteCond %{HTTP_USER_AGENT} ^Zeus RewriteRule ^.*$ http://www.google.com [R,L] # Protect against DOS attacks by limiting file upload size LimitRequestBody 10240000 # Deny access to sensitive files <FilesMatch "\.(htaccess|psd|log)$"> Order Allow,Deny Deny from all </FilesMatch>

    Read the article

  • Downloading multiple files with wget and handling parameters

    - by coure2011
    How can I download multiple files using wget? I also want to rename the files. Here are the commands I'm running one by one (copy/paste on terminal): wget -c --load-cookies cookies.txt http://www.filesonic.com/file/812720774/PS11.rar -O part11.rar wget -c --load-cookies cookies.txt http://www.filesonic.com/file/812721094/PS12.rar -O part12.rar wget -c --load-cookies cookies.txt http://www.filesonic.com/file/812720804/PS13.rar -O part13.rar wget -c --load-cookies cookies.txt http://www.filesonic.com/file/812720854/PS14.rar -O part14.rar ........ and so on.. What can I do to download all these files one by one?

    Read the article

  • DKIM- Filter No Signature Data

    - by Vineet Sharma
    I have installed DKIM-Filter on Postfix after reading this tutorial http://www.unibia.com/unibianet/systems-networking/how-setup-domainkeys-identified-mail-dkim-postfix-and-ubuntu-server My email now has a DKIM signature but still it is landing in the SPAM folder. Here is the header Received-SPF: neutral (google.com: 69.164.193.167 is neither permitted nor denied by best guess record for domain of [email protected]) client-ip=69.164.193.167; Authentication-Results: mx.google.com; spf=neutral (google.com: 69.164.193.167 is neither permitted nor denied by best guess record for domain of [email protected]) [email protected]; dkim=hardfail (test mode) [email protected] Received: from promote.a2labs.in (localhost [127.0.0.1]) by promote.a2labs.in (Postfix) with ESMTPA id 34858530E8 for <[email protected]>; Mon, 28 Feb 2011 12:23:07 +0530 (IST) DKIM-Signature: v=1; a=rsa-sha256; c=simple/simple; d=a2labs.in; s=mail; t=1298875987; bh=bo+H1VYPIHMja2u7i1lnzr4k/j4Pe8iSf79bVw94XpI=; h=To:Subject:Message-ID:Date:From:Reply-To:MIME-Version: Content-Type:Content-Transfer-Encoding; b=nhTdlnUwo0iUJ92ycQzKSRjw 5Pfya0DJcJrAc8Mr2hIv8OLpgzBCzdOMWTGqR5nuUmAzgCGYBhYAM2XZwVxo9JG/iz7 oYKysmNQnskFx0TRyW3UOkDWcfHcPnCL6Y7fGzZWinmsyjsg47k+mKZg/e8jqlwTAMO PYKkt5pBz7SM0= Also my mail.err file shows Feb 28 12:17:03 ivineet dkim-filter[32181]: 1F788530E1: no signature data Feb 28 12:18:02 ivineet dkim-filter[32181]: 432BA530E2: no signature data How to fix it

    Read the article

  • nginx proxypath https redirects to http

    - by Thermionix
    I'm trying to setup Nginx to forward requests to several backend services using proxy_pass however several pages load with 404s The links on the pages have https:// in front, but result in a http request - which ends in a 404 - I only want these services to be available through https. I've tried with varied trailing forward slashes appended to the proxypath and location in proxy.conf, I've also tried commenting out www.conf (just incase its location blocks could have caused any conflicts) to no effect. So if a link is too https://example.com/sickbeard/errorlogs in a browser when loaded https://example.com/sickbeard/errorlogs gives a 404 in a browser https://example.com/sickbeard/errorlogs/ loads nginx error log; 2011/11/23 14:21:58 [error] 28882#0: *6 "/var/www/sickbeard/errorlogs/recent.html" is not found (2: No such file or directory), client: 192.168.1.99, server: example.com, request: "GET /sickbeard/errorlogs/ HTTP/1.1", host: "example.com" Config files; proxy.conf location /sickbeard { proxy_pass http://localhost:8081/sickbeard; include proxy.inc; } .... more entries .... sites-enabled/main server { listen 80; include www.conf; } server { listen 443; include proxy.conf; include www.conf; ssl on; } www.conf root /var/www; server_name example.com; location / { autoindex off; allow all; rewrite ^/$ /mainsite last; location ~* \.(jpg|jpeg|gif|css|png|js|ico)$ { expires max; } location ~ \.php$ { fastcgi_index index.php; include fastcgi_params; if (-f $request_filename) { fastcgi_pass 127.0.0.1:9000; } } } proxy.inc proxy_connect_timeout 59s; proxy_send_timeout 600; proxy_read_timeout 600; proxy_buffer_size 64k; proxy_buffers 16 32k; proxy_pass_header Set-Cookie; proxy_redirect off; proxy_hide_header Vary; proxy_busy_buffers_size 64k; proxy_temp_file_write_size 64k; proxy_set_header Accept-Encoding ''; proxy_ignore_headers Cache-Control Expires; proxy_set_header Referer $http_referer; proxy_set_header Host $host; proxy_set_header Cookie $http_cookie; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-Host $host; proxy_set_header X-Forwarded-Server $host; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;

    Read the article

  • Point domain to port used by java app

    - by takeshin
    I have successfully installed YouTrack issue tracker following the guides at: http://confluence.jetbrains.net/display/YTD3/Linux.+YouTrack+JAR+as+a+Service http://youtrack.jetbrains.com/issue/JT-7619 The application is now running at: mydomain.com:8080 How do I configure the server to run at youtrack.mydomain.com instead? I've been trying to set a reverse proxy in Apache, but it didn't work for me.

    Read the article

  • Nginx and Tomcat 6 proxy pass

    - by Patrick Schneider
    i've got problems tp configure nginx as reverse proxy for an tomcat application. I want to set domain www.example.com/blog to pass to an tomcat application. nginx-site: server { listen 80; servername example.com; location /blog { proxy_pass http://localhost:8080/blog; proxy_redirect off; } } Now when i call on my browser http://example.com/blog it redirects to localhost/blog which does not work. curl http://localhost:8080/blog -H "host: example.com/blog" -v shows a 302 to localhost/blog Any ideas?

    Read the article

  • Reg Expression htaccess RewriteRule

    - by Rick
    I am new to using regular expressions for rewriting URL's in htaccess I need to redirect mysite.com/123 to mysite.com/, IF cookie named 'ref' is set. my current htaccess is: <IfModule mod_rewrite.c> RewriteEngine On RewriteBase / RewriteCond %{HTTP_COOKIE} ref=true [NC] RewriteRule ^/([0-9]+)/$ http://www.mysite.com </IfModule> The goal is that when someone enters site with: mysite.com/111(some number) that they are redirected to the home page of the site after the cookie is set. Be nice... I'm new! ;o)

    Read the article

  • .htaccess template, suggestions needed

    - by purpler
    # Defaults AddDefaultCharset UTF-8 DefaultLanguage en-US FileETag None Header unset ETag ServerSignature Off SetEnv TZ Europe/Belgrade # Rewrites Options +FollowSymLinks RewriteEngine On RewriteBase / # Redirect to WWW RewriteCond %{HTTP_HOST} ^serpentineseo.com RewriteRule (.*) http://www.serpentineseo.com/$1 [R=301,L] # Redirect index to root RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /.*index\.html\ HTTP/ RewriteRule ^(.*)index\.html$ /$1 [R=301,L] # Cache media files: ExpiresActive On ExpiresDefault A0 # Month <filesMatch "\.(gif|jpg|jpeg|png|ico|swf|js)$"> Header set Cache-Control "max-age=2592000, public" </filesMatch> # Week <FilesMatch "\.(css|pdf)$"> Header set Cache-Control "max-age=604800" </FilesMatch> # 10 Min <FilesMatch "\.(html|htm|txt)$"> Header set Cache-Control "max-age=600" </FilesMatch> # Do not cache <FilesMatch "\.(pl|php|cgi|spl|scgi|fcgi)$"> Header unset Cache-Control </FilesMatch> # Compress output <IfModule mod_deflate.c> <FilesMatch "\.(html|js|css)$"> SetOutputFilter DEFLATE </FilesMatch> </IfModule> # Error Documents ErrorDocument 206 /error/206.html ErrorDocument 401 /error/401.html ErrorDocument 403 /error/403.html ErrorDocument 404 /error/404.html ErrorDocument 500 /error/500.html # Prevent hotlinking RewriteCond %{HTTP_REFERER} !^$ RewriteCond %{HTTP_REFERER} !^http://(www\.)?serpentineseo.com/.*$ [NC] RewriteRule \.(gif|jpg|png)$ http://www.serpentineseo.com/images/angryman.png [R,L] # Prevent offline browsers RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [OR] RewriteCond %{HTTP_USER_AGENT} ^Bot\ mailto:craftbot@yahoo.com [OR] RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [OR] RewriteCond %{HTTP_USER_AGENT} ^Custo [OR] RewriteCond %{HTTP_USER_AGENT} ^DISCo [OR] RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [OR] RewriteCond %{HTTP_USER_AGENT} ^eCatch [OR] RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [OR] RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [OR] RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [OR] RewriteCond %{HTTP_USER_AGENT} ^Express\ WebPictures [OR] RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [OR] RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [OR] RewriteCond %{HTTP_USER_AGENT} ^FlashGet [OR] RewriteCond %{HTTP_USER_AGENT} ^GetRight [OR] RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [OR] RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [OR] RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [OR] RewriteCond %{HTTP_USER_AGENT} ^GrabNet [OR] RewriteCond %{HTTP_USER_AGENT} ^Grafula [OR] RewriteCond %{HTTP_USER_AGENT} ^HMView [OR] RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Image\ Stripper [OR] RewriteCond %{HTTP_USER_AGENT} ^Image\ Sucker [OR] RewriteCond %{HTTP_USER_AGENT} Indy\ Library [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^InterGET [OR] RewriteCond %{HTTP_USER_AGENT} ^Internet\ Ninja [OR] RewriteCond %{HTTP_USER_AGENT} ^JetCar [OR] RewriteCond %{HTTP_USER_AGENT} ^JOC\ Web\ Spider [OR] RewriteCond %{HTTP_USER_AGENT} ^larbin [OR] RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [OR] RewriteCond %{HTTP_USER_AGENT} ^Mass\ Downloader [OR] RewriteCond %{HTTP_USER_AGENT} ^MIDown\ tool [OR] RewriteCond %{HTTP_USER_AGENT} ^Mister\ PiX [OR] RewriteCond %{HTTP_USER_AGENT} ^Navroad [OR] RewriteCond %{HTTP_USER_AGENT} ^NearSite [OR] RewriteCond %{HTTP_USER_AGENT} ^NetAnts [OR] RewriteCond %{HTTP_USER_AGENT} ^NetSpider [OR] RewriteCond %{HTTP_USER_AGENT} ^Net\ Vampire [OR] RewriteCond %{HTTP_USER_AGENT} ^NetZIP [OR] RewriteCond %{HTTP_USER_AGENT} ^Octopus [OR] RewriteCond %{HTTP_USER_AGENT} ^Offline\ Explorer [OR] RewriteCond %{HTTP_USER_AGENT} ^Offline\ Navigator [OR] RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [OR] RewriteCond %{HTTP_USER_AGENT} ^Papa\ Foto [OR] RewriteCond %{HTTP_USER_AGENT} ^pavuk [OR] RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [OR] RewriteCond %{HTTP_USER_AGENT} ^RealDownload [OR] RewriteCond %{HTTP_USER_AGENT} ^ReGet [OR] RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [OR] RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [OR] RewriteCond %{HTTP_USER_AGENT} ^SuperBot [OR] RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [OR] RewriteCond %{HTTP_USER_AGENT} ^Surfbot [OR] RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [OR] RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro [OR] RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [OR] RewriteCond %{HTTP_USER_AGENT} ^Web\ Image\ Collector [OR] RewriteCond %{HTTP_USER_AGENT} ^Web\ Sucker [OR] RewriteCond %{HTTP_USER_AGENT} ^WebAuto [OR] RewriteCond %{HTTP_USER_AGENT} ^WebCopier [OR] RewriteCond %{HTTP_USER_AGENT} ^WebFetch [OR] RewriteCond %{HTTP_USER_AGENT} ^WebGo\ IS [OR] RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [OR] RewriteCond %{HTTP_USER_AGENT} ^WebReaper [OR] RewriteCond %{HTTP_USER_AGENT} ^WebSauger [OR] RewriteCond %{HTTP_USER_AGENT} ^Website\ eXtractor [OR] RewriteCond %{HTTP_USER_AGENT} ^Website\ Quester [OR] RewriteCond %{HTTP_USER_AGENT} ^WebStripper [OR] RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [OR] RewriteCond %{HTTP_USER_AGENT} ^WebZIP [OR] RewriteCond %{HTTP_USER_AGENT} ^Wget [OR] RewriteCond %{HTTP_USER_AGENT} ^Widow [OR] RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [OR] RewriteCond %{HTTP_USER_AGENT} ^Xaldon\ WebSpider [OR] RewriteCond %{HTTP_USER_AGENT} ^Zeus RewriteRule ^.*$ http://www.google.com [R,L] # Protect against DOS attacks by limiting file upload size LimitRequestBody 10240000 # Deny access to sensitive files <FilesMatch "\.(htaccess|psd|log)$"> Order Allow,Deny Deny from all </FilesMatch>

    Read the article

  • iptables rules to allow HTTP traffic to one domain only

    - by Emily
    Hi everyone, I need to configure my machine as to allow HTTP traffic to/from serverfault.com only. All other websites, services ports are not accessible. I came up with these iptables rules: #drop everything iptables -P INPUT DROP iptables -P OUTPUT DROP #Now, allow connection to website serverfault.com on port 80 iptables -A OUTPUT -p tcp -d serverfault.com --dport 80 -j ACCEPT iptables -A INPUT -m conntrack --ctstate ESTABLISHED,RELATED -j ACCEPT #allow loopback iptables -I INPUT 1 -i lo -j ACCEPT It doesn't work quite well: After I drop everything, and move on to rule 3: iptables -A OUTPUT -p tcp -d serverfault.com --dport 80 -j ACCEPT I get this error: iptables v1.4.4: host/network `serverfault.com' not found Try `iptables -h' or 'iptables --help' for more information. Do you think it is related to DNS? Should I allow it as well? Or should I just put IP addresses in the rules? Do you think what I'm trying to do could be achieved with simpler rules? How? I would appreciate any help or hints on this. Thanks a lot!

    Read the article

  • Sending mail through local MTA while domain MX records point to Google Apps

    - by Assaf
    My domain's email is managed by Google Apps, so that domain users get Gmail and Calendar, etc. But I also want to be able to send applicative notifications to users outside the domain via email (e.g. "some commented on your post", and so on). However, if I try to send email through code I get blocked by Gmail after a few emails. I send marketing email through MailChimp, to minimize the risk of appearing as spam to my users (one-click unsubscribe, etc.). But I can't send applicative message in this way. I want to install a local MTA (my server runs Ubuntu), but I'm not sure what anti-spam measures I need to implement so that receiving MTAs don't think it's a spam server. What's stopping anyone from setting up a mail server and sending emails using my domain name? AFAIK it's the DNS records that show the MTA's address actually belongs to the domain. But my understanding of this is rather superficial, so someone please correct me if I'm wrong. But what sort of DNS configuration do I need to put in place so that I don't get blacklisted (assuming I don't actually spam anyone)? The MX records already point to Google, and I'd like to keep it this way. So do I just need to define an A record for my internal mail server? Should it show email as coming from a sub-domain, so as not to conflict with the bare domain being managed by google? Edit: Does the following SPF record make sense if I want email from my domain name to be sent by either google's servers or any server with a dns name ending with mydomain.com? "v=spf1 ptr mx:google.com mx:googlemail.com ~all" How should I set up reverse DNS for my server? If I have an A record that points mailsender.mydomain.com to my MTA's ip address, does it mean that reverse lookup will only allow emails sent from [email protected]?

    Read the article

  • Virtualhost setup, same IP address, different DirectoryIndex's

    - by kaykills
    I am trying to set up 2 virtual host entries in apache but I'm not sure how to accomplish what I want to do. I have two domain names, both pointing to the same IP Address. I need the DirectoryIndex to be different, which is pretty much the only difference in the entries. I have the following set up: <VirtualHost *:80> ServerName firstdomain.com ServerAdmin email@example.com DocumentRoot "/srv/www" DirectoryIndex /portals/site/index.html </VirtualHost> <VirtualHost *:80> ServerName seconddomain.com ServerAdmin email@example.com DocumentRoot "/srv/www" DirectoryIndex /portals/site/index_fr.html </VirtualHost> Not sure what I need to do differently but the second entry doesn't work. The only real difference is I need the second domain to point to a different DirectoryIndex. If there is a better way to accomplish this, your help would be appreciated.

    Read the article

  • cURL - Unkown SSL protocol error - OS X 10.9

    - by saq7
    I am trying to use cURL and get the following error on every https request I make. The error is always the same. HTTP requests work flawlessly. The verbose output is quite useless. Saquibs-MacBook-Pro:~ skothawala$ curl https://google.com -vv * Adding handle: conn: 0x7fe09b803a00 * Adding handle: send: 0 * Adding handle: recv: 0 * Curl_addHandleToPipeline: length: 1 * - Conn 0 (0x7fe09b803a00) send_pipe: 1, recv_pipe: 0 * About to connect() to google.com port 443 (#0) * Trying 74.125.226.129... * Connected to google.com (74.125.226.129) port 443 (#0) * Unknown SSL protocol error in connection to google.com:-9805 * Closing connection 0 curl: (35) Unknown SSL protocol error in connection to google.com:-9805 I have looked through other answers on this forum and other places on the internet, but haven't found an answer. Most people's issues involve particular servers and the configuration of SSL on these servers. Mine however is problematic anytime HTTPS is used (with any website). Can someone please suggest what I should be looking into to solve the problem? Can it be that something is not properly configured? What should I be looking for?

    Read the article

  • use nginx proxy_pass and rewrite to hide third party api credentials?

    - by brakertech
    Objective: I'd like to have nginx hide calls to a third party api Example Request: http://example.com/myapi/60601 Nginx rewritten proxy_pass request calls: http://api.remix.bestbuy.com/v1/stores(area(60601,10))?show=storeId,name&apiKey=redacted_24_character_string Code i've tried: This works but not for parameters location ^~ /myapi/ { rewrite ^/myapi/(.*) /api/check/$1/key/e95fad09aa5091b7734d1a268b53cef5 break; proxy_pass http://api.server.com/; }

    Read the article

  • Apache log - file does not exist

    - by Ivan
    I have quite a few of these in Apache logs piling up every day: [Mon Jun 09 20:42:58 2014] [error] [client 180.153.214.181] File does not exist: /home/user/public_html/ajax.googleapis.com, referer: http://www.mysite.com//ajax.googleapis.com/ajax/libs/jquery/1.11.1/jquery.min.js I have over 200k visitors per day but a few of them like a dozen or so are generating the above error. I can't figure out what may be causing it. Checked the html code and it's all good so I ran out of ideas.

    Read the article

  • Seizing naming master from child domain server

    - by meera
    when I am trying to seize the role from my child domain server the naming master I get the following error fsmo maintenance: seize naming master Attempting safe transfer of domain naming FSMO before seizure. ldap_modify_sW error 0x34(52 (Unavailable). Ldap extended error message is 000020AF: SvcErr: DSID-03210380, problem 5002 (UN AVAILABLE), data 8438 Win32 error returned is 0x20af(The requested FSMO operation failed. The current FSMO holder could not be contacted.) ) Depending on the error code this may indicate a connection, ldap, or role transfer error. Transfer of domain naming FSMO failed, proceeding with seizure ... Server "win-fb20ixk90mu" knows about 5 roles Schema - CN=NTDS Settings,CN=WIN-3918XHC5STU,CN=Servers,CN=Default-First-Site-Na me,CN=Sites,CN=Configuration,DC=HCL,DC=com Naming Master - CN=NTDS Settings,CN=WIN-FB20IXK90MU,CN=Servers,CN=Default-First- Site-Name,CN=Sites,CN=Configuration,DC=HCL,DC=com PDC - CN=NTDS Settings,CN=WIN-FB20IXK90MU,CN=Servers,CN=Default-First-Site-Name, CN=Sites,CN=Configuration,DC=HCL,DC=com RID - CN=NTDS Settings,CN=WIN-FB20IXK90MU,CN=Servers,CN=Default-First-Site-Name, CN=Sites,CN=Configuration,DC=HCL,DC=com Infrastructure - CN=NTDS Settings,CN=WIN-FB20IXK90MU,CN=Servers,CN=Default-First -Site-Name,CN=Sites,CN=Configuration,DC=HCL,DC=com

    Read the article

  • SSH Private Key Not Working in Some Directories

    - by uesp
    I have a strange issue where SSH won't properly connect with a private-key if the key file is in certain directories. I've setup the keys on a set of servers and the following command ssh -i /root/privatekey keyuser@host.com works fine and I login to the given host without getting prompted by a password, but this command: ssh -i /etc/keyfiles/privatekey keyuser@host.com gives me a password prompt. I've narrowed it down that this behavior occurs in only some sub-directories of /etc/. For example /etc/httpd1/ gives me a password prompt but /etc/httpd/ does not. What I've checked so far: All private key files used are identical (copied from the original file). The private key file and directories used have identical permissions. No relevant error messages in the server/client logs. No interesting debug messages from ssh -v (it just seems to skip the key file). It happens with connecting to different hosts. After more testing it is not the actual directory name. For example: mkdir /etc/test cp /root/privatekey /etc/test ssh -i /etc/test/privatekey keyuser@host.com # Results in password prompt cp /root/privatekey /etc/httpd # Existing directory ls -ald test httpd # drwxr-xr-x 4 root root 4096 Mar 5 18:25 httpd # drwxr-xr-x 2 root root 4096 Mar 5 18:43 test ssh -i /etc/httpd/privatekey keyuser@host.com # Results in *no* prompt rm -r test cp -R /etc/httpd /etc/test ssh -i /etc/test/privatekey keyuser@host.com # Results in *no* prompt` I'm sure its just something simple I've overlooked but I'm at a loss.

    Read the article

  • htaccess rewrite different folder url, two index files

    - by Andrew
    I've been searching for awhile now and haven't found anything that comes close to what I'm trying to accomplish. Right now my URL's look like this: www.website.com/something which are using the root folder /index.php Now I have created plugins within folders: /plugins/PLUGINNAME/index.php I want to be able to have URLs like: www.website.com/plugins/PLUGINNAME/anything/iwant/here which are all using /plugins/PLUGINNAME/index.php and not the root directory index.php. Currently www.website.com/plugins/PLUGINNAME/ works, but anything after /PLUGINNAME/xxx defaults to the /index.php.

    Read the article

< Previous Page | 336 337 338 339 340 341 342 343 344 345 346 347  | Next Page >