Search Results

Search found 2680 results on 108 pages for 'soft 404'.

Page 69/108 | < Previous Page | 65 66 67 68 69 70 71 72 73 74 75 76  | Next Page >

  • Configurator Scan not picking up views

    - by mxmissile
    New to Py and Python. I'm trying to get pyramid Configurator scan to find my views, but I seem to be missing something, it's not picking up my "view" index here are my files: app.py from wsgiref.simple_server import make_server from pyramid.config import Configurator if __name__ == '__main__': config = Configurator() config.add_route('home', '/') config.scan() app = config.make_wsgi_app() server = make_server('0.0.0.0', 6543, app) server.serve_forever() and index.py from pyramid.view import view_config from pyramid.response import Response @view_config(route_name='home') def index(request): print'Incoming request' return Response('<body><h1>Home</h1></body>') Its returning a 404. However, if I remove config.scan() and add the view manually it works fine. from wsgiref.simple_server import make_server from pyramid.config import Configurator from index import index if __name__ == '__main__': config = Configurator() config.add_route('home', '/') config.add_view(index, route_name='home')

    Read the article

  • Redirected wikipedia request

    - by Le_Coeur
    Hi people, i need to write a program, that can redirect's http://localhost:8080 to en.wikipedia.org, it seems to be easy, but i have some problems(only with wikipedia with another sites works good). I make url to wikipedia: URL url = new URL("http", "en.wikipedia.org", 80, "/wiki"); than URLConnection, extract headers, and when i want connection.getInputStream(), i received message 404 Not Found. So i have tried some hack for host header, because in this way host header is localhost:8080, therefor i have tried to change host header to wikipedia, and it works, but after request in browser http://localhost:8080 wikipedia opens, but url in browser changes to en.wikipedia.org, but i want proceed with localhost :)

    Read the article

  • RewriteRule in htaccess in subdirectory

    - by Jay
    Windows server, running Apache. In my Apache conf, I have AllowOverride None for the root of a site and then I have a subdirectory set to AllowOverride All: <Directory /> AllowOverride None </Directory> <Directory "/safe/"> AllowOverride All </Directory> However, when I try to set up a rewrite rule in the subdirectory's htaccess file, nothing happens, I just get a 404 page not found error. Example: RewriteEngine On RewriteRule (.*) /blah?test=$1 [R=302,NC,NE,L] Rwewriting URLs are working fine from the root via the Apache conf. I don't understand why the rule is ignored. I don't want to do the URL re-writing within the conf because for this case I may need to be changing the redirects constantly and don't want to reload the server every time a change is made. I also don't want to affect server performance by enabling htaccess files site-wide, just in the subdirectory I need it.

    Read the article

  • Ignore route with a specific parameter

    - by Vivien
    Hello, I have an @url.Action in which I pass a parameter to the action. When the parameter is null, I would like to just do nothing and stay on the same page. What I have done is the following: routes.MapRoute( null, "Test/View/{Id}", new { controller = "Test", action = "View" }, new { Id = @"\d+" } //id must be numerical ); routes.IgnoreRoute("Test/View/{*pathInfo}"); The action is not executed but my problem is that I get this: Description: HTTP 404. The resource you are looking for (or one of its dependencies) could have been removed, had its name changed, or is temporarily unavailable. Please review the following URL and make sure that it is spelled correctly. Requested URL: /Test/View Which I obvisouly do not want to see. Thanks for your help.

    Read the article

  • Page not rewriting in ExpressionEngine

    - by Andrew
    I recently just launched an ExpressionEngine site and one of the last steps I take is removing index.php from the URL. In the case of this site, the default template group is called "site". Long story short, after removing index.php from the URL, all pages continue to work great with the exception of my contact page, which also lives in the "site" template group. Going to http://example.com/contact/ gives me a 404 while going to http://example.com/site/contact produces the desired result. In past ExpressionEngine site setups (including my own) this has never happened, so does anyone have thoughts on why this might not be working?

    Read the article

  • links in codeigniter

    - by Patrick
    hi All, I'm experimenting with codeigniter but don't really understand how links work. for example, I have a link like this: localhost/ci/welcome/cat/7 My base url is localhost/ci, so by clicking on this link I would expect the method "cat" of controller "welcome" to be called. This method is very simple: function cat() { echo "just a test."; } Pretty basic - I would expect to see the text on screen, but I just see a 404 -page not found error. What could be the problem?

    Read the article

  • Requested URL is changing

    - by user302486
    Hi, I have a website, at localhost:82 when I type this into IE, it comes up with a 404 error and the requested URL is localhost:80/wwwroot, which is not at all what I requested. There is no URL rewrite set up. I have tried to set up a tracing rule to see what is happening, however, the instructions at http://learn.iis.net/page.aspx/266/troubleshooting-failed-requests-using-tracing-in-iis-7/ say to look for "Fail Request Tracing" link, but it doesn't exist in my IIS 7.0 even under administrator. Not sure where to look or why this is changing my requested URL. Any help would be appreciated.

    Read the article

  • how to allow unamed user in svn authz file?

    - by dtrosset
    I have a subversion server running with apache. It authenticates users using LDAP in apache configuration and uses SVN authorizations to limit user access to certain repositories. This works perfectly. Apache DAV svn SVNParentPath /srv/svn SVNListParentPath Off SVNPathAuthz Off AuthType Basic AuthName "Subversion Repository" AuthBasicProvider ldap AuthLDAPBindDN # private stuff AuthLDAPBindPassword # private stuff AuthLDAPURL # private stuff Require valid-user AuthzSVNAccessFile /etc/apache2/dav_svn.authz Subversion [groups] soft = me, and, all, other, developpers Adding anonymous access from one machine Now, I have a service I want to setup (rietveld, for code reviews) that needs to have an anonymous access to the repository. As this is a web service, accesses are always done from the same server. Thus I added apache configuration to allow all accesses from this machine. This did not work until I add an additional line in the authorization file to allow read access to user -. Apache <Limit GET PROPFIND OPTIONS REPORT> Order allow,deny Allow from # private IP address Satisfy Any </Limit> Subversion [Software:/] @soft = rw - = r # <-- This is the added line For instance, before I add this, all users were authenticated, and thus had a name. Now, some accesses are done without a user name! I found this - user name in the apache log files. But does this line equals to * = r that I absolutely do not want to enable, or does it only allows the anonymous unnamed user (that is allowed access only from the rietveld server)?

    Read the article

  • I cant browse php pages in my local server

    - by tibin mathew
    Hi, I cant browse php pages in my local server.Before it was working fine. But now i cant browse php pages, i can browse html pages and asp pages , no problems with that. But when i try to browse a php page its not loading. What will be the problem?? I am using windows 2000 advanced server and my web server is Tomcat please someone help me Guys i'm not getting anything in my browser, its just continue to loading Nothing showing in that page i'm not getting any 404 error or anything like that. its just continue to be loading

    Read the article

  • XAMPP - local host problems

    - by jennym
    I have picked up some possible answers to my problem here, but none have so far worked. I have installed XAMPP but get the dreaded HTTP 404 error on both IE and Firefox when asking for 'http://localhost/'. I work with Dreamweaver but cannot get the testing server (XAMPP) to work, although I can connect to the linux/apache remote server. Skype has been uninstalled. Therefore XAMPP is running on port 80 by itself. XAMPP says Apache and Mysql have 'started'. Is there someone that could please help me?

    Read the article

  • Rails 2 and Ngnix: https pages can't load css or js (but will load graphics)

    - by Max Williams
    ADMISSION: i've posted this same question on stackoverflow, before realising it's probabaly better suited to superuser, but it kind of depends on the answer: If it turns out to be a problem in my nginx config, it's definitely superuser. If it turns out to be a problem in my Rails config (or code) then it's arguably stackoverflow. I'm adding some https pages to my rails site. In order to test it locally, i'm running my site under one mongrel_rails instance (on 3000) and nginx. I've managed to get my nginx config to the point where i can actually go to the https pages, and they load. Except, the javascript and css files all fail to load: looking in the Network tab in chrome web tools, i can see that it is trying to load them via an https url. Eg, one of the non-working file urls is https://cmw-local.co.uk/stylesheets/cmw-logged-out.css?1383759216 I have these set up (or at least think i do) in my nginx config to redirect to the http versions of the static files. This seems to be working for graphics, but not for css and js files. If i click on this in the Network tab, it takes me to the above url, which redirects to the http version. So, the redirect seems to be working in some sense, but not when they're loaded by an https page. Like i say, i thought i had this covered in the second try_files directive in my config below, but maybe not. Can anyone see what i'm doing wrong? thanks, Max Here's my nginx config - sorry it's a bit lengthy! I think the error is likely to be in the first (ssl) server block: server { listen 443 ssl; keepalive_timeout 70; ssl_certificate /home/max/work/charanga/elearn_container/elearn/config/nginx/certs/max-local-server.crt; ssl_certificate_key /home/max/work/charanga/elearn_container/elearn/config/nginx/certs/max-local-server.key; ssl_session_cache shared:SSL:10m; ssl_session_timeout 10m; ssl_protocols SSLv3 TLSv1; ssl_ciphers RC4:HIGH:!aNULL:!MD5; ssl_prefer_server_ciphers on; server_name elearning.dev cmw-dev.co.uk cmw-dev.com cmw-nginx.co.uk cmw-local.co.uk; root /home/max/work/charanga/elearn_container/elearn; # ensure that we serve css, js, other statics when requested # as SSL, but if the files don't exist (i.e. any non /basket controller) # then redirect to the non-https version location / { try_files $uri @non-ssl-redirect; } # securely serve everything under /basket (/basket/checkout etc) # we need general too, because of the email/username checking location ~ ^/(basket|general|cmw/account/check_username_availability) { # make sure cached copies are revalidated once they're stale add_header Cache-Control "public, must-revalidate, proxy-revalidate"; # this serves Rails static files that exist without running # other rewrite tests try_files $uri @rails-ssl; expires 1h; } location @non-ssl-redirect { return 301 http://$host$request_uri; } location @rails-ssl { proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header Host $http_host; proxy_redirect off; proxy_read_timeout 180; proxy_next_upstream off; proxy_pass http://127.0.0.1:3000; expires 0d; } } #upstream elrs { # server 127.0.0.1:3000; #} server { listen 80; server_name elearning.dev cmw-dev.co.uk cmw-dev.com cmw-nginx.co.uk cmw-local.co.uk; root /home/max/work/charanga/elearn_container/elearn; access_log /home/max/work/charanga/elearn_container/elearn/log/access.log; error_log /home/max/work/charanga/elearn_container/elearn/log/error.log debug; client_max_body_size 50M; index index.html index.htm; # gzip html, css & javascript, but don't gzip javascript for pre-SP2 MSIE6 (i.e. those *without* SV1 in their user-agent string) gzip on; gzip_http_version 1.1; gzip_vary on; gzip_comp_level 6; gzip_proxied any; gzip_types text/plain text/css application/json application/x-javascript text/xml application/xml application/xml+rss text/javascript; #text/html # make sure gzip does not lose large gzipped js or css files # see http://blog.leetsoft.com/2007/7/25/nginx-gzip-ssl gzip_buffers 16 8k; # Disable gzip for certain browsers. #gzip_disable "MSIE [1-6].(?!.*SV1)"; gzip_disable "MSIE [1-6]"; # blank gif like it's 1995 location = /images/blank.gif { empty_gif; } # don't serve files beginning with dots location ~ /\. { access_log off; log_not_found off; deny all; } # we don't care if these are missing location = /robots.txt { log_not_found off; } location = /favicon.ico { log_not_found off; } location ~ affiliate.xml { log_not_found off; } location ~ copyright.xml { log_not_found off; } # convert urls with multiple slashes to a single / if ($request ~ /+ ) { rewrite ^(/)+(.*) /$2 break; } # X-Accel-Redirect # Don't tie up mongrels with serving the lesson zips or exes, let Nginx do it instead location /zips { internal; root /var/www/apps/e_learning_resource/shared/assets; } location /tmp { internal; root /; } location /mnt{ root /; } # resource library thumbnails should be served as usual location ~ ^/resource_library/.*/*thumbnail.jpg$ { if (!-f $request_filename) { rewrite ^(.*)$ /images/no-thumb.png break; } expires 1m; } # don't make Rails generate the dynamic routes to the dcr and swf, we'll do it here location ~ "lesson viewer.dcr" { rewrite ^(.*)$ "/assets/players/lesson viewer.dcr" break; } # we need this rule so we don't serve the older lessonviewer when the rule below is matched location = /assets/players/virgin_lesson_viewer/_cha5513/lessonViewer.swf { rewrite ^(.*)$ /assets/players/virgin_lesson_viewer/_cha5513/lessonViewer.swf break; } location ~ v6lessonViewer.swf { rewrite ^(.*)$ /assets/players/v6lessonViewer.swf break; } location ~ lessonViewer.swf { rewrite ^(.*)$ /assets/players/lessonViewer.swf break; } location ~ lgn111.dat { empty_gif; } # try to get autocomplete school names from memcache first, then # fallback to rails when we can't location /schools/autocomplete { set $memcached_key $uri?q=$arg_q; memcached_pass 127.0.0.1:11211; default_type text/html; error_page 404 =200 @rails; # 404 not really! Hand off to rails } location / { # make sure cached copies are revalidated once they're stale add_header Cache-Control "public, must-revalidate, proxy-revalidate"; # this serves Rails static files that exist without running other rewrite tests try_files $uri @rails; expires 1h; } location @rails { proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header Host $http_host; proxy_redirect off; proxy_read_timeout 180; proxy_next_upstream off; proxy_pass http://127.0.0.1:3000; expires 0d; } }

    Read the article

  • RUNNING PHP IN NETBEANS

    - by user216112
    i have netbeans 6.8 with all bundle faetures. now i m running my php file Binary Search h1 {color: blue} Computer guess number by using binary search Input your hidden number: (1-99) Here; } else { if ($max_num==-1 && $min_num==-1) { $max_num = 100; $min_num = 0; $result_num = $hid_num; } else { if ($comparision == "bigger") { $min_num = $guess_num; } else if ($comparision == "smaller") { $max_num = $guess_num; } } $guess_num = ($max_num + $min_num)/2; setType($guess_num,"integer"); print "Computer guess $guess_num "; if ($guess_num == $result_num) { $flag_num = -1; } if ($flag_num == -1) { print Congratulation, Computer win " Here; } else { print Your intruction: Bigger Smaller Here; } } ? but the erreor coming in the "HTTP 404 NOT FOUND" I THINK SERVER HAS BEEN NOT BEEN SET.SO WHAT SHOULD I DO TO RUN IT

    Read the article

  • Servlet mapping for ajax call parameters

    - by Woho87
    Hi guys! <servlet-mapping> <servlet-name>Test</servlet-name> <url-pattern>/GetContacts*</url-pattern> </servlet-mapping> I got a simple problem where I can't find any solution on the internet. I have a ajax calls that have the url http://localhost:80/Push/GetContacts?id=23..... The above servlet mapping is not correct for the AJAX call. it issues a http 404 not found call. What is the right syntax to enable my ajax call?

    Read the article

  • javascript window.location gives me a wrong url path when checking firebug

    - by Elson Solano
    I have a sample url website: http://mysite.com/ var host = window.location.protocol+"://"+window.location.hostname; $.ajax({ type:"POST", data: params, url : host+'/forms/get_data.php', success:function(data){ ...othercodeblahblah } }); Why is it that when I try to check my firebug it makes the URL weird. This is the sample output of firebug: http://mysite.com/mysite.com/forms/get_data.php With this url it now gives me: "NetworkError: 404 Not Found - http://mysite.com/mysite.com/forms/get_data.php" Shouldn't it output like http://mysite.com/forms/get_data.php ? Why is it giving me a wrong url path? Your help would be greatly appreciated and rewarded! Thank!

    Read the article

  • Compiled ASP.NET application can be viewed locally only

    - by cfdev9
    I have a development website running on my local machine, I can access it locally by typing the address http://mycomputer.mynetwork.local/myapp/default.aspx however when anybody else tries to browse to it they get an error: Server Error in '/' Application. The resource cannot be found. Description: HTTP 404. The resource you are looking for (or one of its dependencies) could have been removed, had its name changed, or is temporarily unavailable. Please review the following URL and make sure that it is spelled correctly. I'm using IIS7, ASP.NET 3.5 and the application is pre-compiled. Any hints?

    Read the article

  • FTP/SFTP module for .NET

    - by Angrius
    I am designing an auto downloader using .NET and C#. I was wondering if there is a decent/robust FTP/SFTP module out there that I can use, preferably free of charge. EDIT: I probably should've noted that I am looking for answer from somebody who tried several and found one that works very well. I currently use .NET wrapper for libcurl for FTP and it does not behave right. For example when I connect to FTP site with Filezilla and provide a path, it finds directory just fine. With libcurl, i get a header for 404 page saying path cannot be found. I am guessing that Filezilla gets 'home' directory correctly, and libcurl puts me to the root (i tried no path, and i had no permissions to access that directory). But anyways, I just don't know enough about FTP itself, so I would like something that just works. Thanks!

    Read the article

  • Drupal rendering incomplete views

    - by Paul
    I got tapped to do some quick maintenance on a recently migrated Drupal site. I'm pretty new to Drupal, so hopefully the problem is something that more experienced guys will figure out quick. Behavior is as follows. The public content works fine as far as I can tell. When I go to login, the login form renders correctly, but when I post the form w/ my credentials, I get back a blank page (not a 404; looks like a 200 to the /user URL, but all the gets rendered is empty Head and body tags). If i refresh the page, I get the content of my profile view, but none of the site chrome or CSS. Note that this is not an issue on the site it was migrated from, so it seems like something wasn't copied over correctly. The site's not public, so I can't provide a URL, sorry!

    Read the article

  • codeigniter problem login through IE, Safari, Chrome?

    - by kamal
    hi everyone, everything works fine in localhost but i have a problem login from other browsers like safari and chrome, in firefox its ok. may be it is because of htaccess or may be other. i tried a lot searching but come to no solution. my htaccess file look like this. <IfModule mod_rewrite.c> RewriteEngine On RewriteBase / RewriteCond %{REQUEST_URI} ^system.* RewriteRule ^(.*)$ /index.php?/$1 [L] RewriteCond %{REQUEST_URI} ^application.* RewriteRule ^(.*)$ /index.php?/$1 [L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*)$ index.php?/$1 [L] </IfModule> <IfModule !mod_rewrite.c> ErrorDocument 404 /index.php </IfModule> my server folder hirarchy looks like this public_html L application system css images .htaccess application/config/config.php looks like this $config['index_page'] = "index.php?"; $config['uri_protocol'] = 'QUERY_STRING'; any help?

    Read the article

  • I can't get areas working in VS2010

    - by devlife
    I just upgraded from VS2010 RC to RTM. Now my areas aren't working. I have a Profile area with a Home controller and an Action method Index(). If I try: http://localhost:4951/profile I get a 404 error saying that the resource can't be found. If I try http://localhost:4951/profile/home I get the same error. However, if I try http://localhost:4951/profile/home/index then the view is returned. Here is my ProfileAreaRegistration: public class ProfileAreaRegistration : AreaRegistration { public override string AreaName { get { return "Profile"; } } public override void RegisterArea(AreaRegistrationContext context) { context.MapRoute( "Profile_Unlock", "Profile/Unlock/{userID}/{unlockID}", new { controller = "Unlock", action = "Index" }, new { userID = new GuidRouteConstraint(), unlockID = new GuidRouteConstraint() } ); context.MapRoute( "Profile_default", "Profile/{controller}/{action}/{id}", new { action = "Home", id = UrlParameter.Optional } ); } Does anyone know what is going wrong?

    Read the article

  • Most efficient way for testing links

    - by Burnzy
    I'm currently developping an app that is going through all the files on a server and checking every single hrefs to check wether they are valid or not. Using a WebClient or a HttpWebRequest/HttpWebResponse is kinda overkilling the process because it downloads the whole page each time, which is useless, I only need to check if the link do not return 404. What would be the most efficient way? Socket seems to be a good way of doing it, however I'm not quite sure how this works. Thanks for sharing your expertise!

    Read the article

  • What is the proper way to handle a fully qualified domain in a GET request?

    - by Mark P Neyer
    I'm writing a proxy server. When I use curl to fetch a page, say http://www.foo.com/pants, curl makes the following request: GET /pants HTTP/1.1 When I have curl send that request through my local proxy, curl changes the GET request to: GET http://www.foo.com/pants HTTP/1.1 This change causes the foo.com server return a 404. Is foo.com broken? Or is the fully qualified domain name only meaningful to proxy servers? Should I always strip http://domain from the requests I send out? Thanks!

    Read the article

  • How Can I optimize this RewriteEngine Code?

    - by Lucki Mile
    I have server overload, server admin said that this issue is caused from htaccess file This is the code: RewriteEngine On RewriteBase /here/ RewriteRule ^top/?$ index.php?mode=top [QSA] RewriteRule ^top/video/?$ index.php?mode=top&cat=vids [QSA] RewriteRule ^top/picture/?$ /index.php?mode=top&cat=pics [QSA] RewriteRule ^random$ index.php?mode=random [QSA] RewriteRule ^random/video/?$ index.php?mode=random&cat=vids [QSA] RewriteRule ^random/picture/?$ index.php?mode=random&cat=pics [QSA] RewriteRule ^new/?$ index.php [QSA] RewriteRule ^new/video/?$ index.php?mode=&cat=vids [QSA] RewriteRule ^new/picture/?$ index.php?mode=&cat=pics [QSA] RewriteRule ^video/([0-9]+)_(.*)$ item.php?cat=vids&id=$1 [QSA] RewriteRule ^picture/([0-9]+)_(.*)$ item.php?cat=pics&id=$1 [QSA] ErrorDocument 404 /item.php

    Read the article

  • Create fake subdirectories with htaccess and php AND keep existing directories as is.

    - by Arseni
    I have a website, which has numerous subdirectories already. (All existing in server's filesystem) I want to create new "virtual" sub-dirs with htaccess, but I only want the htaccess rule work for directories, listed in DB, and not existing in filesystem. i.e. File system has: /dir1/ & /dir2/ MySQL database has record for 'dir3' & 'dir4' And I want: A: mysite.com/dir1/ and mysite.com/dir2/ display existing old content B: mysite.com/dir3/ and mysite.com/dir4/ display content from MySQL provided by PHP sctipt via redirect like: mysite.com/myscript.php?dir=dir3 C: mysite.com/dir5/ display 404 error (Dir does not exist in DB nor in Database) Basically I want .htaccess to work like this: IF DIR Exists in DB - apply the rewrite rule and show content from myscript.php?dir=DIR ELSE don't apply any rule. I can create a separate php script, which can return 0/1 when given dir name exist in DB or not, but how do I make mod_rewrite get the data from that script? Is it possible with htaccess/PHP at all?

    Read the article

  • How to nicely inform to the user that an unknown error has happened?

    - by Jaime Soriano
    There are several guidelines for error reporting, that are usually based on giving to the user useful information when he or she does something wrong, but to give this kind of information you need to be handling the error and know that it can happen. There are also tons of articles about designing 404 error pages. But, what can you do when it's a new, unhandled error provoked by a failure in the shoftware? Are there some guidelines about how to nicely report totally unexpected errors in a web site, as an unexpected error 500? What header message should be shown in that case? something like "Sorry, an unexpected error has ocurred" would be enough? What information should be given? Should it have mechanisms to help to report the failure to developers? Which ones?

    Read the article

  • Redirecting a large number of URLs with htaccess or php header

    - by Peter
    I have undergone a major website overhaul and now have 5,000+ incoming links from search engines and external sites, bookmark services etc that lead to dead pages or 404 errors. A lot of the pages have corresponding "permalinks" or known replacement hierarchy/URL structure. I've started to list the main redirects with htaccess or physical files with simply a header location reidrect which is clearly not sustainable! What would be the best method to list all of the old link addresses and their corresponding new addresses with htaccess, php headers, mysql, sitemap file or is it better to have all broken links and wait for search engines etc to re-index my site? Are there any implications for having a large number of redirecting files for this temporary period until links are reset?

    Read the article

< Previous Page | 65 66 67 68 69 70 71 72 73 74 75 76  | Next Page >