Search Results

Search found 24376 results on 976 pages for 'site crawler'.

Page 47/976 | < Previous Page | 43 44 45 46 47 48 49 50 51 52 53 54  | Next Page >

  • How to ensure javascript in the browser is always enabled while traversing my plone site

    - by user956424
    I wish to disable loading the plone site if JS is disabled in the browser. Where exactly do I change the code? Which template/skin do I choose? I want to ensure that JS is always enabled while browsing any part of the plone site. While browsing, if JS is disabled, I can redirect to another page with tag to enable the JS in the browser and give a hlink to the site back if it is enabled. I am using Plone 4.1

    Read the article

  • Problem with session based login after moving relevent files to site root

    - by YsoL8
    Hello I have a site which I have been testing in a sub-folder of my clients site-root. I had no log in problems during testing, but then I moved the new site files from a sub-directory to the main site root, and now I'm losing my logged in state after almost every page refresh in secure areas. I am running a $_session based login system that refreshes the session id on every page load, with a comparison value stored in the MySQL database. Does anyone have suggestions for what could be causing this problem?

    Read the article

  • Asp.Net 2 integrated sites How to Logout second site programatically.

    - by NBrowne
    Hi , I am working with an asp.net 2.0 site (call it site 1) which has an iframe in it which loads up another site (site2) which is also an asp.net site which is developed by our team. When you log onto site 1 then behind the scenes site 2 is also logged in so that when you click the iframe tab then this displays site 2 with the user logged in (to prevent the user from having to log in twice). The problem i have is that when a user logs out of site 1 then we call some cleanup methods to perform FormsAuthentication.SignOut and clean session variables etc but at the moment no cleanup is called when the user on site 2. So the issue is that if the user opens up Site 2 from within a browser then website 2 opens with the user still logged in which is undesired. Can anyone give me some guidance as to the best approach for this?? One possible approach i though of was just that on click of logout button i could do a call to a custom page on Site 2 which would do the logout. Code below HttpWebRequest request; request = ((HttpWebRequest)(WebRequest.Create("www.mywebsite.com/Site2Logout.aspx"))); request.Method = "POST"; HttpCookie cookie = HttpContext.Current.Request.Cookies[FormsAuthentication.FormsCookieName]; Cookie authenticationCookie = new Cookie( FormsAuthentication.FormsCookieName, cookie.Value, cookie.Path, HttpContext.Current.Request.Url.Authority); request .CookieContainer = new CookieContainer(); request .CookieContainer.Add(authenticationCookie); response.GetResponse(); Problem i am having with this code is that when i run it and debug on Site 2 and check to see if the user is Authenticated they are not which i dont understand because if i open browser and browse to Site 2 i am Still authenticated. Any ideas , different direction to take etc ??? Please let me know if you need any more info or if i something i have said dosent make sense. Thanks

    Read the article

  • protecting my web site content from external

    - by Testadmin
    Hai I heard about external access of a web site using curl by the following code $curl_handle=curl_init(); curl_setopt($curl_handle,CURLOPT_URL,'http://example.com'); $buffer=curl_exec($curl_handle); curl_close($curl_handle); I want to protect my web site from this external access. I am using Php. how will I protect my web site? Does any one know?

    Read the article

  • Use jQuery on a variable instead on the DOM ? [solved]

    - by Stef
    In jQuery you can do : $("a[href$='.img']").each(function(index) { alert($(this).attr('href')); } I want to write a jQuery function which crawls x-levels from a website and collects all hrefs to gif images. So when I use the the get function to retrieve another page, $.get(href, function(data) { }); I want to be able to do something like data.$("a[href$='.img']").each(function(index) { }); Is this possible ? ...UPDATE... Thanks to the answers, I was able to fix this problem. function FetchPage(href) { $.ajax({ url: href, async: false, cache: false, success: function(html){ $("#__tmp__").append("<page><name>" + href + "</name><content>" + html + "</content></page>"); } }); } See this zip file for an example how to use it.

    Read the article

  • Best way to password protect a site? .htacess

    - by Mike Lawsom
    I created/edited a .htaccess file and I got my site password protected fine. Question though: Is there such thing as a URL key? Maybe I'm wording that incorrectly, but I would like to keep my site hidden, but be able to send out a specific URL that can view the site. What's the best way to accomplish this? Thanks in advance.

    Read the article

  • Howto disable the emacs site-start files permanently?

    - by elemakil
    When solving this problem I figured out that I need to disable the site-wise init files in order to get my emacs + CEDET running (everything works nicely when starting emacs using emacs --no-site-file but is broken without the additional argument). I'd like to disable the site-wise init files permanently but as I'm using several different approaches/methods when launching emacs (launcher/panel/terminal) I don't think aliasing it in my .zshrc won't work. I require a method to permanently disable all site-start files. Is there any easy way to achieve this? Thanks!

    Read the article

  • Help getting PHP sessions to persist after being taken off-site

    - by Rob
    Hi, we're working on a payment gateway system with PHP. We use sessions to store the shopping cart data. The system takes you off-site to the payment processor's site, and then return you to your site. On most hosts, we have no issues when we arrive back at our site and having the session data still be there. On those with an issue, turning off the PHP session referer_check has worked in the past. We don't really want to have to include the PHPSESSID as a GET variable, and hope to avoid that. What I'm looking for is what other server configurations would cause the session to be killed, and any other workarounds there might be. Thanks for your help.

    Read the article

  • Use jQuery on a variable instead on the DOM ?

    - by Stef
    In jQuery you can do : $("a[href$='.img']").each(function(index) { alert($(this).attr('href')); } I want to write a jQuery function which crawls x-levels from a website and collects all hrefs to gif images. So when I use the the get function to retrieve another page, $.get(href, function(data) { }); I want to be able to do something like data.$("a[href$='.img']").each(function(index) { }); Is this possible ?

    Read the article

  • SharePoint replicating document between site according to external database on a regular basis

    - by kevin
    I want to copy a file from Site A to Site B according to external DB status on a timer job. e.g I have documents in Site A. I want to move those documents to Site B when the document' status in external MSSQL DB is "approved". I want to do it in a timer job. Which way will be great to do so ? I don't know how to use timer job and BDC job together. Please advice me. Thanks in advance. BTW, I don't know much about SharePoint.

    Read the article

  • What CSS attribute stops my site squidging up?

    - by SLC
    I am looking at a website on the internet with the standard centered box design (the entire site is centered in a box and has a border each side). As you shrink it, the border gets smaller but the content stays the same size. This is fine. However on their site, once you get smaller still, you simply get scrollbars. The content remains untouched. On my site if you do this, things like text and links start to scroll onto multiple lines and the whole site gets broken. How can I keep it the same even when the viewport is small?

    Read the article

  • need to crawl images and the whole web pages

    - by Kei Situ
    hey, I am starting a project and wonder the relationship between the characters in images and the whole web page where the images reside. so first, i want to crawl some images and their web pages.....need to save the crawl result in local disk for further analysis. I wonder if there is any open source for this issue? thx^_^

    Read the article

  • Mark Shuttleworth s'excuse pour la mise en demeure contre le site FixUbuntu.com et les remarques concernant les détracteurs de MIR

    Mark Shuttleworth s'excuse pour la mise en demeure contre le site FixUbuntu.com et les remarques concernant les détracteurs de MIRSuite à la mise en demeure contre le site FixUbuntu.com pour violation de sa marque, Canonical, par la voix de son fondateur, Mark Shuttleworth, a ténu à présenter des excuses à Micah Lee, responsable du site.Pour rappel, Canonical reprochait à Micah Lee d'utiliser le nom « Ubuntu » et son logo, ce qui pouvait « conduire à une confusion ou une association de son site...

    Read the article

  • nginx php-fpm empty site

    - by Katafalkas
    I am writing this as I was stuck trying to fix it whole night. We have recently migrated to a new server. Before migration we have been running our site on nginx + cgi (script). After migration we decided to try apache + mod_php. It was rather terrible, and I would like to migrate back to nginx, but this time I want it with php-fpm (as people say its a cool) So I did follow a number of guides and I think i done everything correctly. As well as that I have our old config files for "server" section, which i reviewed and places into new config. So I ended up having an empty site when I enter our URL. (By empty I mean blank page, no letter, errors or anything at all.) In the access log there are some weird errors like: 123.242.148.54 - - [22/Mar/2012:06:08:11 +0200] "-" 400 0 "-" "-" My guess is that php-fpm is not working, but not sure how to confirm it. Maybe some1 could give some help ? Would much appreciate. my nginx config: user nginx; worker_processes 12; error_log /var/log/nginx/error.log; #error_log /var/log/nginx/error.log notice; #error_log /var/log/nginx/error.log info; pid /var/run/nginx.pid; events { worker_connections 4096; } http { include /etc/nginx/mime.types; #default_type application/octet-stream; log_format main '$remote_addr - $remote_user [$time_local] "$request" ' '$status $body_bytes_sent "$http_referer" ' '"$http_user_agent" "$http_x_forwarded_for"'; access_log /var/log/nginx/access.log main; sendfile on; #tcp_nopush on; #keepalive_timeout 0; keepalive_timeout 65; tcp_nodelay on; gzip on; gzip_disable "MSIE [1-6]\.(?!.*SV1)"; fastcgi_buffer_size 256k; fastcgi_buffers 4 256k; server { listen 80; server_name www.example.com; access_log /var/log/nginx/example.access.log; error_log /var/log/nginx/error.log; root /home/www/example; index index.php; client_max_body_size 50M; #error_page 404 /404.html; # phpMyAdmin location /phpmyadmin { root /usr/share/; error_log /var/log/nginx/phpmyadmin.log; try_files $uri $uri/ /index.php; } location ~ ^/phpmyadmin/.*\.php$ { root /usr/share/; error_log /var/log/nginx/phpmyadmin.log; include /etc/nginx/fastcgi_params; fastcgi_pass 127.0.0.1:9000; } # Munin location /monitoring { root /var/www/; auth_basic "Restricted"; auth_basic_user_file /etc/nginx/conf.d/monitoring_users; error_log /var/log/nginx/monitoring.log; index index.html; } # The site location / { try_files $uri $uri/ /index.php/?$uri&$args; } # PHP interpreter location ~ \.php { include /etc/nginx/fastcgi_params; fastcgi_pass 127.0.0.1:9000; } }

    Read the article

  • problem in publishing

    - by girish
    when i am publishing my .net website on my domain it is showing the directory of files when i open my site .. anyone please tell me how to avoid this and make my website open mu home page .

    Read the article

  • IIS6: Web Site presenting the wrong SSL certificate

    - by pcampbell
    Consider an IIS6 installation with multiple Web Sites. Each is intended to be a different subdomain with its own cert (not a wildcard cert). Each has their host-header specified properly. foo.example.com - port 443. Require SSL w/128 bit. Working properly! It presents its SSL cert properly to the browser. Configured for a specific IP address. bar.example.com - port 443. Require SSL w/128 bit. Configured for all unassigned addresses. When inspecting the IIS property page, it fully shows the cert for bar.example.com on the View Certificate button. This is a NEW web site that is having cert problems. It's presenting the cert for foo.example.com. Ouch! Question: can you have more than one subdomains both running on separate websites with SSL certs on the same port (443)? How would you configure 2 web sites on the same range of 'all unassigned' for the same port (443) ? Update: ignoring the cert error, when browsing to https://bar, the content served is from https://foo site. When NOT using SSL, browsing to http://bar serves the correct content from bar. Just one address is assigned to this DMZ server.

    Read the article

  • Server configuration advice for new site that could get lots of traffic within 6m

    - by alchemical
    We're setting up a new web2.0 type site with elements of e-commerce. Budget is kind of tight. Due to the nature of the site and promotions, etc., we expect traffic could ramp up fairly quickly. Looking for advice for a good configuration to start with, we' looking to co-lo with CalPop in downtown LA. We've looked at Dell, ABMX.com, and got a quote from CalPop (they make their own servers as they also do managed hosting). Price range has been anywhere from about $1200-$3300 per server. We're thinking to start with a web server and db server, both with mirrored drives. It would be nice to stay under about 2k per server if possible. Min configuration for each would probably be a quad-core with 8GB Ram. Thinking to run Windows Server 2008 R2 (Web Edition?) and SQL Server 2008. Looking for advice on the best server configurations and/or brands that fit the budget, yet will allow us to smoothly scale as traffic increases. Reliability is also pretty important. Also wondering if a switch/router is necessary or useful to connect the two servers.

    Read the article

  • Domain with www pointing yo another site

    - by ntechi
    Recently I started Multi Sites on my VPS which is having Centos 64 bit. Currently I am having two sites live and each is working fine, Now the problem is in the URL I have the following sites: http://mbas.co.in http://u-k.in mbas was the very first site on my VPS Now in URL if I type http://mbas.co.in or http://www.mbas.co.in both redirect to my mbas website But for the second website, If in URL I type http://u-k.in then it redirects to the u-k website correctly but if I type http://www.u-k.in then it redirects me to mbas website. You can try that I have configured my DNS in this way, see the image http://i55.tinypic.com/14vlpxl.jpg And my Multi Site code is this <VirtualHost *:80> ServerAdmin [email protected] DocumentRoot /var/www/html/www.mbas.co.in ServerName mbas.co.in ErrorLog logs/mbas.co.in-error_log CustomLog logs/mbas.co.in-access_log common </VirtualHost> <VirtualHost *:80> ServerAdmin [email protected] DocumentRoot /var/www/html/u-k.in ServerName u-k.in ErrorLog logs/u-k-error_log CustomLog logs/u-k-access_log common </VirtualHost>

    Read the article

< Previous Page | 43 44 45 46 47 48 49 50 51 52 53 54  | Next Page >