Search Results

Search found 14294 results on 572 pages for 'browser modes'.

Page 351/572 | < Previous Page | 347 348 349 350 351 352 353 354 355 356 357 358  | Next Page >

  • FeedValidator & Feedburner get 404 when accessing wordpress RSS feeds when permalinks are enabled.

    - by Wazbaur
    I'm helping a friend set up a self-hosted Wordpress blog + feedburner and I'm seeing a problem with the feeds that I'm finding somewhat mysterious. Using the default permalink structure (e.g., ?p=123) everything works as expected; I can follow the feed in Google reader, navigate to it manually, and set it up in feedburner. However, once I switch away from the default permalink structure, feedburner and feedvalidator both report that accessing the feed is returning HTTP-404 and Google reader no longer shows new posts (I'm assuming for the same reason), but I can navigate to the feed using a browser. When I do that it appears as though nothing is wrong; there is a feed there and it contains all the posts I expect it to have. I've re-started the feedburner & reader set-up from the beginning after changing the link structure, so I don't think they're doing anything silly like looking at the feed at its old address. I've seen people with similar problems in various other places but there doesn't seem to be a good answer anywhere.

    Read the article

  • Improving Windows Authentication performance on IIS

    - by flalar
    We're struggling with performance issues with a ASP.NET MVC site that is using Windows Authentication. Response time is very slow on the first request to the site when the user is being authenticated. Further, every time the Authorization header is sent from the browser the response time increases with many seconds. The same issue occurs for both executed files and static content like CSS and JS. Access to the application is restricted to users within a certain role and we are now planning to allow access to static files for all authenticated users to see if that helps. The authentication method in use is NTLM. How should we go forward in pinpointing why authentication decreases performance drastically?

    Read the article

  • nginx caching per user agent

    - by Tuinslak
    I'm currently using nginx as reverse proxy with caching enabled. However, the main site has two different layouts, depending on the user-agent (mobile or not). I've tried something similar to this: # mobile users if ($http_user_agent ~* '(iPhone|iPod|mobile|Android|2.0\ MMP|240x320|AvantGo|BlackBerry|Blazer|Cellphone|Danger|DoCoMo|Elaine/3.0|EudoraWeb|hiptop|IEMobile)') { set $iphone_request '1'; } if ($iphone_request = '1') { proxy_cache mobile; } if ($iphone_request = '') { proxy_cache site; } proxy_cache_key "$scheme://$host$request_uri"; proxy_pass http://real-site.tld; However, nginx gives an error, stating proxy_cache can't be used in an if-structure. Any other way to serve from a different cache depending on the browser? Thanks, Tuinslak

    Read the article

  • Problem running application on windows server 2008 instance using amazon ec2 service and WAMP

    - by Siddharth
    I have a basic (small type) windows server 2008 instance running on amazon ec2. I've installed WAMP server on to it, and have also loaded my application. I did this using Remote desktop Connection from my windows machine. I'm able to run my application locally on the instance, however when I try to access it using the public DNS given to it by amazon, from my browser, I'm unable to do so. My instance has a security group that is configured to allow HTTP, HTTPS, RDP, SSH and SMTP requests on different ports. In fact I have the exact same security group as the one used in this blog, http://howto.opml.org/dave/ec2/ I did almost everything same as the blog, except for using a different Amazon Machine Image. This is my first time using amazon ec2, and i can't figure out what I'm doing wrong here

    Read the article

  • Remote desktop Client versus Web based access to reports and limited data entry

    - by Voyager
    We have a requirement from management to give limited access of our Application to Distributors \ Dealers to look at their account statements in our books of account, enter their purchase requirements (sales order for us). We have given a few of them the RDC who connect to our terminal server and access the reports. This involves licensing of TS Client per each distributor. Is it more better, secure and less costly if a web based application is made to only enter the orders and retrive reports like pending orders, ledgers, receivables etc. Also which is more secure as far as database access is concerened...browser based access or RDC access. Please answer.

    Read the article

  • Vmware Fusion 5 Port Forwarding

    - by Snap Shot
    I have a service (a node.js express app) running on port 3000 in a CentOS 6.3 guest that I would like to access in a web browser on my Mac Mountain Lion host using VMware Fusion 5 Professional. I am having trouble finding any information about how to do this. I believe I would like to forward the port but I cannot find any information about this using either the GUI or by modifying configuration files. In earlier versions it looks like you might have modified a file called nat.conf but that does not seem to apply to Fusion 5. Has anyone successfully done this? Thank you.

    Read the article

  • PHP pages are not parsed by Apache on CentOS

    - by infotoknowledge
    I have installed Centos 5.x, Apache 2.2, PHP 5.3 and MySQL 5.5. I also installed phpMyAdmin. I am able to access phpMyAdmin through the browser without any issues. However, when I create a simple index.php with phpinfo() function in the default directory, that page is served without php parsing. As we all know, phpMyAdmin is a php application. This is working fine from the same server but not the simple php page from the doc root directory ??!!!. Of course, I tried moving this page into phpMyAdmin folder and tried accessing it, but no success. Please note that I updated httpd.conf file with appropriate directives based on the php installation guide. docroot - /var/www/html phpMyAdmin folder - /var/www/html/phpMyAdmin Any help is appreciated.

    Read the article

  • Prevent Chrome from automatically opening downloaded PDF and Image files

    - by Phoenix
    When I download a PDF or image in Google Chrome on my Mac, is it possible to prevent Chrome from automatically opening it in my default application for that file type (e.g., Preview)? I notice that Chrome does not do this for other downloaded files such as audio and ZIP archives. I still want to be able to preview files in Chrome; I just want to prevent it from automatically launching my image/PDF viewer application after I download them. For example: I click on a link in an email to a PDF document or an image file. Chrome displays the contents in the browser. I press Cmd-S and save the file to my computer. When the download finishes, the file opens automatically in Preview.app. It's that last step that I would like to bypass.

    Read the article

  • nginx deny directory and files to be downloaded

    - by YeppThat'sMe
    gurus. I have a problem and i dont know how to solve it. I am working with Git and Compass/SASS on some projects. Now i want to protect those directories. When i go only to the folder its all fine – i get what i expected a 403 forbidden. location ~ /\.git { deny all; } But when i try use the full path to the config file from git the browser start to download it. Same scenario with compass. There is a config.rb file within the folder which also starts to download it. How can i prevent this behaviour? How can i deny downloading specific files?

    Read the article

  • Getting error while starting tomcat?

    - by ram
    For my Tomcat installation process case is 1. cd /home/mpatil/Downloads/ 2. tar zxvf apache-tomcat-6.0.37.tar.gz 3. cd apache-tomcat-6.0.37/bin 4. ./startup.sh 5. tail -f /home/mpatil/Downloads/apache-tomcat-6.0.37/logs/catalina.out for `5` command results : [root@localhost bin]# tail -f /home/mpatil/Downloads/apache-tomcat-6.0.37/logs/catalina.out Nov 08, 2013 12:04:04 PM org.apache.catalina.startup.HostConfig deployDirectory INFO: Deploying web application directory docs Nov 08, 2013 12:04:04 PM org.apache.coyote.http11.Http11Protocol start INFO: Starting Coyote HTTP/1.1 on http-8080 Nov 08, 2013 12:04:04 PM org.apache.jk.common.ChannelSocket init INFO: JK: ajp13 listening on /0.0.0.0:8009 Nov 08, 2013 12:04:04 PM org.apache.jk.server.JkMain start INFO: Jk running ID=0 time=0/115 config=null Nov 08, 2013 12:04:04 PM org.apache.catalina.startup.Catalina start INFO: Server startup in 3036 ms and i tried in browser like http://locahost:8080/ nothing comming why.whats the wrong in my command or i did any wrong in my commands pls tel me

    Read the article

  • Apache rewrite rules behind a nginx proxy

    - by Tuinslak
    Hi, I am running nginx (:80) in front of an Apache webserver (:8080) Nginx config (snippet): location / { proxy_pass http://www.domain.tld:8080; proxy_set_header X-Real-IP $remote_addr; If I set localhost instead of www.domain.tld, my browser gets redirect to http://localhost:8080. Apache rewrite rules: RewriteEngine On Options +FollowSymlinks RewriteBase / RewriteCond %{HTTP_HOST} !^www\. RewriteRule ^ http://www.%{HTTP_HOST}%{REQUEST_URI} [L,R=301] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_URI} !\..+$ RewriteCond %{REQUEST_URI} !/$ RewriteRule (.*) http://%{HTTP_HOST}/$1/ [L,R=301] RewriteCond %{REQUEST_URI} !v2/ RewriteRule ^(.*)$ v1/$1 [L] So far, so good. However, every link (which uses relative paths) appears as http://www.domain.tld:8080/page instead of staying on port 80. Is there any way to solve this through the rewrite rules? I don't want to use absolute paths. Thanks

    Read the article

  • Xorg and three button mouse: emulate more than scroll alone

    - by drumfire
    I'm happy with my three-button mouse. But it could have more functions. When I press the middle button (button 2) and move the mouse, I can scroll up/down and left/right. xev shows me that when I drag the mouse with button 2 pressed it actually emulates button 4 (scroll up), 5 (down), 6 (left) and 7 (right). But can we take this further? For example, I would like to emulate button 8 if I press button 1 and 2 together, and button 9 if i press button 2 and 3. This would allow me to have the 'back' and 'forward' functionality in my browser. It would also be nice to get more functionality by pressing, for example, shift+mouse1, alt+mouse2. You could have a whole new set of emulated keys. Can this be done and if yes, then how?

    Read the article

  • I cant browse php pages in my local server

    - by tibin mathew
    Hi, I cant browse php pages in my local server.Before it was working fine. But now i cant browse php pages, i can browse html pages and asp pages , no problems with that. But when i try to browse a php page its not loading. What will be the problem?? I am using windows 2000 advanced server and my web server is Tomcat please someone help me Guys i'm not getting anything in my browser, its just continue to loading Nothing showing in that page i'm not getting any 404 error or anything like that. its just continue to be loading for example consider my file is located under insider a folder named as myproject i can reach upto this http://localhost/projects/myproject but after that i cant browse php pages inside that... http://localhost/projects/myproject/index.php this will continue to be loading, and nothing shows in that page

    Read the article

  • Why are PNG-8 files mangled when opened in Photoshop?

    - by Daniel Beardsley
    Why are some 32 bit PNGs opened in Photoshop with Indexed Colors and no transparency? For instance, I grabbed a png icon file of the Stack Overflow logo at: http://blog.stackoverflow.com/wp-content/uploads/icon-so.png When opening it in Photoshop CS3, it apparently treats it as indexed color and gets rid of the alpha channel. The image on the right is a screen grab of the icon. Changing the Image mode in Photoshop to RGB doesn't change the image at all. I've tried this with a few other PNGs and it seems hit or miss. When viewed in other programs, it displays fine. left:png opened in Photoshop, right:screen grab of png from browser What gives?, does Photoshop not interpret the PNG file format correctly?

    Read the article

  • Auto logon to internet provider

    - by user31673
    We use wireless internet access. The wireless is secured and to get an IP address, you provide the right key. That is setup just fine. However, on top of that, the first time we access the internet, a browser page appears and we have to enter a password. then we have access to the internet for a period of time. How can the password step be automated? I also have a web-based printer that doesn't work (except via USB connection) because it can't get out to the web. I can't change the router settings. Is there anything I can do to get the printer working and automatic the access?

    Read the article

  • Can't access internet even though everything is working

    - by entity64
    A friend recently upgraded to a new cable internet connection. The modem connects to the router and various PCs and smartphones from her roommates connect to the router. They don't have any problem accessing the internet. She has Windows 8 and can't access any website (via wifi and ethernet). DNS (UDP) is working, DHCP set up everything correctly, Wifi is working, Trace routes and Pings (ICMP) go through with no problem at all. But neither Dropbox nor Skype nor Spotify nor any browser (all TCP) can access any website. The thing is though, she can connect through the university wifi and via a neighbors wifi. It's just her home connection. No firewalls are running and the computer is clean - no malware. How could it be that only her home connection won't work and others do?

    Read the article

  • group write permission ignored in ubuntu

    - by NorthPole
    Its probably my stupidy here but i'm stuck on this and would appreciate the help. I want my user to have full access to the local apache root folder, and i also want the apache to have full access to the same folder. What i did was create a new group called DevGroup and i added my username and www-data there. also i changed the permissions to 770 to allow full group access but now it wont allow me or the apache any kind of access to the folder. here is what i get with ls drwxrwx--- 12 root DevGroup 4096 Sep 27 17:34 testFolder which seems perfect but when i try as a user to access the file i get this var/www$ ls testFolder/ ls: cannot open directory testFolder/: Permission denied also when i try to access the a page in the folder from browser [Thu Sep 27 17:47:16 2012] [error] [client 127.0.0.1] PHP Fatal error: Unknown: Failed opening required '/var/www/testFolder/foo.php' (include_path='.:/usr/share/php:/usr/share/pear') in Unknown on line 0

    Read the article

  • System stuttering caused by hard drive

    - by LukLed
    My system keeps freezing for about 1-2 second every time I try to do something. For example, when I enter URL in browser, it freezes and starts working after few seconds. It is probably related to hard drive. I installed HD tune and when benchmark is in progress, causing constant disk use, everything works fine in background, there are no lags. What can be the reason of this issue? My hardware is Acer Aspire 7740G-6969 running on Windows 7.

    Read the article

  • Virtualhosts - best way of dealing with it?

    - by axqe56
    I'm competent at the basics of Apache, PHP and virtual hosting but have a question about virtual hosting. As far as I'm aware, HOSTS files can only be in one of the following locations: C:/Windows/system32/drivers/etc (varies in older installs, I believe) I don't think it can be put elsewhere for use with Apache, simply for virtual hosts, and the main HOSTS file for blocking sites etc. I heard about PAC files on Uniform Server's website (http://wiki.uniformserver.com/index.php/Virtual_Hosting:_PAC) but they're browser-specific though, aren't they? What's the best way to deal with virtualhosts, other than HOSTS file? My server isn't currently open to the internet yet, but if it is, what's the best way to resolve DNS for my virtualhost domains if it were to become forward-facing (i.e open to the internet)?

    Read the article

  • How to download big file with chrome on Mac OSX?

    - by Eye of Hell
    If I try to download a big file on unstable connection/server (XCode 4) Google chrome simply "stops" downloading on first network error so I have a first 1-2-3 gigabytes of file and chrome thinks that download is finished. Unfortunately, I need to download an entire file, so I need a more advanced download tool like a wget. But there comes a problem: most URL's currently on the web is not a direct URL but multiple "redicrect" pages that utilize complex javascript in order to generate next url and redirect browser to it. Chrome handles such things ok, but if I try to supply such URL to wget it will download some "intermediate" page as a file - not a file itself but an HTML page with complex redirect javascript. is it any way to get a direct URL from chrome or to somehow discover it so I can use it with wget? Maybe it's some avanced download manager integrated in chrome that I just need to install? I use MacOS X 10.6.6 and latest Google chrome.

    Read the article

  • How to cd into smb://[email protected] from terminal?

    - by John
    I am using ubuntu and gnome on my computer. When I open up File Browser, on the left hand rail, I see conveniently a folder called "Work Server". When I mouse over it, the following caption appears "smb://[email protected]". If I click on that folder, then I can see the contents of that folder. Everything is great. So now when I open up a terminal/shell, I type in cd smb://[email protected] I get an error saying the directory doesn't exist. How do I enter this directory via shell/terminal?

    Read the article

  • Squid url rewrites https>>http

    - by bobfran
    I'm exploring some uses with Squid proxy 2.7 and I have seen a good number of examples for url rewrites that take urls such as: http: //somesitename.com and then the rewriter can change the url to: https: //somesitename.com And those examples work great. What I'm wondering though, is if its possible to do the reverse with a squid url rewriter. that is, to go from https: //somesitename.com to http: //somesitename.com ? Simply trying to edit the script file that handles the rewrites doesn't seem to do the trick. So I was wondering if there are some certain things I have to configure squid to do first, if its even possible to do what I am asking. I have my browser manually set up to have squid as a proxy for all requests and I can see https requests showing up in my squid access.log file (via the CONNECT method).

    Read the article

  • Intermittent HTTP 401 errors

    - by forthrin
    I am using an Intranet solution which requires basic HTTP login. However, there is an intermittent error which requires me to log in again, and then the server says "Forbidden" whether I give the correct login information or not. To add insult to injury, Safari (and Chrome) seems to show the login dialog for every included resource in the HTML, and it's impossible to cancel this modal dialog sequence, so the whole browser is blocked until I've pressed Esc some 30 odd times. After an hour, I may gain access again, without having really done anything. My questions: What could cause temporal 401 errors? Why do the browsers show the login dialog 30 times per page load (assumedly for every included resource in the HTML from the same domain)?

    Read the article

  • How do I now access my site for an installation

    - by user4524
    I have just rented a virtual private server with DirectAdmin. I have an ip address, lets say its: 178.239.60.18 Now I have made a new domain on the server. It resides in a folder called: example. Now when I would like to acces this in a browser, I type in 178.239.60.18/example or 178.239.60.18:example But this does not work. What am I doing wrong? When I look at the DNS record it does say the the ip address for example is 178.239.60.18

    Read the article

  • Phishing site uses subdomain that I never registered

    - by gotgenes
    I recently received the following message from Google Webmaster Tools: Dear site owner or webmaster of http://gotgenes.com/, [...] Below are one or more example URLs on your site which may be part of a phishing attack: http://repair.gotgenes.com/~elmsa/.your-account.php [...] What I don't understand is that I never had a subdomain repair.gotgenes.com, but visiting it in the web browser gives an actual My DNS is FreeDNS, which does not list a repair subdomain. My domain name is registered with GoDaddy, and the nameservers are correctly set to NS1.AFRAID.ORG, NS2.AFRAID.ORG, NS3.AFRAID.ORG, and NS4.AFRAID.ORG. I have the following questions: Where is repair.gotgenes.com actually registered? How was it registered? What action can I take to have it removed from DNSs? How can I prevent this from happening in the future? This is pretty disconcerting; I feel like my domain has been hijacked. Any help would be much appreciated.

    Read the article

< Previous Page | 347 348 349 350 351 352 353 354 355 356 357 358  | Next Page >