Search Results

Search found 22500 results on 900 pages for 'multiple browsers'.

Page 408/900 | < Previous Page | 404 405 406 407 408 409 410 411 412 413 414 415  | Next Page >

  • The proxy server received an invalid response from an upstream server

    - by chandank
    I have tomcat server behind the apache. I am using mod_ssl and reverse proxy to the tomcat. All are running at default ports. The full error is as follow. ack Proxy Error The proxy server received an invalid response from an upstream server. The proxy server could not handle the request POST /pages/doeditpage.action. Reason: Error reading from remote server If I clean the browser cache the error goes away and comes back after few attempts. I test the same on Chrome/Firefox/IE on Windows platform. Wondering it works perfectly on Linux based Chrome/Firefox. I googled a lot there are few answers at stack overflow but I am not able to find my answer. Is this a server side problem? because so many browsers cant be wrong at same time on Windows.

    Read the article

  • Macbook Pro suddenly lagging video playback + Flash sites

    - by Mathias
    I have a Macbook Pro, OSx Lion, Intel Core2 Duo, 4GB Ram, NVidia Geforce 8600M GT 128 MB Ram, Intel x25m SSD. Approximately 4 years old. I've been running Flash sites and playing videos without any problems for years. Then suddenly 3 months ago, a flash site like http://thefwa.com is lagging in all browsers. Even mouseover animations - anything. Also video playback in e.g VLC and Quicktime is now lagging. Same videos I used before, I tried installing an older version of VLC without any luck. Playing back video in VLC utilizes the CPU almost 100%, and Flash sites like thefwa.com easily takes up 50-60%. It's as if the hardware acceleration stopped working, or the GPU lost its magic. UPDATE: Same issues also occurred on Snow Leopard Has anyone experienced something similar, or do you know what might be wrong?

    Read the article

  • Linux Kernel Packet Forwarding Performance

    - by Bob Somers
    I've been using a Linux box as a router for some time now. Nothing too fancy, just enabling forwarding in the kernel, turning on masquerading, and setting up iptables to poke a few holes in the firewall. Recently a friend of mine pointed out a performance problem. Single TCP connections seem to experience very poor performance. You have to open multiple parallel TCP connections to get decent speed. For example, I have a 10 Mbit internet connection. When I download a file from a known-fast source using something like the DownThemAll! extension for Firefox (which opens multiple parallel TCP connections) I can get it to max out my downstream bandwidth at around 1 MB/s. However, when I download the same file using the built-in download manager in Firefox (uses only a single TCP connection) it starts fast and the speed tanks until it tops out around 100 KB/s to 350 KB/s. I've checked the internal network and it doesn't seem to have any problems. Everything goes through a 100 Mbit switch. I've also run iperf both internally (from the router to my desktop) and externally (from my desktop to a Linux box I own out on the net) and haven't seen any problems. It tops out around 1 MB/s like it should. Speedtest.net also reports 10 Mbits speeds. The load on the Linux machine is around 0.00, 0.00, 0.00 all the time, and it's got plenty of free RAM. It's an older laptop with a Pentium M 1.6 GHz processor and 1 GB of RAM. The internal network is connected to the built in Intel NIC and the cable modem is connected to a Netgear FA511 32-bit PCMCIA network card. I think the problem is with the packet forwarding in the router, but I honestly am not sure where the problem could be. Is there anything that would substantially slow down a single TCP stream?

    Read the article

  • Did Firefox running in OSX get hacked

    - by z-buffer
    When I try to do a google search in Firefox, I can't click on any of the links. They're just regular black text, not hyperlinks. I even tried safe mode and disabling all the plugins. I was running Firefox 12. I installed the current version over it and it's the same thing. Other browsers have not been affected. This is what it looks like. Edit: My firewall was turned off and I had several things running which are potential security holes. I turned on my firewall and closed all unnecessary ports. I also turned off Home Sharing. Then I restarted my computer. After that, Firefox works normally again. What do you think happened?

    Read the article

  • Chroot jail of Nginx and php

    - by sqren
    I'm hosting multiple websites on one VPS, and want to chroot each website, eg. /chroot/website1 /chroot/website2 I'm using makejail, which is a highlevel tool, for creating the jails, and copying the libraries and dependencies. Easy peasy. Each website will need nginx, php and mysql. For php I'm using php5-fpm which actually supports chroot by configuration, however I'm not using this (maybe I should?) My question is which approach of the following three is the better: 1) Every website will have its own seperated instance of nginx, php and mysql. The downside is, that each webserver + php has to listen to a different port. I also need a "master" nginx web server in front of them, reverse proxying to the chrooted servers behind it. Probably most secure, but also most advanced. 2) I don't make any chroot jails manually. I setup one nginx web server, that proxies php requests to php-fpm, on different ports. I can have multiple php-fpm configurations each with is own chroot'ed folder. This is quite managable - however only php will be chrooted. Not the actual webserver. Is this secure enough. Also, I tried this option out, and it seems I will need to use TCP instead of sockets for connecting to MySQL. 3) You tell me ;) I'm quite new to chroot jailing, so please correct me if I'm wrong in my assumptions. I've been reading all the tutorials I could find, however, I find the market for chroot guides very scarce. Any help or inputs much appreciated!

    Read the article

  • Finding a backup and synchronization solution

    - by Andrea Zilio
    I'm having difficulties to find a backup and synchronization solution with the following characteristics: Cross-platform: Windows, Linux, Mac Offsite backup (so Internet Backup) Data deduplication Transfer only new/modified bits of modified files Secure: Data encrypted before leaving computer Maintain multiple versions of files (even deleted files) Folder synchronization integrated with backup and across multiple computers connected to the internet (not necessarily in the same LAN) I think that the Folder Sync feature needs a better explanation. The use case is this: you have a desktop pc and a laptop. The desktop pc contains a folder with some files and this folder is part of the backup (so it was selected to be backed up). The laptop does not contain that folder or that files at all. Then you're abroad with your laptop and you need that folder. So you want to be able to open the backup program, select that folder from the backup and download it in your laptop mantaining it synchronized with the backed up version. When you then come back home and switch on your desktop pc you want the folder we're talking about to be updated in the desktop PC. Does anyone knows any service with all these features? I've only found SpiderOak to support all the features I've mentioned but I'm not completely satisfied by the time taken to complete a backup. Sometimes it seems to hang for minutes with no reasons at all and folder synchronization occurs only after all files are backed up (instead folder sync should have a separated queue independent from other backup operations and synchronization should occurs frequently... for example every 5 minutes or less, independently from the frequency of normal backup operations)

    Read the article

  • How to change start menu location for Windows 7 Programs

    - by user30994
    I keep my start menu in Windows 7 very organized, but every time there's a program update available (Safari for example), the program recreates its shortcut icons in the default start menu location for that program (Safari, for example, recreates start menu short icons in "Start Menu\All Programs\Safari"). So, every time I update a program I have to move it's start menu icons again to keep them organized the way I like. Some programs ask where I would like the start menu icon placed, and that works fine, but for the programs that don't ask... Is there a way to set a default start menu location for programs so that when I update, the shortcuts are placed in the folders I want them to be at? (Safari for example I keep in "Start Menu\All Programs\Web Browsers\Safari.lnk")

    Read the article

  • Qmail Patching Makes me Nervous

    - by JM4
    We have a system running CentOS 5 with Plesk 8.6 and Qmail running. Our primary domain is hosted through Media Temple. When Plesk and Qmail are hosted on a single Dedicated Virtual server, it reads the primary server IP and domain and reports that when sending emails from the system. Our pages are written in PHP so we are using the mail() function. While our email goes out to everybody, several enterprise email domains reject our email because it shows a different originating IP (our primary server IP and domain) than the domain we list in the 'from' address. This is not modifiable. Every domain we own of course does have its own IP as well underneath our primary server IP. I have seen several places online that provide a patch, specifically - which allows Domain Binding: "DomainBindings -- For servers that host multiple domains or have multiple IP addresses assigned to them, it is sometimes useful (or important) to have qmail use a specific IP address for its outgoing mail. By default, qmail uses whatever address the OS chooses for all outbound connections. With this patch, you can specify which address to use. It uses a control file similar to smtproutes to specify the outbound IP address to use, based on the sender's domain (local copy) (pyropus.ca)" Qmail Link First off I do not have netqmail installed so I'll need to find another source, but also I am completely unfamiliar with applying patches to qmail. Will I lose email services if I patch? Is it a simple apply and use process? Will my existing email accounts and data be restored after the patch? I am very, very new to unix/linux so this does make me a bit nervous but I am the only person who can make the change and it is one our company "HAS" to have. Any ideas?

    Read the article

  • graphics-card makes sound-card produce a buzzing sound

    - by Markus von Broady
    Recently I bought a new GPU: GeForce GTX 550 Ti, and after installing it I get a strange buzzing sound. It's not always there, just sometimes (mostly when I open some game, but sometimes also in browsers etc.). It's not a capacitor or a fan, as unplugging speakers from sound-card makes the 'bzzzzzzz' go away. However, muting windows doesn't mute this sound. I'm pretty sure it is fault of the new GPU, but how is this happening and can I fix it? Can it be a low power supply? I thought of buying a stronger unit, but as everything works, and computer doesn't shut down, I hesitate.

    Read the article

  • VPN only connects to its server!

    - by Eddie
    Hi guys; Previously I bought a windows 2003 VPS and enabled routing and remote access so that users can make a vpn connection. I turend the firewall off and everything was working fine. But since 2 days ago whenever I try and connect to vpn it connects to vpn without any problem and I can see the connection status however it only connects to the server I mean what I can do with this vpn is to connect to the server via remote desktop and I can ping only the server's IP, neither I can open any webpages in browsers or ping other IP addresses beside the server one! I've also rebuilt the server and configured it for routing access and vpn connection from the beginning but it doesn't work either. It seems that server fails to route the traffic properly, as i'm sure that the firewall has been turned off I can't figure out what's the reason, any idea what's going on? Thanks in advance

    Read the article

  • SSH tunnel over http proxy with blocked 443 (SSL)

    - by Evgeny Zhulenev
    Is it possible to create an SSH tunnel over http-proxy when https access is denied? I had such configuration in .ssh\config Host home User root Hostname *my-home-pc-with-ssh-access-allowed* Port 8090 ProxyCommand corkscrew db-isa-01 8080 %h %p ~/.ssh/.corkscrew-db-isa-auth IdentityFile ~/.ssh/id_rsa Where db-isa-01 is my corporate proxy server. Today the admins blocked all https access and allowed it only for few servers on the white list. I used this command to create a tunnel: ssh -D 7070 -o 'GatewayPorts yes' -A -q -g -t root@home and now it doesn't work. As I can understand, that's because our proxy denies all https connections Proxy could not open connnection to ***: Proxy Error ( The specified Secure Sockets Layer (SSL) port is not allowed. Forefront TMG is not configured to allow SSL requests from this port. Most Web browsers use port 443 for SSL requests. ) P.S. I use Windows 7, and corscskrew with cygwin, so Linux solutions not suitable for me.

    Read the article

  • transient DNS issues with certain sites in Snow Leopard

    - by ceejayoz
    I'm having a rather bizarre DNS issue with Snow Leopard. Certain sites - MSNBC.com being the most noticeable for me - have an odd issue with DNS, in all browsers. After not visiting the site after a while (30 minutes or so), the first attempt to access MSNBC.com results in a DNS error. Refreshing 1-5 times resolves the issue until the next ~30 minute period of inactivity. Seen this on three separate Macs at this point. One from-the-factory Snow Leopard install, two upgrades. Most sites are just fine. I Google and found other reports of the same thing with MSNBC, but no solutions.

    Read the article

  • Firefox does not load certificate chain

    - by TimWolla
    I'm running lighttpd/1.4.28 (ssl) on Debian Squeeze. I just created a http://startssl.com certificate, I runs fine at all of my Browsers (Firefox, Chrome, Opera), but my users are reporting certificate-errors in Firefox. I already nailed it down to a failing of loading of the certificate chain: Certificate at my Firefox: http://i.stack.imgur.com/moR5x.png Certificate at others Firefox: http://i.stack.imgur.com/ZVoIu.png (Note the missing StartCOM-certificates here) I followed this tutorial for embedding the certificate in my lighttpd: https://forum.startcom.org/viewtopic.php?t=719 The relevant parts of my lighttpd.conf look like this: $SERVER["socket"] == ":443" { ssl.engine = "enable" ssl.ca-file = "/etc/lighttpd/certs/ca-bundle.pem" ssl.pemfile = "/etc/lighttpd/certs/www.bisaboard.crt" } ca-bundle.pem was created like this: cat ca.pem sub.class1.server.ca.pem > ca-bundle.pem I grabbed the relevant files from here: http://www.startssl.com/certs/ www.bisaboard.crt was created like this: cat certificate.pem ssl.key > www.bisaboard.crt Where certificate.pem is my StartSSL-Class1 Certificate and ssl.key my SSL-Root-Key. Do you have any idea why the second Firefox does not correctly load the certificate-chain?

    Read the article

  • Auto-detect proxy settings on network

    - by Ali Lown
    I am having problems trying to run web browser software on the local network through the proxy. When running off the profile drive which is on a network share, the system is unable to auto-detect proxy settings. When running off the local C drive, the browsers are able to correctly autodetect the settings. The error from the browser is about it being unable to fetch the proxy configuration file. Is this some form of authentication preventing it retreiving the settings when running of the network location? PS. Would this be better off on superuser?

    Read the article

  • Clustering/load balancing for cluster unaware applications

    - by AaronLS
    Forgive me if I use any of these terms incorrectly. I am wondering if there is any kind of software that would allow my two "join" two computers together such that a cluster unaware application could utilize their combined computing resources? By "cluster unaware" I mean an application that isn't designed to share work across multiple services. My understanding is that clustering is enabled by the specific application by it's architecture, such that messaging with multiple instances of the application coordinate the sharing of work. Instead I am looking for something that enables clustering at the OS or virtualization level, so that any application could essentially be clustered. Failing that, I am also wondering about the following scenario: We have 3 different applications we will call A, B, and C. We have 2 single core computers. At any given time lets say that any combination of those applications will be CPU intensive. In cases where only 2 of those apps are very active, have one of them moved over to a different server. In a nutshell, some sort of dynamic automatic shuffling of the application's load. I have heard of virtual machines that can be migrated across physical machines while live, but I am wondering if this can be done automatically in response to an application's or VM's CPU activity?

    Read the article

  • Web browser being selective about the sites that it will visit.

    - by Andrew Doran
    I've been trying to help my father-in-law with this problem but haven't been able to get anywhere. Since the weekend the web browsers on his computer (Chrome and Internet Explorer on Windows XP) will only let him get to certain sites - for example, he is able to conduct his online banking but he cannot visit www.bbc.co.uk, www.amazon.co.uk or www.ancestry.com. There is another computer in the house that goes via the same router and this can connect to both, which suggests it is his machine. I tried running a tracert to www.bbc.co.uk and managed to get through, but the web browser hangs with a message that it is waiting for a response. I tried using the WinSockFix tool in case it was anything to do with a recent registry change but that didn't work either. He can't think of anything that he recently did on his machine to cause the problem. Can anyone help?

    Read the article

  • Use trackball to scroll, zoom, etc

    - by filledvoid
    I've got a Logitech Marble Trackball (which is great, btw). By setting one of the extra buttons as a "middle" mouse button, when I click it, many apps (like browsers) will start "scrolling mode" so that moving the trackball will scroll up and down. Most of the time, this is sufficient, but I figure it would be way cooler if I could have several "modes" to do different things like zooming, panning, rotating (particularly in GIMP). Then when I hold CTRL, CTRL+SHIFT, or some such, it would enter a new mode, and the trackball would behave differently. I found a couple questions similar to this that suggest using AutoHotKey, but I haven't found an example script to do this, nor can I find out to track mouse movements within AHK. Any pointers? hotkey for scrollwheel remedy for a no scroll wheel trackball? Thanks!

    Read the article

  • Can't resolve localhost on Mac OS X Server

    - by iainbeeston
    I have a server running OS X Server 10.5 and it can't resolve localhost to 127.0.0.1. When I try ping this is what happens: ping localhost ping: cannot resolve localhost: Unknown host SSH and web browsers get similar results (uknown host). If I try using 127.0.0.1 or the ip address assigned on the LAN all of the above work. Here's the contents of my /etc/hosts file: cat /etc/hosts ## # Host Database # # localhost is used to configure the loopback interface # when the system is booting. Do not change this entry. ## 127.0.0.1 localhost 255.255.255.255 broadcasthost ::1 localhost fe80::1%lo0 localhost I have no local DNS service running. Does anyone have any idea why this might be happening or how I can fix it?

    Read the article

  • PC version of Google Chrome doesn't recognize ".local" domain name

    - by prosseek
    With Bonjour installed in PC, I can access my server in Mac with ".local". For example, I can access my mac with the name "prosseek.local". The problem is that in Chrome for PC, it doesn't recognize "local" to open search page instead of accessing mac server. This issue isn't happening with other web browsers (explore/firefox) in PC. What is even wierder is that chrome seems to recognize the ".local" sometimes, but not always. How to solve this issue? Or, how can I teach chrome that ".local" is a part of page name in order not to direct to search page?

    Read the article

  • PEN daemon as load balancer, IIS web logs not showing true requester IPs

    - by Aszurom
    I have a Hercules vmware appliance, which is a micro-linux vm that runs the PEN daemon and acts as a server load-balancer. It takes any incoming request on the appliance's IP and routes it out to a number of alternate IPs. The logs of the daemon show the true IP of the browsers hitting the website. The logs of the websites themselves (iis 6 and 7) only show the requester IP as being that of the load balancer. The IT manager tells me that when we had a hardware appliance (serveriron XL) doing the load balancing, the web logs reflected the requester IPs accurately. Is there any way to get this resolved with the daemon, or will I be digging that out of the closet and plugging it back in?

    Read the article

  • Java for 64bit isn't working

    - by Loper324
    I'm having errors with java left and right, normal java works just fine. It's things that use the internet for certificates and java .jnpl. I've tried Minecraft classic, it gives me a error, canirunit, error, Carnegie learning, error. I've switched browsers and still have these errors. everything is broken I've turned on ask me for unsigned certificates instead of blocking it and it doesn't pop-up. I'd like to know how to reset java, is that possible I've re-installed it and rebooted and nothing works. Here is a Image: Here is the rest of the text: http://pastebin.com/bzByPSbh

    Read the article

  • GoDaddy SSL on Shared Hosting

    - by Jon
    So I'm very new to using SSL certificates and I have been trying to install one on a site for a client. He is using shared hosting for multiple domains through GoDaddy, and the site we're working on is not the primary domain. He purchased a UCC certificate for multiple domains and I installed it on the shared hosting account. My thought was that since the domains were under the same hosting account, then they would each be protected under the certificate. This was not the case...apparently. I checked both domains with an SSL checker and the primary domain checked out. The domain that we wanted the SSL on showed the following errors: None of the common names in the certificate match the name that was entered (www.CLIENTDOMAIN.com). You may receive an error when accessing this site in a web browser. I'm not sure how to fix this. It was just purchased yesterday, so if necessary, I guess I could un-install it or re-key it (???). Is there a way to just change the common name to www.CLIENTDOMAIN.com (the correct domain)?

    Read the article

  • display internet usage agreement before users can use internet

    - by Force Flow
    I was looking for a way to display a usage agreement page in browsers that users must agree to before they are allowed to access the internet. This would be for users on public computers and public/open wifi. I'm using a sonicwall firewall which does support this feature, however, there is a rediculously low character limit which makes it impractical to use. I thought about setting the browser's homepage to a usage agreement page, but that can easily be bypassed by navigating to somewhere else. Are there any other approaches that may be worth considering? There is currently no server in place on the public network, though I can set one up if need be.

    Read the article

  • Some people suddenly say they cant access my site, but me and all others can

    - by Cain
    I have had this problem many times before and im still unable to find out what is causing it. It happned to me some months ago but it got fixed by itself and had already gone back to normal. Everything was working fine for quite a while until 2 days ago. Some people are reporting they cant access my site, they get a 404 error, however i can access it normally and many other users cant. I dont know whats the common denominator since both groups of people are from the same countries, use the same browsers, OS, etc. so the issue doesnt seem related to that. I have reported this problem before to my both my host and domain registrar but none of them claim responsibility for that. Who is then to blame? Waht can i do to find out whats causing the issue and solve it? Thank you.

    Read the article

  • How to enable the 2 concurrent (+1 console) sessions on Windows Server 2012

    - by Dai
    I have a Windows Server 2012 VM running on Windows Azure. I want to enable the ability for 2 simultaneous administrative sessions over Remote Desktop. This is permitted under the EULA for Windows Server 2012. This is NOT the same thing as the fully-blown Terminal Services / RDS feature. In Windows Server 2000 and 2003, multiple concurrent sessions (up to a limit of 2, plus the root /console session) were enabled by default (such that logging-in via RDP without logging-out first would create a new session rather than reconnecting to the old session). In Server 2008 and later it uses single-sessions by default, as this simplifies administration (as most people want to connect to old sessions). In Windows Server 2008 R2, you can add the MMC snap-ins for Remote Desktop Host Configuration which allows you to re-enable concurrent sessions. However, in Server 2012, after adding the Remote Administration snap-ins from Server Manager it seems the Remote Desktop Host Configuration snap-in has been removed. How can I re-enable the multiple concurrent sessions for Remote Desktop for Administration in Windows Server 2012?

    Read the article

< Previous Page | 404 405 406 407 408 409 410 411 412 413 414 415  | Next Page >