Search Results

Search found 1676 results on 68 pages for 'dr squid'.

Page 3/68 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • Disabling the Squid Error pages

    - by Nicholas Smith
    I've just started looking at using Squid for a project and can't seem to see an easy way of disabling the Squid error pages (e.g. "Name Error: The domain name does not exist"). We use a custom browser which handles that scenario in our way, so the Squid error pages are overriding our custom logic. Is it possible to set them too 'off'? I've been through the .conf and I've found where the error pages are stored, but no real options to disable them.

    Read the article

  • dd-wrt and squid proxy

    - by Soe Naung Win
    I am now using linksys router with dd-wrt firmware & squid proxy from off-site (VPN) for anonymity. The problem is i have to configure proxy setting in my browser to access that proxy. What i would like to do is to get all my traffic pass through the router via squid proxy without configuring any setting in browser. I can't use openvpn due to port blockage in my country. My current squid proxy listen to port 443.

    Read the article

  • dd-wrt and squid proxy

    - by Soe Naung Win
    I am now using linksys router with dd-wrt firmware & squid proxy from off-site (VPN) for anonymity. The problem is i have to configure proxy setting in my browser to access that proxy. What i would like to do is to get all my traffic pass through the router via squid proxy without configuring any setting in browser. I can't use openvpn due to port blockage in my country. My current squid proxy listen to port 443.

    Read the article

  • DansGuardian/Squid Traffic doesn't get back to user

    - by DKNUCKLES
    I've purchased a Squid appliance that I'm attempting to implement, however the lack of documentation has left me a bit high and dry. Forgive me if this is a silly question, but this is my first attempt at implementing Squid. From what I can ascertain from the documentation (or lack thereof), the users connect to DansGuardian first at port 8080 where the filtering is done, at which point it forwards it to the Squid appliance at port 3128. The traffic is then sent to the internet. The setup I have is as follows Gateway (MikroTik router) : 192.168.88.1 Squid/DansGuardian :192.168.88.100 Client : 192.168.88.238 Client --- Gateway --- Proxy --- Internet I have set up a simple NAT rule to forward all traffic from the client machine (for testing purposes) to go to the DansGuardian. The traffic seems to get there, although I see a lot of SYN_RECV w/ a netstat -antp command on the virtual appliance machine. From this I gather that the traffic is NOT being routed back to the client machine. Proto Recv-Q Send-Q Local Address Foreign Address State PID/Program name tcp 0 0 0.0.0.0:8080 0.0.0.0:* LISTEN - tcp 0 0 192.168.88.100:8080 192.168.88.238:55786 SYN_RECV - tcp 0 0 192.168.88.100:8080 192.168.88.238:55787 SYN_RECV - tcp 0 0 192.168.88.100:8080 192.168.88.238:55785 SYN_RECV - tcp 0 0 192.168.88.100:8080 192.168.88.238:55788 SYN_RECV - tcp 0 0 0.0.0.0:10000 0.0.0.0:* LISTEN - tcp 0 0 0.0.0.0:22 0.0.0.0:* LISTEN - Is this a routing issue or an issue with the Squid Appliance?

    Read the article

  • Squid/Kerberos authentication with only Linux

    - by user28362
    Hi, I would like to know if it possible to let a Windows Xp machine authenticate to Squid (Linux) using Kerberos without the need of an Active Directory domain. I only want to create a Kerberos ticket on the client side, which should give the client access to squid (using I.E.). I only found tutorials about configuring A.D./Squid, not an environment with only Linux servers. Thanks Update: The kerberos setup is correctly done, the proxy and client can get tickets. As for the browser (FF/IE), I get: ERROR Cache Access Denied While trying to retrieve the URL: http://www.google.com/ The following error was encountered: * Cache Access Denied. Sorry, you are not currently allowed to request: http://www.google.com/ from this cache until you have authenticated yourself. In kerberos, I get: squid_kerb_auth: Got 'YR ElRNTVMTUABBAABAB4IIogAAAAAAAAAAAAAAAAAAAAAFASgDAAAADw==' from squid (length: 59). squid_kerb_auth: parseNegTokenInit failed with rc=101 squid_kerb_auth: received type 1 NTLM token This message is strange, as I didn't configure NTLM. It looks like the browser uses the wrong authentication methode.

    Read the article

  • squid configuration change to accept http request on LAN

    - by Ratan Kumar
    installed squid + dansguardian to block adult content on my linux (ubuntu 12.10) . everything worked fine. it has blocked as expected . now the problem is i am also running an apache server for my LAN . ( kind of website ) but when accessing it via 192.168.0.1 , it says squid has blocked the connection , this is the exact error The following error was encountered while trying to retrieve the URL: http: //192.168.0.16/ Connection to 192.168.0.16 failed. The system returned: (113) No route to host The remote host or network may be down. Please try the request again. Your cache administrator is webmaster. before configuring the squid it was working fine . what changes in the squid.conf i have to make . i tried acl Safe_ports 80 allow_all Safe_ports ( i want to know how i can configure it again to listen HTTP request from LAN )

    Read the article

  • How to cache streaming video and silverlight with squid windows reverse proxy

    - by V. Romanov
    We have an intranet web server running a silverlight application (ACTUS media monitor if anyone cares to know). The server is used to record video and stream it to clients through a CDN solution. We want to put a reverse proxy in between the server and CDN provider in order to remove the office network bottleneck that's currently strangling us. I've set up SQUID for windows on a separate machine outside the network using squid BasicAccelerator configuration setting. It seems to work as far as the reverse proxy is concerned, requests are forwarded and the application is working but it doesn't seem to cache anything (no space is used on the drive where squid is installed). I found to explicit setting to turn caching on in squid, so i assume it's on by default. Perhaps I need some other trick to make the video and/or silverlight cacheable? Any help will be appreciated. Any info you need to help me will be provided at once. Thanks in advance!

    Read the article

  • Squid and Webmin - the 'all' acl

    - by Genboy
    In older versions of versions of squid, you had to define an acl 'all'. acl all src 0.0.0.0/0.0.0.0 You use this for http_access allow all http_access deny all etc. In Squid 3.0 and above, the 'all' ACL is built-in, you cannot (& need not) define it. However, the webmin squid module doesn't seem to know this - when you try to add a rule using all, it doesn't show 'all' in it's list of ACLs. How does one get around this? I am using webmin 1.530 on Debian Lenny. Squid Version is 3.0.STABLE19-1~bpo50+1

    Read the article

  • Logs being flooded from Squid for having intercepted and authentication enabled together

    - by Horace
    I have done some hefty Google'ing and I can't seem to find a single solution to this issue that I cam currently experiencing. Here is a sample configuration from squid that I have: # # DIGEST Auth # auth_param digest program /usr/sbin/digest_file_auth /etc/squid/digpass auth_param digest children 8 auth_param digest realm LHPROJECTS.LAN Network Proxy auth_param digest nonce_garbage_interval 10 minutes auth_param digest nonce_max_duration 45 minutes auth_param digest nonce_max_count 100 auth_param digest nonce_strictness on # Squid normally listens to port 3128 # Squid normally listens to port 3128 http_port 192.168.10.2:3128 transparent https_port 192.168.10.2:3128 intercept http_port 192.168.10.2:3130 As noted above, I have three ports defined, 2 of them are transparent/intercept and one is a regular http port (which I use for authentication). Which works rather well in this configuration however my logs are getting flooded of this entry authentication not applicable on intercepted requests whenever a transparent connection is made. So far, I can't seem to find any documentation that would describe how to suppress these messages ?

    Read the article

  • Enabling Squid delay pool eats up the entire memory

    - by Supratik
    I am using "squid-3.1.8-1.el5" in my CentOS 5 32 bit system. In normal condition Squid uses 85m - 90m, but when I enable the delay pool parameters the memory usage suddenly rise up 2GB. The memory keeps on increasing until the system is out of resource. The following are my delay pool settings: delay_pools 1 delay_class 1 1 delay_access 1 allow all delay_parameters 1 192000/192000 Is there anything I am missing here or is it a bug with Squid ?

    Read the article

  • Squid 2.7.6 not honoring ACL rules

    - by peppery
    Hello there, I have a /24 block of IP addresses assigned to a single server that I have been attempting to install Squid on an Ubuntu server machine. All of the IP addresses are set up correctly (aliases of eth0) in /etc/networking and work as they should be, using cURL I can specify an interface and it goes out on the correct address as it should be. I would like Squid to take the incoming IP address the request was sourced to and proxy the request out on the same IP (e.g incoming 123.123.123.1:3128 - 123.123.123.1, .2 - .2, etc) and have set up these ACL rules in /etc/squid.conf acl ip1 myip x.x.x.1 tcp_outgoing_address x.x.x.1 ip1 acl ip2 myip x.x.x.2 tcp_outgoing_address x.x.x.2 ip2 acl ip3 myip x.x.x.3 tcp_outgoing_address x.x.x.3 ip3 and so on, as this seems to be the only way to do what I want (from research). However, after much frustration, Squid seems to be ignoring these rules and sending requests out on the default interface. Does anybody have any suggestions? Thanks.

    Read the article

  • PF, load balanced gateways, and Squid

    - by Santa
    Hi, So I have a FreeBSD router running PF and Squid, and it has three network interfaces: two connected to upstream providers (em0 and em1 respectively), and one for LAN (re0) that we serve. There is some load balancing configured with PF. Basically, it routes all traffic to ports 1-1024 through one interface (em0) and everything else through the other (em1). Now, I have a Squid proxy also running on the box that transparently redirects any HTTP request from LAN to port 3128 in 127.0.0.1. Since Squid redirects this request to HTTP outside, it should follow the load balancing rule through em0, no? The problem is, when we tested it out (by browsing from a computer in the LAN to http://whatismyip.com, it reports the external IP of the em1 interface! When we turn Squid off, the external IP of em0 is reported, as expected. How do I make Squid behave with the load balancing rule that we have set up? Here's the related settings in /etc/pf.conf that I have: ext_if1="em1" # DSL ext_if2="em0" # T1 int_if="re0" ext_gw1="x.x.x.1" ext_gw2="y.y.y.1" int_addr="10.0.0.1" int_net="10.0.0.0/16" dsl_ports = "1024:65535" t1_ports = "1:1023" ... squid=3128 rdr on $int_if inet proto tcp from $int_net \ to any port 80 -> 127.0.0.1 port $squid pass in quick on $int_if route-to lo0 inet proto tcp \ from $int_net to 127.0.0.1 port $squid keep state ... # load balancing pass in on $int_if route-to ($ext_if1 $ext_gw1) \ proto tcp from $int_net to any port $dsl_ports keep state pass in on $int_if route-to ($ext_if1 $ext_gw1) \ proto udp from $int_net to any port $dsl_ports pass in on $int_if route-to ($ext_if2 $ext_gw2) \ proto tcp from $int_net to any port $t1_ports keep state pass in on $int_if route-to ($ext_if2 $ext_gw2) \ proto udp from $int_net to any port $t1_ports Thanks!

    Read the article

  • Ubuntu 9.10 and Squid 2.7 Transparent Proxy TCP_DENIED

    - by user298814
    Hi, We've spent the last two days trying to get squid 2.7 to work with ubuntu 9.10. The computer running ubuntu has two network interfaces: eth0 and eth1 with dhcp running on eth1. Both interfaces have static ip's, eth0 is connected to the Internet and eth1 is connected to our LAN. We have followed literally dozens of different tutorials with no success. The tutorial here was the last one we did that actually got us some sort of results: http://www.basicconfig.com/linuxnetwork/setup_ubuntu_squid_proxy_server_beginner_guide. When we try to access a site like seriouswheels.com from the LAN we get the following message on the client machine: ERROR The requested URL could not be retrieved Invalid Request error was encountered while trying to process the request: GET / HTTP/1.1 Host: www.seriouswheels.com Connection: keep-alive User-Agent: Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.9 (KHTML, like Gecko) Chrome/5.0.307.11 Safari/532.9 Cache-Control: max-age=0 Accept: application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,image/png,/;q=0.5 Accept-Encoding: gzip,deflate,sdch Cookie: __utmz=88947353.1269218405.1.1.utmccn=(direct)|utmcsr=(direct)|utmcmd=(none); __qca=P0-1052556952-1269218405250; __utma=88947353.1027590811.1269218405.1269218405.1269218405.1; __qseg=Q_D Accept-Language: en-US,en;q=0.8 Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3 Some possible problems are: Missing or unknown request method. Missing URL. Missing HTTP Identifier (HTTP/1.0). Request is too large. Content-Length missing for POST or PUT requests. Illegal character in hostname; underscores are not allowed. Your cache administrator is webmaster. Below are all the configuration files: /etc/squid/squid.conf, /etc/network/if-up.d/00-firewall, /etc/network/interfaces, /var/log/squid/access.log. Something somewhere is wrong but we cannot figure out where. Our end goal for all of this is the superimpose content onto every page that a client requests on the LAN. We've been told that squid is the way to do this but at this point in the game we are just trying to get squid setup correctly as our proxy. Thanks in advance. squid.conf acl all src all acl manager proto cache_object acl localhost src 127.0.0.1/32 acl to_localhost dst 127.0.0.0/8 acl localnet src 192.168.0.0/24 acl SSL_ports port 443 # https acl SSL_ports port 563 # snews acl SSL_ports port 873 # rsync acl Safe_ports port 80 # http acl Safe_ports port 21 # ftp acl Safe_ports port 443 # https acl Safe_ports port 70 # gopher acl Safe_ports port 210 # wais acl Safe_ports port 1025-65535 # unregistered ports acl Safe_ports port 280 # http-mgmt acl Safe_ports port 488 # gss-http acl Safe_ports port 591 # filemaker acl Safe_ports port 777 # multiling http acl Safe_ports port 631 # cups acl Safe_ports port 873 # rsync acl Safe_ports port 901 # SWAT acl purge method PURGE acl CONNECT method CONNECT http_access allow manager localhost http_access deny manager http_access allow purge localhost http_access deny purge http_access deny !Safe_ports http_access deny CONNECT !SSL_ports http_access allow localhost http_access allow localnet http_access deny all icp_access allow localnet icp_access deny all http_port 3128 hierarchy_stoplist cgi-bin ? cache_dir ufs /var/spool/squid/cache1 1000 16 256 access_log /var/log/squid/access.log squid refresh_pattern ^ftp: 1440 20% 10080 refresh_pattern ^gopher: 1440 0% 1440 refresh_pattern -i (/cgi-bin/|\?) 0 0% 0 refresh_pattern (Release|Package(.gz)*)$ 0 20% 2880 refresh_pattern . 0 20% 4320 acl shoutcast rep_header X-HTTP09-First-Line ^ICY.[0-9] upgrade_http0.9 deny shoutcast acl apache rep_header Server ^Apache broken_vary_encoding allow apache extension_methods REPORT MERGE MKACTIVITY CHECKOUT cache_mgr webmaster cache_effective_user proxy cache_effective_group proxy hosts_file /etc/hosts coredump_dir /var/spool/squid access.log 1269243042.740 0 192.168.1.11 TCP_DENIED/400 2576 GET NONE:// - NONE/- text/html 00-firewall iptables -F iptables -t nat -F iptables -t mangle -F iptables -X echo 1 | tee /proc/sys/net/ipv4/ip_forward iptables -t nat -A POSTROUTING -j MASQUERADE iptables -t nat -A PREROUTING -p tcp --dport 80 -j REDIRECT --to-port 3128 networking auto lo iface lo inet loopback auto eth0 iface eth0 inet static address 142.104.109.179 netmask 255.255.224.0 gateway 142.104.127.254 auto eth1 iface eth1 inet static address 192.168.1.100 netmask 255.255.255.0

    Read the article

  • Squid 2.7 Stable 8 on Windows 2008

    - by Sadish
    Hi all, I have a Windows 2008 SP2 Active Directory Domain, which has clients of Vista, Win 2000 XP and Windows 7 as members. I installed Squid 2.7 Stable 8 on Windows 2008 SP2 trying to configure NTLM based authentication when surfing Internet. Basically have defined 2 groups for internet allow and deny based on authorization Internet access is allowed. But after trying for over 3 weeks, seems that the authentication does not happen. The browser keeps on asking for user name & password. I would like to know if there is any solution for this. I’m totally frustrated and unable to move forward. My configuration as below from the modifying the default squid.conf Line 292 auth_param ntlm program c:/squid/libexec/mswin_ntlm_auth.exe auth_param ntlm children 5 Line 626 acl localnet proxy_auth REQUIRED src 10.0.0.1/255 acl InetAllow external win_domain_group InternetUsers acl InetDeny external win_domain_group InternetDenyGroup http_access allow InetAllow http_access deny InetDeny Comment any "acl localnet src" Line 294 external_acl_type win_domain_group ttl=120 %LOGIN c:/squid/libexec/mswin_check_lm_group.exe –G My Windows 2008 server is running on 192.168.0.203 and clients are of subnet 10.0.0.x for which I need authentication. Pls help !!!

    Read the article

  • Squid proxy in cent os often disconnected with error : tunnelConnectTimeout(): tunnelState->servers is NULL

    - by Ela
    I am having very often internet disconnection problem with Squid proxy service. My server config; OS: CentOS release 6.3 (Final) model name : Intel(R) Core(TM) i7-2600 CPU @ 3.40GHz cpu MHz : 1600.000 My Local systems IP range:192.168.2.x Server IP: 192.168.2.11 Also this server is configured with lamp for development,Samba SMB file service manager and No svn currently. So i see maximum possibility is this squid proxy since this is where it stops to connect and am sure when i restart the server net started working so something wrong with this squid service only. And this server is connected with local 14 other windows machines and basically serves as a central development node. I am able to resolve it by restarting the server fully some time or sometimes by restarting the squid proxy which is totally killing our development. I have attached my cache log file here for the error info. Cache log file Sample error log: 2013/07/01 13:25:38| tunnelConnectTimeout(): tunnelState->servers is NULL 2013/07/01 13:25:41| tunnelConnectTimeout(): tunnelState->servers is NULL 2013/07/01 13:25:41| tunnelConnectTimeout(): tunnelState->servers is NULL 2013/07/01 13:25:50| clientProcessRequest: Invalid Request 2013/07/01 13:26:05| tunnelConnectTimeout(): tunnelState->servers is NULL Some help can make our lives easier, Thanks in advance.

    Read the article

  • Squid SSL transparent proxy - SSL_connect:error in SSLv2/v3 read server hello A

    - by larryzhao
    I am trying to setup a SSL proxy for one of my internal servers to visit https://www.googleapis.com using Squid, to make my Rails application on that server to reach googleapis.com via the proxy. I am new to this, so my approach is to setup a SSL transparent proxy with Squid. I build Squid 3.3 on Ubuntu 12.04, generated a pair of ssl key and crt, and configure squid like this: http_port 443 transparent cert=/home/larry/ssl/server.csr key=/home/larry/ssl/server.key And leaves almost all other configurations default. The authorization of the dir that holds key/crt is drwxrwxr-x 2 proxy proxy 4096 Oct 17 15:45 ssl Back on my dev laptop, I put <proxy-server-ip> www.googleapis.com in my /etc/hosts to make the call goes to my proxy server. But when I try it in my rails application, I got: SSL_connect returned=1 errno=0 state=SSLv2/v3 read server hello A: unknown protocol And I also tried with openssl in cli: openssl s_client -state -nbio -connect www.googleapis.com:443 2>&1 | grep "^SSL" SSL_connect:before/connect initialization SSL_connect:SSLv2/v3 write client hello A SSL_connect:error in SSLv2/v3 read server hello A SSL_connect:error in SSLv2/v3 read server hello A Where did I do wrong?

    Read the article

  • maximum number of connections Squid

    - by Isaac
    I have a Squid proxy server that controls all internet traffic for my network. I need a way to stop users from downloading big files (say 50MB) in my network. I banned some famous ports (e.g. torrent) but some downloads are possible by HTTP port. Obviously I cannot ban port 80! A simple solution is limiting maxmimum number of the simultaneous connections for each IP (e.g. 3 connections). It's possible in Squid with this config: acl ACCOUNTSDEPT 192.168.5.0/24 acl limitusercon maxconn 3 http_access deny ACCOUNTSDEPT limitusercon But this solution has really bad impact in web browsing, because any smart browser get different parts of a website by several connections simultaneously to speedup web browsing. But if we have a maximum number of connections, the browsers will fail to get some parts and the website will be shown partially and some parts/images/frames will not be shown. So, can we limit maximum number of persist connections? I think this policy will works: Specify Maximum number of connections that is alive for 10 seconds But Number of simultaneous connections for every IP is unlimited But how can we implement this policy when Squid? With which config? UPDATE: artifex and Tom Newton offered using a bandwidth-limiting approach to fight against downloaders. But bandwidth-limiting in Squid has a shortcoming: It's static and cannot dynamically change. So a person has a limited bandwidth not matter how many people are using internet (maybe nobody!) Also, this solution cannot help to stop people from downloading. They still can download but in a lower speed. But if we find a way to terminate persist connections (or any connection that is alive more than a specific time), downloading big files will be almost impossible (always there is some way!)

    Read the article

  • Setting up squid proxy server to in turn connect using another proxy server [closed]

    - by AnkurVj
    My institute uses the Squid proxy server and authentication mechanism requires username and password to be entered. This means that, I can log in on only one machine at a time and Internet access for me is restricted to that machine. I sometimes require Internet access on multiple machines simultaneously. What previosuly worked for me was the following : On one of my own machines A, I set up a Squid proxy server that allowed all local machines without any username and password. I configured rest of the machines to use this machine A as the proxy server. On machine A I logged into the institute proxy server using my browser. This way, I could access Internet from all my machines, by effectively channeling my requests through the server A. Recently, I lost that machine and configuration and now I tried to set it up again in the same manner. However, I cant seem to remember exactly how I made it work. I keep getting Connection Refused (111) on other machines. My guess is that my squid server isnt able to forward requests from other machines to the actual squid server. I could use any help for debugging this problem. I don't want to use alternatives such as ssh tunneling. This solution has worked for me in the past, I just don't remember how to set it up the same way again.

    Read the article

  • Routing and authenticating all access through squid

    - by Knight Samar
    Hi, I want to route all Internet access in my network through a Squid proxy server and authenticate and log all users. I want this to be a client-independent setting so that no one needs to do anything on their browsers or machines. I have set my network gateway as the proxy server so that all traffic will be sent to it. I have done this using options in DHCP server. Now I tried using squid as a transparent proxy, but then it won't authenticate in that mode. I tried using iptables to route all traffic to port 3128 but it won't popup the authentication dialog box from SQUID. I tried telling DHCP to give WPAD to all clients by placing a WPAD file on a webserver containing the following for automatic proxy configuration on clients: Changes in dhcpd.conf option wpad code 252 =test; option wpad "\n\000"; option wpad "http://192.168.1.5/wpad.dat\n"; The WPAD file: function FindProxyForURL(url,host) { return "PROXY squid-server-ip-address:3128 ; DIRECT "; } But the browsers (different versions of Firefox and IE) seem to ignore it. :( What should I do ?

    Read the article

  • Accessing Squid Proxy over internet

    - by user37074
    Hi, I recently finished installing Squid on a VPS I have in the US and its working fine locally (I verified by setting http_proxy variable and using lynx). I want to access this proxy over the internet (as an anonymizer) so that I can see how some ads show up for US traffic on my website. I have setup authentication so abuse is not a problem. However, I am not able to access the proxy over the internet. I have set the following rule in squid.conf http_access allow all Is this not possible to do what I want or I am missing something? The port 3128 is open in the firewall so that is not an issue. Squid is running on 0.0.0.0 Thanks

    Read the article

  • Accessing Squid Proxy over internet

    - by prateekdayal
    Hi, I recently finished installing Squid on a VPS I have in the US and its working fine locally (I verified by setting http_proxy variable and using lynx). I want to access this proxy over the internet (as an anonymizer) so that I can see how some ads show up for US traffic on my website. I have setup authentication so abuse is not a problem. However, I am not able to access the proxy over the internet. I have set the following rule in squid.conf http_access allow all Is this not possible to do what I want or I am missing something? The port 3128 is open in the firewall so that is not an issue. Squid is running on 0.0.0.0 Thanks Prateek

    Read the article

  • Using Squid on Debian, Cannot Connect Error

    - by Zed Said
    I am trying to set up Squid on Debian and am getting a connection refused error: squidclient http://www.apple.com/ > test client: ERROR: Cannot connect to 127.0.0.1:3128: Connection refused Here is my config: visible_hostname none cache_effective_user proxy cache_effective_group proxy cache_dir ufs /var/spool/squid 2048 16 256 cache_mem 512 MB cache_access_log /var/log/squid/access.log emulate_httpd_log on strip_query_terms off read_ahead_gap 128 Kb collapsed_forwarding on refresh_stale_hit 30 seconds retry_on_error on maximum_object_size_in_memory 1 MB acl all src 0.0.0.0/0.0.0.0 acl purgehosts src 127.0.0.1/255.255.255.255 # Caching static objects in __data is important. # Without that, apache processes sit around spooling static objects. acl QUERY urlpath_regex /cgi-bin/ /_edit /_admin /_login /_nocache /_recache /__lib /__fudge acl PURGE method PURGE acl POST method POST cache deny QUERY cache deny POST http_access allow PURGE purgehosts http_access deny PURGE http_access allow all http_port 127.0.0.1:80 http_port 50.56.206.139:80 cache_peer 127.0.0.1 parent 80 0 originserver no-query no-digest default redirect_rewrites_host_header off read_ahead_gap 128 Kb shutdown_lifetime 5 seconds Any ideas why this is happening? What have I missed?

    Read the article

  • Web Fonts in Firefox behind Squid

    - by sinping
    It appears as though Firefox requests web fonts (specifically Google's) in a different manner than other browsers. They fail to load when behind Squid as a result, and I'm trying to isolate why. Using this page as a reference in various browsers, Firefox noticeably displays it differently. In the Squid logs, you see hits like this: TCP_DENIED/407 3292 GET http://themes.googleusercontent.com/font? - NONE/- text/html If you load the same page with IE, you see hits like this: TCP_DENIED/407 4389 GET http://themes.googleusercontent.com/static/fonts/federant/v1/C109bUmZeyhh-vIXq9lNfvesZW2xOQ-xsNqO47m55DA.woff - NONE/- text/html So why the difference and how can we make it play nice with Firefox? This is running Squid 3.1.15, but also tested with a few other 3.1.x versions.

    Read the article

  • Squid Proxy Antivirus - Recommendations / Performance

    - by Jon Rhoades
    Due to our user's increasing expertise at downloading virus and the like, we are investigating adding Antivirus to our Squid proxy. A casual Google reveals several free and one paid: HAVP squid-vscan Viralator Safe Squid (commercial) None of the 'free' ones have reached v1 yet, nor do any inspire huge confidence form their websites (although of course I would rather they spent their time on the app rather than their website!). Does anybody have experience with any of these (or any other similar apps). If so are they suitable for a production network with 400ish concurrent users, and what sort of CPU/RAM requirements does it have?

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >