Search Results

Search found 10967 results on 439 pages for 'django sites'.

Page 326/439 | < Previous Page | 322 323 324 325 326 327 328 329 330 331 332 333  | Next Page >

  • Apache: multiple domains handling

    - by cache
    So I use following schema to handle multiple sites on my apache: <VirtualHost 192.168.1.100:80> # get the server name from the Host: header UseCanonicalName Off VirtualDocumentRoot /var/www/%0/docs VirtualScriptAlias /var/www/%0/cgi-bin </VirtualHost> Therefore, if a client go to www.example.com, it will actually point to /var/www/www.example.com/doc/, which is good. However, what if the client go to example.com? It will point to /var/www/example.com/doc, which is not what we want. So my question is: is there any better schema for that? Or what should I do to fix the issue? Thanks!

    Read the article

  • How to prefer ipv6 over ipv4 only for specific websites?

    - by kria
    I only have ipv6 connectivity via a HE tunnel on my router, so normally I want to prefer ipv4 over ipv6. For some websites however, I would like to prefer ipv6. Right now I have just set DisabledComponents to 0x20 and hard coded the ipv6 resolution into my hosts file for the sites i want to access over ipv6. Since these ip addresses change at times, this is not a good solution. Any ideas on how to handle this in a non-clunky way? Some kind of Chrome/Firefox add-on might do the trick, but I couldn't find one for this purpose.

    Read the article

  • Trying to build a history of popular laptop models

    - by John
    A requirement on a software project is it should run on typical business laptops up to X years old. However while given a specific model number I can normally find out when it was sold, I can't find data to do the reverse... for a given year I want to see what model numbers were released/discontinued. We're talking big-name, popular models like Dell Latitude/Precision/Vostro, Thinkpads, HP, etc. The data for any model is out there but getting a timeline is proving hard. Sites like Dell are (unsurprisingly) geared around current products, and even Wikipedia isn't proving very reliable. You'd think this data must have been collated by manufacturers or enthusiasts, surely?

    Read the article

  • How to report abuse to website hosting company (GoDaddy) [closed]

    - by lgratian
    I'm not sure if this is the right place to ask such a question... Let's say that a website posted a picture of me, without my consent, and I want it to be removed (it's something private, could compromise my career if it's seen by someone that shouldn't). I sent them an email asking nicely that they should remove it, but they didn't respond and the picture is still there. Using 'Whois' I found that the website is hosted by GoDaddy. Is there a way (an email address, for ex.) to report to GoDaddy that one of the sites they're hosting does something illegal and to force them to remove the photo? I searched the site and found nothing about such a thing. Thnaks in advance!

    Read the article

  • Is it possible to open several web pages when browser is started?

    - by gotqn
    I am searching for browser option or plugin (it will be best if it is available in Web Kit browsers,Opera or Firefox - not IE) that allows me to open several web pages when it is initially started. For example, let's say that I have some file with settings in which I have pointed the following websites: Google + gmail StackOverflow.com SuperUser.com dba.stackexchange.com linkedin etc... and when I firstly started the Chrome browser, all this sites will be opened in new tabs and because the browser has saved my passwords I will be logged in. I will find this very helpful because: It will saves me time I will not miss anything when I turn off my computer (for example to forget to check my mail)

    Read the article

  • Understanding IUSR_<machine> account

    - by liho1eye
    Namely how is setting read/write permission for this account different from giving read/write access in the IIS (Windows 2003, so it should be IIS6 if I am not mistaken). Here is the issue: It looks like we had a security sweep and as a part of that IUSR account lost write access everywhere. A whole bunch of legacy ASP sites didn't like that at all... My very surfacish understanding is that it is enough to deny write access in the IIS console to protect a website from someone just dropping random files into it, and IUSR access only has effect on the application scripts running server side, and thus can be safely given write access back. edit: The applications in question obviously require write access to their own web folders, otherwise this wouldn't be an issue at all. Question is how to configure IIS/application to both satisfy security and make them work. My first instinct was to change account which is used to run the app pool. However that is already set to NETWORK_SERVICE, and that guy already has full access to folders in question.

    Read the article

  • Logins with only HTTP - are they as insecure as I'm thinking?

    - by JoeCool1986
    Recently I was thinking about how websites like gmail and amazon use HTTPS during the login process when accessing your account. This makes sense, obviously, since you're typing in your account username and password and you would want that to be secure. However, on Facebook, among countless other websites, their logins are done with simple HTTP. Doesn't that mean that my login name and password are completely unencrypted? Which, even worse, means that all those people who login to their facebooks (or similar sites) at a wifi hotspot in public are susceptible to anyone getting their credentials using a simple packet sniffer (or something similar)? Is it really that easy? Or am I misunderstanding internet security? I'm a software engineer working on some web related stuff, and although at the current time I'm not too involved with the security aspect of our software, I knew I should probably know the answer to this question, since it's extremely fundamental to website security. Thanks!

    Read the article

  • Varnish with multiple hosts/subdomains

    - by jerhinesmith
    I'm new to Varnish, and I'm hoping it already does this "out of the box", but I'd like to clarify before I consider using it in production: Here's my setup: I have multiple sites running off of the same machine that vary by subdomain (i.e. user1.example.com, user2.example.com, etc.) Each "site" has a profile picture that has the same name (i.e. user1.example.com/profile.png, user2.example.com/profile.png) Will Varnish recognize these as separate resources and cache them accordingly? Or will I need to change something in the VCL to tell it include the full host url when looking up cache hits?

    Read the article

  • Where is this error message coming from?

    - by jordanpg
    Recently switched to a new ISP after a move, running Chrome under OSX 10.7. I see the following error when visiting various sites -- no particular pattern -- from time to time. This is the entire message. It is the only thing that appears in my web browser. The problem fixes itself in a few minutes. Probably a lookup error of some sort, but I don't recognize it. What piece of software is serving this message? What is happening? What is this Reference # referencing? Invalid URL The requested URL "/articles/6517181", is invalid. Reference #9.6f200f6c.235618518a.b7e910cf

    Read the article

  • Learn as much as possible about the setup behind a website.

    - by carrier
    Let's say I'm in the process of planning the setup of a website. I study similar sites that offer similar services or might receive similar traffic model. Is there a way to determine a bit the kind of setup, software and/or hardware. Some things are obvious. If I see .php or .jsp then I already know a bit. But any ideas on how to decipher more? Maybe where the site is hosted, hardware, platforms...

    Read the article

  • Cannot access internal network on OSX 10.6.6

    - by cabuki
    Last week, I began having trouble connecting to our internal web servers. Usually, a refresh would take care of it or switching to a different wireless network, but as of yesterday, this wasn't enough. We have an internal DNS server using dnsmasq and a private internal host name (us.lcl). Once I started having more issues with the names not resolving, I tried pinging the server. Using the internal host name (s1.us.lcl), it failed. I tried using the IP address, but that also failed. I have no problems accessing external sites with the exception of it being a bit slower than normal. A reboot yesterday at lunch time after following the instructions here seemed to fix the issue, but when I came into the office this morning, it had stopped working. As of this posting, I cannot ping, ssh or access the web server using the internal host name or ip address. I'm the only one running 10.6 in my office and none of my colleagues has this issue.

    Read the article

  • migration of physical server to a virtual solution, what i have to do?

    - by bibarse
    Hello I'm new in this forum, so i would like that you forgive me for my blissfully and my low English level. I'm a trainee in company one month ago, and my mission is to migrate 3 physicals servers to a virtualization technology. The company edit softwares for E-learning so there are lots of data like videos, flash and compressed (zip). This is some inventory of the servers: OS: Debian, 2 redhat, apache, php/mysql, sendMail/Dovecot, webmin with virtualmin template to create dynamically the web sites because there is no sysadmin ... The future provider will be responsible of to secure, update and create the virtual machines (outsourcing) and with a RedHat OS's. So i want that you help me to choose a virtualisation technologie (for the i prefer KVM of Redhat RHEV, VMWare is expensive), how evaluate the hardware needs (this for evolution of 4 or 5 years) and to elaborate a good planing to don't forget any think. Thank you for your responses.

    Read the article

  • How to create a Service Connection Point for Exchange (Manually)

    - by Ionoxx
    I'm being cautious here. Before I remove anything I want to be able to put it back. I'm having issues with a domain joined computer that is using SCP to get exchange autodiscovery information. It's getting information for the now unused internal Exchange through SCP even through the profile is using Office 365 on another domain. According to this conversation, I can simply remove the object from Active Directory Sites and Services. I want to know how to add back in should this create more problems, or if we reinstate the Exchange server. Right clicking on the parent "autodiscover" node doesn't allow me to create a Service Connection Point. Will simply running the cmdlet "Set-ClientAccessServer -identity servername -AutodiscoverServiceInternalUri url" be enough to recreate the object? Thank you!

    Read the article

  • Web-Server directory permissions

    - by MLS
    Hello All, I would like some help understanding web-server directory permissions. Apache, CentOS, PHP, Mysql Example, I have multiple sites in /var/www/html They are in paths like: /var/www/html/www_domainname_com inside each site I might have a path like /lib/mysql/ like PHP connect stuff, database config, etc. What should me permissions be so that someone cannot just browse to that directory? Should I just .htaccess them? I have apache:apache as the owner of all my web directories. Can I prevent someone from crawling certain directories of my web-server? I have a robots.txt, but what is to say the crawler obeys it? So to sum up: 1. What is the best owner/permission set for my sensitive files that the web-server or php or mysql needs, but I dont want people browsing to? Can I prevent straight out crawling of portions?

    Read the article

  • Question about server usage, big community platform [closed]

    - by Json
    Possible Duplicate: How do you do Load Testing and Capacity Planning for Web Sites I’m working on a community platform writen in PHP, MySQL. I have some questions about the server usage maybe someone can help me out. The community is based on JQuery with many ajax requests to update content. It makes 5 - 10 AJAX(Json, GET, POST) requests every 5 seconds, the requests fetch user data like user notifications and messages by doing mySQL queries. I wonder how a server will handle this when there are for more than 5000 users online. Then it will be 50.000 requests every 5 seconds, what kind of server you need to handle this? Or maybe even more, when there are 15.000 users online, 150.000 requests every 5 seconds. My webserver have the following specs. Xeon Quad 2048MB 5000GB traffic Will it be good enough, and for how many users? Anyone can help me out or know where to find such information, like make a calculation?

    Read the article

  • Cannot logon guest account in windows 7

    - by Javy
    I'm using Windows 7 Home edition. When I try to create any guest account, it fails to load at login with the error: "The User Profile Service failed the logon. User profile cannot be loaded” I can login as admin and my home user with no problems. Every guest account that I create fails. I found this on a microsoft text article: This error may occur if the "Do not logon users with temporary profiles" Group Policy setting is configured. I've tried to find the Group Policy settings and cannot locate it anywhere. Some sites indicate I need to upgrade windows to access it. Is there a way to use guest accounts without upgrading?

    Read the article

  • OOMK kills mysql and apache when there is still a lot[?] of mem

    - by Flyer
    let me first say that I'm pretty new ti *nix systems and even more to server management. Anyway, I've got a little problem. I got VPS with 1gb mem, system is debian 6. I have few sites running on it, though some load can only be caused by one of them. Recently, OOMK started to kill mysql, causing wp and phpbb giving error that it can't connect to mysql server. Error itself is not good, especially if it happens at night and site becomes unavailable until I wake up and restart mysql. I have probably bad line in my cron which can be cause of it all (again, I'm new to it) */20 * * * * sync; echo 3 > /proc/sys/vm/drop_caches Well, if you need any information, let me know, since I don't really know which information can be useful here. Also, I'd like to know if it's not too bad to have above cron task.

    Read the article

  • Remote connection issue with Sql Server 2005 with SMS and Services but not IIS

    - by Mallioch
    Here is the situation: I have a Server 2008 box that is trying to connect to a Sql Server 2005 instance. Connections from websites running in the context of IIS work fine to the Sql Server machine using Sql Server authentication. Rockin'. However, using the same connection string, I cannot get a windows service on the same box to communicate with the Sql Server. Nor can I get management studio to connect from the same box. IIS great, other options no so much. For grins I have tried monkeying with the user accounts in the IIS app pools to match that of the service to get the sites to break and that hasn't worked, so it doesn't appear to be a user account issue. Since this is happening with two different programs and not with IIS, I'm assuming there is something shut down on the Sql Server that needs to allow non-IIS connecting things to communicate, but I have no idea what that would be. Any help would be appreciated.

    Read the article

  • website lookup extremely slow in ubuntu

    - by ubuntulover
    Hi I have a wireless broadband connection through a router and wireless modem. Everything works fine in Windows. However, in ubuntu on the same machine, websites seem to take longer to start loading. I think the dns lookup is slow. I think https sites may be slower, as Ijust can't log in to gmail. I am also using a mercurial repo with remote origin, and it takes forever (like 5 minutes) to push one small change. I think it is because it has to communicate through https multiple times. Should I change my dns server? I've seen that I don't have these problems at my work network (they have another dns server). This happens with the IPv4 settings being automatic (dhcp). When I change it to automatic (dhcp) addresses only, and add google's 8.8.8.8 in the dns servers, it still takes forever. Why is this happening?

    Read the article

  • website lookup extremely slow in ubuntu

    - by ubuntulover
    Hi I have a wireless broadband connection through a router and wireless modem. Everything works fine in Windows. However, in ubuntu on the same machine, websites seem to take longer to start loading. I think the dns lookup is slow. I think https sites may be slower, as Ijust can't log in to gmail. I am also using a mercurial repo with remote origin, and it takes forever (like 5 minutes) to push one small change. I think it is because it has to communicate through https multiple times. Should I change my dns server? I've seen that I don't have these problems at my work network (they have another dns server). This happens with the IPv4 settings being automatic (dhcp). When I change it to automatic (dhcp) addresses only, and add google's 8.8.8.8 in the dns servers, it still takes forever. Why is this happening?

    Read the article

  • Equalizing Agent and Master Nagios on state change alone

    - by punith
    We have a setup where there are distributed Nagios running on multiple sites and are equalizing their data to the main Nagios server. The problem is it sends back the data to main Nagios server no matter if there is a state change in host or service. Is it possible to configure the slave Nagios to check the service/Host every 5 sec but send back the data only if there is a state change. Currently it is implemented by Obsess Over Hosts/Service which always runs the command which will equalize. Nagios version is 3 I am no administrator but a developer so I don't know the exact jargon so please bare with me.

    Read the article

  • Upgraded to Mountain Lion, now 127.0.0.1 is not resolving

    - by Shanimal
    I used to be able to type 127.0.0.1 (or my network IP 10.10.53.32) and it would resolve to my "default" virtual host. 127.0.0.1/~Shanimal and shanimal.dev both resolve to their appropriate folders. localhost and 127.0.0.1 give me a 404 - "Not Found The requested URL / was not found on this server." Basically, my "It works!" screen no longer works. /private/etc/apache2/Shanimal.conf: <Directory "/Users/Shanimal/Sites/_www"> Options Indexes Multiviews AllowOverride AuthConfig Limit Order allow,deny Allow from all </Directory> hosts: 127.0.0.1 localhost 127.0.0.1 shanimal.dev

    Read the article

  • Windows NLB + IIS - Stops serving pages

    - by Ye Ol Developer
    We are currently running Windows NLB and IIS7 load balanced across two servers. What happens is randomly and sporadically the servers stop serving web pages. What we have noticed is that if we run the sites on a dedicated IP on either of the servers, these issues do not exist. As soon as we switch back to the load balanced IP, then everything goes awry. When the servers stop serving pages, we can still TS into the server and surf them internally without issues, or switch to the dedicated IP. However the internal network cannot even access the files from the load balanced IP. We are running out of idea's here. Has anyone had a similar problem?

    Read the article

  • What is the most ethically or morally questionable sysadmin task you have been given?

    - by Alex Angas
    In the recent past I was asked to set up a reporting facility for upper management so they can spy on what web sites users are visiting. This was done without any notice given to users. Unfortunately, I have a good friend with some rather unusual tastes who I knew would be caught! He also knew I set up the reporting... To me, the lack of user notification was unethical. What similar experiences have you had that haven't "felt right" and left you questioning what to do? How did you deal with it?

    Read the article

  • Error installing SQL Server 2005 Express on AMD Turion?

    - by Angel
    Hello. Is there some compatibility problem between SQL Server 2005 Express and AMD processors? I read something along those lines on various sites, but found nothing 100% definitive. I am trying to install SQL Server 2005 Express on a computer that runs Windows Vista Home Premium, 32-bit edition, and the CPU is "AMD Turion Dual-Core RM-70". I tried both the 32-bit and 64-bit installers, just in case, and the process runs fine until the moment the service attempts to start, where it fails with the same error every time. I'm sorry I can't provide more details or the exact message, it was on a client computer to which I have no access at the moment. I hope someone could shed some light on this. Thank you.

    Read the article

< Previous Page | 322 323 324 325 326 327 328 329 330 331 332 333  | Next Page >