Search Results

Search found 37688 results on 1508 pages for 'site search'.

Page 525/1508 | < Previous Page | 521 522 523 524 525 526 527 528 529 530 531 532  | Next Page >

  • Apache2 Virtual Host broken; displayed default index.html on subdomain, but correct content on www.subdomain

    - by Robert K
    I've got a Linode configured as a Ubuntu 10.04.2 web server with Apache 2.2.14. I have a total of 4 sites, all defined under /etc/apache2/sites-available as virtual hosts. All sites are almost identical clones for configuration. And all sites but my last work successfully. default: (www.)exampleadnetwork.com (www.)example.com reseller.example.com trouble: client1.example.com I keep getting this page when I visit the client1.example.com site: It works! This is the default web page for this server. The web server software is running but no content has been added, yet. In my ports.conf file I have the NameVirtualHost correctly set to my IP address on port 80. If I access the "www.sub.example.com" alias the site works! If I access it without the www I see the "It Works" excerpt posted above. Even apache2ctl -S shows that my vhost file parses correctly and is added to the mix. My vhost configuration file is as follows: <VirtualHost 127.0.0.1:80> ServerAdmin [email protected] ServerName client1.example.com ServerAlias client1.example.com www.client1.example.com DocumentRoot /srv/www/client1.example.com/public_html/ ErrorLog /srv/www/client1.example.com/logs/error.log CustomLog /srv/www/client1.example.com/logs/access.log combined <directory /srv/www/client1.example.com/public_html/> Options -Indexes FollowSymLinks AllowOverride None Order allow,deny Allow from all </directory> </VirtualHost> The other sites are variations of: <VirtualHost 127.0.0.1:80> ServerAdmin [email protected] ServerName example.com ServerAlias example.com www.example.com DocumentRoot /srv/www/example.com/public_html/ ErrorLog /srv/www/example.com/logs/error.log CustomLog /srv/www/example.com/logs/access.log combined </VirtualHost> The only site the differs is the other subdomain: <VirtualHost 127.0.0.1:80> ServerAdmin [email protected] ServerName reseller.example.com ServerAlias reseller.example.com DocumentRoot /srv/www/reseller.example.com/public_html/ ErrorLog /srv/www/reseller.example.com/logs/error.log CustomLog /srv/www/reseller.example.com/logs/access.log combined </VirtualHost> Filenames are the FQDN without the www. prefix. I've followed this advice, but still cannot access subdomain properly.

    Read the article

  • SharePoint MOSS - Serve HTTP content on an HTTPS page without Mixed Content Warning?

    - by kcb263
    Our "portal-like" SharePoint site is served using HTTPS/SSL. So a user goes to https://web.company.com and sees content and different Web Parts. So far, no problem. The desire now is to have new Web Parts added that either frame HTTP content (such as Weather Bug) or HTTP RSS feeds. The issue that arises is that by doing this, results in a "Mixed Content" warning in the browser. Has anybody successfully been able to implement such a scenario, or one similar to it? The options we have looked at, unsuccessfully, have been: using Apache Reverse Proxy Server mirror an external site Custom Web Parts

    Read the article

  • Rewrite for robots.txt and favicon.ico [closed]

    - by BHare
    I have setup some rules in which subdomains (my users) will default to where I have located the robots.txt, favicon.ico, and crossdomain.xml therefore if a user creates a site say testing.mywebsite.com and they don't make their own favicon.ico at testing.mywebsite.com/favicon.ico, then it will use the favicon.ico I have in /misc/favicon.ico This works perfect, but it doesn't work for the main website. If you attempt to go to mywebsite.com/favicon.ico it will check if "/" exists, in which it does. And then never redirects to /misc/favicon.ico How can I get it so both instances redirect to /misc/favicon.ico ? # Set all crossdomain (openpalace file) favorite icons and robots.txt doesnt exist on their # side, then redirect to site's just to have something to go on. RewriteCond %{REQUEST_URI} crossdomain.xml$ RewriteCond ^(.+)crossdomain.xml !-f RewriteRule ^(.*)$ /misc/crossdomain.xml [L] RewriteCond %{REQUEST_URI} favicon.ico$ RewriteCond ^(.+)favicon.ico !-f RewriteRule ^(.*)$ /misc/favicon.ico [L] RewriteCond %{REQUEST_URI} robots.txt$ RewriteCond ^(.+)robots.txt !-f RewriteRule ^(.*)$ /misc/robots.txt [L]

    Read the article

  • icacls, Network Service, and setting ACLs on Windows Server 2008

    - by Ted
    Setting ACLs on Windows Server 2008 via the command line is giving me some problems. As per http://web2.minasi.com/forum/topic.asp?TOPIC%5FID=26907 I've tried all sorts of variations: C:\Windows\system32icacls "D:\Websites\site.com\Web\bin*" /grant 'NT A uthority\NETWORK SERVICE: (OI) (CI)M' C:\Windows\system32icacls "D:\Websites\site.com\Web\bin*" /grant "NETWORK SERVICE": (OI) (CI)M And all variations in between. However, each try leads to i.e. "Invalid parameter "'NETWORK'"" depending on the variation above. As per http://technet.microsoft.com/en-us/library/cc753525%28WS.10%29.aspx (see in comments), it appears that others have experienced the same issue where the same command works on Windows 7/Vista/etc., but not on Windows Server 2008. What's the best way to apply permissions to Network Service account on a directory and/or files via the command line in Windows Server 2008? Especially as there's no way to do multiple file permissions at once via the GUI (see http://serverfault.com/questions/30991/windows-server-2008-change-security-settings-for-multiple-files-at-once).

    Read the article

  • Server error at large posts

    - by Shirko
    On a large drupal site, the database server is on a separate server connected directly to the web server. The web server uses apache and memcached. The problem is that whenever the post is a large, say larger than 10KB, the server does not return correctly. I checked both apache and mysql logs but could not find any trace of errors being logged. The error happens also when I use nginx/php5-fpm instead of apache. Despite this, the large posts are registered, however incorrectly so that they show up for admin when I open a new page on the site. I'm really confused and appreciate your hints to pinpoint the possible sources if this chronic problem.

    Read the article

  • Restricting permissions to individual documents on SharePoint

    - by wahle509
    Here's what I'm trying to do: I would like to create a list of documents on a site in my company's SharePoint site. Each document should have specific user's permissions to view and edit it. For example: The list is for performance reports. John has his out there called "John_PR_09.docx". Only him and his supervisor should have permissions to view, edit, or do anything to it. And then another employee has hers out there with permissions for only her and her supervisor, and so on... I have tested this out with a document that I removed the groups and users from (since they inherit permissions from it's parent) and only gave my user account permissions to. I then asked someone else to try and open and she could, she even wrote "TEST" on the document and saved it. What am I doing wrong? I thought I stopped it from inheriting permissions from it's parent and only gave myself rights to edit it.

    Read the article

  • Are you supposed to type '6' with the left hand or the right hand?

    - by Joey Adams
    A few weeks ago, I did a Google Images search for keyboard finger charts to see which fingers I'm supposed to be using to type which keys. According to the charts, '6' is supposed to be typed with the right hand: (as shown on en.wikipedia.org/wiki/Typing) However, today I spotted a split keyboard in a store with the '6' on the left side of the split. Indeed, an image search for split keyboards indicates that this is the norm: (as shown on en.wikipedia.org/wiki/Microsoft_Natural_keyboard) When doing touch typing "correctly", should I go with the finger charts (type 6 with my right hand), or should I go with the split keyboards (type 6 with my left hand)? <troll> Is this just another example of Microsoft not following the standards? </troll>

    Read the article

  • Ways to do simple failover with one server and two IPs

    - by CrassHoppr
    The setup is one server (Windows 2008) at one location with two incoming connections. As the server has to interface with various on-site devices, and will have a small number of incoming connections, a data center is not an option, and instead cable/dsl connections must be used. The goal is that users visit https://service.site.com and are sent to either the primary IP address or a secondary IP if the primary is down. I've seen advice to use round robin DNS for this, but caching an IP for a downed interface is something I'd like to avoid. Is something like this possible with these constraints?

    Read the article

  • How to print all users from windows-group to a textfile?

    - by Tim
    Hello, i'm trying to print all users of a group "Students" to a Textfile "Students.txt". I'm not in a domain, so this does not work: net group "Students" >> students.txt because i get following: This command can be used only on a Windows Domain Controller. Thank you in advance If anybody is interested in a VB.Net solution, i've programmed a Winform solution with a multiline Textbox to copy/paste the members (anyway, thanks for your help): Imports System.DirectoryServices 'first add a refernce to it from .Net Tab' .... Private Sub PrintGroupMember_Load(ByVal sender As Object, ByVal e As System.EventArgs) Handles Me.Load Dim students As List(Of DirectoryEntry) = MembersOfGroup("Students") For Each user As DirectoryEntry In students Me.TextBox1.Text &= user.Name & vbCrLf Next End Sub Public Function MembersOfGroup(ByVal GroupName As String) As List(Of DirectoryEntry) Dim members As New List(Of DirectoryEntry) Try Using search As New DirectoryEntry("WinNT://./" & GroupName & ",group") For Each member As Object In DirectCast(search.Invoke("Members"), IEnumerable) Dim memberEntry As New DirectoryEntry(member) members.Add(memberEntry) Next End Using Catch ex As Exception MessageBox.Show(ex.ToString) End Try Return members End Function

    Read the article

  • Exim: Change sender address when sending mails out of local network

    - by Esa Varemo
    We have a working exim setup at a site, where users can send and receive mails. We are trying to setup a server to send some warnings and errors using email to an address that is outside the local network. The problem is: The program that sends the mails sends them using the username it runs under and the local hostname of the server. This cause the mails to have a sender of format: [email protected]. Exim sends these mails to the ISP's SMTP server, which rejects the mails as they have an illegal or unverifiable sender (the internal address). I'm thinking I should configure exim to rewrite the sender when: - sender's domain is on the local network - receiver's domain is outside the local network I tried setting some kind of rewriting in the exim config, but did not manage to get it to work. I'd show what I have tried, but I ran out of time on the last visit to the site, and had to revert to the original version losing all the changes I tried.

    Read the article

  • Problem with internet connection

    - by vijay.shad
    Hi all, I am working on windows vista. I have got two internet connections, one is wifi connection with high speed. And other is mobile network connection. There is a very strange problem I am getting. When I connect to wi-fi connection I an not able to serf internet(actually not all the sites). I am able to search on Google but when i click on any link in the search list it does not open. But I am able to serf all the pages in google.com domain and also all the pages in stackoverflow domain. But i am not able to go to page http://repo1.maven.org/maven2 But When I am connected by my mobile network. I am able to serf any site. Can you please tell me what might me the problem with my settings.

    Read the article

  • Why is one specific file not showing up in my document library?

    - by Jay Bazuzi
    In my "Documents" library on Windows 7, one file is not showing up in Windows Explorer. When I look in C:\Users\%USERNAME%\Documents\blah\blah all 24 files appear. But when I look in Libraries > Documents > blah > blah only 23 show up. I made a copy of the file and the copy appears. Refresh doesn't help. The "Arrange by" setting defaults to "Name". When I change it to "Folder" the extra file appears, but changing it back to "Name" the file disappears again. How can I make the file appear in all views? Why would it disappear? EDIT: I deleted the Windows Search Index and things seem to be working again. I say it's a bug in the Search Service.

    Read the article

  • Is browser and bot whitelisting a practical approach?

    - by Sn3akyP3t3
    With blacklisting it takes plenty of time to monitor events to uncover undesirable behavior and then taking corrective action. I would like to avoid that daily drudgery if possible. I'm thinking whitelisting would be the answer, but I'm unsure if that is a wise approach due to the nature of deny all, allow only a few. Eventually someone out there will be blocked unintentionally is my fear. Even so, whitelisting would also block plenty of undesired traffic to pay per use items such as the Google Custom Search API as well as preserve bandwidth and my sanity. I'm not running Apache, but the idea would be the same I'm assuming. I would essentially be depending on the User Agent identifier to determine who is allowed to visit. I've tried to take into account for accessibility because some web browsers are more geared for those with disabilities although I'm not aware of any specific ones at the moment. The need to not depend on whitelisting alone to keep the site away from harm is fully understood. Other means to protect the site still need to be in place. I intend to have a honeypot, checkbox CAPTCHA, use of OWASP ESAPI, and blacklisting previous known bad IP addresses.

    Read the article

  • Will new Twitter API 1.1 allow hashtag/tweet/trend queries without any authentication, i.e. for a client that does not use an user's account at all?

    - by P5music
    I see that, even not being logged in Twitter with an account, if I google hashtags or twitter accounts, twitter show them. I think it should be also possible to get those tweets programmatically but I do not know it for sure, so I ask for confirmation here, especially for the future with the new Twitter API resctrictions. I mean, will it be possible to get tweets from hashtags or accounts without logging in an user account, and so not wanting to access the user settings, subscriptions, etc (because I do not need it), thus not having to respect any token limit? I found these API 1.1 faqs, have I to be concerned? Will an application have to request user authorization just to make public API calls? When API v1.1 is released, user authorization (and access tokens) are required for all API 1.1 requests. In the weeks following release, some methods will require only application-based authentication for certain "userless" contexts. Will an application have to request user authorization just to make public API calls? When API v1.1 is released, user authorization (and access tokens) are required for all API 1.1 requests. In the weeks following release, some methods will require only application-based authentication for certain "userless" contexts. Will the Search API require authentication? The Search API is now part of the official REST API in version 1.1. In addition to serving results in a format consistent with other Tweet resources, usage will also require authentication.

    Read the article

  • Homepage not showing on Google

    - by MIke Mayberry
    About six weeks ago my homepage (mayberrykayakingdotcodotuk) disappeared from the google organic search for "kayaking pembrokeshire" despite it having been number 2 within a few weeks of it's launch last summer. My previous site (www.mikemayberrykayakingdotcodotuk) had been 2nd for about six years and has 301 redirects for all pages to the new site. Google toolbar still rates the homepage as 3/10 and the domain is still showing in search results, just not the homepage. A little research suggests that this is most likely to be due to an issue with google treating two pages as identical content (one with www. and one with not) since the changes in their algorithms around that time and that the way to fix this is to add some code somewhere. This makes sense to me as my print advertising doesn't have the www part of the address. I have cpanel access but a limited knowledge on web coding, having picked things up as I've gone along and paid for designers etc., when needed. Would someone be able to let me know where I have to go to add the code and what code I need to add to redirect the crawlers to one page? Or is there another issue that is causing this? Thanks in advance.

    Read the article

  • RewriteRule in htaccess in subdirectory

    - by Jay
    Windows server, running Apache. In my Apache conf, I have AllowOverride None for the root of a site and then I have a subdirectory set to AllowOverride All: <Directory /> AllowOverride None </Directory> <Directory "/safe/"> AllowOverride All </Directory> However, when I try to set up a rewrite rule in the subdirectory's htaccess file, nothing happens, I just get a 404 page not found error. Example: RewriteEngine On RewriteRule (.*) /blah?test=$1 [R=302,NC,NE,L] Rwewriting URLs are working fine from the root via the Apache conf. I don't understand why the rule is ignored. I don't want to do the URL re-writing within the conf because for this case I may need to be changing the redirects constantly and don't want to reload the server every time a change is made. I also don't want to affect server performance by enabling htaccess files site-wide, just in the subdirectory I need it.

    Read the article

  • Website is not clickable in Windows XP Internet Explorer 7

    - by c-sharp newbie
    I have installed Windows XP Virtual PC on Windows 7 to test a site that is having issues in IE7 on windows xp - the website loads up but you cannot click on hyperlinks - its like the website has frozen - now is this a browser support issue or OS issue? Can anyone shed any light on this? Is there any browser tools i can use to spot any problems? Sorry if i have been too vague - not much else to say really - completely lost.. Maybe this might help a little; Any guidance is appreciated UPDATE: I think i know what the problem maybe - its the jQuery UI reference that is causing issues with the site. Has anyone else experienced similar problems? jquery library used was jquery-1.8.0.min.js

    Read the article

  • SSL/https setup for herokuapp.com address rather than my actual domain

    - by new2ruby
    I have a subdomain of my site pointed to a rails app at mysite.herokuapp.com. I bought a certificate from godaddy and seem to have that all set up correctly. So that when I go to: http://mysite.herokuapp.com or http://dev.mysite.com it's redirected to: https://mysite.herokuapp.com or https://dev.mysite.com The problem is that when I visit dev.mysite.com, I get the error: Safari can't verify the identity of the website. But when I go to mysite.herokuapp.com, I don't get the error. I wanted this to be set up the other way, so that dev.mysite.com did not cause the error. I'm not sure where I went wrong. I used dev.mysite.com when generating the key and when setting it up at godaddy.com. Any ideas where I should look? P.S. The old site is hosted at dreamhost and the DNS info is stored there as well. So I created a subdomain there of type cname which points to mysite.herokuapp.com.

    Read the article

  • What are some internet trends that you've noticed over the past ~10 years? [closed]

    - by Michael
    I'll give an example of one that I've noticed: the number of web sites that ask for your email address (GOOG ID, YAHOO! ID, etc.) has skyrocketed. I can come up with no legitimate reason for this other than (1) password reset [other ways to do this], or (2) to remind you that you have an account there, based upon the time of your last visit. Why does a web site need to know your email address (Google ID, etc.) if all you want to do is... download a file (no legit reason whatsoever) play a game (no legit reason whatsoever) take an IQ test or search a database (no legit reason whatsoever) watch a video or view a picture (no legit reason whatsoever) read a forum (no legit reason whatsoever) post on a forum (mildly legit reason: password reset) newsletter (only difference between a newsletter and a blog is that you're more likely to forget about the web site than you are to forget about your email address -- the majority of web sites do not send out newsletters, however, so this can't be the justification) post twitter messages or other instant messaging (mildly legit reason: password reset) buy something (mildly legit reasons: password reset + giving you a copy of a receipt that they can't delete, as receipts stored on their server can be deleted) On the other hand, I can think of plenty of very shady reasons for asking for this information: so the NSA, CIA, FBI, etc. can very easily track what you do by reading your email or asking GOOG, etc. what sites you used your GOOG ID at to use the password that you provide for your account in order to get into your email account (most people use the same password for all of their accounts), find all of your other accounts in your inbox, and then get into all of those accounts sell your email address to spammers These reasons, I believe, are why you are constantly asked to provide your email address. I can come up with no other explanations whatsoever. Question 1: Can anyone think of any legitimate or illegitimate reasons for asking for someone's email address? Question 2: What are some other interesting internet trends of the past ~10 years?

    Read the article

  • 550 operation not permitted using FTP

    - by monkey_boys
    I'm using FTP to manage some files on a site I run but keep seeing this (truncated) error log: Command: DELE calendarpermission.php Response: 550 calendarpermission.php: Operation not permitted [...] Command: DELE button_down.gif Response: 550 button_down.gif: Operation not permitted Command: CWD /domains/example.com/public_html/admincp Response: 250 CWD command successful Command: PWD Response: 257 "/domains/example.com/public_html/admincp" is the current directory Command: RMD control_examples Response: 550 control_examples: Operation not permitted Command: CWD /domains/example.com/public_html Response: 250 CWD command successful Command: PWD Response: 257 "/domains/example.com/public_html" is the current directory Command: RMD admincp Response: 550 admincp: Operation not permitted Status: Retrieving directory listing... Command: PASV Response: 227 Entering Passive Mode (122,155,5,50,138,244). Command: MLSD Response: 150 Opening ASCII mode data connection for MLSD Response: 226 Transfer complete Status: Directory listing successful Status: Set permissions of '/domains/example.com/public_html/admincp' to '777' Command: SITE CHMOD 777 admincp Response: 550 CHMOD 777 admincp: Operation not permitted What do I do to solve this?

    Read the article

  • What is the best way to handle dynamic content?

    - by user1561753
    So we run a site where there are elements of the interface that could potentially be changed at any moment in the backend. Specifically we run a web service where certain functions are loaded dynamically. However, there are times where we remove certian functions and we want the experience to be as seamless for the user as possible. Now we've considered a few methods of solving this Ping the server every few seconds. If the functions are outdated/no longer available refresh the users page. While this would work the best, I feel like having that much IO can't be too good When the user clicks a function, if it's outdated/no longer available, alert them in the response and refresh the page. This would also work fairly well. I guess I'm more wondering how web apps like Google Docs work where you have content that has to be synced up across multiple users and that isn't more than a few seconds outdated Sorry if this isn't the best place to ask this. I figured this was more of a site architecture question and that this might be the place to ask it over SO.

    Read the article

  • Building a List of All SharePoint Timer Jobs Programmatically in C#

    - by Damon Armstrong
    One of the most frustrating things about SharePoint is that the difficulty in figuring something out is inversely proportional to the simplicity of what you are trying to accomplish.  Case in point, yesterday I wanted to get a list of all the timer jobs in SharePoint.  Having never done this nor having any idea of exactly how to do this right off the top of my head, I inquired to Google.  I like to think my Google-fu is fair to good, so I normally find exactly what I’m looking for in the first hit.  But on the topic of listing all SharePoint timer jobs all it came up with a PowerShell script command (Get-SPTimerJob) and nothing more. Refined search after refined search continued to turn up nothing. So apparently I am the only person on the planet who needs to get a list of the timer jobs in C#.  In case you are the second person on the planet who needs to do this, the code to do so follows: SPSecurity.RunWithElevatedPrivileges(() => {    var timerJobs = new List();    foreach (var job in SPAdministrationWebApplication.Local.JobDefinitions)    {       timerJobs.Add(job);    }    foreach (SPService curService in SPFarm.Local.Services)    {       foreach (var job in curService.JobDefinitions)       {          timerJobs.Add(job);       }     } }); For reference, you have the two for loops because the Central Admin web application doesn’t end up being in the SPFarm.Local.Services group, so you have to get it manually from the SPAdministrationWebApplication.Local reference.

    Read the article

  • Worthless Anti-Spam (What can we learn)

    - by smehaffie
    I recently can across a site that had a “anti-spam” field at the bottom of the entry from.  The first issue I had with it was that at 1280X800 you could not read the value you were suppose to enter (see below).  You tell me, should you enter div, dlv, piv, or plv. But even worse than not being readable at high resolutions is the fact that the programmer who coded it really did not understand what this was used for.  An anti-spam (aka: catpcha) entry field should not be able to be read by looking at the HTML DOM object (so entry of value cannot be scripted).  In this case the value is simply a disabled text input filed that has the value you need to type.  So a hacker would simply need to search for text input field named “spam2” and then they could flood the site with spam. 1: <td> 2: <label> 3: <input name="spam1" type="text" class="small" id="spam1" size="6" maxlength="3" /> 4: <input name="spam2" type="text" class="small" id="spam2" value="plv" 5: disabled="disabled" size="6" maxlength="3" /> 6: * <span class="small">- Anti-SPAM key - please enter matching value</span> 7: </label> 8: </td>   There are some things to learn from this example: 1) Always make sure you understand why you are coding a feature/function for any program you write.  Just following the requirements without realizing the “why” will sooner or later come back to bite you.  I think the above example appears to be an example of this. 2) Always check how the screen appears in different resolutions.  In this case it was pretty much unreadable in 1280x800, but you could read it in 800X600 (but most people I know do not have their resolution set that low).  Lucky for me I could “View Source” and get the value I needed to enter.

    Read the article

  • how to rewrite or redirect old or missing or invalid url to 404 page

    - by kath
    I recently upgraded a site and almost all URLs have changed. I have redirected all of them (or so I hope) but it may be possible that some of them have slipped by me. Is there a way to somehow catch all invalid URLs and send the user to a certain page I am using PHP Thanks so much! error file is already in .htaccess but seems nothing going to change you can see the error file as below AddHandler application/x-httpd-php5s .php ErrorDocument 404 /content/404.php <IfModule mod_rewrite.c> RewriteEngine on RewriteBase / here are 2 different url one the first one is old one which i edited and the secound one is edited one #1 old one (which is no longer on the server) http://adsbuz.com/vehicles-cars/toyoya/2009-toyota-land-cruiser-gxr-4686.htm #2 the editet one which is on the server http://adsbuz.com/vehicles-cars-for-sale/toyoya/2009-toyota-land-cruiser-gxr-4686.htm i need only the secound one with the vehicles-cars-for-sale because the other directory is already modified and its not on the server but as you can see after the (adsbuz) site name vehicles-cars and vehicles-cars-for-sale both are opening for same location I hope I made myself clear

    Read the article

  • How to whitelist a user agent for nginx?

    - by djb
    I'm trying to figure out how to whitelist a user agent from my nginx conf. All other agents should be shown a password prompt. In my naivity, I tried to put the following in before deny all: if ($http_user_agent ~* SpecialAgent ) { allow; } but I'm told "allow" directive is not allowed here (!). How can I make it work? A chunk of my config file: server { server_name site.com; root /var/www/site; auth_basic "Restricted"; auth_basic_user_file /usr/local/nginx/conf/htpasswd; allow 123.456.789.123; deny all; satisfy any; #other stuff... } Thanks for any help.

    Read the article

< Previous Page | 521 522 523 524 525 526 527 528 529 530 531 532  | Next Page >