Search Results

Search found 5380 results on 216 pages for 'webmasters'.

Page 113/216 | < Previous Page | 109 110 111 112 113 114 115 116 117 118 119 120  | Next Page >

  • Make Apache encode or replace quotes instead of escaping them?

    - by mplungjan
    In the dcoumentation I read Format Notes For security reasons, starting with version 2.0.46, non-printable and other special characters in %r, %i and %o are escaped using \xhh sequences, where hh stands for the hexadecimal representation of the raw byte. Exceptions from this rule are " and \, which are escaped by prepending a backslash, and all whitespace characters, which are written in their C-style notation (\n, \t, etc). In versions prior to 2.0.46, no escaping was performed on these strings so you had to be quite careful when dealing with raw log files. This is a problem for Analog which is still the handiest analyser I use. I get .... "GET /somerequest?q=\"quoted string\"&someparm=bla" in the logfile and it is of course flagged as corrupt since Analog expects .... "GET /somerequest?q=%22quoted string%22&someparm=bla" or similar. I realise I can pre-process using something like perl -p -i.bak -e 's/\\"/%22/g' logfile But I'd rather not have to add this step to these files which are 50-90MB zipped per day Thanks for any pointers

    Read the article

  • Webserver insists on opening "blog1.php" instead of "index.php"

    - by pepoluan
    I'm at my wits' end. I have just ripped out a website and in the process of rebuilding everything. Previously, the 'home page' of the website is a blog, with the address "www.mydomain.com/blog1.php". After exporting everything, I deleted the whole directory, and -- based on request -- immediately create a blog/ directory. The idea is to get the blog back up as soon as possible, and temporarily redirect people accessing www.mydomain.com to the blog. Accessing the blog via http://www.mydomain.com/blog/ works. So I put in an index.php file containing a (temporary) redirect to the blog's address. The problem: The server insists on opening blog1.php instead of index.php. Even after we deleted all the files (including .htaccess). And even putting in a new .htaccess file with the single line of DirectoryIndex index.php doesn't work. The server stubbornly wants blog1.php. Now, the server is actually a webhosting, so I have no actual access to it. I have to do my work via cPanel. Currently, I work around this issue by creating blog1.php; but I really want to know why the server does not revert to opening index.php. Did I perhaps miss some important settings in the byzantine cPanel menu page?

    Read the article

  • Why does 301 redirect work for http but not for https?

    - by Tom G
    Through my domain registrar I have set up a domain, essayme.co.uk, to automatically forward to https://google.com. If I go to http://essayme.co.uk it works as expected and redirects me to https://google.com. $curl -i http://essayme.co.uk HTTP/1.1 301 Moved Permanently Cache-Control: max-age=900 Content-Type: text/html Location: https://google.com Server: Microsoft-IIS/7.5 X-AspNet-Version: 4.0.30319 X-Powered-By: ASP.NET Date: Sat, 07 Jun 2014 11:14:16 GMT Content-Length: 0 Age: 0 Connection: keep-alive However, if I go to https://essayme.co.uk it just freezes and times out. $curl -i https://essayme.co.uk curl: (7) Failed connect to essayme.co.uk:443; Operation timed out What is happening in the second case? (and, if possible, how can I get the redirect to work for https?) Problem background/clarification: I don't have an SSL certificate for the essayme.co.uk domain above, but I do for my live domain (let's call it mywebsite.com), and I was seeing the exact same problem on this domain (hence why I'm trying to debug the problem). Unfortunately I can't experiment with the live domain (as it's live) and I would like to avoid having to buy a second certificate for essayme.co.uk just for debugging (unless absolutely necessary). The problem I was seeing: my live domain, mywebsite.com (not its real name), has a valid SSL certificate. Visiting https://www.mywebsite.com displayed the webpage as expected. I had set up forwarding (like in the question above) from the naked domain (mywebsite.com) to https://www.mywebsite.com) Visiting http://mywebsite.com redirected to https://www.mywebsite.com as expected. However, visiting https://mywebsite.com would freeze and time out (as in the question above). I also tried forwarding it to http://www.otherwebsite.com as an experiment (i.e. forwarding to another site that does not use SSL), but the result was the same: Visiting http://mywebsite.com redirected to http://www.otherwebsite.com as expected. Visiting https://mywebsite.com would freeze and time out again. So I set up essayme.co.uk as an experiment to try and understand why it doesn't work.

    Read the article

  • Legal responsibility of public posts

    - by Murdock
    Given a public site with no logins: I let people post links to public Facebook profiles, and my site fetches the profile picture and displays it. Would it be ok if I just told people to post profiles of which they had the owner’s permission? Does such a statement exonerate me from copyright infringements and place the burden on the user? Edit: For bonus points. Can the statement just be a notice under the button (that will save the link) that says that "By clicking this button you agree to the terms and conditions" with maybe a link to the terms and conditions.

    Read the article

  • displaying first few pages of a pdf on a page = duplicate content?

    - by Ace
    I am embedding scribd pdfs on my website. These are exam papers pdf which are available on other websites. As it is scribd is an embed/iframe, I think google considers my page as being empty with no content; google does see iframe content right? So I decided to display the first pages of the pdf as text on the page for google. Then, for user experience, i hide the text and replace it with the scribd embed code using javascript. I have 2 worries about this method. Firstly, i am displaying the first pages of the pdf and the latter may be hosted on other websites, will this be considered as duplicate content. Secondly, I am hiding the content and replacing it with the scribd embed with javascript; is it considered bad by google?

    Read the article

  • Value of links on negative review pages

    - by Sam Healey
    A general assumption with SEO is more links = higher rankings. What I would like to know is does Google know what those links are referring to. I.e. if somebody gives a product a good review on their personal blog and links the review to another companies website (who are selling the product), would Google take consideration for the review/description link. Essentially would Google know that this link refers to a product. So if somebody is looking to buy a product, Google would know to include this page because the previous link said it sells products rather than just having information on products. Then to take this further, does Google know if a link is positive or negative. For example, If somebody creates a post saying, do not visit example.com, example.com is bad because of blah blah blah. Would Google know that the link is getting bad feedback and therefore would it have a negative affect on rankings, or would Google go oh its just another link and give it better rankings?

    Read the article

  • How to handle new domain names?

    - by michael
    I have a new product which I'll call a pen ink reloader. I have a website using my products name, for example, www.inkywink.com which I want to have accessed by searches for keywords such as "pen ink", "pen out of ink" "ink for pens" etc. , since nobody knows that a pen ink reloader exists. I see that its quite difficult to get on front page for these keywords since they have lots of competition. However I notice that the exact phrases I want to rank highly for are available as domains. I purchase "www.penink.com" and "penoutofink.com" which for arguments sake are highly searched and the perfect keywords to get eyes on my money site www.inkywink.com . Two questions: 1. What is my best option to leverage those names so that they appear near top of searches so that I can get traffic to my money site? Do I just have them redirect 301 to inkywink.com or should I create small original content on each with links to my main site? 2. If I just have them redirected to inkywink.com, am I able to use keywords in metatag and headers for each site separately or do they all automatically obtain the same headers and tags as the site to which theyre redirected ? Thanks to anyone who can help as I'm a real newbie to all this.

    Read the article

  • URL hex characters in .htaccess

    - by Steve
    There is an old page with a space in the filename, and this is no longer found on the website. So I need to redirect this page to another page using a 301 redirect in .htaccess. If I place the filename directly into .htaccess (Bouquets%20%26%20Loose.html), the redirect does not work. If I escape the % sign like this (Bouquets\%20\%26\%20Loose.html), the redirect still does not work. How do I get this redirect to work in .htaccess? Thanks.

    Read the article

  • Sex - in domain name is this bad???

    - by user3583
    In short I am working with a company that does trade shows... one of their new domain names has the word 'sex' in but completely innocently. EXAMPLE: www. someproductsexpo .com (Being 'some' 'products' 'expo'). The content is completely inoffensive and I do not see there being any other things that would flag either the web or any emails sent from [email protected] as inappropriate. I was just wondering if any has experiences of any domains like this or comments to add? Thanks

    Read the article

  • Is it good or bad to have dynamic content in page titles and/or description

    - by Gunjan
    In a local listing website, I append number of search results found in the description(not in title currntly) meta tag of the page as I think this is valuable for users for e.g. "Find address, phone numbers, blah blah blah for 21 outlets in locality. some more stuff after this..." as more places are added to the database, the description for the same page will change frequently. is this good or bad for SEO how about doing the same for title tags?

    Read the article

  • Need suggestions on how to create a website with an encrypted database.

    - by SFx
    Hi guys, I want to create a website where a user enters content (say a couple of sentences) which eventually gets stored in a backend database (maybe MySQL). But before the content leaves the client side, I want it to get encrypted using something on client like maybe javascript. The data will travel over the web encrypted, but more importantly, will also be permanently stored in the backend database encrypted. Is JavaScript appropriate to use for this? Would 256 bit encryption take too long? Also, how do you query an encrypted database later on if you want to pull down the content that a user may have submitted over the past 2 months? I'm looking for tips, suggestions and any pointers you guys may have in how to go about learning about and accomplishing this. Thanks!

    Read the article

  • Page appears indexed in Google but not findable for any search terms?

    - by Jeff Atwood
    (Note that I am going to use screenshots here because I suspect writing about this will change the behavior over time.) If you do a Google search for uiviewcontroller best practices either with or without the quotes, you end up with results like this: Note that none of these pages resolve to the actual Stack Overflow question containing those words in the title. They resolve to either a) sites that are mirroring our creative commons data and correctly pointing back to the source question without nofollow, as properly specified by our attribution requirements or b) our own internal links to the question, but not the actual question itself. The actual page with the title ... Custom UIView and UIViewController best practices? ... does exist at this URL ... http://stackoverflow.com/questions/3300183/custom-uiview-and-uiviewcontroller-best-practices ... and apparently it is present in Google's index! But why does it not appear when we search for uiviewcontroller best practices ? We know that Google contains this page in its index Our search terms match the title of the question Stack Overflow has much higher pagerank than the other sites that are mirroring this question under Creative Commons I don't get it. What are we doing wrong here?

    Read the article

  • How to rotate html5 canvas as page background?

    - by Sebastian P.R. Gingter
    Hi, I want to achieve the following: Image a white sheet of paper on a black desk. Then rotate the paper a little bit to the left (like, 25 degrees). Now you still have the black desk, and a rotated white box on it. In this rotated white box I want to place non-rotated normal html content like text, tables, div's etc. I already have a problem at the very first step: rotating a rectangle. This is my code so far: <html> <head> <script> function draw() { var canvas=document.getElementById("myCanvas"); var c=canvas.getContext("2d"); c.fillStyle = '#00'; c.fillRect(100, 100, 100, 100); c.rotate(20); c.fillStyle = '#ff0000'; c.fillRect(150, 150, 10, 10); } </script> </head> <body onload="draw()"> <canvas id="myCanvas" width="500" height="500"></canvas> </body> </html> With this, I see only a normal black box. Nothing else. I assume there should be a red, rotated box too, but there's nothing. What is the best approach to reach this and to have it as a (scaling) background for my web page?

    Read the article

  • Consolidating multiple domain names

    - by Mike
    I have a client that has three separately hosted copies of their website, each on a separate domain name. The websites are all essentially the same, bar a few discrepancies caused by badly managed updates in the past. I will soon be launching a completely new website for them, at which point, all three domain names are to resolve to the same web server. One domain name will become the default domain name that they refer to in all their literature, and the other two will simply be used as catch-alls for old links, bookmarks, and so on. I would like to know what people consider the best route to achieve this. My plan so far is: Get the new site up and running on the new webserver. Change the relevant A record of the default domain name to point to the new webserver. a) Keep the existing hosting accounts in operation. Create a list of 301 redirects from old page names on the old site to new page names on the new site. or b) Configure CNAME records for the non-default domain names, each pointing to the new webserver. Create a list of 301 redirects on the new site that redirect from old page names to new page names. If my understanding is correct, 3a will help to maintain whatever search engine rankings the sites already have (I know it's not going to be perfect), while at the same time informing search engines that the old domain names are no longer in use. What's a good approach to take here?

    Read the article

  • is a merchant account a requirment for a website to take payments..

    - by calum
    Hi, I have had a quick look but couldn't see anything related. Basically, if we were to accept payments for events on our website, via paypal (essentially a Buy it now! button), as a business, do we need a merchant's account, or will a regular bank account be acceptable? I may have some confusion in terms. My understanding is you need a merchant's account to accept credit card payments, but as we are using PayPal, is this necessary? Thank you for any clarification. disclaimer - I've read What are some options for taking payments on my website? but it doesn't explicitly say if we require a merchant account or not. Thank you.

    Read the article

  • URL Rewrite http to https EXCEPT files in a specific subfolder

    - by BrettRobi
    I am trying to force all traffic on my web site to use HTTPS, using the URL Rewrite 2.0 module added to IIS 7.5. I got that working and now have a need to exclude a couple of pages from using SSL. So I need a rule to rewrite all URL except those referencing this folder to HTTPS. I've been banging my head against the wall on this and am hoping someone can help. I tried creating a rule to match all URL except those in a nossl subfolder as in this example: <rule name="HTTP to HTTPS redirect" enabled="true" stopProcessing="true"> <match url="(/nossl/.*)" negate="true" /> <conditions logicalGrouping="MatchAll" trackAllCaptures="false"> <add input="{HTTPS}" pattern="off" /> </conditions> <action type="Redirect" url="https://{HTTP_HOST}/{R:1}" redirectType="Found" /> </rule> But this doesn't work. Can anyone help?

    Read the article

  • Why is Joomla based website that was copied off of live server into localhost not showing pictures and throwing 404 error?

    - by Darius
    I have copied Joomla based website via FTP onto my machine and I am trying to make it run on my localhost which is provided by the latest version of XAMPP. I have exported and imported the DB with no problems. I have placed all the files and folders into htdocs folder but when I go to localhost/examplesite all I get is the text that is on the front page but no pictures and it displays 404 Error. Do I need to make changes to .htaccess? If so, can some one point me to the right direction? Thanks

    Read the article

  • How do I get Paypal or a merchant account for a marketplace style web site?

    - by Brett G
    I'm having trouble getting approved for a merchant account for my website. Basically I have expert users and users. Expert users provide a service through my website which they set their own rates. Users purchase the services, then pay me, I give 90% to the expert users. I have been told this is factoring.. Is the way around this, a system like freelancer.com does? Where users deposit money into their freelancer account, then pay for the services they won? What are the negatives to this system? What about sites like 99designs? They accept CC payments and then pay the winning designer. How are some sites doing this but I'm having so much trouble getting approved?

    Read the article

  • Transferring local site to shared hosting

    - by Pete
    I'm looking to setup a simple online text processing tool similar to the Clang demo. The processing program itself is a C++ program which I can modify to provide the desired output I need. Since I use Linux+Perl daily and have used Apache in the past, I'd like to get this working locally first. My two questions are: Is it possible to do this with only Apache and Perl? I've looked into frameworks for doing this and quickly ran into The Paradox Of Choice. Will I be able to easily transfer a working local site to a shared hosting service? I want to administer as little as possible. My understanding is since this needs to run a C++ program that CGI is a requirement and thus I need to administer the httpd server. Hopefully this doesn't mean a VPS. Thanks

    Read the article

  • How to 301 redirect from old query string urls to CakePHP Canonical urls?

    - by Daniel Bingham
    I currently have a .htaccess file that looks like this: RewriteCond %{QUERY_STRING} ^action=view&item=([0-9]+)$ RewriteRule ^index\.php$ /index.php?url=item/%1 [R=301] RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{REQUEST_FILENAME} !-f RewriteRule ^(.*)$ index.php?url=$1 [QSA,L] It is meant to 301 redirect my old query string based URLs to new CakePHP urls. This will successfully send users to the correct page. However, Google doesn't seem to like it (see below). I previously tried doing this: RewriteCond %{QUERY_STRING} ^action=view&item=([0-9]+)$ RewriteRule ^index\.php$ /item/%1 [R=301] RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{REQUEST_FILENAME} !-f RewriteRule ^(.*)$ index.php?url=$1 [QSA,L] But that fails. The second rewrite rule doesn't seem to catch the rewritten URL. It goes straight through. Using the first version wouldn't be a problem, except that I suspect that is what is choking up Google. It hasn't indexed my sitemap full of the new URLs. My old sitemap had been fully indexed and all the URLs are in Google's index. But it isn't following the redirects from the old URLs to the new. I have a 'not followed' error for every one of the query urls that was in my old sitemap. Am I properly using a 301 redirect here? Is it the weird rewrite rule? What can I do to send both Google and users to the proper page and save my page rank?

    Read the article

  • Webhosting with custom database choice [closed]

    - by churchill614
    Possible Duplicate: How to find web hosting that meets my requirements? I am trying to find somewhere to host a website which uses OrientDB as its database. My budget doesn't stretch to a dedicated server where I can configure everything as I need it. Rather, I am hoping to find somewhere, ideally UK based, that will allow me to install/install for me OrientDB on their server, that is of the normal shared server variety. Is anybody able to point me in a good direction for this please (whilst UK is preferable it is not essential)?

    Read the article

  • Error running phusion passenger in standalone mode

    - by msidell
    I'm trying to run standalone phusion passenger so that I can run different ruby rvm configurations on the same host. I already have ruby and passenger running fine on this host. I am following the instructions here. When I run standalone passenger the first time, it appears to successfully install nginx. But then when it tries to run, I get this error: [root@clark directra]# passenger start -a 127.0.0.1 -p 3001 -d --user dweb *** ERROR *** Could not start Passenger Nginx core: nginx: [alert] could not open error log file: open() "/tmp/passenger-standalone.16757/logs/error.log" failed (2: No such file or directory) nginx: [alert] Unable to start the Phusion Passenger watchdog (/var/lib/passenger-standalone/3.0.11-x86-ruby1.9.3-linux-gcc4.1.2-1002/support/ agents/PassengerWatchdog): Permission denied (13) (13: Permission denied) Stopping web server... done FWIW, /tmp is writeable. Any idea what's wrong?

    Read the article

  • 302 Redirect causes garbage at end of Wordpress link in Facebook

    - by Joao
    When I try to link my Wordpress blog to Facebook, the url doesn't resolve properly. There's garbage appended at the end and Facebook is not able to retrieve information from the site. Happens in every page, post or main entry. Here's what happens: http://clarissarezende.com.br/ shows up in Facebook as http://clarissarezende.com.br/UPLcS/ (when copy/paste the link) and no information about the site shows up in FB. I'm using Wordpress 3.3.1 with ProPhoto 4. Recently I moved the DNS entry on my ISP. The blog is hosted at clarissarezende.com.br/public_html/blog2 and before the DNS would point to public_html and then I changed it to public_html/blog2. Note that I did not move any Wordpress files. Made the (I think) necessary changes all over Facebook, but still no dice... Any ideas on what can be happening?

    Read the article

  • SEO Keyword Research Help

    - by user5857
    I'm new at SEO and keyword research. I am using Market Samurai as my research tool, and I was wondering if I could ask for your help to identify the best key word to target for my niche. I do plan on incorporating all of them into my site, but I wanted to start with one. If you could give me your input on these keywords, I would appreciate it. This is all new to me :) I'm too new to post pictures, but here are my keywords (Searches, SEO Traffic, and SEO Value / Day): Searches | SEO Traffic | PBR | SEO Value | Average PR/Backlinks of Current Top 10 1: 730 | 307 | 20% | 2311.33 | 1.9 / 7k-60k 2: 325 | 137 | 24% | 822.94 | 2.3 / 7k-60k 3: 398 | 167 | 82% | 589.79 | 1.6 / 7k-60k I'm wondering if the PBR (Phrase-to-broad) value of #1 is too low. It seems like the best value because the SEOV is crazy high. That is like $70k a month. #3 has the highest PBR, but also the lowest SEOV. #2 doesn't seem worth it because of the PR competetion. Might be a little too hard to get into the top page of Google. I'm wondering which keywords to target, and if I should be looking at any other metric to see if this is a profitable niche to jump into. Thanks.

    Read the article

  • How to handle multiple domains correctly? [closed]

    - by Eric Itzhak
    Possible Duplicate: Could I buy a domain name to increase traffic to my site like this? I have a website with multiple keyword based domains ( 2 actully) Now both domains are common google searches for the topic. What i'm intrested is to handle the domains as such that google can recognize the seconed domain with the same page rank as the primary domain, so it will also appear on the first page. My question is how do i do it correctly so i will help SEO? meaning i want the 2nd domain to be on the first page, because the primary is on the first page for it's keyword. do i simple put a redirect in the index.php file? Or to do a 301 redirect? or change the .htaccess file? Or create a domain pointer in the Control panel?

    Read the article

< Previous Page | 109 110 111 112 113 114 115 116 117 118 119 120  | Next Page >