Search Results

Search found 5380 results on 216 pages for 'webmasters'.

Page 165/216 | < Previous Page | 161 162 163 164 165 166 167 168 169 170 171 172  | Next Page >

  • My Blog is marked as spam because of adBrite

    - by Thala
    I was blogging for over a month. I never thought of publishing ads in my blog until I was recommended to adBrite. I copied the widget code and just added it to my blog. Crap, suddenly I got a message that I was stopped from blogging because it is marked as spam. I never gave up my try, I just kept on sending request to retrieve my blog. How do I get my blogger blog unlocked and not considered spam?

    Read the article

  • Is the W3 standard a major factor when google decides SERP position?

    - by Anonymous12345
    I have a dynamic php website which index only has around 800 errors according to the w3 validator online. I have tried checking major websites like ebay, stackoverflow and others also, all with around 400 errors. So my first thought is, what good is that validator when it always displays errors? Secondly, will the errors affect my SERP ranking? ie, will me fixing these errors as good as I can increase my Google search position? Thanks

    Read the article

  • Next lowest value in MySQL Database [migrated]

    - by Justin Edwards
    SELECT * FROM `experience` WHERE `reqexp` <> '4793' ORDER BY 'lvl' DESC LIMIT 1 Here is what I want to do. I am making an online game for a client, and need to be able to use a mysql query with a random value, and find the level associated with that amount of experience. In this case, I need to find the next value lower than 4793 that already exists in the database so I can determine the players appropriate level. Any Ideas?

    Read the article

  • Does it matter that the TTL is higher than I've been told?

    - by Andy
    Ok, so the title isn't very descriptive, I know. Basically, I'm trying to configure Outlook.com to use my custom domain! I've followed the steps and made the account etc. and now I have the DNS settings to configure from Windows Live. I added the MX Entries and everything last night but Windows Live is still saying I need to prove my ownership of the domain. The only thing I can think of that I had to use a differrent TTL to the one provided because my web hoster will only allow a minimum of four hours, whereas Windows Live told me to configure the TTL as one hour. Would that stop anything? By the way, my web hoster is JustHost (Shared Hosting)

    Read the article

  • Change A-Record in Hetzner Konsoleh

    - by Charles
    I am a Hetzner level 19 shared hosting client and would like to change the A-record for my domain. When I log in to konsoleh, I see which IP-address it points to at the moment, but not where I can change it. Does anyone know how to do it? Thanks! Solution I called Hetzner support. It was because I had to log in with my client id, not with my domain. Then I could just click on the A-record and edit it.

    Read the article

  • what is a squeeze page?

    - by Steve
    I've been reading a marketing book which suggests building a squeeze page to build an email list. Does this mean one of those long sales letter type pages with crumby styling? I'm assuming the styling does not have to be generic, or does it? Or, if the sales letter is not a squeeze page, what is a squeeze page? Is there an easy way to build one, and what considerations should be undertaken when building one?

    Read the article

  • What are my options for Web 2.0 gallery software?

    - by NewProger
    Good day everyone! This is my first post on this site ^_^ My question is: What are my options if I need a Web 2.0 gallery software that work in a very similar manner to this removed web site. Essentially: users can create account, then create and manage their own galleries. And all other users and guests can see these galleries, add tags, comments, etc. Basically a community driven gallery web site. When it comes to "normal" cms I know a lot of good options, but I never worked with any gallery management software... If you can suggest me some good solutions (free) it would be great! Also: If you by any chance know what software is used at this removed website it would be even better, because this solution suits me the best and I would have used this engine, but I cant find any information on what this engine is and if it is even public, or privately developed thing. Thanks in advance!

    Read the article

  • How do I map some subdirectories to run alongside a Drupal site?

    - by paradroid
    I have a Drupal site running on Apache using the following vhosts file: <VirtualHost xx.xx.xx.xx:80> ServerName bananas.net ServerAlias www.bananas.net DocumentRoot /var/www/drupal/ RewriteEngine On RewriteCond %{HTTP_HOST} !=bananas.net [NC] RewriteRule ^(.*)$ http://bananas.net$1 [L,R=301] <Directory /var/www/bananas.net/> Options -Indexes FollowSymlinks AllowOverride All Order allow,deny Allow from all </Directory> CustomLog ${APACHE_LOG_DIR}/access.log combined ErrorLog ${APACHE_LOG_DIR}/error.log </VirtualHost> I set it up some time ago, so I am not sure what the <Directory /var/www/bananas.net/> directive was meant for. That directory is currently empty. With the vhosts file the way it is, does the Directory directive have any effect at all? I want to add some content which is separate from the Drupal site. How do I add sub-directories within /var/www/bananas.net/ which can be accessed alongside the Drupal site running at the root? As they have nothing to do with the Drupal site, I want to keep the files separate, but still using the same domain.

    Read the article

  • Is there a way I can sort traffic by page-type based upon URL structure in Google-Analytics or Google Webmaster Tools??

    - by Felix
    I have a local business directory site. I'm trying to segment my incoming traffic by page-type such that i can find out what percentage of traffic is going to zip code pages exclusively and what percentage is going to city/state level pages. I basically want to filter by URL structure to find out what percentage of total traffic zip code pages account for. The reason for doing this is to find out if Does Google Tag Manager help with this? Here are the two URL paths: http://www.example.com/ny/new-york/10011/ http://www.example.com/ny/new-york Thanks all!

    Read the article

  • Blatant copyright theft

    - by Tom Gullen
    Found a user on the forum trying to solicit business for his website, a good user reported it and I checked the website out. Firstly and most dangerously it's attempting to sell our original software, which is open source. Our open source software is around 15mb big and he's serving a 50mb download and trying to sell it for $20. He's also stolen our CSS/images/site design in general which is all custom built. I attempted to open reasonable discussion with him, and he responded promptly saying he would remove offending materials if he could just have 3 days to sort it out which I accepted. I'm not sure what his plan was because everything on that site is offending material. Anyway he messaged back saying the site was offline, and it was, but it went back online shortly afterwards. It's pretty sickening that someone is selling open source work as their own, (the site about us page references him as the sole developer etc etc, it's unbelievable to read it). I want to shut it down, what are my options? I'm going to contact his domain registrar, web host, and Paypal (that's how he's selling the program). Any other ideas?

    Read the article

  • Search engine bots accessing strange URLs

    - by casasoft
    We have ELMAH enabled on our site and get errors whenever a Page Not Found error is triggered on the website. We have recently redesigned a new website and so we understand that search engine robots might have previously indexed pages which they try to access and result in a Page Not Found errors. For this reason, we have set up permanent redirects for such previously indexed pages to the respective new pages. The website in mention is www.chambercollege.com and for example, a previously indexed URL was www.chambercollege.com/special-offers.aspx. This page is no longer accessible so we have created the necessary permanent redirect to redirect to the respective page on www.chambercollege.com/en/content/special-offers-161/. Now we are starting to receive Page Not Found errors of search engine bots (e.g. MSN bot) trying to access the URL www.chambercollege.com/special-offers.aspx/images/shadow_right.jpg/. Any idea how could a search engine make up that strange URL and whether you have any suggestions of what to do best?

    Read the article

  • I need a flexible VPS like gandi.com [duplicate]

    - by Sharen Eayrs
    This question already has an answer here: How to find web hosting that meets my requirements? 5 answers What are the alternative? I do not ask for which one you recommend. I am simply asking for list of similar service. That being said if there is any you've used and happy with please let me know. This is what I am considering http://en.gandi.net/ Some says it's overpriced. It's like amazon aws I think. So I need VPS hosting where I can easily change the CPU requirement, etc.

    Read the article

  • How to change my website's appearance in a Facebook wall post?

    - by Lode
    When posting a website link in a Facebook wall post, Facebook fetches some content (title, text and image) from the website to show it to readers. Is there a way I can adjust / propose which content is used / preferred by Facebook? I found someone saying to use <meta property="og:image" content="image.jpg">, but this doesn't seem to have any effect. But maybe Facebook caches these results for a while?

    Read the article

  • WHM local/external mail server confusion

    - by BWRic
    We host several websites on the same server using WHM but this seems to confuse the mail routing when someone has their own external mail servers - it looks locally. We have our own email accounts hosted on the server. When creating an account for a client on the same server WHM adds the default entries to the DNS for that account. However this client has their own mail servers elsewhere. But when sending them an email it never reaches that external server - it just sees the local, incorrect one. I realise I can update my DNS to point to the external server, but this means I am copying their settings and if they are changed, then I will also need to update mine. Are there some settings I can use to force it to use the external servers without having to copy the settings.

    Read the article

  • What is an easy way to see how often recently added pages are viewed in google analytics?

    - by cboettig
    Google Analytics makes it very easy to see the number of views of the most-viewed pages, but I cannot figure out how to see the number of views a particular page has received, or the number of views of recently added pages (e.g. blog posts). Is it possible to sort the pageviews list by date the page was added? Can this be done without having to externally create a list of recent pages and use the analytics API?

    Read the article

  • In China. Want to set up my own private proxy. Already have website/webhosting. Help please! n00b with respect to coding/programming, go easy on me [closed]

    - by user1725461
    I am in China and have used freegate in the past -- http://en.wikipedia.org/wiki/Freegate Recently I've been having too many problems with that and some other web-based proxies I usually use. I have a website that is hosted in the US which I can access from China. Is there an easy way for me to setup my own secure private proxy? I'm sick of all my internet problems and looking for a new workable solution. Thank you! PS: and I really hope this is the right place for such a question...

    Read the article

  • Read data from a folder in main domain folder (CPanel\WHM)

    - by Memphis Raines
    I have defined a host in my CPanel\WHM server and put all my websites under one host account. The host Main Domain is domain.com, and all other websites are Add-on Domains: domain.com --folder --domain1 --domain2 --domain3 ... The thing I need is that when calling domain.com in browser, the server read files from another folder. for example when call http://domain.com it shows us http://domain.com/folder BUT I don't mean a redirection, I want server do this in background without showing visitors the real path. I couldn't do this with Domain WildCard Redirection because it got error. How can I do this? With htaccess or ... ?

    Read the article

  • .htaccess redirect to subfolder in different domain, maintaining old domain in the URL

    - by Naoise Golden
    Redirect has been widely discussed and most problems solved, so I am sorry for opening yet another post about this, but none of the codes I am trying work. I have a WordPress site hosted in http://mydomain.com/clientsdomain.com/wordpress I would like to temporarily redirect http://clientsdomain.com/ to the abovementioned URL, maintaining the clientsdomain.com domain in the URL. So for example http://clientsdomain.com/some/page would be pointing to http://mydomain.com/clientsdomain.com/wordpress/some/page Is this even possible with .htaccess? Maybe som configuration or plugin option with WordPress?

    Read the article

  • Which iPhone ad API has produced the highest revenue for you?

    - by Kyle Humfeld
    This isn't a technical question, but more of a request for advice and empirical/anecdotal data. I'm nearly done writing a free app for iPhone, and I'm at the stage where I'm going to put ads into the app. I've had mixed success in the past with iAd (their fill rates have been atrocious recently, and their payouts have cut by about 75% over the past 4 months or so), and would like to know how much ad revenue you, the community, has seen from the various ad APIs you've used for your iPhone apps. This isn't a request for opinion, i.e. which is 'better', only what kinds of numbers you're seeing. I don't need absolute figures, but 'iAd pays x% higher than AdMob, and y% lower than AdSense' would be extremely helpful to me as I make my decision as to which ad API to integrate into my App. Also, have you had any experience or success with integrating multiple ad APIs into the same app? That's something I'm considering doing in my current iAd-filled apps (particularly my iPad app, which has yet to receive a single impression after nearly 60,000 requests)... something like: 1) Request-from-iAd 2) if that fails, request-from-adSense 3) if that fails, request-from-adMob 4) if that fails, ... etc.

    Read the article

  • Poor backlink profile - search rankings not updated for 2+ months

    - by fistameeny
    I am carrying out some work on a website that is a PR2 with a few good quality, relevant backlinks (PR4-6). It has a presence on Twitter that is updated regularly, a Google Places listing, and listings on some decent directories (Qype etc). The site was rebuilt into Drupal 7 two months ago, with all the basics done - URL rewriting, XML Sitemap submitted to Google, and most importantly, good quality, structured content. I've noticed that Google is still showing "old" URL's from the previous version of the site that was ditched 8 weeks ago. I think the site may be penalised under the Penguin update, as a previous SEO company created many low quality links from link farms/directories. My question is what the correct way to deal with this is. Bing Webmaster Tools can "disavow" links, and I guess I can attempt to contact the link farms to have them removed. I've already submitted a request to Google to request that we have the penalty removed as we're trying to tidy up a bad history. We submit updated sitemaps to Google and Bing daily, and have built some further decent quality, relevant links. Is there anything further I can do?

    Read the article

  • Redirect Google crawler to different robots.txt via .htaccess

    - by user3474818
    I have googled for the answer all day and still couldn't find an answer. I have a virtual subdomain www.static.example.com which is a mirror site of www.example.com. It means I have just one root folder for subdomain and domain aswell. I want to redirect crawlers to different robots.txt file - robots_static.txt when they see .static in url in which I will forbid indexing via /disallow command. I want to do this because I have duplicated content in Google search results. Subdomain is showing the exact same content as the main domain. Does anyone know how could I achieve that crawlers sees robots_static.txt instead of robots.txt? What I have managed to find so far is this: RewriteCond %{HTTP_HOST} ^www.static.*$ [NC] RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /.*robots\.txt.*\ HTTP/ [NC] RewriteRule ^robots\.txt /robots_static.txt [NC,L] but when I check in webmaster tools, it still sees robots.txt as my robots file instead of robots_static.txt, so it crawls and index everything twice. What did I do wrong? Thanks EDIT: This is my .htaccess file ## # @package Joomla # @copyright Copyright (C) 2005 - 2013 Open Source Matters. All rights reserved. # @license GNU General Public License version 2 or later; see LICENSE.txt ## ## # READ THIS COMPLETELY IF YOU CHOOSE TO USE THIS FILE! # # The line just below this section: 'Options +FollowSymLinks' may cause problems # with some server configurations. It is required for use of mod_rewrite, but may already # be set by your server administrator in a way that dissallows changing it in # your .htaccess file. If using it causes your server to error out, comment it out (add # to # beginning of line), reload your site in your browser and test your sef url's. If they work, # it has been set by your server administrator and you do not need it set here. ## ## Can be commented out if causes errors, see notes above. Options +FollowSymLinks ## Mod_rewrite in use. RewriteEngine On RewriteEngine On RewriteCond %{HTTP_HOST} !^www\. RewriteRule ^(.*)$ http://www.%{HTTP_HOST}/$1 [R=301,L] RewriteCond %{HTTP_HOST} ^www.static.*$ [NC] RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /.*robots\.txt.*\ HTTP/ [NC] RewriteRule ^robots\.txt /robots_static.txt [NC,L] ## Begin - Rewrite rules to block out some common exploits. # If you experience problems on your site block out the operations listed below # This attempts to block the most common type of exploit `attempts` to Joomla! # # Block out any script trying to base64_encode data within the URL. RewriteCond %{QUERY_STRING} base64_encode[^(]*\([^)]*\) [OR] # Block out any script that includes a <script> tag in URL. RewriteCond %{QUERY_STRING} (<|%3C)([^s]*s)+cript.*(>|%3E) [NC,OR] # Block out any script trying to set a PHP GLOBALS variable via URL. RewriteCond %{QUERY_STRING} GLOBALS(=|\[|\%[0-9A-Z]{0,2}) [OR] # Block out any script trying to modify a _REQUEST variable via URL. RewriteCond %{QUERY_STRING} _REQUEST(=|\[|\%[0-9A-Z]{0,2}) # Return 403 Forbidden header and show the content of the root homepage RewriteRule .* index.php [F] # ## End - Rewrite rules to block out some common exploits. ## Begin - Custom redirects # # If you need to redirect some pages, or set a canonical non-www to # www redirect (or vice versa), place that code here. Ensure those # redirects use the correct RewriteRule syntax and the [R=301,L] flags. # ## End - Custom redirects ## # Uncomment following line if your webserver's URL # is not directly related to physical file paths. # Update Your Joomla! Directory (just / for root). ## # RewriteBase / RewriteCond %{THE_REQUEST} ^GET.*index\.php [NC] RewriteCond %{THE_REQUEST} !/system/.* RewriteRule (.*?)index\.php/*(.*) /$1$2 [R=301,L] RewriteCond %{THE_REQUEST} ^GET ## Begin - Joomla! core SEF Section. # RewriteRule .* - [E=HTTP_AUTHORIZATION:%{HTTP:Authorization}] # # If the requested path and file is not /index.php and the request # has not already been internally rewritten to the index.php script RewriteCond %{REQUEST_URI} !^/index\.php # and the request is for something within the component folder, # or for the site root, or for an extensionless URL, or the # requested URL ends with one of the listed extensions RewriteCond %{REQUEST_URI} /component/|(/[^.]*|\.(php|html?|feed|pdf|vcf|raw))$ [NC] # and the requested path and file doesn't directly match a physical file RewriteCond %{REQUEST_FILENAME} !-f # and the requested path and file doesn't directly match a physical folder RewriteCond %{REQUEST_FILENAME} !-d # internally rewrite the request to the index.php script RewriteRule .* index.php [L] # ## End - Joomla! core SEF Section. <FilesMatch "\.(ico|pdf|flv|jpg|ttf|jpg|jpeg|png|gif|js|css|swf)$"> Header set Expires "Wed, 15 Apr 2020 20:00:00 GMT" Header set Cache-Control "public" </FilesMatch> <ifModule mod_headers.c> Header set Connection keep-alive </ifModule> ########## Begin - Remove Etags # FileETag none # ########## End - Remove Etags

    Read the article

  • How to direct a Network Solutions domain name to an html website hosted on Google Drive? [on hold]

    - by Air Conditioner
    To begin with, I'd wanted to take advantage of HTML, CSS, and so on to build a website that looks and works just as I'd like it to. I took a look around on how I could make that work, and I soon saw a lifehacker article showing that its possible to host website files on google drive. I then made sure that the folder containing the files was shared publicly throughout the web, and I now have a working 'google drive hosted' domain for the website. However, I did want to have the custom domain, and so I registered one with network solutions. So now, I'm curious on how I should direct my Network Solutions domain to the index.html I'm hosting on google drive. Would anyone have an Idea?

    Read the article

  • What's the simplest way to create a page with dynamic elements?

    - by ElendilTheTall
    I'm developing a site, part of which lists training courses with dates and prices. Every year the dates and prices change, which would mean loads of manual code editing to update the pages. What I'd like to do is have a database containing the relevant information, which the course pages then reference, so we can just update the database rather than the HTML. My experience lies in static design - so, what is a simple way to achieve this?

    Read the article

  • Move websites without losing search engine rankings

    - by Nuno
    I have one company with 5 business areas. Each business area has a website. Recently we decided to have all these websites merged into one website, the group website. What is the best way to move these websites without losing the SEO and page ranks? What do you think about our decision? We believe that if we have just one website covering five business areas, it will circulate and have more traffic than we have at this moment.

    Read the article

  • How to set up a mail server on Linux only for sending admin/debug emails?

    - by ChocoDeveloper
    I need to send server reports to myself from my remote servers, and I don't mind them going to spam, so I don't need SPF, DKIM, etc. I tried using mailutils to send something like this: uptime | mail -s "uptime" [email protected], but the emails don't go through. In /var/mail/root I received a message saying the HELO was invalid. So then I tried also adding -r root@my-ip, and now I don't get any error messages but I don't receive the email either. How can I do this?

    Read the article

< Previous Page | 161 162 163 164 165 166 167 168 169 170 171 172  | Next Page >