Search Results

Search found 5380 results on 216 pages for 'webmasters'.

Page 112/216 | < Previous Page | 108 109 110 111 112 113 114 115 116 117 118 119  | Next Page >

  • How to make this CSS design of words in headings look clean and well desinged? [closed]

    - by kacalapy
    I am trying to get the lipstick on the pig and not wearing my UI developer hat often is making this impossible. Can someone give me nice alternatives to the code below. this is what i have now. <style> .FirstLetter:first-letter{font-family: arial; font-size: 14pt; font-weight: bold;color:White; background:Blue; border:1px black solid; padding-top:8px; padding-left:8px; padding-bottom:3px;} .Spaced{letter-spacing: 5px;font-family: arial; font-size: 14pt; font-weight: bold;} </style> <div class="FirstLetter Spaced headerFont"> Executive Summary </div> Here is the ugly result of the above code- i am lookign to make the header section look better ONLY that's where the first letter is blue:

    Read the article

  • Is it unwise to blacklist an IP address?

    - by hawbsl
    We have a form on a commercial website which has been abused (but only once or twice) by someone from a particular IP address. A colleague wants to blacklist that IP address from the website. Seems to me that's overkill, and that there's a risk that genuine customers sharing that same IP address would be blacklisted too. I suppose a big part of my question is how many people might be sharing that same IP address and could be affected by our blacklist. I suspect that's a "how long's a piece of string" question but some ballpark answer would be really helpful. We're in the UK if that's significant.

    Read the article

  • CA For A Large Intranet

    - by Tim Post
    I'm managing what has become a very large intranet (over 100 different hosts / services) and will be stepping down from my role in the near future. I want to make things easy for the next victim person who takes my place. All hosts are secured via SSL. This includes various portals, wikis, data entry systems, HR systems and other sensitive things. We're using self signed certificates which worked o.k. in the past, but are now problematic because: Browsers make it harder for users to understand exactly what is going on when a self signed certificate is encountered, much less accept them. Putting up a new host means 100 phone calls asking what "Add an exception" means What we were doing is just importing the self signed certs when we set up a new workstation. This was fine when we only had a dozen to deal with, but now its just overwhelming. Our I.T. Department has classified this as ya all's problem, all we get from them is support for switch and router configurations. Beyond the user having connectivity, everything else is up to the intranet administrators. We have a mix of Ubuntu and Windows workstations. We'd like to set up our own self signed CA root, which can sign certificates for each host that we deploy on the intranet. Client browsers would of course be told to trust our CA. My question is, would this be dangerous and would we be better off going with intermediate certificates from someone like Verisign? Either way, I still have to import the root for the intermediate CA, so I really don't see what the difference is? Other than charging us money, what would Verisign be doing that we could not, beyond protecting the root CA cert so it can't be used to make forgeries?

    Read the article

  • Has anyone bought Market Samurai and had a good experience?

    - by ZakGottlieb
    It's hard when a piece of marketing software offers an affiliate program to ever find an objective review of it, so I thought I might try on Quora. It just boggles my mind that it can only cost $97 flat, when other SEO or keyword research tools like Wordtracker cost almost the same PER MONTH, and don't seem to offer much, if anything, more... Can anyone explain this, and would anyone recommend Market Samurai WITHOUT posting a link to it in their review? :)

    Read the article

  • Has Microsoft stopped offering the free Internet Explorer Application Compatibility VPC Image for IE 6 testing?

    - by Paul D. Waite
    For some time now, Microsoft has made available free, stripped-down, time-limited Virtual PC images for testing web apps in older versions of IE. The most recent version is here: http://www.microsoft.com/download/en/details.aspx?id=11575 But the XP VPC image has now expired (14th Aug 2011), meaning one can no longer test IE 6 using this method. Have Microsoft made updated XP VPC images available? If not, have they commented on the situation? Do they provide any alternative method to test web apps in IE 6? Update As noted by @PleaseStand, as of 16th Aug 2011, Microsoft has made updated images available that expire on 17th November 2011.

    Read the article

  • How to handle URLs with diacritic characters

    - by user359650
    I am wondering how to handle URLs which correspond to strings containing diacritic (á, u, ´...). I believe what we're seeing mostly are URLs where diacritic characters where converted to their closest ASCII equivalent, for instance Rånades på Skyttis i Ö-vik converted to ranades-pa-skyttis-i-o-vik. However depending on the corresponding language, such conversion might be incorrect. For instance in German, ü should be converted to ue and not just u, as seen with the below URL representing the Bayern München string as bayern-muenchen: http://www.bundesliga.de/en/liga/clubs/fc-bayern-muenchen/index.php However what I've also noticed, is that browsers can render non-ASCII characters when they are percent-encoded in the URL, which is the approach Wikipedia has chosen, for instance http://de.wikipedia.org/wiki/FC_Bayern_M%C3%BCnchen which is rendered as: Therefore I'm considering the following approach for creating URL slugs: -(1) convert strings while replacing non-ASCII characters to their recommended ASCII representation: Bayern München - bayern-muenchen -(2) also convert strings to percent encoding: Bayern München - bayern_m%C3%BCnchen -create a 301 redirect from version (1) to version (2) Version (1) URLs could be used for marketing purposes (e.g. mywebsite.com/bayern-muenchen) but the URLs that would end being displayed in the browser bar would be version (2) URLs (e.g. mywebsite.com/bayern-münchen). Can you foresee particular problems with this approach? (Wikipedia is not doing it and I wonder why, apart from the fact that they don't need to market their URLs)

    Read the article

  • folder level 301 redirect without .htaccess

    - by Vinay
    I have a website hosted in yahoo small business, I don't have access to .htaccess file. I have around 220 pages in a folder 'mysubfolder' (http://mysite.com/myfolder/mysubfolder). And the age of website is around 3 years. I am planning to move all 220 pages in 'mysubfolder' to 'myfolder' (one level up). All the pages in 'mysubfolder' are indexed. what is the best way to do this.So that it should not affects the SEO.

    Read the article

  • Google crawling the site but refusing to index dynamic content

    - by Omeoe
    I am trying to get Google to index an AJAX site (davidelifestyle.com). It's crawlable with JavaScript turned off and I have also recently implemented _escaped_content_ snapshot mechanism but all that's indexed is a home page and PDF files that are not directly available from the home page. Also when I use Fetch as Google in Webmaster Tools, it downloads the dynamic page but does not index it ("Submit to Index" just reloads the page). Any ideas what might be wrong?

    Read the article

  • Does text size and placement on page have an effect on seo

    - by sam
    I was wandering seeing as Google and others keep trying to get more and more 'human' in terms of rating whats good and whats spam, is it known if they take into account the size of a heading ie. an thats font size is 40px is going to speak allot more to the user than a thats font size is 14px.. similarly does placement factor ? ie. a 300 word article at the bottom of a landing page (not in the footer but bellow the useful content) would just be there for seo purposes. i know they look at if your doing things like text-indent:-9999px; and white text on a white background, but what about these more border line practices that both have legitimate uses but also the possibility to be spammy

    Read the article

  • rel="nofollow" SEO impact

    - by Torez
    I saw a technique used where there was a block with three parts: 1. Image (wrapped in an anchor tag) 2. Heading (anchor tag with heading text) 3. Paragraph (regular p tag with synopsis content) e.g. <li class="block"> <a rel="nofollow" class="thumb" href="#"><img src="images/placeholder_service_thumbnail.jpg" alt="" /></a> <a class="h3" href="#"Good SEO Heading</a> <pPellentesque habitant morbi tristique senectus et netus et malesuada fames ac turpis egestas. Vestibulum tortor quam, feugiat vitae, ultricies eget, tempor sit amet, ante. Donec eu...</p> </li> With the image tag there was a rel="nofollow" on the wrapped anchor tag. So the idea is that the users still has the ability to click the image and go to the details page, but the image link does not rank. When users click on the heading text, that is only what ranks for that specific page. Q: Is this the correct approach? Does this even do anything? What is the best practice?

    Read the article

  • how execute mysql command DELIMITER

    - by user5332
    hi, I have huge problem (for me) I need from PHP execute mysql command DELIMITER | but mysql_query fails on error... and I found that mysql_query doesn't support usage of DELIMITER, because this command may be working only in mysql console but when I open phpMyAdmin ... is there at SQL tab an option to change DELIMITER and it works... but I don't know how... could you help me? who is possile to change delimiter from PHP? I need it to do before CREATE TRIGGER ... that uses several ; that may not be interpreted like command end

    Read the article

  • Why are the custom campaign parameters in Google Analytics so long?

    - by Baumr
    Adding several Google Analytics custom campaign parameters can make URLs very long. For example, in Google's own examples: http://www.example.com/?utm_campaign=spring&utm_medium=referral&utm_source=exampleblog http://www.example.com/?utm_campaign=spring&utm_medium=email&utm_source=newsletter1 http://www.example.com/?utm_campaign=spring&utm_medium=email&utm_source=newsletter1&utm_content=toplink Is there shorter alternatives that GA will pick up?

    Read the article

  • Google suddenly only indexes https and not http

    - by spender
    So all of a sudden, searches for our site "radiotuna" give out the result as an HTTPS link. https://www.google.com/?q=radiotuna#hl=en&safe=off&output=search&sclient=psy-ab&q=radiotuna&oq=radiotuna&gs_l=hp.12...0.0.0.3499.0.0.0.0.0.0.0.0..0.0.les%3B..0.0...1c.LnOvBvgDOBk&pbx=1&bav=on.2,or.r_gc.r_pw.r_qf.&fp=177c7ff705652ec3&biw=1366&bih=602 We only use https for the download of two specific files (these urls are resources used for autoupdate functionality of an app we distribute). All other parts of the site should be served over http. We wouldn't like to see any other traffic over https, nor any of our site links to appear in search engines as https. I'd like to address this issue. It seems that the following solutions are available: hand out an https specific robots.txt as such: User-agent: * Disallow: / and/or at app-level, 301 permanent redirect all requests (except the two above) to HTTP if they come in as HTTPS. My concern with the robots method is that, say (for some reason) google decided not to index http pages, disallowing https pages might mean that google has nothing left to index with disastrous consequences for our ranking. This means I'm inclined to go with a 301 redirect. Any thoughts?

    Read the article

  • Shared hosting banwidth limits

    - by mike
    I have a shared hosting account with a 20GB monthly bandwidth limit. I have exceeded my monthly limit and according to my host my counter is never reset, they say they use a continuous 30 day counter. So for example, I make payment on the 1st of each month, say I use 20GB in the last week of the month. My bandwidth counter is not reset on the 1st of the new month and my bandwidth will only become available in the last week of the new month. Is this common practice by shared hosting companies? Sounds a bit shady to me. Surely my counters should be reset on the 1st of every month when I make payment and 20GB of bandwidth should be available from the day payment is made?

    Read the article

  • How to to let Google know about dynamic content?

    - by Yaniv
    Im looking for the best practice to let Google know about a vast number of dynamically created content. Let's say (I mean - dream) that I'm Facebook, and I want to let Google to index all the users' posts. Sitemap.xml may be the answer for this but they are limited to 50,000 URLs in each site map. I know that I can create 500 sitemaps and create a sitemap for sitemaps, but they are also limited, 25,000,000 URLS sounds quite enough at the moment, but could cause problems in the future. I.E - stackoverflow already has 3 Million posts, probably sitemap is not the solution for them. Creating a page with paging, and links to all the dynamic data. i guess this is what stackoverflow did by creating this page here: http://stackoverflow.com/questions So I think that Option 2 is the answer, but it seems to me that sitemaps might have some added value. So what should i do?

    Read the article

  • How can I create an SPF record on my 1and1.com hosted domain?

    - by tnorthcutt
    Emails from my domain (hosted at 1and1, and using Google Apps Premier edition) have sporadically been going to recipients' spam folders lately. I did some research, tested, and found out that I do not have an SPF record for my domain. According to this Google Support page, I need to create one. Following the steps on that page is easy, until I get to #3: Create a TXT record containing this text: v=spf1 include:_spf.google.com ~all I see no way to create a "TXT record". Here is a screenshot of the admin panel:

    Read the article

  • 301 redirects in main navigation menu of WordPress website - is this okay for SEO?

    - by Lewis Bassett
    I want to allow a client to have a flexible way to configure the navigation menu for his WordPress website. To that end, I have created a parent page called "Navigation", which has child pages for each page to be displayed in the navigation menu. Those pages then get 301 redirected to the actual page that should be served. This means the client can create pages freely, and then set up redirects for them as and when needed. This is a really easy way for him to manage his main menu and it works well. From an SEO point of view, is this okay? Will the pages be indexed fine?

    Read the article

  • How to use different php.ini files for different VirtualHosts?

    - by gsingh2011
    I have my site and it's staging subdomain running on the same CentOS machine running apache. The subdomain is created using a VirtualHost, and I use it to find any bugs before I push to production. I want the php.ini file for the staging VirtualHost to be a development one, and the production site will use a production php.ini. How can I configure apache to use different php.ini files? I don't want to use php_value/php_flag for everything, I'd rather just use the php.ini file I already have available. I've tried creating an .htaccess file that looks like this, SetEnv PHPRC /path/to/php.ini/directory This has no effect, as phpinfo() tells me it's still using /etc/php.ini. I've also tried setting PHPIniDir for both virtual hosts (www and staging) and it complains about seeing the directive twice.

    Read the article

  • Very few visitors on Analytics: incorrect setting?

    - by Akaban
    it's quite a long time Analytics is making me crazy: I have a 2 years old website, started with Aruba (an Italian provider) and then transfered on Hostgator. It's a blog Wordpress + MyBB forum, and on both the platforms I've the Analytics code in the footer. The problem is that the stats on Analytics are simply ridiculous compared to the numbers reported by the Aruba (before) and Hostgator (then). I think that the numbers of Aruba/Hostgator are correct, simply because just the daily users connected on the forum is higher than the Analytics numbers. I know it's a really confused request, but maybe you can help me to understand what's the problem.

    Read the article

  • url mod_rewrite

    - by Pritam Borkar
    I had an e-commerce website hosted on http://mydomain.com/beta for more than a year, eventually I decided to move the website to root http://mydomain.com I had done quite a lot of link postings to forums etc, when my site used to be hosted in the sub-dir /beta . Is there any way to do a mod_rewrite by which all the old links that I have posted do not return as broken links since now longer the site is hosted in /beta and is now hosted on the site root. I did read that mod_rewrite can help resolve this issue, but also read about that this has to be done with care. Just a tip that this site is using Friendl URL.

    Read the article

  • Cheaper alternatives to 99Designs.com (outsource CSS design)

    - by Chris Smith
    I'm designing my own website as a side project and I want the site to look professional. (Read, not designed by a programmer.) I don't mind spending a little money to have a professional do it, but design sites like 99designs.com cost way to much. (~$500+) Is there a cheaper (~$100 - $200) alternative for getting a designer to improve an existing site? (Things like updating the CCS or suggesting better ways for laying out the navigation.) Or is my best bet trying to pick up a freelancer on Craigslist?

    Read the article

  • .com.au backordered domain: Do I have to return it if the original owner asks for it?

    - by vDog
    I was contacted by the original owner of a domain to give him the domain that I backordered a few weeks ago. The domain was abandoned for about 2 months before I bought it to eliminate the competition of my client but now I am faced with a threat that he will take this matter to court and AUDA (.au domain administration limited). Am I supposed to handover the domain that I have bought legally? I would like to know my rights in this situation.

    Read the article

  • Is there evidence that linking to quality, reputable and popular website helps with ranking?

    - by JVerstry
    Is there any evidence that linking to external quality, reputable and popular websites helps with ranking (directly or indirectly)? Is there an established correlation? Some posts on the web do claim it, but without providing any evidence. It is known that if your website links to bad neighborhood, this will harm your reputation and authority, but does the reverse actually help? And, does it matter if the website is young or old in this case? Update I have found this Moz video revealing there is a 0.04 correlation with ranking.

    Read the article

  • Can't get into the admin console after migrating to new server

    - by Emerson
    I migrated my WordPress blog to a new server, and everything seemed to be working fine until it started giving me the error when entering the admin area: Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to allocate 4864 bytes) in /home/neworder/public_html/blog/wp-admin/includes/plugin.php on line 729 The line 729 has: $protected = array( '_wp_attached_file', '_wp_attachment_metadata', '_wp_old_slug', '_wp_page_template' ); I had installed the maintenance-mode, and I have suspicions that this is what broke the forum. If I remove the plugin it then gives another error: Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to allocate 19456 bytes) in /home/neworder/public_html/blog/wp-admin/includes/post.php on line 1158 And that line has: $content .= '<p class="hide-if-no-js">' . esc_html__( 'Remove featured image' ) . '</p>'; } I tried to restore the blog file-system from the old server and also to restore the database from the old server (2x), but still it gives me the same error. The blog itself seems to be working fine: http://blog.antinovaordemmundial.com/

    Read the article

  • Strange SEO Problems

    - by Davey
    I have a twitter account linked to my wordpress site so that each tweet becomes a new post. I was wondering why my SEO was hurting and when I looked at the source I was seeing stuff like this: Los Conchita & # 8 2 1 7 ; s on Prince has V & # 8 2 3 0 ; That is what the source lists as the title of the page. Has anyone else had this problem and know why it is occurring? Thanks! The site is reviewathens.com

    Read the article

< Previous Page | 108 109 110 111 112 113 114 115 116 117 118 119  | Next Page >