Search Results

Search found 5380 results on 216 pages for 'webmasters'.

Page 140/216 | < Previous Page | 136 137 138 139 140 141 142 143 144 145 146 147  | Next Page >

  • Is 301 redirect sufficient to solve WWW and HTTP/S duplication?

    - by Thomas Ojo
    I was reading about this article - SEO preference for WWW or HTTP:// protocol redirection? Do www websites rank better than NON-www? I have same problem but I needed a help on this further. What about https:// How will this be treated? Is the redirect 301 sufficient to solve the problem? I have a SEO company that says if possible, i should not have redirect but I don't think this is visible? Does permanent redirect in any way have effect on SEO services if properly done?

    Read the article

  • Are icon fonts bad for SEO?

    - by user359650
    Instead of using <img> tags for your icons, you can use icon fonts on <span> tags (which offer some advantages such as not having to create a sprite, being able to scale icons up/down without degrading quality...). However, by using an icon font you give up the <img> alt attribute (that attribute can help you with SEO). There is a way to add text to the <span> and hide it, but I wonder whether this is recognized / penalized by Google (as it seems to go against the quality guidelines). Are icon fonts bad for SEO (i.e. by using icon fonts you give up the alt attribute) ? Would inserting text in font icon tag and hiding it with CSS (text-indent: -9999px) be recognized / penalized by Google ?

    Read the article

  • Want Google to index redirect urls

    - by Dave Goten
    I'm having issues with users who think that Google Search is the address bar. Some of the sites that link to my site use user friendly addresses with 301 redirects to pages that have less friendly URLs. So, for example if I enter www.foo.com/bar it goes to www.bar.com/page.php?some-parameters-and-utm-codes-etc usually this is done by a 301 redirect in order to keep the SEO from foo.com on bar.com and so on, which I believe is standard practice. However, lately there have been more and more people searching www.foo.com/bar instead of going to www.foo.com/bar directly and because the page /bar is nothing more than a redirect it has no SEO that I know of. Things I've thought of but haven't been able to test, because Google takes forever to update :) (and I'm lazy like that), include using Google sitemaps and having them enter their redirects as entries there. (I could see this working if they were the top search entry all the time, and it might appear as a sitelink, but I don't know if that'll make the url itself show up in searches) Using Canonical tags on my pages to the redirects they set up. Which is a nightmare in itself because of the nature of my pages. One week the www.foo.com/bar might go to www.bar.com/pageA.php the next it might goto www.bar.com/pageB.php and having to remember to take the canonical tag off of pageA, so that it doesn't get confused with pageB would be a pain. Using 302 redirects -.- So I guess the question here is, does anyone have any experience or knowledge about this? What should I do to make www.foo.com/bar show up when someone 'searches' for this redirect url?

    Read the article

  • Responsive Design: Which Framework Should I Use? CSS3 & HTML5

    - by Jayhal
    I've been looking for a suitable set of HTML5/CSS3 foundation files to start new projects on. I started off piecing together my own files, but I believe I might be better served in finding a solid and fairly compatible(with me) CSS3/HTML5 framework and then tweaking certain things that may not best suit my own process. I'd love to find something that is responsive and that includes aspects focusing on layout, type(hor and vert baselines), form and interface components, cross-browser issues, and preferably built on something other than a just imple css reset, but that does include rebuilding elements consistently across browsers for a clean work slate. Extra features like polyfills or others area great, as is good documentation and examples. So far, off the top of my head I know of, Skeleton 1140 Grid 320 & Up (plus BP) HTML5 Boilerplate 2.0 and Mobile Inuit.css Less Framework Fluir Perkins.Less A few WP themes Are there any great one I don't know about? I work a lot in WP, and something that is easily incorporated (but also stand alone) is ideal. Plugins and wide set feature while maintaining the ability to cut it down when needed(flexibility) is also a big plus, and in par with a faster learning, since I want to start using whatever I find immediately . What are some of the better options you guys might be able to recommend? Systems or scripts, plugins, and other related tools are also welcome, Thanks!

    Read the article

  • Unified data source for k2 installed Joomla websites

    - by Özkan ÖZLÜ
    I am responsible for a few web sites of my organization. I use Joomla! 2.5.9 for those web sites. They all are running at the same server. I use K2 component for content managing. I have a general website in which shows all the staff information at the 'Staff' page. Also some of those people and their contents are shown in another department's website. So, there are databases for each web site. For example: In the general website (let's say general.org), when I click on the 'Staff' menu item, page shows all of the people work at my organization. Also they work at different departments. In another web site (eg: education.general.org) when I click on the 'Staff' menu item, it shows the people work at education department. But for each web site, I have different user accounts which means a modification in one of them does not affect the other one. If the one of the education staff tries to change his profile picture on the education web site, he also has to do it on the general web site. And sometimes one person might be working at two departments. Thus he has to edit three times of his data. Is it possible to merge the records for all websites? In other words, I want everyone to insert/update their data on the general web site, and the other web sites will be updated automatically.

    Read the article

  • Source code not matching uploaded HTML file

    - by benhowdle89
    I'm not sure if this is the right place to ask but i'm having a hugely frustrating problem with Coda and my website (i'm not sure which one is causing the issue) I'm using Coda to make changes to my website, Coda uses built in FTP to save changes to your web page. So when you hit Save, it uploads the new file. I've been using Coda for months and never had a problem until now. I am making changes in the html of my index.php and hitting save, it's successfully uploading the file but no changes are reflected in the source code in ANY browser. I even logged into cPanel on my website, ie. www.example.com:2082 and looked at the file - the changes have been made successfully. But the actual webpage in browser's source code, no changes?? I have tried adding which made no difference. Interestingly i make changes to style.css and the changes are instant. I have emptied the cache on all of my browsers but i'm still having an issue. Does this sound like a Coda problem or has anyone heard of such a thing?

    Read the article

  • Tab navigation and double content

    - by Guisasso
    I have a website in which i use tabs to navigate between pages. For example, page a displays A as an active tab and B and C background tabs. If the visitor gets to the website via page B, i also would like to display to page d, but not a and c. Question: I know i can just create index2 for b for example, so when the visitor gets to b from a, i display a,b,c and index1 when visitor gets to b from d for example. Is that a bad practice? I know double content isn't good, but in which other way can i or should i approach this problem? The tab navigation i designed uses < li and id tag do display active tab, defined in the < body tag.

    Read the article

  • Correct microdata and/or microformats for real estate listings?

    - by Ernests Karlsons
    Given I am running a real estate rentals listing website, what would be the correct microdata or microformats for the listing pages? There is the usual data: address, photos, price, start date, possible end date, person who is renting it out, list of amenities, description etc. Are there also microformats/microdata that can be used in the listing summary page (e.g., page that displays all listings in a particular city)?

    Read the article

  • No description for any page on the website is available in Google despite robots.txt allowing crawling

    - by Abhijit
    I seem to have the weirdest issue with Search Engine Optimization, and I asked the IT folks at my university, I asked people on Joomla forums and I am trying to sort this issue out using Google Webmaster Tools for more than 2 months to little avail. I want to know if I have some blatantly wrong configuration somewhere that is causing search engines to be unable to index this site. I noticed a similar issue with another website I searched for online (ECEGSA - The University of British Columbia at gsa.ece.ubc.ca), making me believe this might be a concern that people might be looking an answer for. Here are the details: The website in question is: http://gsa.ece.umd.edu/. It runs using Joomla 2.5.x (latest). The site was up since around mid December of 2013, and I noticed right from the get go that the site was not being indexed correctly on Google. Specifically I see the following message when I search for the website on Google: A description for this result is not available because of this site's robots.txt – learn more. The thing is in December till around March I used the default Joomla robots.txt file which is: User-agent: * Disallow: /administrator/ Disallow: /cache/ Disallow: /cli/ Disallow: /components/ Disallow: /images/ Disallow: /includes/ Disallow: /installation/ Disallow: /language/ Disallow: /libraries/ Disallow: /logs/ Disallow: /media/ Disallow: /modules/ Disallow: /plugins/ Disallow: /templates/ Disallow: /tmp/ Nothing there should stop Google from searching my website. And even more confusingly, when I go to Google Webmaster tools, under "Blocked URLs" tab, when I try many of the links on the site, they are all shown up as "Allowed". I then tried adding a sitemap, putting it in the robots.txt file. That did not help. Same exact search result, same behavior in the "Blocked URLs" tab on the webmaster tools. Now additionally, the "sitemaps" tab says for several links an error saying "URL is robotted out". I tried those exact links in the "Blocked URLs" and they are allowed! I then tried deleting the robots.txt file. No use. Same exact problem. Here is an example screenshot from Google's Webmaster Tools: At this point I cannot give a rational explanation to why this is happening and neither can anyone in the IT department here. No one on Joomla forums can seem to understand what is going on. Based on what I explained, does it seem that I have somehow set a setting in the robots.txt or in .htaccess or somewhere else, incorrectly?

    Read the article

  • Parse text file on click - and then display

    - by John R
    I am thinking of a methodology for rapid retrieval of code snippets. I imagine an HTML table with a setup like this: one two ... ten one oneTwo() oneTen() two twoOne() twoTen() ... ten tenOne() tenTwo() When a user clicks a function in this HTML table, a snippet of code is shown in another div tag or perhaps a popup window (I'm open to different solutions). I want to maintain only one PHP file named utitlities.php that contains a class called 'util'. This file & class will hold all the functions referenced in the above table (it is also used on various projects and is functional code). A key idea is that I do not want to update the HTML documentation everytime I write/update a new function in utilities.php. I should be able to click a function in the table and have PHP open the utilities file, parse out the apropriate function and display it in an HTML window. Questions: 1) I will be coding this in PHP and JavaScript but am wondering if similar scripts are available (for all or part) so I don't reinvent the wheel. 2) Quick & easy Ajax suggestions appreciated too (probably will use jquery, but am rusty). 3) Methodology for parsing out the functions from the utilities.php file (I'm not to good with regex).

    Read the article

  • Free, specific Ip2Location Database

    - by Andresch Serj
    I am searching for a free db (like an updated xml or csv file) that relates ip adresses to specific locations. I want more information than just the Country. I want some sort of region or city refference, even if that ends up to be a number that makes no sense to me. Doesn't have to be super correct or always up to date either. It is just to distinguish between usergroups and not to monitor or spy on them.

    Read the article

  • What are some potential issues in blocking all incoming requests from the Amazon cloud?

    - by ElHaix
    Recently I, along with the rest of the world, have seen a significant increase in what appears to be scraping from Amazon AWS-related sources. So simply put, I blocked all incoming requests from the Amazon cloud for our hosted application. I know that some good services/bots are now hosted on the cloud, and I'm wondering if certain IP addresses should be allowed, as they may gather data that would in the end benefit our site's SEO rankings? -- UPDATE -- I added a feature to block requests from the following hosts: Amazon Softlayer ServerDeals GigAvenue Since then, I have seen my network traffic decrease (monitored by network out bytes). Average operation is around 10,000,000 bytes. You can see where last week I was not blocking, then started blocking. I've since removed the blocks and will see what the outcome is.

    Read the article

  • Switch to https

    - by Mike
    I'm looking to use an .htaccess file to use mod_rewrite to switch the protocol from http:// to https:// when someone hits my website. For instance, once someone goes to: http://www.mywebsite.com/ I'd like the browser to switch to: http*s*://www.mywebsite.com/ The same goes for the http://mywebsite.com/ - https://mywebsite.com This is the following code I've been using and I've experienced some odd things so if anyone could provide me with information if this is the right way to do it, or if you have a better way, please provide it. Thanks in advance. RewriteEngine On RewriteCond %{SERVER_PORT} !=443 RewriteRule ^(.*)$ https://www.ebaillv.com/$1 [R=301,L]

    Read the article

  • SEO disasters moving domain for a high traffic website?

    - by chrism2671
    We're looking at moving our website from http://www.wikijob.co.uk to http://www.wikijob.com/uk as we spread our wings internationally. Our .co.uk website has a PR6 and received around 1/2 million visitors a month, 40% international. The wikijob.com domain, while registered for a while, has not been used nor promoted. I am concerned that moving domain could really haemorrhage our traffic and result in a loss of goodwill from Google, even if we use a 301, but equally, if we could transfer that pagerank to the .com domain, that would give us a massive head start around the world. Should we do it, or should we start over with .com and leave .co.uk as is?

    Read the article

  • jQuery/AJAX on old Computers/Browsers

    - by Andresch Serj
    I am working on a plattform that will have a lot of users in the so called "developing countries". So many of them will be using old computers and old browsers in tiny internet cafes. We want to make sure to give them a good user Experience and make sure the website loads as fast as possible. Problem is, that while you can save a lot of requeasts and time, using jQuery/AJAX, it also brings along a lot of Problems: - Will the Computers be powerfull enough to deal with the client side scripts? - Will the old Browsers handle jQuery? Does anyone have any experience with these sort of problems or might know of some sort of article on the topic?

    Read the article

  • MCrypt Module, Rijndael-256

    - by WernerCD
    An outside company is redoing our company Intranet. During some basic usage I disovered that the "User Edit" screens, with the "Password: *" boxes have the password in plain text, with the text box "type=password" to "hide" the password. The passwords are not store in the database as plain text, they are stored encrypted using "rijndael-256" cypher using the mcrypt module. I know that if I encrypt a password with SHA*, the password is "Unrecoverable" via one-way encryption. Is the same of MCrypt Rijndael-256 encryption? Shouldn't an encrypted password be un-recoverable? Are they blowing smoke up my rear or just using the wrong technology?

    Read the article

  • How to determine the amount to spend per phrase on Adwords research?

    - by Anonymous -
    My company would like to start a PPC advertising campaign. Whilst I understand the concept and how to set everything up from a technical point of view, this is something I've never done before. Logically, we'd like to test out a wide range of keywords that we think would lead to conversions, which we've put together through brainstorming and with some help from Google's External Keyword Tool. Sub-question whilst I remember - am I correct in thinking that in Google's keyword tool, keywords that we think will perform well that have a low competition yet high monthly searches are good since there will be less advertisers, meaning our bid per click will be less? Is there a common benchmark or process of doing a round of tests with keywords? Should we wait for 100 clicks on each keyword, see which ones have lead to the most sales (or rather, sales that are sustainable with the cost per click of that keyword), then drop the ones which aren't converting and put that budget onto the converting keywords? We realistically have a few hundred keywords/phrases we would like to test, but spending $100 per keyword/phrase is going to work out as quite an expensive test. It would be nice to be able to spend $5-10 per phrase, but I don't think the sample size would be great enough to determine anything usefully reliable. Another approach might be to setup all the keywords, and those that bring the most sales within x hours/days would be the ones we use. What is the common procedure with things like this? I know there are a plethora of companies that specialize in exactly this, but this is something we anticipate doing a lot in the future, so it would make sense to do it in house if at all possible.

    Read the article

  • Lazyloading images and SEO

    - by surpr
    Lazyloading images with a noscript fallback. Should I expect any damage in the SERPs? The site is completely thumbnail based. Also should I put a smaller image size in the noscript fallback to increase crawlability? We have nearly 1mil thumbs so it's a decision I'm hesitant to do. The reason why I'm thinking about it in the first place is because we're upping thumbail size about 50% which will add 10% of pagesize.

    Read the article

  • How can I redirect everything but the index as 410?

    - by Mikko Saari
    Our site shut down and we need to give a 410 redirect to the users. We have a small one-page replacement site set up in the same domain and a custom 410 error page. We'd like to have it so that all page views are responded with 410 and redirected to the error page, except for the front page, which should point to the new index.html. Here's what in the .htaccess: RewriteEngine on RewriteCond %{REQUEST_FILENAME} !-f RewriteRule !^index\.html$ index.html [L,R=410] This works, except for one thing: If I type the domain name, I get the 410 page. With www.example.com/index.html I see the index page as I should, but just www.example.com gets 410. How could I fix this?

    Read the article

  • SEO tool is telling me title, description and keywords don't exist, but they do. Where is the problem?

    - by DaveDev
    I'm using the following tool to analyse how 'optimal' a site that I'm working on is for search engines: http://tools.seobook.com/general/spider-test/ I enter the URL for the site - http://ftmsuat.moneymate.com - into the search bar, and it returns a breakdown of the contents of the page. I'm a little confused by what I see though. According to the results, the page doesn't have a title, description or keywords. But if you check the source of the page, those elements are definitely there. So I'm wondering now, which is wrong? seobook.com or my page?

    Read the article

  • Updating Google sitemap for mobile

    - by dimo414
    I have a series of utilities to generate Google sitemaps for my whole site. These files are massive, and slow to build. We want to start telling Google these pages are mobile-crawl-able too, by adding them to mobile sitemaps, but the documentation is unclear if I need to specify physically different files for my mobile URLs than for my normal ones. If this is my current sitemap: <?xml version="1.0" encoding="UTF-8" ?> <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"> <url> <loc>http://mobile.example.com/article100.html</loc> </url> </urlset> Can I simply change it to: <?xml version="1.0" encoding="UTF-8" ?> <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:mobile="http://www.google.com/schemas/sitemap-mobile/1.0"> <url> <loc>http://mobile.example.com/article100.html</loc> <mobile:mobile/> </url> </urlset> Or do I need to create new files with the additional markup, alongside my existing files?

    Read the article

  • Is it costly to leave the Console and Script features enabled in Firebug?

    - by parisminton
    For some time now, I've run Firebug constantly enabled to do quick DOM inspections, leaving the Console and Script panels disabled. I'm just starting to use these two features so I don't have to keep using alerts for testing and debugging. I enable them while I use them and turn them back off when I'm done. I'd like to know if these particular features can slow things down such that they shouldn't be left on round-the-clock. Like do they slow down page loads, use inordinate chunks of memory or something? I don't see anything about it in the Firebug wiki.

    Read the article

  • Your Most Popular Sites Screen in IE 10 - Icons not appearing

    - by MJWadmin
    We use the following code to add icons for favicon, tablets, smartphones, windows 8 tiles and the like:- <link rel="apple-touch-icon" href="apple-touch-icon.png"> <link rel="shortcut icon" type="image/x-icon" href="/favicon.ico"/> <link rel="apple-touch-icon-precomposed" sizes="144x144" href="apple-touch-icon-144x144-precomposed.png"> <link rel="apple-touch-icon-precomposed" sizes="114x114" href="apple-touch-icon-114x114-precomposed.png"> <link rel="apple-touch-icon-precomposed" sizes="72x72" href="apple-touch-icon-72x72-precomposed.png"> <link rel="apple-touch-icon-precomposed" href="apple-touch-icon-precomposed.png"> <meta name="msapplication-TileImage" content="apple-touch-icon-144x144-precomposed.png"/> <meta name="msapplication-TileColor" content="#17151a"/> Unfortunately this doesn't seem to work for IE9 and IE10's 'your most popular sites screen', google searches have been un-enlightening. Stack uses <link rel="apple-touch-icon" href="apple-touch-icon.png"> which seems to work for it, but not for us. Any clues to a solution appreciated.

    Read the article

  • Does the EU cookie law apply to an EU site that is hosted outside of the EU?

    - by mickburkejnr
    I have been reading up about this EU cookie law, and have also had in depth conversations with my girlfriend who is a solicitor/lawyer and with colleagues while building websites. While we are now working towards implementing a way to abide by the EU law, I have thought of something which no one really knows the answer to and has caused a few arguments. It's my understanding that any website in the EU must abide by these cookie laws, which is understandable. However, say if I were to have a .co.uk or .eu domain name pointing to a website which is hosted in America for example, do I still need to abide by the EU laws even though the website is hosted outside of the EU? One person I have asked has said that because the domain name is .co.uk or .eu (a European TLD) then the website is still accountable under EU law. Another person I have asked has said because the actual website is held outside of the EU, it doesn't actually have to bother with this law.

    Read the article

  • File access forbidden in htpasswd

    - by Nerd-Herd
    I have been using the htpasswd generated in this question and it seemed to have been working well until recently. Since yesterday, I am not able to access the newest file created in the folder ChatLogs(named 10_07_2012.txt). The server returns a 403 Forbidden error saying: Forbidden You don't have permission to access /ChatLogs/2012/07/10_07_2012.txt on this server. I am still able to access older files(until 09 July, 2012). At first I thought it might be because of file permissions, but they are the same as on other 9 files in the folder. What could be the problem? Please Help.

    Read the article

< Previous Page | 136 137 138 139 140 141 142 143 144 145 146 147  | Next Page >