Search Results

Search found 5380 results on 216 pages for 'webmasters'.

Page 103/216 | < Previous Page | 99 100 101 102 103 104 105 106 107 108 109 110  | Next Page >

  • Link tags in iframe widget

    - by john Smith
    I have a rating community-site and I´m offering little iframe widgets with the average rating and some little other info. Does it make sense (for visibility, SEO) to add link tags to the head like: <link rel="alternate" type="application/rss+xml" title="RSS 2.0" href="rssfeed" /> <link rel="index" title="main-profile" href="main-profile"> To get a logical association of the widget to relating pages? How would you do this?

    Read the article

  • How to handle URLs with diacritic characters

    - by user359650
    I am wondering how to handle URLs which correspond to strings containing diacritic (á, u, ´...). I believe what we're seeing mostly are URLs where diacritic characters where converted to their closest ASCII equivalent, for instance Rånades på Skyttis i Ö-vik converted to ranades-pa-skyttis-i-o-vik. However depending on the corresponding language, such conversion might be incorrect. For instance in German, ü should be converted to ue and not just u, as seen with the below URL representing the Bayern München string as bayern-muenchen: http://www.bundesliga.de/en/liga/clubs/fc-bayern-muenchen/index.php However what I've also noticed, is that browsers can render non-ASCII characters when they are percent-encoded in the URL, which is the approach Wikipedia has chosen, for instance http://de.wikipedia.org/wiki/FC_Bayern_M%C3%BCnchen which is rendered as: Therefore I'm considering the following approach for creating URL slugs: -(1) convert strings while replacing non-ASCII characters to their recommended ASCII representation: Bayern München - bayern-muenchen -(2) also convert strings to percent encoding: Bayern München - bayern_m%C3%BCnchen -create a 301 redirect from version (1) to version (2) Version (1) URLs could be used for marketing purposes (e.g. mywebsite.com/bayern-muenchen) but the URLs that would end being displayed in the browser bar would be version (2) URLs (e.g. mywebsite.com/bayern-münchen). Can you foresee particular problems with this approach? (Wikipedia is not doing it and I wonder why, apart from the fact that they don't need to market their URLs)

    Read the article

  • 301 redirects in main navigation menu of WordPress website - is this okay for SEO?

    - by Lewis Bassett
    I want to allow a client to have a flexible way to configure the navigation menu for his WordPress website. To that end, I have created a parent page called "Navigation", which has child pages for each page to be displayed in the navigation menu. Those pages then get 301 redirected to the actual page that should be served. This means the client can create pages freely, and then set up redirects for them as and when needed. This is a really easy way for him to manage his main menu and it works well. From an SEO point of view, is this okay? Will the pages be indexed fine?

    Read the article

  • A simple message room

    - by webbyJoe
    Can anyone recommend a simple message room (not chat room) which I can use for a private communication between my users. My idea: to grant some users (2-3 at the most) a specific privilege to talk privately in a message room. none of them would be administrator there. I need such features: - admin panel for adding users allowed to post messages in room - room invisible to anyone except users - filtering not-allowed words - Ajax-enabled so that replies appear immediately - other message room features I have a linux hosting so PHP message room would be the best option. I thought of using a forum for this, but it's too much work as a forum is public by nature and I need something private by nature. Any ideas? Thanks in advance.

    Read the article

  • Wordpress Multisite and Google Analytics in subfolders with mapped domains

    - by David
    I have a wordpress multisite with sub folders. The site's subfolders are mapped to domains, which are set to primary. I'm using the 'Google Analytics Multisite Async' code to track things. From what I can see it's tracking the sites fine (getting page hits for each site in google analytics) baring the original site in the Multisite which in content overview lists domains then the amount of traffic it's getting along with the orginal domains traffic. I don't want to track any other traffic for my orginal site than what goes to that. i.e. I don't want it tracking my other sites in multi-site. e.g. domain1.com is my orginal and I have lots of other sites in the multisite lets say domain2.com, domain3.com. In content overview in Analytics it's listing say domain2.com as content. Can I tell it to filter these out some how either in Analytics or within WordPress? Hopefully explained that clearly!

    Read the article

  • Is there a modern (eg NoSQL) web analytics solution based on log files?

    - by Martin
    I have been using Awstats for many years to process my log files. But I am missing many possibilities (like cross-domain reports) and I hate being stuck with extra fields I created years ago. Anyway, I am not going to continue to use this script. Is there a modern apache logs analytics solution based on modern storage technologies like NoSQL or at least somehow ready to cope with large datasets efficiently? I am primarily looking for something that generates nice sortable and searchable outputs with the focus on web analytics, before having to write my own frontends. (so graylog2 is not an option) This question is purely about log file based solutions.

    Read the article

  • Google's process for publishing/modifying pages [closed]

    - by Glenn Dayton
    I'm assuming that a group of people at Google have control of certain sections of google.com, but how does Google make sure that employees don't accidentally or intentionally sabotage the website? Does Google use Adobe Contribute or some similar product for sharing/publishing the website. Do employees use WebDAV, FTP, SFTP, or SSH to publish the site. Since Google has hundreds of thousands of servers it probably takes some time for its servers to update. Do they transmit the new copy of the website to all servers before publishing at once? This question does not apply to Google editing a database and having a page reflect the database's changes. It applies to employees editing the source code and/ or back end of the site.

    Read the article

  • Loading main javascript on every page? Or breaking it up to relevant pages?

    - by Kyle
    I have a 700kb decompressed JS file which is loaded on every page. Before I had 12 javascript files on each page but to reduce http requests I compressed them all into 1 file. This file is ~130kb gzipped and is served over gzip. However on the local computer it is still unpacked and loaded on every page. Is this a performance issue? I've profiled the javascript with firebug profiler but did not see any issues. The problem/illusion I am facing is there are jquery libraries compressed in that file that are sometimes not used on the current page. For example jquery datatables is 200kb compressed and that is only loaded on 2 of my website pages. Another is jqplot and that is another 200kb. I now have 400kb of excess code that isn't executed on 80% of the pages. Should I leave everything in 1 file? Should I take out the jquery libraries and load only relevant JS on the current page?

    Read the article

  • adding regular expression in php not work

    - by John Smiith
    Code i added ([a-zA-Z0-9\_\-]+) but not work i wan't to include all css files is there is any other way to add?? My code file css.php header("Content-type: text/css"); $css = array( '([a-zA-Z0-9\\_\\-]+).css', ); foreach ($css as $css_file) { $css_get = file_get_contents($css_file); echo $css_get; } call.php <link href="css.php" rel="stylesheet" type="text/css" /> i wan't to rewrite css.php to css.css so public can see css.css instead of css.php. how can i do that using php script?

    Read the article

  • What is the best website width? [on hold]

    - by Salvis Dišlers
    What is the best website width? I can't decide between 1200px max-width and 1236px....with 300px sidebar as enough or 320px? I personally like wide website, my current website is 1236px; but I hear it's considered very wide for eyesight and all those gadgets keeping in mind... The website is for reading - I mean, mostly articles with around 3000 words on average per page, so no matter - whether they look wide, whether they love to scroll down :) Also - what to decide about Banner size / sidebar? Probably the 300px I can put a 300px banner, but on 320px sidebar I could put 336px banner, if necessary...

    Read the article

  • WWW.yoursite.com or HTTP://yoursite.com which one is futureproof?

    - by Sam
    http://yoursite.com www.yoursite.com http://www.yoursite.com yoursite.com Which of these would you choose as your favourite to work with, if you were to make a site for 2011 and beyond, which domainname would you provide to clients, websites linking to you, your letterhead, contact cards. Why one OR other? Which to avoid? Thinking of the following aspects: validity, correctly loading URL audience, most geeks know http://, most seniors/clients don't easiest to remember / URL as a brand misspellings by user input (in mobile phone or desktop browser) browsers not understanding protocol-less links total length of chars for easy user input method of peferance by major search engines/social media sites consistency sothat links dont fragment but all point to the same

    Read the article

  • Tracking AdWord ads with different text in Google Analytics

    - by at01
    I'm trying to see how the text in my Google AdWords ads affects my metrics in Analytics. I have auto-linking enabled, so I figured I would be able to automatically see this in Analytics. Unfortunately, if I try to add a second dimension of Traffic Sources-Ad Content, the metrics are only split by the ad's Headline. Most of my tests are changing only the ads' descriptions... So I guess I need to add a tracking parameter like ?campaign=special_text to my URLs? Or is there a way to see the ads split by ad descriptions? Should I add the full suite of utm_campaign/utm_medium/etc parameters? What's the proper way to track these ads which are mostly similar except the ad descriptions?

    Read the article

  • Shared Hosting Provider [closed]

    - by Garry
    Possible Duplicate: How to find web hosting that meets my requirements? I've been with Dreamhost for 5 years but the amount of downtime I have experienced over the last 6 months has been outrageous. As of now (2012) which hosting provider would you recommend? Most of my sites are small to medium readership blogs running WordPress. I've been looking at Inmotion and Hostgator. Reliability is paramount. Thanks

    Read the article

  • What is the aim of this email? Is this a ping/sping? [closed]

    - by mplungjan
    Hi, I received this spam in my catch-all. As a webmaster of the domain it was sent to, I am really curious what the reason for this mail is. It was sent to a non-existent user "tania" on my domain - here I used mydomain.zzz - what do the sender want to achieve? Since many mail servers have stopped backscattering, not getting a bounce would not mean anything, would it? And if this is off topic, where inb the StackExchange WOULD it be on topic? Delivered-To: [email protected] Received: (qmail 8015 invoked from network); 27 Jan 2011 02:32:47 -0000 Received: from unknown (HELO p3pismtp01-021.prod.phx3.secureserver.net) ([10.6.12.26]) (envelope-sender <[email protected]>) by smtp35.prod.mesa1.secureserver.net (qmail-1.03) with SMTP for <[email protected]>; 27 Jan 2011 02:32:47 -0000 X-IronPort-Anti-Spam-Result: At4FAAlnQE1GVjtCVGdsb2JhbACWXo4gCwEWCA0YJLwyhU8EhRc Received: from mx.dt3ls.com ([70.86.59.66]) by p3pismtp01-021.prod.phx3.secureserver.net with ESMTP; 26 Jan 2011 19:32:47 -0700 Received: from 70.86.59.66 by mx.dt3ls.com (Merak 8.9.1) with ASMTP id JXF39710 for <[email protected]>; Wed, 26 Jan 2011 17:31:10 -0500 Return-Path: [email protected] Status: Message-ID: <20110126173109.4d9d6c3f2b@1c3c> From: "Tech Support" <[email protected]> To: <[email protected]> Subject: Information, as instructed. Date: Wed, 26 Jan 2011 17:31:09 -0500 X-Priority: 3 X-Mailer: General-Mailer v.3 MIME-Version: 1.0 Content-Type: text/plain; charset="us-ascii" Content-Transfer-Encoding: 7bit Quote: I give it to you not that you may remember time, but that you might forget it now and then for a moment and not spend all your breath trying to conquer it. Because no battle is ever won he said. They are not even fought. The field reveals to a man his own folly and despair, and victory is an illusion of philosophers and fools. William Faulkner The Sound and the Fury

    Read the article

  • Do web crawlers/spiders index azure web sites?

    - by Clay Shannon
    For somebody who wants their web site to be as discoverable as possible (and who doesn't?), are Microsoft's Azure web sites (azurewebsites.net) a feasible domain to host sites? I have a site that is both on an azurewebsites.net and hosted under a completely different name by discountasp.net Both of these sites are exactly the same, except for the URL; whenever I update the code, I republish the site to/in both places. So obviosuly, they both have the same H1 and H2 elements. Searching for the value/content in my H1 tag, I find my .com site listed #3 on google and #2 on both Bing and Yahoo; OTOH, my azurewebsites.net site doesn't show up on the first page at all, in any of them. This makes me wonder if azurewebsites.net should only be used for Web API hosting and such-like, not for generic/commercial "public" sites. Are my conclusions valid?

    Read the article

  • "X-Robots-Tag: noindex" on an HTTP 301 response

    - by Peter O.
    I understand that a resource with X-Robots-Tag: noindex forces some search engines, including Google, not to index the resource further. I also understand that an HTTP 301 response causes search engines to use the redirected URL instead of the original URL to refer to the resource. But what happens if both "X-Robots-Tag: noindex" and status code 301 occur on the same response? It's likely that the original URL will no longer be indexed, but will that cause the redirected URL to no longer be indexed too? This possibility is not mentioned in the X-Robots-Tag specification.

    Read the article

  • Is there a way of listing files for a directory if it contains index.html?

    - by fredley
    On my server (over which I have little control), directories are listed by default, so for mysite.com/images I get: Index of /images Parent Directory BirdsAreHere.png CanYouSpot-AdBlank.jpg etc. Is putting an index.html in that directory enough to prevent people listing the files, or is there still a way of getting at that list? Is it the same for my web root directory (mysite.com)?

    Read the article

  • Multiple domain names with pages linking to one website

    - by Mark Ravenhill
    Hello, I work for a company who have been redesigning their company website. I have been asked to register loads of domain names that contain the keywords that they want to use on the original site. Each of these domain names will contain a one page website with a destription of what the company offers and a link saying something along the lines of 'click here for more infmormation' which then takes you to the main site. The idea being the main site will then be recieving a lot of inbound links and hopefully rise in the google rankings, not to mention bring in more customers who have come to the site from all the other domain names who wouldn't have normally got to the website because it wasn't ranked on the first page. Is this a good idea or will Google see this as spam and penalise the main site for having loads of links to it from one page websites hosted on the same nameserver? Any advice would be greatly appreciated. Thanks, Mark.

    Read the article

  • Can I import an existing member data used in old ASP to a new ASP.NET membership database? [closed]

    - by Rick Brown
    I have an old website that I designed and still maintain using old ASP that has a membership database (MS-SQL) that I built from scratch. It is a very simple database that has all the user information in one table (including login info and personal info) and then details and other odds and ends in other tables. It is WAY past time to upgrade this to .NET, especially since I need to add a Paypal payment system into it as soon as I can. I've designed several other sites with membership in .NET, but they have all been from scratch. Is there an easy way to transition from the old ASP site to a new .NET membership database without losing the data? There are hundreds of users with thousands of records relating to those users that I'd rather not lose, if possible. Any ideas on a relatively painless way to do this?

    Read the article

  • OpenSearchDescriptions good or bad signal in Google's eyes?

    - by JeremyB
    I noticed a site using this tag: <link rel="search" type="application/opensearchdescription+xml" title="XXXXXXXXX" href="http://www.XXXXXXXXXX.com/api/opensearch" /> As I understand it (based on http://www.opensearch.org/Home), this tag is a way of describing search results (so you use it on pages which contain search results) to make it easier for other search engines to understand and use your results. Given that Matt Cutts has said Google generally frowns on "search results within search results" is using this tag a bad idea on a page that you hope to achieve a good ranking in Google?

    Read the article

  • Temporary background image while the big one is loading? [migrated]

    - by Mikhail
    Is there a way, without javascript, to load a small image for a background before the real image is downloaded? Without javascript because I know how to do it with it. I can't test if the following CSS3 would work because it works too quick: body { background-image:url('hugefile.jpg'), url('tinypreload.jpg'); } If the tinypreload.jpg is only, say 20k, and the hugefile.jpg is 300k -- would this accomplish the task? I assume that both downloads would start at the same time instead of being consecutive.

    Read the article

  • http to https upgrade -- SEO troubles

    - by SLIM
    I upgraded my site so that all pages have gone from using http to https. I didn't consider that Google treats https pages differently than http. I re-created my sitemap to so that all links now reflect the new https and let it be for a few days. (Whoops!) Google is now re-indexing all https pages. I have about 19k pages on the site, and Google has already indexed about 8k of the new https. The problem is that Google sees all of these as brand new pages when many of them have a long http history. Of course most of you will recognize the problem, I didn't set up a 301 from the old http to the new https. Is it too late to do this? Should I switch my sitemap back to http and then 301 to the new https? Or should I leave the sitemap as is, and setup 301 redirects anyway.. I'm not even sure if Google is trying to reach the http site anymore. Currently the site is doing 303 redirects (from http to https), although I haven't figured out why yet. Thanks for any suggestions you can offer.

    Read the article

  • Does anyone know any good resources for learning how to market a web app?

    - by Jack Kinsella
    I'm a developer first and foremost. I write web apps but have a hard time generating traffic and converting potential users once I've released my product into the wild. I know I need to learn more about marketing but I don't know where to start as I've no baseline to judge the quality of the materials I stumble across. Does anyone know any websites, blogs, e-books or other resources for learning how to market effectively?

    Read the article

  • Creating a Template Like System in cPanel

    - by clifgray
    I am creating a medium sized website using cPanel and their File Manager system and the majority of my pages are going to be the same with a different title and content section and I wanted to see if there is a system for making one general template file and then having all the other pages inherit from that file so all I have to do is have a content and title section and the rest of the links, headers, and whatnot can be changed throughout the site by just changing one file. Is there anything like this? I have used Jinaj2 in python and a few other systems for other server scripting languages but I am not sure how to implement it with cPanel.

    Read the article

  • Parallel downloading of JavaScript files on page load

    - by user359650
    Below is a quote from one of the Yahoo performance pages: While a script is downloading, however, the browser won't start any other downloads, even on different hostnames. When I look at page load of our website, I can see that many scripts are being downloaded at the same time: Am I mistaken, or should the quote should instead read like this? While scripts are downloading (there can be several scripts downloading at the same time), the browser won't start any other downloads, even on different hostnames.

    Read the article

< Previous Page | 99 100 101 102 103 104 105 106 107 108 109 110  | Next Page >