Search Results

Search found 5380 results on 216 pages for 'webmasters'.

Page 32/216 | < Previous Page | 28 29 30 31 32 33 34 35 36 37 38 39  | Next Page >

  • What does an asterisk/star in traceroute mean?

    - by Chang
    The below is a part of traceroute to my hosted server: 9 ae-2-2.ebr2.dallas1.level3.net (4.69.132.106) 19.433 ms 19.599 ms 19.275 ms 10 ae-72-72.csw2.dallas1.level3.net (4.69.151.141) 19.496 ms ae-82-82.csw3.dallas1.level3.net (4.69.151.153) 19.630 ms ae-62-62.csw1.dallas1.level3.net (4.69.151.129) 19.518 ms 11 ae-3-80.edge4.dallas3.level3.net (4.69.145.141) 19.659 ms ae-2-70.edge4.dallas3.level3.net (4.69.145.77) 90.610 ms ae-4-90.edge4.dallas3.level3.net (4.69.145.205) 19.658 ms 12 the-planet.edge4.dallas3.level3.net (4.59.32.30) 19.905 ms 19.519 ms 19.688 ms 13 te9-2.dsr01.dllstx3.networklayer.com (70.87.253.14) 40.037 ms 24.063 ms te2-4.dsr02.dllstx3.networklayer.com (70.87.255.46) 28.605 ms 14 * * * 15 * * * 16 zyzzyva.site5.com (174.122.37.66) 20.414 ms 20.603 ms 20.467 ms What's the meaning of lines 14 and 15? Information hidden?

    Read the article

  • Set up iis7.5 to deny connections outside of LAN for certain folder [migrated]

    - by Darkcat Studios
    Im setting up a combined website and extranet currently, they both read from the same database on the same server as the site is hosted on. The reason being that the website is fed from the data that the staff plug into the extranet interface. it also links in to AD for authorising access to the extranet. I have the extranet in a folder within the website folder. What I want to do is only allow the extranet to be accessed from computers within our LAN, but allow the main website to be freely accessible to internet users. I have it set up as a generic web server currently, so anyone can view anything (well up to the point where the user is asked to log into the extranet of course! I have read a lot on this but nothing I read applies to, or works in IIS7.5

    Read the article

  • Google Analytics API - Super simple?

    - by Jens Törnell
    Google Analytics API - Too complicated? I've read about Google Analytics API but heard of others that it is a bit complicated to make it work. I use PHP. Copy / paste example My question is if there is a copy / paste example anywhere on the web for getting a stats curve of the latest month, or just the numbers for that period? Important I need to use the new Google Analytics API version for 2012. The other one is going to die soon.

    Read the article

  • Targeting advertising for T-Mobile and Virgin Mobile users

    - by Codek
    We'd like to target our advertising to Virgin Mobile users. However, Virgin Mobile is actually on the T-Mobile network. So when we do a lookup of the IP address it reports T-Mobile. So this gives 2 problems: No way to target Virgin Mobile When we target T-Mobile we accidentally target Virgin Mobile users too. We actually have 2 separate sites for T-Mobile and Virgin Mobile - So is there any way we can make sure we send people to the right site? Also would appreciate it very much if anyone has any suggestions on other places for this discussion, I'm not entirely sure this is "webmaster" talk?

    Read the article

  • SEO Suggestion For My Blog [closed]

    - by Rana
    I have a programming tutorial blog, which have decent traffic. However, I am interested to do some basic seo for my blog to get it optimized. I want to do it myself by learning. I was wondering if experts here can suggest me how should I proceed please? Also, if you please review my blog and suggest the most common seo concern that come to your mind first, those will be helpful as well. My blog site url is as follow: http://codesamplez.com/ Looking forward to your feedback soon. Thanks.

    Read the article

  • Friendly URLs: is there a max length for search engines?

    - by Olivier Pons
    People from stackoverflow have been working closely with google team to help them make the panda algorithm more efficient, so I guess they've learned a lot from the google team. Thus they may have done very clever friendly URLs to maximize the page rank. I've seen from time to time very long URLs (can't find where) in stackoverflow, but after a certain "amount" of character there were only numbers, kind of "ok passed this length, SEOs will ignore this so let's put only numbers". I've done a huge work on my framework to make very friendly URLs, and my website can come up with URLs like: http://www.mysite.fr/recherche/region/provence-alpes-cote-d-azur/departement/bouches-du-rhone/categorie-de-metiers/paramedical/ It's very long and I'm wondering if the previous URL won't be mixed with, say, this one: http://www.mysite.fr/recherche/region/provence-alpes-cote-d-azur/departement/bouches-du-rhone/categorie-de-metiers/art/

    Read the article

  • Unknown CSS font-family oddity with IE7-10 on Win Vista-8

    - by Jeff
    I am seeing the following "oddity" with IE7-10 on Win Vista-8: When declaring font-family: serif; I am seeing an old bitmapped serif font that I can't identify (see screenshot below) instead of the expected font Times New Roman. I know it's an old bitmapped font because it displays aliased, without any font smoothing, with IE7-10 on Win Vista-8 (just like Courier on every version of Win). Screenshot: I would like to know (1) can anyone else confirm my research and (2) BONUS: which font is IE displaying? Notes: IE6 and IE7 on Win XP displays Times New Roman, as they should. It doesn't matter if font-family: serif; is declared in an external stylesheet or inline on the element. Quoting the CSS attribute makes no difference. Adding "Unkown Font" to the stack also makes no difference. New Screenshot: The answer from Jukka below is correct. Here is a new screenshot with Batang (not BatangChe) to illustrate. Hope this helps someone.

    Read the article

  • 302 or 301 redirect in case where redirect lasts 1-2 months

    - by Matt Helmick
    I have a case where I have a newly built "author site" (promotes the author in general as a speaker and author) which needs to to temporarily redirect traffic from the author's "book site" (focuses on advertising the specific book). Because of some upcoming publicity we want to redirect traffic from the book site to the author site as a truly temporary measure, but that redirect would probably only last for 1-2 months (until we see the flurry of activity regarding the publicity die down or until the author site has an opportunity to rise in search rankings). At first glance this seems to be the situation designed for a 302 redirect, but I'm worried about losing link juice for the original book site. Would a 301 redirect be better (keeping in mind that this would be temporary) as long as the 301 redirect was lifted after 1-2 months?

    Read the article

  • How does one use the built in IIS URL Rewrite SEO rule that adds trailing slash only to files that exist?

    - by Sn3akyP3t3
    The default rule template is AddTrailingSlash. I've added another condition that allows the rule to apply to directories and not files, but I'm not sure if this is industry standard. Added: The rule allows for filename that are not standard such as .mobileconfig The web.config contains this rule when the template is applied: <rule name="AddTrailingSlashRule1" enabled="true" stopProcessing="true"> <match url="(.*[^/])$" /> <conditions logicalGrouping="MatchAll" trackAllCaptures="false"> <add input="{REQUEST_FILENAME}" matchType="IsDirectory" negate="true" /> <add input="{REQUEST_FILENAME}" matchType="IsFile" negate="true" /> <add input="{REQUEST_FILENAME}" pattern="^.*\.[a-z]{1,12}" negate="true" /> </conditions> <action type="Redirect" url="{R:1}/" /> </rule>

    Read the article

  • FTP GoDaddy Issues

    - by Brian McCarthy
    Is there a special port for godaddy servers? Do I have to call them to enable ftp support? I can login w the username and password on the control panel on godaddy.com but not on ftp. I'm not sure what I'm doing wrong. I tried using Filezilla and CuteFTP Pro using port 21 but w/ no luck. Go Daddy's Instructions are: 1.FTP Address or Hostname: Your Domain Name 2.FTP Username & Password: You selected both of these during account creation 3.Start Directory: You should leave this blank or include a single forward slash (i.e. /) 4.FTP Port: You should enter Standard, or 21. •FTP Client. ( ?Filezilla, ?WS-FTP, ?CuteFTP Pro, ?AceFTP ) Thanks!

    Read the article

  • apache rewriting url doesn't work(using godaddy hosting)

    - by AzizAG
    I'm using a framework to create my website(codeigniter) by default the urls are like this:mysite.com/index.php?/etc/etc/etc. And I'm trying to remove the index.php?, I tried to remove it by doing this(didn't work): RewriteEngine on RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*)$ /index.php?$1 [L] Note: it's working on my localhost(when putting my website's files in the root directiory). So, Is this issue associated with me or the hosting company(Go Daddy)?

    Read the article

  • How to handle geo sitemaps?

    - by Floran
    I have a site with company profiles. I already have a sitemap.xml However, now I'm reading on KML geo sitemaps: http://www.seomoz.org/blog/understand-and-rock-the-google-venice-update I wondered how that may apply to a site with company profiles. Do I place 1 large KML file showing all the locations of businesses in the KML format? Or do I need to make that more specific? locatins-in-LA.kml for example? And what is the relationship between a kml file and a sitemap.xml? It seems that I need to have a reference to the kml files that should be included in the sitemap.xml, but if that is the case, I don't know how. Help! :)

    Read the article

  • Can Campaign URL tags cause a soft 404 error?

    - by user35306
    I was checking out one of my company's website's Webmaster Tools to analyze the cause behind some soft 404 errors and discovered that a few of the older errors had affiliate mp referral tags listed as the relative URLs. Since these are older problems and I don't seem too many of them coming up in the last few months I don't think it's still a problem. I'm just curious if it's possible to cause a soft 404 by improperly copying the campaign or referral tag into the URL.

    Read the article

  • Am I correctly handling duplicate URLs for my homepage?

    - by Rob Goldstein
    I own a Job Search site named www.conservationjobboard.com and have a concern about how the domain is viewed by search engines. The issue is that when the site was first designed, the default page was left as default.php, but the homepage was actually JobBoard.php. To handle this, the default.php page performed a redirect to the JobBoard.php file when www.conservationjobboard.com/ was requested. The main problem resulted because the redirect was a temporary redirect causing search engines to index conservationjobboard.com/ and conservationjobboard.com/JobBoard.php as 2 separate pages. This has since been corrected to use the .htaccess file so that JobBoard.php is now the default file for the root directory eliminating the need for the redirect. Problem is that search engines still show both URL's in search results (one including JobBoard.php and one that ends with /). Another potential problem is that some of my early backlinks are to conservationjobboard.com/JobBoard.php while the rest are to conservationjobboard.com The 2 outstanding questions are as follows: 1. Is my domain still being penalized by search engines like Google for having duplicate homepage URL's? 2. Are all of the back links to my homepage being considered as the same now or is the total number of back links being split between the 2 different URL's? If you think there are still issues with how we have this set-up, I was wondering if you could give me advice on what we should do differently. Thanks.

    Read the article

  • SEO one longer page vs. several targeted subpages?

    - by Travis
    We're working on a site and have come to a choice between one long (not too excessive) main page and several subpages. The subpages would have custom metas/titles/h2 tags and the content that corresponds to them. The main page would have all the content and many more inbound links (pagerank) and with longer content encompassing the content we'd put on the shorter pages. Which would be better for seo and traffic in general? Both schemes are very usable to the user although we are a little concerned with duplicate content (the page's header/footer and other elements remain the same)

    Read the article

  • AWStats configuration issue [on hold]

    - by Dan
    I have a an Ubuntu 12.04.2 LTS server in which i host my website. The website is in Drupal. I tried to set up AWStats for my website but it is giving a lot of problems. I followed this link - http://www.sysadminworld.com/2011/set-up-awstats-on-ubuntu/ But I am confused about the domain name that needs to be given. The website can be accessed from http://xyz.com but the actual link is http://xyz.abc-def.com/root-folder. What is the domain name in this case? So what should be the name of the conf file then?

    Read the article

  • Rebuilt website from static html to CMS need to redirect indexed links

    - by Michael Dunn
    I have rebuilt a website which was all created with static html pages, it has now been rebuilt using a CMS system. I need to find a way of redirecting all the existing links to there new corresponding pages which utilise friendly URL rewrites on the CMS based website I imagine there will be several hundred if not 1000s as i have pages and images linked from google. What is the most efficient way to complete this Thanks in advance Mike

    Read the article

  • CSS Style Element if it does not contain another specific type of Element [migrated]

    - by Chris S
    My CSS includes the following: #mainbody a[href ^='http'] { background:transparent url('/images/icons/external.svg') no-repeat top right; padding-right: 12px; } This places an "external" icon next to links that start with "http" (all internal site links are relative). Works perfectly except if I link an Image, it also get this icon. For example: <a href='http://example.com'><img src='whatever.jpg'/></a> would also get the "external" icon next to the image. I can live with this if necessary, but would like to eliminate it. This must be implement in CSS (no JS); must not require any special IDs, Classes, styling in the html for the image or anchor around the image. Is this possible?

    Read the article

  • New META TAGS with positive effects for seo ranking in 2011 and beyond

    - by Sam
    Hi all, im trying to make an up to date chart of meta tags, for all of us, with their purposes, their use and their good (or bad) effects on search engines/being found. Also any body knows new/promising meta tags? I will add yours into my list so this chart is a result of live discussion and up to date. Also, it would be creative to invent your own useful meta, because we are the ones making the web, or aren't we? LEGEND P PURPOSE? What does this meta tag do in 2011, if anything N NECESSARY? Does every site really needs it or not? G GOOD wether it will have a good effect for your site to be found I INVENTED meta tag, who knows it will be accepted in a year! META "METANAME" = PURPOSE? - NECESSARY? - GOOD EFFECT? #### important meta "title" = P consice summary + teaser - N very - G extremely meta "description" = P description + teaser - N yes - G very meta "robots" = P if needed, to skip default dmoz/yahoodir listing - N no - G? #### new & promising! Thanks for input (John, ) meta "original-source" P url of whoever broke the news gets credits - N? - G? meta "syndication-source" P url for syndication of published news - N? - G? meta "canonical" P? - N? - G? #### seems obsolete meta "keywords" = P some keywords - N+G not for google but yahoo likes them meta "language" = P overrule guesswork by defining language - N no - G? meta "page-topic" = P topic/theme - N? - G? meta "abstract" = P short summary - N? - G? meta "copyright" = ? #### invented by me meta "audience" = P filteres audience: "+seniors, +parents, -children, -youth" meta "mood" = P specifies textual style: "discussion, informative, commercial, sexual, fictional, scientific, romantic, therapeutic, technical"

    Read the article

  • Clean URLs issue using .htaccess in PHP project

    - by x4ph4r
    I am working on a PHP laravel project. I am currently facing issues with .htaccess file. I have following .htaccess: <IfModule mod_rewrite.c> Options -MultiViews RewriteEngine On RewriteBase / RewriteCond %{REQUEST_FILENAME} !-f RewriteRule ^ index.php [L] </IfModule> When I reload my page the it gave me following error: 404 Not Found The requested URL /contacts was not found on this server. Then I opened /etc/apache2/users/username.conf file which had following line of code: <Directory "/Users/username/Sites/"> Options Indexes MultiViews AllowOverride None Order allow,deny Allow from all </Directory> In above code I changed AllowOverride None to AllowOverride All. Then I reload page and got following error: 403 Forbidden You don't have permission to access /contacts on this server. When I add FollowSymLinks to .htaccess file Options such as like this Options -MultiViews FollowSymLinks. Then sometimes I get this 500 Internal Server Error error and sometime this *Error 324 (net::ERR_EMPTY_RESPONSE): The server closed the connection without sending any data*. Each time I reload my page one of these errors with FollowSymLinks option. I also uncomment following lines in /etc/apache2/httpd.conf: LoadModule rewrite_module libexec/apache2/mod_rewrite.so LoadModule php5_module libexec/apache2/libphp5.so and still I am getting same permission denied error. Please help me I am trying to solve this problem for past 3 days but it is till unresolved.

    Read the article

  • Apache on Windows - splitting vHost logs

    - by Cylindric
    I have a Windows Server 2008 running Apache, and it will be hosting several virtual hosts. I'd rather not use the logrotate tool (|bin/logrotate), as it seems create significant extra overhead with all the processes. Is there a simple Windows alternative to get the log entries from a combined log file split into several per-site files? Preferably with custom output directories, but that is optional.

    Read the article

  • What is the most time-effective way to monitor & manage threats from bots and/or humans?

    - by CheeseConQueso
    I'm usually overwhelmed by the amount of tools that hosting companies provide to track & quantify traffic data and statistics. I'm equally overwhelmed by the countless flavors of malicious 'attacks' that target any and every web site known to man. The security methods used to protect both the back and front end of a website are documented well and are straight-forward in terms of ease of implementation and application, but the army of autonomous bots knows no boundaries and will always find a niche of a website to infest. So what can be done to handle the inevitable swarm of bots that pound your domain with brute force? Whenever I look at error logs for my domains, there are always thousands of entries that look like bots trying to sneak sql code into the database by tricking the variables in the url into giving them schema information or private data within the database. My barbaric and time-consuming plan of defense is just to monitor visitor statistics for those obvious patterns of abuse and either ban the ips or range of ips accordingly. Aside from that, I don't know much else I could do to prevent all of the ping pong going on all day. Are there any good tools that automatically monitor this background activity (specifically activity that throws errors on the web & db server) and proactively deal with these source(s) of mayhem?

    Read the article

  • Apache cannot find mysql database modules

    - by user809857
    I've created a simple django project and setup a mysql database. My simple project just creates an entry on the database. The project works fine when I use the built in development server provided by django (runserver) and it works well. But when I deployed the project on Apache and mod_Wsgi (Ubuntu server), django could not find 'books', which is in this case my table in the database. The mysql database that I use in runserver and apache are just the same. I also did rebuild the database using sqlall,validate and syncdb of django but still i get the error. What could be wrong with what I'm doing? Thanks

    Read the article

  • Is an xml sitemap good or bad [closed]

    - by Frederik Heyninck
    Possible Duplicate: Are there any clear indicators that my sitemap file is beneficial? The good: You provide search engines with all the urls in your site But, does the search engine search further than the provided urls in the xml site map? If you have a website with a forum does every post need to be in the sitemap? What if remove the sitemap afterwords, will the search engine need to start over?

    Read the article

  • Can't get heroku site updated on custom domain

    - by Joseph Brown
    I have hosayif.com registered at GoDaddy, and I set up a cname for rails.hosayif.com to point to my heroku app at sharp-meadow-6535.herokuapp.com. I set this up with a previous app, and it worked. I made a new app, renamed the old one, and then renamed the new one so that it is sharp-meadow-6535.herokuapp.com in hopes of not having to change anything at GoDaddy. In theory, rails.hosayif.com and sharp-meadow-6535.herokuapp.com should be the same site, but they are not. Can someone tell me what I am doing wrong?

    Read the article

< Previous Page | 28 29 30 31 32 33 34 35 36 37 38 39  | Next Page >