Search Results

Search found 10444 results on 418 pages for 'macbook pro retina'.

Page 101/418 | < Previous Page | 97 98 99 100 101 102 103 104 105 106 107 108  | Next Page >

  • Remove IP address from the URL of website using apache

    - by sapatos
    I'm on an EC2 instance and have a domain domain.com linked to the EC2 nameservers and it happily is serving my pages if I type domain.com in the URL. However when the page is served it resolves the url to: 1.1.1.10/directory/page.php. Using apache I've set up the following VirtualHost, following examples provided at http://httpd.apache.org/docs/2.0/dns-caveats.html Listen 80 NameVirtualHost 1.1.1.10:80 <VirtualHost 1.1.1.10:80> DocumentRoot /var/www/html/directory ServerName domain.com # Other directives here ... <FilesMatch "\.(ico|pdf|flv|jpg|jpeg|png|gif|js|css|swf)$"> Header set Cache-Control "max-age=290304000, public" </FilesMatch> </VirtualHost> However I'm not getting any changes to how the URL is displayed. This is the only VirtualHost configured on this site and I've confirmed its the one being used as I've managed to break it a number of times whilst experimenting with the configuration. The route53 entries I have are: domain.com A 1.1.1.10 domain.com NS ns-11.awsdns-11.com ns-111.awsdns-11.net ns-1111.awsdns-11.org ns-1111.awsdns-11.co.uk domain.com SOA ns-11.awsdns-11.com. awsdns-hostmaster.amazon.com. 1 1100 100 1101100 11100

    Read the article

  • Is eCPMs dropping by about 50% in January a usual behavior on Google AdSense?

    - by Andrew G. Johnson
    So I just got semi-serious about running some AdSense sites over the past 6 months and the eCPM's have hovered between 1.38 and 1.42 [yes it's that close] when I look at the eCPM for each month. Obviously some deviation day to day but pretty damn close to a buck forty in aggregate. So far for January I am sitting at 0.80 for an eCPM. I know it's not a huge sample size but the daily pageviews are fairly consistent [actually a bit higher] than where they were in December. I am trying to justify this by thinking that somehow a lot of ad buyers buy inventory for the year and have to get setup to do another big buy now that it's a new calendar year but that thought isn't close to comforting. Is this happening to anyone else? EDIT: I run a lot of websites and the ratios of pageviews are about the same this month to last month but just to be clear the eCPM I posted is for 20 websites in a variety of niches, it doesn't accurately depict any one domain.

    Read the article

  • Can search engine robots read file with permission 640?

    - by dkjain
    I am on a shared web hosting linux server. I want search engine robots/spiders to be able to read the robots.txt but not any one typing www.mysite.com/robots.txt. As per the following google group post, the user specifies that by setting file permission to 640, it's possible to deny access to robots.txt file by the world but still enable search engine robots to read them. Is that true? If not how it's possible to deny general public access to robots.txt but still allow Search engine robots to read them.

    Read the article

  • Adding my face to my web-site in Google's search result

    - by Roman Matveev
    I'm trying to accomplish the rich snippet to the template of my future web-site. The data format is review and I used the microdata formatting to add all necessary information to the web-page. The Structured Data Testing Tool delivered rating, author information and review date: However there is no my face image and the sections related to authorship are empty: I made all that recommended to link my Google+ profile to the web-site: I did something wrong? Or I will not be able to see my face in the test tools ever and it will be in the real SERP?

    Read the article

  • Displaying google analytics data on my website

    - by anon-user0
    I sale adspace for my websites directly to the advertisers. I ad page where I want to show google anaylytics information that update automatically without me having to manually update everyday or every month. Something like this: http://wstats.net/en/website/riverplate.com#stat_trafic I don't want to use embedded or iframed third party services. I know google has public API and you can connect it to google graph API to to show pretty graphs. There is a tutorial by google here on how to do it: https://developers.google.com/analytics/resources/articles/gdataAnalyticsCharts Few problems: I don't know much javascript The javascript seems to prompt for authntication as opposed to login automatically. (from my understanding by reading comments on the code) Does anyone know of any ready made script that does what I am looking for or know how I can fix this code that will allow me to display analytic info without authenticating? Thanks.

    Read the article

  • Website URL layout & structure for SEO & PR

    - by Junaid Saeed
    i have already mastered the skills of building a page to comply with SEO requirements. What i want to do is customize the URL based parameters for best SEO & PR. I run wallpapers blog wallz which provides different resolution desktop wallpapers I am thinking about expanding the site to provide wallpapers for multiple devices like iphone, android, pc etc etc My goal is to provide users ease by detecting their devices and directing them to the relative portion of site, keeping that in mind also keep my URLs in a SEO friendly manner. i have the following two options for my URL structure wallpapers.com \ iphone \ **or** iphone . wallpapers . com wallpapers.com \ galaxyS3 \ **or** galaxys3 . wallpapers . com subdomain or subfolders, which one is a better option, i will separate interfaces for ever subdomain\subfolder basing on the specific device. And after this the URL structure will be like wallpapers.com \ galaxyS3 \ Cars \ Ferrai 550 . html galaxys3.wallpapers.com \ Cars \ Ferrai 550 . html what is the better way for me to proceed in

    Read the article

  • Is there any reason to allow Yahoo! Slurp to crawl my site?

    - by James Skemp
    I thought a year or more ago Yahoo! would be using another search engine for results, and no longer using their own Slurp bot. However, a couple of the sites I manage Yahoo! Slurp continues to crawl pages, and seems to ignore the Gone status code when returned (as it keeps coming back). Is there any reason why I wouldn't want to block Yahoo! Slurp via robots.txt or by IP (since it tends to ignore robots.txt in some cases anyways)? I've confirmed that when the bot does hit it is from Yahoo! IPs, so I believe this is a legit instance of the bot. Is Yahoo Search the same as Bing Search now? is a related question, but I don't think it completely answers whether one should add a new block of the bot.

    Read the article

  • Is a backlink with a duplicate description and title from a news site bad for SEO?

    - by Dejan Pelzel
    I have a blog with over a thousand posts. I have posted some of those to a news aggregator site and included the same preview photo and description that I used for it on my own site and the link to the post on my site. Since the site is mainly videos and images, the description was usually a complete match of 4-6 lines of text. It now looks that I have been affected by panda and since I am not doing any bad stuff, I suspect it might be due to duplicate content. For example, when I search the title of my posts, sometimes my site is not even returned, but the news aggregator site is. Could this be the problem with panda?

    Read the article

  • Chrome Web Store verification

    - by Vince V.
    A couple of days ago I created an extension for Chrome and added it to the store. Now I want it to get verified. I payed the 5 dollar and added my website to Webmaster Tools. The website is also verified on those Webmaster Tools. Today I wanted to add my URL to my extension (ultimately to do online installations) but it doesn't recognize the URL I put in those Webmaster Tools. I tried refreshing and clicking add site, but it just doesn't work. Is there some step that I am missing or is this a bug in the Chrome Web Store, because I don't see any option left.

    Read the article

  • Install Moodle to subdomain with Softaculous via cPanel

    - by Sean
    I installed Moodle to a directory with Softaculous. Since it doesn't allow installing to a subdomain, after installing it I created a subdomain and pointed the destination (of the subdomain) to the previously created Moodle directory. Now when I go to the subdomain.example.com it says Incorrect access detected, this server may be accessed only through "http://example.com/moodle" address, sorry. Please notify server administrator. I must be doing something wrong, when installing it was very similar to these instructions. Any suggestions would be much appreciated.

    Read the article

  • Parsing google site speed in analytics

    - by Kevin Burke
    I'm having a hard time making heads or tails of the Site Speed graphs in Google Analytics. Our site speed is fluctuating wildly from month to month, despite a large sample (the report is "based on 100,000's of visits) and a consistent web set up (static files served from an EC2 instance running nginx behind a load balancer). Here's our site speed, with each datapoint representing a week worth of data. Over this time period we modified our source and HTTP headers to increase our cache hits on static resources by 5x. Why would it fluctuate so much? Is there any way to get more reliable information from those graphs?

    Read the article

  • CSS help positioning divs inline

    - by JaPerk14
    I need help with a recurring problem that happens a lot. I want to create a header that consists of 3 sections which are positioned inline. I display them inline using the following css code: display: inline & float: leftThe problem is that when I resize my browser window the last div is pushed down and isn't displayed inline. I know it sounds like I'm being picky, but I don't want the design to distort as the visitor change's the monitor screen. I have provided the html and css code below that I am working with below. Hopefully I have explained this well enough. Thanks in advance. HTML <div class="masthead-wrapper"> &nbsp; </div> <div class="searchbar-wrapper"> &nbsp; </div> <div class="profile-menu-wrapper"> &nbsp; </div> CSS #Header { display: block; width: 100%; height: 80px; background: #C0C0C0; } .masthead-wrapper { display: inline; float: left; width: 200px; height: 80px; background: #3b5998; } .searchbar-wrapper { display: inline; float: left; width: 560px; height: 80px; background: #FF0000; } .profile-menu-wrapper { display: inline; float: left; width: 200px; height: 80px; background: #00FF00; }

    Read the article

  • 301 url rewrite loop

    - by anyvendetta
    I need to do a 301 rewrite to force all urls to become lowercase i put in htaccess (RewriteMap lc int:tolower in httpd.conf) RewriteCond %{REQUEST_URI} [A-Z] RewriteRule . ${lc:{REQUEST_URI}} [R=301,L] Everything works just fine except to urls with subcategories which in this case are: /category-1256-Product-page-example.html the numer 1256 refers to a "subcategory" So when i try to access /category-1256-Product-page-example.html gives me a loop error message I think another redirect rules are making the loop but dunno how to fix it because are just this urls rewrite rules that don't work with the above rewrite. Rewriterule ^main-site-url/category-([0-9]*)-([-_a-zA-Z0-9]*)\.html$ /subcategories.php?idcategory_main=1&idcategory=$1&category=$2 [L] Rewriterule ^main-site-url/([0-9]*)-([-_a-zA-Z0-9]*)-([0-9]*)\.html$ /file.php?idcategory_main=1&idsubcategory=$1&product=$2&idproduct=$3 [L]

    Read the article

  • Website still blocked after hack

    - by dotman14
    I manage a website that was hacked a few months ago (I wasn't the webmaster then), it was running on Joomla. I have manages to redo the website with custom codes (php/mysql), but it still some visitors still complain that their AV blocks them from viewing the website. I have also cleared the former database and anything related to it, contents and the likes. My website is here I have looked for malwares in Google Webmaster but it says there are non Also I checked with Google Safe Browsing Please what could the problem be.

    Read the article

  • Using multiple A-records for my domain - do web browsers ever try more than one?

    - by Jonas
    If I add multiple A-records for my domain, they are returned in a round robin order by DNS-servers. E.g: 1.1.1.1 A example.com 1.1.1.2 A example.com 1.1.1.3 A example.com But how does webbrowsers react if the first host (1.1.1.1) is down (unreachable)? do they try the second host (1.1.1.2) or do they return a error message to the user? Are there any difference between the most popular browsers? If I implement my own application, I can implement so that the second is used in case the first is down, so it's possible. And this would be very helpful to create a fault tolerant website.

    Read the article

  • Strategy for managing lots of pictures for a website

    - by Nate
    I'm starting a new website that will (hopefully) have a lot of user generated pictures. I'm trying to figure out the best way to store and serve these pictures. The CMS I'm using (umbraco) has a media library that puts a folder on the server for each image. Inside of there you can have different sizes of that same image. That folder has an ID on it and the database has additional information for that image along with the ID of the folder. This works great for small sites, but what if the pictures get up to 10,000, 100,000 or 1,000,000? It seems like the lookup on the directory would take a long time to find the correct folder. I'm on windows 2008 if that makes a difference. I'm not so worried about load. I can load balance my server pretty easily and replicate the images across the servers. The nature of the site won't have a lot of users on it either, but it could have a lot of pics. Thanks. -Nate EDIT After some thought I think I'm going to create a directory for each user under a root image folder then have user's pictures under that. I would be pretty stoked if I had even 5,000 users, so that shouldn't be too bad of a linear lookup. If it does get slow I will break it down into folders like /media/a/adam/image123.png. If it ever gets really big I will expand the above method to build a bigger tree. That would take a LOT of content though.

    Read the article

  • How to enable customers to use their own domain for sites hosted by me

    - by Scott
    I am thinking of running a self-site builder. But was wondering how would I allow customers to use their own domains that they already own. Is that even possible? Let's say my site is www.bestsitebuildingwebsite.com and each customer has urls like this www.bestsitebuildingwebsite.com/frances www.bestsitebuildingwebsite.com/eden www.bestsitebuildingwebsite.com/john And a customer has a domain called widgets.com Is it actually possible domain widgets.com to go to my site somehow and have HASHES on the URL still work (my site makes use of hashes for AJAX queries). And their site still have good SEO with Google? Thanks Scott

    Read the article

  • Has anyone been able to convert a site's media content from flash to html5?

    - by Muhammad
    I'm looking to convert a site from Flash based video to HTML5, the current video uses time marks to display slides (kind of like how youtube has ads on their videos). But the difference between Youtube and my site is that it doesn't show up inside the video, the slides are displayed next to the video. Is there any way I can accomplish this with HTML5? Or do I have to use Javascript for this? If this isn't clear enough, please let me know.

    Read the article

  • Apache HTTPS ProxyPass certificate location

    - by oz1cz
    I'm trying to set up an Apache server that uses ProxyPass to pass HTTPS requests on to another server. Let's call the proxy server ALPHA and the target server BETA. ALPHA does not run HTTPS, but BETA does. I first tried using this virtual host specification on ALPHA: <VirtualHost *:443> ServerName mysite.com ProxyPass / https://192.168.1.105/ # BETA's IP address ProxyPassReverse / https://192.168.1.105/ # BETA's IP address ProxyPreserveHost On ProxyTimeout 600 SSLProxyEngine On RequestHeader set Front-End-Https "On" CacheDisable * </VirtualHost> But when I tried this, Apache complained saying, "[error] Server should be SSL-aware but has no certificate configured [Hint: SSLCertificateFile]". I had to copy the SSL certificate from BETA to ALPHA and add these lines to the host specification on ALPHA: SSLEngine on SSLCertificateKeyFile /usr/local/ssl/private/BETA_private.key SSLCertificateFile /usr/local/ssl/crt/BETA_public.crt SSLCertificateChainFile /usr/local/ssl/crt/BETA_intermediate.crt Now the system works. But I have a feeling that I have done something wrong or unnecessary. I have the web site's private key and certificate lying on both ALPHA and BETA. Is that necessary? Should I have done it differently?

    Read the article

  • 1 Google Analytics account or top-level domain + profiles for sub-domains vs. 1 account for each sub-domain

    - by Eric Nguyen
    We have the following websites An online magazine Singapore edition - sg.abc.com The same online magazine Malaysia edition - my.abc.com Forums around the same subjects as the online magazine but functions independently - forums.abc.com Classifieds site rather also around the same subjects but functions independently - directory.abc.com Each of the above websites currently has its own Google Analytics account. abc.com has a separate Google Analytics account too. sg.abc.com has the most traffic and generates most revenues Are there any practical benefits of merging all the above sub-domains to be under abc.com? I can think of more reliable analytics and consistency for sure. Are there more? cross-sales?

    Read the article

  • Breadcrumb using and schema.org rich snippets

    - by Adam Jenkin
    I am having problems implementing the breadcrumb rich snippets from schema.org. When I construct my breadcrumb using the documentation and run via Google Rich Snippet testing tool, the breadcrumb is identified but not shown in the preview. <!DOCTYPE html> <html> <head> <title>My Test Page</title> </head> <body itemscope itemtype="http://schema.org/WebPage"> <strong>You are here: </strong> <div itemprop="breadcrumb"> <a title="Home" href="/">Home</a> > <a title="Test Pages" href="/Test-Pages/">Test Pages</a> > </div> </body> </html> If I change to use the snippets from data-vocabulary.org, the rich snippets show correctly in the preview. <!DOCTYPE html> <html> <head> <title>My Test Page</title> </head> <body> <strong>You are here: </strong> <ol itemprop="breadcrumb"> <li itemscope itemtype="http://data-vocabulary.org/Breadcrumb"> <a href="/" itemprop="url"> <span itemprop="title">Home</span> </a> </li> <li itemscope itemtype="http://data-vocabulary.org/Breadcrumb"> <a href="/Test-Pages/" itemprop="url"> <span itemprop="title">Test Pages</span> </a> </li> </ol> </body> </html> I want the breadcrumb to be shown in the search result rather than the url to the page. Given that schema.org is the recommended way to be using rich snippets, I would rather use this, however as the breadcrumb is not showing in the preview of the search result using this method, i'm not convinced this is working correctly. Am I doing something wrong in the markup for schema.org example?

    Read the article

  • Rich Snippets - LocalBusiness - Photos - Correct Implementation

    - by user32622
    Does somebody know, how this is supposed to be implemented correctly? In my local business full page, I have a carousel with several images, so what I did is that on the container of this carousel i have written the following: "itemprop='photos' itemscope itemtype="http://schema.org/ImageObject"", i.e. <div class="tourism-product-media-gallery" itemprop='photos' itemscope itemtype="http://schema.org/ImageObject"> and then on each and every image i have written the following: "itemprop="contentURL"", i.e. <img src="@mediaItem.NormalImage" alt="@mediaItemCaption" itemprop="contentURL"/> But i am not convinced that this is the way it should be. Anyone has any insight on this and more knowledge? Thanks Note: here are the results from the rich snippet google testing tool: click here

    Read the article

  • Preventing indexing duplicate content by search engines

    - by umesh awasthi
    I am in process of migrating my old domain (www.oldurl.com) to new domain (www.newurl.com). Almost all the content,URL structure as well database is same except for few URL's and only difference will be in the domain name. I have made entries in the Apache's .htaccess file to set 301 redirect and currently have blocked all search engines from crawling my new domain by setting in robot.txt file. I am not sure how i will handle the duplicate content issue as when i will make the new domain go live. Should i block search engines to index/crawl my old domain? i am new to this field and not sure if this is actually any duplicate content issue or not.

    Read the article

  • Is the Google Webmaster Tools verification temporary?

    - by Senseful
    When you add a site to Google Webmaster Tools, it asks you to verify it (e.g. via a <meta> tag). I verified a site a while ago, but when I logged in, I noticed that it isn't verified anymore. The history shows that it was verified 58 days ago, but then 30 days ago it tried and failed saying that "revierification failed". I'm not sure if this is a result of some setting I changed which required a reverification, or if Google Webmaster Tools periodically tries to verify the site. I was under the impression that the verification only happens once when you add the site, and then you can delete the <meta> tag. If this is not how it works, and it does reverify periodically, will it require a different <meta> tag value or can I keep the original one I used and never have to worry about it again?

    Read the article

  • What naughty ways are there of driving traffic?

    - by Tom Wright
    OK, so this is purely for my intellectual curiosity and I'm not interested in illegal methods (no botnets please). But say, for instance, that some organisation incentivised link sharing in a bid to drive publicity. How could I drive traffic to my link? Obviously I could spam all my friends on social networking sites, which is what they want me to do, but that doesn't sound as fun as trying to game the system. (Not that I necessarily dispute the merit of this particular campaign.) The ideas I've come up with so far (in order of increasing deviousness) include: Link-dropping - This is too close to what they want me to do to be devious, but I've done it here (sorry) and I've done it on Twitter. I'm subverting it slightly by focusing on the game aspects rather than their desired message. AdWords - Not very devious at all, but effectively free with the vouchers I've accrued. That said, I must be pretty poor at choosing keywords, because I've seen very few hits (~5) so far. Browser testing websites - The target has a robots txt which prevents browsershots from processing it, but I got around this by including it in an iframe on a page that I hosted. But my creative juices have run dry I'm afraid. Does anyone have any cheeky/devious/cunning/all-of-the-above idea for driving traffic to my page?

    Read the article

< Previous Page | 97 98 99 100 101 102 103 104 105 106 107 108  | Next Page >