Search Results

Search found 9757 results on 391 pages for 'shekhar pro'.

Page 127/391 | < Previous Page | 123 124 125 126 127 128 129 130 131 132 133 134  | Next Page >

  • Google Places Review - Owner or Business Name?

    - by Svengali
    I currently manage a Google Places Business, where we have received several reviews. I am wanting to respond to these reviews but I want to respond as the company, not as myself. I am the 'owner' of the page, it is through my personal gmail address. Like Facebook, when replying to comments and posts by other users on the business' wall, you reply as the actual business, not as your personal name. Is this the case with Google Places? If not, how can I get it so the reviews are posted as the company name, not my personal name?

    Read the article

  • want to change host account for google app email ?

    - by Sathyam Shivam Sundaram
    I have a standard ( Free ) Google app Email Service , From last 5 Months we are using this service. Our webiste was hosted on the Third Party Web Hosting Company. Nut Now iam planning to change my Web Hositing provider , but i want to keep my domian in the previous Hosting Company. Can Google App Allow this option of changing web hoster for the registred Domain in the Google App for Email Service. Is there any body done this ?

    Read the article

  • Google Analytics Funnel Step Regular Expression Not Working

    - by scoarescoare
    The first step in a funnel is going to have a dynamic ending fragment. Examples: http://mysite.com/invite/tickle-party http://mysite.com/invite/pajama-party http://mysite.com/invite/puppy-party To allow for such dynamism, I provided this url for step one: \invite(.*) My goals work but the funnel visualization report shows 0 for everything. I know this problem is due to the regex in the funnel step because I copied this entire goal except I replaced \invite(.*) with /invite/puppy-party When I hardcoded /invite/puppy-party the funnel worked as expected. Why is my funnel report not working with my original funnel step url parameter?

    Read the article

  • SEO for replacing blog content, but keeping the same page URL

    - by cphill
    This might not have any major impact on the SEO, but basically I have random blog at this URL: http://example.com/blog (not a real URL), that I am removing and replacing with a company blog. I want to use the http://example.com/blog URL address, but I'm not sure how this would effect my SEO since this random blog content that I am removing has the example.com/blog URL prefix. Would I just add a 310 redirect for those old blog articles and leave the basic /blog URL without any redirects?

    Read the article

  • Does Google sometime prevent new white hat sites from ranking at all in some verticals?

    - by JVerstry
    Assuming someone wants to implement a new viagra or akai berry e-commerce website. There is a lot of competition and this site does not really bring something new, other than a new online counter to buy products at a nice price. Assuming this site does not use any black hat techniques and that it stays with Google quality guidelines, and assuming it has no (or few) backlinks (from non-authoritative websites). Assuming this website's pages are indexed properly in Webmaster Tool, and that no penalties are reported. No site improvements are suggested. Google crawls the site daily as reported in GWT. No robots.txt configuration issues. Does Google sometime decide to no rank this site for any user query (for weeks), because of lack of original content? The reason I am asking this is that I am trying to understand the possible cause of a similar situation I am observing with two sites. If so, what is the way out to start ranking for these site? If not, does it mean the cause is elsewhere for sure? Any confirmed info to get out of the maze is welcome.

    Read the article

  • Is it good to buy multiple domains for competitive reasons?

    - by lowestofthekeys
    I am attempting to convince the higher ups at my company that spending $55 to renew one domain for a year is bad when they end up having 3-4 domains names for one website. They're reasoning for doing so is to keep these domains names out of the hands of the competition. For example, the company name is Pie Consulting & Engineering. They want to buy up pieforensicconsulting.com to keep it out of the hands of a competitor (we also do forensic engineering). Could a competitor use that domain in any kind of diabolical way? I mean I figure if someone is typing in pieforensiconsulting into the url field, they know what they're looking for and if it redirects to another company, they're not just going to stay on the site.

    Read the article

  • What is recommended minimum object size for gzip benefits?

    - by utt73
    I'm working on improving page speed display times, and one of the methods is to gzip content from the webserver. Google recommends: Note that gzipping is only beneficial for larger resources. Due to the overhead and latency of compression and decompression, you should only gzip files above a certain size threshold; we recommend a minimum range between 150 and 1000 bytes. Gzipping files below 150 bytes can actually make them larger. We serve our content through Akamai, using their network for a proxy and CDN. What they've told me: Following up on your question regarding what is the minimum size Akamai will compress the requested object when sending it to the end user: The minimum size is 860 bytes. My reply: What is the reason(s) for why Akamai's minimum size is 860 bytes? And why, for example, is this not the case for files Akamai serves for facebook? (see below) Google recommends to gzip more agressively. And that seems appropriate on our site where the most frequent hits, by far, are AJAX calls that are <860 bytes. Akamai's response: The reasons 860 bytes is the minimum size for compression is twofold: (1) The overhead of compressing an object under 860 bytes outweighs performance gain. (2) Objects under 860 bytes can be transmitted via a single packet anyway, so there isn't a compelling reason to compress them. So I'm here for some fact checking. Is the 860 byte limit due to packet size the end of this reasoning? Why would high traffic sites push this lower/closer to the 150 byte limit... just to save on bandwidth costs, or is there a performance gain in doing so?

    Read the article

  • django & postgres linux hosting (with SSH access) recommendations

    - by Justin Grant
    We're looking for a good place to host our custom Django app (a fork of OSQA) and its postgresql backend. Requirements include: Linux Python 2.6 or (ideally) Python 2.7 Django 1.2 Postgres 8.4 or later DB backup/restore handled by the hoster, not us OS & dev-platform-stack patching/maintenance handled by the hoster, not us SSH access (so we can pull source code from GitHub, so we can install python eggs, etc.) ability to set up cron jobs (e.g. to send out dail email updates) ability to send up to 10K emails/day good performance (not ganged up with a zillion other sites on one CPU, not starved for RAM) FTP or SCP access to web logs dedicated public IP SSL support Costs under $1000/month for a relatively small site (<5M pageviews/month) Good customer service We already have a prototype site running on EC2 on top of a Bitnami DjangoStack. The problem is that we have to patch the OS, patch postgres, etc. We'd really prefer a platform-as-a-service (PaaS) offering, like Heroku offers for Rails apps, where all we need to worry about is deploying our code instead of worrying about system software patching and maintenance. Google App Engine is closest to what we're looking for, but they don't offer relational DB access (not yet at least). Anyone have a recommendation?

    Read the article

  • Changed url from non www to www...Google Indexing

    - by user20321
    I have recently changed (about 1 week ago) my url from non www version to www version. I told my hosting company to do this and they did it successfully all my urls are directed to www version. But google is indexing my non www version on the search results. I have updated new content on my website and google indexes that content with the changed url i.e with prefix www but the mainpage i.e the site name is still shown without www and its not updated. I have checked that my www.sitename.com is listed on google but not shown when I type www.sitename.com. So how much time does it take to remove the old urls from indexing and updating into new urls ??????

    Read the article

  • Strange robots.txt - how and why did it get there?

    - by Mick
    I recently created a very simple, pure HTML website which I have hosted with "hostmonster". Hostmonster had very good reviews on some comparison website and in general so far they appear to be perfectly good in every way... At least I thought so until just now... I have been making lots of edits to my site on an almost daily basis. My site now appears on the first page (7th on the list) for my most important keyphrase when doing a google search. But I did notice some problem with the snippet chosen by google. I asked a question on this site about snippets and got some great answers. I then made some modifications to my meta data and within 48hrs the google snippet for my search was perfect. The odd thing though was that looking at the "cached" version google had, it appeared that the cache was still very odl- like three weeks previous. This seemed very odd - how could it be that the google robots had read my new metadata without updating the cache? This puzzled me greatly. Just now it occurred to me that maybe I had some goofey setting in my robots.txt file. I didn't actually remember even making one - but I thought I'd have a look just in case. Much to my horror, I saw that there was a robots.txt and it contained the disturbing text below: sitemap: http://cdn.attracta.com/sitemap/728687.xml.gz Intuitively this looks like some kind of junk, spam trick, and I had indeed been getting some spam from "attracta". So my questions are: 1. Should I simply delete this robots.txt? 2. Was the file there all along - placed there because of some commercial tie-in between attracta and hostmonster. 3. Does the attracta robots file explain the lack of re-caching?

    Read the article

  • separate domains vs subdomains [duplicate]

    - by Sharon
    This question already has an answer here: Registering multiple domains vs. subdomains 5 answers We manufacture a very versatile product used in a wide variety of products using multiple brands. In order to market these brands, should we create a separate domain for each brand/market or use subdomains from our well established main domain? What would be best for SEO without breaking the bank?

    Read the article

  • Cannot add DataTables.net javascript into Joomla 1.5

    - by mfmz
    I've been having this problem where i couldn't add Datatables.net javascript into my Joomla article. I have been trying to include it through Jumi. To say that my editor strips of the tag is somewhat not right as I have been able to execute Google Chart API in Joomla which also uses javascript. Any clue why? The code is as below : <link href="//datatables.net/download/build/nightly/jquery.dataTables.css" rel="stylesheet" type="text/css" /> <script src="http://code.jquery.com/jquery-1.9.1.min.js"></script> <script src="//datatables.net/download/build/nightly/jquery.dataTables.js"></script> <script type="text/javascript"> $(document).ready( function () { var table = $('#example').DataTable(); } ); </script>

    Read the article

  • multiple godaddy domains to home router then reverse proxy to multiple internal servers [closed]

    - by Dan
    I need someone to steer me correctly... advising on all required components. I have multiple domains with godaddy, say site1.com , site2.com, site3.net I have multiple home LAMP servers... one on say, lampsrv1 = 192.168.0.2:8080 (windows) and lampsrv2 = 192.168.0.3:9080 (linux srv2) I would like to have server1.site1.com point to lampsrv1 and server2.site1.com point to lampsrv2 . I may also want server1.site2.com also point to my lampsrv1 as an option. My thinking is - I have a dedicated linux srv1 with a reverse proxy server behind the router, ie Apache or NGINX or equivalent directing to appropriate LAMP server. It's the godaddy subdomains, cnames or redirections, etc I'm having a challenge with for starters... I have tested apache with virtual servers but can't get proxying to work based on host header info... seems to go to one address making me think its actually the apache reverse proxying that's not quite working. Finally to add to this, my router has a dynamic IP but does lease for quite a while but that would be my final piece. So, I'm sure this might be a popular question but can't seem to piece this together. I need someone who has actually configured this scenario to advise but will take other suggestions.... please indicate if you have successfully configured this.

    Read the article

  • Redirect Permanent and https

    - by Clem
    I just set up https on my server, and I have an issue with redirect permanent. If I have a link for example http://domain.com/index.html it redirect me on https://www.domain.comindex.html The / is missing and I can't figure out how to fix it. It's work with http://www.domain.com/index.html Here is my httpd.conf <VirtualHost *:80> ServerName domain.com Redirect permanent / https://www.domain.com/ </VirtualHost> <VirtualHost *:80> ServerName www.domain.com Redirect permanent / https://www.domain.com/ </VirtualHost> <VirtualHost *:443> DocumentRoot /var/www/domain/ ServerName www.domain.com SSLEngine on SSLCertificateFile ssl.crt SSLCertificateKeyFile ssl.key </VirtualHost>

    Read the article

  • How to overcome politics of the net (Google translate code refuses to work from a specific region)

    - by Jawad
    According to the FAQ's I am not sure if my question is a ok to ask or will be closed or should I post it in the meta or even I would blame some one for downvoting it. However it is one that has been bugging me since the trouble strated. Let me explain. I have this Web Site. It uses the Google Translate API (Can't post the link, does not open from this region) with the following code. <meta name="google-translate-customization" content="9f841e7780177523-3214ceb76f765f38-gc38c6fe6f9d06436-c"></meta> <script type="text/javascript"> function googleTranslateElementInit() { new google.translate.TranslateElement({pageLanguage: 'en'}, 'google_translate_element'); } </script> <script type="text/javascript" src="http://translate.google.com/translate_a/element.js?cb=googleTranslateElementInit"></script> The problem is since this, it just stopped working. On the site you can see that I had to actually remove the above from here, here, and here while left it here, here, here and here. This is so because the the web site "refuses" to load at all with the pages that have the code (i.e., from this region.) If I use Firefox Stealthy Plugin and open the site in Firefox, It works like a charm without any problems. But with Google Chrome, Apple Safari and Opera Web browser, the site does not load/open at all because of the Google translate. (I know this because If I remove the Google Translate Code, the site works/loads fine) It was one thing to program for "cross browser compatability" and alltogether another to program for "cross region compatability". What can I do to make sure that the site works from anywhere? Do I completely remove the Google Translate code and just have to do without the additional functionality or Do I look for alternatives like this or according to this?

    Read the article

  • How to create a good sitemap for dynamic website

    - by Saif Bechan
    I have a website with dynamic content and different kind of pages. I have some pages that rarely change, and I have pages like blogs that change often. The blog pages also have links for sorting, for example sorting on date, asc, desc. On some of the pages I also have links to different tabbed content, and links that are just anchor links. Now when I use a xml sitemap generator then all the links are thrown into the site, and so I don't think all the links are really relevant. The blogposts up until now are also taken into the sitemap. Is this really necessary? I think the links to the blogposts can be indexed just fine. Is the best way to make a sitemap just to manually assign the main menu links to the sitemap, or is indexing everything really recommended?

    Read the article

  • Understanding the maximum hit-rate supported by a web-server

    - by SNag
    I would like to crawl a publicly available site (and one that's legal to crawl) for a personal project. From a brief trial of the crawler, I gathered that my program hits the server with a new HTTPRequest 8 times in a second. At this rate, as per my estimate, to obtain the full set of data I need about 60 full days of crawling. While the site is legal to crawl, I understand it can still be unethical to crawl at a rate that causes inconvenience to the regular traffic on the site. What I'd like to understand here is -- how high is 8 hits per second to the server I'm crawling? Could I possibly do 4 times that (by running 4 instances of my crawler in parallel) to bring the total effort down to just 15 days instead of 60? How do you find the maximum hit-rate a web-server supports? What would be the theoretical (and ethical) upper-limit for the crawl-rate so as to not adversely affect the server's routine traffic?

    Read the article

  • What does it take to successfully run a URL shortener service [on hold]

    - by MxyL
    What are the costs (technology wise) for running a URL shortener service such as bit.ly or anonym.to? For example, if I decided to use an inexpensive shared hosting with "unlimited" bandwidth, would that be feasible? Or would I need a dedicated hosting? I found this question: I want to run a URL shortener for my own usage, what do I need to do?, which makes it easy to set it up, but I'm not too clear what kind of things I need to consider.

    Read the article

  • Cost effective way to provide static media content

    - by james
    I'd like to be able to deliver around 50MB of static content, either in about 30 individual files up to 10MB or grouped into 3 compressed files, around 5k to 20k times a day. Ideally I'd like to put some sort of very basic security around providing the data to ensure that a request is from the expected source, but if tossing the security for a big reduction in price is possible then it's an option. Does anyone have any suggestions other than what I've found: Google AppEngine is $0.12/GB & I believe has a file size limit of 10MB so I'd have to break the data up a bit. So a rough calculation would seem to be that this would cost me about $30 to $120 a day. Or I've seen something like what seems to be just public static content delivery with no type of logic capabilities like Usenet.nl at what I think calculates to about $0.025/GB which would cost me about $6 to $25 a day. Any idea if I'm going about these calculations right & if there might be a better option for just static content on a decently high volume delivery? Again some basic security would be great but if cost is greatly reduced without it then I'm up for that.

    Read the article

  • Silverstripe: How can I disable comments?

    - by SamIAm
    My client site is built in Silverstripe, there is a news page, and it allows people to leave comments. Unfortunately we've got loads of spam emails. I'm new to this, is there any way we can disable the comment field by default? How do I do it? Alternatively is there easy way for me to install a spam protection? Update - Because this is someone else's code, I just realised that they have some sort of spam protection already, so we are trying to disable comments now. I have manage to set no comment as default by changing file BlogEntry.php static $defaults = array( "ProvideComments" => true, 'ShowInMenus' => false ); to static $defaults = array( "ProvideComments" => false, //changed 'ShowInMenus' => false ); Am I on the right track to disable comments by default? Also how can I stop on the news page showing xxx comments link? eg Test Posted by Admin on 21 June 2011 | 3 Comments Tags: P This is a test.... 3 comments | Read the full post

    Read the article

  • Bing flagging pages as Malware

    - by Vince Pettit
    Bing has flagged some pages on a site I manage as malware, these have been looked at and looks like there was some malware at some point but it's now since been removed. It's also pointing to some pages which no longer exist saying there is malware on those. Is there anything specific I need to do to get Bing to stop trying to access the removed pages and also deflag the pages that have been fixed.

    Read the article

  • Can I instruct the browser not to look for a favicon?

    - by Peter Boughton
    I have a website that doesn't have/need a favicon. Is there a way to instruct the browser not to waste a request looking for /favicon.ico ? I don't mean filtering logs, but something client-side, like this: <link rel="shortcut icon" href="about:blank" /> That appears to work, but I'm not in a position to do comprehensive tests, (and search engines are being unhelpful). Can anyone confirm if this is a valid method, or provide a suitable alternative?

    Read the article

  • Worth changing the URL structure to incorporate keywords?

    - by Dejan Pelzel
    I am migrating my blog from PHP to ASP.NET and while recoding the whole website, I figured I might as well improve the URL structure. This is how an url looks like now: example.com/blog/post/755/hakurei-reimu-cosplay-from-touhou-by-kishigami-hana and this is hould it will look after the change (cosplay being the dynamic main keyword of the post): example.com/blog/cosplay/hakurei-reimu-cosplay-from-touhou-by-kishigami-hana-755/ The website is a bit more than a half year old and receives around 650k page views a month, mainly from search traffic. Of course everything would be redirected with 301 redirects. Do you think it is worth changing to a new URL structure, or will it harm the ranking in the long run?

    Read the article

  • adding tagged / dynamic pages in sitemap

    - by sam
    ive got a blog thats been running for about a year ive made about 200 posts, and there should be about 220 pages to index (additional pages for about / contact ect). When i go to crawl the site i get 1900 pages because of all the pages that are related to tags ive used in my blogs these 70% of these pages only contain one blog post. When submitting my site map to google should i exclude all pages with /tagged/ in the url so ill only be submitting unqiue pages, or should i submit the full site map ?

    Read the article

  • A particular url on a website suddenly disappeared from google search results - why?

    - by Ragavendran Ramesh
    I have a website which had a particular page url that was indexed in google search results - in the first 10 results. Suddenly it disappeared. Now that page is not even in the first 100 results. What would be the reason? I am feeling that the page has be spammed by our competitors. Is it possible to avoid that, or can I find if that page has been spammed or not? Is it possible to find the particular page in a website is spam or malicious?

    Read the article

< Previous Page | 123 124 125 126 127 128 129 130 131 132 133 134  | Next Page >