Search Results

Search found 9728 results on 390 pages for 'zee pro'.

Page 77/390 | < Previous Page | 73 74 75 76 77 78 79 80 81 82 83 84  | Next Page >

  • Invoice from Godaddy with intent to defraud?

    - by Berliner
    Hi Webmasters I have received several email asking me to renew a domain name: REMINDER: Renew early for multiple years and lock in your savings! For your review, listed below are domain names and their expiration dates. F.....COM - Mar. 09, 2011 Since I lost the domain name long time ago and couldn't get it back I asked if it was available again. Goddady replyed: According to WHOIS the domain name is registered to a Japanese company with the expiry date: 2011-12-02. I wrote to Godaddy: According to your information the domain holder is a Japanese company as described below. Can you give me an explanation why you send me an email asking me to pay for a domain name which I do not own? (Expiration Date: 2011-12-02) I am just curious, I am sure there is no ill will on your part. Godaddy answered: Dear Sir or Madam, Thank you for contacting online support. This was just to let you know the domain is registered to someone else and who. Then today I got yet another invoice asking me to renew the same domain name once again: **REMINDER: Renew early for multiple years and lock in your savings! The product(s) listed below have expired or are at risk of expiring: Product NameNext Attempt Date.COM Domain Name Renewal - 1 Year (recurring)03/14/2011 F........COM You are at risk of losing the service(s) or product(s) listed above. Your products are currently set to renew manually – they will NOT be renewed automatically on the next attempt date.** The expiry date has now been changed from the 9 of March to the 14 March. Another party owns the domain name and further the domain name was never registered with Godaddy. This appears like a way to make a few buck on a unsuspecting customer, it might even be illegal. Any comment how to take this futher would be most welcome.

    Read the article

  • What constitutes a "substantial, good-faith effort to remove the links"

    - by Luke McCallum
    We engaged the services of a 3rd party SEO consultant to assist us in managing our Meta data and to write regular blogs on our site http://cyberdesignworks.com.au Without our authorisation, the SEO also ran a link building campaign which has seen us Penguin slapped and we no longer appear in Google for a number of our core keywords. Since notification by Google that we have "unnatural links" back in March we have undertaken a significant campaign to rid ourselves of these dodgy backlinks by a number of methods. I have just received feedback on my 4th or 5th resubmission which is still advising that we need to make a "substantial, good-faith effort to remove the links" before Google will reconsider us for inclusion. After the effort that I have gone through to get links removed, I am now at a loss as to what else I can do to demonstrate "substantial, good-faith effort to remove the links". Below is a summary of the actions that we have taken to date. According to http://removem.com we had about 5584 back-linking domains. Of those we have successfully contacted and had removed links from 344 domains We ignored links from 625 domains as they were either legitimate press releases, natural backlinks or client websites containing an attribution link in the footer that points back to us. Due to our efforts, or the sites simply becoming defunct, removem.com reports that links from 3262 domains have been removed. We have contacted but are yet to receive feedback from 1666 domains so we can assume that the backlinks remain. We have configured an automatic 301 redirect for each of the links from these 1666 domains to point to http://redirects.sanscode.com/ which we are calling our Bad Link Catcher (a stroke of genius I thought). i.e http://www.mysimplewebdesign.com/create-a-perfect-webpage-with-four-important-tips-from-sydney-web-development-service-companies.php As we are a web design agency, we have a large number of client websites which contain an attribution link in their footer which points back to us. We have gone through the vast majority of these and updated these links to replace anchor text with an image and rel="nofollow" link. i.e <a rel="nofollow" target="_blank" href="http://www.cyberdesignworks.com.au/"><img src="https://sessions.sanscode.com/site/assets/media/badges/Badge_CDW_SANSCODE.png"></a> See http://www.milkatwork.com.au/ An export from http://removem.com detailing the number of times we have contacted each link and whether it is still found or not was also supplied with each resubmission. The total back links reported in Google Web Master Tools has dropped from over 100K to 87K and I expect it to drop significantly lower once Google re-crawls each back-linking page. Based on all of the above, I am not sure what else I can do to to demonstrate a "substantial, good-faith effort to remove the links". I would sincerely appreciate any feedback or suggestions that you may have as I am out of ideas.

    Read the article

  • DNS slowdowns on development environment

    - by Sequenzia
    I have a local development environment setup on my Mac. I am running an Ubuntu Web Server inside of a Virtual Box VM. I setup a host file on my Mac that points my dev site to the IP of the Ubuntu Virtual Server. Everything works good other than the fact a lot (not all) of the time it takes more than 5 seconds to load a page. I used firebug to track down where the problem is and when it's slow the DNS part of my request is taking over 5 seconds. Like I said it's not all the time. Sometimes it resolves and loads the page within milliseconds. The same page one click will be super fast and then the next time it takes over 5 seconds. It's really slowing me down and I am not sure what is causing it. Anyone have any ideas? Any help would be great. Thank

    Read the article

  • Host And Expose Application to local small network

    - by tartak
    I developed a little application (web application) using JavaEE+MySql. I try to keep some data and .. from time to time to get some reports using my data. My problem is I have to access this application from 4-5 computers in the office. They are connected through a switch. It's a typical small office network, nothing fancy. I need some advice on how to do this. I mean for a small application with no external communication is it mandatory to use an Apache machine? I'd use a simple Tomcat container on the "server machine" (which is my computer, a windows machine) and .. basically .. I would like to permit the access to my colleagues also. I don't have any knowledge about concurrency (I know mysql permits concurrent access) so I would like some configuration tips also.

    Read the article

  • Make Google Plus One only work for the domain and not the path [closed]

    - by Saeed Neamati
    Possible Duplicate: Make Google +1 button +1 a specific URL rather than the URL it's on? I'm creating an image sharing website, and since it's going to have many thousand links, then it's almost impossible to put a Google Plus One button in my site. Plus one is an indication of site's popularity and trust. You follow a link in SERP, because you see that somebody that you know has already plused one that link. So, you trust that link and click it. The more plus a page get, the more trustworthy it becomes. Sites which has simple static pages can get many plus ones, but sites like mine (dynamic sites with thousands of links) can't aggregate plus ones in one page. Is there any way to tell Google that I only want the Plus Ones to be counted for the domain only, and not for the path? In other words, how can I transfer a plus one given to the http://example.com/tag1-tag2/2525 to plus ones given to the http://example.com? Is it possible at all?

    Read the article

  • Canonical url for a home page and trailing slashes

    - by serg
    My home page could be potentially linked as: http://example.com http://example.com/ http://example.com/?ref=1 http://example.com/index.html http://example.com/index.html?ref=2 (the same page is served for all those urls) I am thinking about defining a canonical url to make sure google doesn't consider those urls to be different pages: <link rel="canonical" href="/" /> (relative) <link rel="canonical" href="http://example.com/" /> (trailing slash) <link rel="canonical" href="http://example.com" /> (no trailing slash) Which one should be used? I would just slap / but messing with canonical seems like a scary business so I wanted double check first. Is it a good idea at all for defining a canonical url for a home page?

    Read the article

  • How do i force www subdomain on both https and http

    - by Brian Perin
    For whatever reason I can't seem to get this right, I've looked at many examples on here and apache's website. I'm trying to force www.domain.com instead of domain.com on EITHER http or https but I am not trying to force https over http. the following code seems to work for all https connections but http will not redirect to www. RewriteEngine On RewriteCond %{HTTPS} on RewriteCond %{HTTP_HOST} !^www\.domain\.com$ [NC] RewriteRule ^ https://www.domain.com%{REQUEST_URI} [R=301] RewriteEngine On RewriteCond %{HTTPS} off RewriteCond %{HTTP_HOST} !^www\.domain\.com$ [NC] RewriteRule ^ http://www.domain.com%{REQUEST_URI} [R=301]

    Read the article

  • How to get Magento to update order status when PayPal returns IPN message?

    - by Nick
    When someone checks out in Magento with PayPal, and PayPal flags their payment for review, Magento correctly sets the order status to "Payment Review". However, if after a day or two PayPal decides the order is OK, it sends an IPN message to Magento with the proper payment status of "Pending" and pending reason of "authorization". I can see this IPN message in Magento's paypal logs (and can simulate it with the sandbox), however, when Magento receives this message it does not update its order status. Why not and how can this be fixed? I am using Magento 1.5.1.0.

    Read the article

  • Google Webmaster Tools, DNS Errors & HostPapa

    - by Gravy
    Received a message from Google Webmaster Tools: Over the last 24 hours, Googlebot encountered 2 errors while attempting to retrieve DNS information for your site. The overall error rate for DNS queries for your site is 40.0%. You can see more details about these errors in Webmaster Tools. Recommended action Contacted HostPapa and they deny that there is any issue with the site / server!!! Support in terms of what I can do to actually resolve this issue is non-existent!!!! The site is currently online. And I don't know much about DNS... so any advice about what I can do to resolve this problem would be much appreciated. Basically, the message from Google says that it is my webhosts fault, the message from my webhost (HostPapa) is... "Just tell google to crawl your site as there are no errors."

    Read the article

  • DNS and WIldcards

    - by Thomas Chapman
    Whenever I attempt to make a record for *.schneiderdonnelly.com.au and CNAME it, I get two errors: You can't mix CNAME/MX records together using the same hostname. Domain root's cannot be CNAME's, however you can web-forward this record to www.schneiderdonnelly.com.au instead for the same effect. I've read it's possible so why can't I make it work? I donated $5 to be a premium member and I've been trying to make it work for yonks. http://i.stack.imgur.com/D9Ui5.jpg This is how I want it to appear. The last record. I am prepared to swap DNS providers as long as they're free.

    Read the article

  • Getting a double slash when redirecting for a canonical hostname on Firefox only

    - by Brian Neal
    I have a Django powered website, and I'm trying to solve the "canonical hostname" problem. I want www.example.com to redirect to example.com. I have tried both techniques found in the Apache documentation here (scroll down to Canonical hostnames). I'm currently trying the mod_rewrite method, and I have this in a virtual host container: RewriteEngine on RewriteCond %{HTTP_HOST} ^www\.example\.com$ [NC] RewriteRule ^/?(.*)$ http://example.com/$1 [L,R=301,NE] This works for me, except for one case. In Firefox only, if I type www.example.com in a browser, it redirects and I see this in the URL bar: example.com// (note the 2 trailing slashes). However, something like this will work correctly: www.example.com/news/ gets redirected to example.com/news/. I only see this on the root URL in Firefox. It seems to work fine on Windows under Chrome, IE9, and Opera (maybe those browsers eat the double slash?). My Mac using friend says it is fine in Safari, but he also sees the problem in Firefox. As far as Django settings go, I am using the default value of APPEND_SLASH=True. I don't know if Django has anything to do with it, but I've tried mod_rewrite rules like the above on static HTML sites before and it always seems to work.

    Read the article

  • Why is email HTML stuck in the 90's?

    - by Sean Dunwoody
    (disclaimer - I've already tried asking this on StackOverflow, but apparently it was off topic. If the same is true here please let me know and I'll close/delete this question.) I've spent about a day putting together a frustrating email newsletter, using tables, inline styles etc. It feels a lot harder than it should be. I was just wondering, is there any reason why email clients have such poor support of HTML and CSS (CSS in particular)? I would have imagined they'd be scrambling to outdo each other in this department ... Is is a security thing (I can't really imagine why)? Or are they just lazy?

    Read the article

  • Google bots are severely affecting site performance

    - by Lynn
    I have an aggregate site on a linux server that pulls in feeds from a universe of about 2,000 blogs. It's in Wordpress 3.4.2 and I have a cron job that is staggered to run five times an hour on another server to pull in the stories and then publish them to the front page of this site. This is so I didn't put too much pressure all on one server. However, the Google bots, which visit a few times every hour bring the server to its knees in the morning and evenings when there is an increase in traffic on the site. The bots have something like 30,000 links to follow at this point. How do I throttle the bots to simply grab the new stories off the front page and stop there? EDIT- Details of my server configuration: The way we have this set up is the server that handles all the publishing is an unmanaged instance via AWS. It mounts the NFS server and connects to the RDS to update content, etc. You get to this publishing instance via a plugin that detects the wp-admin link and then redirects you into there. The front end app server also mounts the NFS and requests data from the RDS. It is the only one that has the WP Super Cache on it.... The OS is Ubuntu on the App server and the NFS runs CentOs. The front end is Nginx and the publishing server is Apache.

    Read the article

  • Google Analytics checkout page tracking problem

    - by Amir E. Habib
    I am running a multilingual website, each lang on a different domain name. I am trying to lead all purchase requests to the checkout progress, which has its own domain too. In order to keep Google Analytics tracking I've updated the Google Analytics code accordingly. I set the source domain to 'multiple top-level domains'. Everything is going fine so far unless in E-commerce Overview; the "Sources / Medium" is always showing as (direct) - or the name of the source domain. Since I am redirecting using PHP header(location:.. etc.) the Google _link method doesn't seem to be working properly - I want to focus on two questions: Should I create a new profile for the checkout domain in Google Analytics? (I am now using the profile ID of the source domain even though I move to the checkout domain, si that OK?) When I'm trying to pass the cookies of the source domain to the checkout domain, I notice that the Google cookies are copied to the new domain (the cookie path is .checkout-domain/) and they have the same values of the original cookies - But for some reason another set of cookies is created once I access a page with google analytics code in the checkout pages, with different values (same path). Feels like I'm doing something wrong here, so my question is - What am I doing wrong here? Does anyone have an idea how to pass the cookies to the checkout domain?

    Read the article

  • What Web Technology to use?

    - by Sven
    Hey guys, I would like to start a project and I am concerning what kind of programming language/web framework to use. There is not that much logic involved. It's about a community-page with a lot of users(not that much at the beginning but I would like to be ready to welcome a lot), that should be able to communicate through private messages and a forum and there will be a lot of content (news, articles) to consume. I also want to provide several authorization settings to provide some content for only specific people. In fact it's about a content management system, but I want to design it functionally myself. And I want to use some external APIs. The only website I can think of with almost similar functionality is pokerstrategy.com. I looked up their job offers and it seems like they use java and php Maybe you guys can give me your thoughts. What would you use to encounter that requirements and how would you apprach? Thank you

    Read the article

  • How can i point wildcard domains to a folder in apache

    - by Abishek R Srikaanth
    I am developing an app using PHP and deploying it on Apache on the Amazon AWS environment. This app requires to be made available to customers from their own chosen domain name? How can i acheive this? For example www.customer1.com = /var/www/myapp.mydomain.com www.customer2.com = /var/www/myapp.mydomain.com I would like to do this similar to how bitly enables shortened url's for custom domains. www.myshrturl.com is dns configured to a CNAME - cname.bitly.com Appreciate if someone could help me acheive this functionality. If there are any other details required, please let me know, I shall update the same.

    Read the article

  • Tracking multiple subdomains and domains going to the same site, separately in Google Analytics

    - by miles
    I have a new site that has multiple top-level domains and subdomains all going to it: www.domain.com, campaign.domain.com, chicago.domain.com, domain2.com - all go to the same site/site directory. Right now I have one Google Analytics account profile set up for it, but I want to be able to track the traffic that is hitting those different URLs separately. The domains are being routed on the server-side (not .htaccess). How can I do this in Google Analytics? Do I need to create filters? Or create different profiles for each domain?

    Read the article

  • DNS for domain shows old website for www version

    - by user3745746
    I bought 2 domains form GoDaddy but with both I am seeing the same problems in that the domain on the www version goes to the old site which is still being hosted. I have checked the IntoDNS website and in the www record it shows: Your www.example.com A record is: www.example.com -> example.typepad.com -> cname-cloudflare.typepad.com -> What can I do to stop this from happening? Will this eventually be automatically removed and fix itself? Though obviously it's not automatically fixed itself in the long drawn out expiry process... It's been quite a while for one of them and still hasn't propagated for the www. I'm not having any problems with the normal example.com part of the site.

    Read the article

  • Javascript and PHP how should I en/decode my data

    - by Ron
    Hello everyone. I whould like to know in what encryption should I encode my data and why first of all, I use GET method because it is search engine inside website. second, I use RTL language (hebrew) and thrid which basically is why I ask this question - firefox and safari (as I understood) encode and decode urls automaticly so if I encoded url, in firefox I will see it decoded which is good but if I copy-paste the url to the address bar and than enter the site firefox encode the uncoded url to utf (i think). anyway, what en/decode should I use, and how can I overcome the firefox auto en/decode?

    Read the article

  • Server Firewall preventing sending of email [migrated]

    - by Jo Fitzgerald
    The firewall on my VPS appears to be preventing my site from sending email. It was working fine until the end of last month. My hosting provider (Webfusion) has been next to useless. I am able to send email if I open INPUT ports 32768-65535, but not if these ports are closed. Why would this be? I have the following rules in my firewall: # sudo iptables -L Chain INPUT (policy DROP) target prot opt source destination VZ_INPUT all -- anywhere anywhere Chain FORWARD (policy DROP) target prot opt source destination VZ_FORWARD all -- anywhere anywhere Chain OUTPUT (policy DROP) target prot opt source destination VZ_OUTPUT all -- anywhere anywhere Chain VZ_FORWARD (1 references) target prot opt source destination Chain VZ_INPUT (1 references) target prot opt source destination ACCEPT tcp -- anywhere anywhere tcp dpt:www ACCEPT tcp -- anywhere anywhere tcp dpt:https ACCEPT tcp -- anywhere anywhere tcp dpt:smtp ACCEPT tcp -- anywhere anywhere tcp dpt:ssmtp ACCEPT tcp -- anywhere anywhere tcp dpt:pop3 ACCEPT tcp -- anywhere anywhere tcp dpt:domain ACCEPT udp -- anywhere anywhere udp dpt:domain ACCEPT tcp -- anywhere anywhere tcp dpts:32768:65535 ACCEPT udp -- anywhere anywhere udp dpts:32768:65535 ACCEPT tcp -- localhost.localdomain localhost.localdomain ACCEPT udp -- localhost.localdomain localhost.localdomain Chain VZ_OUTPUT (1 references) target prot opt source destination ACCEPT tcp -- anywhere anywhere ACCEPT udp -- anywhere anywhere The VPS is running Plesk 10.4.4 (please ask if you require further technical information to help me)

    Read the article

  • Is having a 'home' navigation item on the home page negative to your sites SEO?

    - by Brady
    My work colleague has recently had conversations with some SEO consultants and after those conversations she has come to the conclusion that having a link to the home page on the home page will have a negative effect on the websites SEO. And because of this we are now building websites that don't have a home link show until you are on any page other than the home page. If the above argument is true then surely then if we are on the about page of a website we shouldn't show a navigation item for the page we are on, and that would the case for any other page of the website... So my question is: Does having a home navigation item on the home page have a negative effect on the websites SEO? And if not: Why has my colleague come to the above conclusion? Could she be misunderstanding something more important about home links on the home page regarding SEO?

    Read the article

  • Why does SEO based code tips not appear to affect ranking?

    - by Ben
    I've been researching various methods for SEO where pages have precise titles, keywords are highlighted with h tags and tick the many boxes stated in good page mark up for SEO. However when looking at some top ranked search sites on google for key terms they have terrible SEO based mark up. Really long page titles, no tags, limited appearance of keywords in the text and so on. SEO analysis services rate them lower than other sites, yet these sites rank really high. Even with a low number of back-links they are high, so I don't understand how these sites earn the position when they appear inferior to those below them which have better mark up and links. I don't want to cause trouble my mentioning sites or keywords etc. but looking in google at 'executive search' the roughly 5th placed site makes no sense why it should be highly rank, especially with all the added .swfs. The same applies for the top of 'Japan Executive Search'. My main point is that these sites seem to not have all the important structural rules stated in seo page rating applications and general suggested best practice, nor do they show large back-links. It makes me feel like there is no point bothering to write decent mark up if it really doesn't matter. Can anyone explain how sites with such mark-up, and low back-links can outrank well written and structured sites with greater linkage? Sorry if this is a fuzzy question, I want to avoid singling out any sites for example, but it really has me perplexed that sites which appear to ignore the suggested best practices rank so well.

    Read the article

  • Godaddy VPS or swith to another provider?

    - by Charlie
    Long story short, I want to be able to have close to full control over my server, mainly being able to install things on my sever that Godaddy shared hosting does not allow (gzip, etc..). Should I switch providers to something else (all my domains are hosted there) or upgrade my hosting to VPS. If I upgrade, how hard will it be to set it up without their "assisted service" or something setup (where they do virus scanning, etc.)?

    Read the article

  • GWT: reporting crawling errors for non existing links

    - by pixeline
    Google Webmaster Tools is reporting crawl errors for links that never existed, and if i check the "Linked from" tab for a given error link, it shows another that never existed. They all mention joomla/ which is not the cms used on this domain (it's wordpress fyi). Exampled: http://example.com/joomla/index.php/component/user/register Linked from: http://example.com/joomla/component/user/login?return=L2###### What is going on? UPDATE 1 I tried something: I provided one of the faulty urls to the "Fetch as Google" functionality. Instead of returning a 404, it returns a 301 to another Joomla page. HTTP/1.1 301 Moved Permanently Server: Apache/2.4.3 X-Powered-By: PHP/5.4.4-10 X-Pingback: http://example.com/xmlrpc.php Expires: Wed, 11 Jan 1984 05:00:00 GMT Cache-Control: no-cache, must-revalidate, max-age=0 Pragma: no-cache Set-Cookie: PHPSESSID=1fgr5v2oip39miibuptd51s8h0; path=/ Set-Cookie: woocommerce_items_in_cart=0; expires=Sat, 12-Jan-2013 11:44:01 GMT; path=/ Location: http://example.com/joomla/component/user/register Content-Type: text/html; charset=iso-8859-1 Content-Length: 387 Date: Sat, 12 Jan 2013 12:44:01 GMT Via: 1.1 varnish Connection: keep-alive Accept-Ranges: bytes Age: 0 <!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN"> <html><head> <title>301 Moved Permanently</title> </head><body> <h1>Moved Permanently</h1> <p>The document has moved <a href="http://example.com/joomla/component/user/register">here</a>.</p> <p>Additionally, a 301 Moved Permanently error was encountered while trying to use an ErrorDocument to handle the request.</p> </body></html>

    Read the article

  • cloudflare's mx record should set cname or A records

    - by user7787
    The cloudflare offical support said https://support.cloudflare.com/hc/en-us/articles/200168876-My-email-or-mail-stopped-working-What-should-I-do- But traditionally mx record should not set as cname http://www.exchangepedia.com/blog/2006/12/should-mx-record-point-to-cname-records.html But cloudflare has a service called "cname Flattening" is it related for a reason to set cname as mx records? So should i set cloudflare's mx record as cname ?

    Read the article

< Previous Page | 73 74 75 76 77 78 79 80 81 82 83 84  | Next Page >