Search Results

Search found 9728 results on 390 pages for 'zee pro'.

Page 147/390 | < Previous Page | 143 144 145 146 147 148 149 150 151 152 153 154  | Next Page >

  • Looking for a modern payment processor which accepts adult sites

    - by JakeRow123
    I love how easy authorize.net is to use, but they don't accept adult sites. I am lanching an adult site soon but I can't find any good credit card processors I can use. Most sites use CCbill or Epoch but they are both terrible since the customer is redirected to their external site which also has an ancient 1990's look. It's not like authorize.net's API that you can query and get the result back as to whether the payment went through or not. This makes authorize.net blend seamlessly with the product. But since adult sites are against their TOS and also paypals, what is a good alternative? I am looking for one that won't redirect the user from my site, is big enough to be reliable and trustworthy and has fairly low rates. Any help is appreciated!

    Read the article

  • Aggregating customer service emails from multiple ecommerce sites for easy handling

    - by nitbuntu
    For one of my main customer help email addresses I use Mozilla Thunderbird with a combination of tags and saved searches. As the number of my e-commerce sites grows from 1 to more, customer service handling gets more tricky. Is there any simple and efficient way of handling emails from the different sites? Perhaps what I'm looking for is a way of aggregating customer service emails from different sites, into one place? Perhaps there's a way of already doing this within Thunderbird or Gmail?

    Read the article

  • Country specific content vs global content

    - by Ando
    I have a global product presentation website myproduct.com For certain countries I also own the country domain: myproduct.co.uk, myproduct.com.au, myproduct.es, myproduct.de, etc. The presentation website is translated in multiple languages and I set up redirects: myproduct.es will redirect to myproduct.com/es/, myproduct.de will redirect to myproduct.com/de/, etc. . The content so far is the same, just translated in different languages. The advantages are that it's easy to keep the content aligned - everything is managed from one centralized dashboard (I'm using Wordpress with qtranslate). Now I'm running into trouble as for different countries I want localized content - for UK I want to run different promotions and use a different reseller than for .com.au so I would like that users coming from myproduct.co.uk see something different than those coming from myproduct.com.au (and not be redirected to myproduct.com as they are right now). How can I achieve this? I could duplicate the whole main website and modify only certain parts but then I would have a lot of duplicate content (e.g. info about how the product works) and I would have pages that are likely to change (FAQ page) that I would have to keep updated over all websites. I can duplicate only partially the main website: on the localized website I would have only the pages that are different and then all other links would point to the .com site. This would solve the duplication problem but would cause confusion for the user as you would navigate from .co.uk to .com without noticing and then wonder how to get back. Other, better option?

    Read the article

  • How can I monitor a website for malicious changes to the files

    - by user41421
    I had an occasion recently where our website was compromised - a link farm was added to a couple of the pages on one occasion, and on another occasion, a large and nasty aspx file was put on the server. I won't mention the host's name (Hostway), but I was pretty annoyed that someone was able to do this. No, it wasn't a leaky password - around 10 sites hosted by HW with consecutive IP addresses got trashed. Anyway. What I need is a utility or service (preferably free) that takes a snapshot of my websites contents, and then regularly monitors the files (size and datestamp) for unauthorized changes or additions, and alerts me. I've used web services that monitor one file for changes, but I'm looking for something a bit more aggressive.

    Read the article

  • Copyright of pictures upload to a website?

    - by All
    I want to run a website like stock photos. How can I be sure that the uploader is real copyright holder of the picture? Is it possible to leave the responsibility of this copyright claim to the uploader or at last the webmaster is responsible for the website content? It generally confuses me, as for example, stock photo websites needs form signed by the model for photos showing human face. How they can be sure that the signature actually belongs to the model? How they keep them safe from possible lawsuit in this case (e.g. if selling photos of a models with a fake signature?)

    Read the article

  • SEO title tag and earning a high rank on search engines [closed]

    - by Josh White
    Possible Duplicate: What are the best ways to increase your site's position in Google? One of the most basic SEO techiniques is including accurate description below 64 characters in the tags of each page. I was wondering if is considered ethical SEO to set up the contents based on a search keyword for example. So if the user searches for 'apples pictures' for example, then the title of the webpage would be 'apple pictures'. Note that the search keywords accurately describe my website contents because the title will always relate to the body of the webpage and 85-90% of the terms searched for will return corresponding results. Is this considered a good seo practice and is it ethical? Also, can someone explain what the idea is behind "linking"? I read somewhere that it is a good seo practice to link other websites and it is good when other websites link you. Does this mean that I should include as many links to other websites as possible (that are somehow relevant to my websites goal), also if I joined forums/services and posted my website url in the signature, would that still be considered other websites linking me?

    Read the article

  • SEO Keyword Research Help

    - by user5857
    I'm new at SEO and keyword research. I am using Market Samurai as my research tool, and I was wondering if I could ask for your help to identify the best key word to target for my niche. I do plan on incorporating all of them into my site, but I wanted to start with one. If you could give me your input on these keywords, I would appreciate it. This is all new to me :) I'm too new to post pictures, but here are my keywords (Searches, SEO Traffic, and SEO Value / Day): Searches | SEO Traffic | PBR | SEO Value | Average PR/Backlinks of Current Top 10 1: 730 | 307 | 20% | 2311.33 | 1.9 / 7k-60k 2: 325 | 137 | 24% | 822.94 | 2.3 / 7k-60k 3: 398 | 167 | 82% | 589.79 | 1.6 / 7k-60k I'm wondering if the PBR (Phrase-to-broad) value of #1 is too low. It seems like the best value because the SEOV is crazy high. That is like $70k a month. #3 has the highest PBR, but also the lowest SEOV. #2 doesn't seem worth it because of the PR competetion. Might be a little too hard to get into the top page of Google. I'm wondering which keywords to target, and if I should be looking at any other metric to see if this is a profitable niche to jump into. Thanks.

    Read the article

  • Is AdWords ad blocked from top spots of SERPs until it is reviewed?

    - by Omeoe
    I have an AdWords ad and a keyword with a Quality Score of 10. Inferring from CPC from actual clicks, max CPC is set way beyond that of the third advertiser from the SERP for this keyword (there are three ads in the top). Still the ad is shown on the 4th spot which located either on the right or at the bottom of the SERP. The only catch is that the ad's status is "under review". Is it the reason why it's blocked from the top spots?

    Read the article

  • how to check that Google Analytics Tracking Code is firing on an iPad

    - by crmpicco
    I am used to using the Firebug extension "Omnibug" with Firefox to check that Google Analytics Tracking Code is firing on my website. This application works very well and has minimal overhead. I am now testing the website on an iPad and would like to know if there is a way to check that the GATC is firing on the iPad natively? I have spoofed the iPad UA string on Firefox on the desktop and it appears to fire correctly, however i'd like to see it happening on the device itself (if at all possible). I know that Firebug can be installed on an iPhone by means of a bookmarklet, however it is 1) quite buggy and not very user-friendly and 2) it doesn't support Omnibug. How can I check that my GATC is firing on my iPad?

    Read the article

  • Redirects in .htaccess to avoid crawl errors

    - by user71698
    I am getting a lot of errors in Webmaster Tools and basically there's a lot of links ending like this: mydomainname.com/links.php How can I redirect these links, to shave off this part at the end? For instance, there is a link in Google: http://www.onlineglobalbiz.com/article-marketing/www.onlineglobalbiz.com/links.php This should be: http://www.onlineglobalbiz.com/article-marketing/ Using .htaccess, how can I redirect from the incorrect links?

    Read the article

  • Search ranking for important keywords has gone down drastically [duplicate]

    - by Vaivhav
    This question already has an answer here: How to diagnose a search engine ranking drop? 5 answers Firstly, we are a small entrepreneurial team of 3 persons and I am more like an amateur webmaster of the company's website as we cannot really afford a technical guy/department right now. A few weeks earlier, our website traffic and rankings for most keywords decreased overnight. I did a lot of reading henceforth and learned about Penguin 2.1 which people said is the reason for the drop. Something like this had never happened before. Now, I have gone through the entire Google webmaster help section. It says there that if a manual penalty is taken against us, we would notice a message in Manual Actions page. So far, we haven't received any notice from Google for web spam. Some SEO guys I contacted said they found spam links in our backlink profile. I do believe I had mistakenly purchased a cheap link/SEO scheme when I was yet very new to SEO. This was more than a year back but since then we have been legitimate. Moreover, how do I find out which is a spam link and which is not? Our content is all original, refreshing and the best you will find in our niche. We also have a blog but on a different domain (wordpress.com) from where we send out anchored links to our business website. Is this a good thing to do? Now, how should we proceed and recover our traffic/rankings. I tried searching in webmasters for a way to reach google and ask them why the traffic has decreased suddenly, but I couldn't find a contact form or something. Can someone please go through our website and help in making things more clear regarding the reason for the drop, along with a solution. Will really appreciate this as I can't get to figure this out and its taking a lot of time. Vaivhav

    Read the article

  • VPS Server OS differences

    - by silvercover
    I have two VPS servers. one of them is running Linux and the other is Windows one. I've uploaded same file to their public_html folders and could see them in my browser via static IP address of each one like http://178.63.165.178/getorder/file.xml and http://178.63.165.178/getorder/file.xml. On the other side there is a device called SMSPrinter that configured to read those XML files using GPRS and need static IP address to reach destination server. unfortunately this device can only read file from windows server and could not reach the file on Linux server. There is no note in this device manual suggesting Windows server or specific OS! I've also set file permission on Linux server to 777 to have no limitation. what could be the cause of our problem? Thanks.

    Read the article

  • Google webmaster tools: changing address from domain name to subdomain

    - by Charliz
    So we originally have our blog on our main domain (for example, it would be on www.example.com). Now we have moved it to http://blog.example.com. My question is how do we change the address from www.example.com to blog.example.com. I read this http://www.google.com/support/webmasters/bin/answer.py?answer=83106 and it said make sure your site is main not a subdomain but I'm trying to move the site to a subdomain. Help.

    Read the article

  • clean urls using .htaccess

    - by Napster
    I am trying to implement clean urls using .htaccess. Basically after searching for some time I found out this code RewriteRule latestnews/([a-zA-Z0-9]+)/$ http://thinkmovie.in/index.php/latestnews/?nid=$1 [L] RewriteRule latestnews/([a-zA-Z0-9]+)$ http://thinkmovie.in/index.php/latestnews/?nid=$1 [L] so when I try to access the following url http://thinkmovie.in/index.php/latestnews/272 it redirects to http://thinkmovie.in/index.php/latestnews?nid=272 But what I want is to retain the url in the browsers address bar as http://thinkmovie.in/index.php/latestnews/272

    Read the article

  • Is this Anti-Scraping technique viable with Crawl-Delay?

    - by skibulk
    I want to prevent web scrapers from abusing 1,000,000 on my website. I'd like to do this by returning a "503 Service Unavailable" error code for users that access an abnormal number of pages per minute. I don't want search engine spiders to ever receive the error. My inclination is to set a robots.txt crawl-delay which will ensure spiders access a number of pages per minute under my 503 threshold. Is this an appropriate solution? Do all major search engines support the directive? Could it negatively affect SEO? Are there any other solutions or recommendations?

    Read the article

  • How to discredit hacked links pointing at my company's website

    - by Dan Gayle
    The competition of one of my company's websites has started a really dirty campaign of acquiring hack links. One of their ingenious tactics has been to seed in links to OUR site withing their hack bot, making US look like we might be responsible for it or using us to cover their tail. These are .gov and .edu sites. Is there any way possible to discredit these links? To disavow them at all? EDIT: Penguin has really effected this question, IMO. Does anyone know if there is a revised opinion on disavowing backlinks to your site?

    Read the article

  • Is there a list of shortcuts which are not reserved in any major browser?

    - by MainMa
    In the past, when I implemented shortcuts for web applications, I used different ones for different browsers: for example Ctrl+Shift+A was used in Chrome, but would be something else, like Ctrl+Shift+C in Firefox, since Firefox reserves the Ctrl+Shift+A shortcut for add-ons. Actually, I have a project where it would be nice to have the same shortcuts for every browser. Is there a list of shortcuts which are not reserved in any major browser (Chrome, Firefox, Safari, IE and Opera) or do I have to do my own by listing? Google wasn't helpful, since it rather shows the list of shortcuts which are common to all major browsers.

    Read the article

  • Repeat use of Schema / Rich Snippets Markup i.e LocalBusiness Data

    - by bybe
    I am unable to find official wording and I'm hoping that some Rich Snippets/Schema Guru can give me some insight into proper usage of repeated content when it comes to using markup. I'm building a site that wants to use Schema as the markup type and the owner would like as much usage as possible. The business name, telephone and address will appear on every page now is it valid or even useful to use Rich Snippets on every page where this information is displayed. For example this information appears in the header, and footer of every page of the site and too give you an example of my current markup see below: <body itemscope itemtype="http://schema.org/LocalBusiness"> <header> <a itemprop="url" href="http://www.domain.co.uk/"> <img itemprop="logo" src="image.png" alt="Company Name Logo" /> </a> <span itemprop="telephone">01202 000 000</span> </header> <div> This is where the content will go</div> <footer> <span itemprop="name">Company Name</span> <span itemprop="description"> A small little bit about this company</span> <div itemprop="address" itemscope itemtype="http://schema.org/PostalAddress"> <span itemprop="streetAddress">Address Goes here</span> <span itemprop="addressLocality">Area Here</span>, <span itemprop="addressRegion">Region Here</span> </div> </footer> </body> !-- Local Business Schema Now Closed --> So as you can see above this information will be displayed on every single page.... Is this valid or bad to repeat usage of this information in schema format...

    Read the article

  • Google Authorship Image of my blogspot has been disappeared in Google SERP. Why?

    - by Sathiya Kumar
    I have a blogspot and i used my image to appear on the Google SERP for my keywords using Google Authorship Markup. My image was showed for last 2 months but while checking SERP for my blog, i found that my authorship markup is not working. My image, name and G+ followers count is not appearing near my blogspot URL in SERP. I didn't made any changes in my google+ profile or in my blogspot header tag where i had put the authorship code. I tried to find the reason but i didn't find any value answer. May anyone answer this question. Please let me know if you had already experienced like this.

    Read the article

  • How do I word my url so that it doesn't get blocked or appear spammy

    - by user18681
    I'm creating a fairly large site. Will my links appear spammy if I use the same word as in the pathfile in the url? For example: www.example.com/apples/great-apple-recipes www.example.com/apples/fresh-apple-pie www.example.com/apples/delicious-apple-turnovers I do not want my link to appear spammy. But is it ok if the keyword is almost always the same as in the pathfile on a huge site? Does the pathfile count as part of the keyword? Also, how many words in total should a url (including pathfile etc...) be?

    Read the article

  • Website Stopped Showing From Google Search Results Sunddenly

    - by Aman Virk
    I have a design and development blog http://www.thetutlage.com (1.5 years old), which was doing really well in Google search as I was getting over 70% of my traffic from Google. Now suddenly from last two days it reduced the amount of traffic from 70% to 20% and also when I am trying to search for the exact posts that I can created even after appending my website name to it does not show any results for that. Sample Search Text: JQuery Game Programming Creating A Ping Pong Game Part 1 I have post with exact same title and it does not show it on Google search anywhere. I am totally shocked, I write my own unique content and follow Google guide lines like bible. Also there is no message under my webmasters account stating any problem or error.

    Read the article

  • Pointing domain away from google

    - by redbeard
    I've already asked this question on google apps support, didnt fet an answer yet, and I am in a bit of a hurry. Need to have that website up by monday. Question: We have a domain registered at 101domain, we used a domain just for google apps, for email, and some services, and domain was never used for website. Now we pointed domain to the local hosting service, set everything like it should be, and pointed back to google from hosting service for gmail and gmail works perfectly. Problem is that we tried to set up a website at hosting service but domain still points to google apps login. Everything looks like it should, we recently bought 2 more domains that are set up the same (we use google apps on them too), the only difference tha I can think of is that we didn't use for apps first. Could this be because CNAME lags behind DNS server change?

    Read the article

  • Shared Hosting Provider [closed]

    - by Garry
    Possible Duplicate: How to find web hosting that meets my requirements? I've been with Dreamhost for 5 years but the amount of downtime I have experienced over the last 6 months has been outrageous. As of now (2012) which hosting provider would you recommend? Most of my sites are small to medium readership blogs running WordPress. I've been looking at Inmotion and Hostgator. Reliability is paramount. Thanks

    Read the article

  • Is it good or bad to have dynamic content in page titles and/or description

    - by Gunjan
    In a local listing website, I append number of search results found in the description(not in title currntly) meta tag of the page as I think this is valuable for users for e.g. "Find address, phone numbers, blah blah blah for 21 outlets in locality. some more stuff after this..." as more places are added to the database, the description for the same page will change frequently. is this good or bad for SEO how about doing the same for title tags?

    Read the article

  • Download/submission directories are dead? Are they good for SEO?

    - by fborozan
    We have just released a new major software product version. In the past if you wanted visibility you would create a standardized pad file and you would submit it to hundreds of directories or you used web service that would do that for you. These directories would then serve as first incoming links to your web site. How about today? I think download directories are pretty much dead? Do you think this is still good SEO approach today? Are these software download directories useless?

    Read the article

< Previous Page | 143 144 145 146 147 148 149 150 151 152 153 154  | Next Page >