Search Results

Search found 9728 results on 390 pages for 'zee pro'.

Page 94/390 | < Previous Page | 90 91 92 93 94 95 96 97 98 99 100 101  | Next Page >

  • Facebook link to facebook.com/company-page doesn't work

    - by Teo
    For the last 2 days I'm trying to find the reason why my previous setup, which was a link to my websites Facebook pages doesn't work anymore. I assume that I mistakenly changed something in the Facebook developer area, but I can't remember what it was. The bottom linked my previously to the Facebook.com/company-page, now the same Bottom links me just to Facebook.com. I guess I saw some redirect in the tab, but I'm not sure since it's too fast changing to Facebook.com. The original link in the footer is correct : <a href="http://facebook.com/company-page " target="_blank" class="facebook_ico"></a>. Any ideas?

    Read the article

  • Looking for Hosting Companies that Meet the Following Criteria [closed]

    - by Bryan Hadaway
    Possible Duplicate: How to find web hosting that meets my requirements? Please Note: This is not a subjective question and I am not looking for opinions. This is very much an objective question with legitimate use and purpose to identify hosts that offer the following: Multi Domain SSL Certificate Linux Server PHP5+ cPanel Unlimited Storage, Bandwidth, MySql DBs and Addon Domains SSL is mentioned first because this is most important. This is not a single domain or wildcard SSL cert. It's relatively new and unique. It's for the purpose of securing multiple domains on one account without having to have an entirely separate hosting account and SSL cert for every domain. I'm currently using BlueHost/HostMonster which meets all my criteria except for this special kind of SSL cert. Currently, HostGator is the only host that offers everything I've listed that I've been able to find. Again, I'm not requesting recommendations, advice or opinions of the best or most reputable service based on your experiences. I am asking for an objective list of known hosts that offer the aforementioned listed items only. Thereafter, I (and others who this will benefit) can make our comparisons and selection privately.

    Read the article

  • Google Webmasters tools crawl error

    - by Shiro
    I am looking in to Google Webmaster Tools - Crawl Error section. How should I handle for those URL due their system / application showed invalid URL. e.g http://www.example/images/products/s_=enlarge_16gb.jpg but, I dunno what happen to yahoo groups, it break the link into http://www.example/images/products/s_= enlarge_16gb.jpg and I only make the top part become hyperlink, which is http://www.example/images/products/s_= Because of the URL, Google show crawl error, I got few error because of this kind of result or because other people typo error. How do I prevent this. I am sure I don't have the right go and change other people post. What is the solution for this. Thanks!

    Read the article

  • What is the best approach to copy public dynamic pages?

    - by Renan
    Situation: the government is supposed to publish official information online such as acts and laws. Problem: they're using 90s expertise to do it. You can tell that by the constant use of deprecated html tags such as <table and the lack of any compression at all, which makes some documents go way over 700,000 bytes even though they're pure text. Side problem: some companies are actually editing and selling this content that should be public and free. What I need to know is the best approach to offer said official content in my own site for free. I've thought of setting up a mirror to copy the official pages from time to time, since some of them are updated frequently, which would automatically be compressed as all my pages are via htaccess.

    Read the article

  • Why does Google Search Engine reject my title tag's change?

    - by Michal P.
    I made a simple webpage http://pundaquitboat.michaelspages.com/ giving it the the title tag "Boat – Pundaquit" and I have submitted it to Google bot by Google Webmaster Tools. Then I decided to change the title to "Anawangin trip" of the same page and I submited my webpage again in the same way to Google bot. The result was that the new title of my webpage coexisted with the old title of the same webpage in SERPs for maybe 2 days. After that the new title was rejected and if I enter site:pundaquitboat.michaelspages.com/ I can see that Google has my old copy of my webpage with old title in its database. This problem doesn't occur in Bing when I can enjoy high position of "Anawangin trip" phrase. (In Bing I haven't submitted the old version of title.)

    Read the article

  • What do I do if a user uploads child pornography?

    - by Tom Marthenal
    If my website allows uploading images (which are not moderated), what action do I take if a user uploads child pornography? I already make it easy to report images, and have never had this problem before, but am wondering what the appropriate response is. My initial thought is to: Immediately delete (not just make inaccessible) the image File a report with the National Center for Missing and Exploited Children with all information I have on the user (IP, URL, user-agent, etc.), identifying myself as the website operator and providing contact information Check any other images uploaded by that IP user and prevent them from uploading in the future (this is impossible, but I can at least block their account). This seems like a good way to be responsible in reporting, but does this satisfy all of my legal and moral responsibilities? Would it be better not to delete the image and to just make it inaccessible, so that it can be sent to the National Center for Missing & Exploted Children, the police, FBI, etc?

    Read the article

  • small IIS web farm-create an Active Directory domain or no?

    - by brian b
    We have a smallish web farm of < 5 Windows 2008 servers. Some do data, most do IIS hosting. Is it a good/bad idea to set up a domain controller and put all in the same "production" domain? We want to avoid a world where we have to sync multiple admin passwords between the boxes (or share admin credentials among the team). Presumably, the DC would be just another VM, so hardware cost doesn't enter into the discussion.

    Read the article

  • Domains with similar names and legal issues

    - by abel
    I recently purchased one of those domain names like del.icio.us. While registering I found that delicious.com was being used. Argument: I found that delicious.com belonged to the same category as my to-be website. It served premium delicious dishes. Counter Argument: My to-be domain though belonging to the same category, specialized in serving free but delicious dishes or in giving out links(affiliate) to other sites serving premium delicious dishes. Additional Counter Arguments: 1.delicious.com was not in English. 2.the del.icio.us in my domain name though having the same spelling, is not going to be used in the same fashion. For eg.(this may not make sense, because the names have been changed)the d in delicious on my website actually stands for the greek letter Delta(?/d) and since internationalized domains are still not easily typable, I am going for the english equivalent.The prefix holds importance for the theme of the service which my website intends to offer. My Question: Can I use the domain name del.icio.us for my website? How are these kinds of matters dealt? (The domain names used are fictitious. And I have already registered the domain but have not started using it.I chanced upon this domain name because it was short, easy to remember and suited the theme of my website.)

    Read the article

  • Re-writing URL's with lighttpd

    - by Tim Post
    I'm using Lighttpd to serve a GET based API that I'm working on, and I'm having some difficulty with re-writing requests. My API calls are very simple. An example would be : url:/method/submethod?var1=something&var2=something&key=something This is what I have: url.rewrite-once = ( "^/methodfoo(.*)" => "/index.php$1&method=methodfoo") This works fine if all methods were shallow, but I have methodfoo/submethod to deal with. What I'd like to do is use a rule that can split this up for me, appending a &submethod to the end of the rewritten string. For instance: url://methodfoo/submethod?foo=bar&foobar=foo Would be re-written to: url://index.php?foo=bar&foobar=foo&method=methodfoo&submethod=foo Can I do that without an explicit rule for each submethod? Additional Information: Yes, I know I can use a rule like: "^/methodfoo/(.*)/(.*)" => "/index.php$2&method=methodfoo&submethod=$1" However, That fuglifies (TM) my link structure, as it would have to match: url://methodfoo/submethod/?foo=bar&foobar=foo When I really want: url://methodfoo/submethod?foo=bar&foobar=foo Thanks in advance for any suggestions.

    Read the article

  • Mod Rewrite - url rewriting

    - by modrewriteNewbie
    I am very new to mod rewrite. I need to redirect any user with "citzenhawk" parameter in their url to my url for example http://www.mywebsite.com/?sc=CX12N003&cm_mmc=affiliate--citizenhawk--nooffer-_-na&prfc=5&clickid=0004c845fa9a87050a4277221a003262 should result into http://www.mywebsite.com/ Here are my rewrite conditions: RewriteCond %{QUERY_STRING} (&|^)cm_mmc=(.)citizenhawk(.)(&|$)$ RewriteRule ^/rrs/ [NC,R=302,L] Where am I going wrong? Is my RewriteCond wrong?

    Read the article

  • Templates for forms, tabs etc? - Patterntap alternatives

    - by Marco Demaio
    I used to find http://www.patterntap.com quite useful to get design inspiration for forms, tabs, and other web elements etc. Unfortunately after the ZURB acquisition of Patterntap now they enforce you to sign in with your Twitter account in order to simply view larger images of patterns provided by the crowd. So in some way it's not free anymore. Do you know of alternatives to patterntap that are free and you are not obliged to sign in?

    Read the article

  • Global vs. Local Monthly Searches in Adwords keyword tool

    - by Gregory
    I'm trying to learn how to use a keyword tool in Adwords. Here's what I entered: Country- Russia Language-Russian Desktop and laptop devices And the keyword was ???? ? ??????? (tours to Israel in Russian Cyrillic letters) . As a broad match type... Now... the results that I got were: Global monthly: 60,500 Local monthly: 40,500 If I got it right..."Global monthly" means in this context : worldwide average monthly searches for this search term in ANY language in any Google search site (google.ru, google.com.ua, google.com, google.fr etc.). It's all nice, BUT... Then I made an query for tours to Israel in English in the US...And I got: Global monthly: 60,500 Local monthly: 27,100 That doesn't make any sense to me though! How come the total sum (the global) is actually a smaller number than a combined sum of just TWO countries??? (27,100+40,500=67,60060,500) By "any language" they mean a translation of the term into ANY possible language???Or maybe by "language" Google means the language of searchers' operating system? or their browsers' language?

    Read the article

  • List of usage information to collect in a web application

    - by Thomas Levine
    I'm writing a web application that will allow people to create accounts, edit stuff, send stuff to people, &c. I plan on recording things like when things were created and sent and stuff. Is there a list of usage information that one should collect in a web application? I'd like to see whether I'm missing something. Also, is there a list of usage information that I shouldn't collect (Like maybe information that people find private)?

    Read the article

  • Should I post my PDF library for SEO? [closed]

    - by Iunknown
    Possible Duplicate: Do search engines crawl PDFs and if so are there any rules to follow when making them When a Sales call comes in, the caller often says something like: 'I searched for 3 days before finding your product and it's exactly what I need!' That's telling me that I need some SEO work. We redid our website and streamlined it which removed many of our 'How-To' documents. Since those PDF documents contain words that people might search for, I was wondering if I could add a 'Complete library' link to the bottom of a page that will load up the entire PDF library. Would that help my ranking?

    Read the article

  • What's wrong with my htaccess ? (500 Error)

    - by Dany Khalife
    I've written a small htaccess file to redirect Internet Explorer users to a specific page Here are the contents : # MS Internet Explorer - Mozilla v4 RewriteEngine On RewriteCond %{HTTP_USER_AGENT} ^Mozilla/4(.*)MSIE RewriteRule ^index\.php$ /sorry.php [L] # All other browsers #RewriteRule ^index\.html$ /index.32.html [L] Any clue why this would give a 500 Internal Server Error ? I have used mod rewrite before so i have the module loaded there...

    Read the article

  • Does Google treat AWS IP addresses as related?

    - by ElHaix
    We are hosting several websites on one of our servers, and wondering if because they are on the same subnet that they have been somehow penalized. We are not inter-linking between websites. However in an attempt to have everything hosted in AWS, we will have some sites that we do want to be interlinked. If the sites resided on the same subnet, this could be bad. However, with AWS, we can allocate multiple elastic IP addresses that do reside on different subnets. How does Google deal with this?

    Read the article

  • Economical DNS hosting separate from local registrar for country specific TLDs - email or web hosting not required

    - by Eric Nguyen
    Our company owns many country specific top level domains (TLDs; .sg, .my). We will purchase more for other countries in all South East Asia. These domains are associated with our websites hosted on Amazon EC2. The DNS records are currently hosted on a dedicated server that will shut down tomorrow. (The name servers are set to the ones of a web hosting company) Therefore, I will need to host the DNS records somewhere else. Hosting the DNS records with the local registrar costs SGD18 a year per domain in addition to the domain price (which is already very expensive but we have no choice). It would be convenience to host DNS recors for all the country specific TLDs we have using a single service, separate from the local registrars from which we bought the domains. A few searches prompted examples like Amazon Route 53 and dnsmadeeasy.com and the likes. However, since I'm only concern about the country specific TLDs, not .com 1) Is it really economical to host DNS records of all domains in 1 single place as described above? (Have the relevant countries and/or the local registrars done something to keep their monopoly and always charges ridiculous prices for their country specific TLDs?) 2) I would imagine I will need to tell the local registrars to update the name servers to those of the DNS hosting service provider e.g. dnsmadeeasy.com here. Am I correct about how it works here? 3) Will I be able point the TLDs themselves to IP addresses I desire (the EC2 instances where my websites are) or will I only able to do so with the subdomains? 4) Are there any drawbacks that I should know here? Background about our needs: We need the websites associated with the country TLDs to be up and running all the times Also, we'll need to be able to add/edit A and CNAME records We use Google Apps for Business for internal email so I will need to be able to add/edit MX records and TXT records

    Read the article

  • Which token from a long User-Agent should I use in robots.txt?

    - by Gaia
    The definition of User-Agent states that several tokens can be included, as deemed necessary by the client. I want to block certain bots via robots.txt and I am confused as to which part of the User-Agent string to use, especially for more obscure bots. For example: Mozilla/5.0 (compatible; uMBot-LN/1.0; mailto: [email protected])" JS-Kit URL Resolver, http://js-kit.com/ Mozilla/5.0 (compatible; SEOkicks-Robot +http://www.seokicks.de/robot.html Do I use the second token? Can tokens contain spaces, or did the SEOkicks folks forget a semicolon after SEOkicks-Robot? I don't actually intend on making my question specific to a couple bots - I want to know the guideline: which part of UA do I place in robots.txt for these exotic bots with UA as long as a haiku? User-agent: uMBot-LN/1.0 Disallow: / PS: Thank you but I do not need to hear that undesirable bots are better blocked with mod_security. I already have commercial mod_sec rules in place.

    Read the article

  • Adsense click bot is click bombing my site

    - by Graham
    I have a site that get's roughly 7,000 - 10,000 page views per day right now. Starting around 1 AM on 7/1/12 I noticed the CTR was rising dramatically. These clicks would be credited then de-credited soon after. So, they were obviously fraudulent clicks. The next day I had about 200 clicks in account with about 100 of them being fraudulent. It's about 3 - 8 per hour evenly dispersed for each of the three ads 24 hours a day. This leads me to believe that it's some sort of Adsense click bot. Also, I removed the ads last evening then put them back up around 3AM and the invalid clicks started within 10 minutes. I signed up for statcounter.com to analyze the exit links on the Adsense. Then I conditionally blocked ads for the IP address of the person / bot I suspected doing this. But, I think that the bot has several proxies to choose from and can refresh IP addresses. I've notified Google through the invalid click form / email 4 times over the past two days in order to let them know I'm aware of the situation and am working on a solution. I've also temporally removed all ads on that site. How can I block a bot like this? Thank you.

    Read the article

  • My blog not even ranking for exact title match [on hold]

    - by Akshay Hallur
    I have original in detail blog posts related to blogging and SEO. This domain has been dropped (expired) 2 times before my acquisition. I am the 3rd owner of the domain since 143 days. Blog posts are not ranking even for exact titles. Google+ or LinkedIn shares will show up instead of my content.Some blog posts are not even indexed. I am hardly getting around 7 organic visits / day. Example 1 : http://www.infoflame.com/offer-pdf-of-blog-posts-for-likes-and-shares/    Title: Offer Readers PDF of Blog Posts for Their Likes and Shares not indexed at all.  Example 2 : http://www.infoflame.com/anchor-text-for-seo/    is indexed but not coming up for the exact title. Suspect: Dropped domain, less likely used for spam( WayBack machine (2 drops) 3 captures since 2004, I don't know whether there was Email spam) (But no manual actions in WMT, so no reconsideration request). What's the reason for this? Should I wait? How can I tell Google that ownership is changed and the domain is now spam-free? or should I de-index it and start a new blog? Thank you, for any advises.

    Read the article

  • Is the Mailchimp API available in other languages?

    - by boundaryfunctions
    I'm using the Mailchimp API in combination with PHP and jQuery to provide the subscribing/unsubscribing-actions on a website via Ajax. On errors with user data you get useful messages like "Invalid Email Address", "[email protected] is already subscribed to list x. Click here to update your profile." or "There is no record of "[email protected]" in the database". For sure I want to keep theses messages, but is there a way I can get them in other languages (in particular in German)? How would I achieve this? I wasn't able to find anything about in the Mailchimp docs. I wouldn't like to translate them myself...

    Read the article

  • Increasing traffic for music blog

    - by Wladimir Ivanov
    I own a music blog with some articles and of course youtube iframes and mp3 listening plugins. I get traffic from google and some not very popular site (I don't mention it because you can think it's a SPAM) where I post links with pictures to my posts. Any ideas to get more traffic for this kind of blog? I know about Myspace, blog directories, video sharing sites, same niche blogs and relevant forums. Anything I'm missing? Thanks in advance

    Read the article

  • SEO optimisation problems after Google Panda [on hold]

    - by Daniel West
    I am currently trying to improve a website's SEO after it took quite a hit from the Google Panda upgrades. What are the main things I need to look at improving when trying to improve its ranking in Google? I have already made sure that the pages validate to W3C Standards, minimized css and js and done the obvious meta tags and header optimization but this hasn't made any difference yet. It could possibly be a content issue as the pages currently read much like a brochure and there were some pages with just a video and no text content on them which is also an issue. I've added a rel="nofollow" attribute to the links to these pages although i'm told this doesn't really work anymore. If anyone has any ideas let me know. Cheers!

    Read the article

  • Permanent redirect domain to www subdomain without web.config

    - by Lord Simpson
    I've just set up a site via 1and1 and have run into an issue, I want to accomplish the simple task of redirecting the root domain to the www sub domain however due to complications I cant seam to find a way to get it to work. I'm on a Microsoft (asp.net) package so can't use .htaccess, also the IIS server they have doesn't have the URL redirect module installed (so can't use <rewrite> in web.config). They have built in HTTP forwarding options however if I set the root domain to redirect to the www sub domain it just infinitely redirects. Hopefully there is some obvious option/method I've missed during the past two days of searching!

    Read the article

  • Best way to prevent Google from indexing a directory [duplicate]

    - by Gkhan14
    This question already has an answer here: Stopping Google index some web pages I have 5 answers I've researched many methods on how to prevent Google/other search engines from crawling a specific directory. The two most popular ones I've seen are: Adding it into the robots.txt file: Disallow: /directory/ Adding a meta tag: <meta name="robots" content="noindex, nofollow"> Which method would work the best? I want this directory to remain "invisible" from search engines so it does not affect any of my site's ranking. In other words, I want this directory to be neutral/invisible and "just there." I don't want it to affect any ranking. Which method would be the best to achieve this?

    Read the article

< Previous Page | 90 91 92 93 94 95 96 97 98 99 100 101  | Next Page >