Search Results

Search found 14789 results on 592 pages for 'pro backup'.

Page 178/592 | < Previous Page | 174 175 176 177 178 179 180 181 182 183 184 185  | Next Page >

  • Transfer .com domain to GoDaddy - websites running on same domain - 3 weeks left until expiration, 2 days left web hosting

    - by Eric Nguyen
    Our company purchased this abc.com domain from a local registrar. The domain will expire in about 3 weeks. We have our main websites running on this abc.com domain and they cannot be down for too long. The web hosting service will end in 2 days. Our websites are already hosted and they are up and running on Amazon EC2. We would like to transfer the domain to GoDaddy now or as soon as possible. (since we have many other domains there and we belive GoDaddy will be better in long-term considering the prices and the features it offers) There are many questions on the decision to transfer the domain to GoDaddy: 1) Cost and time required to move out of our local registrar? This is currently unknown as I'm still trying to retrieve the agreement we have with them 2) How does the 3 week time left until expiration of the domain matters here? Should we wait until the domain expires and then purchase in through GoDaddy? How long would such process take as I suppose our websites will be down during that time? Any other drawbacks? 3) What can I do to ensure our websites will continue functioning regardless of the domain transfer process? It seems the actual registrar here is enom.com and the local registrar here just partners with it I suppose I should then park the abc.com domain with enom.com and make changes to DNS settings so that our websites can continue to be hosted on EC2 as normal. How long does it normally take the domain to be transferred to GoDaddy completely? Is it even possible at all to keep our websites are up and running during the whole domain transfer process? Apologies that I'm throwing many questions at the same time here. It's rather last minutes and I suddenly realised there are too many unknown risks.

    Read the article

  • Singular or Plural Nouns as file names for better Search & SEO friendlyness? [closed]

    - by Sam
    Possible Duplicate: Should I use singular or plural nouns in a domain name and why? Dear folks, two scenarios where file names should be best representing the search volume by audiences searching for it. Scenario 1 website.org/en/logo.php website.org/en/brochure.php website.org/en/poster.php website.org/en/design.php OR Scenario 2 website.org/en/logos.php website.org/en/brochures.php website.org/en/posters.php website.org/en/designs.php Q1. What do you intuitivly think would be the best? Q2. What do the facts in general show? people search for singular or plural in search? Q3. Do Search engines have common rule of thumb for this? Q4. Should I pick either and go with either scenario consistently or does it depend on the word? Thanks very much for your ideas/suggestions. I reall don't know which one to go for.

    Read the article

  • Hiding php includes from search spiders?

    - by 21stcn
    Quick and simple question. I have 80+ html files which I want to be crawled. They are individual product pages. Each of these pages calls its content using php includes. These php include files are in a separate folder on the server and contain the core content for the individual product pages. I just wanted to ask, if I use robots.txt or .htaccess to prevent crawling of the directory that holds the php content files, will there be no issue crawling the html pages which include these files? What I want to achieve is have the html files indexed with the php content included in them, but I don't want visitors landing on the php content pages, nor have these php files indexed as duplicate content. Just clarification needed as to whether it is safe to block spiders from accessing the php folder, without this affecting the html files being indexed with the included content. Is this the best way to do things? Or should I just leave the content php files to be crawled?

    Read the article

  • Properly force SSL with .htaccess, no double authentication

    - by cwd
    I'm trying to force SSL with .htaccess on a shared host. This means there I only have access to .htaccess and not the vhosts config. I know you can put a rule in the VirtualHost config file to force SSL which will be picked up there (and acted upon first), preventing double authentication, but I can't get to that. Here's the progress I've made: Config 1 This works pretty well but it does force double authentication if you visit http://site.com - once for http and then once for https. Once you are logged in, it automatically redirects http://site.com/page1.html to the https coutnerpart just fine: RewriteEngine On RewriteCond %{HTTPS} !=on RewriteRule ^ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301] RewriteEngine on RewriteCond %{HTTP_HOST} !(^www\.site\.com*)$ RewriteRule (.*) https://www.site.com$1 [R=301,L] AuthName "Locked" AuthUserFile "/home/.htpasswd" AuthType Basic require valid-user Config 2 If I add this to the top of the file, it works a lot better in that it will switch to SSL before prompting for the password: SSLOptions +StrictRequire SSLRequireSSL SSLRequire %{HTTP_HOST} eq "site.com" ErrorDocument 403 https://site.com It's clever how it will use the SSLRequireSSL option and the ErrorDocument403 to redirect to the secure version of the site. My only complaint is that if you try and access http://site.com/page1.html it will redirect to https://site.com/ So it is forcing SSL without a double-login, but it is not properly forwarding non-SSL resources to their SSL counterparts. Regarding the first config, Insyte mentioned "using mod_rewrite to perform a simple redirect is a bit of overkill. Use the Redirect directive instead. It's possible this may even fix your problem, as I believe mod_rewrite rules are some of the last directives to be processed, just before the file is actually grabbed from the filesystem" I have not had no such luck on finding a force-ssl config option with the redirect directive and so have been unable to test this theory.

    Read the article

  • Is there a Content Management System that allows multiple & independent blogs to be running on one domain?

    - by Ron
    Hello Webmasters, I am a Wordpress fan, and I'm now building a new site and I'm not sure which CMS can achieve what I'm trying to do. I am building a food blog network for a bunch of cities in the US, and I want to my city pages to be independently running blogs themselves. So basically... Home Page - Its own blog with its own users, talking about Food in general Dallas Page (child of home page) - Its own blog with its own users Chicago Page ..... so on and so forth. The web layout and design will be all the same, but just trying to achieve 25~50 independent blogs on one domain. How can I achieve this? I'm hoping that I don't have to install Wordpress into as many subdomains that I create... Thank you for your help in advance. -RP

    Read the article

  • Bingbot requests from Google IP address

    - by JITHIN JOSE
    We have some suspicious requests to our server, 74.125.186.46 - - [24/Aug/2014:23:24:11 -0500] "GET <url> HTTP/1.1" 200 16912 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)" 74.125.187.193 - - [24/Aug/2014:23:24:12 -0500] "GET <url> HTTP/1.1" 200 20119 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)" As it shows, user-agent shows it is bingbot. But whois data of IP address(74.125.186.46 and 74.125.187.193) shows it is from google servers. So is it Google,Bing or any other content scrappers?

    Read the article

  • Google+1 button strategy - Combined +1s or separate +1s?

    - by nctrnl
    I have included the Google+1 button on my blog. Each post outputs a +1 button on the bottom. Depending if you are viewing the actual post or just the main page the +1 button will "+1" either the post address or blog website address. This made me think for a bit if the +1 button should be configured to +1 the blog section (www.example.org/blog), +1 the main website address (www.example.org), or +1 individual posts?

    Read the article

  • Traffic fall after a server problem

    - by Sébastien
    I have a website from which I analyse the traffic with Google analytics. Day after day the traffic (mainly from Google SE) incresed until I get a problem with my server. For one day the server has been offline and after that I have no longer had as much users as I had before. Now it's like the site is no more referenced on Google index (but when I type "site:mysite.com", I still have all the results). Do you know if this is a normal behaviour and if the traffic will come back as before (the server has had problems two days ago) ?

    Read the article

  • Setup site folders on Apache and PHP [closed]

    - by Cobus Kruger
    I'm trying to set up my first Apache server on my Windows PC at home and I have real trouble finding out which configuration settings go where. I downloaded and installed XAMPP which seemed to get everything nicely set up and can see a working website on http://localhost. So far so good. The point of this is to develop a website of course, and to make my life easier (irony?), I wanted to let the web site root point to my Eclipse project folder. So I opened httpd.conf, uncommented a VirtualHost block and changed its DocumentRoot to my local path. Now when I try to load http://localhost I get a 403 (Access denied) error. So where do I configure permissions for my folder? And is that all I need to let my site run from the folder specified or am I going to have to clear another hurdle?

    Read the article

  • Is this a DNS or server-side error?

    - by joshlfisher
    I am having difficulty accessing a specific website. (I get 500 Server fault errors) I can access this site on my iPhone when NOT connected to WiFi. I CANNOT access the site when connected to WiFi or via a Ethernet connection to my home network. I thought it might be a DNS issue, so I copied the DNSservers from a friend who has a different ISP, and has no problem access the site. No luck. Also tried some of the public DNS servers out there, again, with no luck. Does anyone have any idea on how to trace this issue?

    Read the article

  • Domain masking (and simple page links)

    - by Halik
    How do you set up the domain (Im using godaddy) to mask the server url but to append the sub-page link. Im thinking something like the wikipedia en.wikipedia.org/wiki/something (or, if it would require httpd.conf access, setting it to append the default subpage link eg. '?page_id=2') Currently I can set up the domain to either be masked completely without showing any sub-page links or to simply redirect my domain to my web server.

    Read the article

  • DNS MX and NS entries

    - by unkown
    I was wondering about my domain and if next is afordable. First of all this is my "architecture": Domain registration at GoDaddy.com Hosting at Dreamhost mail at google apps Until now I setted up the google apps MX entries in my domain through the GoDaddy manager, but now what I want is to set up the hosting I have hired from Dreamhost. I understand that all I have to do is to setup next Dreamhost NS entries into the GoDaddy domain manager: NS1.Dreamhost.COM. 66.33.206.206 NS2.Dreamhost.COM. 208.96.10.221 NS3.Dreamhost.COM. 66.33.216.216 My question is, will my mail keep working right as soon as the MX entries I already setup into the GoDaddy are the Google Apps ones?

    Read the article

  • How do i make/setup a mailing list?

    - by acidzombie24
    I know i could program it myself but maybe there is a simple (thats the keyword) package. I like to add a box, snippet, form or page to my site for users to input their email address. Then i'd like it to automatically send email saying you have successfully subscribe. I'd want an unsubscribe link and i would be using smtp on google apps. I would like to send email with html so i can have a simple sentence with a complex link such as this I dont know if i'll get marked as spam if i try to send everyone an email at once? so i'd like the software to manage it. Also i'd like to get the emails in a simple format like csv. Is there a simple app that does this? if not i can write a simple form which does a simple insert (or delete when unsubscribing) and write a 200-300line app to send my emails out for me.

    Read the article

  • Guidance on building an au pair-to-family networking site.

    - by Philip Kidd
    I'm building a website for an au pair agency business that will connect au pairs to families around Europe. I know nothing about website building, HTML etc. so I'm using a wysiwyg editer (weebly). How I would like the site to function: Families upload their information into profiles Au pairs do the same families can view a limited part of an au pairs' profile until they pay a deposit After deposit is payed, all au pairs' profile information becomes open to families Families can order au pairs and confirm their order with another payment payment must be made before 'order' is confirmed By 'order' I mean full communications become open between the family and the au pair they have 'ordered' as well as travel information being sent to another agency the site needs to be linked with a bank account (e.g paypal) and another agency, who will look after the flight bookings etc. A website already exists for this business however it just contains information on the business and application forms - if the site becomes fully automated it will relieve a lot of strain on administration in the office (dealing with applications, travel information etc.)

    Read the article

  • Ajax does not send the data to my php file [migrated]

    - by Mert METIN
    I try to send my data to php file but does not work. This my ajax file var artistIds = new Array(); $(".p16 input:checked").each(function(){ artistIds.push($(this).attr('id')); }); $.post('/json/crewonly/deleteDataAjax2', { artistIds: artistIds },function(response){ if(response == 'ok') alert('dolu'); elseif (response == 'error') alert('bos'); }); and this is my php public function deleteDataAjax2() { extract($_POST); if (isset($artistIds)) $this->sendJSONResponse('ok'); else $this->sendJSONResponse('error'); } However, my artistIds in php side is null. Why ?

    Read the article

  • .htaccess URL rewriting friendly URL with 2 parameters, the second parameter is optional

    - by letsworktogether
    I'm kind of stuck at this part and was hoping that I'd get some assistance. I'm building a highscores page in PHP, that's going great, it works. However, I dislike the idea of index.php?skill=name and therefore wanted a bit of SEO in this. I have successfully replaced the url with a more friendly version: highscores/skill/name And this is where the problem starts, I have added pagination to the highscores and the page is read from the HTTP_GET page variable ($_GET['page']). I dislike the idea of highscores/skill/name&page=2 and was hoping if you guys could assist me to make the url like the following: Page 1, so accessing the file without declaring the page number: DOMAIN.TLD/highscores/skill/name Page 1 so now the page variable is needed: DOMAIN.TLD/highscores/skill/name/2 As you can tell the "2" will define page 2 and load the correct data for page 2. However, I'm having much trouble in my .htaccess file to configure it this way. RewriteRule ^highscores\/skill\/(.*?)(\/(.*?)*)$ highscores/skills.php?skill=$1&page=$2 [L] # Skills page That is my latest attempt in order to get it to work, unfortunately it does not work, it makes the page look horrible (CSS doesn't work) and it doesn't go to the page specified on the URL.

    Read the article

  • EV SSL Certificates - does anyone care?

    - by pygorex1
    Is any one aware of any data or studies from an impartial source that show the impact of EV SSL certificates on customer behavior? I've been unable to find any such studies. If an EV SSL certificate increases sales on a web store front by even a few points, I can see the value. Aside from data targeted at EV SSL it may be possible to guess at customer behavior based on user interaction with regular SSL certificates. Are users even aware of SSL security? Does regular SSL have any proven effect on web store front sales? Note, that I'm not asking about the necessity of good encryption - I'm asking about a potential customer's perception of security & trust.

    Read the article

  • % new visitor vs. % returning visitor

    - by Torben Gundtofte-Bruun
    I'm not sure how to interpret the results in Google Analytics. I understand that some metrics should be high, and some should be low. But this one I don't get: % new visitor vs. % returning visitor: It's good that users are returning, but surely it's also good to get new, fresh visitors. How do I evaluate this %-vs-% ratio? The higher the better: visits unique visitors pageviews pages per visit avg. visit duration The lower the better: bounce rate drop-offs

    Read the article

  • Embarking on a website redevelopment and all developers pushing to move to ASP.NET 4.0

    - by Sue
    Our company is going through a website redevelopment / retooling exercise and we are not quite sure which direction to take. We are told that the website was built in ASP classic and that we should be moving to ASP.NET 4.0. Some developers refuse to do any work in the ASP classic framework citing the advantages of ASP.NET 4.0-- stability, compilation, language support. We are generally happy with our website as is. There are some kinks in the backend involving forms and there is little integration between the CRM of the website and any content management system. Does the move from ASP classic to ASP.NET 4.0 give major advantages to the integration between how content is created, and delivered to our customers?

    Read the article

  • difference in using third party social buttons and directly integrating each social buttons ourselves

    - by Jayapal Chandran
    I wanted to add specific social buttons to my article. I used ShareThis. It gives a facebook like button, google plus button, etc... by default. were as in other articles of different modules i had integrated the facebook like by myself by following the documentation (including markup in the head section) What is the difference in adding manually with many markups and using third party code? Will that affect SEO or any other advantage over the respective social networking site (here for example facebook and google plus)?

    Read the article

  • Is it true that the Google Spider gives the most relevance of a search result to the first 68 characters of the <title>?

    - by leeand00
    I am reading documentation about my CMS and it states that an HTML page <title> tag is really important in SEO. It states that the Google Spider gives the most relevance to the first 68 characters of a site title. (68 characters being the number of characters that Google will display in it's search engine result pages,) Can anyone verify this is still true? I read in The Information Diet that content farms were getting too good at gaming Google's algorithm for collecting and posting SERPs and so google had to change the search algorithm.

    Read the article

  • Stop bots from crawling old links with extensions

    - by Jared
    I've recently switched to MVC3 which is extension-less for the URL's, but Google and Bing have a wealth of links that they are crawling which no longer exist. So I'm trying to find out if there is a way to format robots.txt (or by some other method) to tell google/bing that any link that ends in an extension isn't a valid link... Is this possible? On pages that I'm concerned about a User having saved as a fav I'm displaying a 404 page that lists the links to take once they are redirected to the new page (I decided to not just redirect them as I don't want to maintain these forever). For Google/Bing sake I do have the canonical tag in the header. User-agent: * Allow: / Disallow: /*.* EDIT: I just added the 3rd line (in text above) and it APPEARS to do what I'm wanting. Allow a path, but disallow a file. Can anyone confirm this?

    Read the article

  • htaccess redirect problem

    - by jimbo
    Hi all, I am currently building a site and want to hide all development work on the site. I am using a htaccess file and redirecting to my holding page index.php?id=7: Options +FollowSymlinks RewriteCond %{REQUEST_URI} !/index.php RewriteCond %{REQUEST_URI} !/assets/ RewriteRule $ /index.php?id=7 [R=307,L] This is working for pretty much all pages, but, changing index.php?id=7 to another number id=6 for example still shows the page with no redirect. Any help welcome...

    Read the article

  • Facebook like button for a Facebook feed item

    - by jverdeyen
    Here is what I do: When the admin of a website i'm creating adds a new 'news' item to his website, it directly posts the same item on the Facebook 'fanpage' of the website. This is done by an Facebook app with the permissions to post an item on the Facebook fanpage. So far so good. I'm getting the new post id from the facebook graph. Here is what I want to achieve: I'm adding a 'like' button on the original website, but when this is clicked it should be referenced to the same post item on the facebook page. So if someone likes the news post on the website he should also like the referencing facebook feed post. Which is exactly the same. Can this be done? I've been playing around with the href etc of the like button, without any success. When I hit the like button it likes the url given, but is doesn't recognizes that this is a feed post. Any idea? Thx in advance.

    Read the article

  • Legal responsibility for emebedding code

    - by Tom Gullen
    On our website we have an HTML5 arcade. For each game it has an embed this game on your website copy + paste code box. We've done the approval process of games as strictly and safely as possible, we don't actually think it is possible to have any malicious code in the games. However, we are aware that there's a bunch of people out there smarter than us and they might be able to exploit it. For webmasters wanting to copy + paste our games on their websites, we want to warn them that they are doing it at their own risk - but could we be held responsible if say for instance a malicious game was hosted on an important website and it stole their users credentials and cause them damage? I'm wondering if having an HTML comment in the copy + paste code saying "Use at your own risk" is sufficient.

    Read the article

< Previous Page | 174 175 176 177 178 179 180 181 182 183 184 185  | Next Page >