Search Results

Search found 14789 results on 592 pages for 'pro backup'.

Page 249/592 | < Previous Page | 245 246 247 248 249 250 251 252 253 254 255 256  | Next Page >

  • Is there a way of using HTTPS with Amazon's CloudFront CDN and CNAMEs?

    - by Metalshark
    We use Amazon's CloudFront CDN with custom CNAMEs hanging under the main domain (static1.example.com). Although we can break this uniform appearance and use the original whatever123wigglyw00.cloudfront.net URLs to utilise HTTPS, is there another way? Do Amazon or any other similar provider offer HTTPS CDN hosting? Is TLS and its selective encryption available for use somewhere (SNI: Server Name Indication)? Foot note: assuming that the answer is no, but just in the hope someone knows. EDIT: Now using Google App Engine https://developers.google.com/appengine/docs/ssl for CDN hosting with SSL support.

    Read the article

  • Open Source PHP based secure file download script?

    - by SiddharthP
    Basically I need a self hosted solution where I as the admin can create client areas (which can be simple folders) where I upload files and secure them with username / pass. A client page will then be automatically generated which the client can access the username / pass and download the files. It's relatively simple script but i'm having a hard time finding open source solutions which accomplish what i need. Any help would be appreciated.

    Read the article

  • Which forum software has the most advanced community/GetSatisfaction type features?

    - by Gaia
    I need to assemble a GetSatisfaction/Lithium/Jive type support forum/community. The first is not available in the desired language and the last two are priced for the enterprise market. I did research some other options (open source or SaaS) but they all seem to be either: kind of dead (open source options) too focused in gathering ideas/feedback (uservoice) strictly support without the community/voting features (zendesk) I need an open forum (people powered support/UGC with community/voting features). Therefore I will have to do some of the work on my own. I want to piece things (plugins/mods/etc) on top of a standard forum platform to give it the features I need. For this purpose, I want to use a mature product with widespread userbase, active community and lots of plugin options. I believe most will agree that my options therefore are: vBulletin phpBB SMF Here are the questions: Which one of the three above offers the easier path towards the desired goal? Which one of the three above has the most advanced features related to the desired goal? Of course I dont expect anyone to know these answers cut and dry. I am hoping to hear some experiences and see some examples. Also, it would be great if both those questions had the same answer, but I am not going to get my hopes up... PS: I wish I could add the tags "phpbb" and "smf" ;)

    Read the article

  • Synchronise Database between servers via php [closed]

    - by Emmanuel
    Hi Guys, I'm needing to synchronise two mysql databases between different servers on a regular basis, by a client-initiated interface. I've been doing it by remote MYSQL connection, and adding the IP of the servers to the whitelist for MYSQL remote connections. Problem is however, that the client has a dynamic IP, so as soon as it changes they can no longer sync. So I'm trying to find an alternative way of synchronising the two databases via some sort of secure php script.

    Read the article

  • Will Tracking Subdomains as Single Entity with Google Analytics Help SEO? [closed]

    - by Sam Gridley
    Possible Duplicate: Does Google Analytics data affect SEO? We have two subdomains, one for our blog and one for our ecommerce store. The blog serves to bring traffic and the store is how we monetize the site. We have them designed to appear as one large site, but I know google sees them as two sites. Here is how the subdomains look: www.example.com (store) blog.example.com (blog) I believe I can configure analytics to use subdomain tracking as explained here: http://support.google.com/googleanalytics/bin/answer.py?hl=en&answer=55524 But my question is whether this will cause google to see our 2 subdomains as one larger domain for SEO purposes. In other words, is there any relationship to how you configure google analytics and how google indexes and ranks your website(s) and pages? Is there anything I need to do in anaytics or webmaster tools to make google aware that these two subdomains work together as one website? Thanks! Sam

    Read the article

  • Why some video posts from the same blog appear in google with thumbs, while others do not?

    - by jayarjo
    We own media blog - which is basically a big collection of various videos streamed through our branded player. Interesting thing is that some of our posts show up in google search results with a thumb denoting that the post in question is in fact a video. But more often they are not. We basically wonder why? What does affect it and can we control it somehow? All posts (their single pages) have facebook og meta tags in place.

    Read the article

  • "X-Robots-Tag: noindex" on an HTTP 301 response

    - by Peter O.
    I understand that a resource with X-Robots-Tag: noindex forces some search engines, including Google, not to index the resource further. I also understand that an HTTP 301 response causes search engines to use the redirected URL instead of the original URL to refer to the resource. But what happens if both "X-Robots-Tag: noindex" and status code 301 occur on the same response? It's likely that the original URL will no longer be indexed, but will that cause the redirected URL to no longer be indexed too? This possibility is not mentioned in the X-Robots-Tag specification.

    Read the article

  • SEO effects of intermix of WP blog, custom PHP site and FB app game

    - by melbournetechlover
    We're a melbourne tech company in the process of building a custom site in PHP. We plan to launch a "pre-launch" page which is also custom coded (CSS3 on twitter bootstrap framework + HTML5 front end and PHP back end). On that site will be a link to a blog - the idea behind this is to build up ranking for a variety of relevant keywords prior to the full site going live (given the majority of the site is a member only community anyway so the blog is really the main way we'll be able to execute on-site SEO. Ideally, we would like to install wordpress in a subdirectory on our servers and just customise the header to look the same as the landing page of the website. But some questions and concerns... Is there any detrimental effect on SEO efforts in having two separate systems (one custom PHP, the other an installation of wordpress) to manage the blog vs the rest of the site? Are there any benefits or detriments to installing on a sub domain such as blog.sitename.com vs. sitename.com/blog. My preference would be sitename.com/blog as it feels neater - but open to suggestions based on knowledge of Google preferences. Separately, we are building a Facebook app which is under another site name. Again because we are launching this app first, from an SEO perspective, would it actually be better to run it from a sub domain on the main site - e.g. gamename.mainsitename.com instead of on app.gamename.com? Currently we have it on app.gamename.com, but if there are SEO benefits to moving it to the other domain and server then we'll do it. Basically we don't want to have our SEO efforts divided - will Google algorithms prefer two sites heavily referring traffic, or is it better to focus our efforts on one. I guess that's the crux of the issue. But the other one is - does Google care about traffic accessing a page built for the Facebook app iFrame - does that count toward rankings? Sorry I hope these questions aren't too complex - but we're in the tech world every day and still can't seem to find a good answer to these ones...hence I'm taking to the forums!! Free beer for whoever can give me a solid answer!

    Read the article

  • Do PHP-FPM (and other PHP handlers) need execute permissions on the PHP files they're serving?

    - by Andrew Cheong
    I read in a post at Server Fault that PHP-FPM needs execute permissions. However, the answer in When creating a website, what permissions and directory structure? only grants read and write permissions to PHP-FPM. Maybe I don't quite understand how PHP handlers (or CGI in general) work, but the two claims seem contradictory to me. As I understand, when Apache / Nginx gets a request for foobar.php, it "passes" the file to an appropriate handler. That is, I imagine it's as if www-root (or apache or whomever the webserver's running as) were to run some command, /usr/sbin/php-fpm foobar.php Actually, no, that's naive, I just realized. PHP-FPM must be a running instance (if it's to be performant, and cache, etc.), so probably PHP-FPM is just being told, "Hey, quick, process this file for me!" In either case, I don't see why execute permissions are necessary. It's not like the webserver needs to literally execute the file, i.e. ./foobar.php Is the Server Fault answer simply mistaken?

    Read the article

  • Google still has record of my old site URL - what to do?

    - by Mayeenul Islam
    I had a blog site, i.e. http://example2.com, then I bought a new domain, i.e. http://example.com and 301 permanent redirected example2.com to example.com. But when I get into the Google Webmaster Tools, if I get some 404, and then click into the link and see the "Linked from" tab, it shows some links like: http://example.com/post-1 http://example2.com/feed http://example2.com/post-1 According to Google, if you change your domain, just use a redirection for at least 4-6 months, but it almost passed. Then why Google has still traces of my old site? The issue is important, because I don't want to pay for the old domain anymore. I tried deleting my existing sitemap.xml and recreating it from the new site, but still such links are stored. What could I do?

    Read the article

  • How can a domain use its own nameservers?

    - by Thomas Clayson
    I have to change the MX DNS records for our company domain name and I've come across this odd situation: A whois search shows up that the nameservers for ourcompany.com are ns1.ourcompany.com and ns2.ourcompany.com. In the DNS settings at the registrar there are no A/Cname records at all. However the nameservers are defined in the DNS settings for the domain on our dedicated server. (Registrar and host are two different companies). Using the DNS lookup on http://www.mxtoolbox.com/ shows that ns2.ourcompany.com is reporting the correct IP for our dedicated server. Its all very odd... the DNS on the dedicated server doesn't seem to have much effect, but its odd that the dns at the registrar's end doesn't have any records. thanks for your help.

    Read the article

  • Deny access to a folder on hosting server but serve the pages

    - by Sourav
    My hosting server allows to host multiple websites. The directory structure is like this root |_ www.a.com |_ www.b.com |_ www.c.com |_ www.d.com I want to put some PHP files on the www.d.com folder so if some one browse the site from web-browser can get it, but no one can get it's source code [even by loggin in to the root folder]. Is there any way to doing so ? There is a feature called Password protect folder or so, can in help in this case ?

    Read the article

  • Not Provide count has reached 80%! But what about remaining 20?

    - by Rajesh Magar
    I am updated with all not-provided reasons as Google has encrypted their all searches, but here is the little question banging again and again in my head. That if all search results has encrypted with HTTPS protocol then how did Google analytics still able to track some of (20%) organic keywords details? I means their still some keywords appreading in my organic keywords section. So how did Google analytics track or bypass that HTTPS thing? Thanks in advance!

    Read the article

  • Does a system exist to facilitate virtual meetings and file sharing?

    - by CSharp Mania
    I'm looking for a system that is similar to an online classroom setup but allows for virtual meeting rooms with video/audio conferencing, and of course file sharing. I'm preferring an open source solution that I can edit/tweak myself as needed, and is of course free. Ultimately, I guess what I'm looking for is something that we could possibly tweak to give our own "branded" look and feel, if possible, along with full integration within our own servers. Thus the reason I brought up open source solutions. Do you masters of the web know of such a system available? If so, do you have a preferred one that you would suggest? OR, can such a system be developed by slapping together a couple of open source projects to derive at what is desired? Thanks for sharing your expertise. (FYI - I am a developer that is comfortable with PHP and C#. I'm not experienced with Ruby or Python, but a system using them or something else is acceptable. We can figure it out I'm sure.)

    Read the article

  • How can I avoid a 302 for Fetch as Bot?

    - by CookieMonster
    I originally posted this on Stackoverflow, but I believe here is a better place to ask. My web application is very similar to notepad.cc which redirects to a randomly generated URL upon access, e.g. http://myapp.com/roTr94h4Gd. (Please note that notepad.cc is not my site.) Probably because of this redirect feature, when I do "fetch as Google" or "fetch as Bingbot", I get a 302 and no html content. Not even a <html></html> tag. HTTP/1.1 302 Moved Temporarily Server: nginx/1.4.1 Date: Tue, 01 Oct 2013 04:37:37 GMT Content-Type: text/html Transfer-Encoding: chunked Connection: keep-alive X-Powered-By: PHP/5.4.17-1~dotdeb.1 Set-Cookie: PHPSESSID=vp99q5e5t5810e3bnnnvi6sfo2; expires=Thu, 03-Oct-2013 04:37:37 GMT; path=/ Expires: Thu, 19 Nov 1981 08:52:00 GMT Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0 Pragma: no-cache Location: /roTr94h4Gd How should I avoid 302 in this case? I suppose I could modify my site to prevent the redirect, but it is a necessary feature of my web app to generate a random URL on each access. I added <meta name="fragment" content="!"> tag into my index page and set it to return a static snapshot of my page when the flag is set. But this still returns a 302. I also added a header to return 200 before redirecting, but this had no effect, either. Could someone tell me a good suggestion to solve this problem?

    Read the article

  • My blog which gets 300+ daily impressions has stopped appearing on the 1st page of Google

    - by Sangram
    I have a blog regarding Placement papers from December 2010. My monthly impressions are around 4000. For the last 2 days, my blog has disappeared from Google search engine result pages. Impressions have reduced drastically. Please check Stat reports: My blog is still on the search engine because when I search site: mydomain.com on Google, I can see my all pages indexed over there… But my pages which used to appear on the first or second pages of Google do not appear any more. Example: If I search with query GE round 2 code writing test on bing.com or Yahoo search, the first link on the result page is my blog. But if you do same on Google, my URLs do not appear even on the 1st 3 result pages. I used to get lots of visitors by these search query earlier.

    Read the article

  • Track sales and commission with third-party tool

    - by Andrew
    I have a clothing website where I link to various clothing retailers. I have reached an agreement with one of the retailers whereby they will pay a commission to us for every sale they make from traffic that was referred by our site. I need a mechanism for tracking how much commission should be paid to us, that involves as little work as possible to implement from their side. We both have Google Analytics. Option 1: They record a goal in their GA account whenever someone makes a purchase on their site. They see how many completed goals are marked as referral traffic from our site and calculate commission accordingly. The problem with this is that the whole process of calculating and paying commission will be manual. They will need to frequently check how many sales were generated by referral traffic from our site, and probably we will have to chase them for commission payments. Also - since we won't have access to their GA data - we will need to trust that they report all sales accurately. Option 2: Sign them up to an affiliate network like Commission Junction or Google's Affiliate Network, and connect to them through this network. The problem with this solution is that it seems too heavyweight; ideally we don't want to ask a retailer to go through the whole sign up process just to deal with us and pay us commission. I am assuming that there must be some lightweight service that tracks the number of sales by one site and pays commission accordingly to the other site, where the sign up and installation procedure is simple and fast.

    Read the article

  • Is it time to add IPv6 access to my websites?

    - by Rob Hoare
    I have several dedicated servers and VPS servers, and some of those are at companies that have provided me with native IPv6 blocks (in addition to the IPv4 IP addresses). Does it currently make sense to point an AAAA record to an IPv6 address on my server, in addition to the A record pointing to the IPv4 address? This would be for (for example) the www subdomain. (the networking and web server software would be set up on the server to respond appropriately). A while ago I read that a small percentage of users (1 in a thousand?) would have slow or no access if a subdomain had both A and AAAA records because their networking software asked for one and got the other. Is that still the case, will adding an AAAA record inconvenience some users, or is the percentage already smaller and falling? In other words, is now the time to get around to adding native IPv6 support for a busy website aimed at the general public, or is it still too early?

    Read the article

  • Digital "Post It" notes for organizing content of sites/pages

    - by Alex
    We're restructuring our old intranet into a new one and are going through each site to find content and use our new standard structure/look-and-feel. Do you recommend a tool where you can do "digital Post-It" notes? It would provide a way to type some items on a "card" and be able to move it around and organize it quickly. Also, if you know of tools in general for this kind of task, please advise. Thank you.

    Read the article

  • Reach Local Proxy Page - Duplicate content?

    - by Simon Bennett
    We have a client who has instructed Reach Local to manage their paid SEO work etc. RL have created a proxy version of the page at http://example-px.rtrk.co.uk which mirrors the existing site completely. Would I be correct in assuming that this would count as duplicate content and one or both of the sites would be penalized because of this? And would the addition of a rel="canonical" meta-tag on the proxy site assist with this? Many thanks in advance.

    Read the article

  • I'm getting 403 forbidden error on my website

    - by user1230090
    I was accessing the directories through cyberduck and also trying to upload files.But now it started showing this forbidden error.I was getting the homepage first,now i dont get that too.Can anyone please tell me how can I get my website back to show [Fri Mar 02 14:36:21 2012] [error] File does not exist: /var/www/vhosts/example.com/httpdocs/bin [Fri Mar 02 14:37:24 2012] [error] File does not exist: /var/www/vhosts/example.com/httpdocs/httpsdocs [Fri Mar 02 14:39:01 2012] [error] (13)Permission denied: file permissions deny server access: /var/www/vhosts/example.com/httpdocs/index.html

    Read the article

  • Why is Google still not indexing my !# website?

    - by Zubair
    I have been working on a website which uses #! (2minutecv.com), but even after 6 weeks of the site up and running and conforming to the Google hash bang guidelines stated here, you can still see that Google still hasn't indexed the site yet. For example if you use Google to search for 2MinuteCV.com benefits it does not find this page which is referenced from the homepage. Can anyone tell me why Google isn't indexing this website? Update: Thanks for al lthe help with this answer. So just to make sure I understand what is wrong. According to the answers Google never actually indexes the pages after the Javascript has run. I need to create a "shadow site" which google indexes (which google calls HTNL snapshots). If I am right in thinking this then I can pick a winner for the bounty

    Read the article

  • How do Expires headers and cache manifest rules work together?

    - by Robert K
    I find the W3C's official Offline Web Applications specification to be rather vague about how the cache manifest interacts with headers such as ETag, Expires, or Pragma on cached assets. I know that the manifest should be checked with each request so that the browser knows when to check the other assets for updates. But because the specification doesn't define how the cache manifest interacts with normal cache instructions, I can't predict precisely how the browser will react. Will assets with a future expiration date be refreshed (no matter the cache headers) when the cache manifest is updated? Or, will those assets obey the normal caching rules? Which caching mechanism, HTTP cache versus cache manifest, will take precedence, and when?

    Read the article

  • joomla subscription ecommerce solution

    - by CQM
    I am using Joomla and want to implement a subscription based ecommerce solution. I started using VirtueMart and noticed its shortcomings regarding subscription based recurring items. Turns out Virtuemart can take its own plugins and I found something called SimSu which I don't want to pay for. I feel like I have taken a nuke to a knife fight, as I only need one or two subscription based products on this site. They will be recurring payments. Is anyone familiar with a solution for this issue?

    Read the article

  • Panda 4: Reducing #indexed pages. How much is enough?

    - by Noam
    I've been hit by panda 4 (40% decrease). I didn't see any change during panda 1-3. From what I've read it and when compared to my site, the change is probably due to the fact that I have over 30M pages indexed on Google, and they've starting seeing that as some sort of bad indication. Although I feel all of the pages have a unique value that Google should crawl, it seems I should make some tough calls and deduce the indexed pages according to some prioritization I will conduct. The question is what should be my target, or what factors should help me figure out a relevant target. How many pages should I try to reduce to? - 25M - 15M - 1M - 2000 Is it enough to add noindex to low priority pages or should I also remove all internal linking to them?

    Read the article

< Previous Page | 245 246 247 248 249 250 251 252 253 254 255 256  | Next Page >