Search Results

Search found 7 results on 1 pages for 'matw'.

Page 1/1 | 1 

  • Nginx + php5-fpm = "File not found"

    - by MatW
    I've hit a wall whilst setting up a site using nginx / fpm. The page displays "File not found", and this appears in the nginx error.log: FastCGI sent in stderr: "Primary script unknown" while reading response header from upstream I'm new to both nginx and fpm, and that error message means nothing to me (even the google machine hasn't helped!). Can anyone shed any light onto what could be happening?

    Read the article

  • Will running CSF & Bastille cause any conflicts?

    - by MatW
    I'm taking my first steps into the world of un-managed servers, and have confused myself whilst reading through the 101 tutorials on server hardening that Google spews out! The most recent advice I have been given is to install both CSF and Bastille on my server (used to serve a consumer-facing ecommerce site and act as the business' email server), but my understanding was that both of these tools were an abstraction layer above netfilter / iptables. Will installing both packages cause any conflicts, or do they play well together?

    Read the article

  • Google Analytics - async tracking with two accounts

    - by MatW
    I'm currently testing GAs new async code snippet using two different tracking codes on the same page; _gaq.push( ['_setAccount', 'UA-XXXXXXXX-1'], ['_trackPageview'], ['b._setAccount', 'UA-XXXXXXXX-2'], ['b._trackPageview'] ); Although both codes work, I've noticed that they present inconsistent results. Now, we aren't talking huge differences here, only 1 or 2 visits / day every now and then. However, this site is tiny and 1 or 2 visits equates to a 15% difference in figures. Now, the final site has much more traffic, but my concerns are; will this inconsistancy scale with traffic? assuming not, is a slight variation in recorded stats an accepted norm?

    Read the article

  • Memcache localhost connection oddity

    - by MatW
    When I try to connect to memcache using this code: $memcache = new Memcache; $memcache->connect('localhost', 11211) or die ("Could not connect"); The call dies with the "Could not connect" error, but if I use localhost's IP: $memcache = new Memcache; $memcache->connect('127.0.0.1', 11211) or die ("Could not connect"); It works! So what's my problem? Well, this new computer is the only development environment I've setup that's been sensitive to that difference. I'm not about to go changing the settings on any code for what seems to be a computer specific issue, but I can't figure out what could be causing this behaviour... Any ideas? I'm running XP, memcached 1.2.4, and wampserver 2. I've checked the hosts file and it does have an entry for localhost.

    Read the article

  • Partial Git deployment strategy?

    - by MatW
    I need to setup a Kohana dev environment that allows me to make full use of shared module / system classes across separate applications. Each application typically belonging to a different client. I use Git for source control, but am struggling to come up with a clean deployment method that will allow me to pull only those parts of the dev environment specific to a client / app down into that client's production environment (assuming that the client's production environment will have Git installed). Dev enviroment: - kohana - applications - clientapp1 - clientapp2 - modules - public_html - clientapp1 - clientapp2 - system - 3.0.1 - 3.0.5 Client 1's production environment: - / - applications - clientapp1 - modules - public_html - client_app1 - system - 3.0.5 Naturally, I want to have total control over each client "sub repo" as if it were an independent repo (in terms of gitignore, etc). I have seen topics that cover Git's sparse checkout feature, but it seems like it may cause a few problems down the line from a maintenance point of view, and I don't like the idea of the entire repo's metadata existing in client's production environment repo. As you can probably tell, I'm not exactly a Git poweruser, so any suggestions / wisdom are very welcome!

    Read the article

  • Image upload storage strategies

    - by MatW
    When a user uploads an image to my site, the image goes through this process; user uploads pic store pic metadata in db, giving the image a unique id async image processing (thumbnail creation, cropping, etc) all images are stored in the same uploads folder So far the site is pretty small, and there are only ~200,000 images in the uploads directory. I realise I'm nowhere near the physical limit of files within a directory, but this approach clearly won't scale, so I was wondering if anyone had any advice on upload / storage strategies for handling large volumes of image uploads.

    Read the article

  • Trouble creating stored procedure

    - by MatW
    I'm messing around with stored procedures for the first time, but can't even create a simple select! I'm using phpMyAdmin and this is my SQL: DELIMITER // CREATE PROCEDURE test_select() BEGIN SELECT * FROM products LIMIT 10; END // DELIMITER ; After submitting that, my localhost does some thinking for a loooong time and eventually loads a page with no content called /phpmyadmin/import.php. After reloading phpMyAdmin and trying to invoke the procedure: CALL test_select(); I get a "PROCEDURE doesn't exist" error. Any ideas?

    Read the article

1