Search Results

Search found 136 results on 6 pages for 'martijn courteaux'.

Page 2/6 | < Previous Page | 1 2 3 4 5 6  | Next Page >

  • Set default MySQL connect charset for PHP (in RHEL)?

    - by Martijn Heemels
    We're running a hundred or so legacy PHP websites on an older server which runs Gentoo Linux. When these sites were built latin1 was still the common charset, both in PHP and MySQL. To make sure those older sites used latin1 by default, while still allowing newer sites to use utf8 (our current standard), we set the default connect charset in php.ini: mysql.connect_charset = latin1 mysqli.connect_charset = latin1 pdo_mysql.connect_charset = latin1 Specific more modern sites could override this in their bootstrapping code with: <?php mysql_set_charset("utf8", $dsn ); ...and all was well. Now the server is overloaded and we're no longer with that hoster, so we're moving all these sites to a faster server at our standard hoster, which uses RHEL 5 as their OS of choice. In setting up this new server I discover to my surprise that the *.connect_charset directives are a Gentoo specific patch to PHP, and RHEL's version of PHP doesn't recognize them! Now how do I set PHP to connect to MySQL with the latin1 charset? I thought about setting a default in my.cnf but would prefer not to force every app and client to default to latin1. Our policy is to use utf8, and we'd like to restrict the exception to PHP only. Also, converting every legacy site to properly use utf8 is not doable since many are of the touch 'm and you break 'm kind. We simply don't have the time to go fix them all. How would I set a default mysql/mysqli/pdo_mysql connection charset to latin1 for PHP, while still allowing individual scripts to override this to utf8 with mysql_set_charset()?

    Read the article

  • Cisco ASA and static IPv6 tunnel endpoint?

    - by Martijn Heemels
    I recently installed a Cisco ASA 5505 firewall on the edge of our LAN. The setup is simple: Internet <-- ASA <-- LAN I would like provide the hosts in the LAN with IPv6 connectivity by setting up a 6in4 tunnel to SixXS. It would be nice to have the ASA as tunnel endpoint so it can firewall both IPv4 and IPv6 traffic. Unfortunately the ASA apparently can't create a tunnel itself, and can't port-forward protocol 41 traffic, so I believe I would have to do one of the following instead: Set up a host with it's own IP outside the firewall, and have that function as tunnel-endpoint. The ASA can then firewall and route the v6 subnet to the LAN. Set up a host inside the firewall that functions as endpoint, separated via vlan or whatever, and loop the traffic back into the ASA where it can be firewalled and routed. This seems contrived, but would allow me to use a VM instead of a physical machine as endpoint. Any other way? What would you suggest is the optimal way to set this up? P.S. I do have a spare public IP address available if needed, and can spin up another VM in our VMware infrastructure.

    Read the article

  • Cisco ASA5505 won't sync with NTP

    - by Martijn Heemels
    Today I noticed that the clock my Cisco ASA 5505 firewall was running about 15 minutes late, which surprised me since I've set up the NTP client. My two NTP servers 10.10.0.1 and 10.10.0.2 are virtualized Windows Server 2008 R2 domain controllers, and both have the correct time. As shown below, the ASA knows about the two servers, can ping them and seems to poll them periodically, so I suppose it can reach them both. The ASA claims its time source is NTP, however the clock is unsynchronized. Neither host is marked as synced. Result of the command: "ping 10.10.0.1" Type escape sequence to abort. Sending 5, 100-byte ICMP Echos to 10.10.0.1, timeout is 2 seconds: !!!!! Success rate is 100 percent (5/5), round-trip min/avg/max = 1/1/1 ms Result of the command: "sh ntp ass" address ref clock st when poll reach delay offset disp ~10.10.0.1 .LOCL. 1 78 1024 377 0.5 643.69 17.0 ~10.10.0.2 10.10.0.1 2 190 1024 377 0.9 655.91 58.4 * master (synced), # master (unsynced), + selected, - candidate, ~ configured Result of the command: "sh ntp stat" Clock is unsynchronized, stratum 16, no reference clock nominal freq is 99.9984 Hz, actual freq is 99.9984 Hz, precision is 2**6 reference time is 00000000.00000000 (07:28:16.000 CEST Thu Feb 7 2036) clock offset is 0.0000 msec, root delay is 0.00 msec root dispersion is 0.00 msec, peer dispersion is 0.00 msec Result of the command: "sh clock detail" 10:33:23.769 CEDT Tue Jun 26 2012 Time source is NTP UTC time is: 08:33:23 UTC Tue Jun 26 2012 Summer time starts 02:00:00 CEST Sun Mar 25 2012 Summer time ends 03:00:00 CEDT Sun Oct 28 2012 I've tried the basic steps of manually setting the time and removing and adding the timeservers, to no avail. My ASA's ntp config is simply: ntp server 10.10.0.1 ntp server 10.10.0.2 Do I need to enable authentication to use a Windows NTP server? Any thoughts?

    Read the article

  • Web-based disk space visualizer

    - by Martijn
    I have a number of Linux webservers for which I'd like to track where disk space is going and keep disk space to a minimum. Typically I login on SSH and use du to find out where disk space is wasted but this is cumbersome and slow. A visualisation tool like KDirStat would be ideal, but it requires installing an X server at the very least, which kind of defeats the purpose. Is there any web-based disk space visualizer? I'm open to alternative solutions.

    Read the article

  • ServerName wildcards in Apache name-based virtual hosts?

    - by Martijn Heemels
    On our LAN I've set up several 'fake' TLDs in the DNS server, with the intention of using them for Apache name-based virtual hosting. I'd like to combine this with mass-virtual-hosting (i.e. VirtualDocumentRoot) on an Ubuntu 10.04 LAMP server. However, I can't get it to select the right vhost! Here is a summary of the Apache config: NameVirtualHost 10.10.0.205 <VirtualHost 10.10.0.205> ServerName *.test VirtualDocumentRoot /var/www/%-3.0.%-2/test/%1/ CustomLog /var/log/apache2/access.log vhost_combined </VirtualHost> <VirtualHost 10.10.0.205> ServerName *.dev VirtualDocumentRoot /var/www/%-3.0.%-2/dev/%1/ CustomLog /var/log/apache2/access.log vhost_combined </VirtualHost> A hostname such as www.domain.com.dev, correctly resolves to 10.10.0.205, but always selects the top vhost, instead of the bottom one, which matches more closely. I was under the impression that Apache would first try to match the ServerName before defaulting to the top vhost for a given IP. What am I doing wrong? Or is this not possible and must I use another IP for each TLD? apachectl -S outputs (trimmed): 10.10.0.205:* is a NameVirtualHost default server *.test port * namevhost *.test port * namevhost *.dev

    Read the article

  • Redirect 301 fails with a path as destination

    - by Martijn Heemels
    I'm using a large number of Redirect 301's which are suddenly failing on a new webserver. We're in pre-production tests on the new webserver, prior to migrating the sites, but some sites are failing with 500 Internal Server Error. The content, both databases and files, are mirrored from the old to the new server, so we can test if all sites work properly. I traced this problem to mod_alias' Redirect statement, which is used from .htaccess to redirect visitors and search engines from old content to new pages. Apparently the Apache server requires the destination to be a full url, including protocol and hostname. Redirect 301 /directory/ /target/ # Not Valid Redirect 301 /main.html / # Not Valid Redirect 301 /directory/ http://www.example.com/target/ # Valid Redirect 301 /main.html http://www.example.com/ # Valid This contradicts the Apache documentation for Apache 2.2, which states: The new URL should be an absolute URL beginning with a scheme and hostname, but a URL-path beginning with a slash may also be used, in which case the scheme and hostname of the current server will be added. Of course I verified that we're using Apache 2.2 on both the old and the new server. The old server is a Gentoo box with Apache 2.2.11, while the new one is a RHEL 5 box with Apache 2.2.3. The workaround would be to change all paths to full URL's, or to convert the statements to mod_rewrite rules, but I'd prefer the documented behaviour. What are your experiences?

    Read the article

  • What does ldapsearch response mean?

    - by Martijn Burger
    I created a ldap directory with a number of users and groups. When I query this directory from a remote server with: ldapsearch -H ldap://ldap.myserver.net/ -x -vvvvvvv -b dc=myserver,dc=net -D cn=admin,dc=myserver,dc=net -W I get all objects in the directory returned. The result finishes with the following: # search result search: 2 result: 0 Success # numResponses: 85 # numEntries: 84 What do these numbers mean exactly?

    Read the article

  • How to filter Varnish logs based on XID?

    - by Martijn Heemels
    I'm running into infrequent 503 errors which appear hard to pinpoint. Varnishlog is driving me mad, since I can't seem to get the information I want out of it. I'd like to see both the client- and backend-communications as seen by Varnish. I thought the XID number, which is logged on Varnish's default error page, would allow me to filter the exact request out of the logging buffer. However, no combination of varnishlog parameters gives me the output I need. The following only shows the client-side communication: varnishlog -d -c -m ReqStart:1427305652 while this only shows the resulting backend communication: varnishlog -d -b -m TxHeader:1427305652 Is there a one-liner to show the entire request?

    Read the article

  • RDP with multiple monitors, display preferences get reset?

    - by Martijn Kooij
    Problem: When I connect to my pc at the office via RDP all the application windows I had previously carefully placed on either monitor 1 or 2 will be "scrambled". Either all applications show on monitor 1 and monitor 2 is empty, or they have switched 1 <- 2. Expected behaviour: When I connect I see all the application windows on exactly the same position and in the exact same size as I left them the night before. I have the exact same monitors at home as I have at work: Primary 2560x1440, Secondary 900x1440. Yesterday I tried switching the physical cables on the host machine hoping that the hardware order of the monitors was the difference. But this morning my secondary monitor was completely blank, not even the taskbar (which I had set to ONLY show on the secondary). Somewhere there must be something to help Windows understand which physical monitor is which virtual RDP monitor is which RDP "server" monitor... Are there more options than switching the cables? This one has been bothering me for a long long time now, I hope someone has a solution or workaround for me. Edit I want to use both monitors, so I have checked the "Use all monitors" setting in the RDP client. For example I leave my mail and total commander on the right monitor, and visual studio and Firefox on the left monitor. When I connect to RDP I want to see those applications on the same positions and sizes.

    Read the article

  • Returning in a static initializer

    - by Martijn Courteaux
    Hello, This isn't valid code: public class MyClass { private static boolean yesNo = false; static { if (yesNo) { System.out.println("Yes"); return; // The return statement is the problem } System.exit(0); } } This is a stupid example, but in a static class constructor we can't return;. Why? Are there good reasons for this? Does someone know something more about this? So the reason why I should do return is to end constructing there. Thanks

    Read the article

  • Java + Eclipse: Synchronize stdout and stderr

    - by Martijn Courteaux
    Hi, I use Eclipse. When I have an application like this: write 20 times 'Hello World\n' to stdout write 'ERROR\n' to stderr write 5 times 'Hello World\n' to stdout The output looks many times like this: Hello World Hello World Hello World Hello World Hello World Hello World ... Hello World Hello World Hello World ERROR Is there a way to synchronize these two output streams? Of course without waiting a few milliseconds after the block of 20 times Hello World and waiting a few milliseconds after printing ERROR.

    Read the article

  • Returning in a static class constructor

    - by Martijn Courteaux
    Hello, This isn't valid code: public class MyClass { private static boolean yesNo = false; static { if (yesNo) { System.out.println("Yes"); return; // The return statement is the problem } System.exit(0); } } This is a stupid example, but in a static class constructor we can't return;. Why? Are there good reasons for this? Does someone know something more about this? So the reason why I should do return is to end constructing there. Thanks

    Read the article

  • Java2D: Fill a convex rounded polygon (QuadCurves)

    - by Martijn Courteaux
    Hi, If I have a QuadCurve like this (+ = node): + + \ ./ +--?? And I fill it in Java 2D the result is something like this: (x = colored) +xxxxxxxxx+ \xxxxxx./ +--?? But I want to color the other side: + + x\ ./x xxx +--??xx xxxxxxxxxxx This succeeds by drawing a rectangle around the curve in the color I want to color the other side and then fill the curve with the background color. But this isn't good enough to fill a convex rounded (based on QuadCurves) polygon. In case of some coordinates for the rectangles (as explained in the trick I used) overlap other pieces of the polygon. Here are two images (the green area is my polygon): So, the question is simple: "How can I color a shape build of curves?" But to the answer will not be simple I think... Any advice would be VERY VERY appreciated. Thanks in advance. Maybe I'm going to make a bounty for this question if I don't get an answer

    Read the article

  • When I'm iterating over two arrays at once, which one do I use as the limit?

    - by Martijn Courteaux
    Hi, I'm always struggling with something like the following Java example: String breads[] = {"Brown", "White", "Sandwich"}; int count[] = new int[breads.length]; for (int i = 0; i < ****; i++) { // Prompt the number of breads } ****: which array.length should I choose? I can choose between breads.length and count.length I know it would be the same result, but I don't know which one I shoud choose. There are many other examples where I get the same problem. I'm sure that you have encountered this problem as well in the past. What should you choose? Are there general agreements? Thanks

    Read the article

  • Eclipse And Linux: Keyboard unusable after gnome-screen-saver

    - by Martijn Courteaux
    Hi, I know this is not programming related. But I can't find any topics on Google or UbuntuForums. So the problem is: When gnome-screensaver starts on the moment Eclipse has the focus and I wake up again my laptop, Eclipse doesn't listen to keyboard-events. To solve this I have to change the focus to another program and then back to Eclipse. Than it works again. This isn't a real problem, but it would be nice if someone can solve it. Thanks

    Read the article

  • Software Protection: Shuffeling my application?

    - by Martijn Courteaux
    Hi, I want to continue on my previous question: http://stackoverflow.com/questions/3007168/torrents-can-i-protect-my-software-by-sending-wrong-bytes Developer Art suggested to add a unique key to the application, to identifier the cracker. But JAB said that crackers can search where my unique key is located by checking for binary differences, if the cracker has multiple copies of my software. Then crackers change that key to make them self anonymous. That is true. Now comes the question: If I want to add a unique key, are there tools to shuffle (a kind of obfuscation) the program modules? So, that a binary compare would say that the two files are completely different. So they can't locate the identifier key. I'm pretty sure it is possible (maybe by replacing assembler blocks and make some jumps). I think it would be enough to make 30 to 40 shuffles of my software. Thanks

    Read the article

  • Programming Contest Question: Counting Polyominos

    - by Martijn Courteaux
    Hi, An example question for a programming contest was to write a program that finds out how much polyominos are possible with a given number of stones. So for two stones (n = 2) there is only one polyominos: XX You might think this is a second solution: X X But it isn't. The polyominos are not unique if you can rotate them. So, for 4 stones (n = 4), there are 7 solutions: X X XX X X X X X X XX X XX XX XX X X X XX X X XX The application has to be able to find the solution for 1 <= n <=10 PS: Using the list of polyominos on Wikipedia isn't allowed ;) EDIT: Of course the question is: How to do this in Java, C/C++, C#

    Read the article

  • C++: Platform independent game lib?

    - by Martijn Courteaux
    Hi, I want to write a serious 2D game, and it would be nice if I have a version for Linux and one for Windows (and eventually OSX). Java is fantastic because it is platform independent. But Java is too slow to write a serious game. So, I thought to write it in C++. But C++ isn't very cross-platform friendly. I can find game libraries for Windows and libraries for Linux, but I'm searching one that I can use for both, by recompiling the source on a Windows platform and on a Linux platform. Are there engines for this or is this idea irrelevant? Isn't it that easy (recompiling)? Any advice and information about C++ libraries would be very very very appreciated!

    Read the article

  • Java: Make a method abstract for each extending class

    - by Martijn Courteaux
    Hi, Is there any keyword or design pattern for doing this? public abstract class Root { public abstract void foo(); } public abstract class SubClass extends Root { public void foo() { // Do something } } public class SubberClass extends SubClass { // Here is it not necessary to override foo() // So is there a way to make this necessary? // A way to obligate the developer make again the override } Thanks

    Read the article

< Previous Page | 1 2 3 4 5 6  | Next Page >