Search Results

Search found 11913 results on 477 pages for 'fail fast fail early'.

Page 212/477 | < Previous Page | 208 209 210 211 212 213 214 215 216 217 218 219  | Next Page >

  • Best way to grow Linux software RAID 1 to RAID 10

    - by Hans Malherbe
    mdadm does not seem to support growing an array from level 1 to level 10. I have two disks in RAID 1. I want to add two new disks and convert the array to a four disk RAID 10 array. My current strategy: Make good backup. Create a degraded 4 disk RAID 10 array with two missing disks. rsync the RAID 1 array with the RAID 10 array. fail and remove one disk from the RAID 1 array. Add the available disk to the RAID 10 array and wait for resynch to complete. Destroy the RAID 1 array and add the last disk to the RAID 10 array. The problem is the lack of redundancy at step 5. Is there a better way?

    Read the article

  • Linux or Windows for a server?

    - by Matt
    I'm a Linux guy when it comes to (web) servers for the following reasons Legally free Fast software updates (Unless you're running Cent OS :) Powerful CLI management of services Easy to secure (in terms of users and groups) Web server software is, well, built for Linux... Apache, PHP, Python, etc, are Linux programs that get ported to Windows - I'm 90% sure of this Unless the web server needed to run ASP, I wouldn't use Windows. My boss' IT friend is a Windows guy, though. He recently got a server setup in the office to run Microsoft Exchange and some other shit. What I'm asking is, if he wanted to start running websites on this thing, what would be good reasoning to convince him otherwise? He's not very bright in terms of IT and the IT friend is all Windows. So it's two against one here... What would you say to running a Windows web server?

    Read the article

  • Xpath and parameters

    - by Daniel
    I am relatively new to XSL and I think this is a basic question. So I better get my apologies in early! Basically, I would like to use a value-of function. This will be in a template and I would like to pass part of the xpath as a parameter. <xsl:variable name="TEST_VAR">"h:elementA/elementB"</xsl:variable> ... and then use the variable (or passed in param) as all (or part) of the xpath: <xsl:element name="transformedElement"><xsl:value-of select=$TEST_VAR/></xsl:element> I would really like to have the xpath to be a mixture of string literals and the vlue of TEST_VAR. Thanks in advance

    Read the article

  • Using the link command to keep backups on another drive

    - by Xavier
    I have a folder that contains a not so large amount of space called /data/backup. I have been told that if I link that folder (/data/backup) to an even bigger folder area like /bigdata/backup for example, that I will be able to execute backups to the /data/backup folder. It will then just create a link, but the data will be seen in both folders and the latter one (/bigdata/backup) will contain the backup results but it will show on both folders. Since the /bigdata/backup has far more disk space then the backup will no longer fail because of space problems in the /data/backup one. Is this true?

    Read the article

  • Best way to grow Linux software RAID 1 to RAID 10

    - by Hans Malherbe
    mdadm does not seem to support growing an array from level 1 to level 10. I have two disks in RAID 1. I want to add two new disks and convert the array to a four disk RAID 10 array. My current strategy: Make good backup. Create a degraded 4 disk RAID 10 array with two missing disks. rsync the RAID 1 array with the RAID 10 array. fail and remove one disk from the RAID 1 array. Add the available disk to the RAID 10 array and wait for resynch to complete. Destroy the RAID 1 array and add the last disk to the RAID 10 array. The problem is the lack of redundancy at step 5. Is there a better way?

    Read the article

  • Strange behavior in networking between 64bit and 32bit

    - by Rob
    I'm having a strange behavior about my network setup. I have 2 laptops, one (Lenovo) with Windows 7 Professional 64-bit and another (Acer) with Windows 7 Ultimate 32-bit and a wireless router. I'm connecting these 2 using the router but with a strange behavior. I can ping both machines, as well as the router, but when i try to access their shared folders (\\computer_name\shared_folder) the connection starts to fail and I need to reboot both machines to get it working again. But this only happens sometimes, sometimes it works.

    Read the article

  • How would you fool your boss, if you needed to? [closed]

    - by Starx
    Even as a programmer myself, I am not always in mood of coding, so I think of a way to fool my boss and GET GOING..... (if you know what I mean). Like one time I wanted to go home early, I said to my boss, "Sir, I have a check up, I need to meet my doctor at 12:00 pm today". And I was out of there Well, I am doing this, so I thought others might also have done something like this or wanted to do something like this. So, How about sharing them?

    Read the article

  • Slow Jquery Animation

    - by Pyronaut
    I have this webpage : http://miloarc.pyrogenicmedia.com/ Which atm is nothing special. It has a few effects but none that break the bank. If you mouse over a tile, it should change it's opacity to give it a fade effect. This is done through the Jquery Animation, not through CSS (I do this so it can fade, instead of being a straight change). Everything looks nice when the page loads, and the fades look perfect. Infact if you drag your mouse all over the place, if gives you a "trail" almost like a snake. Anyway, My next problem is that you will see there is a box in the top left, which is going to tell you information about the tile you are hovering over. This seems to work fine. When you mouse over that information box, it switches it's position (So that you can reach the tiles that were previously hidden underneath it). From my understanding, this is all working fine, and to the letter. However, After one move of the info box (One hover). Viewing the page in Firefox turns slugish. As in, after a successful move of the info box, the fade effects become very stuttered, and it doesn't pick up events as fast meaning you can't draw a "trail" on the screen. Chrome doesn't have this issue. It seems to work fine no matter what. Safari also seems ok. Although I do notice if I move my mouse really fast, it does jump a bit but not as much as firefox. Internet explorer doesn't work at all. Seems to crash the browser... And there is an error with the rounded corner plugin im using (Not sure why...). All in all, I think whatever I'm doing inside my code must be heavily sluggish. But I don't know where it is happening. Here is the full code, but I would advise go to the link to view everything. <script type="text/javascript"> $(document).ready(function() { var WindowWidth = $(window).width(); var WindowMod = WindowWidth % 20; var WindowWidth = WindowWidth - WindowMod; var BoxDimension = WindowWidth / 20; var BoxDimensionFixed = BoxDimension - 12; var dimensions = BoxDimensionFixed + 'px'; $('.gamebutton').each(function(i) { $(this).css('width', dimensions); $(this).css('height', dimensions); }); var OuterDivHeight = BoxDimension * 10; var TopMargin = ($(window).height() - OuterDivHeight) / 2; var OuterDivWidth = BoxDimension * 20; var LeftMargin = ($(window).width() - OuterDivWidth) / 2; $('#gamePort').css('margin-top', TopMargin).css('margin-left', LeftMargin).css('margin-right', LeftMargin); $('.gamebutton img').each(function(i) { $(this).hover( function () { $(this).animate({'opacity': 0.65}); }, function () { $(this).animate({'opacity': 1}); } ); }); $('.rounded').corners(); $('.gamebutton').each(function(i) { $(this).hover(function() { $('.gameTitlePopup').html($(this).attr('title')); FadeActive(); }); }); function FadeActive() { $('.activeInfo').fadeIn('slow'); } $('#gameInfoLeft').hover(function() { $(this).removeClass('activeInfo'); $(this).fadeOut('slow', function() { $('#gameInfoRight').addClass('activeInfo'); FadeActive(); }); }); $('#gameInfoRight').hover(function() { $(this).removeClass('activeInfo'); $(this).fadeOut('slow', function() { $('#gameInfoLeft').addClass('activeInfo'); FadeActive(); }); }); }); </script> Thanks for any help :).

    Read the article

  • Is a cluster the most cost effective redundancy method for windows server 2003?

    - by Ryan
    We had a server with bad ram which caused a long outage while they figured it out and our client facing apps had to go down for a while. We are coming up with a solution for instant fail-over but are not sure what the most cost effective method would be. Is a windows server cluster the best method for this? Also note we are using Parallels Virtuozzo if that makes any difference here. We found Parallels has a documented method for setting this up but it said it required a Domain Controller as well as a Fiber connection to shared storage, is all that really needed? Thanks.

    Read the article

  • Init modules in apache2

    - by user306963
    Hello, I used to write apache modules in apache 1.3, but these days I am willing to pass to apache2. The module that I am writing at the moment has is own binary data, not a database, for performance purposes. I need to load this data in shared memory, so every child can access it without making his own copy, and it would be practical to load/create the binary data at startup, as I was used to do with apache 1.3. Problem is that I don't find an init event in apache2, in 1.3 in the module struct, immediatly after STANDARD_MODULE_STUFF you find a place for a /** module initializer */, in which you can put a function that will be executed early. Body of the function I used to write is something like: if ( getppid == 1 ) { // Load global data here // this is the parent process void* data = loadGlobalData( someFilePath ); setGlobalData( config, data ); } else { // this is the init of a child process // do nothing } I am looking for a place in apache2 in where I can put a similar function. Can you help? Thanks Benvenuto

    Read the article

  • how to install mpgtx from source code

    - by Ahmet vardar
    i am new on linux server. i have mpgtx folder in my root, how can i install it ? in readme file it is written; ./configure && make when i type this i get permission denied error ? thanks EDIT: Here the steps i done root@server [/]# cd /mpgtx root@server [/mpgtx]# ./configure -bash: ./configure: Permission denied root@server [/mpgtx]# make ----------------------------------------------------------------------------- Hello ! I'm afraid I'm a dummy Makefile. My goal in life is to politely ask you to run the configure script to actual- ly generate a real Makefile. Would you be kind enough to type "./configure --help" to see the options that will suit your needs ? Please note that typing "./configure" without option will generate a Makefile that will suit most people needs. I wish you a good day. Please don't drive to fast. ----------------------------------------------------------------------------- root@server [/mpgtx]# ./configure -bash: ./configure: Permission denied root@server [/mpgtx]#

    Read the article

  • CentOS PAM+LDAP login and host attribute

    - by pianisteg
    My system is CentOS 6.3, openldap is configured well, PAM authorization works fine. But after turning pam_check_host_attr to yes, all LDAP-auths fail with message "Access denied for this host". hostname on the server returns correct value, the same value is listed in user's profile. "pam_check_host_attr no" works fine and allows everyone with correct uid/password a piece of /var/log/secure: Sep 26 05:33:01 ldap sshd[1588]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=my-host user=my-username Sep 26 05:33:01 ldap sshd[1588]: Failed password for my-username from 77.AA.BB.CC port 58528 ssh2 Sep 26 05:33:01 ldap sshd[1589]: fatal: Access denied for user my-username by PAM account configuration Another two servers (CentOS 5.7 Debian) authorizes on this LDAP server correctly. Even with pam_check_host_attr yes! I didn't edit /etc/security/access.conf, it is empty, only default comments. I don't know what to do! How to fix this?

    Read the article

  • Will Dolphin emulator run (smoothly) on this machine?

    - by Mark
    Product Specs: Intel® NM10 Express chipset /w Intel® Atom® D525 (dual-core, 1.8 GHz) I think the most I can put in there for RAM is 4 GB. (They don't make 8 GB SODIMM do they?) The Dolphin site is pretty vague about requirements: Windows XP or higher, or Linux, or MacOSX Intel Fast CPU with SSE2. GPU with Pixel Shader 2.0 or greater. Some integrated graphics chips work but it depends on the model (and only with DirectX 9). I'm trying to make a light-weight quiet little machine for streaming video and playing eumulators, and I'm trying to figure out the minimum requirements I will need to do what I want. I know I'm not supposed to ask for product recommendations here, so if you could just advise the minimum requirements in terms of CPU, graphics card, and RAM, that'd be helpful.

    Read the article

  • Arch linux ati graphics crash after grub install

    - by Jay
    Ok, so I've had an arch linux w/ gnome 3 installed for a while now. And a while ago I installed ubuntu as another partition, I think to fix an issue that cause arch to fail. So, it was all working fine, but then I went and reinstalled grub 1 on the arch partition; Ubuntu had overwritten it on the install. Then when I tried to boot into arch it booted, but the graphics wasn't working correctly: gdm wouldn't even show, and there were weird colors instead. So, I uninstalled xf86-video-ati and then installed xf86-video-vesa. That made gdm run in fallback mode and I was able to boot to gnome 3 fallback mode (or whatever it's called). But I can't seem to get the graphics working correctly.

    Read the article

  • Internet very slow when upgrading to Ubuntu 9.10

    - by roojoo
    I was running Ubuntu 8.x on my desktop and everything worked fine. Im using wired internet and it worked perfectly, pages loaded pretty fast. However, when I decided to upgrade to 9.10 the upgrade failed at some point, however I was left with what appeared to be Ubuntu 9.10. Since then the internet has been weird. When I go to a website it takes at least 10 seconds for the page to display, however if Im on a site and navigate to other pages on the website it loads quickly. This never happened prior to the upgrade. I thought this may be due to the upgrade not installing correctly so I did a fresh install of Xubuntu 9.10 but the problems are still the same. Im writing this on a Vista machine over the wireless network and internet is fine. Does anyone have any ideas of the issue? Thanks.

    Read the article

  • Will this increase my VPS failing rate ?

    - by Spencer Lim
    Will this increase my Virtual private Server failing rate if i :- install Microsoft Window Server 2008 Enterprise install SQL server enterprise 2008 install IIS 7.5 install ASP.Net Mvc 2 install Microsoft Exchange install Team foundation server on one mini VPS with specification of DELL Poweredge R710 shared plan DDR3 ECC RAMs 16GB and -- 1GB for this VPS using DELL PERC 6i raid controller (this thing alone about 1.5k-2k) and the SAS HDD (15K RPM) (146GB) -- 33GB to this VPS each hdd is freaking fast over 300MB read / write possible with proper tuning the motherboard is a DELL and it has twin redundant PSU (870watt 85%eff) its running on Intel Xeon 5502 (Quad Core) x2 so about 8 physical proc (fairly share) is there any ruler for this about one VPS can only install what what what service ? Thx for reply

    Read the article

  • Can KVM CPU assignment count differ from physical hosts CPU count?

    - by javano
    I have read this question. I knew already that I could for example, have a quad core machine with four guests each having two vCPUs. As they don't all be require 100% CPU usage all the time, the scheduler would handle this for me. My question is about how this relates to a fail-over or migration situation; If host1 has two dual-core CPUs, and I assign guest1 four vCPUs (so it accessed all four physical cores), what will happen if I try and migrate it to host2 which only has one dual-core CPU? Can qemu-kvm emulate more vCPUs than there are physical? Or would I have to shut down the virtual machine, change the CPU assignment, migrate it, and then boot it back up (so no live migration)? Many thanks.

    Read the article

  • Can't login to a new mysql user

    - by mostar
    Hi, When I create a new Mysql user, it is impossible to login using this user and password. Only if I crate a user without a password I can login. For example: mysql -u root -phererootpass grant all privileges on mydb.* to testuser@'%' identified by '' with grant option; grant all privileges on mydb.* to testuser2@'%' identified by 'mypass' with grant option; FLUSH PRIVILEGES; exit; mysql -u testuser #<<< work fine mysql -u testuser2 -pmypass #<<< fail to login ERROR 1045 (28000): Access denied for user 'testuser2'@'localhost' (using password: YES) </code> I'm using Mysql 5.0 on Red Hat v5 Please advice Mostar

    Read the article

  • SELECT TOP N With Two Variables

    - by Ricardo Deano
    Hello all. It's Tuesday morning and I am being thick as (I'm blaming my daughter waking up early this morning!) I have the following example in a SQL table Cust Group Sales A 1 15 A 1 10 A 1 5 A 2 15 A 2 10 A 2 5 B 1 15 B 1 10 B 1 5 B 2 15 B 2 10 B 2 5 What I would like to show is the top 2 products per customer, per group sorted descending by Sales i.e. Cust Group Sales A 1 15 A 1 10 A 2 15 A 2 10 B 1 15 B 1 10 B 2 15 B 2 10 I'm assuming I need to declare two variables, Cust and Group, I'm just not sure how to complete this in one fell swoop. Apologies for the thick question...no excuse. Thanks for any help.

    Read the article

  • SSL FTP fails on Windows 7 but not Windows XP clients

    - by Andrew Neely
    We currently use a free SSL-FTP client called Move-It-Freely to transmit data from a custom data entry program at over forty facilities scattered around the state to our central server. Under XP, it works flawlessly. Some facilities have upgraded to Windows 7. On these machines, uploads (transfers to us) work, downloads (transfers from us to them) fail. Replacing the Windows 7 machine with an XP machine solves the problem. We have also verified that the network firewall settings have not changed. This problem persists even if Windows firewall is not running. We were able to remote into one of the Windows 7 machines to verify that the Windows firewall was indeed turned off. We cannot replicate the problem on our own Windows 7 machines, and are at a loss of how to fix this feature for our customers. The data contain health-related information, and needs to be encrypted (hence SSL-FTP.) Despite hours spent on Google, we cannot find a solution.

    Read the article

  • what might cause a print error in perl?

    - by Mark J Seger
    I have a long running script that every hour opens a file, prints to it and closes the file. I've recently found very rarely, the print is failing, not because I'm testing the status of the print itself but rather due to the fact of missing entries in the file until the system is actually rebooted! I do trap for file open failures and write a message to syslog when that happens and I'm not seeing any open failures so I'm now guessing it may be the print that is failing. I'm not trapping the print failures, which I suspect most people don't but am now going to update that one print. Meanwhile, my question is does anyone know what types of situations could cause a print statement to fail when there is plenty of disk storage and no contention for a file which has been successfully opened in append mode?

    Read the article

  • Best practice? Using DPM to backup VMs within each VM or through the host?

    - by andrew
    We've got two Hyper-V hosts running multiple VMs (all flavors of Windows Servers). One of the VMs is running MS Data Protection Manager 2010, which runs beautifully (most of the time) and is connected to a separate NAS via iSCSI for the DPM storage. I noticed when I installed the DPM agent on the Hyper-V hosts, it enumerates the VMs in the DPM Protection listing. I don't want to burn through my storage space too fast with duplicate protection, so I was wondering: Is it recommended to back up VMs through the host, or is it better to install the DPM agent on each VM and backup as I would any other machine? It would seem as though most people (currently including me) do it the second way, but is there any advantage to including the entries under HyperV (Backup using Child Partition Snapshop)?

    Read the article

  • How can I get a value out of a jQuery object?

    - by Jannis
    I am returning some data (example below) and saving it to a jQuery object (or is this an array and I am confusing the two?) when I log the variable that is the object it has the values I am looking for but how do I access the data inside this object? code $itemPosition = { 'top': $item.offset().top, 'left':$item.offset().left }, console.log($itemPosition); This would log out (the in this case expected) top: 0 & left: 490. But how can I know work with those values? Also, while this it is probably obvious I am still in the early stages of learning jQuery/Javascript rest assured that reference books are on their way, but so far the SO community has been invaluable to my learning, so thanks for reading! J.

    Read the article

  • What programming technique / practice done by you was ahead of its time?

    - by Binoj Antony
    I once built a very good web application in ASP (classic) back in 2001 and extensively used XmlHttpRequest object in it. (I was lucky that the clients were only using IE, and only IE supported this object at that time). Then later when people started talking about AJAX in 2005, It felt good to have used something ahead (or early) of its time. Well, maybe this does not qualify to be listed as something done ahead of its time. Which programming technology/technique/practice have you done that was ahead of this time. One story per answer please. The title for this question taken from an opposite question here.

    Read the article

  • Postgresql init.d script not working

    - by Bram Jetten
    I installed Postgresql 8.4 on my VPS with Ubuntu 10.04. Default setup, nothing unusual. After the installation the dbserver is automatically started and is running great. The installer also sets a init.d script in place. This script however, doesn't seem to affect Postgres. $ sudo /etc/init.d/postgresql stop The above line is not stopping the server. The command does not fail or show any message. The logs won't say anything as well. After killing all postgres processes with killall I cannot get Postgres working again using the init script. When rebooting my VPS it somehow starts up and works again.

    Read the article

< Previous Page | 208 209 210 211 212 213 214 215 216 217 218 219  | Next Page >