Search Results

Search found 42869 results on 1715 pages for 'running total'.

Page 212/1715 | < Previous Page | 208 209 210 211 212 213 214 215 216 217 218 219  | Next Page >

  • How to manage iowait over cifs?

    - by Silvia
    For backup purposes we have Cifs file Server running that contains encrypted containers for backing up the more sensitive data. The container is mounted with cryptsetup and loop as a local filesystem and the rsync is used for backups. Because the Cifs server is not the fastest machine ever built, running the rsync process results in an iowait on the servers running the backup which in turn drives Nagios into an email frenzy. The question is, how do reduce the iowait on the server? Configuring Nagios to not report seems more like a workaround then a solution. Stretching the backups over different time intervals is already done with little effect and spending money is also not an option because apparently, we are talking about a "non-critical system".

    Read the article

  • Can a web server and XBMC HTPC co exist happily?

    - by Mild Fuzz
    I have a machine that is currently dedicated to running my home theatre. It is way more powerful than it needs to be, and spec wise would have no problem running both a few websites and an HTPC What I wanted to know is that is this was a reasonable thing to expect of a single machine? 90% of the time, all it's power would be just for the web server ( and the odd torrent) Currently it's running Windows, but I am pretty sure I will have to turn it into a Linux box Will I run into any problems? Is there anything I need to know before I start? Any prerequisites? The webserver will be required to run Ruby on Rails sites mainly, but might be called upon to run PHP for Wordpress also.

    Read the article

  • Unable to connect to a remote SQL Server Instance over a VPN

    - by Jack Njiri
    I'm running SQL Server 2005 on two different servers running Win XP. The two servers are in different physical locations and are connected via a dedecated point to point data link in a virtual private network(VPN). Im only able to connect to the remote instance of SQL Server by specifying the IP address on the server name property. If I provide the actual server name say 'ServerA', then I get an error message. Everything works fine except configuring replication at the subscriber level, which requires the actual name of the instance, not an IP address or alias. I have already configured both instances on allow remote connections and im running the SQL Server Browser. How do I connect to the remote instance by providing the instance name? Alternatively how I configure subscription to a remote publisher without supplying the remote instance name?

    Read the article

  • Windows calibration settings persistance over reboots

    - by Dmatig
    I'm running Windows 7 64bit on a laptop (Samsung R560) using a cheap external CRT monitor. The screen is a littler dark for my liking, despite having the physical monitors settings up to the max for all the brightness-related settings. Windows 7 has a tool called "Calibrate display color" (search in the start menu). Running this tool, you have a slider that allows you to adjust the "Gamma", which sliding up gives me acceptable brightness levels. Unfortunately, upon reboot (and certain other activities such as running certain fullscreen games) this is reset to default. Is there a way to make this persistent? Some registry setting? Batch file to run at startup even (less preferable as I'd like games to run brighter too)?

    Read the article

  • How do I give MacPorts privileges?

    - by cojadate
    I tried to install PostgreSQL server development libraries using MacPorts and got the following: Warning: MacPorts running without privileges. You may be unable to complete certain actions (e.g. install). ---> Computing dependencies for postgresql-server-devel ---> Dependencies to be installed: postgresql-devel ---> Building postgresql-devel Error: Target org.macports.build returned: shell command failed Error: The following dependencies failed to build: postgresql-devel Error: Status 1 encountered during processing. To report a bug, see <http://guide.macports.org/#project.tickets> So I guess that means I need to running MacPorts with privileges and try again. Unfortunately I've no idea how to give MacPorts privileges. I'm running OS X 10.6.3

    Read the article

  • Remote desktop session ends abruptly with a "protocol error"

    - by Jon
    Intermittently we get a problem where a remote desktop session will get disconnected with the error message “Because of a protocol error, this session will be disconnected. Please try connecting to the remote computer again.” We are getting this with one server only which is running Windows Server 2008, connecting with Windows 7 clients. The session itself stays running, you just get disconnected, and you can try and reconnect. Sometimes you get in for a while then it will kick you out. We are connecting from Windows 7 clients. We have tried connecting using Cord on a Mac and this works fine, so it's not like the session itself is corrupted. One problem is that there are some critical applications running under the session (I know, let's not discuss the idiocy of that), so we cannot reset the session in any way during the working day – so any diagnostics must have minimum impact. Thanks, Jon

    Read the article

  • How to setup a web server with remote SMTP

    - by IP
    I have 2 severs, both running server 2008 (R2) One is the web server, one is running as a mail server. The setup I want is that any mail sent from apps (php, asp and asp.net) on the web server are sending mail through the mail server's SMTP server...but this seems to be proving trickier than i'd hoped. The mail server is running MailEnable, and the web server IIS7 (maybe 7.5) What i don't want is to setup an open relay SMTP server on the web server, as this is going to be open to abuse (even if I just allow relay from local address). the problem is, there doesn't appear to be a way to specify credentials in php so if I point it at the mail server, then the mail server has to be set as an open relay, which is almost worse. Any ideas how I should be doing this?

    Read the article

  • Cygwin/Git Bizarre Terminal Issue

    - by emptyset
    Alright, this is weird. First off, this is mintty running on up-to-date cygwin, with git pulled from cygwin's setup.exe. I am running zsh. $ git clone https://<user>@<domain>/<repository>/ ~/src/project/dev Initialized empty Git repository in /cygdrive/c/src/project/dev/.git/ Password: <actual password in plain text appears> # Nothing happens... ^C $ <password text that I just typed> zsh: command not found: <same password text> What is going on here? Is this a terminal problem, a shell problem, a git problem, or a cygwin problem? Update: Yes, I'm running the Cygwin git version, not the Windows version: $ which git /usr/bin/git $ git --version git version 1.7.1 $ /cygdrive/c/Program\ Files\ \(x86\)/Git/bin/git.exe --version git version 1.7.0.2.msysgit.0

    Read the article

  • Nagios Woudn't Start, now won't Stop!

    - by Bart B
    I ran an update on a CentOS server running Nagios, after the update, Nagios failed to start. The error in the logs was: Failed to obtain lock on file /var/run/nagios.pid: Permission denied So, I checked and there was no pid file for Nagios in /var/run. I created one and gave it the following permissions: -rwxr--r-- 1 nagios nagios 6 May 31 11:58 nagios.pid Nagios then started and seems to be running normally. The only problem is, it refuses to stop now, so I can't re-start it to add new servers and services to be monitored! When I issue the command "service nagios stop", I get [FAILED], but nothing at all gets outputted to the log, and the service remains up. Any ideas on how I can get the service to stop now? I'm running the RPM version which was installed via yum from the RPMForge repositories. The server is CenotOS 5.5.

    Read the article

  • Which are the most important directories to backup on a Linux server?

    - by QAH
    Hello everyone! I'm running an Ubuntu 9.10 Linux server. I'm trying to find a way to backup the machine while it is running and from what I see, this eliminates the disk clone utilities. All of the disk clone stuff I have seen for Linux requires that you reboot into a special live CD. So my question is this, what is the best solution for backing up the system while it is running? Also, I don't really care about the OS config too much, I just want to be able to keep my stored files and my programs that I have installed on it. Thanks

    Read the article

  • 64bit on core i5 with 2GB DDR3 RAM?

    - by Jacques
    Core i5 2.3 Ghz processor 512mb ATI HD 4570 2GB 1333 RAM 64bit Windows 7 Home Premium. Should I "down-grade" to 32bit? Does running 64bit with only 2GB RAM make the laptop weaker than running 32bit with 2GB RAM or is the performance pretty much the same? Is there any performance benefit to running 64bit with only 2GB RAM? Is there an impact on battery life between 64bit and 32bit. Should I maybe just add another 2GB RAM? Thanks.

    Read the article

  • How to prevent unison syncronize file when file process uploading

    - by user134600
    I use CentOS 5.8 Final. My situation is I running unison with cron where script below : */1 * * * * /usr/bin/unison /dev/null 2&1 and default profile like below : root = /var/www root = ssh://web02.example.com//var/www auto=true batch=true confirmbigdel=true fastcheck=true group=true owner=true prefer=newer silent=true times=true So in every minutes will syncronized www folder . My problem are : I upload file with size bigger than 10 MB to www from client with user1 permission where www folder is user1 owner. file in processing uploading then unison running in that minute and suddenly file upload owner changed to root:root When I editing file in www folder then I save when unison running, file owner changed to root:root where should be user1:user1 Is there anyone know about this problem?

    Read the article

  • Mysql-proxy compile in CentOS

    - by gtfx
    Hey, While trying to compile Mysql-Poxy i get the following error. By the instructions here Libtool library used but `LIBTOOL' is undefined The usual way to define `LIBTOOL' is to add `AC_PROG_LIBTOOL' to `configure.in' and run `aclocal' and `autoconf' again. If `AC_PROG_LIBTOOL' is in `configure.in', make sure its definition is in aclocal's search path Libtool installed from source. Running aclocal get's no error. running ./configure ./configure: line 5821: AC_DISABLE_STATIC: command not found ./configure: line 5823: AC_PROG_LIBTOOL: command not found checking shared library path variable... configure: error: eval "libtool --config | grep ^shlibpath_var" failed Running libtool command libtool --config | grep ^shlibpath_var shlibpath_var=LD_LIBRARY_PATH What am i missing? Thank you for your time.

    Read the article

  • Using FastCGI for PHP on Mac OS X

    - by DanieL
    I have apache2 running on a Mac OS X (10.6) machine, and it is currently serving PHP pages fine, using php5_module but I would like to configure fastcgi_module to handle the php pages. I have tried using the configuration found on www.fastcgi.com but I get the following error: [warn] FastCGI: (dynamic) server "/Path/to/script.php" has failed to remain running for 30 seconds given 3 attempts, its restart interval has been backed off to 600 seconds [warn] FastCGI: server "/usr/bin/php" has failed to remain running for 30 seconds given 3 attempts, its restart interval has been backed off to 600 seconds I'm thinking this is because PHP has not been compiled with FastCGI, but seeing as it came with Mac OS X i'm not sure how to recompile it. Is this the problem? And if so, how do I recompile PHP with FastCGI?

    Read the article

  • Exchange 2013 Virtual Machine: Backup just mailboxes and clear logs

    - by Ben Curtis
    I have a Windows Server 2012 machine running Exchange 2013 running as a KVM virtual machine. For my VM guests, I do full image based backups from the host, so that I can quickly restore to any host server simply by copying over the disk image files. This means I don't need a nightly full system backup. That being said, without running a VSS Full Backup, the Exchange logs get massive (Specifically, the performance logs which are 500MB a day). In addition, I would also like to have a nightly backup of just the mail database. What is the best way to accomplish this? A full backup of the C:\Program Files\Microsoft\Exchange Server\V15 folder as I found in one tutorial did not clear out the logs. Thanks, Ben

    Read the article

  • Having problems with connecting to/seeing the local SQL server with Microsoft SQL Server Management Studio

    - by Hans-Henrik
    I'm having some difficulties when I'm trying to connect to my local SQL Server. I'm pretty sure the server is running (many of the other topics on this subject suggests that the services might not be running, so I kinda looked into it, but they do seem to be running). But when I try to access it through Microsoft SQL Server Management Studio it doesn't seem to be able to find them. Server type: Database Engine Server name: ILIZANESQL* - I'm trying to "browse for more..." to find my server, but it doesn't show up Authentication: Windows Authentication

    Read the article

  • What are the limitations of virtual machines?

    - by j-g-faustus
    I'm considering setting up a virtual machine running Windows, with Ubuntu 10.10 as the host OS, for those cases where I have a Windows-only program. I understand that using a VM will lose some performance, but are there other limitations to what the OS in a virtual machine can do compared to "running on bare metal"? For example: Can a VM play games, like Dragon Age Origins or Civilization V? (Possibly with poorer framerates and/or lower resolution, but does it play at all?) Can a VM rip DVD/Blue-ray using AnyDVD or similar Windows program? Can a VM handle new hardware that requires dedicated drivers, but the drivers are only available for the OS running inside the VM? (Ex. graphics card, digital camera, card reader for smart card authentication.) Is it possible to say anything about "general limitations" of VMs, or is this wholly dependent on the specific VM?

    Read the article

  • Intel cpu hyperthreading on or off for ibm db2?

    - by rtorti19
    Has anyone ever done any database performance comparisons with hyper-threading enabled vs disabled? We are running ibm db2 and I'm curious if anyone has an recommendations for enabling hyper-threading or not. With hyper-threading enabled it makes it quite difficult to do capacity planning for cpu usage. For example. "With 8 physical cores represented as 16 "threads" on the OS and a cpu-bound workload, does that mean when your cpu usage hit's 50% you are actually running at 100%." What real benefits do I gain with leaving hyper-threading enabled on an intel server running DB2? Does hyper-threading help if you're workload is truly disk IO bound? If so, up to what percentage? These are the types of questions I'm trying to answer. Any thoughts?

    Read the article

  • Why does Task Scheduler NOT re-run successfully completed tasks

    - by Teo
    I am using Task Scheduler on Windows 2008 x64. I have 3 tasks, running every night on different times without overlapping. It works for some days - usually 2-3 up to 10 (it's really random), then it stops running the tasks. When I look at the history, I see that the tasks completed successfully. In the UI, the column "Next Run Time" stays empty. The tasks are set to run on background; the account for running them is a domain one - it is valid and enabled. When I check with Process Explorer, there are no left-over processes associated with my tasks. I am completely baffled at what's going on.

    Read the article

  • I can't do a Remote Assistance session to a Windows XP box from Windows 7.

    - by superkinhluan
    My Mom's computer is running Windows XP, and my desktop running Windows 7. She's having some technical issue, so I want to do a Remote Assistance session to her machine. However, no matter what I've tried, the Remote Assistance program doesn't connect successfully. I've verified that the Windows Firewall (on both my and her machines) is configured properly to allow Remote Assistance program to go through. What's interesting is that I have the same problem when I try to do Remote Assistance from my desktop to my laptop, which is also running Windows XP. However, when I try to connect to my girlfriend's machine, which is runninng Windows 7 this time, the connection is successful. So in the end, I guess there must be some incompability between Windows 7 and Windows XP. Does anyone experience the same issue? How did you resolve it?

    Read the article

  • Win2008 DC in a Windows 2000 domain: can I keep the old DC?

    - by gravyface
    Will be putting a new Windows 2008 SE Server into a single domain network with two domain controllers, both running Windows 2000 Server. The functional level of the domain is mixed mode/2000. Until a second 2008 DC can be purchased, I'd like to leave the current Win2k operational master DC as a backup DC as the other member servers running 2003 have either accounting/SQL or Exchange on them. Eventually all the w2k servers will be decommissioned, but until then, I need another DC for redundancy. Following the standard process for adding a new DC, can I leave the old operational master DC (or the other backup DC) running after I transfer the FSMO roles to the new server? Will this cause any issues?

    Read the article

  • Chef bootstrap giving 401 unauthorized

    - by loddy1234
    I'm trying to bootstrap a new chef node by running: knife bootstrap <server ip> -x lewis -N gitlab --sudo But I get the following output: [Mon, 03 Sep 2012 14:45:17 +0000] INFO: *** Chef 10.12.0 *** [Mon, 03 Sep 2012 14:45:17 +0000] INFO: Client key /etc/chef/client.pem is not present - registering [Mon, 03 Sep 2012 14:45:17 +0000] INFO: HTTP Request Returned 401 Unauthorized: Failed to authenticate. Ensure that your client key is valid. [Mon, 03 Sep 2012 14:45:17 +0000] FATAL: Stacktrace dumped to /var/chef/cache/chef-stacktrace.out [Mon, 03 Sep 2012 14:45:17 +0000] FATAL: Net::HTTPServerException: 401 "Unauthorized" My chef server is running Ubuntu 12.04 x32 and the machine I'm trying to bootstrap is running CentOS 6.3 x64 Any idea what's going wrong?

    Read the article

  • Why would VMWare to go defunct? How to recover from/prevent it?

    - by Josh
    I am running VMWare Server 2.0.2 (Build 203138) on a dual core Intel i5 with Ubuntu Server 10.04 LTS system (kernel 2.6.32-22-server #33-Ubuntu SMP). Disk Subsystem is a software RAID5 array. The system has been set up for a little over a week. For the past 5 days I have been running at leat 3 VMs (Linux and a variety of Windows OSes) with no issues whatsoever. But while I was installing Linux onto a new VM, suddenly all VMs became unresponsive, including the one I was installing to. I could not log in to the VMWare Management Interface, and the system was somewhat unresponsive via SSH. When I looked at top, I saw: top - 16:14:51 up 6 days, 1:49, 8 users, load average: 24.29, 24.33 17.54 Tasks: 203 total, 7 running, 195 sleeping, 0 stopped, 1 zombie Cpu(s): 0.2%us, 25.6%sy, 0.0%ni, 74.3%id, 0.0%wa, 0.0%hi, 0.0%si, 0.0%st Mem: 8056656k total, 5927580k used, 2129076k free, 20320k buffers Swap: 7811064k total, 240216k used, 7570848k free, 5045884k cached PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND 21549 root 39 19 0 0 0 Z 100 0.0 15:02.44 [vmware-vmx] <defunct> 2115 root 20 0 0 0 0 S 1 0.0 170:32.08 [vmware-rtc] 2231 root 21 1 1494m 126m 100m S 1 1.6 892:58.05 /usr/lib/vmware/bin/vmware-vmx -# product=2; 2280 jnet 20 0 19320 1164 800 R 0 0.0 30:04.55 top 12236 root 20 0 833m 41m 34m S 0 0.5 88:34.24 /usr/lib/vmware/bin/vmware-vmx -# product=2; 1 root 20 0 23704 1476 920 S 0 0.0 0:00.80 /sbin/init 2 root 20 0 0 0 0 S 0 0.0 0:00.01 [kthreadd] 3 root RT 0 0 0 0 S 0 0.0 0:00.00 [migration/0] 4 root 20 0 0 0 0 S 0 0.0 0:00.84 [ksoftirqd/0] 5 root RT 0 0 0 0 S 0 0.0 0:00.00 [watchdog/0] 6 root RT 0 0 0 0 S 0 0.0 0:00.00 [migration/1] The VMWare process for the virtual machine I was installing into became a zombie. Yet, it was still consuming 100% of the CPU time on one of the cores, and I couldn't reach it or any other virtual machines. (I was logged in to one virtual machine over SSH, another via X11, and a third via VNC. All three connections died). When I ran ps -ef and similar commands, I found that the defunct vmware-vmx process had it's parent PID set to init (1). I also used lsof -p 21549 and found that the defunct process had no open files. Yet it was using 100% of CPU time... I was unable to kill any vmware-vmx processes, including the defunct one, even with kill -9. As a last resort to resolve the situation I tried to reboot the box, however shutdown, halt, reboot, and init 6 all failed to reboot/shutdown, even when given appropriate --force settings. ControlAltDel produced a message about rebooting on the console, but the system would not reboot. I had to hard power-cycle the box to resolve the situation. (See my other question, Should I worry about the integrity of my linux software RAID5 after a crash or kernel panic?) What would cause a scenario like this? What else could I have done to resolve it besides a hard reboot? What can I do to prevent such a situation in the future?

    Read the article

  • Can only see part of the screen when connecting via NX, how can I control NX size?

    - by Oak
    I'm using NoMachine's NX client on OSX 10.7.2 to connect to an NX server running on Ubuntu 10.04.3. I've chosen the Gnome desktop during the connection creation. The NX screen that opens is smaller than the Gnome screen running "behind" it it looks like it's 1024x768, while the screen "behind" it looks to be running 1680x1050. The result is naturally very inconvenient (e.g. can't see the taskbar, can't click lower options in popup menus, etc.). In the past I've used NX client successfully to connect and see the entire screen. Resizing the NX window only resizes the transferred image, but does not let me see more of the Gnome desktop. Playing with the NX client display options also does the same. How can I control the size of the NX client, and make it show the entire Gnome desktop?

    Read the article

  • Should I Use PHP as FastCGI?

    - by Synetech inc.
    Hi, I am running an Apache webserver on my Windows machine. It is not generally a public server (most of the little bit of traffic comes from the machine itself, and most of the public traffic comes from crawlers). Basically, it is mostly just for use as a test-bed, development system. I have read about how running PHP as FastCGI is better (ie faster and more stable) than as an Apache module. However, I really don’t like the idea of multiple PHP.exe processes (I don’t like that Apache has two processes and I’m not even too thrilled with Chromium’s multi-process model). So I’m wondering if it would be worthwhile to change PHP to FastCGI for this scenario. If it is, how would I configure it? Pretty much all of the information I have seen has been either for non-Windows or for IIS. As I said, I’m running Windows+Apache. Thanks a lot.

    Read the article

< Previous Page | 208 209 210 211 212 213 214 215 216 217 218 219  | Next Page >