Search Results

Search found 59084 results on 2364 pages for 'local system'.

Page 530/2364 | < Previous Page | 526 527 528 529 530 531 532 533 534 535 536 537  | Next Page >

  • "killed" message from cron.daily, but not when run from command line

    - by Dan Stahlke
    On Fedora 17, I put a file into /etc/cron.daily with the following contents: cd / su dstahlke /home/dstahlke/bin/anacron-daily.sh exit 0 For some reason, I get a mail every day that just says /etc/cron.daily/dstahlke-daily: ...killed. I tried with and without the exit 0 line above (I noticed that some system scripts have that and others don't, I'm not sure of the purpose). Running /etc/cron.daily/dstahlke-daily from the command line as root produces no ...killed message. Other than the message, everything seems to work fine. Putting set -x in the above script, as well as in the /home/dstahlke/bin/anacron-daily.sh script shows that the ...killed message happens just after the latter script terminates (or perhaps just after the su command finishes). What causes the ...killed message? Or, is there a more acceptable way to have anacron run a user script daily? I figured that putting this in /etc/cron.daily would help the system coordinate all of the daily tasks rather than potentially running my task concurrently with the system tasks.

    Read the article

  • How to update a keyboard layout DLL on Windows?

    - by user87690
    Information about keyboard layout is stored in keyboard layout DLLs on Windows. One can make a custom layout using a tool like Microsoft Keyboard Layout Creator (http://msdn.microsoft.com/en-us/goglobal/bb964665.aspx). It creates the dynamical library and also an installer which places the DLL at a right place and also puts necessary information to registry so the layout can be used. Now my question. How can one update his custom layout? One could uninstall the layout and install the new version which effectively replaces the DLL file. However this alone doesn't work because it seems that the old copy of the DLL stays loaded somewhere and is used when one sets the layout instead of loading new version. So is there a way how to tell the system that its “keyboard layout cache” is invalid? Of course I could reboot the whole system but I'd like to avoid it as it's annoying and seems to be poor design to reboot whole system just to reload a DLL.

    Read the article

  • Workflow Automation software for SVN

    - by KyleMit
    We're currently using IBM's ClearQuest for task management and ClearCase for change management. They plug and play very well with each other. Users can create tasks in ClearCase as defects and enhancements, and developers can use those tasks to check out and modify code in source control. We're looking to upgrade to a better, more modern Source Control system, like SVN, although we're not married to that as our Source Control system. There are loads of source control systems out there, but I'm having difficulty finding one that also includes the ability to have users enter tasks and track them, especially in a native way to the source control system itself. Are there any products that replace ClearQuest for systems like SVN? Are there any other cheap / open source application pairs that handle both sides of the coin?

    Read the article

  • Does Wake-on-LAN from power state S5 require any OS configuration?

    - by TARehman
    I am configuring a HTPC which I would like to be able to power on using Wake-on-LAN, from the S5 state (full shutdown, still plugged in). The system is running Linux Mint 14 Cinnamon. I'm getting some conflicting information in my searching on the Net. I am not concerned with using WoL to change the state from standby or hibernate to on. Because of the current interface to our TV, the system must be either turned on or turned off. So, basically, this system will cycle from S0 to S5, and from S5 back to S0, and so on. Some tutorials suggest that I need to use ethtool to configure things after enabling WoL in my BIOS, but my understanding is that doing an S5 - S0 power on only requires the BIOS to be configured (since when the computer is in state S5, the OS hasn't even been loaded anyway). Can I use WoL with only the BIOS configured to go from state S5 to S0, or will I need to configure the OS as well?

    Read the article

  • Set up Linux box as WAP for MyBookLive?

    - by AcidFlask
    I inherited an old Linux box as well as a MyBookLive and would like to make the MyBookLive available over my wireless, essentially using the Linux box as a wireless access point. I just wiped the Linux box (home) and installed Ubuntu 12.04 on it. My network setup currently looks like this: (192.168.0.1 netmask 255.255.255.0) ISP --- wireless router --- wlan0 on home (192.168.0.12) | eth0 on home --- MyBookLive MacBook (192.168.0.11) so that the MyBookLive is basically a glorified external hard drive. The router does have an Ethernet port, but it is being used by my roommate's computer so I can't plug the MyBookLive directly into it. Right now I can ping MyBookLive.local and MacBook.local from home, but I am having trouble understanding and figuring out what the correct iptables commands are to make my MacBook see my MyBookLive through the Bonjour network. Also, I'm not sure if I need to set up DNS to forward xxx.local Bonjour/Zeroconf addresses. I tried the following to forward my entire wired network (which has only my MyBookLive) to a single IP address: sysctl net.ipv4.ip_forward=1 iptables -A FORWARD -i wlan0 -o eth0 -j ACCEPT iptables -A FORWARD -i eth0 -o wlan0 -j ACCEPT iptables -t nat -A PREROUTING -i eth0 -p tcp -j DNAT --to 192.168.0.66 iptables -t nat -A PREROUTING -i eth0 -p udp -j DNAT --to 192.168.0.66 but I can't ping this address from my MacBook. This is probably horribly wrong, but I am a complete noob at setting up this kind of network and could use some expert help with setting this up properly.

    Read the article

  • How to remotely install Linux via SSH?

    - by netvope
    I need to remotely install Ubuntu Server 10.04 (x86) on a server currently running RHEL 3.4 (x86). I'll have to be very careful because no one can press the restart button for me if anything goes wrong. Have you ever remotely installed Linux? Which way would you recommend? Any advice for things to watch out? Update: Thanks for your help. I managed to "change the tires while driving"! The main components of my method are drawn from HOWTO - Install Debian Onto a Remote Linux System, grub legacy: Booting once-only, grub single boot and kernel panic reboot , and Ubuntu Community Documentation: InstallationFromKnoppix Here is the outline of what I did: Run debootstrap on an existing Ubuntu server Transfer the files to the swap partition of the RHEL 3.4 server Boot into tha swap partition (the debootstrap system) Transfer the files to the original root partition Boot into the new Ubuntu system and finish up the installation with tasksel, apt-get, etc I tested the method in a VM and then applied to the server. I was lucky enough that everything went smoothly :)

    Read the article

  • Users and Groups management on 7 Home Premium

    - by AviD
    Recently upgraded the home pc from XP pro, to Windows 7 Home Premium. I'm looking for a solution for a few things that seem to be missing from this edition... Since Local Users and Groups is blocked on Home Premium, I can't figure out how to manage groups, or even do anything even slightly advanced to users (basically, create/group/picture is it). net localgroup, net users, net etc dont seem to work - getting "system error 5". While I'm on the topic, I cant activate (what was once) "Local Security Policy"... Looking for any help, advice, or even a new direction cuz things is differ'nt on Winnows7... To clarify, I'm looking to do some of the following, which were simply back in XP-land: remote user only (i.e. no local logon) Grant special privileges for specific user grant access to e.g. C$ share for specific remote user create custom groups for users, to be able to separate privileges of say, my wife's from my kids define quite specifically what each user can do (beyond just standard users) Harden OS (hmm, i guess maybe what i'm looking for is security hardening guide for 7...?)

    Read the article

  • Windows 7 - Windows 8 dual boot installation error

    - by Nikhil
    I am trying to dual boot Windows 8 with Windows 7 . But I keep getting an error while selecting the drive to install on . The error is ""Setup was unable to create a new system partition or locate an existing system partition. See the Setup log files for more information." " . What might be the problem ? Also previously I tried to install only Windows 7 on my HDD . I downloaded the ISO from digital river . Made bootable usb . I got the same above error . But when I tried to install it via other usb which had pirated Windows 7 downloaded from torrents it gave no error . My system config is Motherboard - Gigabyte G41 MT S2P HDD - 160 GB SATA RAM - 8 GB Help !!!

    Read the article

  • Can't run .net software on Samsung Omnia Pro B7330

    - by Sam
    I've got a Samsung Omnia Pro B7330, and am unable to run any program for .net compact framework on it. My last three Smartphones had been HTC brand, and all the programs I tried are just simple winform apps, which ran on all HTC phones. But not on the Omnia. Whenever I try to start an app, I get an "unexpected error" / "NotSupportedException" at: Microsoft.AGL.Common.MISC.HandleArt() System.Windows.Forms.Control._InitInstalce() System.Windows.Forms.Control..ctor() System.Windows.Forms.Button..ctor() MyApp.InitializeComponents() The proper compact framework 2.0 is already installed in the rom, installing CF 3.5 didn't change anything. Anyone can tell me whats the problem with this phone?

    Read the article

  • Rollback driver in Windows via command line

    - by ultrasawblade
    I have a remote system in which I've updated the Nvidia graphics driver, but now RDPDD.dll, which I'm guessing stands for RDP Display Driver, will not load (found this out via Event Viewer). I can use Sysinternal's Psexec to execute commands though. I'm sure something has gone very wrong and the user of this system (a CAD engineer) won't be able to use his system normally. I've tried two remote reboots and that has not resolved the issue. So I would like to use the "Roll Back" option for the Nvidia driver in the device manager. Is there a command-line way of doing this?

    Read the article

  • Run disk error check on NTFS file?

    - by paulius_l
    I have a feeling that my system hard drive is dying. Benchmark kind of enforces it. Here is the benchmark of my system hard drive during low system activity: And here is the benchmark of backup drive: Furthermore, there are some files which I just can't touch because I get CRC errors and the hard drive activity spikes to 100% with operating speeds less than 1 MB/s while working with such files. I haven't yet tried swapping SATA cable as I have read this might cause the problems. Anyway, I would like to run some tests on specific clustsers where those files I am interested in are stored. I don't want to do the full chkdsk because it takes a very long time. I would like to either find the utility which executes the disk check directly on the clusters where the file belongs or a couple utilities where one tells me the cluster locations and another can check just those locations. How do I check and possibly fix disk errors where the files I am interested in are stored? Edit: S.M.A.R.T. info:

    Read the article

  • Creating a custom view for windows log based on a "Contains {text}" rule

    - by jussinen
    I have a server running Windows Server 2008. I'm using Windows Server Auditing to check when and by which user a folder is modified to determine who is modifying it as the modifications are causing problems. I can see the log of the audit when a change is made in the System log. How do I create a Custom View that will return all events from System log where a certain text (which is the folder name) is present? The create custom view doesn't seem to have that option. I'm not sure whether it's possible via custom xml query or whether I'll need to export the system log to csv and search in Excel. John

    Read the article

  • What is the difference between du -h and ls -lh?

    - by PeanutsMonkey
    I am having a difficult time grasping what is the correct way to read the size of the files since each command gives you varying results. I also came across a post at http://forums.devshed.com/linux-help-33/du-and-ls-generating-inconsistent-file-sizes-42169.html which states the following; du gives you the size of the file as it resides on the file system. ( IE will will always give you a result that is divisible by 1024 ). ls will give you the actual size of the file. What you are looking at is the difference between the actual size of the file and the amount of space on disk it takes. ( also called file system efficiency ). What is the difference between as it resides on the file system and actual size of the fil

    Read the article

  • JFFS2 poor mount performance

    - by Marcin Polkowski
    I run multiple ARM boards with Debian Linux installed. Board is equipped with 512 MB of NAND memory. I've observed that after ~3 months of continuous run booting time increased significantly - it takes over 3 minutes to mount filesystem (JFFS2). System was using about 35% of available storage so I’ve removed unnecessary files (got to ~18%) but this didn't change anything. Then I realized that my software produces directories that are left empty so I’ve removed ~500 empty and unnecessary dirs. This didn’t help either. After system is started I see JFFS2 garbage collector (jffs2_gcd_mtd4) running and occupying over 90% of CPU. Now my question: is there a way to „optimize” JFFS2 filesystem for better performance - faster booting (my system have limited timid to boot up)? It would be great if this optimization could be done remotely - I have no physical access to boards.

    Read the article

  • CPU and HD degradation on sourced based Linux distribution

    - by danilo2
    I was wondering for a long time if source based Linux distributions, like Gentoo or Funtoo are "destroying" your system faster than binary ones (like Fedora or Debian). I'm talking about CPU and hard drive degradation. Of course, when you're updating your system, it has to compile everything from source, so it takes longer and your CPU is used at hard conditions (it is warmer and more loaded). Such systems compile hundreds of packages weekly, so does it really matter? Does such a system degrade faster than binary based ones?

    Read the article

  • Unable to install vlc and mplayer after update on fedora 18

    - by mahesh
    I just updated fedora 18 using # yum update Then if I try # rpm -ivh http://download1.rpmfusion.org/free/fedora/rpmfusion-free-release-stable.noarch.rpm I get, Retrieving http://download1.rpmfusion.org/free/fedora/rpmfusion-free-release-stable.noarch.rpm warning: /var/tmp/rpm-tmp.0K5pWw: Header V3 RSA/SHA256 Signature, key ID 172ff33d: NOKEY error: Failed dependencies: system-release >= 19 is needed by rpmfusion-free-release-19-1.noarch So I tried installing vlc from development version, # rpm -ivh http://download1.rpmfusion.org/free/fedora/rpmfusion-free-release-rawhide.noarch.rpm I get, Retrieving http://download1.rpmfusion.org/free/fedora/rpmfusion-free-release-rawhide.noarch.rpm warning: /var/tmp/rpm-tmp.WZC0gw: Header V3 RSA/SHA256 Signature, key ID 6446d859: NOKEY error: Failed dependencies: system-release >= 21 is needed by rpmfusion-free-release-21-0.1.noarch There's no system release after 20. What does this mean?

    Read the article

  • Users are getting a temporary profile

    - by Serhiy
    A bit about current setup: It is windows 2008 R2 AD servers (all of them are 2008R2) and couple locations which set as Sites. Each location has DFS on AD server. Roaming profiles are not used nor configured. Users have their home folder configured as mapped S: drive to DFS shared folder. For example: in profile tab user has: Home Folder - connect - S: to \\domain.com\dc\users\%username% We also have redirected Desktop, Documents and Downloads folders to \\domain.com\dc\users. Everything was fine. Suddenly (today), users in most locations lost their local profile (both XP and W7 desktops) and got temporary profiles. Also, it looks like local profile was created today (from folder properties). I checked events at couple machines and there is not errors related to profiles or logon process. I do not see issues in event logs at servers as well. Basically, I run out of ideas what is wrong and why machines lost their local profiles. PS: Laptop users do not have their folders redirected, but lost profiles as well.

    Read the article

  • Power Shell does not like command

    - by Campo
    Any ideas what I did wrong here? I copied this script from a tutorial and get this error.... PS C:\Windows\system32> Get-Service | Where-Object ($_.status -eq "running") Where-Object : Cannot bind parameter 'FilterScript'. Cannot convert value "False" to type "System.Management.Automation .ScriptBlock". Error: "Invalid cast from 'System.Boolean' to 'System.Management.Automation.ScriptBlock'." At line:1 char:27 + Get-Service | Where-Object <<<< ($_.status -eq "running") + CategoryInfo : InvalidArgument: (:) [Where-Object], ParameterBindingException + FullyQualifiedErrorId : CannotConvertArgumentNoMessage,Microsoft.PowerShell.Commands.WhereObjectCommand

    Read the article

  • Installing VirtIO drivers in Windows Server 2008

    - by Stefan K.
    We are running a Windows Server 2008 system as a "guest" on a Linux-KVM virtual server (SLES11, with VirtIO support). We have trouble with the system performance and this is possibly due to not using the VirtIO drivers. I don't have much experience with neither KVM not VirtIO. Just heard it this could be the reason for our problem. Questions: The install examples I just found are describing how to install the drivers during Windows setup. Is it possible to install the VirtIO drivers later? We have running software on that system and would like to avoid reinstalling/setup of all these. I already found a page describing how to sign the drivers, which seems to be needed. A good tutorial page (step by step instructions) would be nice. Is there anything like that out there?

    Read the article

  • error while installing binutils in LFS

    - by user53347
    lfs:/mnt/lfs/sources/binutils-build$ ../binutils-2.15.94.0.2.2/configure \ --target=$LFS_TGT --prefix=/tools \ --disable-nls --disable-werror loading cache ./config.cache checking host system type... i686-pc-linux-gnuoldld checking target system type... i686-lfs-linux-gnu checking build system type... i686-pc-linux-gnuoldld checking for a BSD compatible install... /usr/bin/install -c checking whether ln works... yes checking whether ln -s works... yes checking for gcc... no checking for cc... no configure: error: no acceptable cc found in $PATH

    Read the article

  • Auto Launching PHP-FPM

    - by Seth
    My plist file <?xml version='1.0' encoding='UTF-8'?> <!DOCTYPE plist PUBLIC "-//Apple Computer//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd" > <plist version='1.0'> <dict> <key>Label</key><string>org.macports.php-fpm</string> <key>ProgramArguments</key> <array> <string>/opt/local/bin/daemondo</string> <string>--label=php-fpm</string> <string>--start-cmd</string> <string>/opt/local/sbin/php-fpm</string> <string>;</string> <string>--pid=fileauto</string> <string>--pidfile</string> <string>/opt/local/var/run/php-fpm/php-fpm.pid</string> </array> <key>Debug</key><false/> <key>Disabled</key><true/> <key>OnDemand</key><false/> </dict> </plist> After rebooting, it's not loading up automatically. I still have to manually start php-fpm. I have tried unloading and adding RunAtLoad etc. with no luck and tried both these launchctl commands. sudo launchctl load -F /Library/LaunchDaemons/org.macports.php-fpm.plist sudo launchctl load -w /Library/LaunchDaemons/org.macports.php-fpm.plist

    Read the article

  • Server 2003 and SSL Certificates

    - by Keith Stokes
    I have a Windows 2000 domain with dozens of Windows 2000 servers and a few 2003 servers. Each server runs a custom app talking to a 3rd party utilizing self-signed certificates. To help troubleshooting we've created a custom test app. The 2000 servers are able to talk within seconds. The 2003 servers take anywhere from 10-30 seconds using a domain account and much less, usually under 5 seconds using a local account. The only exception to the local account performance is a new account, which is slow initially then faster. If you leave the test app open and reconnect repeatedly it talks in seconds. If you leave it open for sometime between 1 and 2 hours, it reverts back to the previous 10 seconds, so obviously something is caching. Installing the destination certificates in the local 2003 server store makes no difference. I've installed the certificates in AD and that apparently makes domain accounts work in 9-12 seconds, vs 30 seconds that was regular before. Manually clearing the certificate store on the 2003 server makes no difference. I'm at a loss as to where the certs might be cached and if I'm using some sort of domain certificate store that's hiding from me.

    Read the article

  • Can't access server sound card when vnc'd into ubuntu server

    - by Corey Kennedy
    I've set up my ubunutu 10 server with xfce, nxserver, and now tightvncserver so that I can control it remotely from my Windows 7 laptop. NX is working fine for remote access, but when I run (for example) exaile, no sound will be sent through the server's sound card. I installed tightvncserver and connected, but ran into the same problem. Exaile opens, sound isn't muted, I can see that sound cards are installed (via cat /proc/asound/cards), but I can't seem to get the remote sessions to access the server's sound card. Also, just to confirm that the sound card was working I hooked up a montior/keyboard to the server and opened a local xfce session. That worked fine. While I had the local session running, I was also able to open a remote session with NXClient and start exaile - which then successfully piped sound to the local card. After disconnecting the monitor/keyboard and moving the box back to its normal spot, though, I was not able to play sound via either an NX or VNC session. Does anyone have any suggestions? Surely it's possible to configure my remote sessions to pipe sound to the server's sound card, right? Or at least get xfce up and running without a monitor or keyboard but with access to the sound card so I can VNC into it? Thanks!

    Read the article

  • HP DL160 G6 memory PC3-10600R vs PC3-10600E

    - by Jeremy Hajek
    I am using a HP DL 160 G6 server that according to specs takes PC3 Registered or Unbuffered. When I combine the two types of memory below the system will not POST. When I use just the first type of memory listed the system will POST. I have two pieces of HP memory that came with the server labeled PC3-10600E-9-10-E0 and then I have some Crucial memory labeled PC3-10600R-9-10-B0 I wager that the R means Registered memory and the E means ECC - then shouldn't the crucial memory boot with the system according to the HP specs? Or does the E mean it is Unbuffered and therefore I shouldn't mix and match as according to this HP memory config doc?

    Read the article

  • Can a Windows Domain play along with a Hosted Exchange service?

    - by benzado
    I'm setting up a computer network for a small (10-20 people) company. They are currently using a Hosted Exchange service they are totally happy with. Other than that, they are starting from scratch (office doesn't even have furniture yet). They will need some kind of file sharing server set up in their office. If I set up a machine as a file server and nothing more, users will have three passwords to deal with: local machine, file server, and email. If I set up a Domain Controller, identities for local machine and file server will be the same. But what about the Hosted Exchange server? Must the users have a separate email password, or is it possible to combine the two? (I realize it might depend on the specific hosting provider, but is it possible?) If not, it seems like I have these options: Deal with it: users have a separate email password. Host Exchange on the local server: more than they want to manage in-house? Purchase a hosted VPS, make it part of the domain, and host Exchange there. (Or can/should a VPS be a domain controller?) I realize I have a lot of questions in there. The main one: is there any reason to use a Hosted Exchange service if I'm setting up other Windows services?

    Read the article

< Previous Page | 526 527 528 529 530 531 532 533 534 535 536 537  | Next Page >