Search Results

Search found 21227 results on 850 pages for 'zombie process'.

Page 207/850 | < Previous Page | 203 204 205 206 207 208 209 210 211 212 213 214  | Next Page >

  • SQL Server Migration Assistant for Oracle problem

    - by Paul
    I've recently installed SSMA on my computer and after connecting to both the Oracle instance (which holds the database to be converted) and the SQL Server. I've mapped the needed schemas from oracle to mssql. The problem is that when i click on the report button for the assessment report there's an error popping up: Assesment Error : Nothing to Process The output window states: Starting conversion... Analyzing metadata... Conversion finished with 0 errors, 0 warnings, and 0 informational messages. There is nothing to process. Has anyone got experience with SSMA. I can't figure out what I am doing wrong. Thank you.

    Read the article

  • Error while reomving the new kernel 2.6.37

    - by Tarek
    Hi! I tried to install the new kernel but something went wrong and I'm trying to remove it now. The error massege is: mhd@Tarek-Laptop:~$ sudo apt-get install -f Reading package lists... Done Building dependency tree Reading state information... Done The following packages will be REMOVED: linux-image-2.6.37-020637-generic 0 upgraded, 0 newly installed, 1 to remove and 9 not upgraded. 1 not fully installed or removed. After this operation, 111MB disk space will be freed. Do you want to continue [Y/n]? y (Reading database ... 188780 files and directories currently installed.) Removing linux-image-2.6.37-020637-generic ... Examining /etc/kernel/postrm.d . run-parts: executing /etc/kernel/postrm.d/initramfs-tools 2.6.37-020637-generic /boot/vmlinuz-2.6.37-020637-generic run-parts: executing /etc/kernel/postrm.d/zz-update-grub 2.6.37-020637-generic /boot/vmlinuz-2.6.37-020637-generic /etc/default/grub: 33: Syntax error: EOF in backquote substitution run-parts: /etc/kernel/postrm.d/zz-update-grub exited with return code 2 Failed to process /etc/kernel/postrm.d at /var/lib/dpkg/info/linux-image-2.6.37-020637-generic.postrm line 328. dpkg: error processing linux-image-2.6.37-020637-generic (--remove): subprocess installed post-removal script returned error exit status 1 Errors were encountered while processing: linux-image-2.6.37-020637-generic E: Sub-process /usr/bin/dpkg returned an error code (1) The previous unsloved error is on this bug.

    Read the article

  • How do I throttle a command in a terminal window?

    - by To Do
    I needed to run convert with a lot of images at the same time. The command took quite a while but this doesn't bother me. The issue is that this command rendered my computer unusable while the command was running (for about 15 minutes). So is it possible to throttle the command by limiting resources (processor and memory) to the command, directly from the command line? This can only work if I add something to the same line before pressing Enter because once I start the process the computer slows so much that it is impossible for example to switch to "System monitor" and reduce priority. Edit: top and iotop results I managed to run top and sudo iotop >iotop.txt while doing one of these convert operations. (The iotop.txt file produced is difficult to read) Results of top: PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND 14275 username 20 0 4043m 3.0g 1448 D 7.0 80.4 0:16.45 convert Results of iotop: [?1049h[1;24r(B[m[4l[?7h[?1h=[39;49m[?25l[39;49m(B[m[H[2JTotal DISK READ: 1269.04 K/s | Total DISK WRITE:[59G0.00 B/s (B[0;7m TID PRIO USER DISK READ DISK WRITE SWAPIN(B[0;1;7m IO(B[0;7m COMMAND [3;2H(B[m2516 be/4 username 350.08 K/s 0.00 B/s 0.00 % 0.00 % zeitgeist-datahub 7394 be/4 username 568.88 K/s 0.00 B/s 77.41 % 0.00 % --rendere~.530483991[5;1H14275 idle username 350.08 K/s 0.00 B/s 37.49 % 0.00 % convert S~f test.pdf[6;2H2048 be/4 root[6;24H0.00 B/s 0.00 B/s 0.00 % 0.00 % [kworker/3:2] [5G1 be/4 root[7;24H0.00 B/s 0.00 B/s 0.00 % 0.00 % init Furthermore, even after the process ends, the computer does not return to the previous performance. I found a way around this by running sudo swapoff -a followed by sudo swapon -a

    Read the article

  • Distributing application updates using SCCM 2007

    - by theraneman
    Hi all, If there are any System Center Config Manager (SCCM) users out there, please clarify my doubt here. I have used the ConfigMgr console to distribute a custom application to a client machine. Now I need to distribute some updated files of that application. Isnt's it possible to add those files in the same package source used earlier and advertise again? Or should I use the SCCM software update section for this? Not sure if its only me, but the Software distribution process looks much easier than the Software Updates process in SCCM 2007. Please do let me know if there any online tutorials which explain how to update a custom application. Any help much appreciated.

    Read the article

  • Is there a way to make scp run faster on a Mac OS X?

    - by paul_sns
    I'm trying to a upload a Flex generated SWF file from my Macbook (running Snow Leopard) using the command scp main.swf server.com:/ I had setup key authentication to prevent typing the user/pass every time. This process normally takes up to two minutes using my connection at home (768kbps down/300+ kbps up). The interesting part is that when I use WinSCP in my Windows XP machine, the process only takes 30 seconds max. Both my MacBook and Windows XP machine use the same internet connection. The MacBook is connected to the router via cable (which should be faster right?) while the Windows XP connects through Wifi. Let me know if you need additional information in order to diagnose the problem. Thanks!

    Read the article

  • Is chroot the right choice for my use case?

    - by Anthony
    Backstory: I am working on setting up a MineCraft server and want to allow admins to have ssh access to the MineCraft server console and appropriate mc server files, but not the whole system. The console provided by the minecraft server is only available to the user that launched the process. In addition, the admins will need terminal access to some basic cli tools such as wget, cp, mv, rm, and a text editor. Plan: I have already setup the ssh aspect of things, requiring pre-shared keys and whatnot. Setup a jailed environment in which all user activity will be contained. Setup user accounts. - The first user account will be the minecraft user. The minecraft user will start the MC server in a multiuser screen session and allow the other admins to attach to it. - Subsequent users should have their own /home directory for normal usage. Setup acl for the appropriate files to allow each user to edit the mc server files. No one will be doing system updates, nor will anyone be installing any programs, so I'll be the only user with sudo. The Issues: I don't want the ssh users to have access to the whole system. Users will still need to use wget or curl to update the mc server files. Is chroot the right tool for this use case, or is there something more appropriate for the job? I have no experience setting up a chroot environment and have found several tools to aid in this process. Jailkit seems to be the most robust, but it's not in the standard repos.

    Read the article

  • Building a complete program?

    - by Bob
    Reading books, watching videos, and reviewing tutorials is all very easy. Taking notes and actually learning the material may be slightly harder, but even then, anyone with a decent brain and a fair amount of interest, it's easy enough (not to mention, fun). The thing is, it doesn't really prepare you to write a full program or website. Let's say you're those teens (only in highschool, no true (college level) computer science or programming courses, and no real world experience), and you come out with Groupon. Or even Mark Zuckerburg, sure he was a genius, and he was a very capable programmer... but how? How do you recommend that people who are not necessarily new to programming, but new to programming real applications and real programmers go about developing it? What is the "development process" - especially for single programmers (or maybe 2-3 teens)? Also, as far as web development goes, what is the process? Was something like Facebook or Groupon written with a framework (like CodeIgniter or Zend for PHP)? Or do they develop their own frameworks? I'm not asking how to come up with a great idea, but how to implement great ideas in an effective way? Does anyone have advice? I've read a couple of books on both C and C++ (primarily the C Programming Language and the C++ Programming Language) and taken AP Computer Science (as well as read a few additional books on Java and OOP). I also have read a few tutorials on PHP (and CodeIgniter) and Python. But I'm still in highschool, and I'm technically not even old enough to work at an internship for a few more months.

    Read the article

  • Seeking a solution to automatically copy files from the cd-rom disk to the USB drive once it's connected.

    - by Ray Nathan
    I plan to distribute a free CD that automatically copies files to a connected usb device. This process will be done on the computers of the users that obtain the cd. The CD will contain an autorun.ini file that will instruct the computer to copy a set of files located on the cd..to a specific directory on the connected usb device. The usb drive letter is not the same on all the systems, therefore...Windows XP should automatically know the drive letter of the usb device before the copy operation begins. What would be the best way of creating a short batch file or script that I can place on the CD to execute this process? Also, please note that it is NOT feasible or recommended to include a batch file on the USB devices to sync this operation due to the explanation at the beginning of this paragraph. :) Thank You All

    Read the article

  • How to check if a server runs in pressing mode

    - by Ice
    Hi there, Layout: i have at customer side a server (win2003 R2 SP2 standard edition 32-bit) with a sql-server 2005 and some databases. This system starts with the /3GB-Switch. The system reports 3.25 GB RAM and taskmanager reports the process of sqlserver.exe with 2758255 K as the process with the highest consumption. The OS separates RAM for applications and for itself, normaly 50:50. But here we have the /3GB-Switch aktivated and i think the part for the applications is more than 50% of RAM. Knowledge (or better not knowledge): Somebody told me that if the OS runs out of memory within his part of RAM, the server runs into pressing mode. Questions: What is this pressing mode? Is pressing mode possible at all in this szenario? What should be done to get more performance out of this sql-server, beside optimizing the database and all this stuff.

    Read the article

  • Xorg becomes unkillable at 3AM

    - by chew socks
    Most nights, some time in the hour of 3AM my xorg process will increase to 100% cpu and gpu load will also increase to 100%. The process also becomes unkillable. I cannot sudo kill -9 it or get back control with sudo service lightdm restart. I also cannot switch to to a tty screen with ctrl + alt + f1. To reboot I have to log in with ssh, but this is not perfect because if I reboot while it is doing this my ZFS pool will fail to mount when it comes back up ( that is where my /home is ). Does anyone have any ideas as to why I can't stop and restart xorg, or even better, know why this is happening? Thanks NOTE: For anyone who comes looking for the same problem. I disabled catalyst AI and made it through the night. I've been up for 1 day 3 hours now. My record for this month is 2 days and 19 hours without a problem. My all time record is 6 days without a crash. I'll post here if it crashes again or I'm able to set a new record.

    Read the article

  • Is LSB's init script function "start_daemon" really used for real daemons or should I stick to start-stop-daemon?

    - by Fred
    In the context of init scripts, according to the LSB specification, "Each conforming init script shall execute the commands in the file /lib/lsb/init-function", which then defines a couple of functions to be used when using daemons. One of those functions is start_daemon, which obviously "runs the specified program as a daemon" while checking if the daemon is already running. I'm in the process of daemonizing a service app of mine, and I'm looking at how other daemons are run to try to "fit in". In the process of looking how it's done elsewhere, I noticed that not a single daemon on my Ubuntu 10.04 machine uses start_daemon. They all call start-stop-daemon directly. Same goes for my Fedora 14 machine. Should I try to play nice and be the first one to use start_daemon, or is there really no point and start-stop-daemon is the way to go since everybody is already using that? Why is there no daemons using LSB's functions?

    Read the article

  • How to pass parameters to a function?

    - by sbi
    I need to process an SVN working copy in a PS script, but I have trouble passing arguments to functions. Here's what I have: function foo($arg1, $arg2) { echo $arg1 echo $arg2.FullName } echo "0: $($args[0])" echo "1: $($args[1])" $items = get-childitem $args[1] $items | foreach-object -process {foo $args[0] $_} I want to pass $arg[0] as $arg1 to foo, and $arg[1] as $arg2. However, it doesn't work, for some reason $arg1 is always empty: PS C:\Users\sbi> .\test.ps1 blah .\Dropbox 0: blah 1: .\Dropbox C:\Users\sbi\Dropbox\Photos C:\Users\sbi\Dropbox\Public C:\Users\sbi\Dropbox\sbi PS C:\Users\sbi> Note: The "blah"parameter isn't passed as $arg1. I am absolutely sure this is something hilariously simple (I only just started with doing PS and still feel very clumsy), but I have banged my head against this for more than an hour now, and I can't find anything.

    Read the article

  • How to find spyware dll launched using svchost.exe

    - by Sheen
    This weekend I found my PC was possibly infected by some virus or spyware. There is one "svchost.exe -k netsvcs" in my task manager, and it is running under my user name, rather than SYSTEM accounts. There is already another same process with same command line options under SYSTEM account. This user account svchost.exe consistently consumes 50% CPU (1 of 2 cores of my CPU). In Process Explorer, I can see it is started by explorer.exe, instead of services.exe. However, I failed to find its real service dll place in registry or disk. Does anyone know how to find this malicious program?

    Read the article

  • CHROOT for shell script testing

    - by Josh
    I am looking at setting up a shell script in order to properly document and automate the process I am using to setup a few servers we have. In order to test the shell script through its different stages I was thinking a CHROOT would be ideal, since I can wipe out the "virtual root" and create it on the fly. I have never used CHROOT before, however. I was just curious what are the exact steps I would need to follow to implement this process of creating a chroot (with the basic core functions that would be needed to install apache/php/etc.)? and then destroying it?

    Read the article

  • Do large folder sizes slow down IO performance?

    - by Aaron
    We have a Linux server process that writes a few thousand files to a directory, deletes the files, and then writes a few thousand more files to the same directory without deleting the directory. What I'm starting to see is that the process doing the writing is getting slower and slower. My question is this: The directory size of the folder has grown from 4096 to over 200000 as see by this output of ls -l. root@ad57rs0b# ls -l 15000PN5AIA3I6_B total 232 drwxr-xr-x 2 chef chef 233472 May 30 21:35 barcodes On ext3, can these large directory sizes slow down performance? Thanks. Aaron

    Read the article

  • Extract first page from multiple pdfs

    - by Tim Alexander
    Have got about 500 PDFs to go through and extract the first page of. They then need to go through some time consuming conversion process so was hoping to try and save some time by have a batch process to extract just the first page from the 500 pdfs and place it in a new pdf. Have had a poke around Acrobat but can find no real method of doing this for multiple files. Does anyone know any other programs or methods that this could be achieved? Free and open source are obviously more favourable :) EDIT: Have actually had some success using GhostScript to extract just one page. Am now looking at how to batch that and take the list of files and use those.

    Read the article

  • lighttpd silently stops logging

    - by Max Cantor
    I'm on a Slicehost 256MB VPS with Ubuntu 9.04 (Jaunty). lighttpd is the only web server process running; it listens on port 80. My lighttpd.conf can be found here. I'm using Ubuntu's default logrotate setup for lighty. At seemingly random times, lighttpd will stop logging. It is not correlated with log rotation--that is, the errors do not occur when logrotate kicks in. What happens is, I will verify that the server is serving files by hitting a URL with my browser, and I will verify that it is not logging by checking access.log and seeing that the GET request I just made is not there. Using init.d to restart the process starts logging again, without truncating or rotating the log file. That is, new requests will be logged at the end of the existing access.log file. There are no cron jobs running on this box. Any ideas?

    Read the article

  • Automatic document generation

    - by Bowler
    I have some data in an excel file from which I have to generate a report. I repeat this task fairly regularly and am looking to automate it. I have a LaTeX project into which I usually just copy data by hand, export the necessary worksheets as pdfs and add them to my LaTeX project and compile with pdflatex. It has occured to me that there must be a way to automate this process. Is there an efficient way to export the data from excel and into a LaTeX project, possibly a vba script in excel could run the process? Also, it doesn't have to be LaTeX, I'm not all that experienced with MS office's more advanced features is there some way akin to a mail merge that I could achieve this with? In some ways this might be better in case I have to pass the work on to someone who doesn't know LaTeX. Thanks.

    Read the article

  • The balance between client and server functionality

    - by Eugen Martynov
    I want to bring the discussion that started in our teams and get your opinion about it. Assume we have an user account which could have different credentials for authentication and associated email to recover. An user has possibility to do signup with an email or use his social profile to complete signup process. As an Rest API from the backend to client looks like: Create account Authorise Update user data Link social account Register email Verify email In addition our BE is distributed and divided between several services/servers/clusters. So different calls are related to different end points. In case of the social sign up some of steps should be skipped or simplified. For example, with Facebook signup we could already skip email registration and verification step (we ask email permission form user), linking the social account and pre-fill user displayed name. So we proposed to have another end point which will hide/combine different calls on BE and return whole process result to the clients. The pros for this approach: No more duplication of functionality between clients Speed up the networking and user experience The cons for this approach: Additional work for backend Probably most complex scenarios in future updates I would like to get your opinion or experience with this situation. Especially if you already experienced point #2 from against reasons.

    Read the article

  • Window 2003 Server - Logon Failure error message in Event Viewer

    - by user45192
    Hi guys, I received alot of event logged in the event viewer with this message. I notice is always the same user id which encounters this error. The user id is use by an application to access the database. However, this account does not exits on this server. How do I trace the services/program use by this user id which causes these error messages? Reason=Unknown user name or bad password&&User Name=&&Domain=&&Logon Type=3&&Logon Process=NtLmSsp&&Authentication Package=NTLM&&Workstation Name=&&Caller User Name=-&&Caller Domain=-&&Caller Logon ID=-&&Caller Process ID=-&&Transited Services=-&&Source Network Address=-&&Source Port=-&&User=SYSTEM&&ComputerName=

    Read the article

  • rdiff-backup failed due to target machine being down, but is unkillable

    - by Markus
    My backup script was invoked by cron, using rdiff-backup to backup the user files onto a target system in the network. That target computer went down at some point, yet still appeared as mounted on the server. rdiff-backup didn't do anything, but still appears as a process. kill-ing doesn't stop it. Similarly, running rdiff-backup for other directories works but doesn't exit properly and remains in the process list. Is there anything short of rebooting the server that I can try?

    Read the article

  • mongod fork vs nohup

    - by Daniel Kitachewsky
    I'm currently writing process management software. One package we use is mongo. Is there any difference between launching mongo with mongod --fork --logpath=/my/path/mongo.log and nohup mongod >> /my/path/mongo.log 2>&1 < /dev/null & ? My first thought was that --fork could spawn more processes and/or threads, and I was suggested that --fork could be useful for changing the effective user (downgrading privileges). But we run all under the same user (process manager and mongod), so is there any other difference? Thank you

    Read the article

  • Why does my CPU Usage reach 100% too often?

    - by deathlock
    I'm using a dual-core processor and often see my CPU usage reaches 100%. I realize this may happen if I'm running too much applications, so when I know the computer starts to run slowly, I start to close my applications. I usually run 4-5 applications simultaneously. Usually those are: web browser (Google Chrome), Adobe Photoshop, Notepad++, XAMPP, and Windows Task Manager. Usually I close tabs in my Chrome first, because I often browse the net with about 20 tabs/4 windows open, so I presume that would take much memory (bad habit, I know). But even after closing Chrome's tabs or closing other applications, my CPU Usage often stays at high percentage - 72% at best, 100% at worst. I check the Processes tab on Windows Task Manager and usually found the System, System Idle Process, or services.exe taking the highest CPU process (could reach 60). Why is this happening? And is there any solution? EDIT I have T2250 @ 1,73 Ghz and 2.5 GB RAM

    Read the article

  • Running a Screen instance of a program as non-root

    - by user288467
    I've got a dedicated server (Ubuntu 12.04, no GUI) set up to launch an instance of McMyAdmin and attach it to a screen instance every time I reboot the hardware. I have the command saved to root's crontab as: @reboot cd /var/MC_SVR && screen -dmS McMyAdmin ./MCMA2_Linux_x86_64 Problem being, though, I have a user set up specifically for FTP access to the server files so I don't always have to SSH into the machine. Since the server is being started as a root process, all the files it makes are, obviously, set with root as the owner. So I chown'd all the files and set them to ftpuser. Now I'm stuck with trying to get the process to start as ftpuser. I've tried doing the following but to no avail: cd /var/MC_SVR && su ftpuser - -c 'screen -dmS McMyAdmin ./MCMA2_Linux_x86_64' I try this in terminal and I get no errors or anything (in fact I never get anything unless it's a syntax error from su), but there's no screen instance to access and so I can assume the server never starts. So, what am I doing wrong? Or am I just not accessing the screen instance correctly since it's (supposed) to be launched by another user?

    Read the article

  • How to check use of userva boot option on Win 2K3 server

    - by Tim Sylvester
    I have some 32-bit Win2K3 servers running an application that fails now and then apparently due to heap fragmentation. (Process virtual bytes grows, private bytes does not) I do not have access to the source code or build process of this application. I have modified the boot.ini file on one of these servers to include /userva=2560, half way between the normal mode of operation and the /3GB option. Normally it takes weeks to reach the point of failure, but I'd like to see right away whether this has actually had any effect. As I understand it, this option limits the kernel to the remaining address space (1536MB instead of 2048), but does not necessarily give an application the extra address space, depending on the flags in the application's PE header. How can I determine whether the O/S is allowing a particular application, running in production, to access address space above 2GB? Additionally, what's the best way to monitor the system to ensure that the kernel is not starved for address space, and more generally how should I go about finding the optimal value for this setting?

    Read the article

< Previous Page | 203 204 205 206 207 208 209 210 211 212 213 214  | Next Page >