Search Results

Search found 23404 results on 937 pages for 'script compression'.

Page 673/937 | < Previous Page | 669 670 671 672 673 674 675 676 677 678 679 680  | Next Page >

  • wait after call command minecraft

    - by smeagogol
    I'm trying to create a batch file for a friend because he has some problems on minecraft. He needs to launch minecraft 80 times without closing java error... I have 2 batches, one with a while, and an other one executing java command for launching minecraft. ::Launcher.bat title Script Minecraft set tour=10 set tour2=tour :boucle set /a tour=tour-1 call "D:\thepath\Minecraft2.bat" if %tour%==0 goto suite goto boucle :suite wait javaw.exe :boucle2 set /a tour2=tour2-1 taskkill /F /IM "javaw.exe" if %tour2%==0 goto fin goto boucle2 :fin echo Appuyez sur une touche pour quitter... pause >nul and the other one ::Minecraft2.bat @echo off java -Xmx2048m -Xms1024m -cp "D:\thepath\Minecraft.exe" net.minecraft.LauncherFrame My problem is that when it calls the second batch, it's waiting for closure of the window but we must leave them opened ! If someone has already encountered this problem, I'd be grateful. Thanks PS: If my english is bad, it's because I'm french ;)

    Read the article

  • Tidying up old apache logs managed by vlogger

    - by Andre Lackmann
    We're using vlogger to manage our apache logs which keeps everything nice and neat but pretty much breaks the ability to use logrotate from as far as I can see. eg. our virtual access.logs each sit within their own dir and are named similar to: /virtual.com/ 20100501-access.log 20100502-access.log 20100503-access.log 20100504-access.log etc.. Has anyone created a cleanup script to go through the /var/log/httpd/ sub dirs and remove old logs? We like using vlogger, but cleaning up olds logs after it is a pain!

    Read the article

  • Mirroring MySQL server with diffrent configuration

    - by HTF
    I have to migrate MySQL server to a different data centre so I would like to create another MySQL slave server in new DC and then promote it to a master later on. I previously used LVM snapshots and Percona Xtrabackup for this purpose but this time I've optimized MySQL configuration file that prevents me from using these methods. Old server (backup): innodb_log_file_size = 256M innodb_log_files_in_group = 3 New server (restore): innodb_log_file_size = 512M innodb_log_files_in_group = 2 The Xtrabackup script and LVM snapshots copy the whole directory structure so the MySQL server won't start because there is a different size for InnoDB logs. Is there any solution to avoid a downtime in this case? I can't use mysqldumps as there is around 8000 databases so I would have to take the server down for a couple of hours. I was also thinking to use the old settings with Xtrabackup and then change it once the new server is promoted to a master - less downtime but I'm not sure if this will work? Thank you Regards

    Read the article

  • SQL Server backup/restore error: The Media Family on Device is Incorrectly Formed.

    - by Chris
    Basically, I'm having this issue: http://www.sqlcoffee.com/Troubleshooting047.htm What I'm doing is running a script I found online (http://pastebin.com/3n0ZfybL) to do a full backup, then rar'ing up the file and moving it to my computer. The CRC of the backup file inside the rar is correct on both computers, so there is no problem with data being corrupted when I transfer it. But then I go and try to restore the database on my dev computer here and I get the errors "sql server cannot process this media family" ... "msg 3013". Why is this happening? I'd test out the backup on the server I'm getting it from, but it's a production server.

    Read the article

  • How to determine if a file has been backed up?

    - by Console
    I try to consolidate old drives to new ones of larger capacity. Sometimes files have been renamed, but are otherwise identical. Sometimes an old directory has just a few more files in it than a newer directory with the same name. Sometimes a file has the same name but the size differs. So I often find myself asking the question: Are there any files on this old drive or directory that I haven't already copied to the new drive? I just want to know that I have the files, I don't want to try and sync stuff automatically (Syncing tools tend to just sync, creating duplicate folder structures and other problems, so I prefer to do it by hand). Basically, if an old drive has a file called "foo.bar" ten directories deep, and my new big drive has an identical file called "oldstuff.zip" in the root, I just want a "yes you have it" or "no, unique files exist". Is there a free tool, a script or a quick and easy method (Mac/Unix or Windows) to get the answer?

    Read the article

  • Reporting available RDP sessions

    - by Sergei
    Hi All Default Windows 2003 Server has two RDP sessions available.If all of them are taken by someone else you will only be notified about it after providing your credentials. This seems like a waste of time especially in our environment where we share a lot of development server. Could a message be given as soon as you open the connection window if the connections are taken up? We can use qwinsta based script to check sessions before logon but it's still takes time. Is there a feature to do what I am asking for?

    Read the article

  • Apache: Use Specific Applicaion for CGI scripts

    - by RandomInsano
    I have two servers, one in production and one for development. The production server is Solaris, dev is FreeBSD. Because of this, python is installed in different directories. I'm using Python right now for making CGI scripts, and needing to remember to swap my hashbangs when I copy from dev to production is a little annoying (Same issue for SVN updates depending on which server I'm comitting from). Is there a way to configure apache so that I no longer need to hashbangs? Like, if it would lauch python and supply the CGI script for it? Might be a bit of a stretch, but no harm in asking

    Read the article

  • How do I log which process is deleting a file on Windows XP?

    - by Jordan Milne
    I'm having an issue with a file getting deleted seemingly randomly throughout the day. The vendor of the software whose file is getting deleete says that another piece of software installed on the computer is deleting it, while the other software's vendor says the opposite. I've tried using Process Monitor so I can pinpoint exactly what's deleting it, but even when filtered specifically to that file, createfile operations are being triggered a few times a second, and I can't seem to filter it to deletions specifically. Is there a tool or script I can use to specifically monitor deletion attempts on a single file?

    Read the article

  • Change Windows Service Priority

    - by SchlaWiener
    I have a windows service that needs to run with High Priority. At the end of the day I want to use this script to modify the priority after service startup: Const HIGH = 256 strComputer = "." strProcess = "BntCapi2.exe" Set objWMIService = GetObject("winmgmts:\\" & strComputer & "\root\cimv2") Set colProcesses = objWMIService.ExecQuery _ ("Select * from Win32_Process Where Name = '" & strProcess & "'") For Each objProcess in colProcesses objProcess.SetPriority(HIGH) Next But currently I am not able to change the priority, even with the taskmanger. The taskmananger throws an "Access Denied" error, but I am logged on as administrator and I changed the user account of the service to administrator, too. I still get the "access denied" message when trying to change the priority. Any ideas what permission I need to do that?

    Read the article

  • SQL Server 2008 Log-shipping: Without a UNC drive: how?

    - by samsmith
    My real question here is... is there a tool I can use? (E.g. I have a lot to do, and would prefer not to script it all up myself!) Anyone using the redgate (hmmm, they had a tool for this, but I do not see it on their web site now...) I have a primary web app at rackspace. Am setting up a backup copy of the app in another data center. I want to use SQL log replication to sync the db. Using SQL Server Web Edition. TIA for suggestions and insight!

    Read the article

  • Howto update Preview.app from the command line without loosing focus on OSX ?

    - by snies
    Hello, i want to update Preview.app in the background from the command line without loosing focus of my current window. I know that i can use the following to open/update the view of a file, but than i loose focus to the Preview.app. open -a Preview foo.pdf I guess there might be some clever Apple Script commands to do so but so far i didn't find the right one. Alternatively i would be interested into transfering the focus back to my current app directly after the update. I need this in order to update Preview.app's view of a pdf through a vi autocmd after i update the pdf according to changes in a tex file i am editing. Here is an example of what i want to achive but using Ubuntu and evince.

    Read the article

  • Problem with dpkg-preconfigure, how to correct?

    - by Eric Wilson
    I was trying to install TeamViewer, and I followed the instructions here even though they specify 11.10 instead of 12.04 (what I'm running). In particular, I executed. $ wget http://www.teamviewer.com/download/teamviewer_linux.deb $ sudo dpkg -i teamviewer_linux.deb The dpkg command failed, and after this point my packaging system has been broken. The software center instructs me to try: $ sudo apt-get -f install which leads to Reading package lists... Done Building dependency tree Reading state information... Done Correcting dependencies... Done The following packages will be REMOVED: teamviewer7:i386 0 upgraded, 0 newly installed, 1 to remove and 17 not upgraded. 9 not fully installed or removed. Need to get 89.0 kB of archives. After this operation, 81.9 MB disk space will be freed. Do you want to continue [Y/n]? y Get:1 http://us.archive.ubuntu.com/ubuntu/ precise/main dash amd64 0.5.7-2ubuntu2 [89.0 kB] Fetched 89.0 kB in 1s (83.9 kB/s) E: Sub-process /usr/sbin/dpkg-preconfigure --apt || true returned an error code (100) E: Failure running script /usr/sbin/dpkg-preconfigure --apt || true At this point I'm stumped.

    Read the article

  • show/hide specific scripts from browsers (adblock)

    - by user143822
    i'm using AdBlock Plus and Element Hiding Helper to show/hide dom element. But i don't understand how can i show/hide a specific javascript script from page. For example look this page: http://downloadzoneforum.net We have different div, class: maintitle. Every div maintitle have a spoiler. If click on - minus picture i can close the container. Default div maintitle have spoilers opened. But i want hide some spoilers using a filter. When i open Firefox, i want see hidden this spoilers from Extra Forum and Discussioni Varie Like this: How Can i Do this using Adblock or Element Hiding Helper or Another Solution?

    Read the article

  • Major permission repair needed on Mac Os

    - by Luke1111
    I made the fatal error of copying and pasting a sudo command into my terminal without double checking it, here it is. sudo -R mysql / What this does (for those that don't know) is recursively change the owner every file from the root down to mysql!! obviously not what i was intending This has of course played havoc with my system, the first thing i did was the apple permission repair but that only works for files that it has an idea of though it has changed a lot of file ownerships back to root. It seems that many library files are still owned incorrectly, as a lot of problems don't work. What i propose doing as a temporary fix until i can reinstall mountain lion is to recursively set all ownerships that are mysql to Luke. I'm not sure what they should precisely but this is still better than nothing. Is this possible using a shell script? I realise that this won't fix the problem properly and i will have to reformat but i need the machine in a workable state just for this week.

    Read the article

  • creating secure multicast with socat

    - by arash
    How we can create secure tunnels multicast with socat? Assume we have a list of ip address, CIDR network addresses that we want to create secure tunnel to them. I found this socat STDIO UDP4-DATAGRAM:224.1.0.1:6666,range=192.168.10.0/24 but I want a secure tunnel and different adds with net addrs I want to create script that give the IPs and net addresses and create secure tunnel ./myscript IP1 NetAdd1 IP2 NetAdd2 .... how can i send this parametersw to socat? Socat multicast hasn't any limits? Thanks for your help

    Read the article

  • How should I determine if a user is logged in graphically while lightdm is running?

    - by Jack
    I want to know if someone is logged into a local X-session. In the past I looked at the output of ck-list-sessions. The output looked something like this: Session12: unix-user = '[redacted]' realname = '[redacted]' seat = 'Seat1' session-type = '' active = TRUE x11-display = ':0' x11-display-device = '/dev/tty8' display-device = '' remote-host-name = '' is-local = TRUE on-since = '2012-10-22T18:17:55.553236Z' login-session-id = '4294967295' If no one was logged in, there was no output. I checked if someone was logged in with ck_result" string => execresult("/usr/bin/ck-list-sessions | /bin/grep x11 | /usr/bin/cut --delimiter=\\' -f 2 | /usr/bin/wc -w This no longer works, because lightdm greeter looks like a logged in user Session12: unix-user = '[redacted]' realname = 'Light Display Manager' seat = 'Seat1' session-type = 'LoginWindow' active = TRUE x11-display = ':0' x11-display-device = '/dev/tty8' display-device = '' remote-host-name = '' is-local = TRUE on-since = '2012-10-22T22:17:55.553236Z' login-session-id = '4294967295' I guess I could check session-type, but I don't know how to do that and check x11-display in one-liner. I then need to write my own script, but at that point I thought I would check if anyone else has already done the work or if there is a way to get ConsoleKit to tell me what I want (or if I should be using a different tool)?

    Read the article

  • Best way to compare (diff) a full directory structure?

    - by Adam Matan
    Hi, What's the best way to compare directory structures? I have a backup utility which uses rsync. I want to tell the exact differences (in terms of file sizes and last-changed dates) between the source and the backup. Something like: Local file Remote file Compare /home/udi/1.txt (date)(size) /home/udi/1.txt (date)(size) EQUAL /home/udi/2.txt (date)(size) /home/udi/2.txt (date)(size) DIFFERENT Of course, the tool can be ready-made or an idea for a python script. Many thanks! Udi

    Read the article

  • You don't have permission to access / on this server on centos 5.3

    - by zahid
    hello am using centos 5.3 with kloxo panel everything was fine but last night when i was updating my site i do not know i got first error when i try to access my site script everything is ok www.w3scan.net www.dl4fun.com Forbidden You don't have permission to access / on this server. please help i checked httpd it seems to be ok my httpd.conf #<VirtualHost *:80> # ServerName www.domain.tld # ServerPath /domain # DocumentRoot /home/user/domain # DirectoryIndex index.html index.htm index.shtml default.cgi default.html default.htm #</VirtualHost> i uninstall apache and installed again now i have still now access Index of / i modify apache welcome.conf to remove apache test page help

    Read the article

  • sendmail on Ubuntu won't send from www-data user

    - by bumperbox
    I if call mail() function in PHP from webserver (running as www-data) i get an error sending email. If i call the same script from the cmdline logged in as root, then it works If i switch user to www-data and run from the cmdline i get this error message WARNING: RunAsUser for MSP ignored, check group ids (egid=33, want=107) can not chdir(/var/spool/mqueue-client/): Permission denied Program mode requires special privileges, e.g., root or TrustedUser. FAILEDWARNING: RunAsUser for MSP ignored, check group ids (egid=33, want=107) can not chdir(/var/spool/mqueue-client/): Permission denied Program mode requires special privileges, e.g., root or TrustedUser. FAILEDTest Complete$ WARNING: RunAsUser for MSP ignored, check group ids (egid=33, want=107) I am guessing i need to do something in sendmail configuration I have googled for some solutions but have ended up more confused. Can someone let me know what configuration I need to change to fix so i can send from www-data user?

    Read the article

  • How to get Bash shell history range

    - by Aniti
    How can I get/filter history entries in a specific range? I have a large history file and frequently use history | grep somecommand Now, my memory is pretty bad and I also want to see what else I did around the time I entered the command. For now I do this: get match, say 4992 somecommand, then I do history | grep 49[0-9][0-9] this is usually good enough, but I would much rather do it more precisely, that is see commands from 4972 to 5012, that is 20 commands before and 20 after. I am wondering if there is an easier way? I suspect, a custom script is in order, but perhaps someone else has done something similar before.

    Read the article

  • Vista - perform scheduled actions only if screen is not locked

    - by Syntax Error
    Ok, here's the general idea of what I want to do. After a certain time, I would like the computer to nag me to go to sleep. Maybe every five minutes or so. But I don't want the messages to pop up if the screen is locked, because I leave it like that all night. Ideally I would like to be able to do more things like shut down running instances the web browser, or lock my user session if I ignore the notices for too long. But I'm happy with just popup messages if that's all I can do. So, how much of this is possible and where do I start? I'm not too well versed with task scheduler, and I'm assuming I'll use that to at least start whatever script I have to put together.

    Read the article

  • System wide Proxy settings when on a windows network with a password

    - by sav
    I'm using Ubuntu on a windows network. I want to connect to the world wide web. I have followed the steps here which I have found very useful. However when I try to ping a website (eg: ping www.wikipedia.org) I get no reply. I can ping local computers on my network, but I need to go through our proxy to get to the world wide web. I can even browse wikipedia using firefox, I just needed to enter the proxy configuration script location and my username and password. I'm quite sure the reason I'm having this trouble is because I havn't entered a username and password. I'm not sure how to do this on a system wide level. ultimately I would like to be able to use package managers like synaptic but first I need them to be able to connect to the internet. EDIT As sugested I created a /etc/apt/apt.conf file like Acquire::http::Proxy "http://chrisav:[email protected]:8080"; Acquire::https::Proxy "https://chrisav:[email protected]:8080"; Acquire::ftp::Proxy "ftp://chrisav:[email protected]:8080"; Acquire::socks::Proxy "socks://chrisav:[email protected]:8080"; However I still cant ping wikipedia when I try installing stuff I get chris@chris-Ubuntu:~$ sudo apt-get install kate Reading package lists... Done Building dependency tree Reading state information... Done E: Unable to locate package kate

    Read the article

  • mini-dinstall chmod 0600 changes file: Operation not permitted

    - by V. Reileno
    I'm getting "Operation not permitted" in the mini-dinstall.log everytime a new debian package has been uploaded on the custom debian repository using dput. The deb file is installed successfuly but the changes file remains in the incoming folder. I can not use a post-install script when the changes file can not be processed. How can I fix this problem? Traceback (most recent call last): File "/usr/bin/mini-dinstall", line 780, in install retval = self._install_run_scripts(changefilename, changefile) File "/usr/bin/mini-dinstall", line 826, in _install_run_scripts do_chmod(changefilename, 0600) File "/usr/bin/mini-dinstall", line 193, in do_chmod do_and_log('Changing mode of "%s" to %o' % (name, mode), os.chmod, name, mode) File "/usr/bin/mini-dinstall", line 176, in do_and_log function(*args) OSError: [Errno 1] Operation not permitted: '/srv/debian-repository/mini-dinstall/incoming/debian-repository_1.3_amd64.changes' The mini-dinstall permissions: ls -lad incoming/ drwxrws--- 2 mini-dinstall debian-repository-uploader 4096 Jun 6 11:45 incoming/ ls -la incoming/debian-repository_1.3_amd64.changes -rw-rw---- 1 uploader-user debian-repository-uploader 1322 Jun 6 11:43 incoming/debian-repository_1.3_amd64.changes groups uploader-user uploader-user : uploader-user adm users debian-repository debian-repository-uploader puppet-client-updater groups mini-dinstall mini-dinstall : mini-dinstall debian-repository-uploader Cheers and thanks V.

    Read the article

  • KDE doesn't start up anything else

    - by Shane
    I just installed KDE under Arch Linux. Problem is, nothing is starting up right with it - no window manager, no panels, nothing. All I get is a small terminal window in the bottom right corner of the screen, which I'm assuming is konsole. From that single window I can do things like start kwin or launch programs whose names I happen to know (like chromium or firefox), but I don't have a panel for starting programs or switching between programs. It doesn't matter whether I start kdm through inittab or manually by typing # /etc/rc.d/kdm start. KDM looks great, but once I log in as my normal user I just get a console window with no decoration. Is there a startup script for KDE somewhere that needs to run, and usually has a bunch of programs by default - like a window manager, panels, widgets, and all the normal background programs that run in an ordinary GUI? If so, how can I "restore" the default behavior?

    Read the article

  • cURL looking for CA in the wrong place

    - by andrewtweber
    On Redhat Linux, in a PHP script I am setting cURL options as such: curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, True); curl_setopt($ch, CURLOPT_CAINFO, '/home/andrew/share/cacert.pem'); Yet I am getting this exception when trying to send data (curl error: 77) error setting certificate verify locations: CAfile: /etc/pki/tls/certs/ca-bundle.crt CApath: none Why is it looking for the CAfile in /etc/pki/tls/certs/ca-bundle.crt? I don't know where this folder is coming from as I don't set it anywhere. Shouldn't it be looking in the place I specified, /home/andrew/share/cacert.pem? I don't have write permission /etc/ so simply copying the file there is not an option. Am I missing some other curl option that I should be using? (This is on shared hosting - is it possible that it's disallowing me from setting a different path for the CAfile?)

    Read the article

< Previous Page | 669 670 671 672 673 674 675 676 677 678 679 680  | Next Page >