Search Results

Search found 25651 results on 1027 pages for 'shell script'.

Page 765/1027 | < Previous Page | 761 762 763 764 765 766 767 768 769 770 771 772  | Next Page >

  • load balancing two web servers each on two different isp's?

    - by Scott
    I have two ISP's that provide me hosting via apache / php / mysql. I am running drupal on them. On occasion the mysql server will go away (crash), so I was hoping to find a reasonable way to have a fail over, if server A SQL is down, all traffic is sent to server B. I know traditionally this is handled in DNS where a second alternate ip is given if there is a problem - or similar. But I do not have control over the isp, other than I can run php, perl and the usual apache stuff. Also, I have static ip's on each isp, and I can create dns entries (A/CNAME/TXT). So, I was hoping there might be a way for me to have a script that checks if drupal has a problem, and if so, somehow alter dns, or ? Or, any other ideas? (other than spending lots more $ on a better isp)

    Read the article

  • How can I check the location of perl and CPAN files?

    - by Rob
    I constantly have to set up new servers for an employer of mine for an exact purpose of his, and as such they all have to be set up in exactly the same way. So I've created a script in PHP that I run from my own box to automatically send over all the relevant files, compile everything, run updates, and everything else. However, for some reason these brand new servers come with perl, which is fine, but they have perl installed in different locations. This makes it a pain for me to copy over Config.pm for CPAN without going in and finding the location manually. Is there perhaps some command I'm unaware of that will hunt down the precise location? If it helps, usually the servers are CentOS 5

    Read the article

  • Exchange - get age range of items using Powershell

    - by marcwenger
    We are going to be implementing personal archives for Exchange in our organization. For us to get a good grasp on how much space is needed, we need to get an idea of the age of items that we currently have. Is it possible to have a powershell script that tells me the total size and number of items given certain date ranges of all mailboxes in all databases? What I'd like to have is the 1) number of items, 2) total size of times (GB) - all grouped by date ranges (Less than 15 days, 15-30 days, 30-60 days, 60-90 days, more than 90). Another possibility would be to have it also grouped by mailbox database

    Read the article

  • Get an yerror plot without a line in Octave

    - by queueoverflow
    I'd like to print a plot with y-error-bars and just plain points. My current Octave script looks like this: errorbar(x_list, y_list, Delta_y_list, "~.x"); title("physikalisches Pendel"); xlabel("a^2 [m^2]"); ylabel("aT^2 [ms^2]"); print -dpdf plot.pdf The plot I get has a line, although I specified the .x style option: How can I get rid of that line? And the ylabel is in the scale as well, is there some way to fix that?

    Read the article

  • Creating a Scheduled Task that runs forever on Windows XP

    - by Mike Fiedler
    When I create a scheduled task, I do so via command line: schtasks.exe /Create /TN "startup-script" /TR "C:\startup.bat" /RU taskuser /RP taskpasswd /SC ONLOGON The idea is that this task run forever. The batch opens a java process that is never meant to end. I've used ONLOGON, as the machine auto-logs in as taskuser. All this works fine, for about 72 hours, after which the Duration flag kicks in and ends the process. Windows XP doesn't have the /DU flag on command line - is there an alternative method to creating a task that is meant to run from a system startup (doesn't even require logon) and runs forever, without touching a GUI?

    Read the article

  • Can I prevent Internet Explorer 8 from running scripts until the page is loaded?

    - by Tom W
    When trying to veiw a number of different websites in IE8, I get the following error message: HTML Parsing Error: Unable to modify the parent container element before the child element is closed (KB927917) On investigating the error message, it appears the script is trying to modify parts of the page before it is fully loaded. Is there a setting in IE8 I can change to prevent scripts from running until the page is fully loaded? EDIT: The sites in question used to work just fine, until I had to re-install IE8 for a seperate issue. Then they stopped working.

    Read the article

  • Try the Oracle Database Appliance Manager Configurator - For Fun!

    - by pwstephe-Oracle
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 If you would like to get a first hand glimpse of how easy it is to configure an ODA, even if you don’t have access to one, it’s possible to download the Appliance Manager Configurator from the Oracle Technology Network, and run it standalone on your PC or Linux/Unix  workstation. The configurator is packaged in a zip file that contains the complete Java environment to run standalone. Once the package is downloaded and unzipped it’s simply a matter of launching it using the config command or shell depending on your runtime environment. Oracle Appliance Manager Configurator is a Java-based tool that enables you to input your deployment plan and validate your network settings before an actual deployment, or you can just preview and experiment with it. Simply download and run the configurator on a local client system which can be a Windows, Linux, or UNIX system. (For Windows launch the batch file config.bat for Linux/Unix environments, run  ./ config.sh). You will be presented with the very same dialogs and options used to configure a production ODA but on your workstation. At the end of a configurator session, you may save your deployment plan in a configuration file. If you were actually ready to deploy, you could copy this configuration file to a real ODA where the online Oracle Appliance Manager Configurator would use the contents to deploy your plan in production. You may also print the file’s content and use the printout as a checklist for setting up your production external network configuration. Be sure to use the actual production network addresses you intend to use it as this will only work correctly if your client system is connected to same network that will be used for the ODA. (This step is not necessary if you are just previewing the Configurator). This is a great way to get an introductory look at the simple and intuitive Database Appliance configuration interface and the steps to configure a system. /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;}

    Read the article

  • automated printouts from a wireless printer

    - by Piotr
    I have a wireless printer which is always on, and an always on fanless linux server. Looking at the mprinter project on Kickstarter I started to wonder if there is an app somewhere in the internet already that will allow to prepare an automated daily printout based on some settings. things to be printed could include - weather forecast for my locations, todo`s scheduled for that day, a "quote of a day" or "word of the day", stats from google analytics for my site, and many more ... I would set a printout at 6:15 every work day so its on my printer when I am already up, having a coffee. anyone knows something that can be used for such purpose? While I know this can be done by combining the power of TeX, cron and a script language to manage the dynamic part of the PDF I believe this is a use case someone has already addressed.

    Read the article

  • No break compatible with Linux box?

    - by Somebody still uses you MS-DOS
    I'm buying this unit from deal extreme: it's a bitorrent downloader, with NAS capability. I'm interested in sharing an external HD in it, with media and backup purposes. I'm afraid of energy problems (don't know if this is the correct term), corrupting my mounted drives (like after a storm), so I thought about buying a No Break that sends a "signal" to my Linux box, and a script in my Linux box would unmout everything to avoid problems. Do this "No Break signal" feature exists? Do you have model suggestions? Thanks!

    Read the article

  • How to configure Linux to act as a Bluetooth RFCOMM SPP server?

    - by regulatre
    I'm writing a phone app for Android that connects to a bluetooth RFCOMM device in my car. My phone app talks AT commands with it. For development work, I often need to communicate with the device to try different commands and things. My neighbors are starting to think I'm weird because I sit in my car for hours on end with my laptop screen shining on my face, typing away like a script kiddie. I'd much rather configure one of my many Linux servers to act as a bluetooth RFCOMM device and allow me to connect to it (indoors, while I sit on my couch). I imagine I have to start with something like sdptool add SP But then what? I'm perfectly happy writing a perl app to handle the I/O, but I just don't know how to make the bluez stack accept connections and subsequently pipe that stream to a perl app.

    Read the article

  • uWSGI and python virtual env

    - by user27512
    I'm trying to use uWSGI with a virtual env in order to use the Trac bug tracker on it. I've installed system-wide uwsgi via pip. Next, I've installed trac in a virtualenv $ virtualenv venv $ . venv/bin/activate $ pip install trac I've then written a simple uWSGI configuration script: [uwsgi] master = true processes = 1 socket = localhost:3032 home = /srv/http/trac/venv/ no-site = true gid = www-data uid = www-data env = TRAC_ENV=/srv/http/trac/projects/my_project module = trac.web.main:dispatch_request But when I try to launch it, it fails: $ uwsgi --http :8000 --ini /etc/uwsgi/vassals-available/my_project.ini --gid www-data --uid www-data ... Set PythonHome to /srv/http/trac/venv/ ... *** Operational MODE: single process *** ImportError: No module named trac.web.main unable to load app 0 (mountpoint='') (callable not found or import error) I think uWSGI isn't using the virtual env. When inside the virtual env, I can import trac.web.main without having an ImportError. How can I do that ? Thanks

    Read the article

  • Ubuntu: crypt user's home directory and protect from admin ?

    - by Luc
    I have the following problem: I need to run some scripts on a Ubuntu machine but I do not want those scripts to be visible by anybody. What could be the best way to do that ? I was thinking of the following: create a particular user Add the scripts in this user's home directory Protect + crypt the user's home directory = Can I run the script from outside if the directory is crypted ? Can superuser see the content of the home dir ? Is there a right way to do this ? UPDATE I thing the best way would be that root own those scripts. In this case I would need to allow an another user to modify the network configuration. Is it possible to ONLY provide network rights to a user ? (via sudo or else)

    Read the article

  • WIN32 services dependencies

    - by grmbl
    I know this has been handled before but I'm not getting a clear answer from this question. I have a service that depends on the print spooler.Every now and then, the spooler crashes...(luckily not often)... I need to stop that service when the spooler service crashes. I'm not sure if adding dependency for Spooler to my service will do just that? I tried using recovery option "Run Program" and use some script to stop the service but I don't fully trust that... (getting "Access Denied" errors) Thank you for your advice.

    Read the article

  • Centrally managing 100+ websites without bankrupting a small company

    - by palintropos
    I'm mainly interested in opinions on the trade-offs between having a single central server all the websites connect to as opposed to each website mirroring a subset of the master database with all the products in it. For example, will I run into severe performance issues (or even security issues, or restrictions) making queries to an offsite database? Will we hit scalability issues we can't handle early on from the sheer bandwidth required to maintain this? If we do go with something like a script that keeps smaller databases (each containing a subset of the central master data) in sync, what sorts of issues will we likely encounter there? I would really like the opinions of people far more knowledgeable than I am regarding the pros and cons of both setups and what headaches we are likely to encounter. CLARIFICATION: This should not be viewed as a question about whether we should implement one database vs multiple databases. This question has been answered numerous times. The question is regarding the pros and cons for a deployment like this having the ability to manage all the websites centrally (one server) vs trying to keep them all in sync if they each have their own db (multiple servers). REAL-WORLD EXAMPLE: We are a t-shirt company, and we have individual websites for our different kinds of t-shirts, but we're looking at a central order management integrated with our single shopping cart (which is ColdFusion + MySQL). Now, let's say we have a t-shirt that's on 10 of our websites and we change an image for it. Ideally we would change that in one place and the change would propagate, but how would we set this up?

    Read the article

  • best way to record local modifications to an application's configuration files

    - by Menelaos Perdikeas
    I often install applications in Linux which don't come in package form but rather one just downloads a tarball, unpacks it, and runs the app out of the exploded folder. To adjust the application to my environment I need to modify the default configuration files, perhaps add an odd script of my own and I would like to have a way to record all these modifications automatically so I can apply them to another environment. Clearly, the modifications can not be reproduced verbatim as things like IP addresses or username need to change from system to system; still an exhaustive record to what was changed and added would be useful. My solution is to use a pattern involving git. Basically after I explode the tarball I do a git init and an initial commit and then I can save to a file the output of git diff and a cat of all files appearing as new in the git status -s. But I am sure there are more efficient ways. ???

    Read the article

  • Multiple MySQL instances via mysqlmanager under Debian

    - by Karolis T.
    Has anyone got multiple MySQL instances running on Debian with mysqlmanager? Problem is, Debian doesn't ship with init.d script that takes mysqlmanager into account. Oh, and it doesn't work for me. I'm trying to run 3 instances, here's what I get after starting mysqlmanager # mysqlmanager --defaults-file=/etc/mysql/my.cnf ... 090614 0:42:10 starting instance 'mysqld2'... 090614 0:42:10 guardian: starting instance 'mysqld1'... 090614 0:42:10 starting instance 'mysqld1'... 090614 0:42:10 starting instance 'mysqld3'... 090614 0:42:10 guardian: starting instance 'mysqld3'... 090614 0:42:10 guardian: starting instance 'mysqld2'... 090614 0:42:10 starting instance 'mysqld2'... 090614 0:42:10 guardian: starting instance 'mysqld1'... 090614 0:42:10 starting instance 'mysqld1'... ... It just keeps "starting" and "restarting", but no MySQL instance ever starts up.

    Read the article

  • Waiting for a daemontools service to stop

    - by also
    I'm running a service under daemontools that take several seconds to stop when sent the TERM signal. I need to stop it in a script, and then wait for the process to stop before continuing to take a LVM snapshot or restarting the service. Does daemontools provide a way to do this? If not, what's the best way? I was thinking of sleeping while svcok exits with 0, but it seems like this should be a common problem with an easier solution. Thoughts?

    Read the article

  • Terminal mail delivery delay in Mac OS X

    - by cmaughan
    I'm using mail from the Mac OS X terminal to send the results of a database query to me via email. Most of the time it works, but sometimes there is a long delay before the mail arrives (often when another similar script is run). It looks like there is some kind of send queue, but I can't find any documentation mentioning this. Is there something I need to do to flush mail from the terminal? UPDATE: Sometimes delivery doesn't even seem to happen, though I get no errors at the console. Very weird.

    Read the article

  • windows VPS running apache and mysql, php scripts running slow.. but cpu usage is 1-3%..

    - by Roeland
    So every night I run some cron jobs. It requires probably about 20 min to process all the records. I gather the script does something like 10,000 sql queries. I figure this task was just that intense and needs time to complete, but I looked at CPU and memory usage, and it is super low. Cpu usage is between 1-3% and once in a while will bounce to 50ish for 2-3 seconds. This VPS is running windows 2003 server with Apache and MySQL. Does this sound right?

    Read the article

  • How to make Microsoft Keyboard special keys run osascript commands on OS X?

    - by t-a-w
    I'm trying to make (1) special key open new terminal window. I bound it to file /Users/taw/bin/new_term, which contains: #!/bin/sh exec osascript -e 'tell application "Terminal" to do script "cd ."' This does the trick, except it also opens a Terminal window with this (even though Terminal.app is configured to always close windows when processes finish): Last login: Thu Mar 11 19:41:29 on ttys000 /Users/taw/bin/new_term ; exit; ~$ /Users/taw/bin/new_term ; exit; tab 1 logout [Process completed] How do I make it all work correctly? (possibly using a way different that what I've been attempting so far)

    Read the article

  • Auto login CISCO VPN client on linux [closed]

    - by user70704
    Hi, I have installed Cisco vpn client on my linux system (Fedora core 8). After login, every time, i need to run the following command VPNC to connect the VPN server. VPNC command request the input data from the user, IPSec gateway : IPSec ID: IPSec Secret: Username: Password: So, my requirement is, can i connect the VPN server through any single command?. I feel so lazy to enter the above requirements at every time. I want to connect the VPN Server on boot startup. I was try using expect script, but i can't. Thanks in advance.

    Read the article

  • How to add a service to the S runlevel in Debian?

    - by MasterM
    I have the following script (what it does exactly is not important): #!/bin/sh -e ### BEGIN INIT INFO # Provides: watchdog_early # Required-Start: udev # Required-Stop: # Default-Start: S # Default-Stop: # X-Interactive: true # Short-Description: Start watchdog early. ### END INIT INFO # Do stuff here... I insert it into the S runlevel by invoking: insserv watchdog_early The aproriate link is created in /etc/rcS.d: S04watchdog_early -> ../init.d/watchdog_early and /etc/init.d/watchdog_early is executable (has mode 755). Despite all this, it is NOT being run at boot. Why?

    Read the article

  • How to remove all associated files and configuration settings of an app installed through 'force architecture' command

    - by Mysterio
    A few weeks ago I installed a 32 bit .deb file through the 'force architecture' command (on my 64bit notebook), however the procedure was unsuccessful and I used the apt-get purgecommand to uninstall the app. It seems there are some leftovers of the app I uninstalled which has now broken system update. Synaptic recommended a sudo apt-get install -fwhich I did in the terminal with this initial response: Reading package lists... Done Building dependency tree Reading state information... Done The following package was automatically installed and is no longer required: libntfs10 Use 'apt-get autoremove' to remove them. The following packages will be REMOVED: crossplatformui 0 upgraded, 0 newly installed, 1 to remove and 0 not upgraded. 1 not fully installed or removed. After this operation, 0 B of additional disk space will be used. Do you want to continue [Y/n]? I chose 'Y' then got this response: (Reading database ... 187616 files and directories currently installed.) Removing crossplatformui ... ztemtvcdromd: no process found dpkg: error processing crossplatformui (--remove): subprocess installed post-removal script returned error exit status 1 Errors were encountered while processing: crossplatformui E: Sub-process /usr/bin/dpkg returned an error code (1) It seems the app I installed crossplatformuiis still on my system and has caused update manager to stop running with a partial upgrade warning. What do I do now?

    Read the article

  • Javascript is not loading

    - by Oden
    Hey, I've got a problem with JavaScript under Ubuntu, that drives me crazy. I'm using Gedit for my web sites since I'm an Ubuntu user. When I start a new website I create (usually with the gnome terminal) folder structure, and I copy the files I need into them. The next step is creating an index.html where I build the design and basic JavaScript functionality. JavaScript is stored in a sub-folder of the project and when i try to load one using the tag in the header, my whole page body disappears. If the source contains a script tag with its own body, and its not the first its code wont run. I've tried to solve the problem by setting chmod to 777 with sudo chmod -R 777 . but nothing changed. CSS is loading correctly, but JS isn't. I'm using the newest version of apache, no mod_rewrite stuff, but i get the same problem when I run the html from file (file:///...) Do anyone know how to solve this problem?

    Read the article

  • Install Office software on a standard user account automatically.

    - by Earls
    If I know the Administrator account name and password on a Windows 7 computer, would it be possible to create an Office 2010 install CD that would "silently" install Office 2010 on a standard user account which does not have installation privileges. As in, the group policy "always install with elevated privileges" is in effect for the user. Some way to built the admin account privileges into the office installer? VBS Script? CMD? Understand, the laptops are in the field, the end user doesn't have the admin password and can't have the admin password. Thanks.

    Read the article

< Previous Page | 761 762 763 764 765 766 767 768 769 770 771 772  | Next Page >