Search Results

Search found 5786 results on 232 pages for 'umbraco scripts'.

Page 131/232 | < Previous Page | 127 128 129 130 131 132 133 134 135 136 137 138  | Next Page >

  • Recursively resize images from one directory tree to another?

    - by davr
    I have a large complex directory tree full of JPG images. I would like to create a second directory tree that exactly mirrors the first, but resizing all the images down to a set size (say 2000x1500 or something) and quality (perhaps 85%). Is there any tool that would allow me to easily do this on Windows? I could write some scripts to automate it with bash and image magick, but first want to see if it's already been done. Faster is better too, as I have thousands of images. So something like Photoshop is probably not a good solution as it might take a couple of seconds per image.

    Read the article

  • Outgoing mail from linux not being delivered

    - by Jason
    I can't seem to send mail through my php scripts or through the linux console on my Centos 5.5 LAMP server, when the email is addressed to go to a domain that is hosted by my box. I think it is something to do with the email routing internally, or the DNS servers that the box uses not reporting the correct MX records. Basically my box doesn't host any mail, it's all hosted on google apps. My name servers are hosted by a 3rd party provider and I am using webmin. Webmin doesn't recognise the settings on the 3rd party provider. I'm unsure how to fix this. Previously when I had this problem on a cpanel server, I would edit the remotedomains and localdomains files, moving domains from one file to another and it would fix the problem. What information do I need to provide for anyone to work out what the issue is? Thanks

    Read the article

  • How to make ssh/rsync/etc use a VLAN network interface?

    - by Annan
    A company I work for has a number of virtual servers with ElasticHosts. They are setup in such a way that eth1 is on a private VLAN connecting them to each other. This is so backups sent between servers are not charged at the same rate as external data transfer. My understanding of how VLANs and network interfaces work is sketchy at best. How can I make ssh, rsync, etc. transfer data through the VLAN? My final solution: I spent a while trying to figure this out, For all servers involved, edit /etc/sysconfig/network-scripts/ifcfg-eth1 DEVICE=eth1 BOOTPROTO=static ONBOOT=yes HWADDR=YOUR_MAC_ADDR IPADDR=192.168.0.100 NETMASK=255.255.255.0 Where HWADDR should already be set and the last octate of IPADDR should be different from each other. Then run, on all servers /etc/init.d/network restart After this the IP addresses specified by IPADDR can be used directly as any other IP address.

    Read the article

  • List of MD /Raid/LVM (Devices) = How to mount them without any further information available?

    - by Jens
    Hello Expets, I do not have much skills in linux and installed a system two years ago that I now had to reboot, but it seems I did not automate everything with start-scripts... My Problem: I miss some mountpoints. I have a list of my raids (excerpt:) md3 : active (auto-read-only) raid1 sda6[0] sdb6[1] 97659008 blocks [2/2] [UU] md4 : active (auto-read-only) raid1 sda7[0] sdb7[1] 250099776 blocks [2/2] [UU] and it seems md3 and md4 are NOT mounted. However i do NOT have any entries for them fstab file. What should I do next. I do NOT know which filesystem they have (most likely ext3). =Can I savely try to mount them with (mount -t ext3 /dev/md3 /mnt/mymntpoint) or will the lead to corrupted data, in case they are not ext3? What should I do next (based on the information given above). The goal is to remount these Devices again, but I do not know anything about them anymore... Thank you very much Jens

    Read the article

  • Monit wont start/stop any processes

    - by Vaughan Magnusson
    Hi, I've got monit running on a linux vserver, installed in a custom location /home/user/bin/monit as that is the only suitable location according to the webhost providers. When I installed monit I used ./configure --prefix=/home/user Monit itself runs, and sends me emails of it's activity, and the control file syntax is correct. However, monit cannot seem to start or stop anything - or even run the simplest of scripts. eg. Using 'monit stop all', I try to run the following stop command stop = "/bin/bash /home/user/simple_script.sh" Which fails (and says so in the log). I cant figure out why this is failing, can anyone help with this?

    Read the article

  • Extract attachments from Mbox throw MIME

    - by Simeon
    I am a littlebit frustrated, im working on a project with the aim to build a system witch print automatically e-mail attachments of incoming mails ("E-Mail to Print"-system). I already set up a e-mail server (exim4) which receive perfectly e-mail and stores them to a mbox in /var/mail/ - now I want to extract the attachments out of the mbox file throw MIME to the original .PDF, .DOC, .JPG, .GIF, ... and save them in a directory, from where they get print. After the e-mail attachments got extracted they should be deleted, so they don't get extracted again. But how can I get this to work? I am not a coder, so I looked for existing scripts and programs but found nothing to work with. Could anyone give me little help - I would be very thankful! Thanks, Simeon

    Read the article

  • Transferring websites from x64 to x86 server

    - by Ke
    Hi, I run a x64 staging server here along with the following: Solr Java etc. However, I am about to get a linode vps for production and quickly realising that x86 is the way to go for their lowest RAM package (thinking to upgrade later). My staging server is x64 with 12gb ram, so going down to 300mb ram is going to feel devilishly slow ;/ Here are my questions: 1) Will I have problems transferring my scripts, dbs etc from a x64 to x86 server? e.g. solr indexes 2) Is it worth going for the x86 package? I am probably going to upgrade later down the line and x64 might be better for the servers with more RAM? should I stick with x64 instead as there isnt much difference when using with low RAM? Cheers Ke

    Read the article

  • Robustly disabling specific cron.{hourly,daily,weekly} script

    - by benizi
    On various systems that I administer, there are cron scripts that get run via the commonly-used /etc/cron.{hourly,daily,weekly} layout. What I want to know is whether there's any common 'disable this script' functionality. Obviously, simply deleting something out of a given directory will disable it, but I'm looking for a more permanent solution. Deleting /etc/cron.daily/slocate will work to disable the nightly updatedb on my home machine (where I never use slocate), but next time I upgrade the slocate package, I'm pretty sure it'll reappear. The two distributions I'm most interested in are Gentoo and OpenSUSE, but I'm hoping there's a widely-implemented mechanism. Both distros as I have them use vixie-cron (not sure it matters).

    Read the article

  • Export and import a PostgreSQL database with a different name?

    - by J. Pablo Fernández
    Is there a way to export a PostgreSQL database and later import it with another name? I'm using PostgreSQL with Rails and I often export the data from production, where the database is called blah_production and import it on development or staging with names blah_development and blah_staging. On MySQL this is trivial as the export doesn't have the database anywhere (except a comment maybe), but on PostgreSQL it seems to be impossible. Is it impossible? I've seen out there some people using sed scripts to modify the dump. I'd like to avoid that solution but if there are no alternative I'll take it. Has anybody wrote a script to alter the dump's database name ensure no data is ever altered?

    Read the article

  • Run script when shutting down ubuntu before the logged in user is logged out

    - by Travis
    I'm writing a script to backup some local directories on a unix machine (Ubuntu) to a samba drive. The script works fine and I've got it running at shutdown and restart using the method described at http://en.kioskea.net/faq/3348-ubuntu-executing-a-script-at-startup-and-shutdown It works by placing the backup script into the /etc/rc6.d and /etc/rc0.d directories. However there is a problem. After looking at the scripts logfile it seems to be run after the user is logged out. We are using LDAP authentication and when the user logs out, the system cannot backup to their samba share. Does anyone know of anyway to run the script before the user is logged out?

    Read the article

  • Script for run script

    - by user31568
    Hello everyone. There is script: Dim WSHShell, WinDir, Value, wshProcEnv, fso, Spath Set WSHShell = CreateObject("WScript.Shell") Dim objFSO, objFileCopy Dim strFilePath, strDestination Const OverwriteExisting = True Set objFSO = CreateObject("Scripting.FileSystemObject") Set windir = objFSO.getspecialfolder(0) objFSO.CopyFile "\dv.rt.ru\SYSVOL\DV.RT.RU\scripts\shutdown.vbs", windir&"\", OverwriteExisting strComputer = "." Set objWMIService = GetObject("winmgmts:" _ & "{impersonationLevel=impersonate}!\" _ & strComputer & "\root\cimv2") JobID = "1" Set colScheduledJobs = objWMIService.ExecQuery _ ("Select * from Win32_ScheduledJob") For Each objJob in colScheduledJobs objJob.Delete Next Set objNewJob = objWMIService.Get("Win32_ScheduledJob") errJobCreate = objNewJob.Create _ (windir & "\shutdown.vbs", "**093000.000000+660", _ True, 1 OR 2 OR 4 OR 8 OR 16 OR 32 OR 64, ,True, JobId) How make that shutdown.vbs run not at 9:00 once, but run for 9:00 to 12:00 Thanks

    Read the article

  • Change Directory Browsing Page in IIS 7.5

    - by Gabriel Ryan Nahmias
    NOTE: This post is tagged ASP Classic but really that's just one of the languages in which I could write it. I really need assistance with configuring IIS (7.5). I have found many scripts and ideas to effect this but I require that it's not be a "drop-in" replacement, as in it must work globally for any possibly directory from one codebase. Here are several links related to this goal: http://mvolo.com/get-nice-looking-directory-listings-for-your-iis-website-with-directorylistingmodule: Best example of what I want and the one with which I can't seem to follow through. http://www.daleanderson.ca/edb/: This is an example of a "drop-in" replacement (at least it's oriented for that purpose). It still has viable code that could be useful to serve as the main file that processes directory traversal.

    Read the article

  • tar: How to create a tar file with arbitrary leading directories w/o 'cd'ing to parent dir

    - by Yan
    Say I have a directory of files at /home/user1/dir1 and I want to create a tar with only "dir1" as the leading directory: /dir1/file1 /dir1/file2 I know I can first cd to the directory cd /home/user1/ tar czvf dir1.tar.gz dir1 But when writing scripts, jumping from directory to directory isn't always favorable. I am wondering is there a way to do it with absolute paths without changing current directories? I know I can always create a tar file with absolute paths INSIDE and use --strip-components when extracting but sometimes extra path names are extra private information that you don't want to distribute with your tar files. Thanks!

    Read the article

  • How to share files between cPanel accounts?

    - by Darren
    I am setting up a multi-site/multi-store Magento installation, and I want each site to have its own cPanel account so I can setup the SSL and dedicated IP properly. I have tried to create a linux group called 'magento' and changed the files I need to share to that group (even added the users to that group), however when I try to access files through my scripts on those accounts it doesn't acknowledge the files exist. I first made a soft symbolic link which didn't work and then including them to their real location but it didn't work. Am I missing a step in allowing which users can access which files? I added the users to the magento group and like I said changed the group of the files I need to share to them but it's still not working. Thanks, Darren

    Read the article

  • What is AddType application/x-httpd-php-source

    - by egor
    I have the apache2.0, PHP5.2.4 and the directive in the httpd.conf: AddType application/x-httpd-php-source .php .php3 .php4 .php5 .php6 AddType directive is used to maps the given filename extensions onto the specified content type. This is the only meaning of this directive. But why does this method switch off PHP handler, that assigned .php extensions, and I can view source code of scripts in my browser? And another: AddType application/x-httpd-php5 .php Why does this method switch on PHP handler? This simply must send header "Content-Type: application/x-httpd-php5" to my browser and this must be only meaning of directive AddType from mod_mime. I'm confused. Thanks for your replies.

    Read the article

  • Redirecting to a diferent exe for download based on user agent

    - by Ra
    I own a Linux-Apache site where I host exe files for download. Now, when a user clicks this link to my site (published on another site): http://mysite.com/downloads/file.exe I need to dynamically check their user agent and redirect them to either http://mysite.com/downloads/file-1.exe or http://mysite.com/downloads/file-2.exe It seems to me that I have to options: Put a .htaccess file stating that .exe files should be considered to be scripts. Then write a script that checks the user agent and redirects to a real exe placed in another folder. Call this script file.exe. Use Apache mod-rewrite to point file.exe to redirect.php. Which of these is better? Any other considerations? Thanks.

    Read the article

  • MySQL on a laptop for remote workers - MyISAM keeps corrupting

    - by Jonathon
    We have an application that is used by remote, mobile workers. It intalls WAMP (Server2Go) on a laptop and uses MySQL to store data locally. All tables are MyISAM. Once a day, the workers sync the database to our central server via HTTP scripts that query the data and post it to our site. The problem is that many of these laptop database tables are corrupting continually. It appears that MySQL acts like it saves the information (I don't get any query errors), but the table gets corrupt. I have to repair the table constantly (which removes several rows of data in the process). Does anyone have any ideas about how to work around this problem? Would it be wise to switch to InnoDB on the laptops? How about a different database system altogether. I have looked at MySQL Embedded, but it appears to be the same engine as the regular MySQL.

    Read the article

  • System information shown when booting Debian

    - by WebDevHobo
    When booting Debian, you'll see it printing a lot of information about the system variables and such. I don't really need to see all that, so I'd like to modify some scripts to make sure that on boot, it just does what it has to do, without printing it on the screen. Just something I fancy. Offcourse, still seeing errors would be nice. But that long slur of text, I could do without. I've tried looking it up, but I can't find documentation on this specific thing anywhere.

    Read the article

  • How do I run a beanshell script on my Mac?

    - by jonalv
    My Ubuntu-friend told me to testrun a Beanshell script by doing: bsh #filename# and when I told him that I don't know what bsh is nor have it he told me: sudo apt-get install bsh Being on a Mac I instead ran: sudo port install beanshell But still no bsh command available. A listing of the package content revealed a jar file named: /opt/local/share/java/bsh.jar but when I try to run that with my script file a complete Window manager written in Java starts up (and does not run the script file btw). Now, clearly I am doing something wrong, I am sure there must be a way of running beanshell scripts on a mac terminal although it does seem more natural for the linux users. What am I doing wrong and what should I do to run that script?

    Read the article

  • How do I create a Launcher in Ubuntu 9.10 that runs a shell script?

    - by mkelley33
    Here's my situation: New to Ubuntu (just installed 9.10 Karmic Koala 64 bit) Purpose: to easily run PyCharm without too much typing (ie. cd... ./pycharm.sh) Want to create desktop Launcher instead of terminal & typing (without resorting to the "Run in Terminal" option) Tried to create Launcher to executes .sh script in Document directory Right-clicked Desktop Create Launcher a. Type == Application; Browse [insert absolute path to .sh script]; no luck b. Type == Application in Terminal; Browse ...ditto I'm open to any other alternatives that involve as little typing as possible. I would like to just start Ubuntu, click Launcher icons, and have terminals spring to life, running the intended scripts. Crazy? No. Lazy? Probably. Productive? Hopefully :)

    Read the article

  • Drupal + Lighttpd: enabling clean urls (rewriting)

    - by Patrick
    I'm emulating Ubuntu on my mac, and I use it as a server. I've installed lighttpd + Drupal and the following configuration section requires a domain name in order to make clean urls to work. Since I'm using a local server I don't have a domain name and I was wondering how to make it work given the fact the ip of the local machine is usually changing. thanks $HTTP["host"] =~ "(^|\.)mywebsite\.com" { server.document-root = "/var/www/sites/mywebsite" server.errorlog = "/var/log/lighttpd/mywebsite/error.log" server.name = "mywebsite.com" accesslog.filename = "/var/log/lighttpd/mywebsite/access.log" include_shell "./drupal-lua-conf.sh mywebsite.com" url.access-deny += ( "~", ".inc", ".engine", ".install", ".info", ".module", ".sh", "sql", ".theme", ".tpl.php", ".xtmpl", "Entries", "Repository", "Root" ) # "Fix" for Drupal SA-2006-006, requires lighttpd 1.4.13 or above # Only serve .php files of the drupal base directory $HTTP["url"] =~ "^/.*/.*\.php$" { fastcgi.server = () url.access-deny = ("") } magnet.attract-physical-path-to = ("/etc/lighttpd/drupal-lua-scripts/p-.lua") }

    Read the article

  • Reliable Backup Solution for Linux for Complete System Restoration

    - by Chris S
    What's the best backup solution for Linux that can completely restore the entire filesystem to a blank harddrive (including partitioning) after an old harddrive dies? I'm currently running a few Ubuntu machines, some with RAID-1 and others without RAID (mostly laptops). I'd like to implement a backup solution that can take incremental snapshots of the entire filesystem, so that if I were to replace all the harddrives in a machine, I could use the backup to restore a perfect copy of the previous filesystem. Unfortunately, nearly all the backup solutions I've found seem to be glorified rsync scripts, which only backup some files, and have no easy way to restore once the entire filesystem is gone. Some of the more complicated solutions, like Bacula, might do what I need, but require a complicated server/client setup and are notoriously difficult to maintain. I've heard that Apple's TimeMachine utility has this ability, and I've had similar success taking differential disk images with Acronis True Image on Windows, but of course neither of these work on Linux. Is there anything comparable for Ubuntu?

    Read the article

  • cannot run CMD script from Vista windows explorer

    - by jamesvista
    I am running Vista Home Premium. I tried to write a script to do some simple automation.... it does not work! even the most simple script like: @echo ON dir . does not get executed and only an empty CMD shell pops open when started from explorer. From the cmd windows there is no problem. This is really weird and I have never seen this before (but wrote many CMD scripts before) ftype cmdfile and batfile are unchanged from "%1" %* virusscan done - no problems Is there a policy setting that might have changed? Any ideas?

    Read the article

  • Active DFS node did not restore after failure

    - by Mark Henderson
    On Tuesday we had a Server 2008 R2 DFS-R node go offline unexpectedly. DFS did the right thing and started routing requests to a different node, which was in a remote site. This is by design, because even though it's slow, at least it's still working. We had the local DFS-R node back online within an hour, and it had synced all its changes 10 minutes after that. 3 of the 5 terminal servers reset themselves to the local DFS node, but the other two stayed pointing at the remote DFS node for three days, until someone finally piped up about how slow requests were. What reasons could there be why some, but not all, of the server reverted? Is the currently active DFS node for a namespace exposed anywhere in the OS (WMI, or even scripts) so that we can monitor the active nodes?

    Read the article

  • What are good CLI tools for JSON?

    - by jasonmp85
    General Problem Though I may be diagnosing the root cause of an event, determining how many users it affected, or distilling timing logs in order to assess the performance and throughput impact of a recent code change, my tools stay the same: grep, awk, sed, tr, uniq, sort, zcat, tail, head, join, and split. To glue them all together, Unix gives us pipes, and for fancier filtering we have xargs. If these fail me, there's always perl -e. These tools are perfect for processing CSV files, tab-delimited files, log files with a predictable line format, or files with comma-separated key-value pairs. In other words, files where each line has next to no context. XML Analogues I recently needed to trawl through Gigabytes of XML to build a histogram of usage by user. This was easy enough with the tools I had, but for more complicated queries the normal approaches break down. Say I have files with items like this: <foo user="me"> <baz key="zoidberg" value="squid" /> <baz key="leela" value="cyclops" /> <baz key="fry" value="rube" /> </foo> And let's say I want to produce a mapping from user to average number of <baz>s per <foo>. Processing line-by-line is no longer an option: I need to know which user's <foo> I'm currently inspecting so I know whose average to update. Any sort of Unix one liner that accomplishes this task is likely to be inscrutable. Fortunately in XML-land, we have wonderful technologies like XPath, XQuery, and XSLT to help us. Previously, I had gotten accustomed to using the wonderful XML::XPath Perl module to accomplish queries like the one above, but after finding a TextMate Plugin that could run an XPath expression against my current window, I stopped writing one-off Perl scripts to query XML. And I just found out about XMLStarlet which is installing as I type this and which I look forward to using in the future. JSON Solutions? So this leads me to my question: are there any tools like this for JSON? It's only a matter of time before some investigation task requires me to do similar queries on JSON files, and without tools like XPath and XSLT, such a task will be a lot harder. If I had a bunch of JSON that looked like this: { "firstName": "Bender", "lastName": "Robot", "age": 200, "address": { "streetAddress": "123", "city": "New York", "state": "NY", "postalCode": "1729" }, "phoneNumber": [ { "type": "home", "number": "666 555-1234" }, { "type": "fax", "number": "666 555-4567" } ] } And wanted to find the average number of phone numbers each person had, I could do something like this with XPath: fn:avg(/fn:count(phoneNumber)) Questions Are there any command-line tools that can "query" JSON files in this way? If you have to process a bunch of JSON files on a Unix command line, what tools do you use? Heck, is there even work being done to make a query language like this for JSON? If you do use tools like this in your day-to-day work, what do you like/dislike about them? Are there any gotchas? I'm noticing more and more data serialization is being done using JSON, so processing tools like this will be crucial when analyzing large data dumps in the future. Language libraries for JSON are very strong and it's easy enough to write scripts to do this sort of processing, but to really let people play around with the data shell tools are needed. Related Questions Grep and Sed Equivalent for XML Command Line Processing Is there a query language for JSON? JSONPath or other XPath like utility for JSON/Javascript; or Jquery JSON

    Read the article

< Previous Page | 127 128 129 130 131 132 133 134 135 136 137 138  | Next Page >