Search Results

Search found 23627 results on 946 pages for 'alter script'.

Page 684/946 | < Previous Page | 680 681 682 683 684 685 686 687 688 689 690 691  | Next Page >

  • System wide Proxy settings when on a windows network with a password

    - by sav
    I'm using Ubuntu on a windows network. I want to connect to the world wide web. I have followed the steps here which I have found very useful. However when I try to ping a website (eg: ping www.wikipedia.org) I get no reply. I can ping local computers on my network, but I need to go through our proxy to get to the world wide web. I can even browse wikipedia using firefox, I just needed to enter the proxy configuration script location and my username and password. I'm quite sure the reason I'm having this trouble is because I havn't entered a username and password. I'm not sure how to do this on a system wide level. ultimately I would like to be able to use package managers like synaptic but first I need them to be able to connect to the internet. EDIT As sugested I created a /etc/apt/apt.conf file like Acquire::http::Proxy "http://chrisav:[email protected]:8080"; Acquire::https::Proxy "https://chrisav:[email protected]:8080"; Acquire::ftp::Proxy "ftp://chrisav:[email protected]:8080"; Acquire::socks::Proxy "socks://chrisav:[email protected]:8080"; However I still cant ping wikipedia when I try installing stuff I get chris@chris-Ubuntu:~$ sudo apt-get install kate Reading package lists... Done Building dependency tree Reading state information... Done E: Unable to locate package kate

    Read the article

  • Easy to use JSON Web Service Hosts?

    - by Serguei Fedorov
    I saw this being used by someone in a college class once and cannot find anything that is analogous to it. I am not sure if this is the right place to ask about something like this, but hopefully I can get some direction. I want to write an app which uses web services that can obtain and push data back to the client apps. Right now I am gathering up the design and documentation of this app. Not having to code the web service myself would reduce development time by a lot; instead using an easy to setup web service that will be easy to setup and manage. Either XML based on JSON based is totally fine; though I would prefer JSON for its reduced overhead. Like I said I have seen this demonstrated before; you define the data structure to be stored and how it is treated. I cannot find the person who demonstrated this; hopefully maybe someone can suggest something? The service he used was free with a limited amount of requests allowed. EDIT: He was using an online service to do this not a script which is installed onto an existing web hosting account. Thank you!

    Read the article

  • Change Windows Service Priority

    - by SchlaWiener
    I have a windows service that needs to run with High Priority. At the end of the day I want to use this script to modify the priority after service startup: Const HIGH = 256 strComputer = "." strProcess = "BntCapi2.exe" Set objWMIService = GetObject("winmgmts:\\" & strComputer & "\root\cimv2") Set colProcesses = objWMIService.ExecQuery _ ("Select * from Win32_Process Where Name = '" & strProcess & "'") For Each objProcess in colProcesses objProcess.SetPriority(HIGH) Next But currently I am not able to change the priority, even with the taskmanger. The taskmananger throws an "Access Denied" error, but I am logged on as administrator and I changed the user account of the service to administrator, too. I still get the "access denied" message when trying to change the priority. Any ideas what permission I need to do that?

    Read the article

  • sendmail on Ubuntu won't send from www-data user

    - by bumperbox
    I if call mail() function in PHP from webserver (running as www-data) i get an error sending email. If i call the same script from the cmdline logged in as root, then it works If i switch user to www-data and run from the cmdline i get this error message WARNING: RunAsUser for MSP ignored, check group ids (egid=33, want=107) can not chdir(/var/spool/mqueue-client/): Permission denied Program mode requires special privileges, e.g., root or TrustedUser. FAILEDWARNING: RunAsUser for MSP ignored, check group ids (egid=33, want=107) can not chdir(/var/spool/mqueue-client/): Permission denied Program mode requires special privileges, e.g., root or TrustedUser. FAILEDTest Complete$ WARNING: RunAsUser for MSP ignored, check group ids (egid=33, want=107) I am guessing i need to do something in sendmail configuration I have googled for some solutions but have ended up more confused. Can someone let me know what configuration I need to change to fix so i can send from www-data user?

    Read the article

  • How to get Bash shell history range

    - by Aniti
    How can I get/filter history entries in a specific range? I have a large history file and frequently use history | grep somecommand Now, my memory is pretty bad and I also want to see what else I did around the time I entered the command. For now I do this: get match, say 4992 somecommand, then I do history | grep 49[0-9][0-9] this is usually good enough, but I would much rather do it more precisely, that is see commands from 4972 to 5012, that is 20 commands before and 20 after. I am wondering if there is an easier way? I suspect, a custom script is in order, but perhaps someone else has done something similar before.

    Read the article

  • SSH connection falling down

    - by kappa
    I've set up a connection with autossh that creates some tunnels at system startup, but if I try to connect, after successful login (with RSA key) connection fall down, here a trace: debug1: Authentication succeeded (publickey). debug1: Remote connections from LOCALHOST:5006 forwarded to local address localhost:22 debug1: Remote connections from LOCALHOST:6006 forwarded to local address localhost:80 debug1: channel 0: new [client-session] debug1: Requesting [email protected] debug1: Entering interactive session. debug1: remote forward success for: listen 5006, connect localhost:22 debug1: remote forward success for: listen 6006, connect localhost:80 debug1: All remote forwarding requests processed debug1: Sending environment. debug1: Sending env LANG = it_IT.UTF-8 debug1: Sending env LC_CTYPE = en_US.UTF-8 debug1: client_input_channel_req: channel 0 rtype exit-status reply 0 debug1: client_input_channel_req: channel 0 rtype [email protected] reply 0 debug1: channel 0: free: client-session, nchannels 1 Transferred: sent 2400, received 2312 bytes, in 1.3 seconds Bytes per second: sent 1904.2, received 1834.4 debug1: Exit status 1 What can be the problem? All this stuff is managed by a script already running on another machine (creating reverse tunnels on the same machine but with different ports)

    Read the article

  • How should I determine if a user is logged in graphically while lightdm is running?

    - by Jack
    I want to know if someone is logged into a local X-session. In the past I looked at the output of ck-list-sessions. The output looked something like this: Session12: unix-user = '[redacted]' realname = '[redacted]' seat = 'Seat1' session-type = '' active = TRUE x11-display = ':0' x11-display-device = '/dev/tty8' display-device = '' remote-host-name = '' is-local = TRUE on-since = '2012-10-22T18:17:55.553236Z' login-session-id = '4294967295' If no one was logged in, there was no output. I checked if someone was logged in with ck_result" string => execresult("/usr/bin/ck-list-sessions | /bin/grep x11 | /usr/bin/cut --delimiter=\\' -f 2 | /usr/bin/wc -w This no longer works, because lightdm greeter looks like a logged in user Session12: unix-user = '[redacted]' realname = 'Light Display Manager' seat = 'Seat1' session-type = 'LoginWindow' active = TRUE x11-display = ':0' x11-display-device = '/dev/tty8' display-device = '' remote-host-name = '' is-local = TRUE on-since = '2012-10-22T22:17:55.553236Z' login-session-id = '4294967295' I guess I could check session-type, but I don't know how to do that and check x11-display in one-liner. I then need to write my own script, but at that point I thought I would check if anyone else has already done the work or if there is a way to get ConsoleKit to tell me what I want (or if I should be using a different tool)?

    Read the article

  • Postfix Vacation.pl with local users

    - by Simiyu
    Hi, I am trying to setup the vacation.pl script on a mail servers which has local users only (since they are only 10 users). I have installed the SquirrelMail plugin and the Auto respond option is available for the users, but when an email is sent to the addresses no auto reply email is sent to the sender. There are also no logs on the /var/log/vacation folder which i created as well as the normal log files. Most of the examples online refer to virtual users, can it work with local users? and if so how? regards, Arthur

    Read the article

  • Best method(s) to backup VMs running on HyperV?

    - by Kara Marfia
    We're in the middle of P2V'ing most of the network, so the current backup method is likely the worst - the backup agent is still installed on the guest OSs, and the backup device is dutifully pulling them onto tape, one file at a time. I suspect there's a clever way to script (PowerShell?) a suspend on the VMs, then backup of the .vhd files, and unsuspend the VMs. This seems like it would provide big speed benefits, while losing file-level restore (might be best for things like DCs and app servers). What methods/policies have you hammered out?

    Read the article

  • vps like [load] graphs

    - by foober
    I investigated a couple of tools but they were really annoying and not polished. kSar for exampe is supposed to graph sar output, but it doesn't work. There's a perl script around (sar2rrd) that's supposed to convert sar output in rrd format and generate graphs. Doesn't work. (at least it doesn't like the output of "atsar" as per debian/ubuntu package). Tried munin but it wants to mess with http servers, and for some reason it didn't really work, too. It displayed errors in the webpage generated by the http server it put on port 4949. So, is there a simple install and forget tool to generate daily load,cpu,memory,network graphs? It seems strange to me that this problem has not been solved, maybe I'm looking in the wrong places

    Read the article

  • How to store prices that have effective dates?

    - by lal00
    I have a list of products. Each of them is offered by N providers. Each providers quotes us a price for a specific date. That price is effective until that provider decides to set a new price. In that case, the provider will give the new price with a new date. The MySQL table header currently looks like: provider_id, product_id, price, date_price_effective Every other day, we compile a list of products/prices that are effective for the current day. For each product, the list contains a sorted list of the providers that have that particular product. In that way, we can order certain products from whoever happens to offer the best price. To get the effective prices, I have a SQL statement that returns all rows that have date_price_effective >= NOW(). That result set is processed with a ruby script that does the sorting and filtering necessary to obtain a file that looks like this: product_id_1,provider_1,provider_3,provider8,provider_10... product_id_2,provider_3,provider_2,provider1,provider_10... This works fine for our purposes, but I still have an itch that a SQL table is probably not the best way to store this kind of information. I have that feeling that this kind of problema has been solved previously in other more creative ways. Is there a better way to store this information other than in SQL? or, if using SQL, is there a better approach than the one I'm using?

    Read the article

  • Retrieve malicious IP addresses from Apache logs and block them with iptables

    - by Gabriel Talavera
    Im trying to keep away some attackers that try to exploit XSS vulnerabilities from my website, I have found that most of the malicious attempts start with a classic "alert(document.cookie);\" test. The site is not vulnerable to XSS but I want to block the offending IP addresses before they found a real vulnerability, also, to keep the logs clean. My first thought is to have a script constantly checking in the Apache logs all IP addresses that start with that probe and send those addresses to an iptables drop rule. With something like this: cat /var/log/httpd/-access_log | grep "alert(document.cookie);" | awk '{print $1}' | uniq Why would be an effective way to send the output of that command to iptables? Thanks in advance for any input!

    Read the article

  • Boot to VHD backup plan

    - by Josh Barker
    I have a machine that I just reinstalled Windows and all of my applications onto... what a chore that is. I want to totally and completely avoid this from now on by creating an image. My first thought was to see if it possible to copy a VHD file when you are booted into it since I am using Windows 7 Ultimate as boot-to-vhd (without a parent machine). Is this possible and if so, how could I accomplish this? Keep in mind, this is my personal machine and I'm trying to keep things inexpensive (a good script would work). Thanks, Josh

    Read the article

  • Retrieve a domain name based on an IP Address?

    - by Neil Kodner
    I'm reviewing some apache logs, specifically with respect to downloaded files. I'm interested in knowing, if possible, which domain is responsible for the download, given an IP address. I've given nslookup a try and it seems to (mostly) get the job done but it returns all sorts of extraneous information. Ideally, I pass in an IP and receive a domain back. Before I write a shell script to parse the output of nslookup to capture the domain, I'd like to know if this is the best way of approaching this problem, or if there is a more tried-and-true method of doing this. Specifically, I'd like to know if an address resolves to an amazonaws.com domain. I understand that this might be difficult because EC2 machines are dynamically created and destroyed - I'd like to know if the IP addresses for AWS/EC2/EMR machines fit any sort of addressing pattern.

    Read the article

  • Deciding on a company-wide javascript strategy [on hold]

    - by drogon
    Our company is moving most of its software from thick-client winforms apps to web apps. We are using asp.net mvc on the server side. Most of the developers are brand new to the web and need to become efficient and knowledgeable at writing client-side web code (javascript). We are deciding on a number of things and would appreciate feedback on the following: Angular.js or Backbone.js? Backbone (w/ Underscore) is certainly more light weight, but requires more custom development. Angular seems to be a full-fledged framework, but would require everyone to embrace it and probably a longer learning curve(??). (Note: I know nothing about Angular at this point) Require.js or script includes w/ MVC bundleconfig? Require.js makes development "feel like" c# (importing namespaces). But, integrating the build/minification process can be a pain (especially the configuration). Bundling via mvc requires developers to worry more about which scripts to include but has less overall development friction. Typescript vs Javascript Regardless of frameworks, our developers are going to need to learn the basics. Typescript is more like c# and MAY be easier for c# developers to understand. However, learning TypeScript before javascript may hinder their mastery of javascript at the expense of efficiency.

    Read the article

  • Help desk software - specific feature needed

    - by LunchMoney
    I am having a hard time finding help desk software that allows for drop down hyperlink selection during ticket creation. The situation is that we do external support for client systems and connect via remotely anywhere or logmein. Right now we use a poorly modified php based system that has a customer drop down menu and then a site drop down list that is then parsed by a bit of java script which opens a url. What I am looking for is the ability to store customer site URL information in the database and during the creation of a ticket be able to select the customer name and then select the site there by placing the corresponding site URL in the ticket. The support tech will then be able to click on this link to access the customer's site. Has anyone used or seen help desk software with this feature?

    Read the article

  • Simple dig output?

    - by knocte
    In a script I want to be able to write an IP address to somewhere easily, so I thought using dig (or a similar command) with back-ticks. However the simplest output I've been able to come up to wrt dig parameters is > dig -t A +noall +answer www.google.com www.google.com. 300 IN A 173.194.66.106 www.google.com. 300 IN A 173.194.66.104 Any way (extra arg, different tool instead of dig?) to get rid of the junk apart from the IP address?? (And please don't tell me to use sed.) Thanks

    Read the article

  • Error while reomving the new kernel 2.6.37

    - by Tarek
    Hi! I tried to install the new kernel but something went wrong and I'm trying to remove it now. The error massege is: mhd@Tarek-Laptop:~$ sudo apt-get install -f Reading package lists... Done Building dependency tree Reading state information... Done The following packages will be REMOVED: linux-image-2.6.37-020637-generic 0 upgraded, 0 newly installed, 1 to remove and 9 not upgraded. 1 not fully installed or removed. After this operation, 111MB disk space will be freed. Do you want to continue [Y/n]? y (Reading database ... 188780 files and directories currently installed.) Removing linux-image-2.6.37-020637-generic ... Examining /etc/kernel/postrm.d . run-parts: executing /etc/kernel/postrm.d/initramfs-tools 2.6.37-020637-generic /boot/vmlinuz-2.6.37-020637-generic run-parts: executing /etc/kernel/postrm.d/zz-update-grub 2.6.37-020637-generic /boot/vmlinuz-2.6.37-020637-generic /etc/default/grub: 33: Syntax error: EOF in backquote substitution run-parts: /etc/kernel/postrm.d/zz-update-grub exited with return code 2 Failed to process /etc/kernel/postrm.d at /var/lib/dpkg/info/linux-image-2.6.37-020637-generic.postrm line 328. dpkg: error processing linux-image-2.6.37-020637-generic (--remove): subprocess installed post-removal script returned error exit status 1 Errors were encountered while processing: linux-image-2.6.37-020637-generic E: Sub-process /usr/bin/dpkg returned an error code (1) The previous unsloved error is on this bug.

    Read the article

  • How do I get around "Access is Denied" [Number: 5 (0x80070005)], with IIS6/FastCGI and PHP 5.2.3?

    - by Evan Carroll
    I'm getting this error with IIS 6.0 (i assume), and PHP 5.2.3, and FastCGI FastCGI Error The FastCGI Handler was unable to process the request. Error Details: Error Number: 5 (0x80070005). Error Description: Access is denied. HTTP Error 500 - Server Error. Internet Information Services (IIS) Any ideas, nothing revealings in logs (other than 500 errors), this is pretty much all of I have to work with. The script has read and execute privileged for the internet guest account; and, I've added read/execute privileges to the whole D:\PHP. I followed this tutorial http://learn.iis.net/page.aspx/247/using-fastcgi-to-host-php-applications-on-iis-60/ to set it up. The only major diversion is I installed PHP to D:\PHP

    Read the article

  • cURL looking for CA in the wrong place

    - by andrewtweber
    On Redhat Linux, in a PHP script I am setting cURL options as such: curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, True); curl_setopt($ch, CURLOPT_CAINFO, '/home/andrew/share/cacert.pem'); Yet I am getting this exception when trying to send data (curl error: 77) error setting certificate verify locations: CAfile: /etc/pki/tls/certs/ca-bundle.crt CApath: none Why is it looking for the CAfile in /etc/pki/tls/certs/ca-bundle.crt? I don't know where this folder is coming from as I don't set it anywhere. Shouldn't it be looking in the place I specified, /home/andrew/share/cacert.pem? I don't have write permission /etc/ so simply copying the file there is not an option. Am I missing some other curl option that I should be using? (This is on shared hosting - is it possible that it's disallowing me from setting a different path for the CAfile?)

    Read the article

  • Knife leaves stray processes on my system

    - by Leons
    I'm seeing stray knife processes on my system. I have an automated ruby script that runs bundle exec knife bootstrap against various nodes. Most of the time the knife process completes and goes away, but sometimes it stays for days. I'm noticing it days later in ps aux I think it's related to the target node being down when knife runs. The chef server timeout is high, so the action completes eventually when the node goes back up, but I think knife may give up or hang somehow during the wait. Is there something I can do about the stray knife processes? Does knife have timeout settings separate from the chef server's timeout settings?

    Read the article

  • creating secure multicast with socat

    - by arash
    How we can create secure tunnels multicast with socat? Assume we have a list of ip address, CIDR network addresses that we want to create secure tunnel to them. I found this socat STDIO UDP4-DATAGRAM:224.1.0.1:6666,range=192.168.10.0/24 but I want a secure tunnel and different adds with net addrs I want to create script that give the IPs and net addresses and create secure tunnel ./myscript IP1 NetAdd1 IP2 NetAdd2 .... how can i send this parametersw to socat? Socat multicast hasn't any limits? Thanks for your help

    Read the article

  • How do I elevate privileges when running appcmd from a nant task?

    - by Rune
    We are using a Windows 7 box as build server. As part of our continuous integration process I would like to stop and start an IIS 7 website. I have tried doing this from the command line using appcmd: appcmd start site "my website" However, this only works if I start the console window by choosing "Run as Administrator", so it won't work out-of-the-box from NAnt etc. How do I script appcmd to be run with elevated privileges (or am I going about this in the wrong way)? Thank you.

    Read the article

  • Wrong resolution for Lightdm/GDM on Ubuntu 13.04 using HDMI

    - by f03lipe
    I've tried all the solution I could find on the matter so far, but the error persists. My problem is that the login screen (both under gdm and lightdm) runs with the wrong resolution, even though all is fine when I log in. The error occurs solely when I have my HDMI cable connected to my other screen. The login screen resolution becomes 1024x768 (for my 1366x768 laptop screen) and mirrored on my screen, which is 1920x1080. I've had this issue on version 12.04 (the last one before I upgraded to 13.04), but I got it fixed by adding the xrandr commands on the begining of the /etc/gdm/Init/Default file. This doesn't seem to work anymore. I've also tried telling lightdm to run a script fixing the resolution with xrandr (by editing /etc/lightdm/lightdm.conf), but lightdm crashes, and I'm forced to log in with low graphic settigs. Hint: when ubuntu is loading, the resolution starts OK, then goes bad right before the login screen is initialized. Does that mean that there's nothing wrong with my graphic cards? What do you think? Cheers!

    Read the article

  • Can't locate RRDs.pm in @INC

    - by User4283
    Hi, If i run any of my perl script without "use lib qw( /opt/rrdtool-1.4.4/lib/perl );" after perl interpreter. I've to face the following error. Can't locate RRDs.pm in @INC (@INC contains: /usr/lib/perl5/site_perl/5.8.8/i386-linux-thread-multi /usr/lib/perl5/site_perl/5.8.8 /usr/lib/perl5/site_perl /usr/lib/perl5/vendor_perl/5.8.8/i386-linux-thread-multi /usr/lib/perl5/vendor_perl/5.8.8 /usr/lib/perl5/vendor_perl /usr/lib/perl5/5.8.8/i386-linux-thread-multi /usr/lib/perl5/5.8.8 .) Its hard for me to use: "use lib qw( /opt/rrdtool-1.4.4/lib/perl );" in all of my scripts because there are hundreds of scripts. Can anyone help to resolve this....?

    Read the article

< Previous Page | 680 681 682 683 684 685 686 687 688 689 690 691  | Next Page >