Search Results

Search found 5809 results on 233 pages for 'curl multi'.

Page 6/233 | < Previous Page | 2 3 4 5 6 7 8 9 10 11 12 13  | Next Page >

  • Pathfinding in multi goal, multi agent environment

    - by Rohan Agrawal
    I have an environment in which I have multiple agents (a), multiple goals (g) and obstacles (o). . . . a o . . . . . . . o . g . . a . . . . . . . . . . o . . . . o o o o . g . . o . . . . . . . o . . . . o . . . . o o o o a What would an appropriate algorithm for pathfinding in this environment? The only thing I can think of right now, is to Run a separate version of A* for each goal separately, but i don't think that's very efficient.

    Read the article

  • SSL / HTTP / No Response to Curl

    - by Alex McHale
    I am trying to send commands to a SOAP service, and getting nothing in reply. The SOAP service is at a completely separate site from either server I am testing with. I have written a dummy script with the SOAP XML embedded. When I run it at my local site, on any of three machines -- OSX, Ubuntu, or CentOS 5.3 -- it completes successfully with a good response. I then sent the script to our public host at Slicehost, where I fail to get the response back from the SOAP service. It accepts the TCP socket and proceeds with the SSL handshake. I do not however receive any valid HTTP response. This is the case whether I use my script or curl on the command line. I have rewritten the script using SOAP4R, Net::HTTP and Curb. All of which work at my local site, none of which work at the Slicehost site. I have tried to assemble the CentOS box as closely to match my Slicehost server as possible. I rebuilt the Slice to be a stock CentOS 5.3 and stock CentOS 5.4 with the same results. When I look at a tcpdump of the bad sessions on Slicehost, I see my script or curl send the XML to the remote server, and nothing comes back. When I look at the tcpdump at my local site, I see the response just fine. I have entirely disabled iptables on the Slice. Does anyone have any ideas what could be causing these results? Please let me know what additional information I can furnish. Thank you! Below is a wire trace of a sample session. The IP that starts with 173 is my server while the IP that starts with 12 is the SOAP server's. No. Time Source Destination Protocol Info 1 0.000000 173.45.x.x 12.36.x.x TCP 36872 > https [SYN] Seq=0 Win=5840 Len=0 MSS=1460 TSV=137633469 TSER=0 WS=6 Frame 1 (74 bytes on wire, 74 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 0, Len: 0 No. Time Source Destination Protocol Info 2 0.040000 12.36.x.x 173.45.x.x TCP https > 36872 [SYN, ACK] Seq=0 Ack=1 Win=8760 Len=0 MSS=1460 Frame 2 (62 bytes on wire, 62 bytes captured) Ethernet II, Src: Dell_fb:49:a1 (00:21:9b:fb:49:a1), Dst: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6) Internet Protocol, Src: 12.36.x.x (12.36.x.x), Dst: 173.45.x.x (173.45.x.x) Transmission Control Protocol, Src Port: https (443), Dst Port: 36872 (36872), Seq: 0, Ack: 1, Len: 0 No. Time Source Destination Protocol Info 3 0.040000 173.45.x.x 12.36.x.x TCP 36872 > https [ACK] Seq=1 Ack=1 Win=5840 Len=0 Frame 3 (54 bytes on wire, 54 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 1, Ack: 1, Len: 0 No. Time Source Destination Protocol Info 4 0.050000 173.45.x.x 12.36.x.x SSLv2 Client Hello Frame 4 (156 bytes on wire, 156 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 1, Ack: 1, Len: 102 Secure Socket Layer No. Time Source Destination Protocol Info 5 0.130000 12.36.x.x 173.45.x.x TCP [TCP segment of a reassembled PDU] Frame 5 (1434 bytes on wire, 1434 bytes captured) Ethernet II, Src: Dell_fb:49:a1 (00:21:9b:fb:49:a1), Dst: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6) Internet Protocol, Src: 12.36.x.x (12.36.x.x), Dst: 173.45.x.x (173.45.x.x) Transmission Control Protocol, Src Port: https (443), Dst Port: 36872 (36872), Seq: 1, Ack: 103, Len: 1380 Secure Socket Layer No. Time Source Destination Protocol Info 6 0.130000 173.45.x.x 12.36.x.x TCP 36872 > https [ACK] Seq=103 Ack=1381 Win=8280 Len=0 Frame 6 (54 bytes on wire, 54 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 103, Ack: 1381, Len: 0 No. Time Source Destination Protocol Info 7 0.130000 12.36.x.x 173.45.x.x TLSv1 Server Hello, Certificate, Server Hello Done Frame 7 (1280 bytes on wire, 1280 bytes captured) Ethernet II, Src: Dell_fb:49:a1 (00:21:9b:fb:49:a1), Dst: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6) Internet Protocol, Src: 12.36.x.x (12.36.x.x), Dst: 173.45.x.x (173.45.x.x) Transmission Control Protocol, Src Port: https (443), Dst Port: 36872 (36872), Seq: 1381, Ack: 103, Len: 1226 [Reassembled TCP Segments (2606 bytes): #5(1380), #7(1226)] Secure Socket Layer No. Time Source Destination Protocol Info 8 0.130000 173.45.x.x 12.36.x.x TCP 36872 > https [ACK] Seq=103 Ack=2607 Win=11040 Len=0 Frame 8 (54 bytes on wire, 54 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 103, Ack: 2607, Len: 0 No. Time Source Destination Protocol Info 9 0.130000 173.45.x.x 12.36.x.x TLSv1 Client Key Exchange, Change Cipher Spec, Encrypted Handshake Message Frame 9 (236 bytes on wire, 236 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 103, Ack: 2607, Len: 182 Secure Socket Layer No. Time Source Destination Protocol Info 10 0.190000 12.36.x.x 173.45.x.x TLSv1 Change Cipher Spec, Encrypted Handshake Message Frame 10 (97 bytes on wire, 97 bytes captured) Ethernet II, Src: Dell_fb:49:a1 (00:21:9b:fb:49:a1), Dst: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6) Internet Protocol, Src: 12.36.x.x (12.36.x.x), Dst: 173.45.x.x (173.45.x.x) Transmission Control Protocol, Src Port: https (443), Dst Port: 36872 (36872), Seq: 2607, Ack: 285, Len: 43 Secure Socket Layer No. Time Source Destination Protocol Info 11 0.190000 173.45.x.x 12.36.x.x TLSv1 Application Data Frame 11 (347 bytes on wire, 347 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 285, Ack: 2650, Len: 293 Secure Socket Layer No. Time Source Destination Protocol Info 12 0.190000 173.45.x.x 12.36.x.x TCP [TCP segment of a reassembled PDU] Frame 12 (1514 bytes on wire, 1514 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 578, Ack: 2650, Len: 1460 Secure Socket Layer No. Time Source Destination Protocol Info 13 0.450000 12.36.x.x 173.45.x.x TCP https > 36872 [ACK] Seq=2650 Ack=578 Win=64958 Len=0 Frame 13 (54 bytes on wire, 54 bytes captured) Ethernet II, Src: Dell_fb:49:a1 (00:21:9b:fb:49:a1), Dst: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6) Internet Protocol, Src: 12.36.x.x (12.36.x.x), Dst: 173.45.x.x (173.45.x.x) Transmission Control Protocol, Src Port: https (443), Dst Port: 36872 (36872), Seq: 2650, Ack: 578, Len: 0 No. Time Source Destination Protocol Info 14 0.450000 173.45.x.x 12.36.x.x TCP [TCP segment of a reassembled PDU] Frame 14 (206 bytes on wire, 206 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 2038, Ack: 2650, Len: 152 No. Time Source Destination Protocol Info 15 0.510000 12.36.x.x 173.45.x.x TCP [TCP Dup ACK 13#1] https > 36872 [ACK] Seq=2650 Ack=578 Win=64958 Len=0 Frame 15 (54 bytes on wire, 54 bytes captured) Ethernet II, Src: Dell_fb:49:a1 (00:21:9b:fb:49:a1), Dst: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6) Internet Protocol, Src: 12.36.x.x (12.36.x.x), Dst: 173.45.x.x (173.45.x.x) Transmission Control Protocol, Src Port: https (443), Dst Port: 36872 (36872), Seq: 2650, Ack: 578, Len: 0 No. Time Source Destination Protocol Info 16 0.850000 173.45.x.x 12.36.x.x TCP [TCP Retransmission] [TCP segment of a reassembled PDU] Frame 16 (1514 bytes on wire, 1514 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 578, Ack: 2650, Len: 1460 Secure Socket Layer No. Time Source Destination Protocol Info 17 1.650000 173.45.x.x 12.36.x.x TCP [TCP Retransmission] [TCP segment of a reassembled PDU] Frame 17 (1514 bytes on wire, 1514 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 578, Ack: 2650, Len: 1460 Secure Socket Layer No. Time Source Destination Protocol Info 18 3.250000 173.45.x.x 12.36.x.x TCP [TCP Retransmission] [TCP segment of a reassembled PDU] Frame 18 (1514 bytes on wire, 1514 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 578, Ack: 2650, Len: 1460 Secure Socket Layer No. Time Source Destination Protocol Info 19 6.450000 173.45.x.x 12.36.x.x TCP [TCP Retransmission] [TCP segment of a reassembled PDU] Frame 19 (1514 bytes on wire, 1514 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 578, Ack: 2650, Len: 1460 Secure Socket Layer

    Read the article

  • WGet or cURL: Mirror Site from http://site.com And No Internal Access

    - by alharaka
    I have tried wget -m wget -r and a whole bunch of variations. I am getting some of the images on http://site.com, one of the scripts, and none of the CSS, even with the fscking -p parameter. The only HTML page is index.html and there are several more referenced, so I am at a loss. curlmirror.pl on the cURL developers website does not seem to get the job done either. Is there something I am missing? I have tried different levels of recursion with only this URL, but I get the feeling I am missing something. Long story short, some school allows its students to submit web projects, but they want to know how they can collect everything for the instructor who will grade it, instead of him going to all the externally hsoted sites. UPDATE: I think I figured out the issue. I though the links to the other pages were in the index.html page that downloaded. I was way off. Turns out the footer of the page, which has all the navigation links, is handled by a JavaScript file Include.js, which reads JLSSiteMap.js and some other JS files to do page navigation and the like. As a result, wget does not pick up an other dependencies because a lot of this crap is handled not on web pages. How can I handle such a website? This is one of several problem cases. I assume little can be done if wget cannot parse JavaScript.

    Read the article

  • HTTP responses curl and wget different results

    - by Fab
    To check HTTP response header for a set of urls I send with curl the following request headers foreach ( $urls as $url ) { // Setup headers - I used the same headers from Firefox version 2.0.0.6 $header[ ] = "Accept: text/xml,application/xml,application/xhtml+xml,"; $header[ ] = "text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5"; $header[ ] = "Cache-Control: max-age=0"; $header[ ] = "Connection: keep-alive"; $header[ ] = "Keep-Alive: 300"; $header[ ] = "Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7"; $header[ ] = "Accept-Language: en-us,en;q=0.5"; $header[ ] = "Pragma: "; // browsers keep this blank. curl_setopt( $ch, CURLOPT_URL, $url ); curl_setopt( $ch, CURLOPT_USERAGENT, 'Googlebot/2.1 (+http://www.google.com/bot.html)'); curl_setopt( $ch, CURLOPT_HTTPHEADER, $header); curl_setopt( $ch, CURLOPT_REFERER, 'http://www.google.com'); curl_setopt( $ch, CURLOPT_HEADER, true ); curl_setopt( $ch, CURLOPT_NOBODY, true ); curl_setopt( $ch, CURLOPT_RETURNTRANSFER, true ); curl_setopt( $ch, CURLOPT_FOLLOWLOCATION, true ); curl_setopt( $ch, CURLOPT_HTTPAUTH, CURLAUTH_ANY ); curl_setopt( $ch, CURLOPT_TIMEOUT, 10 ); //timeout 10 seconds } Sometimes I receive 200 OK which is good other time 301, 302, 307 which I consider good as well, but other times I receive weird status as 406, 500, 504 which should identify an invalid url but when I open it on the browser they are fine for example the script returns http://www.awe.co.uk/ => HTTP/1.1 406 Not Acceptable and wget returns wget http://www.awe.co.uk/ --2011-06-23 15:26:26-- http://www.awe.co.uk/ Resolving www.awe.co.uk... 77.73.123.140 Connecting to www.awe.co.uk|77.73.123.140|:80... connected. HTTP request sent, awaiting response... 200 OK Does anyone know which request header I am missing or adding in excess?

    Read the article

  • Saving images only available when logged in

    - by James
    Hi, I've been having some trouble getting images to download when logged into a website that requires you to be logged in. The images can only be viewed when you are logged in to the site, but you cannot seem to view them directly in the browser if you copy its location into a tab/new window (it redirects to an error page - so I guess the containing folder has be .htaccess-ed). Anyway, the code I have below allows me to log in and grab the HTML content, which works well - but I cannot grab the images ... this is where I need help! <? // curl.php class Curl { public $cookieJar = ""; public function __construct($cookieJarFile = 'cookies.txt') { $this->cookieJar = $cookieJarFile; } function setup() { $header = array(); $header[0] = "Accept: text/xml,application/xml,application/xhtml+xml,"; $header[0] .= "text/html;q=0.9,text/plain;q=0.8,image/gif;q=0.8,image/x-bitmap;q=0.8,image/jpeg;q=0.8,image/png,*/*;q=0.5"; $header[] = "Cache-Control: max-age=0"; $header[] = "Connection: keep-alive"; $header[] = "Keep-Alive: 300"; $header[] = "Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7"; $header[] = "Accept-Language: en-us,en;q=0.5"; $header[] = "Pragma: "; // browsers keep this blank. curl_setopt($this->curl, CURLOPT_USERAGENT, 'Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US; rv:1.8.1.7) Gecko/20070914 Firefox/2.0.0.7'); curl_setopt($this->curl, CURLOPT_HTTPHEADER, $header); curl_setopt($this->curl, CURLOPT_COOKIEJAR, $this->cookieJar); curl_setopt($this->curl, CURLOPT_COOKIEFILE, $this->cookieJar); curl_setopt($this->curl, CURLOPT_AUTOREFERER, true); curl_setopt($this->curl, CURLOPT_FOLLOWLOCATION, true); curl_setopt($this->curl, CURLOPT_RETURNTRANSFER, true); } function get($url) { $this->curl = curl_init($url); $this->setup(); return $this->request(); } function getAll($reg, $str) { preg_match_all($reg, $str, $matches); return $matches[1]; } function postForm($url, $fields, $referer = '') { $this->curl = curl_init($url); $this->setup(); curl_setopt($this->curl, CURLOPT_URL, $url); curl_setopt($this->curl, CURLOPT_POST, 1); curl_setopt($this->curl, CURLOPT_REFERER, $referer); curl_setopt($this->curl, CURLOPT_POSTFIELDS, $fields); return $this->request(); } function getInfo($info) { $info = ($info == 'lasturl') ? curl_getinfo($this->curl, CURLINFO_EFFECTIVE_URL) : curl_getinfo($this->curl, $info); return $info; } function request() { return curl_exec($this->curl); } } ?> And below is the page that uses it. <? // data.php include('curl.php'); $curl = new Curl(); $url = "http://domain.com/login.php"; $newURL = "http://domain.com/go_here.php"; $username = "user"; $password = "pass"; $fields = "user=$username&pass=$password"; // Calling URL $referer = "http://domain.com/refering_page.php"; $html = $curl->postForm($url, $fields, $referer); $html = $curl->get($newURL); echo $html; ?> I've tried putting the direct URL for the image into $newURL but that doesn't get the image - it simply returns an error saying since that folder is not available to view directly. I've tried varying the above in different ways, but I haven't been successful in getting an image, though I have managed to get a screen through basically saying error 405 and/or 406 (but not the image I want). Any help would be great!

    Read the article

  • Convert this curl to multi PHP

    - by user1642423
    I have this code and I want made 10 curl connections like this with multi but I don't know how to that with this specific code: What the code does? Make a curl requiest to an .asp page Uses the result to send some data in a form ($ciudad) then the page get this submit and make an internal request and show an result. Output the final result of that. function curl($header,$encoded,$cookie){ $options = array( CURLOPT_USERAGENT => $_SERVER['HTTP_USER_AGENT'], CURLOPT_TIMEOUT => 120, //CURLOPT_REFERER => '', //CURLOPT_HTTPHEADER => $header, CURLOPT_COOKIE => $cookie, CURLOPT_POST => true, CURLOPT_POSTFIELDS => $encoded, CURLOPT_RETURNTRANSFER => true, CURLOPT_HEADER => false, CURLOPT_FOLLOWLOCATION => true, ); $ch = curl_init("http://procesos.ramajudicial.gov.co/consultaprocesos/consultap.aspx"); curl_setopt_array( $ch, $options ); $output = curl_exec($ch); curl_close($ch); return $output; } $cookie = ""; foreach($_COOKIE as $k => $v) $cookie .= $k."=".$v.";"; $cookie = substr($cookie,0,strlen($cookie)-1); $encoded = ''; foreach($_POST as $name => $value) { $encoded .= urlencode($name).'='.urlencode($value).'&'; } $lk = "http://procesos.ramajudicial.gov.co/consultaprocesos/"; $header[] = 'User-Agent: '.$_SERVER['HTTP_USER_AGENT']; $header[] = 'Accept: text/xml,application/xml,application/xhtml+xml,text /html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5'; $header[] = 'Accept-Language: en-us,en;q=0.5'; $header[] = 'Accept-Encoding: gzip,deflate'; $header[] = 'Connection: keep-alive'; $header[] = 'Cookie : '.$cookie; $header[] = 'Content-Type: application/x-www-form-urlencoded'; $output = curl($header,$encoded,$cookie); $CIUDAD = urlencode("Medellin"); // to change $CORPORACION = urlencode("JUZGADOS CIVILES MUNICIPALES DE MEDELLIN"); // to change $DIGITOS = $numsus; // BEGIN STEP 1 $__VIEWSTATE = 'id="__VIEWSTATE" value="'; $i = stripos($output,$__VIEWSTATE) + strlen($__VIEWSTATE); $j = stripos($output,'"',$i); $__VIEWSTATE = substr($output,$i,$j-$i); $__EVENTVALIDATION = 'id="__EVENTVALIDATION" value="'; $i = stripos($output,$__EVENTVALIDATION) + strlen($__EVENTVALIDATION); $j = stripos($output,'"',$i); $__EVENTVALIDATION = substr($output,$i,$j-$i); $encoded = '__EVENTTARGET=DropDownList1&__EVENTARGUMENT=&__LASTFOCUS=&__VIEWSTATE='.urlencode($__VIEWSTATE).'&__EVENTVALIDATION='.urlencode($__EVENTVALIDATION).'&DropDownList1='.$CIUDAD.'&TextBox13='; $output = curl($header,$encoded,$cookie);

    Read the article

  • Multi-Finger Gestures in 14.04

    - by Alex Mundy
    I'm running 14.04 on a Lenovo Y500. I want to get multi-touch gestures running, specifically a three-finger swipe to switch desktops. I would like to keep using the Unity interface, so I can't use touchegg, and I have a buttonless touchpad, so easystroke is not a good candidate either. Is there another third party program that will allow me to use buttonless three finger gestures, or some config hack that will accomplish the same thing?

    Read the article

  • how can i set cookie in curl

    - by Sushant Panigrahi
    i am fetching somesite page.. but it display nothing and url address change. example i have typed http://localhost/sushant/EXAMPLE_ROUGH/curl.php in curl page my coding is= $fp = fopen("cookie.txt", "w"); fclose($fp); $agent= 'Mozilla/5.0 (Windows; U; Windows NT 5.1; pl; rv:1.9) Gecko/2008052906 Firefox/3.0'; $ch = curl_init(); curl_setopt($ch, CURLOPT_USERAGENT, $agent); // 2. set the options, including the url curl_setopt($ch, CURLOPT_URL, "http://www.fnacspectacles.com/place-spectacle/manifestation/Grand-spectacle-LE-ROI-LION-ROI4.htm"); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); curl_setopt($ch, CURLOPT_HEADER, 0); curl_setopt ($ch, CURLOPT_COOKIEJAR, 'cookie.txt'); curl_setopt($ch, CURLOPT_COOKIEFILE, "cookie.txt"); // 3. execute and fetch the resulting HTML output if(curl_exec($ch) === false) { echo 'Curl error: ' . curl_error($ch); } else echo $output = curl_exec($ch); // 4. free up the curl handle curl_close($ch); ? but it canege url like this.. http://localhost/aide.do?sht=_aide_cookies_ object not found. how can solve these problem help me

    Read the article

  • PHP & cUrl - POST problems in while loop.

    - by Max Hoff
    I have a while loop, with cUrl inside. (I can't use curl_multi for various reasons.) The problem is that cUrl seems to save the data it POSTS after each loop traversal. For instance, if parameter X is One the first loop through, and if it's Two the second loop through, cUrl posts: "One,Two". It should just POST "Two".(This is despite closing and unsetting the curl handle.) Here's a simplified version of the code (with unecessary info stripped out): while(true){ // code to form the $url. this code is local to the loop. so the variables should be "erased" and made new for each loop through. $ch = curl_init(); $userAgent = 'Googlebot/2.1 (http://www.googlebot.com/bot.html)'; curl_setopt($ch,CURLOPT_USERAGENT, $userAgent); curl_setopt($ch,CURLOPT_URL,$url); curl_setopt($ch,CURLOPT_POST,count($fields)); curl_setopt($ch,CURLOPT_POSTFIELDS,$fields_string); curl_setopt($ch, CURLOPT_FAILONERROR, true); curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true); curl_setopt($ch, CURLOPT_AUTOREFERER, true); curl_setopt($ch, CURLOPT_RETURNTRANSFER,true); curl_setopt($ch, CURLOPT_TIMEOUT, 10); $html = curl_exec($ch); curl_close($ch); unset($ch); $dom = new DOMDocument(); @$dom->loadHTML($html); $xpath = new DOMXPath($dom); $resultTable = $xpath->evaluate("/html/body//table"); // $resultTable is 20 the first time through the loop, and 0 everytime thereafter becauset he POSTing doesn't work right with the "saved" parameters. What am I doing wrong here?

    Read the article

  • CURL C API: callback was not called

    - by Pierre
    Hi all, The code below is a test for the CURL C API . The problem is that the callback function write_callback is never called. Why ? /** compilation: g++ source.cpp -lcurl */ #include <assert.h> #include <iostream> #include <cstdlib> #include <cstring> #include <cassert> #include <curl/curl.h> using namespace std; static size_t write_callback(void *ptr, size_t size, size_t nmemb, void *userp) { std::cerr << "CALLBACK WAS CALLED" << endl; exit(-1); return size*nmemb; } static void test_curl() { int any_data=1; CURLM* multi_handle=NULL; CURL* handle_curl = ::curl_easy_init(); assert(handle_curl!=NULL); ::curl_easy_setopt(handle_curl, CURLOPT_URL, "http://en.wikipedia.org/wiki/Main_Page"); ::curl_easy_setopt(handle_curl, CURLOPT_WRITEDATA, &any_data); ::curl_easy_setopt(handle_curl, CURLOPT_VERBOSE, 1); ::curl_easy_setopt(handle_curl, CURLOPT_WRITEFUNCTION, write_callback); ::curl_easy_setopt(handle_curl, CURLOPT_USERAGENT, "libcurl-agent/1.0"); multi_handle = ::curl_multi_init(); assert(multi_handle!=NULL); ::curl_multi_add_handle(multi_handle, handle_curl); int still_running=0; /* lets start the fetch */ while(::curl_multi_perform(multi_handle, &still_running) == CURLM_CALL_MULTI_PERFORM ); std::cerr << "End of curl_multi_perform."<< endl; //cleanup should go here ::exit(EXIT_SUCCESS); } int main(int argc,char** argv) { test_curl(); return 0; } Many thanks Pierre

    Read the article

  • Wordpress curl save Images

    - by Jeton Ramadani
    I am working on saving images from external sites into a folder in my wordpress theme. And I was wondering if its Ok to call curl twice or can it be done with one time. Example: $data = get_url('http://www.veoh.com/watch/v19935546Y8hZPgbZ'); // getting the url first curl instance preg_match('/fullHighResImagePath="(.*?)"/', $data, $thumbnail); // find the image from content savePhoto($thumbnail, $post->ID); //2nd instance of curl to save the image function get_url($url) { $user_agent = "Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.2)"; $ytc = curl_init(); // initialize curl handle curl_setopt($ytc, CURLOPT_URL, $url); // set url to post to curl_setopt($ytc, CURLOPT_FAILONERROR, 1); // Fail on errors curl_setopt($ytc, CURLOPT_FOLLOWLOCATION, 1); // allow redirects curl_setopt($ytc, CURLOPT_RETURNTRANSFER, 1); // return into a variable curl_setopt($ytc, CURLOPT_PORT, 80); //Set the port number curl_setopt($ytc, CURLOPT_TIMEOUT, 15); // times out after 15s curl_setopt($ytc, CURLOPT_HEADER, 1); // include HTTP headers curl_setopt($ytc, CURLOPT_USERAGENT, $user_agent); $source = curl_exec($ytc); curl_close($ytc); $data = trim( $source ); return $data; } function savePhoto($remoteImage, $isbn) { $ch = curl_init(); curl_setopt ($ch, CURLOPT_URL, $remoteImage); curl_setopt ($ch, CURLOPT_RETURNTRANSFER, 1); curl_setopt ($ch, CURLOPT_CONNECTTIMEOUT, 0); $fileContents = curl_exec($ch); curl_close($ch); if (DIRECTORY_SEPARATOR=='/'){ $absolute_path = dirname(__FILE__).'/'; } else { $absolute_path = str_replace('\\', '/', dirname(__FILE__)).'/'; } $newImg = imagecreatefromstring($fileContents); return imagejpeg($newImg, $absolute_path ."video_images/{$isbn}.jpg",100); }

    Read the article

  • PHP, Apache and curl: Differences between Windows and Linux?

    - by beginner_
    I'm trying to run my php App on Ubuntu Server 11.10. This App works fine under Apache + PHP in windows. I have other applications that I can simply copy&paste between the 2 OS and they work on both. (These don't use cURL). However this one uses the php library tonic (RESTful webservices) and makes us of php cURL module. The issue is I'm not getting an error message which makes it impossible to find the issue. I (must) use NTLM authentication and this is done with AuthenNTLM Apache Module: Order allow,deny Allow from all PerlAuthenHandler Apache2::AuthenNTLM AuthType ntlm AuthName "Protected Access" require valid-user PerlAddVar ntdomain "domainName server" PerlSetVar defaultdomain domainName PerlSetVar ntlmsemtimeout 2 PerlSetVar ntlmdebug 1 PerlSetVar splitdomainprefix 0 All files that cURL needs to fetch override AuthenNTLM authentication: order deny,allow deny from all allow from 127.0.0.1 Satisfy any Since these files are only fectehd by cURL from same server, access can be limited to localhost. Possible issues are: NTLM auth isn't overridden for files requested through cURL (even though AllowOverride All is set) curl works differently on linux $ch = curl_init(); curl_setopt($ch, CURLOPT_COOKIE, $strCookie); curl_setopt($ch, CURLOPT_URL, $baseUrl . $queryString); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); $html = curl_exec($ch); curl_close($ch); other? Apache log says: [error] Bad/Missing NTLM/Basic Authorization Header for /myApp/webservice/local/viewList.php But this directory should override NTLM authentication using curl command line from windows to access same resource i get: <!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN"> <html> <head> <title>406 Not Acceptable</title> </head> <body> <h1>Not Acceptable</h1> <p>An appropriate representation of the requested resource /myApp/webservice/myResource could not be found on this server.</p> Available variants: <ul> <li><a href="myResource.php">myResource.php</a> , type application/x-httpd-php</li> </ul> <hr> <address>Apache/2.2.20 (Ubuntu) Server at localhost Port 80</address> </body> </html> Note: This is duplicate from http://stackoverflow.com/questions/9821979/php-curl-on-linux-what-is-the-difference-to-curl-on-windows Is it was suggested I post it here. EDIT: Please see Ubuntu Server: Apache2 seems to attach .php to URI as I discovered why it does not work but need help so the issue does not occur anymore. ANSWER: The issue is the default Apache configuration on Ubuntu: Options Indexes FollowSymLinks MultiViews MultiViews is changing request_uri from myResource to myResource.php. Solutions: disable MultiViews in .htaccess: Options -MultiViews remove MultiViews from default config rename the file as example to myResourceClass I chose last option because that should work regardless of configuration and I only have 3 such files so the change took about 30 secs...

    Read the article

  • Building cURL & libcurl with Visual Studio 2010

    - by KTC
    With the help of question #197444, I have managed to build cURL & libcurl from source on Windows from within the Visual Studio 2010 IDE, OpenSSL 1.0.0, and zlib 1.2.5. The problem I see is that at the moment, if I run the resulting curl.exe with the argument -V, then the version that it report is curl 7.20.1 (i386-pc-win32) libcurl/7.20.1 OpenSSL/0.9.8d zlib/1.2.3 Protocols: dict file ftp ftps http https imap imaps ldap pop3 pop3s rtsp smtp smtps telnet tftp Features: AsynchDNS Largefile NTLM SSL libz Note the versions reported for both OpenSSL & zlib doesn't match if what I actually used. Any idea on how to fix this? p.s. Is there a clear list of optional components that can be compiled into libcurl and what options/preprocessor directive to use? (e.g. SSPI, libidn, ...?)

    Read the article

  • Google Charts Through CURL

    - by swt83
    I have a PHP class that helps me generate URLs for custom charts using Google Chart service. These URLs work fine when I load them in my browser, but I'm trying to pull them using CURL so I can access the charts on secure https websites. Whenever I try and pull a chart via CURL, I get an Error 400 Bad Request. Any idea on how to get around this? Everything I have tried has failed. $url = urldecode($_GET['url']); $session = curl_init($url); // Open the Curl session curl_setopt($session, CURLOPT_HEADER, false); // Don't return HTTP headers curl_setopt($session, CURLOPT_RETURNTRANSFER, true); // Do return the contents of the call $image = curl_exec($session); // Make the call #header("Content-Type: image/png"); // Set the content type appropriately curl_close($session); // And close the session die($image);

    Read the article

  • curl object moved here error, bash

    - by adam n
    I'm trying to download an html file with curl in bash. When I download it manually, it works fine. However, when i try and run my script through crontab, the output html file is very small and just says "Object moved to here." with a broken link. Does this have something to do with the sparse environment the crontab commands run it? I found this question: http://stackoverflow.com/questions/1279340/php-ssl-curl-object-moved-error but i'm using bash, not php. What are the equivalent command line options or variables to set to fix this problem in bash? (I want to do this with curl, not wget)

    Read the article

  • Login to Gmail Inbox using Curl ?

    - by Vishwanath dalvi
    Hello friends, I'm doing a project where user will first save his gmail id and password and after confirmation,I will provide a link for directly login to gmail next time without entering his gmail id and password.. the saved password will passed as a parameters of userid and passwd using CURL I'm doing this using php.. I heard about curl to do this.. I tried lots of code but didn't get any working code.. Can anyone tell me.. how do i just login to gmail inbox .. using CURL in PHP ??

    Read the article

  • Get unicode character with curl in PHP

    - by saturngod
    I tried to get URL with curl. Return value contain unicode character. curl convert to \u for unicode character. How to get unicode character with curl ? This is my code <?php $ch = curl_init(); // set URL and other appropriate options curl_setopt($ch, CURLOPT_URL, "http://www.ornagai.com/index.php/api/word/q/test/format/json"); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); $return=curl_exec($ch); echo $return; ?>

    Read the article

  • CURL issue in PHP while getting location list

    - by Ajay
    I am retrieving the nearest locations available from a given address (Longitude/Latitude) from geolocation website. It works fine, but for some places it gives junk characters in the name. Moreover, in browser I am getting different characters compared to my PHP CURL functionality. Here is the URL http://www.geoplugin.net/extras/nearby.gp?lat=17.7374669&long=83.3214858&limit=5&radius=50&format=php One of the location is "Sitampeta" in original location name, but in browser I am getting "Sitammapeta" where as in CURL function I am getting "SÄ«tammapeta". Please tell me why this difference. I wrote a function to convert browser output to original which works fine. function convert ($old) { $n=""; for ($i=0; $i<strlen($old); $i++) { $n .= chr(ord(substr($old,$i,1))); } return $n; } But I dont understand how I convert the CURL output to original name.

    Read the article

  • Alternative to CURL due to long waiting

    - by aalan
    Hey Guys I currently run a PHP-script using CURL to send data to another server, to do run a PHP-script that could take up to a minute to run. This server doesn't give any data back. But the CURL-request still has to wait for it to complete, and then it loads the rest of the orignal page. I would like my PHP-script to just send the data to the other server and then not wait for an answer. So my question is how should I solve this? I have read that CURL always has to wait. What are your suggestions? Thank You!

    Read the article

  • How would you practice concurrency and multi-threading?

    - by Xavier Nodet
    I've been reading about concurrency, multi-threading, and how "the free lunch is over". But I've not yet had the possibility to use MT in my job. I'm thus looking for suggestions about what I could do to get some practice of CPU heavy MT through exercises or participation in some open-source projects. Thanks. Edit: I'm more interested in open-source projects that use MT for CPU-bound tasks, or simply algorithms that are interesting to implement using MT, rather than books or papers about the tools like threads, mutexes and locks...

    Read the article

  • Do I need to recompile PHP to make use of CURL API?

    - by amn
    I have both Apache and PHP set up manually, albeit the latter without CURL. There is this jungle of instructions and explanations on extensions for PHP. I have a very straightforward question - what do I need to do to enable CURL in a more dynamic way. I resent the idea of static linking, in fact I hate and avoid static linking like the plague. Is it possible to have my Apache and PHP understand that there is CURL in town? I can compile CURL if necessary. Package management may be out of the question, because I built PHP myself - I am on Ubuntu, and it does not provide PHP without Suhosin and a a whole lot of time, so I removed it and built PHP myself. The whole slew of related questions simply propse installing "php5-curl" package, which is exactly one thing I CANNOT do since it installs it in a completely unrelated directory, which my PHP does not even seem to bother linking to.

    Read the article

  • Ubuntu 10.04/CURL: How do I fix/update the CA Bundle?

    - by Nick
    I recently upgraded our server from 8.04 to 10.04, and all the software along with it. From what I've found online, it seems that the new version of CURL doesn't include a CA bundle, and, as a result, fails to verify that the certificate of the server you're connecting to is signed by a valid authority. The actual error is: CURL error: SSL certificate problem, verify that the CA cert is OK. Details: error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE: certificate verify failed Some palces I've found suggest manually specifying a CA file or disabling the check altogether by setting an option when you call CURL, but I'd much rather fix the issue globally, rather than having to modify each application's CURL calls. Is there a way to fix CURL's CA problem server-wide so that all of the existing application code works as is without needing to be modified?

    Read the article

  • Multi-tenant ASP.NET MVC - Views

    - by zowens
    Part I – Introduction Part II – Foundation Part III – Controllers   So far we have covered the basic premise of tenants and how they will be delegated. Now comes a big issue with multi-tenancy, the views. In some applications, you will not have to override views for each tenant. However, one of my requirements is to add extra views (and controller actions) along with overriding views from the core structure. This presents a bit of a problem in locating views for each tenant request. I have chosen quite an opinionated approach at the present but will coming back to the “views” issue in a later post. What’s the deal? The path I’ve chosen is to use precompiled Spark views. I really love Spark View Engine and was planning on using it in my project anyways. However, I ran across a really neat aspect of the source when I was having a look under the hood. There’s an easy way to hook in embedded views from your project. There are solutions that provide this, but they implement a special Virtual Path Provider. While I think this is a great solution, I would rather just have Spark take care of the view resolution. The magic actually happens during the compilation of the views into a bin-deployable DLL. After the views are compiled, the are simply pulled out of the views DLL. Each tenant has its own views DLL that just has “.Views” appended after the assembly name as a convention. The list of reasons for this approach are quite long. The primary motivation is performance. I’ve had quite a few performance issues in the past and I would like to increase my application’s performance in any way that I can. My customized build of Spark removes insignificant whitespace from the HTML output so I can some some bandwidth and load time without having to deal with whitespace removal at runtime.   How to setup Tenants for the Host In the source, I’ve provided a single tenant as a sample (Sample1). This will serve as a template for subsequent tenants in your application. The first step is to add a “PostBuildStep” installer into the project. I’ve defined one in the source that will eventually change as we focus more on the construction of dependency containers. The next step is to tell the project to run the installer and copy the DLL output to a folder in the host that will pick up as a tenant. Here’s the code that will achieve it (this belongs in Post-build event command line field in the Build Events tab of settings) %systemroot%\Microsoft.NET\Framework\v4.0.30319\installutil "$(TargetPath)" copy /Y "$(TargetDir)$(TargetName)*.dll" "$(SolutionDir)Web\Tenants\" copy /Y "$(TargetDir)$(TargetName)*.pdb" "$(SolutionDir)Web\Tenants\" The DLLs with a name starting with the target assembly name will be copied to the “Tenants” folder in the web project. This means something like MultiTenancy.Tenants.Sample1.dll and MultiTenancy.Tenants.Sample1.Views.dll will both be copied along with the debug symbols. This is probably the simplest way to go about this, but it is a tad inflexible. For example, what if you have dependencies? The preferred method would probably be to use IL Merge to merge your dependencies with your target DLL. This would have to be added in the build events. Another way to achieve that would be to simply bypass Visual Studio events and use MSBuild.   I also got a question about how I was setting up the controller factory. Here’s the basics on how I’m setting up tenants inside the host (Global.asax) protected void Application_Start() { RegisterRoutes(RouteTable.Routes); // create a container just to pull in tenants var topContainer = new Container(); topContainer.Configure(config => { config.Scan(scanner => { scanner.AssembliesFromPath(Path.Combine(Server.MapPath("~/"), "Tenants")); scanner.AddAllTypesOf<IApplicationTenant>(); }); }); // create selectors var tenantSelector = new DefaultTenantSelector(topContainer.GetAllInstances<IApplicationTenant>()); var containerSelector = new TenantContainerResolver(tenantSelector); // clear view engines, we don't want anything other than spark ViewEngines.Engines.Clear(); // set view engine ViewEngines.Engines.Add(new TenantViewEngine(tenantSelector)); // set controller factory ControllerBuilder.Current.SetControllerFactory(new ContainerControllerFactory(containerSelector)); } The code to setup the tenants isn’t actually that hard. I’m utilizing assembly scanners in StructureMap as a simple way to pull in DLLs that are not in the AppDomain. Remember that there is a dependency on the host in the tenants and a tenant cannot simply be referenced by a host because of circular dependencies.   Tenant View Engine TenantViewEngine is a simple delegator to the tenant’s specified view engine. You might have noticed that a tenant has to define a view engine. public interface IApplicationTenant { .... IViewEngine ViewEngine { get; } } The trick comes in specifying the view engine on the tenant side. Here’s some of the code that will pull views from the DLL. protected virtual IViewEngine DetermineViewEngine() { var factory = new SparkViewFactory(); var file = GetType().Assembly.CodeBase.Without("file:///").Replace(".dll", ".Views.dll").Replace('/', '\\'); var assembly = Assembly.LoadFile(file); factory.Engine.LoadBatchCompilation(assembly); return factory; } This code resides in an abstract Tenant where the fields are setup in the constructor. This method (inside the abstract class) will load the Views assembly and load the compilation into Spark’s “Descriptors” that will be used to determine views. There is some trickery on determining the file location… but it works just fine.   Up Next There’s just a few big things left such as StructureMap configuring controllers with a convention instead of specifying types directly with container construction and content resolution. I will also try to find a way to use the Web Forms View Engine in a multi-tenant way we achieved with the Spark View Engine without using a virtual path provider. I will probably not use the Web Forms View Engine personally, but I’m sure some people would prefer using WebForms because of the maturity of the engine. As always, I love to take questions by email or on twitter. Suggestions are always welcome as well! (Oh, and here’s another link to the source code).

    Read the article

  • Curl Error 52 Empty reply from server

    - by Paul Sheldrake
    Hello I have a cron job setup on one server to run a backup script in PHP that is hosted on another server. The command I've been using is formatted like this: curl -sS http://www.example.com/backup.php Lately I've been getting this error when the Cron runs curl: (52) Empty reply from server I have no idea what this means. If I go to the link directly in my browser the script runs fine and I get my little backup zip file. Can anyone help? Thanks, Paul

    Read the article

  • How can I speed up CURL tasks?

    - by embedded
    I'm using CURL to fetch some data from user accounts. First it logs in and then redirect to another URL where the data resides. My stats showed that it took an average of 14 seconds to fetch some data spread over 5 pages. I would like to speed things up, my questions are: Is it possible to see how much each step takes? Do you know how could I speed up/enhance CURL? Thanks

    Read the article

< Previous Page | 2 3 4 5 6 7 8 9 10 11 12 13  | Next Page >