Search Results

Search found 17749 results on 710 pages for 'connection pool'.

Page 293/710 | < Previous Page | 289 290 291 292 293 294 295 296 297 298 299 300  | Next Page >

  • How to detect internet connectivity using java program

    - by Sunil Kumar Sahoo
    How to write a java program which will tell me whether I have internet access or not. I donot want to ping or create connection with some external url because if that server will be down then my program will not work. I want reliable way to detect which will tell me 100% guarantee that whether I have internet connection or not irrespective of my Operating System. I want the program for the computers who are directly connected to internet. I have tried with the below program URL url = new URL("http://www.xyz.com/"); URLConnection conn = url.openConnection(); conn.connect(); I want something more appropriate than this program Thanks Sunil Kumar Sahoo

    Read the article

  • help finding a hosing company with unixODBC and FreeTDS support

    - by patrick
    I need to find a hosting company that provides a LAMP stack, the P being PHP. Finding that is pretty easy, but I have a further requirement of unixODBC and FreeTDS or some equilant. The project will require a remote connection to a Microsoft SQL 2005 database. Most of the project will use a local MySQL database but it also requires data from a remote MS SQL 2005 database. In my reading it looks like I'll need unixODBC and FreeTDS installed on the server to make that connection. So far I've been unable to find a shared host that provides these. Can anyone suggest or use a host that might work? The project has budget limits so we we're hoping to find a shared host.

    Read the article

  • WebSocket support on mobile devices

    - by Marco W.
    For an Android multiplayer game's communication between players I'm using a WebSocket server and TooTallNate's Java library on the client side to enable WebSocket support in the Android app. So just to point it out clearly, WebSocket support in mobile browsers is not important to me. Unfortunately, users report that they're experiencing problems such as connection failures or unreceived messages. Is that a general problem of WebSockets on mobile devices (blocked ports, firewalls, mobile Internet connection) or is that probably a flaw in the client side code? Do you have experience with WebSocket client libraries such as the one above? I've just discovered autobahn.ws for Android - but I don't know if it's worth switching from my current library (see above). What about WAMP? Is WebSocket technology not exactly the adequate solution so that I should use the sub-protocol (?) WAMP?

    Read the article

  • cURL PHP Proper SSL between private servers with self-signed certificate

    - by PolishHurricane
    I originally had a connection between my 2 servers running with CURLOPT_SSL_VERIFYPEER set to "false" with no Common Name in the SSL cert to avoid errors. The following is the client code that connected to the server with the certificate: curl_setopt($ch,CURLOPT_SSL_VERIFYPEER,FALSE); curl_setopt($ch,CURLOPT_SSL_VERIFYHOST,2); However, I recently changed this code (set it to true) and specified the computers certificate in PEM format. curl_setopt($ch,CURLOPT_SSL_VERIFYPEER,TRUE); curl_setopt($ch,CURLOPT_SSL_VERIFYHOST,2); curl_setopt($ch,CURLOPT_CAINFO,getcwd().'/includes/hostcert/Hostname.crt'); This worked great on the local network from a test machine, as the certificate is signed with it's hostname for a CN. How can I setup the PHP code so it only trusts the hostname computer and maintains a secure connection. I'm well aware you can just set CURLOPT_SSL_VERIFYHOST to "0" or "1" and CURLOPT_SSL_VERIFYPEER to "false", but these are not valid solutions as they break the SSL security.

    Read the article

  • PHP File Serving Script: Unreliable Downloads?

    - by JGB146
    This post started as a question on ServerFault ( http://serverfault.com/questions/131156/user-receiving-partial-downloads ) but I determined that our php script was the culprit. So I'm issuing an updated question here about what I believe is the actual issue. I am using a php script to verify permissions and then serve up a file for users of my website to download. Most of the time, this works, but recently one user has been seeing problems with larger downloads. He is only getting ~80% of downloads for files that are 100MB in size. Also, all downloads from this script fail to report a filesize. Further, tests revealed that the same user COULD reliably download each of the failed files if given a direct link (at which point the filesize is reported). Here's the relevant snippet of code that we are using to serve the file: header("Content-type:$contenttype"); $len = filesize($filename); header("Content-Length: $len"); header("Content-Disposition: attachment; filename=".$title.".".$ext); readfile($filename); Note that $contenttype, $filename, $title, and $ext are all set correctly before we get here. These have been triple-checked. None of them are the problem. Also, $len does provide the correct filesize. While researching this issue, I came across this post: http://stackoverflow.com/questions/1334471/content-length-header-always-zero It seems that I am encountering the same issue. When I use the script, I get chunked encoding on the file and no size is set for content-length. I'm hypothesizing that something is going wrong on the large downloads, leading him to get a zero-length chunk before the end of the file. Here's what the headers look like for a direct request: http://www.grinderschool.com/videos/zfff5061b65ae00e8b21/KillsAids021.wmv GET /videos/zfff5061b65ae00e8b21/KillsAids021.wmv HTTP/1.1 Host: www.grinderschool.com User-Agent: Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3 (.NET CLR 3.5.30729) Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 Accept-Language: en-us,en;q=0.5 Accept-Encoding: gzip,deflate Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7 Keep-Alive: 115 Connection: keep-alive Referer: http://www.grinderschool.com/phpBB3/viewtopic.php?f=14&p=29468 Cookie: style_cookie=printonly; phpbb3_7c544_u=2; phpbb3_7c544_k=44b832912e5f887d; phpbb3_7c544_sid=e8852df42e08cc1b2250300c2897f78f; __utma=174624884.2719561324781918700.1251850714.1270986325.1270989003.575; __utmz=174624884.1264524375.411.12.utmcsr=google|utmccn=(organic)|utmcmd=organic|utmctr=low%20stakes%20poker%20videos; phpbb3_cmviy_k=; phpbb3_cmviy_u=2; phpbb3_cmviy_sid=d8df5c0943863004ca40ef9c392d371d; __utmb=174624884.4.10.1270989003; __utmc=174624884 Pragma: no-cache Cache-Control: no-cache HTTP/1.1 200 OK Date: Sun, 11 Apr 2010 12:57:41 GMT Server: Apache/2.2.14 (Unix) mod_ssl/2.2.14 OpenSSL/0.9.8l DAV/2 mod_auth_passthrough/2.1 FrontPage/5.0.2.2635 Last-Modified: Sun, 04 Apr 2010 12:51:06 GMT Etag: "eb42d6-7d9b843-48368aa6dc280" Accept-Ranges: bytes Content-Length: 131708995 Keep-Alive: timeout=10, max=30 Connection: Keep-Alive Content-Type: video/x-ms-wmv And here's what they look like for the request answered by my script: http://www.grinderschool.com/download_video_test.php?t=KillsAids021&format=wmv GET /download_video_test.php?t=KillsAids021&format=wmv HTTP/1.1 Host: www.grinderschool.com User-Agent: Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3 (.NET CLR 3.5.30729) Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 Accept-Language: en-us,en;q=0.5 Accept-Encoding: gzip,deflate Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7 Keep-Alive: 115 Connection: keep-alive Cookie: style_cookie=printonly; phpbb3_7c544_u=2; phpbb3_7c544_k=44b832912e5f887d; phpbb3_7c544_sid=e8852df42e08cc1b2250300c2897f78f; __utma=174624884.2719561324781918700.1251850714.1270986325.1270989003.575; __utmz=174624884.1264524375.411.12.utmcsr=google|utmccn=(organic)|utmcmd=organic|utmctr=low%20stakes%20poker%20videos; phpbb3_cmviy_k=; phpbb3_cmviy_u=2; phpbb3_cmviy_sid=d8df5c0943863004ca40ef9c392d371d; __utmb=174624884.4.10.1270989003; __utmc=174624884 HTTP/1.1 200 OK Date: Sun, 11 Apr 2010 12:58:02 GMT Server: Apache/2.2.14 (Unix) mod_ssl/2.2.14 OpenSSL/0.9.8l DAV/2 mod_auth_passthrough/2.1 FrontPage/5.0.2.2635 X-Powered-By: PHP/5.2.11 Content-Disposition: attachment; filename=KillsAids021.wmv Vary: Accept-Encoding Content-Encoding: gzip Keep-Alive: timeout=10, max=30 Connection: Keep-Alive Transfer-Encoding: chunked Content-Type: video/x-ms-wmv So the question is...what can I do to make downloads from the script work properly? Again, for 99% of users, it works as is (though I find it annoying now that no filesize is reported and thus that no time estimate can be computed about the download).

    Read the article

  • Can't connect to Sunspot server in Ubuntu server

    - by Chris Benseler
    I followed the steps in https://github.com/outoftime/sunspot/wiki/Adding-Sunspot-search-to-Rails-in-5-minutes-or-less to install & set up Sunspot search in Rails in a Mac OS and it is ok. In a Ubuntu server, there's connection refused error. When I run rake sunspot:solr:start and the proccess starts. The file sunspot-solr-development.pid is created in /tmp/pids But when I try to reindex rake sunspot:reindex ... rake aborted! Connection refused - connect(2) I tried to run the commands with sudo and gave permission 777 to the project files, but there's still error. Rails 3.0.8 Don't know where else to search for a solution...

    Read the article

  • Hibernate Communications Link Failure in Hibernate Based Java Servlet application powered by MySQL

    - by Vatsala
    Let me describe my question - I have a Java application - Hibernate as the DB interfacing layer over MySQL. I get the communications link failure error in my application. The occurence of this error is a very specific case. I get this error , When I leave mysql server unattended for more than approximately 6 hours (i.e. when there are no queries issued to MySQL for more than approximately 6 hours). I am pasting a top 'exception' level description below, and adding a pastebin link for a detailed stacktrace description. javax.persistence.PersistenceException: org.hibernate.exception.JDBCConnectionException: Cannot open connection - Caused by: org.hibernate.exception.JDBCConnectionException: Cannot open connection - Caused by: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure - The last packet successfully received from the server was 1,274,868,181,212 milliseconds ago. The last packet sent successfully to the server was 0 milliseconds ago. - Caused by: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure - The last packet successfully received from the server was 1,274,868,181,212 milliseconds ago. The last packet sent successfully to the server was 0 milliseconds ago. - Caused by: java.net.ConnectException: Connection refused: connect the link to the pastebin for further investigation - http://pastebin.com/4KujAmgD What I understand from these exception statements is that MySQL is refusing to take in any connections after a period of idle/nil activity. I have been reading up a bit about this via google search, and came to know that one of the possible ways to overcome this is to set values for c3p0 properties as c3p0 comes bundled with Hibernate. Specifically, I read from here http://www.mchange.com/projects/c3p0/index.html that setting two properties idleConnectionTestPeriod and preferredTestQuery will solve this for me. But these values dont seem to have had an effect. Is this the correct approach to fixing this? If not, what is the right way to get over this? The following are related Communications Link Failure questions at stackoverflow.com, but I've not found a satisfactory answer in their answers. http://stackoverflow.com/questions/2121829/java-db-communications-link-failure http://stackoverflow.com/questions/298988/how-to-handle-communication-link-failure Note 1 - i dont get this error when I am using my application continuosly. Note 2 - I use JPA with Hibernate and hence my hibernate.dialect,etc hibernate properties reside within the persistence.xml in the META-INF folder (does that prevent the c3p0 properties from working?) edit - Here are the c3p0 parameters I tried out - select 1; 2

    Read the article

  • Building a webserver, client doesn't acknowledge HTTP 200 OK frame.

    - by Evert
    Hi there, I'm building my own webserver based on a tutorial. I have found a simple way to initiate a TCP connection and send one segment of http data (the webserver will run on a microcontroller, so it will be very small) Anyway, the following is the sequence I need to go through: receive SYN send SYN,ACK receive ACK (the connection is now established) receive ACK with HTTP GET command send ACK send FIN,ACK with HTTP data (e.g 200 OK) receive FIN,ACK <- I don't recieve this packet! send ACK Everything works fine until I send my acknowledgement and HTTP 200 OK message. The client won't send an acknowledgement to those two packages and thus no webpage is being displayed. I've added a pcap file of the sequence how I recorded it with wireshark. Pcap file: http://cl.ly/5f5 (now it's the right data) All sequence and acknowledgement numbers are correct, checksum are ok. Flags are also right. I have no idea what is going wrong.

    Read the article

  • libcurl - unable to download a file

    - by marmistrz
    I'm working on a program which will download lyrics from sites like AZLyrics. I'm using libcurl. It's my code lyricsDownloader.cpp #include "lyricsDownloader.h" #include <curl/curl.h> #include <cstring> #include <iostream> #define DEBUG 1 ///////////////////////////////////////////////////////////////////////////// size_t lyricsDownloader::write_data_to_var(char *ptr, size_t size, size_t nmemb, void *userdata) // this function is a static member function { ostringstream * stream = (ostringstream*) userdata; size_t count = size * nmemb; stream->write(ptr, count); return count; } string AZLyricsDownloader::toProviderCode() const { /*this creates an url*/ } CURLcode AZLyricsDownloader::download() { CURL * handle; CURLcode err; ostringstream buff; handle = curl_easy_init(); if (! handle) return static_cast<CURLcode>(-1); // set verbose if debug on curl_easy_setopt( handle, CURLOPT_VERBOSE, DEBUG ); curl_easy_setopt( handle, CURLOPT_URL, toProviderCode().c_str() ); // set the download url to the generated one curl_easy_setopt(handle, CURLOPT_WRITEDATA, &buff); curl_easy_setopt(handle, CURLOPT_WRITEFUNCTION, &AZLyricsDownloader::write_data_to_var); err = curl_easy_perform(handle); // The segfault should be somewhere here - after calling the function but before it ends cerr << "cleanup\n"; curl_easy_cleanup(handle); // copy the contents to text variable lyrics = buff.str(); return err; } main.cpp #include <QString> #include <QTextEdit> #include <iostream> #include "lyricsDownloader.h" int main(int argc, char *argv[]) { AZLyricsDownloader dl(argv[1], argv[2]); dl.perform(); QTextEdit qtexted(QString::fromStdString(dl.lyrics)); cout << qPrintable(qtexted.toPlainText()); return 0; } When running ./maelyrica Anthrax Madhouse I'm getting this logged from curl * About to connect() to azlyrics.com port 80 (#0) * Trying 174.142.163.250... * connected * Connected to azlyrics.com (174.142.163.250) port 80 (#0) > GET /lyrics/anthrax/madhouse.html HTTP/1.1 Host: azlyrics.com Accept: */* < HTTP/1.1 301 Moved Permanently < Server: nginx/1.0.12 < Date: Thu, 05 Jul 2012 16:59:21 GMT < Content-Type: text/html < Content-Length: 185 < Connection: keep-alive < Location: http://www.azlyrics.com/lyrics/anthrax/madhouse.html < Segmentation fault Strangely, the file is there. The same error is displayed when there's no such page (redirect to azlyrics.com mainpage) What am I doing wrong? Thanks in advance EDIT: I made the function for writing data static, but this changes nothing. Even wget seems to have problems $ wget http://www.azlyrics.com/lyrics/anthrax/madhouse.html --2012-07-06 10:36:05-- http://www.azlyrics.com/lyrics/anthrax/madhouse.html Resolving www.azlyrics.com... 174.142.163.250 Connecting to www.azlyrics.com|174.142.163.250|:80... connected. HTTP request sent, awaiting response... No data received. Retrying. Why does opening the page in a browser work and wget/curl not? EDIT2: After adding this: curl_easy_setopt(handle, CURLOPT_FOLLOWLOCATION, 1); The log is: * About to connect() to azlyrics.com port 80 (#0) * Trying 174.142.163.250... * connected * Connected to azlyrics.com (174.142.163.250) port 80 (#0) > GET /lyrics/anthrax/madhouse.html HTTP/1.1 Host: azlyrics.com Accept: */* < HTTP/1.1 301 Moved Permanently < Server: nginx/1.0.12 < Date: Fri, 06 Jul 2012 09:09:47 GMT < Content-Type: text/html < Content-Length: 185 < Connection: keep-alive < Location: http://www.azlyrics.com/lyrics/anthrax/madhouse.html < * Ignoring the response-body * Connection #0 to host azlyrics.com left intact * Issue another request to this URL: 'http://www.azlyrics.com/lyrics/anthrax/madhouse.html' * About to connect() to www.azlyrics.com port 80 (#1) * Trying 174.142.163.250... * connected * Connected to www.azlyrics.com (174.142.163.250) port 80 (#1) > GET /lyrics/anthrax/madhouse.html HTTP/1.1 Host: www.azlyrics.com Accept: */* < HTTP/1.1 200 OK < Server: nginx/1.0.12 < Date: Fri, 06 Jul 2012 09:09:47 GMT < Content-Type: text/html < Transfer-Encoding: chunked < Connection: keep-alive < Segmentation fault

    Read the article

  • MySQL query killing my server

    - by Webnet
    Looking at this query there's got to be something bogging it down that I'm not noticing. I ran it for 7 minutes and it only updated 2 rows. //set product count for makes $tru->query->run(array( 'name' => 'get-make-list', 'sql' => 'SELECT id, name FROM vehicle_make', 'connection' => 'core' )); while($tempMake = $tru->query->getArray('get-make-list')) { $tru->query->run(array( 'name' => 'update-product-count', 'sql' => 'UPDATE vehicle_make SET product_count = ( SELECT COUNT(product_id) FROM taxonomy_master WHERE v_id IN ( SELECT id FROM vehicle_catalog WHERE make_id = '.$tempMake['id'].' ) ) WHERE id = '.$tempMake['id'], 'connection' => 'core' )); } I'm sure this query can be optimized to perform better, but I can't think of how to do it. vehicle_make = 45 rows taxonomy_master = 11,223 rows vehicle_catalog = 5,108 rows All tables have appropriate indexes

    Read the article

  • php MySQL snytax error

    - by Jacksta
    my scrip is supposed to look up contacts in a table and present thm on the screen to then be edited. however this is not this case. I am getting the error Parse error: syntax error, unexpected $end in /home/admin/domains/domain.com.au/public_html/pick_modcontact.php on line 50 NOTE: this is the last line in this script. <? session_start(); if ($_SESSION[valid] != "yes") { header( "Location: contact_menu.php"); exit; } $db_name = "testDB"; $table_name = "my_contacts"; $connection = @mysql_connect("localhost", "user", "pass") or die(mysql_error()); $db = @mysql_select_db($db_name, $connection) or die(mysql_error()); $sql = "SELECT id, f_name, l_name FROM $table_name ORDER BY f_name"; $result = @mysql_query($sql, $connection) or die(mysql_error()); $num = @mysql_num_rows($result); if ($num < 1) { $display_block = "<p><em>Sorry No Results!</em></p>"; } else { while ($row = mysql_fetch_array($result)) { $id = $row['id']; $f_name = $row['f_name']; $l_name = $row['l_name']; $option_block .= "<option value\"$id\">$l_name, $f_name</option>"; } $display_block = "<form method=\"POST\" action=\"show_modcontact.php\"> <p><strong>Contact:</strong> <select name=\"id\">$option_block</select> <input type=\"submit\" name=\"submit\" value=\"Select This Contact\"></p> </form>"; ?> <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml"> <head> <meta http-equiv="Content-Type" content="text/html; charset=utf-8" /> <title>Modify A Contact</title> </head> <body> <h1>My Contact Management System</h1> <h2><em>Modify a Contact</em></h2> <p>Select a contact from the list below, to modify the contact's record.</p> <? echo "$display_block"; ?> <br> <p><a href="contact_menu.php">Return to Main Menu</a></p> </body> </html>

    Read the article

  • php script return specific xml tag value

    - by Alaa
    i need php script to extract the ip address from the following xml page www.ip-address.com/test.xml <dnstools> <service_provider>Domain Tools</service_provider> <provider_url>http://www.domaintools.com/</provider_url> <date>Wed, 12 May 2010 00:43:07 GMT</date> <unix_time>1273624987</unix_time> <ip_address>94.252.157.241</ip_address> <hostname>94.252.157.241</hostname> <blacklist_status>Clear</blacklist_status> <remote_port>43577</remote_port> <protocol>HTTP/1.0</protocol> <connection>keep-alive</connection> <keep_alive/>

    Read the article

  • Does a C# using statement perform try/finally?

    - by Lirik
    Suppose that I have the following code: private void UpdateDB(QuoteDataSet dataSet, Strint tableName) { using(SQLiteConnection conn = new SQLiteConnection(_connectionString)) { conn.Open(); using (SQLiteTransaction transaction = conn.BeginTransaction()) { using (SQLiteCommand cmd = new SQLiteCommand("SELECT * FROM " + tableName, conn)) { using (SQLiteDataAdapter sqliteAdapter = new SQLiteDataAdapter()) { sqliteAdapter.Update(dataSet, tableName); } } transaction.Commit(); } } } The C# documentation states that with a using statement the object within the scope will be disposed and I've seen several places where it's suggested that we don't need to use try/finally clause. I usually surround my connections with a try/finally, and I always close the connection in the finally clause. Given the above code, is it reasonable to assume that the connection will be closed if there is an exception?

    Read the article

  • Lamp with mod_fastcgi

    - by Jonathan
    Hi! I am building a cgi application, and now I would like it to be like an application that stands and parses each connection, with this, I can have all session variables saved in memory instead of saving them to file(or anyother place) and loading them again on a new connection I am using lamp within a linux vmware but I can't seem to find how to install the module for it to work and what to change in the httpd.conf. I tried to compile the module, but I couldn't because my apache isn't a regular instalation, its a lamp already built one, and it seems that the mod needs the apache directory to be compiled. I saw some coding examples out there, so I guess is not that hard once its runing ok with Apache Can you help me with this please? Thanks, Joe

    Read the article

  • Wireless barcode scanner

    - by Zinx
    Hi All, I have 2 wireless barcode scanners. I have created an application in C# which reads a barcode and sends data to a web service which then manipulates the data and do further processing. When I start aplication, it first tries to connect to web service and will proceed further only if connection succeded. The problem I am facing is, if I deploy the application through visual studio then it works fine and connects to web service. But if I just copy the contents (exe and config files) manually, then it gives error that unknown host name. Can someone please help me to understand how this connection works? Does it needs some special settings in scanner device which visual studio does automatically while deployment? Thanks and Cheers.

    Read the article

  • How to change a Datasource's username/password at runtime in a J2EE app?

    - by Toto
    I've deployed a web-module which connects to the database via a datasource configured in the J2EE application server. Currently, the user/password for the database connection is set in the proper J2EE application server's datasources configuration file. I want to change during runtime the datasource's user/password. (e.g.: implement a new web form in which the user is asked to enter de user/password to be used in the database connection). Is there a standard way to do that in J2EE applications or it depends on the J2EE application server? In this case I'm using Orion application server.

    Read the article

  • System net web exception when the system is connected for a while.

    - by Sharpeye500
    Hi, I have a asp.net application connecting to a .net web service, whenever i login & reach the main page, web services returns me back the data. But if i login for may be more than 1 or 2hrs, when i click any of the links i get: "System.Net.WebException: Unable to connect to the remote server ---> System.Net.Sockets.SocketException: A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond" What is the fix for this? Thanks.

    Read the article

  • Trouble getting Flash socket policy file to work.

    - by Alex
    Basically I'm using Flash to connect to a Java server. Despite my Java application replying to the , in the Flash debug log it lists (not sure about the order as there are lots): * Security Sandbox Violation * Connection to 192.168.1.86:4049 halted - not permitted from http://127.0.0.1:8888/Current/wander.swf Warning: Timeout on xmlsocket://192.168.1.86:4049 (at 3 seconds) while waiting for socket policy file. This should not cause any problems, but see http://www.adobe.com/go/strict_policy_files for an explanation. Error: Request for resource at xmlsocket://192.168.1.86:4049 by requestor from http://127.0.0.1:8888/Current/wander.swf is denied due to lack of policy file permissions. What I don't understand is, the server (port 4049) receives the request, outputs the policy file and then closes the connection, surely it shouldn't time out? The policy file I'm using is: <?xml version="1.0"?> <cross-domain-policy><allow-access-from domain="*" to-ports="*" /> </cross-domain-policy>

    Read the article

  • How to retrieve email from GMail account using PHP?

    - by Tatu Ulmanen
    Hi, I'm trying to automatically retrieve some email from my GMail account for further parsing, but I can't get my head around on how to do that. I've searched the internets and it suggested that I use PHP's imap functions, like this: $server = '{imap.gmail.com:993/ssl}'; $connection = imap_open($server, '[email protected]', 'password'); But using that code, I get: Warning: imap_open() [function.imap-open]: Couldn't open stream {imap.gmail.com:993/ssl} Any idea what I am doing wrong? Any server setting that might be preventing me from making a connection to GMail (I'm using a shared service)? Is the address even right? Has anyone ever managed to do something like this? I've found tons of examples on how to send email via GMail, but very little of retrieving. Any help is much appreciated.

    Read the article

  • HttpURLConnection does not read the whole respnse

    - by Peter Szanto
    I use HttpURLConnection to do HTTP POST but I dont always get back the full response. I wanted to debug the problem, but when I step through each line it worked. I thought it must be a timing issue so I added Thread.sleep and it really made my code work, but this is only a temporary workaround. I wonder why is this happening and how to solve. Here is my code: URL u = new URL(url); URLConnection c = u.openConnection(); InputStream in = null; String mediaType = null; if (c instanceof HttpURLConnection) { //c.setConnectTimeout(1000000); //c.setReadTimeout(1000000); HttpURLConnection h = (HttpURLConnection)c; h.setRequestMethod("POST"); //h.setChunkedStreamingMode(-1); setAccept(h, expectedMimeType); h.setRequestProperty("Content-Type", inputMimeType); for(String key: httpHeaders.keySet()) { h.setRequestProperty(key, httpHeaders.get(key)); if (logger.isDebugEnabled()) { logger.debug("Request property key : " + key + " / value : " + httpHeaders.get(key)); } } h.setDoOutput(true); h.connect(); OutputStream out = h.getOutputStream(); out.write(input.getBytes()); out.close(); mediaType = h.getContentType(); logger.debug(" ------------------ sleep ------------------ START"); try { Thread.sleep(2000); } catch (InterruptedException e) { e.printStackTrace(); } logger.debug(" ------------------ sleep ------------------ END"); if (h.getResponseCode() < 400) { in = h.getInputStream(); } else { in = h.getErrorStream(); } It genearates the following HTTP headers POST /emailauthentication/ HTTP/1.1 Accept: application/xml Content-Type: application/xml Authorization: OAuth oauth_consumer_key="b465472b-d872-42b9-030e-4e74b9b60e39",oauth_nonce="YnDb5eepuLm%2Fbs",oauth_signature="dbN%2FWeWs2G00mk%2BX6uIi3thJxlM%3D", oauth_signature_method="HMAC-SHA1", oauth_timestamp="1276524919", oauth_token="", oauth_version="1.0" User-Agent: Java/1.6.0_20 Host: test:6580 Connection: keep-alive Content-Length: 1107 In other posts it was suggested to turn off keep-alive by using the http.keepAlive=false system property, I tried that and the headers changed to POST /emailauthentication/ HTTP/1.1 Accept: application/xml Content-Type: application/xml Authorization: OAuth oauth_consumer_key="b465472b-d872-42b9-030e-4e74b9b60e39", oauth_nonce="Eaiezrj6X4Ttt0", oauth_signature="ND9fAdZMqbYPR2j%2FXUCZmI90rSI%3D", oauth_signature_method="HMAC-SHA1", oauth_timestamp="1276526608", oauth_token="", oauth_version="1.0" User-Agent: Java/1.6.0_20 Host: test:6580 Connection: close Content-Length: 1107 the Connection header is "close" but I still cannot read the whole response. Any idea what do I do wrong?

    Read the article

  • accessing pdf via https URL

    - by Paul
    I send out a newsletter email containing URLs to a https website that then redirects to a pdf document. On first invocation of a URL the user is prompted with the typical https browser "security alert" popup, on selecting "Yes" the display of the PDF fails. The HTTP Header on the failed response is: HTTP/1.1 200 OK Server: ECS/HTTP-Server Date: Tue, 16 Mar 2010 15:57:26 GMT Content-type: application/pdf Content-language: en-US Set-cookie: JSESSIONID=0000r111cRz1Vc-PtCJg8Cdu4eR:-1; Path=/ Expires: Thu, 01 Dec 1994 16:00:00 GMT Cache-control: no-cache="set-cookie, set-cookie2" Connection: close Subsequent invocations of the URL successfully opens the PDF (at this point we have the session id cookie set by the initial failed request). The HTTP Header on the successful response is: HTTP/1.1 200 OK Server: ECS/HTTP-Server Date: Tue, 16 Mar 2010 16:53:03 GMT Content-type: application/pdf Content-language: en-US Connection: close The email client is Lotus Notes 6.5 which launches an IE6 browser Any ideas?

    Read the article

  • Are there Python ORMs out there that support multiple independent databases concurrently in use?

    - by sdt
    I'm writing an application in Python where I wish to use sqlite as the backing store for documents edited by the app, with documents generally living in memory, but being saved to disk-based databases when the application saves. Ideally I'd like to use something like an ORM to make access to the data from my Python application code simple. Unfortunately it seems like the majority of Python ORMs, including SQLAlchemy, SQLObject, Django, and Storm, associate the database connection (or engine or whatever) with the classes representing table data, rather than instances of those classes. This restricts these ORMs to working with a single database connection across all instances. Since I'd like to support having multiple documents open simultaneously, this isn't going to work for me. Are there any ORMs out there that support this usage model in Python? Bazaar seems to support this, but it's quite out of date, and at first glance appears to have some other shortcomings. Thanks for any suggestions!

    Read the article

  • python interactive web data/forms/interface communicating with remote server

    - by decipher
    What's an efficient method (preferably simple as well) for communicating with a remote server and allowing the user to 'interact' with it (IE submit commands, user interface) via the web browser (IE a text box to input commands, and an text area for output, or various command-less abstracted interfaces)? I have the 'standalone' python code finished for communicating and working(terminal/console based right now). My primary concern is with re-factoring the code to suite the web, which involves establishing a connection (python sockets), and maintaining the connection while the user is logged on. some further details: currently using django framework for the basic back end/templates.

    Read the article

  • Reading HttpURLConnection InputStream - manual buffer or BufferedInputStream?

    - by stormin986
    When reading the InputStream of an HttpURLConnection, is there any reason to use one of the following over the other? I've seen both used in examples. Manual Buffer: while ((length = inputStream.read(buffer)) > 0) { os.write(buf, 0, ret); } BufferedInputStream is = http.getInputStream(); bis = new BufferedInputStream(is); ByteArrayBuffer baf = new ByteArrayBuffer(50); int current = 0; while ((current = bis.read()) != -1) { baf.append(current); } EDIT I'm still new to HTTP in general but one consideration that comes to mind is that if I am using a persistent HTTP connection, I can't just read until the input stream is empty right? In that case, wouldn't I need to read the message length and just read the input stream for that length? And similarly, if NOT using a persistent connection, is the code I included 100% good to go in terms of reading the stream properly?

    Read the article

< Previous Page | 289 290 291 292 293 294 295 296 297 298 299 300  | Next Page >