Search Results

Search found 59381 results on 2376 pages for 'http request'.

Page 161/2376 | < Previous Page | 157 158 159 160 161 162 163 164 165 166 167 168  | Next Page >

  • Limit TCP requests per IP

    - by asmo
    Hello! I'm wondering how to limit the TCP requests per client (per specific IP) in Java. For example, I would like to allow a maximum of X requests per Y seconds for each client IP. I thought of using static Timer/TimerTask in combination with a HashSet of temporary restricted IPs. private static final Set<InetAddress> restrictedIPs = Collections.synchronizedSet(new HashSet<InetAddress>()); private static final Timer restrictTimer = new Timer(); So when a user connects to the server, I add his IP to the restricted list, and start a task to unrestrict him in X seconds. restrictedIPs.add(socket.getInetAddress()); restrictTimer.schedule(new TimerTask() { public void run() { restrictedIPs.remove(socket.getInetAddress()); } }, MIN_REQUEST_INTERVAL); My problem is that at the time the task will run, the socket object may be closed, and the remote IP address won't be accessible anymore... Any ideas welcomed! Also, if someone knows a Java-framework-built-in way to achieve this, I'd really like to hear it.

    Read the article

  • How can my facebook application post message to a wall?

    - by Thomas Dekiere
    i already found out how to post something to a wall with the graph api on behalf of the facebook user. But now i want to post something in the name of my application. Here is how i'm trying to do this: protected void btn_submit_Click(object sender, EventArgs e) { Dictionary<string, string> data = new Dictionary<string, string>(); data.Add("message", "Testing"); // i'll add more data later here (picture, link, ...) data.Add("access_token", FbGraphApi.getAppToken()); FbGraphApi.postOnWall(ConfigSettings.getFbPageId(), data); } FbGraphApi.getAppToken() // ... private static string graphUrl = "https://graph.facebook.com"; //... public static string getAppToken() { MyWebRequest req = new MyWebRequest(graphUrl + "/" + "oauth/access_token?type=client_cred&client_id=" + ConfigSettings.getAppID() + "&client_secret=" + ConfigSettings.getAppSecret(), "GET"); return req.GetResponse().Split('=')[1]; } FbGraphApi.postOnWall() public static void postOnWall(string id, Dictionary<string,string> args) { call(id, "feed", args); } FbGraphApi.call() private static void call(string id, string method, Dictionary<string,string> args ) { string data = ""; foreach (KeyValuePair<string, string> arg in args) { data += arg.Key + "=" + arg.Value + "&"; } MyWebRequest req = new MyWebRequest(graphUrl +"/" + id + "/" + method, "POST", data.Substring(0, data.Length - 1)); req.GetResponse(); // here i get: "The remote server returned an error: (403) Forbidden." } Does anyone see where this i going wrong? I'm really stuck on this. Thanks!

    Read the article

  • ServerVariables and POST Data

    - by tyndall
    I've got a piece of tracking code which is capturing REMOTE_HOST, SERVER, REQUEST_METHOD, SCRIPT_NAME and QUERY_STRING. It grabs these from ServerVariables and sticks them in a database by user and IP. What is the best way to pick up the exact contents of what was posted back to a URL in ASP.NET? Is there an HTTP_POST? I'd rather not grab something and then have to parse it.

    Read the article

  • Can you iterate over chunks() with request.POST in Django?

    - by Sebastian
    I'm trying to optimize a site I'm building with Django/Flash and am having a problem using Django's iterate over chunks() feature. I'm sending an image from Flash to Django using request.POST data rather than through a form (using request.FILES). The problem I foresee is that if there is large user volume, I could potentially kill memory. But it seems that Django only allows iterating over chunks with request.FILES. Is there a way to: 1) wrap my request.POST data into a request.FILES (thus spoofing Django) or 2) use chunks() with request.POST data

    Read the article

  • How to cancel a deeply nested process

    - by Mystere Man
    I have a class that is a "manager" sort of class. One of it's functions is to signal that the long running process of the class should shut down. It does this by setting a boolean called "IsStopping" in class. public class Foo { bool isStoping void DoWork() { while (!isStopping) { // do work... } } } Now, DoWork() was a gigantic function, and I decided to refactor it out and as part of the process broke some of it into other classes. The problem is, Some of these classes also have long running functions that need to check if isStopping is true. public class Foo { bool isStoping void DoWork() { while (!isStopping) { MoreWork mw = new MoreWork() mw.DoMoreWork() // possibly long running // do work... } } } What are my options here? I have considered passing isStopping by reference, which I don't really like because it requires there to be an outside object. I would prefer to make the additional classes as stand alone and dependancy free as possible. I have also considered making isStopping a property, and then then having it call an event that the inner classes could be subscribed to, but this seems overly complex. Another option was to create a "Process Cancelation Token" class, similar to what .net 4 Tasks use, then that token be passed to those classes. How have you handled this situation? EDIT: Also consider that MoreWork might have a EvenMoreWork object that it instantiates and calls a potentially long running method on... and so on. I guess what i'm looking for is a way to be able to signal an arbitrary number of objects down a call tree to tell them to stop what they're doing and clean up and return.

    Read the article

  • someone help me please

    - by Ronnie Chester Lynwood
    hey i want to make a thing but i need some help. ive got an index.php with codes. and i added "file" parameter to index.php. so i mean if "index.php?file=/folder/folder/picture.png" is set, go to file. if "file=" not set do not do anything. I get "file" parameter with $_REQUEST thingy. please help thanks..

    Read the article

  • jQuery - Cancel form submit (using "return false"?)?

    - by Nike
    Hey there. I'm running into a bit of trouble while trying to cancel the submit of a form. I've been following this tutorial (even though i'm not making a login script), and it seems to be working for him. Here's my form: <form action="index.php" method="post" name="pForm"> <textarea name="comment" onclick="if(this.value == 'skriv här...') this.value='';" onblur="if(this.value.length == 0) this.value='skriv här...';">skriv här...</textarea> <input class="submit" type="submit" value="Publicera!" name="submit" /> </form> And here's the jquery: $(document).ready(function() { $('form[name=pForm]').submit(function(){ return false; }); }); I've already imported jQuery in the header and i know it's working. My first thought was that it might be outdated, but it's actually "just" a year ago. So do anyone see what's wrong? Thanks in advance. EDIT: From what i've read the easiest and most appropriate way to abort the submit is to return false? But i can't seem to get it working. I've searched the forum and i've found several helpful threads but none of them actually works. I must be screwing something up.

    Read the article

  • How to first get different related values from diferent SQL tables (PHP)

    - by Ole Jak
    I am triig to fill options list. I have 2 tables USERS and STREAMS I vant to get all streams and get names of users assigned to that streams. Users consists of username and id Streams consists of id, userID, streamID I try such code: <?php global $connection; $query = "SELECT * FROM streams "; $streams_set = mysql_query($query, $connection); confirm_query($streams_set); $streams_count = mysql_num_rows($streams_set); while ($row = mysql_fetch_array($streams_set)){ $userid = $row['userID']; global $connection; $query2 = "SELECT email, username "; $query2 .= "FROM users "; $query2 .= "WHERE id = '{$userid}' "; $qs = mysql_query($query2, $connection); confirm_query($qs); $found_user = mysql_fetch_array($qs); echo ' <option value="'.$row['streamID'].'">'.$row['userID'].$found_user.'</option> '; } ?> But it does not return USER names from DB=( So what shall I do to this code to see usernames as "options" text?

    Read the article

  • Can' get couchdb external http handlers to work.

    - by fuzzy lollipop
    following the instructions here http://wiki.apache.org/couchdb/ExternalProcesses this is what I get { * error: "{{badarg,[{erlang,port_command, [#Port<0.2056>, [123, [34,<<"info">>,34], 58, [123, [34,"db_name",34], 58, [34,<<"transfer_central">>,34], 44, [34,"doc_count",34], 58,"39441",44, [34,"doc_del_count",34], 58,"0",44, [34,"update_seq",34], 58,"56508",44, [34,"purge_seq",34], 58,"0",44, [34,"compact_running",34], 58,<<"false">>,44, [34,"disk_size",34], 58,"43593828",44, [34,"instance_start_time",34], 58, [34,<<"1272560477320483">>,34], 44, [34,"disk_format_version",34], 58,"5",125], 44, [34,<<"id">>,34], 58,<<"null">>,44, [34,<<"method">>,34], 58, [34,"GET",34], 44, [34,<<"path">>,34], 58, [91, [34,<<"transfer_central">>,34], 44, [34,<<"_test">>,34], 93], 44, [34,<<"query">>,34], 58,<<"{}">>,44, [34,<<"headers">>,34], 58, [123, [34,<<"Accept">>,34], 58, [34, <<"text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8,application/json">>, 34], 44, [34,<<"Accept-Charset">>,34], 58, [34,<<"ISO-8859-1,utf-8;q=0.7,*;q=0.7">>,34], 44, [34,<<"Accept-Encoding">>,34], 58, [34,<<"gzip,deflate">>,34], 44, [34,<<"Accept-Language">>,34], 58, [34,<<"en-us,en;q=0.5">>,34], 44, [34,<<"Connection">>,34], 58, [34,<<"keep-alive">>,34], 44, [34,<<"Host">>,34], 58, [34,<<"127.0.0.1:5984">>,34], 44, [34,<<"Keep-Alive">>,34], 58, [34,<<"115">>,34], 44, [34,<<"User-Agent">>,34], 58, [34, <<"Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.6; en-US; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3">>, 34], 125], 44, [34,<<"body">>,34], 58, [34,"undefined",34], 44, [34,<<"peer">>,34], 58, [34,<<"127.0.0.1">>,34], 44, [34,<<"form">>,34], 58,<<"{}">>,44, [34,<<"cookie">>,34], 58,<<"{}">>,44, [34,<<"userCtx">>,34], 58, [123, [34,<<"db">>,34], 58, [34,<<"transfer_central">>,34], 44, [34,<<"name">>,34], 58,<<"null">>,44, [34,<<"roles">>,34], 58,<<"[]">>,125], 125,10]]}, {couch_os_process,writeline,2}, {couch_os_process,writejson,2}, {couch_os_process,handle_call,3}, {gen_server,handle_msg,5}, {proc_lib,init_p_do_apply,3}]}, {gen_server,call, [<0.110.0>, {prompt,{[{<<"info">>, {[{db_name,<<"transfer_central">>}, {doc_count,39441}, {doc_del_count,0}, {update_seq,56508}, {purge_seq,0}, {compact_running,false}, {disk_size,43593828}, {instance_start_time,<<"1272560477320483">>}, {disk_format_version,5}]}}, {<<"id">>,null}, {<<"method">>,'GET'}, {<<"path">>,[<<"transfer_central">>,<<"_test">>]}, {<<"query">>,{[]}}, {<<"headers">>, {[{<<"Accept">>, <<"text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8,application/json">>}, {<<"Accept-Charset">>, <<"ISO-8859-1,utf-8;q=0.7,*;q=0.7">>}, {<<"Accept-Encoding">>,<<"gzip,deflate">>}, {<<"Accept-Language">>,<<"en-us,en;q=0.5">>}, {<<"Connection">>,<<"keep-alive">>}, {<<"Host">>,<<"127.0.0.1:5984">>}, {<<"Keep-Alive">>,<<"115">>}, {<<"User-Agent">>, <<"Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.6; en-US; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3">>}]}}, {<<"body">>,undefined}, {<<"peer">>,<<"127.0.0.1">>}, {<<"form">>,{[]}}, {<<"cookie">>,{[]}}, {<<"userCtx">>, {[{<<"db">>,<<"transfer_central">>}, {<<"name">>,null}, {<<"roles">>,[]}]}}]}}, infinity]}}" * reason: "{gen_server,call, [<0.109.0>, {execute,{[{<<"info">>, {[{db_name,<<"transfer_central">>}, {doc_count,39441}, {doc_del_count,0}, {update_seq,56508}, {purge_seq,0}, {compact_running,false}, {disk_size,43593828}, {instance_start_time,<<"1272560477320483">>}, {disk_format_version,5}]}}, {<<"id">>,null}, {<<"method">>,'GET'}, {<<"path">>,[<<"transfer_central">>,<<"_test">>]}, {<<"query">>,{[]}}, {<<"headers">>, {[{<<"Accept">>, <<"text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8,application/json">>}, {<<"Accept-Charset">>, <<"ISO-8859-1,utf-8;q=0.7,*;q=0.7">>}, {<<"Accept-Encoding">>,<<"gzip,deflate">>}, {<<"Accept-Language">>,<<"en-us,en;q=0.5">>}, {<<"Connection">>,<<"keep-alive">>}, {<<"Host">>,<<"127.0.0.1:5984">>}, {<<"Keep-Alive">>,<<"115">>}, {<<"User-Agent">>, <<"Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.6; en-US; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3">>}]}}, {<<"body">>,undefined}, {<<"peer">>,<<"127.0.0.1">>}, {<<"form">>,{[]}}, {<<"cookie">>,{[]}}, {<<"userCtx">>, {[{<<"db">>,<<"transfer_central">>}, {<<"name">>,null}, {<<"roles">>,[]}]}}]}}, infinity]}" }

    Read the article

  • apt-get update getting 404 on debian lenny

    - by JoelFan
    Here is my /etc/apt/sources.list ###### Debian Main Repos deb http://ftp.us.debian.org/debian/ lenny main contrib non-free ###### Debian Update Repos deb http://security.debian.org/ lenny/updates main contrib non-free deb http://ftp.us.debian.org/debian/ lenny-proposed-updates main contrib non-free When I do: # apt-get update I'm getting some good lines, then: Err http://ftp.us.debian.org lenny/contrib Packages 404 Not Found [IP: 35.9.37.225 80] Err http://ftp.us.debian.org lenny/non-free Packages 404 Not Found [IP: 35.9.37.225 80] Err http://ftp.us.debian.org lenny-proposed-updates/main Packages 404 Not Found [IP: 35.9.37.225 80] Err http://ftp.us.debian.org lenny-proposed-updates/contrib Packages 404 Not Found [IP: 35.9.37.225 80] Err http://ftp.us.debian.org lenny-proposed-updates/non-free Packages 404 Not Found [IP: 35.9.37.225 80] Err http://ftp.us.debian.org lenny/main Packages 404 Not Found [IP: 35.9.37.225 80] W: Failed to fetch http://security.debian.org/dists/lenny/updates/main/binary-i386/Packages 404 Not Found [IP: 149.20.20.6 80] W: Failed to fetch http://security.debian.org/dists/lenny/updates/contrib/binary-i386/Packages 404 Not Found [IP: 149.20.20.6 80] W: Failed to fetch http://security.debian.org/dists/lenny/updates/non-free/binary-i386/Packages 404 Not Found [IP: 149.20.20.6 80] W: Failed to fetch http://ftp.us.debian.org/debian/dists/lenny/contrib/binary-i386/Packages 404 Not Found [IP: 35.9.37.225 80] W: Failed to fetch http://ftp.us.debian.org/debian/dists/lenny/non-free/binary-i386/Packages 404 Not Found [IP: 35.9.37.225 80] W: Failed to fetch http://ftp.us.debian.org/debian/dists/lenny-proposed-updates/main/binary-i386/Packages 404 Not Found [IP: 35.9.37.225 80] W: Failed to fetch http://ftp.us.debian.org/debian/dists/lenny-proposed-updates/contrib/binary-i386/Packages 404 Not Found [IP: 35.9.37.225 80] W: Failed to fetch http://ftp.us.debian.org/debian/dists/lenny-proposed-updates/non-free/binary-i386/Packages 404 Not Found [IP: 35.9.37.225 80] W: Failed to fetch http://ftp.us.debian.org/debian/dists/lenny/main/binary-i386/Packages 404 Not Found [IP: 35.9.37.225 80] E: Some index files failed to download, they have been ignored, or old ones used instead. Now what?

    Read the article

  • What is the best practice with KML files when adding geositemap?

    - by Floran
    Im not sure how to deal with kml files. Now important particularly in reference to the Google Venice update. My site basically is a guide of many company listings (sort of Yellow Pages). I want each company listing to have a geolocation associated with it. Which of the options I present below is the way to go? OPTION 1: all locations in a single KML file with a reference to that KML file from a geositemap.xml MYGEOSITEMAP.xml <?xml version="1.0" encoding="UTF-8"?> <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:geo="http://www.google.com/geo/schemas/sitemap/1.0"> <url><loc>http://www.mysite.com/locations.kml</loc> <geo:geo> <geo:format>kml</geo:format></geo:geo></url> </urlset> ALLLOCATIONS.kml <kml xmlns="http://www.opengis.net/kml/2.2" xmlns:atom="http://www.w3.org/2005/Atom"> <Document> <name>MyCompany</name> <atom:author> <atom:name>MyCompany</atom:name> </atom:author> <atom:link href="http://www.mysite.com/locations/3454/MyCompany" rel="related" /> <Placemark> <name>MyCompany, Kalverstraat 26 Amsterdam 1000AG</name> <description><![CDATA[<address><a href="http://www.mysite.com/locations/3454/MyCompany">MyCompany</a><br />Address: Kalverstraat 26, Amsterdam 1000AG <br />Phone: 0646598787</address><p>hello there, im MyCompany</p>]]> </description><Point><coordinates>5.420686499999965,51.6298808,0</coordinates> </Point> </Placemark> </Document> </kml> <kml xmlns="http://www.opengis.net/kml/2.2" xmlns:atom="http://www.w3.org/2005/Atom"> <Document> <name>MyCompany</name><atom:author><atom:name>MyCompany</atom:name></atom:author><atom:link href="http://www.mysite.com/locations/22/companyX" rel="related" /><Placemark><name>MyCompany, Rosestreet 45 Amsterdam 1001XF </name><description><![CDATA[<address><a href="http://www.mysite.com/locations/22/companyX">companyX</a><br />Address: Rosestreet 45, Amsterdam 1001XF <br />Phone: 0642195493</address><p>some text about companyX</p>]]></description><Point><coordinates>5.520686499889632,51.6197705,0</coordinates></Point></Placemark> </Document> </kml> OPTION 2: a separate KML file for each location and a reference to each KML file from a geositemap.xml (kml files placed in a \kmlfiles folder) MYGEOSITEMAP.xml <?xml version="1.0" encoding="UTF-8"?> <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:geo="http://www.google.com/geo/schemas/sitemap/1.0"> <url><loc>http://www.mysite.com/kmlfiles/3454_MyCompany.kml</loc> <geo:geo> <geo:format>kml</geo:format></geo:geo></url> <url><loc>http://www.mysite.com/kmlfiles/22_companyX.kml</loc> <geo:geo> <geo:format>kml</geo:format></geo:geo></url> </urlset> *3454_MyCompany.kml* <kml xmlns="http://www.opengis.net/kml/2.2" xmlns:atom="http://www.w3.org/2005/Atom"> <Document><name>MyCompany</name><atom:author><atom:name>MyCompany</atom:name></atom:author><atom:link href="http://www.mysite.com/locations/3454/MyCompany" rel="related" /><Placemark><name>MyCompany, Kalverstraat 26 Amsterdam 1000AG</name><description><![CDATA[<address><a href="http://www.mysite.com/locations/3454/MyCompany">MyCompany</a><br />Address: Kalverstraat 26, Amsterdam 1000AG <br />Phone: 0646598787</address><p>hello there, im MyCompany</p>]]></description><Point><coordinates>5.420686499999965,51.6298808,0</coordinates></Point></Placemark> </Document> </kml> *22_companyX.kml* <kml xmlns="http://www.opengis.net/kml/2.2" xmlns:atom="http://www.w3.org/2005/Atom"> <Document><name>companyX</name><atom:author><atom:name>companyX</atom:name></atom:author><atom:link href="http://www.mysite.com/locations/22/companyX" rel="related" /><Placemark><name>companyX, Rosestreet 45 Amsterdam 1001XF </name><description><![CDATA[<address><a href="http://www.mysite.com/locations/22/companyX">companyX</a><br />Address: Rosestreet 45, Amsterdam 1001XF <br />Phone: 0642195493</address><p>some text about companyX</p>]]></description><Point><coordinates>5.520686499889632,51.6197705,0</coordinates></Point></Placemark> </Document> </kml> OPTION 3?

    Read the article

  • How to setup SyntaxHighlighter with GeeksWithBlogs in about 10 minutes.

    - by mbcrump
    SyntaxHighlighter is a fully functional self-contained code syntax highlighter developed in JavaScript. Below is a sample of what it looks like in your blog. class Test { static void Main() { System.Console.WriteLine("Sample SyntaxHighlighter"); } } This tutorial will help you setup SyntaxHighlighter with GeeksWithBlogs.net in about 10 minutes. Even though this guide is specifically for GWB, you can use it on any other hosting provider that does not allow you to upload custom CSS/JavaScript. It is recommended that if you are using LiveWriter to go ahead and download Code Snippet with SyntaxHighlighter Support to integrate this functionality within Live Writer. 1) Log into GWB and select Options->Configure Now under the Custom CSS insert the following code at the top of the textbox: @import url("http://alexgorbatchev.com/pub/sh/current/styles/shCore.css"); @import url("http://alexgorbatchev.com/pub/sh/current/styles/shThemeDefault.css"); Please note that you can change the default theme by changing the shThemeDefault.css to one listed below: shThemeDefault.css shThemeDjango.css shThemeEmacs.css shThemeFadeToGrey.css shThemeMidnight.css shThemeRDark.css 2) Under the Static News/Announcements insert the following code at the top: <script type="text/javascript" src="http://alexgorbatchev.com/pub/sh/current/scripts/shCore.js"></script> <script type="text/javascript" language="javascript" src="http://alexgorbatchev.com/pub/sh/current/scripts/shBrushCSharp.js"></script> <script type="text/javascript" language="javascript" src="http://alexgorbatchev.com/pub/sh/current/scripts/shBrushJScript.js"></script> <script src='http://alexgorbatchev.com/pub/sh/current/scripts/shBrushJava.js' type='text/javascript'></script> <script language='javascript'> SyntaxHighlighter.config.bloggerMode = true; SyntaxHighlighter.config.clipboardSwf = 'http://alexgorbatchev.com/pub/sh/current/scripts/clipboard.swf'; SyntaxHighlighter.all(); </script> Please note that this will only give you support for Java, JavaScript and C Sharp. If you want more languages like Ruby and SQL. Then add the proper tags listed below. The reason that I didn’t add them is because I do not want to load languages that I will not be blogging about. <link href='http://alexgorbatchev.com/pub/sh/current/styles/shCore.css' rel='stylesheet' type='text/css'/> <link href='http://alexgorbatchev.com/pub/sh/current/styles/shThemeDefault.css' rel='stylesheet' type='text/css'/> <script src='http://alexgorbatchev.com/pub/sh/current/scripts/shCore.js' type='text/javascript'></script> <script src='http://alexgorbatchev.com/pub/sh/current/scripts/shBrushCpp.js' type='text/javascript'></script> <script src='http://alexgorbatchev.com/pub/sh/current/scripts/shBrushCSharp.js' type='text/javascript'></script> <script src='http://alexgorbatchev.com/pub/sh/current/scripts/shBrushCss.js' type='text/javascript'></script> <script src='http://alexgorbatchev.com/pub/sh/current/scripts/shBrushJava.js' type='text/javascript'></script> <script src='http://alexgorbatchev.com/pub/sh/current/scripts/shBrushJScript.js' type='text/javascript'></script> <script src='http://alexgorbatchev.com/pub/sh/current/scripts/shBrushPhp.js' type='text/javascript'></script> <script src='http://alexgorbatchev.com/pub/sh/current/scripts/shBrushPython.js' type='text/javascript'></script> <script src='http://alexgorbatchev.com/pub/sh/current/scripts/shBrushRuby.js' type='text/javascript'></script> <script src='http://alexgorbatchev.com/pub/sh/current/scripts/shBrushSql.js' type='text/javascript'></script> <script src='http://alexgorbatchev.com/pub/sh/current/scripts/shBrushVb.js' type='text/javascript'></script> <script src='http://alexgorbatchev.com/pub/sh/current/scripts/shBrushXml.js' type='text/javascript'></script> <script src='http://alexgorbatchev.com/pub/sh/current/scripts/shBrushPerl.js' type='text/javascript'></script> <script language='javascript'> SyntaxHighlighter.config.bloggerMode = true; SyntaxHighlighter.config.clipboardSwf = 'http://alexgorbatchev.com/pub/sh/current/scripts/clipboard.swf'; SyntaxHighlighter.all(); </script> 3) Now install Code Snippet with SyntaxHighlighter Support and launch Windows Live Writer. Click on the PreCode Snippet plugin add copy/paste your code into the windows. Make sure you select “PRE” and the Language that you are using. It should look similar to the following screenshot.  After you finish editing the post, hit publish and your code should look nice and neat like the example shown earlier.

    Read the article

  • How to repeat a POST request using Chrome's developer tools?

    - by Luke The Obscure
    Not sure if this is the right stack exchange to ask this, but here goes... I'm trying to wean myself off of Firebug, which has served me very well for a lot of years. One feature that seems to be missing in Chrome's dev tools is the ability to repeat an AJAX POST. In firebug I can right click on the request in the console and hit "Open in new tab" and the request is repeated exactly as it was originally sent. In Chrome, the same action just does a normal GET on the link, without any of the post data. Is there any way to repeat an AJAX POST in Chrome's dev tools?

    Read the article

  • How to calculate maximum number of request in 128 MB VPS performance?

    - by ifdion
    I am a newbie here, please let me know if I'm using wrong webmaster terms. I am currently setting up a VPS for a multi site WordPress. The VPS uses Debian 6 LNMP setup and the DNS is being taken care by another service. Currently the VPS is running non multi site WordPress with -+ 83 MB RAM out of 128MB. As far as I know the performance is relative to the number of request, not the number of sites in the multi site setup. The question How do I calculate maximum number of request in with that setup? If the information is not enough, what other factor do I need to know? Thank you in advance.

    Read the article

  • Do WebSockets have exclusive access to their sockets?

    - by Aoriste
    I'm curious to know if, after a WebSocket has been established (after having received the proper handshake from a server that supports them), whether or not the TCP socket used by the "WebSocket connection" is used exclusively by the WebSocket, or if the browser may still make regular HTTP requests with it. It only makes sense to me that WebSockets would have exclusive use of their TCP sockets, but I don't remember having read in any of the documentation that such is the case.

    Read the article

  • GZip compression with WCF hosted on IIS7

    - by joniba
    So I'm going to add my query to the small ocean of questions on the subject. I'm trying to enable GZip compression on large soap responses from a WCF service. So far, I've followed instructions here and in a variety of other places to enable dynamic compression on IIS. Here's my dynamicTypes section from the applicationHost.config: <dynamicTypes> <add mimeType="text/*" enabled="true" /> <add mimeType="message/*" enabled="true" /> <add mimeType="application/x-javascript" enabled="true" /> <add mimeType="application/atom+xml" enabled="true" /> <add mimeType="application/xaml+xml" enabled="true" /> <add mimeType="application/xop+xml" enabled="true" /> <add mimeType="application/soap+xml" enabled="true" /> <add mimeType="*/*" enabled="false" /> </dynamicTypes> And also: <urlCompression doDynamicCompression="true" dynamicCompressionBeforeCache="true" /> Though I'm not so clear on why that's needed. Threw some extra mime-types in there just in case. I've implemented IClientMessageInspector to add Accept-Encoding: gzip, deflate to my client's HttpRequests. Here's an example of a request-header taken from fiddler: POST http://[omitted]/TestMtomService/TextService.svc HTTP/1.1 Content-Type: application/soap+xml; charset=utf-8 Accept-Encoding: gzip, deflate Host: [omitted] Content-Length: 542 Expect: 100-continue Now, this doesn't work. There's simply no compression happening, no matter what the size of the message (tried up to 1.5Mb). I've looked at this post, but have not run into an exception as he describes, so I haven't tried the CodeProject implementation that he proposes. Also I've seen a lot of other implementations that are supposed to get this to work, but cannot make sense of them (e.g., msdn's GZip encoder). Why would I need to implement the encoder, or the code-project solution? Shouldn't IIS take care of the compression? So what else do I need to do to get this to work? Joni

    Read the article

  • Detecting REFERRER 301 redirects in AwStats

    - by Riccardo
    About six months ago, I have moved a website to a new domain, and helped migration using 301 redirects into .htaccess of the old domain. This morning I was looking at AwStats log of the new domain, and was surpised to notice that in the "HTTP Status codes"section, 301 redirects score 77% of the whole codes (seems 200 are not tracked here). So, what is the proper meaning of the 301 code in those stats? Does it mean that 77% of traffic is incoming (referrer) from 301 redirects or?

    Read the article

  • Can URIs have non-ASCII characters?

    - by Cheeso
    I tried to find this in the relevant RFC, IETF RFC 3986, but couldn't figure it. Do URIs for HTTP allow Unicode, or non-ASCII of any kind? Can you please cite the section and the RFC that supports your answer. NB: For those who might think this is not programming related - it is. It's related to an ISAPI filter I'm building.

    Read the article

  • What might cause the big overhead of making a HttpWebRequest call?

    - by Dimitri C.
    When I send/receive data using HttpWebRequest (on Silverlight, using the HTTP POST method) in small blocks, I measure the very small throughput of 500 bytes/s over a "localhost" connection. When sending the data in large blocks, I get 2 MB/s, which is some 5000 times faster. Does anyone know what could cause this incredibly big overhead? Update: I did the performance measurement on both Firefox 3.6 and Internet Explorer 7. Both showed similar results. Update: The Silverlight client-side code I use is essentially my own implementation of the WebClient class. The reason I wrote it is because I noticed the same performance problem with WebClient, and I thought that the HttpWebRequest would allow to tweak the performance issue. Regrettably, this did not work. The implementation is as follows: public class HttpCommChannel { public delegate void ResponseArrivedCallback(object requestContext, BinaryDataBuffer response); public HttpCommChannel(ResponseArrivedCallback responseArrivedCallback) { this.responseArrivedCallback = responseArrivedCallback; this.requestSentEvent = new ManualResetEvent(false); this.responseArrivedEvent = new ManualResetEvent(true); } public void MakeRequest(object requestContext, string url, BinaryDataBuffer requestPacket) { responseArrivedEvent.WaitOne(); responseArrivedEvent.Reset(); this.requestMsg = requestPacket; this.requestContext = requestContext; this.webRequest = WebRequest.Create(url) as HttpWebRequest; this.webRequest.AllowReadStreamBuffering = true; this.webRequest.ContentType = "text/plain"; this.webRequest.Method = "POST"; this.webRequest.BeginGetRequestStream(new AsyncCallback(this.GetRequestStreamCallback), null); this.requestSentEvent.WaitOne(); } void GetRequestStreamCallback(IAsyncResult asynchronousResult) { System.IO.Stream postStream = webRequest.EndGetRequestStream(asynchronousResult); postStream.Write(requestMsg.Data, 0, (int)requestMsg.Size); postStream.Close(); requestSentEvent.Set(); webRequest.BeginGetResponse(new AsyncCallback(this.GetResponseCallback), null); } void GetResponseCallback(IAsyncResult asynchronousResult) { HttpWebResponse response = (HttpWebResponse)webRequest.EndGetResponse(asynchronousResult); Stream streamResponse = response.GetResponseStream(); Dim.Ensure(streamResponse.CanRead); byte[] readData = new byte[streamResponse.Length]; Dim.Ensure(streamResponse.Read(readData, 0, (int)streamResponse.Length) == streamResponse.Length); streamResponse.Close(); response.Close(); webRequest = null; responseArrivedEvent.Set(); responseArrivedCallback(requestContext, new BinaryDataBuffer(readData)); } HttpWebRequest webRequest; ManualResetEvent requestSentEvent; BinaryDataBuffer requestMsg; object requestContext; ManualResetEvent responseArrivedEvent; ResponseArrivedCallback responseArrivedCallback; } I use this code to send data back and forth to an HTTP server.

    Read the article

  • Mochiweb's Scalability Features

    - by ErJab
    From all the articles I've read so far about Mochiweb, I've heard this over and over again that Mochiweb provides very good scalability. My question is, how exactly does Mochiweb get its scalability property? Is it from Erlang's inherent scalability properties or does Mochiweb have any additional code that explicitly enables it to scale well? Put another way, if I were to write a simple HTTP server in Erlang myself, with a simple 'loop' (recursive function) to handle requests, would it have the same level scalability as a simple web server built using the Mochiweb framework?

    Read the article

  • How to expand URLs in C#?

    - by Hao Wooi Lim
    If I have a URL like http://popurls.com/go/msn.com/l4eba1e6a0ffbd9fc915948434423a7d5, how do I expand it back to the original URL programmatically? Obviously I could use an API like expandurl.com but that will limits me to 100 requests per hour.

    Read the article

< Previous Page | 157 158 159 160 161 162 163 164 165 166 167 168  | Next Page >