Search Results

Search found 15228 results on 610 pages for 'literature request'.

Page 2/610 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • HTTP request stream not readable outside of request handler

    - by Jason Young
    I'm writing a fairly complicated multi-node proxy, and at one point I need to handle an HTTP request, but read from that request outside of the "http.Server" callback (I need to read from the request data and line it up with a different response at a different time). The problem is, the stream is no longer readable. Below is some simple code to reproduce the issue. Is this normal, or a bug? function startServer() { http.Server(function (req, res) { req.pause(); checkRequestReadable(req); setTimeout(function() { checkRequestReadable(req); }, 1000); setTimeout(function() { res.end(); }, 1100); }).listen(1337); console.log('Server running on port 1337'); } function checkRequestReadable(req) { //The request is not readable here! console.log('Request writable? ' + req.readable); } startServer();

    Read the article

  • Pull Request Changes, Multi-Selection in Advanced View, and Advertisement Changes

    [Do you tweet? Follow us on Twitter @matthawley and @adacole_msft] We deployed a new version of the CodePlex website today. Pull Request Changes In this release, we have begun to re-focus on Pull Requests to ensure a productive experience between the project users and developers. We feel we made significant progress in this area for this release and look forward to using your feedback to drive future iterations. One of the biggest hurdles people have indicated is the inability to see what a pull request includes without pulling the source down from a Mercurial client. With today’s changes, any user has the ability to view a pull request, the changesets / changes included, and perform an inline diff of the file. When a pull request is made, the CodePlex website will query for all outgoing changes from the fork to the main repository for a point-in-time comparison. Because of this point-in-time comparison… All existing pull requests created prior to this release will not have changesets associated with them. If new commits are pushed to the fork while a pull request is active, they will not appear associated with the pull request. The pull request will need to be re-submitted for them to appear. Once a pull request is created, you can “View the Pull Request” which takes you to a page that looks like As you may notice, we now display a lot more detailed information regarding that pull request including who it was requested by and when, the associated changesets, the description, who it’s assigned to (we’ll come back to this) and the listing of summarized file changes. What you’ll also notice, is that each modified file has the ability to view a diff of all changes made. When you click “(view diff)” for a file, an inline diff experience appears. This new experience allows you to quickly navigate through all of the modified files as well as viewing the various change blocks for each file. You’ll also notice as you browse through each file’s changes, we update the URL to include the file path so you can quickly send a direct link to a pull request’s file. Clicking “(close diff)” will bring you back to the original pull request view. View this pull request live on WikiPlex. Pull Request Review Assignment Another new feature we added for pull requests is the ability for project members to assign pull requests for review. Any project member has the ability to assign (and re-assign if needed) a pull request to a project member. Once the assignment has been made, that project member will be notified via email of the assignment. Once they complete the review of the pull request, they can either accept or deny it similarly to the previous process. Multi-Selection in Advanced View Filters One of the more recent requests we have heard from users is the ability multi-select advanced view filters for work items. We are happy to announce this is now possible. Simply control-click the multiple options for each filter item and your work item query will be refined as such. Should you happen to unselect all options for a given filter, it will automatically reset to the default option for that filter. Furthermore, the “Direct Link” URL will be updated to include the multi-selected options for each filter. Note: The “Direct Link” feature was released in our previous deployment, just never written about. It allows you to capture the current state of your query and send it to other individuals. Advertisement Changes Very recently, the advertiser (The Lounge) we partnered to provide advertising revenue for projects, or donated to charity, was acquired by Lake Quincy Media. There has been no change in the advertising platform offering, and all projects have been converted over to using the new infrastructure. Project owners should note the new contact information for getting paid. The CodePlex team values your feedback, and is frequently monitoring Twitter, our Discussions and Issue Tracker for new features or problems. If you’ve not visited the Issue Tracker recently, please take a few moments to log an idea or vote for the features you would most like to see implemented on CodePlex.

    Read the article

  • webservice request issue with dynamic request inputs

    - by nanda
    try { const string siteURL = "http://ops.epo.org/2.6.1/soap-services/document-retrieval"; const string docRequest = "<soap:Envelope xmlns:soap='http://schemas.xmlsoap.org/soap/envelope/' xmlns:xsi='http://www.w3.org/2001/XMLSchema-instance' xmlns:xsd='http://www.w3.org/2001/XMLSchema'><soap:Body><document-retrieval id='EP 1000000A1 I ' page-number='1' document-format='SINGLE_PAGE_PDF' system='ops.epo.org' xmlns='http://ops.epo.org' /></soap:Body></soap:Envelope>"; var request = (HttpWebRequest)WebRequest.Create(siteURL); request.Method = "POST"; request.Headers.Add("SOAPAction", "\"document-retrieval\""); request.ContentType = " text/xml; charset=utf-8"; Stream stm = request.GetRequestStream(); byte[] binaryRequest = Encoding.UTF8.GetBytes(docRequest); stm.Write(binaryRequest, 0, docRequest.Length); stm.Flush(); stm.Close(); var memoryStream = new MemoryStream(); WebResponse resp = request.GetResponse(); var buffer = new byte[4096]; Stream responseStream = resp.GetResponseStream(); { int count; do { count = responseStream.Read(buffer, 0, buffer.Length); memoryStream.Write(buffer, 0, count); } while (count != 0); } resp.Close(); byte[] memoryBuffer = memoryStream.ToArray(); System.IO.File.WriteAllBytes(@"E:\sample12.pdf", memoryBuffer); } catch (Exception ex) { throw ex; } The code above is to retrieve the pdf webresponse.It works fine as long as the request remains canstant, const string docRequest = "<soap:Envelope xmlns:soap='http://schemas.xmlsoap.org/soap/envelope/' xmlns:xsi='http://www.w3.org/2001/XMLSchema-instance' xmlns:xsd='http://www.w3.org/2001/XMLSchema'><soap:Body><document-retrieval id='EP 1000000A1 I ' page-number='1' document-format='SINGLE_PAGE_PDF' system='ops.epo.org' xmlns='http://ops.epo.org' /></soap:Body></soap:Envelope>"; but how to retrieve the same with dynamic requests. When the above code is changed to accept dynamic inputs like, [WebMethod] public string DocumentRetrivalPDF(string docid, string pageno, string docFormat, string fileName) { try { ........ ....... string docRequest = "<soap:Envelope xmlns:soap='http://schemas.xmlsoap.org/soap/envelope/' xmlns:xsi='http://www.w3.org/2001/XMLSchema-instance' xmlns:xsd='http://www.w3.org/2001/XMLSchema'><soap:Body><document-retrieval id=" + docid + " page-number=" + pageno + " document-format=" + docFormat + " system='ops.epo.org' xmlns='http://ops.epo.org' /></soap:Body></soap:Envelope>"; ...... ........ return "responseTxt"; } catch (Exception ex) { return ex.Message; } } It return an "INTERNAL SERVER ERROR:500" can anybody help me on this???

    Read the article

  • 502: proxy: pass request body failed

    - by Apikot
    Sometimes I get the following error (in apache's error.log) when viewing my site over https: (502)Unknown error 502: proxy: pass request body failed to xxx.xxx.xxx.xxx:443 I'm not entirely sure what this is and why it happens, it's also not consistent. The request route is: Browser Proxy server (apache with mod_proxy + mod_ssl) Load balancer (aws) Web server (apache with mod_ssl) The configuration on the proxy server is as follows: <VirtualHost *:443> ProxyRequests Off ProxyVia On ServerName www.xxx.co.uk ServerAlias xxx.co.uk <Directory proxy:*> Order deny,allow Allow from all </Directory> <Proxy *> AddDefaultCharset off Order deny,allow Allow from all </Proxy> ProxyPass / balancer://cluster:443/ lbmethod=byrequests ProxyPassReverse / balancer://cluster:443/ ProxyPreserveHost off SSLProxyEngine On SSLEngine on SSLCipherSuite ALL:!ADH:!EXPORT56:RC4+RSA:+HIGH:+MEDIUM:+LOW:+SSLv2:+EXP:+eNULL SSLCertificateFile /var/www/vhosts/xxx/ssl/www.xxx.co.uk.cert SSLCertificateKeyFile /var/www/vhosts/xxx/ssl/www.xxx.co.uk.key <Proxy balancer://cluster> BalancerMember https://xxx.eu-west-1.elb.amazonaws.com </Proxy> </VirtualHost> Any idea what the issue might be?

    Read the article

  • Please help! request compression

    - by Naor
    Hi, I wrote an IHttpModule that compress my respone using gzip (I return a lot of data) in order to reduce response size. It is working great as long as the web service doesn't throws an exception. In case exception is thrown, the exception gzipped but the Content-encoding header is disappear and the client doesn't know to read the exception. How can I solve this? Why the header is missing? I need to get the exception in the client. Here is the module: public class JsonCompressionModule : IHttpModule { public JsonCompressionModule() { } public void Dispose() { } public void Init(HttpApplication app) { app.BeginRequest += new EventHandler(Compress); } private void Compress(object sender, EventArgs e) { HttpApplication app = (HttpApplication)sender; HttpRequest request = app.Request; HttpResponse response = app.Response; try { //Ajax Web Service request is always starts with application/json if (request.ContentType.ToLower(CultureInfo.InvariantCulture).StartsWith("application/json")) { //User may be using an older version of IE which does not support compression, so skip those if (!((request.Browser.IsBrowser("IE")) && (request.Browser.MajorVersion <= 6))) { string acceptEncoding = request.Headers["Accept-Encoding"]; if (!string.IsNullOrEmpty(acceptEncoding)) { acceptEncoding = acceptEncoding.ToLower(CultureInfo.InvariantCulture); if (acceptEncoding.Contains("gzip")) { response.AddHeader("Content-encoding", "gzip"); response.Filter = new GZipStream(response.Filter, CompressionMode.Compress); } else if (acceptEncoding.Contains("deflate")) { response.AddHeader("Content-encoding", "deflate"); response.Filter = new DeflateStream(response.Filter, CompressionMode.Compress); } } } } } catch (Exception ex) { int i = 4; } } } Here is the web service: [WebMethod] public void DoSomething() { throw new Exception("This message get currupted on the client because the client doesn't know it gzipped."); } I appriciate any help. Thanks!

    Read the article

  • What is the "opposite" of request serialization called?

    - by Adam Lindberg
    For example, if a request is made to a resource and another identical request is made before the first has returned a result, the server returns the result of the first request for the second request as well. This to avoid unnecessary processing on the resource. This is not the same thing as caching/memoization since it only concerns identical requests ongoing in parallel. Is there a term for the reuse of results for currently ongoing requests to a resource for the purpose of minimizing processing?

    Read the article

  • How to make per- http Request cache in ASP.NET 3.5

    - by Artem
    We using ASP.NET 3.5 (Controls-based apporach) and need to have storage specific for one http request only. Thread-specific cache with keys from session id won't work because threads are supposed to be pooled and therefore I have a chance to have data from some previous request in cache, which is undesireble in my case. I always need to have brand new storage for each request available through whole request. Any ideas how to do it in ASP.NET 3.5?

    Read the article

  • copying the request header from request object to urlConnection object

    - by Bunny Rabbit
    protected void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException { // TODO Auto-generated method stub URL url = new URL("http://localhost:8080/testy/Out"); HttpURLConnection connection = (HttpURLConnection) url.openConnection(); connection.setDoOutput(true); connection.setRequestMethod("POST"); PrintWriter out=response.getWriter(); for(Enumeration e=request.getHeaderNames();e.hasMoreElements();){ Object o=e.nextElement(); String value=request.getHeader(o.toString()); out.println(o+"--is--"+value+"<br>"); connection.setRequestProperty((String) o, value); } connection.connect(); } i wrote the above code in a servlet to post form so some alternate locations than this servlet,but its not working.is it okay to use connection.setRequestProperty to set the header fields to what they are in the incoming request to servlet.

    Read the article

  • How to validate HTTP request headers before receiving request body using WCF

    - by anelson
    I'm implementing a REST service using WCF which will be used to upload very large files. The HTTP headers in this request will communicate information which will be validated prior to allowing the upload to proceed (things like permissions, available disk space, etc). It's possible this validation will fail resulting in an error response. I'd like to do this validation prior to the client sending the body of the request, so it has a chance to detect failure before uploading potentially gigabytes of data. RESTful web services use the HTTP 1.1 Expect: 100-continue in the request to implement this. For example Amazon S3's REST API can validate your key and ACLs in response to an object PUT operation, returning 100 Continue if all is well, indicating you may proceed to send your data. I've rummaged around the WCF documentation and I just can't see a way to accomplish this without doing some pretty low-level hooking into the HTTP request processing pipeline. How would you suggest I solve this problem?

    Read the article

  • Spiceworks versus Request Tracker?

    - by dmackey
    We currently utilize Request Tracker for help desk ticketing, we utilize Spiceworks for asset inventorying. I am pondering whether it might be worthwhile to move from RT to Spiceworks for help desk as well. Has anyone used both systems and can provide some insight into any benefits/problems with either system? Or has general philosophical reasons why one should use one solution over the other? Of course, RT is open source and Spiceworks is not - and usually this would be a major item for me - but since Spiceworks is free and takes community involvement fairly actively its not as major of a concern for me (personally).

    Read the article

  • Apache Connection vs. Request

    - by user101570
    I apologize in advance if this is a basic question, but I am quite confused after reading the Apache documentation and other tutorials. Does a single Apache prefork process serve all HTTP requests for a given client? That's what I thought, but when I reduce maxclients down to a low number, my page load times go to a crawl. This despite the fact I'm the only client on the server in question. This would suggest each process serves a single HTTP request at a time, rather than serving all requests within the TimeOut window. So if a single webpage requires 15 HTTP requests to load fully, do I require 15 prefork Apache processes to optimally serve it?

    Read the article

  • Classic ASP Request.Form removes spaces?

    - by alex
    I'm trying to figure this oddity out... in classic ASP i seem to be losing spaces in Request.Form values... ie, Request.Form("json") is {"project":{"...","administrator":"AlexGorbatchev", "anonymousViewUrl":null,"assets":[],"availableFrom":"6/10/20104:15PM"... However, CStr(Request.Form) is json={"project":{"__type":"...":"Alex Gorbatchev", "anonymousViewUrl":null,"assets":[],"availableFrom":"6/10/2010 4:15 PM"... Here's the entire code :) <%@ language="VBSCRIPT"%> <% Response.Write(CStr(Request.Form("json"))) Response.Write(CStr(Request.Form)) %> Somebody please tell me I haven't lost all my marbles...

    Read the article

  • ASP.NET binding object to Request in asp.net mvc

    - by Alxandr
    I've created a object that I'd like to have accessible from wherever the request-object is accessible, and to "die" with the request, more or less like how you always in a mvc-application has access to the RouteData-collection. Especially it's important that I have access to this object in the execution of action-filters. And also there need to be created a new object of my class whenever a new request is made to the page (the object needs to be request-safe, ie. only one request modifies that one object). Any thoughts about how to achieve this?

    Read the article

  • Django: request object to template context transparancy

    - by anars
    Hi! I want to include an initialized data structure in my request object, making it accessible in the context object from my templates. What I'm doing right now is passing it manually and tiresome within all my views: render_to_response(...., ( {'menu': RequestContext(request)})) The request object contains the key,value pair which is injected using a custom context processor. While this works, I had hoped there was a more generic way of passing selected parts of the request object to the template context. I've tried passing it by generic views, but as it turns out the request object isn't instantiated when parsing the urlpatterns list.

    Read the article

  • Request Tracker 4: Ticket Escalation

    - by Randy
    I am running Request Tracker 4 on a Debian Squeeze Server. I have to implement a priority escalation. Actually escalation is not the right term for this since the the ticket priority should be set linear via rt-crontool (or any other tool that can be run via a cronjob) dependent on the time that has been passed between the „Started“ and „Due“ to a number between 0 (starting priority) and the „Final Priority“ (eg. 100) while the value of the „Final Priority“ should be reached exactly the moment the „Due“-Date is passed. This already implies that the search condition should be all tickets of a certain queue that have „Started“ AND „Due“ AND „Final Priority“. The cronjob should be called very frequently for excample any 5 or 10 minutes so that the call should be indempotent and not depentent on the frequency of the rt-crontool invocations. One Example: A Ticket is Started at 2012-12-23 0am and Due is 2012-12-23 11.59pm while the Final Priority is 100. When the call is made at noon the priority should be set to 50. Could anybody help me with this? Thank you for reading this to the bottom!

    Read the article

  • IIS Request Filtering Rule for User Agent

    - by alexp
    I'm trying to block requests from a certain bot. I've added a request filtering rule, but I know it is still hitting the site because it shows up in Google Analytics. Here is the filtering rule I added: <security> <requestFiltering> <filteringRules> <filteringRule name="Block GomezAgent" scanUrl="false" scanQueryString="false"> <scanHeaders> <add requestHeader="User-Agent" /> </scanHeaders> <denyStrings> <add string="GomezAgent+3.0" /> </denyStrings> </filteringRule> </filteringRules> </requestFiltering> </security> This is an example of the user agent I'm trying to block. Mozilla/5.0+(Windows+NT+6.1;+WOW64;+rv:13.0;+GomezAgent+3.0)+Gecko/20100101+Firefox/13.0.1 In some ways it seems to work. If I use Chrome to spoof my user agent, I get a 404, as expected. But the bot traffic is still showing up in my analytics. What am I missing?

    Read the article

  • jQuery getJSON request returning empty on a valid request

    - by mikemike
    I'm trying to grab some JSON from Apple's iTunes JSON service. The request is simple: http://ax.phobos.apple.com.edgesuite.net/WebObjects/MZStoreServices.woa/wa/wsSearch?term=jac&limit=25 If you visit the URL in your browser you will see some well-formed (backed up by jsonlint.com) JSON. When I use the following jQuery to make the request, however, the request finds nothing: $("#soundtrack").keypress(function(){ $.getJSON("http://ax.phobos.apple.com.edgesuite.net/WebObjects/MZStoreServices.woa/wa/wsSearch",{'term':$(this).val(), 'limit':'25'}, function(j){ var options = ''; for (var i = 0; i < j.results.length; i++) { options += '<option value="' + j.results[i].trackId + '">' + j.results[i].artistName + ' - ' + j.results[i].trackName + '</option>'; } $("#track_id").html(options); }); }); Firebug sees the request, but only receives an empty response. Any help would be appreciated here, as I'm at my whits end trying to solve it. You can view the script here: http://rnmtest.co.uk/gd/drives_admin/add_drive (soundtrack input box is at the bottom of the page). Thanks

    Read the article

  • Perl: POST request how?

    - by Peterim
    Unfortunately, I'm not familiar with Perl, so asking here. Actually I'm using FCGI with Perl. I need to 1. accept a POST request - 2. send it via POST to another url - 3. get results - 4. return results to the first POST request (4 steps). To accept a POST request (step 1) I use the following peace of code (found it somewhere in the Internet): $ENV{'REQUEST_METHOD'} =~ tr/a-z/A-Z/; if ($ENV{'REQUEST_METHOD'} eq "POST") { read(STDIN, $buffer, $ENV{'CONTENT_LENGTH'}); } else { print ("some error"); } @pairs = split(/&/, $buffer); foreach $pair (@pairs) { ($name, $value) = split(/=/, $pair); $value =~ tr/+/ /; $value =~ s/%(..)/pack("C", hex($1))/eg; $FORM{$name} = $value; } The content of $name (it's a string) is the result of the first step. Now I need to send $name via POST request to some_url (step 2) which returns me another result (step 3), which I have to return as a result to the very first POST request (step 4). Any help with this would be greatly appreciated. Thank you.

    Read the article

  • HttpContext.Current.Request.UserHostName is empty when called from a class

    - by John Galt
    I have various web pages that need to build up a URL to display or place it in an emitted email message. The code I inherited had this value for the name of the webserver in a Public Const in a Public Class called FixedConstants. For example: Public Const cdServerName As String = "WEBSERVERNAME" Trying to improve on this, I wrote this: Public Class UIFunction Public Shared myhttpcontext As HttpContext Public Shared Function cdWebServer() As String Dim s As New StringBuilder("http://") Dim h As String h = String.Empty Try h = Current.Request.ServerVariables("REMOTE_HOST").ToString() Catch ex As Exception Dim m As String m = ex.Message.ToString() 'Ignore this should-not-occur thingy End Try If h = String.Empty Then h = "SomeWebServer" End If s.Append(h) s.Append("/") Return s.ToString() End Function I've tried different things while debugging such as HttpContext.Current.Request.UserHostName and I always get an empty string which pumps out my default string "SomeWebServer". I know Request.UserHostName or Request.ServerVariables("REMOTE_HOST") works when invoked from a page but why does this return empty when invoked from a called method of a class file (i.e. UIFunction.vb)?

    Read the article

  • Outlook 2007 meeting request varying times across users

    - by gtaylor85
    I've googled this quite a bit, but none of the answers seem to apply to me. User A creates a meeting and invites everyone to a meeting at 1:30pm. Everyone gets the meeting for 1:30pm except User B who gets it for 2:30pm. User B responds with a "Correction" for 1:30pm and it shows up to User A for 12:30pm. I've checked Time Zone settings both in Windows Time and Date settings and also in Outlook options for both computers involved. Also, the DST check boxes are all checked (4x). I'm not sure what else to check. Any ideas?

    Read the article

  • How to redirect external web request to localhost's testing server

    - by Ivan Monteiro
    Some web services calls my web application(www.myapplication.com/external_update_handler). I need to test those requests locally, so I'd like to know your opinions about how can I "redirect" those requests to my localhost dev machine(that is outside of my web aplication domain) so I can debug. Probably it's needed a service/server to get those external requests and a desktop application that sends it to localhost:5555/external_update_handler, but I have no idea where to start and simpler options.

    Read the article

  • php to translate GET request to POST request

    - by DennyHalim
    i have a hosted script somewhere that only accept POST request. example, some.hosted/script.php how can i setup another simple php that can accept GET request and then POST it to the hosted script. so that i can put up a link like this: other.site/post2hostedscript.php?postthis=data and then it POST postthis=data to the hosted script. tnx

    Read the article

  • Initiate a Post request from a form with paylod in the Body request

    - by Martin Böschen
    I have the following problem. I have a webservice, which accepts a post request with some json data in the request body and which also returns Json data. Now I want to build a user friendly HTML page to test this service. I have a form to fill in data, when the user clicks the button, the JSON should be build from the form data and POSTed to my webservice, the response should be displayed to the user. How do I achieve that?

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >