Search Results

Search found 53597 results on 2144 pages for 'http requests'.

Page 219/2144 | < Previous Page | 215 216 217 218 219 220 221 222 223 224 225 226  | Next Page >

  • Shaping outbound Traffic to Control Download Speeds with Linux

    - by Kyle Brandt
    I have a situation where a server makes lots of requests from big webservers all at the same time. Currently, I have not control over the amount of requests or the rate of the requests from the application that does this. The responses from these webservers is more than the internet line can handle. (Basically, we are launching a DoS on ourselves). I am going to get push to get this fixed at the application level, but for the time being, is there anyway I can use traffic shaping on the Linux server to control this? I know I can only shape outbound traffic, but maybe there is a way I can slow the TCP responses so the other side will detect congestion and this will help my situation? If there is anything like this with tc, what might the configuration look like? The idea is that the traffic control might help me control which packets get dropped before they reach my router.

    Read the article

  • Weighted round robins via TTL - possible?

    - by Joe Hopfgartner
    I currently use DNS round robin for load balancing, which works great. The records look like this (I have a ttl of 120 seconds) ;; ANSWER SECTION: orion.2x.to. 116 IN A 80.237.201.41 orion.2x.to. 116 IN A 87.230.54.12 orion.2x.to. 116 IN A 87.230.100.10 orion.2x.to. 116 IN A 87.230.51.65 I learned that not every ISP / device treats such a response the same way. For example some DNS servers rotate the addresses randomly or always cycle them through. Some just propagate the first entry, others try to determine which is best (regionally near) by looking at the ip address. However if the userbase is big enough (spreads over multiple ISPs etc) it balances pretty well. The discrepancies from highest to lowest loaded server hardly every exceeds 15%. However now I have the problem that I am introducing more servers into the systems, that not all have the same capacities. I currently only have 1gbps servers, but I want to work with 100mbit and also 10gbps servers too. So what I want is I want to introduce a server with 10 GBps with a weight of 100, a 1 gbps server with a weight of 10 and a 100 mbit server with a weight of 1. I used to add servers twice to bring more traffic to them (which worked nice. the bandwidth doubled almost.) But adding a 10gbit server 100 times to DNS is a bit rediculous. So I thought about using the TTL. If I give server A 240 seconds ttl and server B only 120 seconds (which is about about the minimum to use for round robin, as a lot of dns servers set to 120 if a lower ttl is specified.. so i have heard) I think something like this should occour in an ideal scenario: first 120 seconds 50% of requests get server A -> keep it for 240 seconds. 50% of requests get server B -> keep it for 120 seconds second 120 seconds 50% of requests still have server A cached -> keep it for another 120 seconds. 25% of requests get server A -> keep it for 240 seconds 25% of requests get server B -> keep it for 120 seconds third 120 seconds 25% will get server A (from the 50% of Server A that now expired) -> cache 240 sec 25% will get server B (from the 50% of Server A that now expired) -> cache 120 sec 25% will have server A cached for another 120 seconds 12.5% will get server B (from the 25% of server B that now expired) -> cache 120sec 12.5% will get server A (from the 25% of server B that now expired) -> cache 240 sec fourth 120 seconds 25% will have server A cached -> cache for another 120 secs 12.5% will get server A (from the 25% of b that now expired) -> cache 240 secs 12.5% will get server B (from the 25% of b that now expired) -> cache 120 secs 12.5% will get server A (from the 25% of a that now expired) -> cache 240 secs 12.5% will get server B (from the 25% of a that now expired) -> cache 120 secs 6.25% will get server A (from the 12.5% of b that now expired) -> cache 240 secs 6.25% will get server B (from the 12.5% of b that now expired) -> cache 120 secs 12.5% will have server A cached -> cache another 120 secs ... i think i lost something at this point but i think you get the idea.... As you can see this gets pretty complicated to predict and it will for sure not work out like this in practice. But it should definitely have an effect on the distribution! I know that weighted round robin exists and is just controlled by the root server. It just cycles through dns records when responding and returns dns records with a set propability that corresponds to the weighting. My DNS server does not support this, and my requirements are not that precise. If it doesnt weight perfectly its okay, but it should go into the right direction. I think using the TTL field could be a more elegant and easier solution - and it deosnt require a dns server that controls this dynamically, which saves resources - which is in my opinion the whole point of dns load balancing vs hardware load balancers. My question now is... are there any best prectices / methos / rules of thumb to weight round robin distribution using the TTL attribute of DNS records? Edit: The system is a forward proxy server system. The amount of Bandwidth (not requests) exceeds what one single server with ethernet can handle. So I need a balancing solution that distributes the bandwidth to several servers. Are there any alternative methods than using DNS? Of course I can use a load balancer with fibre channel etc, but the costs are rediciulous and it also increases only the width of the bottleneck and does not eliminate it. The only thing i can think of are anycast (is it anycast or multicast?) ip addresses, but I don't have the means to set up such a system.

    Read the article

  • Deserializing classes from XML generated using XSD.exe

    - by heap
    I have classes generated (using xsd.exe) from an .xsd that I can serialize just fine, but when I try and deserialize it, I get the error: {"<XMLLanguages xmlns='http://tempuri.org/XMLLanguages.xsd'> was not expected."} I've searched for a couple of hours and found most peoples problems lie in not declaring namespaces in their xsd/xml, not defining namespaces in their classes, etc, but I can't find a solution for my problem. Here are code snippets for the relevant classes. <?xml version="1.0" encoding="utf-8"?> <xs:schema id="SetupData" targetNamespace="http://tempuri.org/XMLLanguages.xsd" elementFormDefault="qualified" xmlns="http://tempuri.org/XMLLanguages.xsd" xmlns:xs="http://www.w3.org/2001/XMLSchema" > <xs:element name="XMLLanguages"> <xs:complexType> <xs:sequence> <xs:element name="Tier" minOccurs="1" maxOccurs="unbounded"> <xs:complexType> <xs:sequence> <xs:element name="L" minOccurs="1" maxOccurs="unbounded" type="Language"/> </xs:sequence> <xs:attribute name="TierID" type="xs:int"/> </xs:complexType> </xs:element> </xs:sequence> </xs:complexType> </xs:element> <xs:complexType name="Language"> <xs:sequence> <xs:element name="LangID" type="xs:int"/> <xs:element name="Tier" type="xs:int"/> <xs:element name ="Name" type="xs:string"/> </xs:sequence> <xs:attribute name ="PassRate" type="xs:int"/> </xs:complexType> </xs:schema> And the class: /// <remarks/> [System.CodeDom.Compiler.GeneratedCodeAttribute("xsd", "4.0.30319.1")] [System.SerializableAttribute()] [System.Diagnostics.DebuggerStepThroughAttribute()] [System.ComponentModel.DesignerCategoryAttribute("code")] [System.Xml.Serialization.XmlTypeAttribute(Namespace = "http://tempuri.org/XMLLanguages.xsd")] [System.Xml.Serialization.XmlRootAttribute(Namespace = "http://tempuri.org/XMLLanguages.xsd", IsNullable = false)] public partial class XMLLanguages { private List<XMLLanguagesTier> tierField; /// <remarks/> [System.Xml.Serialization.XmlElementAttribute("Tier")] public List<XMLLanguagesTier> Tiers { get { return this.tierField; } set { this.tierField = value; } } } And a the line in XML causing the error: <XMLLanguages xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns="http://tempuri.org/XMLLanguages.xsd">

    Read the article

  • How to secure Firefox traffic (+DNS) through SOCKS proxy under Ubuntu 10.04?

    - by Maarx
    I'm using Ubuntu 10.04, and starting a SOCKS proxy with 'ssh -D', and setting Ubuntu to use it with "System - Preferences - Network Proxy". Firefox uses the proxy, and the proxy's IP appears when I visit a site like http://www.whatismyip.com/. My question is, is Firefox resolving DNS requests through this proxy? Is my web-browsing truly secure? (That is, until I exit the other end of the proxy. I know it's insecure after that.) (And I've verified the keys, I'm not being man-in-the-middled) (And--screw it. You know what I mean. Is it resolving DNS requests through the proxy?) I don't know how I would go about verifying such a thing for myself. Using additional hardware such as another debugging proxy is not an option. If Firefox isn't resolving my DNS requests through the SOCKS proxy, how do I go about fixing it?

    Read the article

  • How to detect Links with out anchor element in a plain text

    - by dhee
    If user enters his text in the text box and saves it and again what's to add some more text he can edit that text and save it if required. Firstly if user enters that text with some links I, detected them and converted any hyperlinks to linkify in new tab. Secondly if user wants to add some more text and links he clicks on edit and add them and save it at this time I must ignore the links that already hyperlinked with anchor button Please help and advice For example: what = "<span>In the task system, is there a way to automatically have any site / page URL or image URL be hyperlinked in a new window?</span><br><br><span>So If I type or copy http://www.stackoverflow.com/&nbsp; for example anywhere in the description, in any of the internal messages or messages to clients, it automatically is a hyperlink in a new window.</span><br><a href="http://www.stackoverflow.com/">http://www.stackoverflow.com/</a><br> <br><span>Or if I input an image URL anywhere in support description, internal messages or messages to cleints, it automatically is a hyperlink in a new window:</span><br> <span>https://static.doubleclick.net/viewad/4327673/1-728x90.jpg</span><br><br><a href="https://static.doubleclick.net/viewad/4327673/1-728x90.jpg">https://static.doubleclick.net/viewad/4327673/1-728x90.jpg</a><br><br><br><span>This would save us a lot time in task building, reviewing and creating messages.</span> Test URL's http://www.stackoverflow.com/ http://stackoverflow.com/ https://stackoverflow.com/ www.stackoverflow.com //stackoverflow.com/ <a href='http://stackoverflow.com/'>http://stackoverflow.com/</a>"; I've tried this code function Linkify(what) { str = what; out = ""; url = ""; i = 0; do { url = str.match(/((https?:\/\/)?([a-z\-]+\.)*[\-\w]+(\.[a-z]{2,4})+(\/[\w\_\-\?\=\&\.]*)*(?![a-z]))/i); if(url!=null) { // get href value href = url[0]; if(href.substr(0,7)!="http://") href = "http://"+href; // where the match occured where = str.indexOf(url[0]); // add it to the output out += str.substr(0,where); // link it out += '<a href="'+href+'" target="_blank">'+url[0]+'</a>'; // prepare str for next round str = str.substr((where+url[0].length)); } else { out += str; str = ""; } } while(str.length>0); return out; } Please help Thanks.

    Read the article

  • remove multiple trailing slashes mod_rewrite

    - by Boyan
    I know this question was asked a number of times on this site alone, but browsing through the relevant posts I couldn't find a solution. Trying to remove multiple trailing slashes after domain. The following mod_rewrite expressions seem to work for URLs such as http://www.domain.com//path1///path2////, but do not work for domain// DirectorySlash Off RewriteEngine on # Canonical fix RewriteCond %{HTTP_HOST} !^www.domain.com$ [NC] RewriteRule ^(.*)$ http://www.domain.com/$1 [R=301] RewriteRule ^/main.do http://www.domain.com/ [R=301,L] RewriteRule ^/index.jsp http://www.domain.com/ [R=301,L] # Remove bogus query strings RewriteCond %{query_string} q= [NC] RewriteRule (.*) http://www.domain.com/$1? [R=301,L] # Remove multiple slashes after domain - DOESN'T WORK!!! #RewriteCond %{REQUEST_URI} ^//+(.*)$ [OR] #RewriteCond %{REQUEST_URI} ^(.*/)/+$ #RewriteRule / http://www.domain.com/%1 [R=301,L] # Remove multiple slashes anywhere in URL RewriteCond %{REQUEST_URI} ^(.*)//(.*)$ RewriteRule . %1/%2 [R=301,L] # Externally redirect to get rid of trailing slash except for home page, ads RewriteCond %{REQUEST_URI} !^/ads/ RewriteRule ^(.+)/$ $1 [R=301,L] Your help is appreciated.

    Read the article

  • Problem serializing complex data using WCF

    - by Gustavo Paulillo
    Scenario: WCF client app, calling a web-service (JAVA) operation, wich requires a complex object as parameter. Already got the metadata. Problem: The operation has some required fields. One of them is a enum. In the SOAP sent, isnt the field above (generated metadata) - Im using WCF diagnostics and Windows Service Trace Viewer: [System.CodeDom.Compiler.GeneratedCodeAttribute("System.Xml", "2.0.50727.3082")] [System.SerializableAttribute()] [System.Diagnostics.DebuggerStepThroughAttribute()] [System.ComponentModel.DesignerCategoryAttribute("code")] [System.Xml.Serialization.XmlTypeAttribute(TypeName="Consult-Filter", Namespace="http://webserviceX.org/")] public partial class ConsFilter : object, System.ComponentModel.INotifyPropertyChanged { private PersonType customerTypeField; Property: [System.Xml.Serialization.XmlElementAttribute("customer-type", Form=System.Xml.Schema.XmlSchemaForm.Unqualified, Order=1)] public PersonType customerType { get { return this.customerTypeField; } set { this.customerTypeField = value; this.RaisePropertyChanged("customerType"); } } The enum: [System.CodeDom.Compiler.GeneratedCodeAttribute("System.Xml", "2.0.50727.3082")] [System.SerializableAttribute()] [System.Xml.Serialization.XmlTypeAttribute(TypeName="Person-Type", Namespace="http://webserviceX.org/")] public enum PersonType { /// <remarks/> F, /// <remarks/> J, } The trace log: <MessageLogTraceRecord> <HttpRequest xmlns="http://schemas.microsoft.com/2004/06/ServiceModel/Management/MessageTrace"> <Method>POST</Method> <QueryString></QueryString> <WebHeaders> <VsDebuggerCausalityData>data</VsDebuggerCausalityData> </WebHeaders> </HttpRequest> <s:Envelope xmlns:s="http://schemas.xmlsoap.org/soap/envelope/"> <s:Header> <Action s:mustUnderstand="1" xmlns="http://schemas.microsoft.com/ws/2005/05/addressing/none"></Action> <ActivityId CorrelationId="correlationId" xmlns="http://schemas.microsoft.com/2004/09/ServiceModel/Diagnostics">activityId</ActivityId> </s:Header> <s:Body xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema"> <filter xmlns="http://webserviceX.org/"> <product-code xmlns="">116</product-code> <customer-doc xmlns="">777777777</customer-doc> </filter> </s:Body> </s:Envelope> </MessageLogTraceRecord>

    Read the article

  • Getting XML data from a external page and parsing it with PHP

    - by James P
    I'm trying to create a database of World of Warcraft gems. If I go to this page: http://www.wowarmory.com/search.xml?fl[source]=all&fl[type]=gems&fl[subTp]=purple&searchType=items And go to View Source in Firefox, I see a tonne of XML data which is exactly what I want. I wrote up this quick script to try and parse some of it: <?php $gemUrls = array( 'Blue' => 'http://www.wowarmory.com/search.xml?fl[source]=all&fl[type]=gems&fl[subTp]=blue&searchType=items', 'Red' => 'http://www.wowarmory.com/search.xml?fl[source]=all&fl[type]=gems&fl[subTp]=red&searchType=items', 'Yellow' => 'http://www.wowarmory.com/search.xml?fl[source]=all&fl[type]=gems&fl[subTp]=yellow&searchType=items', 'Meta' => 'http://www.wowarmory.com/search.xml?fl[source]=all&fl[type]=gems&fl[subTp]=meta&searchType=items', 'Green' => 'http://www.wowarmory.com/search.xml?fl[source]=all&fl[type]=gems&fl[subTp]=green&searchType=items', 'Orange' => 'http://www.wowarmory.com/search.xml?fl[source]=all&fl[type]=gems&fl[subTp]=orange&searchType=items', 'Purple' => 'http://www.wowarmory.com/search.xml?fl[source]=all&fl[type]=gems&fl[subTp]=purple&searchType=items', 'Prismatic' => 'http://www.wowarmory.com/search.xml?fl[source]=all&fl[type]=gems&fl[subTp]=purple&searchType=items' ); // Get blue gems $blueGems = file_get_contents($gemUrls['Blue']); $xml = new SimpleXMLElement($blueGems); echo $xml->items[0]->item; ?> But I get a load of errors like this: Warning: SimpleXMLElement::__construct() [simplexmlelement.--construct]: Entity: line 20: parser error : xmlParseEntityRef: no name in C:\xampp\htdocs\WoW\index.php on line 19 Warning: SimpleXMLElement::__construct() [simplexmlelement.--construct]: if(Browser.iphone && Number(getcookie2("mobIntPageVisits")) < 3 && getcookie2( in C:\xampp\htdocs\WoW\index.php on line 19 I'm not sure what's wrong. I think file_get_contents() is bringing back data that isn't XML, maybe some Javascript files judging by the iPhone parts in the errors. Is there any way to just get back the XML from that page? Without any HTML or anything? Thanks :)

    Read the article

  • Mod rewrite with 3 parameters ?

    - by Axel
    Hello, I did tons of methods to figure out how to make this mod rewrite but i was completly unsuccessful. I want a .htaccess code that rewrite in the following method: http://www.mydomain.com/apple/upcoming/2 --- http://www.mydomain.com/handler.php?topic=apple&orderby=upcoming&page=2 This is easy to do, but the problem is that all parameters are not required so the link has different levels of parameters each time like this: http://www.mydomain.com/apple/popular/2 -- topic=apple&orderby=popular&page=2 http://www.mydomain.com/apple/2 -- topic=apple&orderby=&page=2 http://www.mydomain.com/all/popular/2 -- topic=all&orderby=popular&page=2 http://www.mydomain.com/apple/upcoming/ -- topic=apple&orderby=upcoming&page= So briefly, the url has 3 optional parameters in one static order: (topic) (orderby) (page) Note: the ORDERBY parameter can be "popular" or "upcoming" or nothing. Thanks

    Read the article

  • Wildcard redirect for subdomains and URI

    - by user1807680
    I have a problem with create pernament (301) redirect in apache: I have 2 domains: olddomain.com with many subdomains newdomain.com and I want to do redirect like: if user enter on http://anysubdomain.olddomain.com should be redirected to http://anysubdomain.newdomain.com if user enter on http://olddomain.com/something should be redirected to http://newdomain.com/something if user enter on http://olddomain.com/different/index.html should be redirected to http://newdomain.com/different/index.html if user enter on http://example.olddomain.com/ex/index.html should be redirected to http://example.newdomain.com/ex/index.html I don't know how I should set this: <VirtualHost *:80> ServerName olddomain.com </VirtualHost> Regards

    Read the article

  • remove multiple training slashes mod_rewrite

    - by Boyan
    I know this question was asked a number of times on this site alone, but browsing through the relevant posts I couldn't find a solution. Trying to remove multiple trailing slashes after domain. The following mod_rewrite expressions seem to work for URLs such as http://www.domain.com//path1///path2////, but do not work for domain// DirectorySlash Off RewriteEngine on # Canonical fix RewriteCond %{HTTP_HOST} !^www.domain.com$ [NC] RewriteRule ^(.*)$ http://www.domain.com/$1 [R=301] RewriteRule ^/main.do http://www.domain.com/ [R=301,L] RewriteRule ^/index.jsp http://www.domain.com/ [R=301,L] # Remove bogus query strings RewriteCond %{query_string} q= [NC] RewriteRule (.*) http://www.domain.com/$1? [R=301,L] # Remove multiple slashes after domain - DOESN'T WORK!!! #RewriteCond %{REQUEST_URI} ^//+(.*)$ [OR] #RewriteCond %{REQUEST_URI} ^(.*/)/+$ #RewriteRule / http://www.domain.com/%1 [R=301,L] # Remove multiple slashes anywhere in URL RewriteCond %{REQUEST_URI} ^(.*)//(.*)$ RewriteRule . %1/%2 [R=301,L] # Externally redirect to get rid of trailing slash except for home page, ads RewriteCond %{REQUEST_URI} !^/ads/ RewriteRule ^(.+)/$ $1 [R=301,L] Your help is appreciated.

    Read the article

  • What good technology/programming vodcasts are out there?

    - by Sam Saffron
    I'm trying to round up a list of programming/technology related Vodcasts. Related Question: What good technology podcasts are out there? Here I am looking for podcasts which include Video content like: dnr tv : http://www.dnrtv.com/ Channel 9 : http://channel9.msdn.com/ Dimecasts : http://dimecasts.net/ Railscasts : http://railscasts.com/ Zendcasts : http://www.zendcasts.com/ Net Tuts : http://net.tutsplus.com/category/videos/screencasts/ Feel free to edit this post, and improve the list. (with perhaps university lectures in RSS formats etc etc... ) For the voting to have even a slight amount of meaning please include only one vodcast per post. If this is a dupe, let me know and ill delete it.

    Read the article

  • Zend_Soap_Client doesn't work with proxy

    - by understack
    I'm accessing a SOAP web service like : $wsdl_url = 'http://abslive3.timesgroup.com:8888/clsRSchedule.soap?wsdl' ; $client = new Zend_Soap_Client($wsdl_url, array('proxy_host'=>"http://virtual-browser.25u.com" , 'proxy_port'=>80)); Since my shared server blocks port 8888, I'm using this proxy server. But Zend Soap Client tries to directly connect it. Exception information: Message: SOAP-ERROR: Parsing WSDL: Couldn't load from 'http://abslive3.timesgroup.com:8888/clsRSchedule.soap?wsdl' : failed to load external entity "http://abslive3.timesgroup.com:8888/clsRSchedule.soap?wsdl" Stack trace: #0 /home/..../library/Zend/Soap/Client/Common.php(51): SoapClient->SoapClient('http://abslive3...', Array) #1 /home/..../library/Zend/Soap/Client.php(1024): Zend_Soap_Client_Common->__construct(Array, 'http://abslive3...', Array) #2 /home/..../library/Zend/Soap/Client.php(1180): Zend_Soap_Client->_initSoapClientObject() #3 /home/..../library/Zend/Soap/Client.php(1104): Zend_Soap_Client->getSoapClient() #4 [internal function]: Zend_Soap_Client->__call('ReturnDataSet', Array) What am I doing wrong?

    Read the article

  • Remove server from loadbalancer [on hold]

    - by Cris
    Our datacenter is load balancing the traffic to our weblogic servers based on an application deployed on each server called probe.ear It is a simple java app which is accessed by the loadbalancer and if it accessible it will redirect traffic to this server as well. When we want to remove one node from the loadbalancer we just stop the probe app and new traffic is not redirected to it BUT I am asking myself what about the current requests? The responses for the current requests are executed (completed). I did not had time to test (will do that first thing when go back to work) but on theoretical level for me it make sense to do so unless the loadbalancer has some special settings. Does my reasoning make sense to you? Just to make it clear since it was reported the question is not clear. Is there a way to configure a load balancer in a way that from a specific moment will not redirect traffic to a certain server without reseting the current requests ?

    Read the article

  • XDS file, where to get xmlns argument?

    - by Daok
    <?xml version="1.0" encoding="utf-8"?> <xs:schema id="abc" targetNamespace="http://schemas.businessNameHere.com/SoftwareNameHere" elementFormDefault="qualified" xmlns="http://schemas.businessNameHere.com/SoftwareNameHere" xmlns:mstns="http://schemas.businessNameHere.com/SoftwareNameHere" xmlns:xs="http://www.w3.org/2001/XMLSchema"> <xs:element name="..." type="..." /> <xs:complexType name="..."> I am working on a project using XSD to generate .cs file. My question is concerning the string "http://schemas.businessNameHere.com/SoftwareNameHere" If I change it, it doesn't work. But the http:// is not a valid one... what is the logic behind and where can I can information about what to put there or how to change it?

    Read the article

  • Can access maven repository from behind proxy, need help.

    - by Digambar Daund
    I am trying to access maven repository from behind proxy. I configured settings.xml correctly (i guess so...) true http username password 12.34.56.78 8080 But still I am error message like... if i dont configure userid/password gets correct error message which is HTTP response code 407 - saying authentication required. But If I configure correct/incorrect proxy authentication it always prints below error message.... Downloading: http://repo1.maven.org/maven2/org/apache/maven/plugins/maven-clean-plugin/2.2/maven-clean-plugin-2.2.pom [WARNING] Unable to get resource 'org.apache.maven.plugins:maven-clean-plugin:pom:2.2' from repository central (http://repo1.maven.org/maven2): Error trans ferring file: Server redirected too many times (20) Downloading: http://repo1.maven.org/maven2/org/apache/maven/plugins/maven-clean-plugin/2.2/maven-clean-plugin-2.2.pom [WARNING] Unable to get resource 'org.apache.maven.plugins:maven-clean-plugin:pom:2.2' from repository central (http://repo1.maven.org/maven2): Error trans ferring file: Server redirected too many times (20) [INFO] ------------------------------------------------------------------------ [ERROR] BUILD ERROR

    Read the article

  • Page.ClientScript Register a pair of javascript code

    - by blgnklc
    How can I register a javascript code a page? I want to put th javascript function to the aspx.page; I thing it might be like that; string strJS= "<script language = javascript> (function(images, elements) { var fetchImages = function() { if(images.length > 0) { var numImages = 10; while(images.length > 0 && numImages-- > 0) { // assuming your elements are <img> document.getElementById(elements.shift()).src = images.shift(); // if not you could also set the background (or backgroundImage) css property // document.getElementById(elements.shift()).style.background = "url(" + images.shift() + ")"; } setTimeout(fetchImages, 5000); } } // bind to window onload window.onload = fetchImages; // if you're going to use something like jquery then do something like this instead //$(fetchImages); }(['url1', 'url2', 'url3'], ['img1', 'img2', 'img3'])) </script>"; Page.ClientScript.RegisterClientScriptBlock(typeof(ui_SUBMVGInfo), "SmartPenCheck", strJS); The second Questin is; ... ... g(ctrlIDBirthPlaceCode).onchange(); }} </script> <script language = javascript> var url1 = 'http://localhost:3645/ImageHandler.ashx?FormId=XXX.11&FieldId=s1_Adiniz'; var url2 ='http://localhost:3645/ImageHandler.ashx?FormId=XXX.11&FieldId=s1_Soyadiniz'; var url3 = 'http://localhost:3645/ImageHandler.ashx?FormId=XXX.11&FieldId=s1_tarih'; var url4 = 'http://localhost:3645/ImageHandler.ashx?FormId=XXX.11&FieldId=s1_Adiniz'; var url5 ='http://localhost:3645/ImageHandler.ashx?FormId=XXX.11&FieldId=s1_Soyadiniz'; var url6 = 'http://localhost:3645/ImageHandler.ashx?FormId=XXX.11&FieldId=s1_tarih'; var url7 = 'http://localhost:3645/ImageHandler.ashx?FormId=XXX.11&FieldId=s1_Adiniz'; var url8 ='http://localhost:3645/ImageHandler.ashx?FormId=XXX.11&FieldId=s1_Soyadiniz'; var url9 = 'http://localhost:3645/ImageHandler.ashx?FormId=XXX.11&FieldId=s1_tarih'; var url10 = 'http://localhost:3645/ImageHandler.ashx?FormId=XXX.11&FieldId=s1_Adiniz'; var url11 ='http://localhost:3645/ImageHandler.ashx?FormId=XXX.11&FieldId=s1_Soyadiniz'; var url12 = 'http://localhost:3645/ImageHandler.ashx?FormId=XXX.11&FieldId=s1_tarih'; function(images, elements) {var fetchImages = function() {if(images.length > 0) {var numImages = 5; while(images.length > 0 && numImages-- > 0) { document.getElementById(elements.shift()).src = images.shift(); }setTimeout(fetchImages, 5000); }}window.onload = fetchImages; }(['+url1+', '+url2+', '+url3+','+url4+', '+url5+', '+url6+','+url7+', '+url8+', '+url9+','+url10+', '+url11+', '+url12+'], ['ui_taskFormControl$ctl03$ctl00$ctl03$ui_CustomerNameImage', 'ui_taskFormControl$ctl03$ctl00$ctl03$ui_CustomerSurNameImage', 'ui_taskFormControl$ctl03$ctl00$ctl03$ui_MaritalStatusImage','ui_taskFormControl$ctl03$ctl00$ctl03$ui_SexImage','ui_taskFormControl$ctl03$ctl00$ctl03$ui_BirthDateImage','ui_taskFormControl$ctl03$ctl00$ctl03$ui_BirthPlaceCodeImage','ui_taskFormControl$ctl03$ctl00$ctl03$ui_BirthPlaceImage','ui_taskFormControl$ctl03$ctl00$ctl03$ui_IdNationalityImage','ui_taskFormControl$ctl03$ctl00$ctl03$ui_MotherOldSurNameImage','ui_taskFormControl$ctl03$ctl00$ctl03$ui_TaxNoImage','ui_taskFormControl$ctl03$ctl00$ctl03$ui_CitizenshipNoImage','ui_taskFormControl$ctl03$ctl00$ctl03$ui_HomePhoneImage'])); </script> <script> var ui_BirthPlaceCode_a='Intertech.Utility'; var ui_BirthPlaceCode_c='Intertech.Utility.Common'; var ui_BirthPlaceCode_sbn=''; ... .. What do you think is it allright or how can I do that? And according to second question above; the use of the code is correct? ref:** PS: To see my purpose please continue reading below; There is a web site page which is coded on asp.net with using c# and ajax is enabled too. I want a very fast loading web page; which is going to happen with the following architecture; 1- First all data is shown by the text boxes (There are 50 text boxes, it is an application form.) 2- When the web Page is requested and loaded, then I want all the photos are shown near the each text boxes 10 by 10 from top of the page till the end of it. (Each photo is between 5 kb - 20 kb; ) I know ImageHandler's the question is how can I put all these idea into real life? some examples and ideas will be great ! thanks This issue has been answered and you may have a look - here Regards Bk

    Read the article

  • bind: blackhole for invalid recursive queries?

    - by Udo G
    I have a name server that's publicly accessible since it is the authoritative name server for a couple of domains. Currently the server is flooded with faked type ANY requests for isc.org, ripe.net and similar (that's a known distributed DoS attack). The server runs BIND and has allow-recursion set to my LAN so that these requests are rejected. In such cases the server responds just with authority and additional sections referring the root servers. Can I configure BIND so that it completely ignores these requests, without sending a response at all?

    Read the article

  • XSD file, where to get xmlns argument?

    - by Daok
    <?xml version="1.0" encoding="utf-8"?> <xs:schema id="abc" targetNamespace="http://schemas.businessNameHere.com/SoftwareNameHere" elementFormDefault="qualified" xmlns="http://schemas.businessNameHere.com/SoftwareNameHere" xmlns:mstns="http://schemas.businessNameHere.com/SoftwareNameHere" xmlns:xs="http://www.w3.org/2001/XMLSchema"> <xs:element name="..." type="..." /> <xs:complexType name="..."> I am working on a project using XSD to generate .cs file. My question is concerning the string "http://schemas.businessNameHere.com/SoftwareNameHere" If I change it, it doesn't work. But the http:// is not a valid one... what is the logic behind and where can I can information about what to put there or how to change it?

    Read the article

  • CSS layout changes in Internet explorer

    - by user1784103
    My problem: I just coded a website that is styled just fine in chrome and firefox. However in internet explorer(9) it breaks. the gray header background is supposed to be pushed up to the right of the logo block, and the buttons are supposed to be in the dark grey area. I posted my code. I'm no expert at css, any tips would be greatly appreciated. (the second image is displaying the desired result) the html: Website </head> <body> <div class="wrap_overall"> <div class="header"> <a href="http://localhost"> <img class="logo" src="http://localhost/images/logo.png" width="175" height="24" alt="weblogo" /> </a> </div> <div class="headerbg"></div> <!-- nav top highlight --> <div style="background-color:#6c6c6c;margin:9px0px0px;height:1px;width:1020px;z-index:1;"></div> <!-- nav bar --> <div style="background-color:#5a5a5a;height:53px;width:1020px;z-index:1;"></div> <!-- nav bottom frame --> <div style="background-color:#d4e6b6;height:13px;width:1020px;z-index:1;border-top:4px solid #9de629; margin:0px 0px 10px;"></div> <div class="nav_main"> <ul> <li> <a href="http://localhost/button1"> <img src="http://localhost/images/button1.png" width="63" height="18" alt="button1" /> </a> </li> <li> <a href="http://localhost/index.php?page=button2"> <img src="http://localhost/images/button2.png" width="59" height="18" alt="button2" /> </a> </li> <li> <a href="http://localhost/index.php?page=button3"> <img src="http://localhost/images/button3.png" width="62" height="18" alt="button3" /> </a> </li> <li> <a href="http://localhost/index.php?page=button4"> <img src="http://localhost/images/button4.png" width="41" height="18" alt="button4" /> </a> </li> <li> <a href="http://localhost/index.php?page=button5"> <img src="http://localhost/images/button5.png" width="73" height="18" alt="button5" /> </a> </li> </ul> </div> </div> </body> </html> the css: .logo { padding:60px 20px 50px 20px; } body { background-color:#282828; color:#FFFFFF; font-family: Arial, Helvetica, sans-serif; } body a, img{ border-style:none; color:#9de629; text-decoration:none;} body a:visited {color:#9de629;} body a:hover{ color:#FFFFFF; text-decoration:underline;} .wrap_overall { position:relative; width: 1020px; margin:0px auto; } .header { width:216px; height:148px; margin:0px 0px; padding:0px 0px; background-color:#252525; float:left; display:inline; } .headerbg { margin:0px 0px 0px; padding:0px 0px; width:1020px; height:148px; background-color:#c7c7c7; } .nav_main/*holds the buttons*/ { margin:0px 0px 1px 0px; padding:0px 0px 0px 0px; position:absolute; top:148px; left:363px; z-index:2; overflow: hidden; } .nav_main ul { margin:0px 0px 0px 0px; padding:0px 0px 0px 0px; overflow: hidden; } .nav_main ul li { margin:0px 0px 0px 0px; display: inline; float: left; } .nav_main ul li a { outline: none; border:none; margin:0px 0px 0px 0px; margin-right:-10px; height:54px; width:125px; color:#FFFFFF; background-image:url(../images/button.png); text-align:center; display:table-cell; vertical-align:middle; } .nav_main ul li a:hover { background-image:url(../images/buttonlight.png); }

    Read the article

  • libcurl - unable to download a file

    - by marmistrz
    I'm working on a program which will download lyrics from sites like AZLyrics. I'm using libcurl. It's my code lyricsDownloader.cpp #include "lyricsDownloader.h" #include <curl/curl.h> #include <cstring> #include <iostream> #define DEBUG 1 ///////////////////////////////////////////////////////////////////////////// size_t lyricsDownloader::write_data_to_var(char *ptr, size_t size, size_t nmemb, void *userdata) // this function is a static member function { ostringstream * stream = (ostringstream*) userdata; size_t count = size * nmemb; stream->write(ptr, count); return count; } string AZLyricsDownloader::toProviderCode() const { /*this creates an url*/ } CURLcode AZLyricsDownloader::download() { CURL * handle; CURLcode err; ostringstream buff; handle = curl_easy_init(); if (! handle) return static_cast<CURLcode>(-1); // set verbose if debug on curl_easy_setopt( handle, CURLOPT_VERBOSE, DEBUG ); curl_easy_setopt( handle, CURLOPT_URL, toProviderCode().c_str() ); // set the download url to the generated one curl_easy_setopt(handle, CURLOPT_WRITEDATA, &buff); curl_easy_setopt(handle, CURLOPT_WRITEFUNCTION, &AZLyricsDownloader::write_data_to_var); err = curl_easy_perform(handle); // The segfault should be somewhere here - after calling the function but before it ends cerr << "cleanup\n"; curl_easy_cleanup(handle); // copy the contents to text variable lyrics = buff.str(); return err; } main.cpp #include <QString> #include <QTextEdit> #include <iostream> #include "lyricsDownloader.h" int main(int argc, char *argv[]) { AZLyricsDownloader dl(argv[1], argv[2]); dl.perform(); QTextEdit qtexted(QString::fromStdString(dl.lyrics)); cout << qPrintable(qtexted.toPlainText()); return 0; } When running ./maelyrica Anthrax Madhouse I'm getting this logged from curl * About to connect() to azlyrics.com port 80 (#0) * Trying 174.142.163.250... * connected * Connected to azlyrics.com (174.142.163.250) port 80 (#0) > GET /lyrics/anthrax/madhouse.html HTTP/1.1 Host: azlyrics.com Accept: */* < HTTP/1.1 301 Moved Permanently < Server: nginx/1.0.12 < Date: Thu, 05 Jul 2012 16:59:21 GMT < Content-Type: text/html < Content-Length: 185 < Connection: keep-alive < Location: http://www.azlyrics.com/lyrics/anthrax/madhouse.html < Segmentation fault Strangely, the file is there. The same error is displayed when there's no such page (redirect to azlyrics.com mainpage) What am I doing wrong? Thanks in advance EDIT: I made the function for writing data static, but this changes nothing. Even wget seems to have problems $ wget http://www.azlyrics.com/lyrics/anthrax/madhouse.html --2012-07-06 10:36:05-- http://www.azlyrics.com/lyrics/anthrax/madhouse.html Resolving www.azlyrics.com... 174.142.163.250 Connecting to www.azlyrics.com|174.142.163.250|:80... connected. HTTP request sent, awaiting response... No data received. Retrying. Why does opening the page in a browser work and wget/curl not? EDIT2: After adding this: curl_easy_setopt(handle, CURLOPT_FOLLOWLOCATION, 1); The log is: * About to connect() to azlyrics.com port 80 (#0) * Trying 174.142.163.250... * connected * Connected to azlyrics.com (174.142.163.250) port 80 (#0) > GET /lyrics/anthrax/madhouse.html HTTP/1.1 Host: azlyrics.com Accept: */* < HTTP/1.1 301 Moved Permanently < Server: nginx/1.0.12 < Date: Fri, 06 Jul 2012 09:09:47 GMT < Content-Type: text/html < Content-Length: 185 < Connection: keep-alive < Location: http://www.azlyrics.com/lyrics/anthrax/madhouse.html < * Ignoring the response-body * Connection #0 to host azlyrics.com left intact * Issue another request to this URL: 'http://www.azlyrics.com/lyrics/anthrax/madhouse.html' * About to connect() to www.azlyrics.com port 80 (#1) * Trying 174.142.163.250... * connected * Connected to www.azlyrics.com (174.142.163.250) port 80 (#1) > GET /lyrics/anthrax/madhouse.html HTTP/1.1 Host: www.azlyrics.com Accept: */* < HTTP/1.1 200 OK < Server: nginx/1.0.12 < Date: Fri, 06 Jul 2012 09:09:47 GMT < Content-Type: text/html < Transfer-Encoding: chunked < Connection: keep-alive < Segmentation fault

    Read the article

  • Identify Long Running or Slow PHP Scripts

    - by Kirk
    I have web server that is getting around 25K visits a day up at yougetsignal.com. Sometimes the site feels a bit sluggish. I am hosting it on nginx with php5-fpm. Is there a way for me to see a list of all of the long running requests that are coming to the site? I'd love to have a real-time list of all of the active requests that PHP is handling and how long they have been running. Kind of like top, but just for the web server. This would let me know how long requests are taking and which script is the culprit. Anyone have any ideas on how I can do this?

    Read the article

  • Getting error 400 / 404 - HttpUtility.UrlEncode not encoding full string?

    - by Justin808
    Why do the following URLs give me the IIS errors below: A) http://192.168.1.96/cms/View.aspx/Show/Small+test' A2) http://192.168.1.96/cms/View.aspx/Show/Small%20test' <-- this works, but is not the result from HttpUtility.UrlEncode() B) http://192.168.1.96/cms/View.aspx/Show/'%26$%23funky**!!~''+page Error for A: HTTP Error 404.11 - Not Found The request filtering module is configured to deny a request that contains a double escape sequence. Error for B: HTTP Error 400.0 - Bad Request ASP.NET detected invalid characters in the URL. The last part of the URL after /Show/ is the result after the text is being sent through HttpUtility.UrlEncode() so, according to Microsoft it is URL Encoded correctly. If I user HttpUtility.UrlPathEncode() rather than HttpUtility.UrlEncode() I get the A2 results. But B ends up looking like: http://192.168.1.96/TVCMS-CVJZ/cms/View.aspx/Show/'&$#funky**!!~''%20page which is still wrong. Does Microsoft know how to URL Encode at all? Is there a function someone has written up to do it the correct way?

    Read the article

  • Using a Filter to serve a specific page?

    - by user246114
    Hi, I am using a class which implements Filter for my jsp stuff. It looks like this: public class MyFilter implements Filter { public void doFilter(ServletRequest request, ServletResponse response, FilterChain chain) throws IOException, ServletException { request.getRequestDispatcher("mypage.jsp").forward(request, response); } } So the target, "mypage.jsp", is just sitting in my top-level directory. The filter works fine if I'm entering urls like: http://www.mysite.com/foo http://www.mysite.com/boo but if I enter a trailing slash, I'll get a 404: http://www.mysite.com/foo/ http://www.mysite.com/boo/ HTTP ERROR: 404 /foo/mypage.jsp RequestURI=/foo/mypage.jsp it seems if I enter the trailing slash, then the filter thinks I want it to look for mypage.jsp in subfolder foo or boo, but I really always want it to just find it at: http://www.mysite.com/mypage.jsp how can I do that? Thank you

    Read the article

  • accessing pdf via https URL

    - by Paul
    I send out a newsletter email containing URLs to a https website that then redirects to a pdf document. On first invocation of a URL the user is prompted with the typical https browser "security alert" popup, on selecting "Yes" the display of the PDF fails. The HTTP Header on the failed response is: HTTP/1.1 200 OK Server: ECS/HTTP-Server Date: Tue, 16 Mar 2010 15:57:26 GMT Content-type: application/pdf Content-language: en-US Set-cookie: JSESSIONID=0000r111cRz1Vc-PtCJg8Cdu4eR:-1; Path=/ Expires: Thu, 01 Dec 1994 16:00:00 GMT Cache-control: no-cache="set-cookie, set-cookie2" Connection: close Subsequent invocations of the URL successfully opens the PDF (at this point we have the session id cookie set by the initial failed request). The HTTP Header on the successful response is: HTTP/1.1 200 OK Server: ECS/HTTP-Server Date: Tue, 16 Mar 2010 16:53:03 GMT Content-type: application/pdf Content-language: en-US Connection: close The email client is Lotus Notes 6.5 which launches an IE6 browser Any ideas?

    Read the article

< Previous Page | 215 216 217 218 219 220 221 222 223 224 225 226  | Next Page >