Search Results

Search found 8206 results on 329 pages for 'firefox addon'.

Page 293/329 | < Previous Page | 289 290 291 292 293 294 295 296 297 298 299 300  | Next Page >

  • PNG Transparency Problems in IE8

    - by user138777
    I'm having problems with a transparent PNG image showing black dithered pixel artifacts around the edge of the non transparent part of the image. It only does this in Internet Explorer and it only does it from a Javascript file it is used in. Here's what I'm talking about... http://70.86.157.71/test/test3.htm (link now dead) ...notice the girl in the bottom right corner. She has artifacts around her in IE8 (I haven't tested it in previous versions of IE, but I'm assuming it probably does the same). It works perfectly in Firefox and Chrome. The image is loaded from a Javascript file to produce the mouseover effect. If you load the image all by itself, it works fine. Here's the image... http://70.86.157.71/test/consultant2.png Does anyone know how to fix this? The image was produced in Photoshop CS3. I've read things about removing the Gama, but that apparently was in previous versions of Photoshop and when I load it in TweakPNG, it doesn't have Gama. Please help!

    Read the article

  • Scriptaculous problem in IE

    - by Django Reinhardt
    Hi there. We've got this very annoying problem with Scriptaculous and Internet Explorer 7/8. We have two Effect.toggles on the same page, but only one of them is ever working (the first one). I hope it's some simple mistake in my implementation, but I can't seem to find any decent documentation. Hopefully somebody here can help. The HTML/JS looks like this: <ul> <li id="LinkA" class="icon"> <a onclick="new Effect.toggle('divA', 'slide', { duration: 0.6 }); return false;" href="#">Show List A</a> </li> </ul> <div id="divA" style="display:none"> <div> -- Things to display -- </div> </div> <ul> <li id="LinkB" class="icon"> <a onclick="new Effect.toggle('divB', 'slide', { duration: 0.6 }); return false;" href="#">Show List B</a> </li> </ul> <div id="divB" style="display:none"> <div> -- Things to display -- </div> </div> It works perfectly in Chrome and Firefox, but the second one never works in IE 7 or 8, no matter what I do. Any help would be greatly appreciated!

    Read the article

  • Comet (long polling) and XmlHttpRequest status

    - by chris_l
    I'm playing around a little bit with raw XmlHttpRequestObjects + Comet Long Polling. (Usually, I'd let GWT or another framework handle of this for me, but I want to learn more about it.) I wrote the following code: function longPoll() { var xhr = createXHR(); // Creates an XmlHttpRequestObject xhr.open('GET', 'LongPollServlet', true); xhr.onreadystatechange = function () { if (xhr.readyState == 4) { if (xhr.status == 200) { ... } if (xhr.status > 0) { longPoll(); } } } xhr.send(null); } ... <body onload="javascript:longPoll()">... I wrapped the longPoll() call in an if statement that checks for status > 0, because I encountered, that when I leave the page (by browsing somewhere else, or by reloading it), one last unnecessary comet call is sent. [And on Firefox, it even causes severe problems when doing a page reload, for some reason I don't fully understand yet.] Question: Is that status check the correct way to handle this problem, or is there a better solution?

    Read the article

  • How to retrieve captcha and save session with PHP cURL?

    - by user302974
    Hi all, i'm create some script to submit content via php curl. first fetch session and captcha, and user must submit captcha to final submit. the problem is i can't get captcha, i've try with this code and preg_match to get image tag and return it $ch = curl_init(); curl_setopt($ch, CURLOPT_URL,$url); curl_setopt($ch, CURLOPT_USERAGENT, 'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1.2) Gecko/20070219 Firefox/2.0.0.2'); curl_setopt($ch, CURLOPT_HEADER, 0); curl_setopt($ch, CURLOPT_COOKIE, 1); curl_setopt($ch, CURLOPT_COOKIEJAR, "1"); curl_setopt($ch, CURLOPT_COOKIEFILE, "1"); curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1); curl_setopt($ch, CURLOPT_RETURNTRANSFER,1); $result = curl_exec($ch); curl_close($ch); But no luck, page i'm trying to submit is http://abadijayaiklan.co.cc/pasang-iklan/. I hope someone can help me out :) Thanks and regards

    Read the article

  • CSS Background image in Redmine template arbitrarily not loading

    - by Pekka
    I`m in the process of building a template for Redmine (a project management system based on Ruby on Rails.) Ruby is running on a virtual server from a Bitnami.org installation package. The OS is Windows. The template essentially consists of a styles.css file. In that file, I have the following line: #header { padding: 0px; padding-top: 48px; background-color: #62DFFF; background-image: url(../images/bkg.jpg) background-position: center bottom; background-repeat: repeat-x; height:150px; } It's a header element with a background image. The problem: This background image arbitrarily appears and disappears when reloading. Say you reload ten times in twenty seconds; the image will appear in two instances, and be missing in the 18 others. I would have put this down to server problems, but the weird thing is that when it's missing, the request for the image doesn't appear in Firebug's net tab at all. Even if it were cached, the request should be there. Raw screenshots of the identical page on two reloads: I am 100% sure the CSS file does not change in between. I have examined both instances with Firebug and the CSS is identical. It happens in both Firefox and Chrome so it must be something basic I'm overlooking. What could be causing a browser not to load a resource at all? I have zero idea about Ruby nor Rails - getting Redmine running and customized is all I have ever had to do with this platform - so I don't really know where to look. Apache's, Mongrel's and Redmine's error logs look fine, though.

    Read the article

  • How can I use JSONP to download client-side javascript objects?

    - by Alex Mcp
    I'm trying to get client-side javascript objects saved as a file locally. I'm not sure if this is possible. The basic architecture is this: Ping an external API to get back a JSON object Work client-side with that object, and eventually have a "download me" link This link sends the data to my server, which processes it and sends it back with a mime type application/json, which (should) prompt the user to download the file locally. Right now here are my pieces: Server Side Code <?php $data = array('zero', 'one', 'two', 'testing the encoding'); $json = json_encode($data); //$json = json_encode($_GET['']); //eventually I'll encode their data, but I'm testing header("Content-type: application/json"); header('Content-Disposition: attachment; filename="backup.json"'); echo $_GET['callback'] . ' (' . $json . ');'; ?> Relevant Client Side Code $("#download").click(function(){ var json = JSON.stringify(collection); //serializes their object $.ajax({ type: "GET", url: "http://www.myURL.com/api.php?callback=?", //this is the above script dataType: "jsonp", contentType: 'jsonp', data: json, success: function(data){ console.log( "Data Received: " + data[3] ); } }); return false; }); Right now when I visit the api.php site with Firefox, it prompts a download of download.json and that results in this text file, as expected: (["zero","one","two","testing the encoding"]); And when I click #download to run the AJAX call, it logs in Firebug Data Received: testing the encoding which is almost what I'd expect. I'm receiving the JSON string and serializing it, which is great. I'm missing two things: The Actual Questions What do I need to do to get the same prompt-to-download behavior that I get when I visit the page in a browser (much simpler) How do I access, server-side, the json object being sent to the server to serialize it? I don't know what index it is in the GET array (silly, I know, but I've tried almost everything)

    Read the article

  • What do I have to change in my PHP/CURL code to retrieve data from a https:// URL?

    - by Edward Tanguay
    I have a PHP file using CURL that accepts a Google Doc URL as a parameter, then returns the plain text of the Google Doc. It worked well until recently when apparently a redirect was added so that the http:// address redirects to the equivalent https:// address, as in this example: http://docs.google.com/View?id=dc7gj86r_20dn2csqg3 So I changed my code to access the https:// address, but it just returns blank. What do I have to change my CURL code so that I can get the HTML text from the https:// address? $url = filter_input(INPUT_GET, 'url',FILTER_SANITIZE_STRING); $validUrlPrefixes[] = "https://docs.google.com"; if(beginsWithOneOfThese($url, $validUrlPrefixes)) { $user_agent = 'Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3 (.NET CLR 3.5.30729)'; $ch = curl_init(); curl_setopt($ch, CURLOPT_COOKIEJAR, "/tmp/cookie"); curl_setopt($ch, CURLOPT_COOKIEFILE, "/tmp/cookie"); curl_setopt($ch, CURLOPT_URL, $url ); curl_setopt($ch, CURLOPT_FAILONERROR, 1); curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 0); curl_setopt($ch, CURLOPT_RETURNTRANSFER,1); curl_setopt($ch, CURLOPT_TIMEOUT, 15); curl_setopt($ch, CURLOPT_USERAGENT, $user_agent); curl_setopt($ch, CURLOPT_VERBOSE, 0); $rawData = curl_exec($ch); $rawData = cleanText($rawData); if(beginsWith($url, "https://docs.google.com")) { echo qstr::convertGoogleDocContentToText($rawData); die; } echo $rawData; die;

    Read the article

  • Using same onmouseover function for multiple objects

    - by phpscriptcoder
    I'm creating a building game in JavaScript and PHP that involves a grid. Each square in the grid is a div, with an own onmouseover and onmousedown function: for(x=0; x < width; x++) { for(y=0; y < height; y++) { var div = document.createElement("div"); //... div.onmouseclick = function() {blockClick(x, y)} div.onmouseover = function() {blockMouseover(x, y)} game.appendChild(div); } } But, all of the squares seem to have the x and y of the last square that was added. I can sort of see why this is happening - it is making a pointer to x and y instead of cloning the variables - but how could I fix it? I even tried for(x=0; x < width; x++) { for(y=0; y < height; y++) { var div = document.createElement("div"); var myX = x; var myY = y; div.onmouseclick = function() {blockClick(myX, myY)} div.onmouseover = function() {blockMouseover(myX, myY)} game.appendChild(div); } } with the same result. I was using div.setAttribute("onmouseover", ...) which worked in Firefox, but not IE. Thanks!

    Read the article

  • Interrupting Prototype handler, alert() vs event.stop()

    - by lxs
    Here's the test page I'm using. This version works fine, forwarding to #success: <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd"> <html><head> <script type="text/javascript" src="prototype.js"></script> </head><body> <form id='form' method='POST' action='#fail'> <button id='button'>Oh my giddy aunt!</button> <script type="text/javascript"> var fn = function() { $('form').action = "#success"; $('form').submit(); } $('button').observe('mousedown', fn); </script> </form> </body></html> If I empty the handler: var fn = function() { } The form is submitted, but of course we are sent to #fail this time. With an alert in the handler: var fn = function() { alert("omg!"); } The form is not submitted. This is awfully curious. With event.stop(), which is supposed to prevent the browser taking the default action: var fn = function(event) { event.stop(); } We are sent to #fail. So alert() is more effective at preventing a submission than event.stop(). What gives? I'm using Firefox 3.6.3 and Prototype 1.6.0.3. This behaviour also appears in Prototype 1.6.1.

    Read the article

  • Opa app does not load in Internet Explorer when compiled with Opa 1.1.1

    - by Marcin Skórzewski
    I did a minor update to the already working application and then had problems using new version of Opa compiler. First problem - runtime exception Since the original deployment Opa 1.1.1 has been released and it resulted in error: events.js:72 throw er; // Unhandled 'error' event ^ Error: listen EADDRINUSE at errnoException (net.js:901:11) at Server._listen2 (net.js:1039:14) at listen (net.js:1061:10) at Server.listen (net.js:1127:5) at global.BslNet_Http_server_init_server (/opt/mlstate/lib/opa/stdlib/server.opp/serverNodeJsPackage.js:223:1405) at global.BslNet_Http_server_init_server_cps (/opt/mlstate/lib/opa/stdlib/server.opp/serverNodeJsPackage.js:226:15) at __v1_bslnet_http_server_init_server_cps_b970f080 (/opt/mlstate/lib/opa/stdlib/stdlib.qmljs/stdlib.core.web.server.opx/main.js:1:175) at /opt/mlstate/lib/opa/stdlib/stdlib.qmljs/stdlib.core.web.server.opx/main.js:440:106 at global.execute_ (/opt/mlstate/lib/opa/static/opa-js-runtime-cps/main.js:19:49) at /opt/mlstate/lib/opa/static/opa-js-runtime-cps/main.js:17:78 I decided to build Opa from sources and it helped, but another problem occurred :( Second problem - stops to support the IE Application stopped to work in Internet Explorer. I tried two different machines (Windows XP and 7) with IE 8 and 10. Web page does not load at all (looks like the network problem, but the same URL works fine in Firefox). I confirmed the same problem with "Hello world" from the Opa tutorial compiled with both Opa stable 1.1.1 and build from sources. I suspected that the problem is due to Node.js update (Opa = 1.1.1 requires Node 0.10.* - now I am using 0.10.12, but I also tried other 0.10-s), but "Hello world" from the Node's from page works fine. I am running the app on OSX developer box and Linux Debian 7.0 server. Any suggestions what am I doing wrong? PS. I was off the business for a while. Anyone knows what happened to the Opa forum? Signing is seams not to work.

    Read the article

  • embed dll in html <object>

    - by Raynos
    I've come across some old code <object id="foo" classid="/location/bar.dll#ProjectName.ClassName" viewastext></object> It doesn't currently work and used to work in older versions of IE. I've never come across embedding a dll in a web page like this. It appears to be a windows .NET application written in C#. This is used on our intranet. And ClassName is of type System.Windows.Forms.UserControl It also seems I can call the C# methods of the UserControl directly through javascript. Does anyone have any documentation on how this works and whether its possible to hack it into firefox. Rewriting the windows control as a web application would be a nightmare. [Edit] It appears to be some kind of activeX / COM thing where in IE you could just port a windows application directly into a html file. It's supposed to be able to run locally if you set up various correctly. If anyone has an idea of what needs to be set up for this to work, that would be nice.

    Read the article

  • ASP.Net IE6 disable button

    - by RemotecUk
    Hi, I have the following code running as part of my OnClientclick attribute on my custom ASP.Net button.... function clickOnce(btnSubmit) { if ( typeof( Page_ClientValidate ) == 'function' ) { if ( ! Page_ClientValidate() ) { return false; } } btnSubmit.disabled = true; } There is a validator on the page. If a given text box is empty then the validator activates no problem. If a given text box is populated then the button disables but a post back does not occur. The rendered markup looks like this... <input type="submit" name="TestButton" value="Test Button" onclick="clickOnce(this);WebForm_DoPostBackWithOptions(new WebForm_PostBackOptions(&quot;TestButton&quot;, &quot;&quot;, true, &quot;&quot;, &quot;&quot;, false, false))" id="TestButton" class="euva-button-decorated" /> This works nicely in Firefox but not in IE6. Its almost like after the button has been disabled it simply does not run the post back javascript. Any ideas welcomed. EDIT: I have tried returning true from the function as well.

    Read the article

  • jQuery.addClass not adding a class

    - by John Nolan
    Just why is my style not being applied in the jquery below. It aslo only adds the table in FireFox $.each(data.AdvisorPerformances, function(i) { $("#advisorPerfomance").append("<tr>" + "<td>" + data.AdvisorPerformances[i].Advisor + "</td>" + "<td>" + data.AdvisorPerformances[i].PackInCount + "</td>" + "<td>" + data.AdvisorPerformances[i].PacksInValue + "</td>" + "<td>" + data.AdvisorPerformances[i].PacksOutCount + "</td>" + "<td> " + data.AdvisorPerformances[i].PaymentsInCount + "</td>" + "<td>" + data.AdvisorPerformances[i].PaymentsInValue + "</td>" + "</tr>"); }); $("#advisorPerfomance").append("</table>"); $("#advisorPerfomance").addClass("NOTAPPLIEDSTYLE"); Also is there a better way to add a table?

    Read the article

  • Ajax UpdatePanels SetFocus issue

    - by George
    I set the AutoPostback property of a textbox to True so I can process the TextChanged event on the server and, based on what they typed in the textbox, appropriately display a message in an update panel. The problem is, when the partial screen refresh is performed, no control on the screen has focus. 99% of the time, when the text in the textbox is changed, it is because the user has tabbed forward, and so, to limit the disruption in the lost of focus, I perform a "Focus" call on teh next control in the tab sequence. For the most part, this works OK, but of course, is disputive if the user is tabbing in the reverse order or has used the mouse to set the focus to another control. In these situations, the focus would be set to the next control even though the user was trying to set focus elsewhere. OK, that sucks. Now what I consider the bigger problem with calling the focus method on the server: In IE, it works OK, but in Mozilla Firefox and Chrome, setting the focus causes a repositioning of the scroll bar, even though none is necessary because the control is already in view. I realize that I could switch to doing AJAX web service calls, but these darn Updae Panels are so convenient if used in moderation. is there anyway to use updatepanels and not have these focus/scroll issues?

    Read the article

  • What is wrong with the JavaScript event handling in this example? (Using click() and hover() jQuery

    - by Bungle
    I'm working on a sort of proof-of-concept for a project that approximates Firebug's inspector tool. For more details, please see this related question. Here is the example page. I've only tested it in Firefox: http://troy.onespot.com/static/highlight.html The idea is that, when you're mousing over any element that can contain text, it should "highlight" with a light gray background to indicate the boundaries of that element. When you then click on the element, it should alert() a CSS selector that matches it. This is somewhat working in the example linked above. However, there's one fundamental problem. When mousing over from the top of the page to the bottom, it will pick up the paragraphs, <h1> element, etc. But, it doesn't get the <div>s that encompass those paragraphs. However, for example, if you "sneak up" on the <div> that contains the two paragraphs "The area was settled..." and "Austin was selected..." from the left - tracing down the left edge of the page and entering the <div> just between the two paragraphs (see this screenshot) - then it is picked up. I assume this has something to do with the fact that I haven't attached an event handler to the <body> element (where you're entering the <div> from if you enter from the left), but I have attached handlers to the <p>s (where you're entering from if you come from the top or bottom). There are also other issues with mousing in and out elements - background colors that "stick" and the like - that I think are also related. As indicated in the related question posted above, I suspect there is something about event bubbling that I don't understand that is causing unexpected behavior. Can anyone spot what's wrong with my code? Thanks in advance for any help!

    Read the article

  • Characters in usernames that cause trouble

    - by acidzombie24
    I am testing out security and reliability issues on my site. I have made \n and \r illegal. I created a user with null in the name which caused my PM system to not message the user. However \b worked and \t didnt allow copy/paste to work correctly. The browser (firefox which i am testing with) copied the tab as a single space causing the name not to be the same thus not recognizing the username. Since i cant copy paste easily i'll probably disallow it. \f works as well although i do see a symbol in the title but nowhere else because of the \f. What else should i try? It appears 0-31 127-159 (i dont understand this range) are illegal. What characters in legal range might i want to disallow? I heard there was a 0 width character space. That may be something i want to disallow? What else is there?

    Read the article

  • Flash won't load, embed error?

    - by Adrian M.
    Hello, I want to know why the flash movie in the header located here: http://www.dolphintemplate.com/demo/dolphin7/index.php?skin=dt_firestarter_red only loads in Firefox but NOT in IE and Chrome.. The flash movie resides in a iframe, this is the code of the iframe: <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml" > <head> <meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1" /> <title>header</title> </head> <body bgcolor="#000000" topmargin="0" leftmargin="0" marginwidth="0" marginheight="0"> <object data="header.swf" type="application/x-shockwave-flash" id="myflash" width="988" height="240"> <param name="movie" value="header.swf" /> <param name="bgcolor" value="#000000" /> <param name="height" value="988" /> <param name="width" value="240" /> <param name="quality" value="high" /> <param name="menu" value="false" /> <param name="allowscriptaccess" value="samedomain" /> <p>Adobe <a href="http://get.adobe.com/flashplayer/">Flash Player</a> is required to view this content.</p> </object> </body> </html> Thanks.

    Read the article

  • How to prevent chrome from injecting content to webpage

    - by Nazariy
    Recently I have discovered that my application is misbehaving in Google Chrome. On a page with a form, after it was submitted, my application reloads page using simple method like this: header('Location: ' . $url); after that, page is rendered incorrectly and this content is injected to DOM <div id="sbi_camera_button" class="sbi_search" style="left: 0px; top: 0px; position: absolute; width: 29px; height: 27px; border: none; margin: 0px; padding: 0px; z-index: 2147483647; display: none; "></div> After manual page refresh everything works as expected. I'm not sure what causing this behavior, as I'm working in closed local environment and application works fine in Firefox. My application using following libraries (hosted locally): jQuery v1.7.1 jQuery UI 1.8.16 Bootstrap.js v 2.1.1 Can someone suggest me what can possibly cause this issue?

    Read the article

  • Submit form using javascript, work in FF but not in IE

    - by Permana
    I have this code. The code below is working in Firefox, but it is not in IE <body> <?php $data = getLoginData($_SESSION['whoyouare']); ?> <form name="frm_redirect_dfr" action="<?php echo $data['url']; ?>" method="POST" id="frm_redirect_dfr" style="display: none;"> <input name="DFRNet_User" value="<?php echo $data['username']; ?>" type="hidden" /> <input name="DFRNet_Pass" value="<?php echo $data['password']; ?>" type="hidden" /> <input name="tbllogin" value="login" type="hidden" /> <input type="submit" value="submit" /> </form> <script language="javascript" type="text/javascript"> document.forms["frm_redirect_dfr"].submit(); </script> </body> What I want to do is, when user access the page, it first will try to get login data, echo it in the form, and submit the form automatically using javascript

    Read the article

  • Google Chrome and (cache or memory leaks).

    - by Alexey Ogarkov
    Hello All, I have a big problem with Google Chrome and its memory. My app is displaying to user several image charts and reloads them every 10s. In the interval i have code like that var image = new Image(); var src = 'myurl/image'+new Date().getTime(); image.onload = function() { document.getElementById('myimage').src = src; image.onload = image.onabort = image.onerror = null; } image.src = src; So i have no memory leaks in Firefox and IE. Here the response headers for images Server Apache-Coyote/1.1 Vary * Cache-Control no-store (// I try no-cache, must-revalidate and so on here) Content-Type image/png Content-Length 11131 Date Mon, 31 May 2010 14:00:28 GMT Vary * taken from here In about:cache page there is no my cached images. If i enable purge-memory-button for chrome (--purge-memory-button parameter) it`s not help. Images is in PNG24. So i think that the problem is not in cache. May be Google Chrome is not releasing memory for old images. Please help. Any suggestions. Thanks.

    Read the article

  • can javascript process binary data?

    - by Johnny
    admit me describe my questions in situation-oriented way: assume IE is still the dominate web browser(the firefox have document for binary processing): the XMLHttpRequest.responseText or XMLHttpRequest.responseXML in ie desire txt or xml/xhtml/html,but what about the server response the xmlHttprequest whith MIME TYPE application/octet ? would the response string all little than 256 ?(every char of that string < 256), thanks very much for a straight answer, i have no webserver env,so i don't know how to test it out. because use txt or xml have a issue of character set encode, and i don't know how to process #[[[CDDATA node of one encoded xml(ex : utf-8,ascii,gb18030) with javascript, when i getNodeText, does the docObj return me byte or decoded char ? if it was decoded char which according to the header indicated charSet in the httpresponse , it would be all wrong. to avoid mess up with charSet ,i would like the server to response octet data and force strings data to be encoded as utf-8 but another charSet in the binary format. if the response is octal, so i guess the browser would not try to decode the response"txt" does this weird? or miss understanding the fundamental things? EDIT: I believe the question is asking this: Can Javascript safely process strings that aren't encoded in Unicode? What are the problems with trying to do so? EDIT: no no no , i means if http-header: content-type is "application/octet" , would the ie try to decoded it as (16bits Unicode | ie local setting charset ) when i get XMLHttpRequestobj.responseText use javascript ? or it(ie) just wrap every single byte of the response body as a javascript string, then every char in that string little than or equal 256 (char<=256), am i talking Mars language? sadly, if i were Marsizen,i would come as tourist without fuzzy questions. however i am in a country which share at least one property with Mars : RED

    Read the article

  • MVC JsonResult not working with chrome?

    - by Karsten Detmold
    i want jquery to take a JsonResult from my MVC controller but it does'nt receive any data! If I put the output into a textfile and enter its link its working so I think my jQuery is fine. Then I was testing with other browsers like chrome and I saw NOTHING. The requested page was just emtpy.. no errors. Also IE seems to have problems receiving my string.. only firefox displays the string but why? public JsonResult jsonLastRequests() { List<Request> requests = new List<Request>(); while (r.Read()) { requests.Add(new Models.Request() { ID = (int)r[0], SiteID = r[1].ToString(), Lat = r[2].ToString(), City = r[4].ToString(), CreationTime = (DateTime)r[5] }); } r.Close(); return Json(requests); } I found out that also if I want to return the JSON as string its not working! Its working with a string in all browsers now.. but jQuery is still not loading anything var url = "http://../jsonLastRequests"; var source = { datatype: "json", datafields: [ { name: 'ID' }, { name: 'SiteID' }, { name: 'Lat' }, { name: 'CreationTime' }, { name: 'City' }, ], id: 'id', url: url }; var dataAdapter = new $.jqx.dataAdapter(source, { downloadComplete: function (data, status, xhr) { }, loadComplete: function (data) { }, loadError: function (xhr, status, error) { } }); I fixed my problem by adding: access-control-allow-origin:*

    Read the article

  • Can't write to dynamic iframe using jQuery

    - by Fremont Troll
    My goal is to dynamically create an iframe and write ad JavaScript into it using jQuery (e.g. Google AdSense script). My code works on Chrome, but fails intermittently in Firefox i.e. sometimes the ad script runs and renders the ad, and other times it doesn't. When it doesn't work, the script code itself shows up in the iframe. My guess is these intermittent failures occur because the iframe is not ready by the time I write to it. I have tried various iterations of *iframe_html* (my name for the function which is supposed to wait for the iframe to be ready), but no luck. Any help appreciated! PS: I have read various threads (e.g. http://stackoverflow.com/questions/205087/jquery-ready-in-a-dynamically-inserted-iframe). Just letting everyone know that I've done my research on this, but I'm stuck :) Iteration 1: function iframe_html(html){ $('<iframe name ="myiframe" id="myiframe"/>').appendTo('#maindiv'); $('#myiframe').load( function(){ $('#myiframe').ready( function(){ var d = $("#myiframe")[0].contentWindow.document; d.open(); d.close(); d.write(html); }); } ); }; Iteration 2: function iframe_html(html){ $('<iframe id="myiframe"/>').appendTo('#maindiv').ready( function(){ $("#myiframe").contents().get(0).write(html); } ); };

    Read the article

  • Every flash uploader giving bad progress values.

    - by Mike Boers
    The file upload script I wrote early last year for an internal website has been misbehaving oddly on a number of machines. On some machines it consistently works fine, on others it consistently misbehaves. I am having exactly the same problem with YUI Uploader, SWFUpload (2.2 and 2.5a), and Uploadify. On the misbehaving machines, the progress event (or callback as the case may be) is reporting the upload going far too quickly. It is progressing around 9 or 10MB/s, instead of the 50 or 60kb/s that is actually going on. The progress bar fills up very quickly, and then no more progress events are triggered. A few minutes later the completion event will trigger when the upload is actually done. I must emphasize that the file upload does proceed normally, even though the progress being reported is very wrong. The progress events are reporting a correct file size, but the reported amount uploaded is usually way too high, and it appears that it is always a multiple of 2^16 (65536). I'm only having this problem with Firefox 3.5 on Windows XP, all of which have various subversions of Flash 10. Has anyone heard of this happening, or have any idea what is going on? (I'm off to go file a number of bug reports, but hopefully someone here has some previous experience with this.)

    Read the article

  • Overlapping 2 Flash objects and controlling z-index

    - by Magnus Smith
    I have two Flash objects on a webpage (call them A and B), and they overlap so one partially obscures the other. I don't seem to have any control over the z-index, to force B in front of A. Whatever I try, A always 'wins' and stays on the top! I have read many people's posts about problem with getting HTML to show over the top of Flash...but nothing about when your two overlapping items are both Flash objects. I have tried various combinations of wmode=opaque/transparent/window I have tried CSS position:absolute/relative and z-index:0/999 I have tried placing the HTML sections in a different order The problem is the same in IE6 and Firefox 2.0 I do not want to use jQuery in this case In my particular situation B must have position:absolute and wmode=transparent, and sit above A. A needs relative positioning and transparency is not required. However, I have been testing without these restrictions, and I still have no control over the overlap. Are some SWFs (ours are adverts sent by clients) created in such a way as to override any code control of z-index? Thanks for any advice you can give.

    Read the article

< Previous Page | 289 290 291 292 293 294 295 296 297 298 299 300  | Next Page >