Search Results

Search found 17188 results on 688 pages for 'browser plugins'.

Page 601/688 | < Previous Page | 597 598 599 600 601 602 603 604 605 606 607 608  | Next Page >

  • Positioning / Scrolling problem with Flex popup.

    - by user284163
    Hi all, I'm trying to work out a specific problem I'm having with positioning in Flex using the PopUpManager. Basically I'm wanting to create a popup which will scroll with the parent container - this is necessary because the parent container is large and if the user's browser window isn't large enough (this will be the case the majority of the time) - they will have to use the scrollbar of the container to scroll down. The problem is that the popup is positioned relative to another component, and it needs to stay by that component. <?xml version="1.0" encoding="utf-8"?> <mx:Application xmlns:mx="http://www.adobe.com/2006/mxml" layout="absolute"> <mx:Script> <![CDATA[ import mx.core.UITextField; import mx.containers.TitleWindow; import mx.managers.PopUpManager; private function clickeroo(event:MouseEvent):void { var popup:TitleWindow = new TitleWindow(); popup.width = 250; popup.height = 300; popup.title = "Example"; var tf:UITextField = new UITextField(); tf.wordWrap = true; tf.width = popup.width - 30; tf.text = "This window stays put and doesn't scroll when the hbox is scrolled (even with using the hbox as parent in the addPopUp method), I need the popup to be local to the HBox."; popup.addChild(tf); PopUpManager.addPopUp(popup, hbox, false); PopUpManager.centerPopUp(popup); } ]]> </mx:Script> <mx:HBox width="100%" height="2000" id="hbox"> <mx:Button label="Click Me" click="clickeroo(event)"/> </mx:HBox> </mx:Application> Could anyone give me any pointers in the right direction? Thanks.

    Read the article

  • ajax delay load UserControl asp.net

    - by user196202
    regarding ajax delay load of usercontrols (or any controls) on Post at Encosia.com : http://encosia.com/2008/02/05/boost-aspnet-performance-with-deferred-content-loading/ I tried to implement it , but I noticed that it can be done only for simple controls or UserControls that Have simple asp.net controls (or html tags) . But when it involved with advanced dynamic ajax control (like ajaxControlToolkit or Telerik controls) that have javascripts inside them This method of injecting the html code to the .InnerHtml property of div tag (for example) IS NOT WORKING , and I red about it that The browser need to load the script on load and after that it won't iterperate the scripts injectd via .InnerHtml. So I attached here example of delay load project (from encosia.com by dave ward) with my modification (look at DefaultPopup.aspx and beforePopup.aspx and AfterPopup.aspx) Which I modified the RssReader to show listview with popup items (which is implemented via ACT HoverMenuExtender ) So in the regular way the popup items are shown right , but on the delay load which is done by creating virtual page for rendering the html and injecting it to .InnerHtml property – This ISN'T WORKING. So my question is : is there a way to do delay loading for controls which include scripts lik ACT and Telerik and others? And for the ajax templates – if I need to inject advanced control to the page – how I do it with your approach? Thanks very much (I can't attach here files so everyone please ask me by mail ([email protected]) and i'll send it to him. ) Zahi Kramer

    Read the article

  • Which combining css technique?

    - by DotnetShadow
    Hi there, Which of the following would you say is the best way to go when combining files for CSS: Say I have a master.css file that is used across all pages on my website (page1.aspx, page2.aspx) Page1.aspx - A specific page that has some unique css that is only ever used on that page, so I create a page1.css and it also uses another css grids.css Page2.aspx - Another specific page that is different from all other pages on the site and is different to page1.aspx, I'll name this page2.aspx and make a page2.css this doesn't use grids.css So would you combine the scripts as: Option1: Combine scripts csshandler.axd?d=master.css,page1.css,grids.css when visiting page1 Combine scripts csshandler.axd?d=master.css,page2.css when visiting page2 Benefits: Page specific, rendering quicker since only selectors for that page need to be matched up no unused selectors Drawback: Multiple combinations of master.css + page specific hence master.css has to be downloaded for each page Option2: Combine all scripts whether a page needs them or not csshandler.axd?d=master.css,page1.css,page2.css,grids.css (master, page1 and page2) that way it gets cached as one. The problem is that rendering maybe slower since it will have to try and match EVERY selector in the css with selectors on the page even the missing ones, so in the case of page2.aspx that doesn't use grids.css the selectors in grids.css will need to be parsed to see if they are in page2 which means rendering will be slow Benefits: One file will ever be downloaded and cached doesn't matter what page you visit Drawback: Unused selectors will need to be parsed by the browser slower rendering Option3: Leave the master file on it's own and only combine other scripts (the benefit of this is because master is used across all pages there is a chance that this is cached so doesn't need to keep on downloading csshandler.axd?d=Master.css csshandler.axd?d=page1.css,grids.css Benefits: master.css file can be cached doesn't matter what page you visit. Not many unused selectors as page spefic is applied Drawback: Initially minimum of 2 HTTP request will have to be made What do you guys think? Cheers DotnetShadow

    Read the article

  • How to delete a large cookie that causes Apache to 400

    - by jakemcgraw
    I've come across an issue where a web application has managed to create a cookie on the client, which, when submitted by the client to Apache, causes Apache to return the following: HTTP/1.1 400 Bad Request Date: Mon, 08 Mar 2010 21:21:21 GMT Server: Apache/2.2.3 (Red Hat) Content-Length: 7274 Connection: close Content-Type: text/html; charset=iso-8859-1 <!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN"> <html><head> <title>400 Bad Request</title> </head><body> <h1>Bad Request</h1> <p>Your browser sent a request that this server could not understand.<br /> Size of a request header field exceeds server limit.<br /> <pre> Cookie: ::: A REALLY LONG COOKIE ::: </pre> </p> <hr> <address>Apache/2.2.3 (Red Hat) Server at www.foobar.com Port 80</address> </body></html> After looking into the issue, it would appear that the web application has managed to create a really long cookie, over 7000 characters. Now, don't ask me how the web application was able to do this, I was under the impression browsers were supposed to prevent this from happening. I've managed to come up with a solution to prevent the cookies from growing out of control again. The issue I'm trying to tackle is how do I reset the large cookie on the client if every time the client tries to submit a request to Apache, Apache returns a 400 client error? I've tried using the ErrorDocument directive, but it appears that Apache bails on the request before reaching any custom error handling.

    Read the article

  • Parsing Line Breaks in PHP/JavaScript

    - by Matt G
    I have a text area in my PHP application where users can enter notes in a project. Sometimes this is displayed on the page via PHP and sometimes it is displayed via javascript. The problem is, if the note is across multiple lines (i.e. the user presses enter while entering notes in the text area), it causes the JS to fail. It's fine when it's being done by the PHP. The line of code in question is: var editnotes='<textarea class="desc_text" style="width:20em;" id="notes_editor"><?php print $notes; ?></textarea>'; So, if the note is over multiple lines, the PHP builds the pager as: var editnotes='<textarea class="desc_text" style="width:20em;" id="notes_editor">This is a test note over multiple lines </textarea>'; And this obviously causes problems for the js. So my question is, what can I do to prevent this? As the code is being built by PHP before it even gets to the browser, I'm thinking that the best approach may be to parse it in the PHP so that the output is something more like this: var editnotes='<textarea class="desc_text" style="width:20em;" id="notes_editor">This is<br/>a test note<br/>over multiple lines<br/></textarea>'; Will this work? How would I do it? Thanks

    Read the article

  • Rails 2.3.11 Server Crashing After 4 Requests

    - by Taka
    I have a Rails 2.3.11 application running on my local Windows machine using InstantRails. I cd to my application directory, run ruby script/server to start the server running, and point my browser to localhost:3000. I get the page I expect, and am able to click a few links to other pages (all of them are static). The problem starts when I load the 4th page or so. My server crashes, with this message: Processing HomeController#index (for 127.0.0.1 at 2012-06-23 15:48:40) [GET] Rendering template within layouts/application Rendering home/index Completed in 11ms (View: 9, DB: 1) | 200 OK [http://localhost/index] C:/rails/ruby/lib/ruby/gems/1.8/gems/activesupport-2.3.11/lib/active_support/memoizable.rb:46: [BUG] Segmentation fault ruby 1.8.7 (2012-02-08 patchlevel 358) [i386-mingw32] This application has requested the Runtime to terminate it in an unusual way. Please contact the application's support team for more information. I've uninstalled this gem and reinstalled it, which didn't help. It doesn't seem to be the gem though, because the segmentation fault sometimes occurs in C:/rails/ruby/lib/ruby/gems/1.8/gems/mongrel-1.1.2-x86-mswin32/lib/mongrel.rb:114 or C:/rails/ruby/lib/ruby/1.8/benchmark.rb:306 Versions: >ruby -v ruby 1.8.7 (2012-02-08 patchlevel 358) [i386-mingw32] >rails -v Rails 2.3.11 I'd like to get this fixed so while I'm developing I don't have to keep restarting my server. Any suggestions?

    Read the article

  • Webservice returning 403 error

    - by user48408
    I'm wondering whether I'm receiving the 403 errors because there are too many attempted connections being made to the webservice. If this is the case how do I get around it? I've tried creating a new instance of InternalWebService each time and disposing of the old one but I get the same problem. I've disabled the firewall and the webservice is located locally at the moment. I beginning to think it may be a problem with the credentials but the control tree is populated via the webservice at some stage. If I browse to the webmethods in my browser I can run them all. I return an instance of the webservice from my login handler loginsession.cs: static LoginSession() { ... g_NavigatorWebService = new InternalWebService(); g_NavigatorWebService.Credentials = System.Net.CredentialCache.DefaultCredentials; ... } public static InternalWebService NavigatorWebService { get { return g_NavigatorWebService; } } I have a tree view control which uses the webservice to populate itself. IncidentTreeViewControl.cs: public IncidentTreeView() { InitializeComponent(); m_WebService = LoginSession.NavigatorWebService; ... } public void Populate() { m_WebService.BeginGetIncidentSummaryByCompany(new AsyncCallback(IncidentSummaryByClientComplete), null); m_WebService.BeginGetIncidentSummaryByDepartment(new AsyncCallback(IncidentSummaryByDepartmentComplete), null); ... } private void IncidentSummaryByClientComplete(IAsyncResult ar) { MyTypedDataSet data = m_WebService.EndGetIncidentSummaryByCompany(ar); //403 ..cont... } I'm getting the 403 on the last line.

    Read the article

  • Shopify JSONP issue in ajaxAPI

    - by Aaron U
    I'm getting some odd response back from shopify ajaxapi for jsonp. If you cURL a Shopify ajax api location http://storename.domain.com/cart.json?callback=handler you will get a jsonp response. But something is breaking the same request in browsers. It appears to be related to compression? Here are some responses from each browser when attempting to call the jsonp as documented. Firefox: The page you are trying to view cannot be shown because it uses an invalid or unsupported form of compression. Internet Explorer: Internet Explorer cannot display the webpage Chrome/Safari/Webkit: Cannot decode raw data, or failed (chrome) Attempted use via jquery: $.getJSON('http://storename.domain.com/cart.json?callback=?', function(data) { ... }); // Results in a failed request, viewable network request panels of dev tools Here is some output from cURL including response headers: $ curl -i http://storename.domain.com/cart.json?callback=CALLBACK_FUNC HTTP/1.1 200 OK Server: nginx Date: Tue, 18 Dec 2012 13:48:29 GMT Content-Type: application/javascript; charset=utf-8 Transfer-Encoding: chunked Connection: keep-alive Status: 200 OK ETag: cachable:864076445587123764313132415008994143575 Cache-Control: max-age=0, private, must-revalidate X-Alternate-Cache-Key: cachable:11795444887523410552615529412743919200 X-Cache: hit, server X-Request-Id: a0c33a55230fe42bce79b462f6fe450d X-UA-Compatible: IE=Edge,chrome=1 Set-Cookie: _session_id=b6ace1d7b0dbedd37f7787d10e173131; path=/; HttpOnly X-Runtime: 0.033811 P3P: CP="NOI DSP COR NID ADMa OPTa OUR NOR" CALLBACK_FUNC({"token":null,"note":null,"attributes":{},"total_price":0,...}) Also related unanswered here: Shopify Ajax API JSONP supported? Thanks

    Read the article

  • Why would javascript click-areas not be working in IE8?

    - by Edward Tanguay
    I'm trying to find a bug in an old ASP.NET application which causes IE8 to not be able to click on the following "button" area in our application: <td width="150px" class="ctl00_CP1_UiCommandManager1i toolBarItem" valign="middle" onmouseout="onMouseOverCommand(this,1,'ctl00_CP1_UiCommandManager1',0,0);" onmouseover="onMouseOverCommand(this,0,'ctl00_CP1_UiCommandManager1',0,0);" onmousedown="onMouseDownCommand(this, 'ctl00_CP1_UiCommandManager1', 0, 0);" onmouseup="onMouseUpCommand(this, 'ctl00_CP1_UiCommandManager1', 0, 0);" id="ctl00_CP1_UiCommandManager1_0_0"> <span style="width:100%;overflow:hidden;text-overflow:ellipsis;vertical-align:middle;white-space:nowrap;"> NEW </span> </td> When we switch IE8 to IE7 compatibility mode, the problem disappears, IE7 is able to click on it. Since the above HTML is generated by a third party control (Janus, http://www.janusys.com/controls), we don't have the source code. has anyone experienced any similar problems with IE8? I've determined that it actually fires the onMouseDownCommand command also the CSS of the button area is different in IE8, it doesn't have color shading that it does in IE7. I can imagine that somewhere the HTML is not valid and IE8 being stricter is not playing along, but where? any advice on how to narrow in on this bug welcome ANSWER: Turned out to be that the application was not checking the navigator.agent for "MSIE 8.0" and was thus treating IE8 has a non-Internet-Explorer browser. Thanks Lazarus for the tip, the IE8 Javascript debugger is very nice, like a Firebug for IE, will be using it more!

    Read the article

  • Question about registering COM server and Add Reference to it in a C# project

    - by smwikipedia
    I build a COM server in raw C++, here is the procedure: (1) write an IDL file to define the interface and library. (2) use msidl.exe to compile the IDL file to necessary .h, .c, .tlb files. (3) implement the COM server in C++ and build a .dll file. (4) add the following registry entris: [HKEY_CLASSES_ROOT\RawComCarLib.ComCar.1\CurVer] @="RawComCarLib.ComCar.1" ;CLSID [HKEY_CLASSES_ROOT\CLSID{6CC26343-167B-4CF2-9EDF-99368A62E91C}] @="RawComCarLib.ComCar.1" [HKEY_CLASSES_ROOT\CLSID{6CC26343-167B-4CF2-9EDF-99368A62E91C}\InprocServer32] @="D:\com\Project01.dll" [HKEY_CLASSES_ROOT\CLSID{6CC26343-167B-4CF2-9EDF-99368A62E91C}\ProgID] @="RawComCarLib.ComCar.1" [HKEY_CLASSES_ROOT\CLSID{6CC26343-167B-4CF2-9EDF-99368A62E91C}\TypeLib] @="{E5C0EE8F-8806-4FE3-BC0E-3A56CFB38BEE}" ;TypeLib [HKEY_CLASSES_ROOT\TypeLib{E5C0EE8F-8806-4FE3-BC0E-3A56CFB38BEE}] [HKEY_CLASSES_ROOT\TypeLib{E5C0EE8F-8806-4FE3-BC0E-3A56CFB38BEE}\1.0] @="Car Server Type Lib" [HKEY_CLASSES_ROOT\TypeLib{E5C0EE8F-8806-4FE3-BC0E-3A56CFB38BEE}\1.0\0] [HKEY_CLASSES_ROOT\TypeLib{E5C0EE8F-8806-4FE3-BC0E-3A56CFB38BEE}\1.0\0\win32] @="D:\com\Project01.tlb" [HKEY_CLASSES_ROOT\TypeLib{E5C0EE8F-8806-4FE3-BC0E-3A56CFB38BEE}\1.0\FLAGS] @="0" [HKEY_LOCAL_MACHINE\SOFTWARE\Classes\TypeLib{E5C0EE8F-8806-4FE3-BC0E-3A56CFB38BEE}\1.0\0\win32] @="C:\Windows\System32\msdatsrc.tlb" (5) I try to add reference to the COM by click the Add Reference in the C# project. (6) In the COM tab, I saw my "Car Server Type Lib", it's ok until now. I try to use the Object Browser to browse my COM lib, but the Visual Studio said "the following components could not be browsed", and I noticed that there's no new reference added to the list in the C# project Reference. I can use the tlbimp.exe to generate a interop.CarCom.dll, and then use the COM through this interop dll, but I want this interop assembly to be generated automatically when I just add reference to the COM. Could someone tell me what's wrong? Many thanks.

    Read the article

  • Rendering PDF on WebPage

    - by Priyank
    Hi. We are trying to load a pdf file in web browser using pdfobject javascript api. Currently the size of the pdf's that we are trying to display is close to 10MBs. This creates a long delay in displaying a PDF on web page; while the complete PDF gets downloaded. We need to remove this lag by achieving either of the alternatives: Show a progress bar until the PDF is actually displayed. We couldn't find an event which is triggered and can be used to find out if pdf is visible now. This lacking doesn't let us decide when to stop showing progress bar/spinner OR lazy load the PDF such that it gets displayed as soon as first page gets loaded. With that ateast user will have a visual indication as to something is happening. We couldn'find anything in pdf object that lets us do a lazy load. User alternative pdf rendering api; this is a low priority as we already have complete code in place; but in an event of first 2 alternatives not being met; we'd have to consider this option. So please feel free to suggest. Any other ideas as to how user interaction can be made more intuitive or pleasant; would be welcome. Cheers

    Read the article

  • WCF REST with jQuery AJAX - removing/working around same origin policy

    - by csauve
    So I'm trying to create a C# WCF REST service that is called by jQuery. I've discovered that jQuery requires that AJAX calls are made under the same origin policy. I have a few questions for how I might proceed. I am already aware of; 1. The hacky solution of JSONP with a server callback 2. The way too much server overhead of having a cross-domain proxy. 3. Using Flash in the browser to make the call and setting up crossdomain.xml at my WCF server root. I'd rather not use these because; 1. I don't want to use JSON, or at least I don't want to be restricted to using it 2. I would like to separate the server that serves static pages from the one that serves application state. 3. Flash in this day in age is out of the question. What I'm thinking: is there anything like Flash's crossdomain.xml file that works for jQuery? Is this "same-origin" policy a part of jQuery or is it a restriction in specific browsers? If it's just a part of jQuery, maybe I'll try digging in the code to work around it.

    Read the article

  • CodeIgniter's Scaffolding and Helper Functions Not Working

    - by 01010011
    Hi, I'm following CodeIgniter's tutorial "Create a blog in 20 minutes" and I am having trouble getting the helper, anchor and Scaffolding functions to work. I can't seem to create links on my HTML page using the helper and anchor functions. I put $this->load->helper('url'); $this->load->helper('form'); in the constructor under parent::Controller(); and <p>&lt;?php anchor('blog/comments','Comments'); ?&gt;</p> within the foreach loop as specified in the tutorial. But Im not getting the links to appear. Secondly, I keep getting a 404 Page Not Found error whenever I try to access CodeIgniter's Scaffolding page in my browser, like so: localhost/codeignitor/index.php/blog/scaffolding/mysecretword I can access localhost/codeignitor/index.php/blog just fine. I followed CodeIgnitor's instructions in their "Create a blog in 20 minutes" by storing my database settings in the database.php file; and automatically connecting to the database by inserting "database" in the core array of the autoload.php; and I've added both parent::Controller(); and $this->load->scaffolding('myTableName') to blog's constructor. It still gives me this 404. Any assistance will be appreciated. Thanks in advance?

    Read the article

  • BlazeDS StreamingAMF: How to detect when flex client closes the connection?

    - by Adrian Pirvulescu
    Hello, I have a Flex application that connects to a BlazeDS server using the StreamingAMF channel. On the server-side the logic is handled by a custom adapter that extends ActionScriptAdapter and implements FlexSessionListener and FlexClientListener interfaces. I am asking how can I detect which "flex-client" has closed a connection when for example the user is closing the browser? (so I can clean some infos inside the database) I tried using the following: 1. To manually manage the command messages: @Override public Object manage(final CommandMessage commandMessage) { switch (commandMessage.getOperation()) { case CommandMessage.SUBSCRIBE_OPERATION: System.out.println("SUBSCRIBE_OPERATION = " + commandMessage.getHeaders()); break; case CommandMessage.UNSUBSCRIBE_OPERATION: System.out.println("UNSUBSCRIBE_OPERATION = " + commandMessage.getHeaders()); break; } return super.manage(commandMessage); } But the clientID's are always different from the ones that came. 2. Listening for sessionDestroyed and clientDestroyed events @Override public void clientCreated(final FlexClient client) { client.addClientDestroyedListener(this); System.out.println("clientCreated = " + client.getId()); } @Override public void clientDestroyed(final FlexClient client) { System.out.println("clientDestroyed = " + client.getId()); } @Override public void sessionCreated(final FlexSession session) { System.out.println("sessionCreated = " + session.getId()); session.addSessionDestroyedListener(this); } @Override public void sessionDestroyed(final FlexSession session) { System.out.println("sessionDestroyed = " + session.getId()); } But those sessionDestroyed and clientDestroyed methods are never called. :(

    Read the article

  • Automatically resize jQuery UI dialog to the width of the content loaded by ajax

    - by womp
    I'm having a lot of trouble finding specific information and examples on this. I've got a number of jQuery UI dialogs in my application attached to divs that are loaded with .ajax() calls. They all use the same setup call: $(".mydialog").dialog({ autoOpen: false, resizable: false, modal: true }); I just want to have the dialog resize to the width of the content that gets loaded. Right now, the width just stays at 300px (the default) and I get a horizontal scrollbar. As far as I can tell, "autoResize" is no longer an option for dialogs, and nothing happens when I specify it. I'm trying to not write a separate function for each dialog, so .dialog("option", "width", "500") is not really an option, as each dialog is going to have a different width. Specifying width: 'auto' for the dialog options just makes the dialogs take up 100% of the width of the browser window. What are my options? I'm using jQuery 1.4.1 with jQuery UI 1.8rc1. It seems like this should be something that is really easy. EDIT: I've implemented a kludgy workaround for this, but I'm still looking for a better solution.

    Read the article

  • not a valid AllXsd value

    - by jun
    I got this from a Soap client request: Exception: SoapFault exception: [soap:Client] Server was unable to read request. --- There is an error in XML document (2, 273). --- The string '2010-5-24' is not a valid AllXsd value. in /path/filinet.php:21 Stack trace: #0 [internal function]: SoapClient-__call('SubIdDetailsByO...', Array) #1 /path/filinet.php(21): SoapClient-SubIdDetailsByOfferId(Array) #2 {main} Seems like I am sending an incorrect value, how do I format my value in an AllXsd in php? Here is my code: <?php $start = isset($_GET['start']) ? $_GET['start'] : date("Y-m-d"); $end = isset($_GET['end']) ? $_GET['end'] : date("Y-m-d"); //define parameter array $param = array('userName'=>'user', 'password'=>'pass', 'startDate' => $start, 'endDate' => $end, 'promotionId' => ''); //Get wsdl path $serverPath = "https://webservices.filinet.com/affiliate/reports.asmx?WSDL"; //Declare Soap client $client = new SoapClient($serverPath); try { //make the call $result = $client->SubIdDetailsByOfferId($param); //If error found display error if(isset($fault)) { echo "Error: ". $fault; } //If no error display response else { //Used to display raw XML in the Web Browser header("Content-Type: text/xml;"); //SubIdDetailsResult = XML results echo $result->SubIdDetailsByOfferIdResult; } } catch(SoapFault $ex) { echo "<b>Exception:</b> ". $ex; } unset($client); ?>

    Read the article

  • Getting Google results in Java? Need help!

    - by Cris Carter
    Hello. Right now, I'm trying to get the results from Google in Java, by searching for a term. I'm using a desktop program, not an applet. That in itself isn't complicated. but then Google gave me a 403 error. Anyways, I added referrer and User Agent and then it worked. Now, my problem is that I don't get the results page from Google. Instead, I get their script which gets the results page. My code right now simply uses a GET request on "http://www.google.com/search?q=" + Dork; Then it outputs each line. Here is what I get when I run my program: <.!doctype html<.head<.titledork - Google Search<./title<.scriptwindow.google={kEI:"9myaS-Date).getTime()}}};try{}catch(u){}window.google.jsrt_kill=1; align:center}#logo{display:block;overflow:hidden;position:relative;width:103px;height:37px; <./ script<./div Lots of stuff like that. I shortened it (A LOT) and put in dots to fit it here. So my big question is: How do I turn this whole mess into the nice results page I get when searching Google with a browser? Any help would be seriously appreciated, and I really need the answer fast. Also, please keep in mind that I do NOT want to use Google's API for this. Thanks in advance!

    Read the article

  • Crystal Reports - export to pdf in MVC

    - by BhejaFry
    Hi folks, I have integrated the below code in my application to generate a 'pdf' file using crystal reports in MVC project. However, after the request is processed, i get to see only 2 pages in the pdf file while my 'data' returns more than 2 records. Also, the pdf isn't rendered as soon as the page is processed but instead i have to refresh atleast once, then the pdf is rendered on the browser. using CrystalDecisions.CrystalReports.Engine; public FileStreamResult Report() { ReportClass rptH = new ReportClass(); List<sampledataset> data = objdb.getdataset(); rptH.FileName = Server.MapPath("[reportName].rpt"); rptH.Load(); rptH.SetDatabaseLogon("un", "pwd", "server", "db"); rptH.SetDataSource(data); Stream stream = rptH.ExportToStream(CrystalDecisions.Shared.ExportFormatType.PortableDocFormat); stream.Seek(0, System.IO.SeekOrigin.Begin); return new FileStreamResult(stream, "application/pdf"); } I took the code from here in SO but modified it like above. TIA.

    Read the article

  • jQuery ajax success chaining Internet Explorer Issues

    - by Nickd
    I have a jQuery ajax function that retrieves JSON data. In the success block I call another function to parse the data and update the page. At the end of this parsing/updating function a different ajax call is made. This works perfectly in all browsers except Internet Explorer (7 and 8). The problem is Internet explorer thinks the script is taking too long to process because the success block from the first ajax call doesn't complete until the 2nd ajax call finishes. I get the message: "Stop running this script? A script on this page is causing your web browser to run slowly. If it continues to run, your computer might become unresponsive." My jQuery code: $("#id_select").bind("change", function(e){ $.ajax({ url: "/retrieve_data.js", data: {id:$(e.target).children(":selected").attr("value")}, type: "get", dataType:"json", success: function(data, status, form){ processData(data); }, error: function(response, status){ alert(response.responseText); } }); }) Any suggestions on how to get IE to stop timing out or to accomplish this task without rewriting all my jQuery functions would be appreciated.

    Read the article

  • Prevent illegal behavior to the registered user

    - by Al Kush
    I am building a website in which this website will be focused on the publishing of novels. Every writers who publish their novels with us will get a royalty from us. And this royalty comes from the user or the reader who read the novel online in our website. When a user search for a novel and want to read that, they will click a link to the page which its content is that novel. The html page for each novels will have a session function that first will force them to login or register to make a payment such as with a credit-card or paypal before accessing that html page. My problem now is if the user has succesfully login and access the html page, I am afraid if the user will copy the content of the novel. Some disccussion out here How to Disable Copy Paste (Browser) have a solution to create it in Flash so that it can't be coppied-paste. But the I think, if the user who access it is a web developer like us they will try to find the path of the file from the link in the page source, and then they can steal it. For now I think it is enough I am explaining this. I hope anyone fully accept this problem (question) with a good idea to solve it.

    Read the article

  • viewstack causing error 1065 variable not defined issue?

    - by jason
    I've got an flex application where I have a left side TREE control and a viewstack on the right and when someone selects the tree it loads the named viewstack based on the hidden node value of the XML of the tree. But it's throwing a error 1065 variable not defined on a viewstack which worked on the last browser refresh/reload. It's not related to a particular viewstack from what I can tell it just seems to throw the error on certain render events. I've tried to use creationpolicy="all" on the viewstack but it seems to not be of any help. public function treeChanged(event:Event):void { selectedNode=Tree(event.target).selectedItem as XML; //trace(selectedNode.@hidden); //Alert.show([email protected]() + " *"); if([email protected]() == '' || [email protected]() == null){ //Alert.show("NULL !"); return; } mainviewstack.selectedChild = Container(mainviewstack.getChildByName([email protected]())); //Container(mainviewstack.getChildByName(selectedNode.@hidden)); If I add in an alert box before the getchildbyname option the viewstack has time to render and everything works fine, so it leads me to believe the app is not giving it enough time to load the viewstack?

    Read the article

  • Is using a StringBuilder for writing XML ok?

    - by Jack Lawson
    It feels dirty. But maybe it isn't... is it ok to use a StringBuilder for writing XML? My gut instinct says "although this feels wrong, it's probably pretty darn performant because it's not loading extra libraries and overhead." Here's what it looks like. I'm building an OpenSearch XML doc based on the domain you come in from. public void ProcessRequest(HttpContext context) { context.Response.ContentType = "text/xml"; string domain = WebUtils.ReturnParsedSourceUrl(null); //returns something like www.sample.com string cachedChan = context.Cache[domain + "_opensearchdescription"] as String; if (cachedChan == null) { StringBuilder sb = new StringBuilder(); sb.Append("<?xml version=\"1.0\" encoding=\"UTF-8\"?>"); sb.Append("<OpenSearchDescription xmlns=\"http://a9.com/-/spec/opensearch/1.1/\" xmlns:moz=\"http://www.mozilla.org/2006/browser/search/\">"); sb.Append(" <ShortName>Search</ShortName>"); sb.Append(" <Description>Use " + domain + " to search.</Description>"); sb.Append(" <Contact>[email protected]</Contact>"); sb.Append(" <Url type=\"text/html\" method=\"get\" template=\"http://" + domain + "/Search.aspx?q={searchTerms}\" />"); sb.Append(" <moz:SearchForm>http://" + domain + "/Search.aspx</moz:SearchForm>"); sb.Append(" <Image height=\"16\" width=\"16\" type=\"image/x-icon\">http://" + domain + "/favicon.ico</Image>"); sb.Append("</OpenSearchDescription>"); cachedChan = sb.ToString(); context.Cache.Insert(domain + "_opensearchdescription", cachedChan, null, DateTime.Now.AddDays(14), TimeSpan.Zero); } context.Response.Write(cachedChan); }

    Read the article

  • asp.net does not Redirect when in frameset

    - by Snoop Dogg
    I have developed an application on asp.net. I uploaded it to my host. lets say http://myhost/app. My manager wrapped this address into an empty frameset on http://anotherhost/somename and sets the src of frame to http://myhost/ap. And so nobody can login. When the button is hit, it posts back (browser loads stuff, progress bar fills up and ends) but nothing happens. Does not redirect. (I have set IE to alwaysAllowCookies and it now does work, but other people still cannot) I think there is something that I have no clue about framesets and ASP.NET ps: I never use frames but could not convince my manager in doing so. He likes to develop in front page :) Whatz happening? Thx in advance protected void btnLogin_Click(object sender, ImageClickEventArgs e) { Member member = Logic.DoLogin(txtUsername.Text.Trim(), txtPassword.Text.Trim()); if (null == member) { lblError.Text = "Invalid Login !"; return; } CurrentMember = member; ///CurrentMember is an inherited property that accesses Session["member"] = member Response.Redirect("Default.aspx");

    Read the article

  • How best to embed multiple Flash Player instances using swfobject via a usercontrol?

    - by panamack
    I have a ListView on a Page within a MasterPage and some very ugly ugly autogenerated IDs. Such as..."ctl00_workbenchPlaceHolder_ListView1_ctrl1_LibItem2One" Using swfobject.embedSWF(...) requires me to hand over the id of a div on my page that can be replaced with object/embed markup depending on the browser context. My aim is to show the user a collection of video's they have uploaded to their website so they can review them and change some related data if desired. Hence the ListView which is populated via a SQLDataSource which currently provides a number of URLs pointing to .flv files. But it ain't gonna work if I put a <div id="replaceme"></div>' in my user control because I may then have more than one id="replaceme" and poor swfobject won't like it. So my evil solution is to put an <asp:Literal> in my usercontrol and build the script, function name and div tag id as a string. ApplyVideoConfiguration is called if the library object retreived from the database is a video and switches to the relevant View of a MultiView control. protected void ApplyVideoConfiguration() { MultiViewLibItem.ActiveViewIndex = 3; string functionName = "MakeFlashFor_" + this.ClientID; string divId = "fp" + this.ClientID; VideoScriptLiteral.Text = "<script type=\"text/javascript\">" + "Sys.Application.add_load(" + functionName + ");" + "function " + functionName + "(){" + "swfobject.embedSWF('PanamaVideoThumbnail.swf', '" + divId + "', '140', '127', '10');" + "};" + "</script>" + "<div id=\"" + divId + "\" ></div>" ; } I was wondering, just how bad a solution is this, I'm really completely inexperienced when it comes to best practices but my instincts are telling me this is bad, although it does succeed in the aim of embedding some Flash Player instances. Can anyone help me make it beautiful?

    Read the article

  • "SessionId doesn't exist" error when starting Selenium server

    - by ripper234
    I'm raising a Selnium-server (the jar), and getting this exception without trying to talk to the server. What can be the cause? The errors keep coming in once every 2 seconds. Could this be some leftover from a previous Selenium run? C:\Foo>java -jar ..\..\..\..\lib\Selenium\selenium-server.jar 14:53:30.141 INFO - Java: Sun Microsystems Inc. 14.2-b01 14:53:30.142 INFO - OS: Windows Server 2008 6.1 amd64 14:53:30.149 INFO - v1.0.1 [2696], with Core v@VERSION@ [@REVISION@] 14:53:30.209 INFO - Version Jetty/5.1.x 14:53:30.210 INFO - Started HttpContext[/selenium-server/driver,/selenium-server/driver] 14:53:30.211 INFO - Started HttpContext[/selenium-server,/selenium-server] 14:53:30.211 INFO - Started HttpContext[/,/] 14:53:30.217 INFO - Started SocketListener on 0.0.0.0:4444 14:53:30.218 INFO - Started org.mortbay.jetty.Server@2747ee05 14:53:31.729 INFO - Checking Resource aliases 14:53:31.735 WARN - POST /selenium-server/driver/?seleniumStart=true&localFrameAddress=top&seleniumWindowName= &uniqueId=sel_27224&sessionId=1f2385b8bae24f6fb79816753de7cd69&counterToMakeURsUniqueAndSoStopPageCachingInThe Browser=1255006411692&sequenceNumber=268 HTTP/1.1 java.lang.RuntimeException: sessionId 1f2385b8bae24f6fb79816753de7cd69 doesn't exist; perhaps this session was already stopped? at org.openqa.selenium.server.FrameGroupCommandQueueSet.getQueueSet(FrameGroupCommandQueueSet.java:218 ) at org.openqa.selenium.server.SeleniumDriverResourceHandler.handleBrowserResponse(SeleniumDriverResour ceHandler.java:159) at org.openqa.selenium.server.SeleniumDriverResourceHandler.handle(SeleniumDriverResourceHandler.java: 127) at org.mortbay.http.HttpContext.handle(HttpContext.java:1530) at org.mortbay.http.HttpContext.handle(HttpContext.java:1482) at org.mortbay.http.HttpServer.service(HttpServer.java:909) at org.mortbay.http.HttpConnection.service(HttpConnection.java:820) at org.mortbay.http.HttpConnection.handleNext(HttpConnection.java:986) at org.mortbay.http.HttpConnection.handle(HttpConnection.java:837) at org.mortbay.http.SocketListener.handleConnection(SocketListener.java:245) at org.mortbay.util.ThreadedServer.handle(ThreadedServer.java:357) at org.mortbay.util.ThreadPool$PoolThread.run(ThreadPool.java:534)

    Read the article

< Previous Page | 597 598 599 600 601 602 603 604 605 606 607 608  | Next Page >