Search Results

Search found 17538 results on 702 pages for 'request headers'.

Page 321/702 | < Previous Page | 317 318 319 320 321 322 323 324 325 326 327 328  | Next Page >

  • How do I stop Safari from caching my Servlet response?

    - by Cliff
    I'm having trouble testing a web app with Safari. My app returns wave audio data. The problem happens when I change the application and hit it again from Safari. Safari caches the original response so no matter how many times I hit refresh it seems like I've not updated anything. I can almost get around this using force refresh with Firefox but because I'm having trouble generating the wave headers using the javax.sound API Firefox only plays the first second of audio returned. A few weeks ago I tried setting the HTTP header in my servlet to prevent caching but I don't think I was setting it correctly. (What is the header for browser cache control?) This is becoming a real pain and I'm looking for any ideas, comments, or alternative approaches. I'm getting ready to try again but I figured I'd ask here in the interim to see if someone can provide help.

    Read the article

  • Why is Varnish not caching?

    - by Justin
    I am troubleshooting the setup of Varnish 3.x on my Ubuntu server. I'm running Drupal 7 on two sites set up on the box, via named-based vhosts. Before trying to get Varnish to play nice with Drupal I'm trying to just get Varnish to a PNG from cache. Here are the headers I get from a curl -I request of the PNG file: HTTP/1.1 200 OK Server: Apache/2.2.22 (Ubuntu) Last-Modified: Sun, 07 Oct 2012 21:18:59 GMT ETag: "a57c2-3850-4cb7ea73db6c0" Accept-Ranges: bytes Content-Length: 14416 Cache-Control: max-age=1209600 Expires: Thu, 25 Oct 2012 22:55:14 GMT Content-Type: image/png Accept-Ranges: bytes Date: Thu, 11 Oct 2012 22:55:14 GMT X-Varnish: 1766703058 Age: 0 Via: 1.1 varnish Connection: keep-alive X-Varnish-Cache: MISS Here is the Varnish VCL file I'm using (It's a default VCL configuration designed for Drupal): # Default backend definition. Set this to point to your content # server. # backend default { .host = "127.0.0.1"; .port = "8080"; } # Respond to incoming requests. sub vcl_recv { # Use anonymous, cached pages if all backends are down. if (!req.backend.healthy) { unset req.http.Cookie; } # Allow the backend to serve up stale content if it is responding slowly. set req.grace = 6h; # Pipe these paths directly to Apache for streaming. #if (req.url ~ "^/admin/content/backup_migrate/export") { # return (pipe); #} # Do not cache these paths. if (req.url ~ "^/status\.php$" || req.url ~ "^/update\.php$" || req.url ~ "^/admin$" || req.url ~ "^/admin/.*$" || req.url ~ "^/flag/.*$" || req.url ~ "^.*/ajax/.*$" || req.url ~ "^.*/ahah/.*$") { return (pass); } # Do not allow outside access to cron.php or install.php. #if (req.url ~ "^/(cron|install)\.php$" && !client.ip ~ internal) { # Have Varnish throw the error directly. # error 404 "Page not found."; # Use a custom error page that you've defined in Drupal at the path "404". # set req.url = "/404"; #} # Always cache the following file types for all users. This list of extensions # appears twice, once here and again in vcl_fetch so make sure you edit both # and keep them equal. if (req.url ~ "(?i)\.(pdf|asc|dat|txt|doc|xls|ppt|tgz|csv|png|gif|jpeg|jpg|ico|swf|css|js)(\?.*)?$") { unset req.http.Cookie; } # Remove all cookies that Drupal doesn't need to know about. We explicitly # list the ones that Drupal does need, the SESS and NO_CACHE. If, after # running this code we find that either of these two cookies remains, we # will pass as the page cannot be cached. if (req.http.Cookie) { # 1. Append a semi-colon to the front of the cookie string. # 2. Remove all spaces that appear after semi-colons. # 3. Match the cookies we want to keep, adding the space we removed # previously back. (\1) is first matching group in the regsuball. # 4. Remove all other cookies, identifying them by the fact that they have # no space after the preceding semi-colon. # 5. Remove all spaces and semi-colons from the beginning and end of the # cookie string. set req.http.Cookie = ";" + req.http.Cookie; set req.http.Cookie = regsuball(req.http.Cookie, "; +", ";"); set req.http.Cookie = regsuball(req.http.Cookie, ";(SESS[a-z0-9]+|SSESS[a-z0-9]+|NO_CACHE)=", "; \1="); set req.http.Cookie = regsuball(req.http.Cookie, ";[^ ][^;]*", ""); set req.http.Cookie = regsuball(req.http.Cookie, "^[; ]+|[; ]+$", ""); if (req.http.Cookie == "") { # If there are no remaining cookies, remove the cookie header. If there # aren't any cookie headers, Varnish's default behavior will be to cache # the page. unset req.http.Cookie; } else { # If there is any cookies left (a session or NO_CACHE cookie), do not # cache the page. Pass it on to Apache directly. return (pass); } } } # Set a header to track a cache HIT/MISS. sub vcl_deliver { if (obj.hits > 0) { set resp.http.X-Varnish-Cache = "HIT"; } else { set resp.http.X-Varnish-Cache = "MISS"; } } # Code determining what to do when serving items from the Apache servers. # beresp == Back-end response from the web server. sub vcl_fetch { # We need this to cache 404s, 301s, 500s. Otherwise, depending on backend but # definitely in Drupal's case these responses are not cacheable by default. if (beresp.status == 404 || beresp.status == 301 || beresp.status == 500) { set beresp.ttl = 10m; } # Don't allow static files to set cookies. # (?i) denotes case insensitive in PCRE (perl compatible regular expressions). # This list of extensions appears twice, once here and again in vcl_recv so # make sure you edit both and keep them equal. if (req.url ~ "(?i)\.(pdf|asc|dat|txt|doc|xls|ppt|tgz|csv|png|gif|jpeg|jpg|ico|swf|css|js)(\?.*)?$") { unset beresp.http.set-cookie; } # Allow items to be stale if needed. set beresp.grace = 6h; } # In the event of an error, show friendlier messages. sub vcl_error { # Redirect to some other URL in the case of a homepage failure. #if (req.url ~ "^/?$") { # set obj.status = 302; # set obj.http.Location = "http://backup.example.com/"; #} # Otherwise redirect to the homepage, which will likely be in the cache. set obj.http.Content-Type = "text/html; charset=utf-8"; synthetic {" <html> <head> <title>Page Unavailable</title> <style> body { background: #303030; text-align: center; color: white; } #page { border: 1px solid #CCC; width: 500px; margin: 100px auto 0; padding: 30px; background: #323232; } a, a:link, a:visited { color: #CCC; } .error { color: #222; } </style> </head> <body onload="setTimeout(function() { window.location = '/' }, 5000)"> <div id="page"> <h1 class="title">Page Unavailable</h1> <p>The page you requested is temporarily unavailable.</p> <p>We're redirecting you to the <a href="/">homepage</a> in 5 seconds.</p> <div class="error">(Error "} + obj.status + " " + obj.response + {")</div> </div> </body> </html> "}; return (deliver); } I'm getting a MISS and age 0 every time. If I'm understanding correctly, this means the file isn't being returned from Varnish's cache. Is there a problem with my Varnish config?

    Read the article

  • SSIS - Can I get the column schema for a flat file source from a database?

    - by Steve Clement
    We receive a nightly data export from a vendor in the form of about 10 tab-delimited flat file without column headers. In addition, the vendor provides us with the SQL scripts for the database tables so that we can import the files into our system. Unfortunately, the vendor recently changed the schema for the flat files. Each file has upwards 150 columns, and having to go through the DB schema and adjust column types on a Flat File Data Source in SSIS is extremely time consuming, not to mention a royal pain. Since I know the file data layout in the database schema, is there any way I can dynamically pull that into a Flat File source to set the columns correctly? Or am I just stuck with manually setting everything?

    Read the article

  • How can I add styles to dynamically added table cells?

    - by Doc Hoffiday
    In my program I have a table that, when loaded, has jQuery add some styles/classes to the table cells and table headers. Everything works fine until rows are added via functionality on the rest of the page. Instead of adding the classes to the table cell during addition, is it possible to "listen" or fire some event that checks to see if child elements were added to the table. Essentially, I want something functionally equivalent to this: $("#table td").live("ready", function(){ // do something }); but the live/ready won't work on a table cell... Any ideas?

    Read the article

  • Bookmarkable URLs after Ajax for Wicket

    - by Wolfgang
    There is this well-known problem that browsers don't put Ajax request in the request history and cause problems for bookmarkability, forward/back button, and refresh. Also, there is a common solution to that problem that appends the hash symbol # and some additional parameters to the URL by using Javascript window.location.hash = .... In this question a basic solution to this problem is proposed, for example. = My question is if such a solution has been integrated in Wicket, so that existing Wicket facilities are used and no custom Javascript had to be added. If not, I'd be interested in how this could be done. Such a solution had to answer the question what should be put after the hash. I like the idea that the bookmarkable URL that (in the non-Ajax case) were in front of the hash could be put behind it. For example, when you are on http://host/catalog and reach a page http://host/product/xyz the Ajax-triggered URL would be http://host/catalog#/product/xyz. Then it would be easy to write an onload handler that checks for the # and does a redirect to the URL after the hash.

    Read the article

  • visual studio 2008 linker error

    - by ravi
    In visual studio 2008, I have created a static dll called test_static.dll. I am trying to call this from one application. I have included this dll in source files folder and the header file related to it in headers folder. When i am running the application I am getting following liking error. Please give me a solution. error LNK2019: unresolved external symbol "struct morph_output * __cdecl morpho_data(struct morph_input *)" (?morpho_data@@YAPAUmorph_output@@PAUmorph_input@@@Z) referenced in function _wmain 1D:\test_app\Debug\test_app.exe : fatal error LNK1120: 1 unresolved externals 1Build log was saved at "file://d:\test_app\test_app\Debug\BuildLog.htm" Here test_app is application that is using static dll. and morpho_data is the dll function which is taking input as structure and returning another structure.

    Read the article

  • asp code for upload data

    - by vicky
    hello everyone i have this code for uploading an excel file and save the data into database.I m not able to write the code for database entry. someone please help <% if (Request("FileName") <> "") Then Dim objUpload, lngLoop Response.Write(server.MapPath(".")) If Request.TotalBytes > 0 Then Set objUpload = New vbsUpload For lngLoop = 0 to objUpload.Files.Count - 1 'If accessing this page annonymously, 'the internet guest account must have 'write permission to the path below. objUpload.Files.Item(lngLoop).Save "D:\PrismUpdated\prism_latest\Prism\uploadxl\" Response.Write "File Uploaded" Next Dim FSYSObj, folderObj, process_folder process_folder = server.MapPath(".") & "\uploadxl" set FSYSObj = server.CreateObject("Scripting.FileSystemObject") set folderObj = FSYSObj.GetFolder(process_folder) set filCollection = folderObj.Files Dim SQLStr SQLStr = "INSERT ALL INTO TABLENAME " for each file in filCollection file_name = file.name path = folderObj & "\" & file_name Set objExcel_chk = CreateObject("Excel.Application") Set ws1 = objExcel_chk.Workbooks.Open(path).Sheets(1) row_cnt = 1 'for row_cnt = 6 to 7 ' if ws1.Cells(row_cnt,col_cnt).Value <> "" then ' col = col_cnt ' end if 'next While (ws1.Cells(row_cnt, 1).Value <> "") for col_cnt = 1 to 10 SQLStr = SQLStr & "VALUES('" & ws1.Cells(row_cnt, 1).Value & "')" next row_cnt = row_cnt + 1 WEnd 'objExcel_chk.Quit objExcel_chk.Workbooks.Close() set ws1 = nothing objExcel_chk.Quit Response.Write(SQLStr) 'set filobj = FSYSObj.GetFile (sub_fol_path & "\" & file_name) 'filobj.Delete next End if End If plz tell me how to save the following excel data to the oracle databse.any help would be appreciated

    Read the article

  • What is your favorite API developer community site? And why? [closed]

    - by whatupwilly
    There are a lot of great sites out there that offer good documentation, tools, tips, best-practices, sample code, etc. for the API's they are publishing. A sample: http://apiwiki.twitter.com http://developer.netflix.com/ http://developers.facebook.com/ https://affiliate-program.amazon.com/gp/advertising/api/detail/main.html http://code.google.com/ http://remix.bestbuy.com/ http://www.flickr.com/services/api/misc.overview.html http://products.wolframalpha.com/api/webserviceapi.html There are some no-brainers that I think a good developer site should have: Hi level introduction Quick start guide API specific details - showing example request and responses Links to sample code and/or 3rd party libraries Developer registration (e.g. get an API key) Blog But what about some other things: Online-Forum or Msg Board vs. Google Group (or similar) Galleries/ShowCases - spotlighting great apps built on the API - who has done nice galleries? Community Wiki - How do people feel about letting the community have edit rights on API documentation pages Online testing tools (like Facebook has a lot of nice interactive tools to simulate request/responses) What are some packages that you would recommend to put this all together: pbwiki Google Group pages MediaWiki API vendor package such as Sonoa Systems that offers a customizable developer portal So, to summarize: What are some other great API developer portals out there What are some nice features you like on them Any recommendations on what to use to build these features out Thanks, Will Zappos.com Public API (soon to launch) Product Manager

    Read the article

  • Unable to HTTP PUT with libcurl

    - by Jesse Beder
    I'm trying to PUT data using libcurl to mimic the command curl -u test:test -X PUT --data-binary @data.yaml "http://127.0.0.1:8000/foo/" which works correctly. My options look like: curl_easy_setopt(handle, CURLOPT_USERPWD, "test:test"); curl_easy_setopt(handle, CURLOPT_URL, "http://127.0.0.1:8000/foo/"); curl_easy_setopt(handle, CURLOPT_VERBOSE, 1); curl_easy_setopt(handle, CURLOPT_UPLOAD, 1); curl_easy_setopt(handle, CURLOPT_READFUNCTION, read_data); curl_easy_setopt(handle, CURLOPT_READDATA, &yaml); curl_easy_setopt(handle, CURLOPT_INFILESIZE, yaml.size()); curl_easy_perform(handle); I believe the read_data function works correctly, but if you ask, I'll post that code. I'm using Django with django-piston, and my update function is never called! (It is called when I use the command line version above.) libcurl's output is: * About to connect() to 127.0.0.1 port 8000 (#0) * Trying 127.0.0.1... * connected * Connected to 127.0.0.1 (127.0.0.1) port 8000 (#0) * Server auth using Basic with user 'test' > PUT /foo/ HTTP/1.1 Authorization: Basic dGVzdDp0ZXN0 Host: 127.0.0.1:8000 Accept: */* Content-Length: 244 Expect: 100-continue * Done waiting for 100-continue ** this is where my read_data handler confirms: read 244 bytes ** * HTTP 1.0, assume close after body < HTTP/1.0 400 BAD REQUEST < Date: Thu, 13 May 2010 08:22:52 GMT < Server: WSGIServer/0.1 Python/2.5.1 < Vary: Authorization < Content-Type: text/plain < Bad Request* Closing connection #0

    Read the article

  • How to invert rows and columns using a T-SQL Pivot Table

    - by Jeff Stock
    I have a query that returns one row. However, I want to invert the rows and columns, meaning show the rows as columns and columns as rows. I think the best way to do this is to use a pivot table, which I am no expert in. Here is my simple query: SELECT Period1, Period2, Period3 FROM GL.Actuals WHERE Year = 2009 AND Account = '001-4000-50031' Results (with headers): Period1, Period2, Period3 612.58, 681.36, 676.42 I would like for the results to look like this: Desired Results: Period, Amount Period1, 612.58 Period2, 681.36 Period3, 676.42 This is a simple example, but what I'm really after is a bit more comlex than this. I realize I could produce theses results by using several SELECT commands instead. I'm just hoping someone can shine some light on how to accomplish this with a Pivot Table or if there is yet a better way.

    Read the article

  • Can not access response.body inside after filter block in Sinatra 1.0

    - by Petr Vostrel
    I'm struggling with a strange issue. According to http://github.com/sinatra/sinatra (secion Filters) a response object is available in after filter blocks in Sinatra 1.0. However the response.status is correctly accessible, I can not see non-empty response.body from my routes inside after filter. I have this rackup file: config.ru require 'app' run TestApp Then Sinatra 1.0.b gem installed using: gem install --pre sinatra And this is my tiny app with a single route: app.rb require 'rubygems' require 'sinatra/base' class TestApp < Sinatra::Base set :root, File.dirname(__FILE__) get '/test' do 'Some response' end after do halt 500 if response.empty? # used 500 just for illustation end end And now, I would like to access the response inside the after filter. When I run this app and access /test URL, I got a 500 response as if the response is empty, but the response clearly is 'Some response'. Along with my request to /test, a separate request to /favicon.ico is issued by the browser and that returns 404 as there is no route nor a static file. But I would expect the 500 status to be returned as the response should be empty. In console, I can see that within the after filter, the response to /favicon.ico is something like 'Not found' and response to /test really is empty even though there is response returned by the route. What do I miss?

    Read the article

  • HTTP Handler error when downloading files - SSL

    - by Chiefy
    Ok big problem as this is affecting two projects on our new server. We have a file that is downloaded by users, the files are downloaded using a HTTPHandler. Since moving the site to the server and setting SSL the downloads have stopped working and we get an error message "Unable to download DownloadDocument.ashx" from site". DownloadDocument.ashx is the handler page that is set in the web.config and the button that goes there is a hyperlink with the id of the document as a querystring. Ive read the article on http://support.microsoft.com/kb/316431 and read a few other requests on this site but nothing seems to be working. This problem only happens in IE and works fine when I run it on the server in http instead of https. public override void HandleRequest(HttpContext context) { Guid guid = new Guid(context.Request.QueryString["ID"]); DataTable dt = Documents.GetDocument(guid); if (dt != null) { context.Response.Cache.SetCacheability(HttpCacheability.Private); context.Response.AddHeader("content-disposition", string.Format("attachment; filename={0}", dt.Rows[0]["DocumentName"].ToString())); context.Response.AddHeader("Content-Transfer-Encoding", "binary"); context.Response.AddHeader("Content-Length", ((byte[])dt.Rows[0]["Document"]).Length.ToString()); context.Response.ContentType = string.Format("application/{0}", dt.Rows[0]["Extension"].ToString().Remove(0, 1)); context.Response.Buffer = true; context.Response.BinaryWrite((byte[])dt.Rows[0]["Document"]); context.Response.Flush(); context.Response.End(); } } The above is my current code for the request. Ive used the base handler on http://haacked.com/archive/2005/03/17/AnAbstractBoilerplateHttpHandler.aspx. Any ideas on what this might be and how we can fix it. Thanks in advance for all responses.

    Read the article

  • How do I use HTML5's localStorage in a Google Chrome extension?

    - by davidkennedy85
    I am trying to develop an extension that will work with Awesome New Tab Page. I've followed the author's advice to the letter, but it doesn't seem like any of the script I add to my background page is being executed at all. Here's my background page: <script> var info = { poke: 1, width: 1, height: 1, path: "widget.html" } chrome.extension.onRequestExternal.addListener(function(request, sender, sendResponse) { if (request === "mgmiemnjjchgkmgbeljfocdjjnpjnmcg-poke") { chrome.extension.sendRequest( sender.id, { head: "mgmiemnjjchgkmgbeljfocdjjnpjnmcg-pokeback", body: info, } ); } }); function initSelectedTab() { localStorage.setItem("selectedTab", "Something"); } initSelectedTab(); </script> Here is manifest.json: { "update_url": "http://clients2.google.com/service/update2/crx", "background_page": "background.html", "name": "Test Widget", "description": "Test widget for mgmiemnjjchgkmgbeljfocdjjnpjnmcg.", "icons": { "128": "icon.png" }, "version": "0.0.1" } Here is the relevant part of widget.html: <script> var selectedTab = localStorage.getItem("selectedTab"); document.write(selectedTab); </script> Every time, the browser just displays null. The local storage isn't being set at all, which makes me think the background page is completely disconnected. Do I have something wired up incorrectly?

    Read the article

  • Could DataGridView be this dumb? or its me?lol

    - by Selase
    Am trying to bind data to a dropdown list on pageload based on a condition. Code explains further below. public partial class AddExhibit : System.Web.UI.Page { string adminID, caseIDRetrieved; DataSet caseDataSet = new DataSet(); SqlDataAdapter caseSqlDataAdapter = new SqlDataAdapter(); string strConn = WebConfigurationManager.ConnectionStrings["CMSSQL3ConnectionString1"].ConnectionString; protected void Page_Load(object sender, EventArgs e) { adminID = Request.QueryString["adminID"]; caseIDRetrieved = Request.QueryString["caseID"]; if (caseIDRetrieved != null) { CaseIDDropDownList.Text = caseIDRetrieved; //CaseIDDropDownList.Enabled = false; } else { try { CreateDataSet(); DataView caseDataView = new DataView(caseDataSet.Tables[0]); CaseIDDropDownList.DataSource = caseDataView; CaseIDDropDownList.DataBind(); } catch (Exception ex) { string script = "<script>alert('" + ex.Message + "');</script>"; } } } The CreateDataset method that is called in the if..else statement is contains the following code. private void CreateDataSet() { SqlConnection caseConnection = new SqlConnection(strConn); caseSqlDataAdapter.SelectCommand = new SqlCommand("Select CaseID FROM Cases", caseConnection); caseSqlDataAdapter.Fill(caseDataSet); } However when i load the page and as usual the condition that is supposed to bid the data is met, the gridview decides to displays as follows... IS IT ME OR ITS THE DATAGRID?...??

    Read the article

  • Why Microsoft not provide for C# a static Win32 class with the most native functions and structures

    - by Oleg
    Everybody who used P/Invoke of Windows API knows a long list of declarations of static functions with attributes like [DllImport ("kernel32.dll", SetLastError = true, CharSet = CharSet.Auto)] The declaration of structures copied from Windows headers like WinNT.h or from web sites like www.pinvoke.net take also a lot of place in our programs. Why we all have to spend our time for this? Why Microsoft not give us a simple way to include a line like in old unmanaged programs #include <windows.h> and we would be have access to a static class Native with all or the most Windows functions and structures inside?

    Read the article

  • Loading a CSV file using jQuery GET returns the header but no data

    - by Cees Meijer
    When reading a CSV file from a server using the jQuery 'GET' function I do not get any data. When I look at the code using FireBug I can see the GET request is sent and the return value is '200 OK'. Also I see that the header is returned correctly so the request is definitely made, and data is returned. This is also what I see in Wireshark. Here I see the complete contents of the CSV file is returned as a standard HTTP response. But the actual data is not there in my script. Firebug shows an empty response and the 'success' function is never called. What could be wrong ? <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd"> <html xmlns="http://www.w3.org/1999/xhtml"> <head> <title>New Web Project</title> <meta http-equiv="Content-Type" content="text/html; charset=utf-8" /> <script src="jquery.js" type="text/javascript" charset="utf-8"></script> <script type="text/javascript"> var csvData; $(document).ready(function() { $("#btnGET").click(function() { csvData = $.ajax({ type: "GET", url: "http://www.mywebsite.com/data/sample_file.csv", dataType: "text/csv", success: function () { alert("done!"+ csvData.getAllResponseHeaders()) } }); }); }) </script> </head> <body> <h1>New Web Project Page</h1> <button id="btnGET">GET Data</button> </body> </html>

    Read the article

  • Where to learn about HTTP?

    - by razass
    I am fluent in HTML and PHP and slowly learning JavaScript however I have noticed that there is a huge hole in my knowledge when it comes to understanding how web software communications actually work. I understand the flow of information across the net however I would like to learn about HTTP protocols to better understand how the data is actually sent back and forth through the internet to help me understand things like REST, HTTP Headers, AJAX requests, etc. However, I must be searching the wrong terms because I haven't been able to find a good description of HTTP protocols. Any help is appreciated to point me in the right direction. Thanks!

    Read the article

  • jQuery AJAX Redirection problem

    - by meosoft
    Hello please consider this: On page A I have a link that takes you to page B when JS is off, but when JS is on, I want to replace content on current page with content from the page B. Pages A and B are in fact the same script that is able to tell AJAX calls from regular ones and serve the content appropriately. Everything works fine, as long as there are no redirects involved. But, sometimes there is a 301 redirect and what seems to be happening is that client browser then makes a second request, which will return with a 200 OK. Only the second request is sent without a X-Requested-With header, therefore I cannot tell within my script wether it came from AJAX or not, and will send a complete page instead of just the content. I have tried checking for 301 status code in my error, success, and complete handlers but none of them worked. It seems to be handling the 301 behind the scenes. Could anyone help me with this? jQuery 1.4, PHP 5 Edit: People requested the code to this, which I didn't think was necessary but here goes: // hook up menu ajax loading $('#menu a').live("click", function(){ // update menu highlight if($(this).parents('#menu').size() > 0){ $("#menu>li").removeClass("current_page_item"); $(this).parent().addClass("current_page_item"); } // get the URL where we will be retrieving content from var url = $(this).attr('href'); window.location.hash = hash = url; $.ajax({ type: "GET", url: url, success: function(data){ // search for an ID that is only present if page is requested directly if($(data).find('#maincontent').size() > 0){ data = $(data).find('#maincontent .content-slide *').get(); } // the rest is just animating the content into view $("#scroller").html(data); $('.content-slide').each(setHeight); $('.content-slide').animate({ left: "0px" }, 1000, 'easeOutQuart', function(){ $('#home').css("left", "-760px").html(data); $('#scroller').css("left", "-760px"); $('.content-slide').each(setHeight); } ); } }); return false; });

    Read the article

  • How do I create an "iframe popup" when I hover over an <a> tag?

    - by Angela
    Here is the scenario: a User will see a list of company names, each wrapped in an tag. He is able to see dynamic information and as he hover over each name and then make a request. So Given a list of companies, each wrapped in an tag. When the cursor hovers over an tag Then a "pop-up" appears that contains an -based, dynamic content. Given the pop-up When the User clicks on the "submit" button in the pop-up Then the form (based on the framework" is submitted and ajax displays "request succesful" So, because I am using a php-framework, I'd like to use iframe to contain the form. Some challenges: When the cursor is no longer hovering over the tag, the hover disappears. How do I keep it operating? How do I make it appear in an so I can have full form-submission and POST-ing dynamic values through the URL? How do the "popup" disappear when the cursor is no longer on either the -tag or the pop-up itself? Can I do it without loading a bunch of 's onto the page, because the list of companies could be long.

    Read the article

  • PHP Session code work differently on two servers

    - by williamsdb
    I have some code which works fine on one server but is giving a session header warning: Warning: session_start() [function.session-start]: Cannot send session cache limiter - headers already sent on another. I have checked the php.ini settings on the two servers and they are identical. I know that the warning message is supposed to suggest that something has been outputted before the session_start but what I don't understand is why the same code works on one server but not the other. Is there anything else that could be explaining it other than the php.ini settings?

    Read the article

  • Force sending a user to custom QuerySet.

    - by Jack M.
    I'm trying to secure an application so that users can only see objects which are assigned to them. I've got a custom QuerySet which works for this, but I'm trying to find a way to force the use of this additional functionality. Here is my Model: class Inquiry(models.Model): ts = models.DateTimeField(auto_now_add=True) assigned_to_user = models.ForeignKey(User, blank=True, null=True, related_name="assigned_inquiries") objects = CustomQuerySetManager() class QuerySet(QuerySet): def for_user(self, user): return self.filter(assigned_to_user=user) (The CustomQuerySetManager is documented over here, if it is important.) I'm trying to force everything to use this filtering, so that other methods will raise an exception. For example: Inquiry.objects.all() ## Should raise an exception. Inquiry.objects.filter(pk=69) ## Should raise an exception. Inquiry.objects.for_user(request.user).filter(pk=69) ## Should work. inqs = Inquiry.objects.for_user(request.user) ## Should work. inqs.filter(pk=69) ## Should work. It seems to me that there should be a way to force the security of these objects by allowing only certain users to access them. I am not concerned with how this might impact the admin interface.

    Read the article

  • getting error as Internal Server error

    - by Rishi2686
    Hi There, When I try to run my site it gives error as Internal Server error, when I refresh the page I get my result properly.The error page looks like this: Internal Server Error The server encountered an internal error or misconfiguration and was unable to complete your request. Please contact the server administrator, [email protected] and inform them of the time the error occurred, and anything you might have done that may have caused the error. More information about this error may be available in the server error log. Additionally, a 404 Not Found error was encountered while trying to use an ErrorDocument to handle the request. I also checked error_log file on my server, it gives error as: [Sat Jun 12 01:21:55 2010] [error] [client 117.195.6.76] File does not exist: /home/rohit25/public_html/test/500.shtml, referer: http://www.test.mysite.com/home.php sometimes error can be; [Sat May 29 19:35:12 2010] [error] [client 97.85.189.208] File does not exist: /home2/carlton/public_html/test/favicon.ico Are there any changes required in configuration file, I also tried to involve this error code in custom error page, it shows error page, which could not resolve this issue. Your urgent help will be greatly appreciated.

    Read the article

  • ASP.NET MVC URL Routing problem

    - by Sadegh
    hi, i have defined a route as below: context.MapRoute("SearchEngineWebSearch", "search/web/{query}/{index}/{size}", new { controller = "search", action = "web", query = "", index = 0, size = 5 }); and action method to handle request match with that: public System.Web.Mvc.ActionResult Web(string query = "", int index = 0, int size = 5) { if (string.IsNullOrEmpty(query)) return RedirectToRoute("SearchEngineBasicSearch"); var search = new Search(); var results = search.PerformSearch(query, index, size); ViewData["Query"] = query; if (results != null && results.Count() > 0) { ViewData["Results"]= results; return View("Web"); } else return View("Not-Found"); } and form to sent parameter to action method: <% using (Html.BeginForm("Web", "Search", FormMethod.Post)) { %> <input name="query" type="text" value="<%: ViewData["Query"]%>" class="search-field" /> <input type="submit" value="Search" class="search-button" /> <input type="hidden" name="index" value="2" /> <input type="hidden" name="size" value="2" /> <%} %> now after click on submit and sending value to action method all route values updated but url values still is equals to first time of sending parameter. for example if i sent for first time request such as http://localhost/search/web/google and for next time http://localhost/search/web/yahoo, query parameter which passed to action method is yahoo but url after postback is http://localhost/search/web/google still! can anybody help me plz? ;)

    Read the article

  • real time stock quotes, StreamReader performance optimization

    - by sean717
    I am working on a program that extracts real time quote for 900+ stocks from a website. I use HttpWebRequest to send HTTP request to the site and store the response to a stream and open a stream using the following code: HttpWebResponse response = (HttpWebResponse)request.GetResponse(); Stream stream = response.GetResponseStream (); StreamReader reader = new StreamReader( stream ) the size of the received HTML is large (5000+ lines), so it takes a long time to parse it and extract the price. For 900 files, It takes about 6 mins for parsing and extracting. Which my boss isn't happy with, he told me he'd want the whole process to be done in TWO mins. I've identified the part of the program that takes most of time to finish is parsing and extracting. I've tried to optimize the code to make it faster, the following is what I have now after some optimization: // skip lines at the top for(int i=0;i<1500;++i) reader.ReadLine(); // read the line that contains the price string theLine = reader.ReadLine(); // ... extract the price from the line now it takes about 4 mins to process all the files, there is still a significant gap to what my boss's expecting. So I am wondering, is there other way that I can further speed up the parsing and extracting and have everything done within 2 mins?

    Read the article

  • Consolidate multiple site files into single location

    - by seengee
    We have a custom PHP/MySQL CMS running on Linux/Apache thats rolled out to multiple sites (20+) on the same server. Each site uses exactly the same CMS files with a few files for each site being customised. The customised files for each site are: /library/mysql_connect.php /public_html/css/* /public_html/ftparea/* /public_html/images/* There's also a couple of other random files inside /public_html/includes/ that are unique to each site. Other than this each site on the server uses the exact same files. Each site sitting within /home/username/. There is obviously a massive amount of replication here as each time we want to deploy a system update we need to update to each user account. Given the common site files are all stored in SVN it would make far more sense if we were able to simply commit to SVN and deploy to a single location direct from there. Unfortunately, making a major architecture change at this stage could be problematic. In my mind the ideal scenario would mean creating an account like /home/commonfiles/ and each site using these common files unless an account specific file exists, for example a request is made to /home/user/public_html/index.php but as this file doesnt exist the request is then redirected to /home/commonfiles/public_html/index.php. I know that generally this approach is possible, similar to how Zend Framework (and probably others) redirect all requests that dont match a specific file to index.php. I'm just not sure about how exactly to go about implementing it and whether its actually advisable. Would really welcome any input/ideas people have got. Thanks.

    Read the article

< Previous Page | 317 318 319 320 321 322 323 324 325 326 327 328  | Next Page >