Search Results

Search found 15403 results on 617 pages for 'request querystring'.

Page 523/617 | < Previous Page | 519 520 521 522 523 524 525 526 527 528 529 530  | Next Page >

  • Returned JSON from Twitter and displaying tweets using FlexSlider

    - by Trey Copeland
    After sending a request to the Twitter API using geocode, I'm getting back a json response with a list of tweets. I then that into a php array using json_decode() and use a foreach loop to output what I need. I'm using flex slider to show the tweets in a vertical fashion after wrapping them in a list. So what I want is for it to only show 10 tweets at a time and scroll through them infinitely like an escalator. Here's my loop to output the tweets: foreach ($tweets["results"] as $result) { $str = preg_replace('/[^\00-\255]+/u', '', $result["text"]); echo '<ul class="slides">'; echo '<li><a href="http://twitter.com/' . $result["from_user"] . '"><img src=' . $result["profile_image_url"] . '></a>' . $str . '</li><br /><br />'; echo '</ul>'; } My jQuery looks like this as of right now as I'm trying to play around with things: $(window).load(function() { $('.flexslider').flexslider({ slideDirection: "vertical", start: function(slider) { //$('.flexslider .slides > li gt(10)').hide(); }, after: function(slider) { // current.sl } }); }); Non-Working demo here - http://macklabmedia.com/tweet/

    Read the article

  • i18n redirection breaks my tests ....

    - by Mike
    I have a big application covered by more than a thousand tests via rspec. We just made the choice to redirect any page like : / /foo /foo/4/bar/34 ... TO : /en /en/foo /fr/foo/4/bar/34 .... So I made a before filter in application.rb like so : if params[:locale].blank? headers["Status"] = "301 Moved Permanently" redirect_to request.env['REQUEST_URI'].sub!(%r(^(http.?://[^/]*)?(.*))) { "#{$1}/#{I18n.locale}#{$2}" } end It's working great but ... It's breaking a lot of my tests, ex : it "should return 404" do Video.should_receive(:failed_encodings).and_return([]) get :last_failed_encoding response.status.should == "404 Not Found" end To fix this test, I should do : get :last_failed_encoding, :locale => "en" But ... seriously I don't want to fix all my test one by one ... I tried to make the locale a default parameter like this : class ActionController::TestCase alias_method(:old_get, :get) unless method_defined?(:old_get) def get(path, parameters = {}, headers = nil) parameters.merge({:locale => "fr"}) if parameters[:locale].blank? old_get(path, parameters, headers) end end ... but couldnt make this work ... Any idea ??

    Read the article

  • How can I fetch Google static maps with TIdHTTP?

    - by cloudstrif3
    I'm trying to return content from maps.google.com from within Delphi 2006 using the TIdHTTP component. My code is as follows procedure TForm1.GetGoogleMap(); var t_GetRequest: String; t_Source: TStringList; t_Stream: TMemoryStream; begin t_Source := TStringList.Create; try t_Stream := TMemoryStream.Create; try t_GetRequest := 'http://maps.google.com/maps/api/staticmap?' + 'center=Brooklyn+Bridge,New+York,NY' + '&zoom=14' + '&size=512x512' + '&maptype=roadmap' + '&markers=color:blue|label:S|40.702147,-74.015794' + '&markers=color:green|label:G|40.711614,-74.012318' + '&markers=color:red|color:red|label:C|40.718217,-73.998284' + '&sensor=false'; IdHTTP1.Post(t_GetRequest, t_Source, t_Stream); t_Stream.SaveToFile('google.html'); finally t_Stream.Free; end; finally t_Source.Free; end; end; However I keep getting the response HTTP/1.0 403 Forbidden. I assume this means that I don't have permission to make this request but if I copy the url into my web browser IE 8, it works fine. Is there some header information that I need or something else?

    Read the article

  • SQL to get list of dates as well as days before and after without duplicates

    - by Nathan Koop
    I need to display a list of dates, which I have in a table SELECT mydate AS MyDate, 1 AS DateType FROM myTable WHERE myTable.fkId = @MyFkId; Jan 1, 2010 - 1 Jan 2, 2010 - 1 Jan 10, 2010 - 1 No problem. However, I now need to display the date before and the date after as well with a different DateType. Dec 31, 2009 - 2 Jan 1, 2010 - 1 Jan 2, 2010 - 1 Jan 3, 2010 - 2 Jan 9, 2010 - 2 Jan 10, 2010 - 1 Jan 11, 2010 - 2 I thought I could use a union SELECT MyDate, DateType FROM ( SELECT mydate - 1 AS MyDate, 2 AS DateType FROM myTable WHERE myTable.fkId = @MyFkId; UNION SELECT mydate + 1 AS MyDate, 2 AS DateType FROM myTable WHERE myTable.fkId = @MyFkId; UNION SELECT mydate AS MyDate, 1 AS DateType FROM myTable WHERE myTable.fkId = @MyFkId; ) AS myCombinedDateTable This however includes duplicates of the original dates. Dec 31, 2009 - 2 Jan 1, 2009 - 2 Jan 1, 2010 - 1 Jan 2, 2010 - 2 Jan 2, 2010 - 1 Jan 3, 2010 - 2 Jan 9, 2010 - 2 Jan 10, 2010 - 1 Jan 11, 2010 - 2 How can I best remove these duplicates? I am considering a temporary table, but am unsure if that is the best way to do it. This also appears to me that it may provide performance issues as I am running the same query three separate times. What would be the best way to handle this request?

    Read the article

  • Speeding up a soap powered website

    - by ChrisRamakers
    Hi all, We're currently looking into doing some performance tweaking on a website which relies heavily on a Soap webservice. But ... our servers are located in Belgium and the webservice we connect to is locate in San Francisco so it's a long distance connection to say the least. Our website is PHP powered, using PHP's built in SoapClient class. On average a call to the webservice takes 0.7 seconds and we are doing about 3-5 requests per page. All possible request/response caching is already implemented so we are now looking at other ways to improved the connection speed. This is the code which instantiates the SoapClient, what i'm looking for now is other ways/methods to improve speed on single requestes. Anyone has idea's or suggestions? private function _createClient() { try { $wsdl = sprintf($this->config->wsUrl.'?wsdl', $this->wsdl); $client = new SoapClient($wsdl, array( 'soap_version' => SOAP_1_1, 'encoding' => 'utf-8', 'connection_timeout' => 5, 'cache_wsdl' => 1, 'trace' => 1, 'features' => SOAP_SINGLE_ELEMENT_ARRAYS )); $header_tags = array('username' => new SOAPVar($this->config->wsUsername, XSD_STRING, null, null, null, $this->ns), 'password' => new SOAPVar(md5($this->config->wsPassword), XSD_STRING, null, null, null, $this->ns)); $header_body = new SOAPVar($header_tags, SOAP_ENC_OBJECT); $header = new SOAPHeader($this->ns, 'AuthHeaderElement', $header_body); $client->__setSoapHeaders($header); } catch (SoapFault $e){ controller('Error')->error($id.': Webservice connection error '.$e->getCode()); exit; } $this->client = $client; return $this->client; }

    Read the article

  • How to persist objects between requests in PHP

    - by SztupY
    I've been using rails, merb, django and asp.net mvc applications in the past. What they have common (that is relevant to the question) is that they have code that sets up the framework. This usually means creating objects and state that is persisted until the web server is recycled (like setting up routing, or checking which controllers are available, etc). As far as I know PHP is more like a CGI script that gets compiled to some bytecode each time it's run, and after the request it's discarded. Of course you can have sessions, to persist data between requests from the same user, and as I see there are extensions like APC, with which you can persist objects between requests at the server level. My question is: how can one create a PHP application that works like rails and such? I mean an application that on the first requests sets up the framework, then on the 2nd and later requests use the objects that are already set up. Is there some built in caching facility in mod_php? (for example that stores the compiled bytecode of the executed php applications) Or is using APC or some similar extensions the only way to solve this problem? How would you do it? Thanks.

    Read the article

  • IntegrityError: foreign key violation upon delete

    - by Lukasz Korzybski
    I have Order and Shipment model. Shipment has a foreign key to Order. class Order(...): ... class Shipment() order = m.ForeignKey('Order') ... Now in one of my views I want do delete order object along with all related objects. So I invoke order.delete(). I have Django 1.0.4, PostgreSQL 8.4 and I use transaction middleware, so whole request is enclosed in single transaction. The problem is that upon order.delete() I get: ... File "/usr/local/lib/python2.6/dist-packages/django/db/backends/__init__.py", line 28, in _commit return self.connection.commit() IntegrityError: update or delete on table "main_order" violates foreign key constraint "main_shipment_order_id_fkey" on table "main_shipment" DETAIL: Key (id)=(45) is still referenced from table "main_shipment". I checked in connection.queries that proper queries are executed in proper order. First shipment is deleted, after that django executes delete on order row: {'time': '0.000', 'sql': 'DELETE FROM "main_shipment" WHERE "id" IN (17)'}, {'time': '0.000', 'sql': 'DELETE FROM "main_order" WHERE "id" IN (45)'} Foreign key have ON DELETE NO ACTION (default) and is initially deferred. I don't know why I get foreign key constraint violation. I also tried to register pre_delete signal and manually delete shipment objects before delete on order is called, but it resulted in the same error. I can change ON DELETE behaviour for this key in Postgres but it would be just a hack, I wonder if anyone has a better idea what's going on here. There is also a small detail, my Order model inherits from Cart model, so it actually doesn't have id field but cart_ptr_id and after DELETE on order is executed there is also DELETE on cart, but it seems unrelated? to the shipment-order problem so I simplified it in the example.

    Read the article

  • WCF and ASP.NET - Server.Execute throwing object reference not set to an instance of an object

    - by user208662
    Hello, I have an ASP.NET page that calls to a WCF service. This WCF service uses a BackgroundWorker to asynchronously create an ASP.NET page on my server. Oddly, when I execute the WCF Service [OperationContract] [WebInvoke(Method = "POST", BodyStyle = WebMessageBodyStyle.WrappedRequest, RequestFormat = WebMessageFormat.Json, ResponseFormat = WebMessageFormat.Json)] public void PostRequest(string comments) { // Do stuff // If everything went o.k. asynchronously render a page on the server. I do not want to // block the caller while this is occurring. BackgroundWorker myWorker = new BackgroundWorker(); myWorker.DoWork += new DoWorkEventHandler(myWorker_DoWork); myWorker.RunWorkerAsync(HttpContext.Current); } private void myWorker_DoWork(object sender, DoWorkEventArgs e) { // Set the current context so we can render the page via Server.Execute HttpContext context = (HttpContext)(e.Argument); HttpContext.Current = context; // Retrieve the url to the page string applicationPath = context.Request.ApplicationPath; string sourceUrl = applicationPath + "/log.aspx"; string targetDirectory = currentContext.Server.MapPath("/logs/"); // Execute the other page and load its contents using (StringWriter stringWriter = new StringWriter()) { // Write the contents out to the target url // NOTE: THIS IS WHERE MY ERROR OCCURS currentContext.Server.Execute(sourceUrl, stringWriter); // Prepare to write out the result of the log targetPath = targetDirectory + "/" + DateTime.Now.ToShortDateString() + ".aspx"; using (StreamWriter streamWriter = new StreamWriter(targetPath, false)) { // Write out the content to the file sb.Append(stringWriter.ToString()); streamWriter.Write(sb.ToString()); } } } Oddly, when the currentContext.Server.Execute method is executed, it throws an "object reference not set to an instance of an object" error. The reason this is so strange is because I can look at the currentContext properties in the watch window. In addition, Server is not null. Because of this, I have no idea where this error is coming from. Can someone point me in the correct direction of what the cause of this could be? Thank you!

    Read the article

  • PHP $_SERVER['HTTP_HOST'] vs. $_SERVER['SERVER_NAME'], am I understanding the man pages correctly?

    - by Jeff
    I did a lot of searching and also read the PHP $_SERVER man page. Do I have this right regarding which to use for my PHP scripts for simple link definitions used throughout my site? $_SERVER['SERVER_NAME'] is based on your web servers' config file (Apache2 in my case), and varies depending on a few directives: (1) VirtualHost, (2) ServerName, (3) UseCanonicalName, etc. $_SERVER['HTTP_HOST'] is based on the request from the client. Therefore, it would seem to me that the proper one to use in order to make my scripts as compatible as possible would be $_SERVER['HTTP_HOST']. Is this assumption correct? Followup comments: I guess I got a little paranoid after reading this article and noting that someone said "they wouldn't trust any of the $_SERVER vars": http://markjaquith.wordpress.com/2009/09/21/php-server-vars-not-safe-in-forms-or-links/ and also: http://www.php.net/manual/en/reserved.variables.server.php (comment: Vladimir Kornea 14-Mar-2009 01:06) Apparently the discussion is mainly about $_SERVER['PHP_SELF'] and why you shouldn't use it in the form action attribute without proper escaping to prevent XSS attacks. My conclusion about my original question above is that it is "safe" to use $_SERVER['HTTP_HOST'] for all links on a site without having to worry about XSS attacks, even when used in forms. Please correct me if I'm wrong.

    Read the article

  • HTTP crawler in Erlang

    - by ctp
    I'm coding on a simple HTTP crawler but I have an issue running the code at the bottom. I'm requesting 50 URLs and get the content of 20+ back. I've generated few files with 150kB size each to test the crawler. So I think the 20+ responses are limited by the bandwidth? BUT: how to tell the Erlang snippet not to quit until the last file is not fetched? The test data server is online, so plz try the code out and any hints are welcome :) -module(crawler). -define(BASE_URL, "http://46.4.117.69/"). -export([start/0, send_reqs/0, do_send_req/1]). start() -> ibrowse:start(), proc_lib:spawn(?MODULE, send_reqs, []). to_url(Id) -> ?BASE_URL ++ integer_to_list(Id). fetch_ids() -> lists:seq(1, 50). send_reqs() -> spawn_workers(fetch_ids()). spawn_workers(Ids) -> lists:foreach(fun do_spawn/1, Ids). do_spawn(Id) -> proc_lib:spawn_link(?MODULE, do_send_req, [Id]). do_send_req(Id) -> io:format("Requesting ID ~p ... ~n", [Id]), Result = (catch ibrowse:send_req(to_url(Id), [], get, [], [], 10000)), case Result of {ok, Status, _H, B} -> io:format("OK -- ID: ~2..0w -- Status: ~p -- Content length: ~p~n", [Id, Status, length(B)]); Err -> io:format("ERROR -- ID: ~p -- Error: ~p~n", [Id, Err]) end. That's the output: Requesting ID 1 ... Requesting ID 2 ... Requesting ID 3 ... Requesting ID 4 ... Requesting ID 5 ... Requesting ID 6 ... Requesting ID 7 ... Requesting ID 8 ... Requesting ID 9 ... Requesting ID 10 ... Requesting ID 11 ... Requesting ID 12 ... Requesting ID 13 ... Requesting ID 14 ... Requesting ID 15 ... Requesting ID 16 ... Requesting ID 17 ... Requesting ID 18 ... Requesting ID 19 ... Requesting ID 20 ... Requesting ID 21 ... Requesting ID 22 ... Requesting ID 23 ... Requesting ID 24 ... Requesting ID 25 ... Requesting ID 26 ... Requesting ID 27 ... Requesting ID 28 ... Requesting ID 29 ... Requesting ID 30 ... Requesting ID 31 ... Requesting ID 32 ... Requesting ID 33 ... Requesting ID 34 ... Requesting ID 35 ... Requesting ID 36 ... Requesting ID 37 ... Requesting ID 38 ... Requesting ID 39 ... Requesting ID 40 ... Requesting ID 41 ... Requesting ID 42 ... Requesting ID 43 ... Requesting ID 44 ... Requesting ID 45 ... Requesting ID 46 ... Requesting ID 47 ... Requesting ID 48 ... Requesting ID 49 ... Requesting ID 50 ... OK -- ID: 49 -- Status: "200" -- Content length: 150000 OK -- ID: 47 -- Status: "200" -- Content length: 150000 OK -- ID: 50 -- Status: "200" -- Content length: 150000 OK -- ID: 17 -- Status: "200" -- Content length: 150000 OK -- ID: 48 -- Status: "200" -- Content length: 150000 OK -- ID: 45 -- Status: "200" -- Content length: 150000 OK -- ID: 46 -- Status: "200" -- Content length: 150000 OK -- ID: 10 -- Status: "200" -- Content length: 150000 OK -- ID: 09 -- Status: "200" -- Content length: 150000 OK -- ID: 19 -- Status: "200" -- Content length: 150000 OK -- ID: 13 -- Status: "200" -- Content length: 150000 OK -- ID: 21 -- Status: "200" -- Content length: 150000 OK -- ID: 16 -- Status: "200" -- Content length: 150000 OK -- ID: 27 -- Status: "200" -- Content length: 150000 OK -- ID: 03 -- Status: "200" -- Content length: 150000 OK -- ID: 23 -- Status: "200" -- Content length: 150000 OK -- ID: 29 -- Status: "200" -- Content length: 150000 OK -- ID: 14 -- Status: "200" -- Content length: 150000 OK -- ID: 18 -- Status: "200" -- Content length: 150000 OK -- ID: 01 -- Status: "200" -- Content length: 150000 OK -- ID: 30 -- Status: "200" -- Content length: 150000 OK -- ID: 40 -- Status: "200" -- Content length: 150000 OK -- ID: 05 -- Status: "200" -- Content length: 150000 Update: thanks stemm for the hint with the wait_workers. I've combined your and mine code but same behaviour :( -module(crawler). -define(BASE_URL, "http://46.4.117.69/"). -export([start/0, send_reqs/0, do_send_req/2]). start() -> ibrowse:start(), proc_lib:spawn(?MODULE, send_reqs, []). to_url(Id) -> ?BASE_URL ++ integer_to_list(Id). fetch_ids() -> lists:seq(1, 50). send_reqs() -> spawn_workers(fetch_ids()). spawn_workers(Ids) -> %% collect reference to each worker Refs = [ do_spawn(Id) || Id <- Ids ], %% wait for response from each worker wait_workers(Refs). wait_workers(Refs) -> lists:foreach(fun receive_by_ref/1, Refs). receive_by_ref(Ref) -> %% receive message only from worker with specific reference receive {Ref, done} -> done end. do_spawn(Id) -> Ref = make_ref(), proc_lib:spawn_link(?MODULE, do_send_req, [Id, {self(), Ref}]), Ref. do_send_req(Id, {Pid, Ref}) -> io:format("Requesting ID ~p ... ~n", [Id]), Result = (catch ibrowse:send_req(to_url(Id), [], get, [], [], 10000)), case Result of {ok, Status, _H, B} -> io:format("OK -- ID: ~2..0w -- Status: ~p -- Content length: ~p~n", [Id, Status, length(B)]), %% send message that work is done Pid ! {Ref, done}; Err -> io:format("ERROR -- ID: ~p -- Error: ~p~n", [Id, Err]), %% repeat request if there was error while fetching a page, do_send_req(Id, {Pid, Ref}) %% or - if you don't want to repeat request, put there: %% Pid ! {Ref, done} end. Running the crawler forks fine for a handful of files, but then the code even doesnt fetch the entire files (file size each 150000 bytes) - he crawler fetches some files partially, see the following web server log :( 82.114.62.14 - - [13/Sep/2012:15:17:00 +0200] "GET /10 HTTP/1.1" 200 150000 "-" "-" 82.114.62.14 - - [13/Sep/2012:15:17:00 +0200] "GET /1 HTTP/1.1" 200 150000 "-" "-" 82.114.62.14 - - [13/Sep/2012:15:17:00 +0200] "GET /3 HTTP/1.1" 200 150000 "-" "-" 82.114.62.14 - - [13/Sep/2012:15:17:00 +0200] "GET /8 HTTP/1.1" 200 150000 "-" "-" 82.114.62.14 - - [13/Sep/2012:15:17:00 +0200] "GET /39 HTTP/1.1" 200 150000 "-" "-" 82.114.62.14 - - [13/Sep/2012:15:17:00 +0200] "GET /7 HTTP/1.1" 200 150000 "-" "-" 82.114.62.14 - - [13/Sep/2012:15:17:00 +0200] "GET /6 HTTP/1.1" 200 150000 "-" "-" 82.114.62.14 - - [13/Sep/2012:15:17:00 +0200] "GET /2 HTTP/1.1" 200 150000 "-" "-" 82.114.62.14 - - [13/Sep/2012:15:17:00 +0200] "GET /5 HTTP/1.1" 200 150000 "-" "-" 82.114.62.14 - - [13/Sep/2012:15:17:00 +0200] "GET /50 HTTP/1.1" 200 150000 "-" "-" 82.114.62.14 - - [13/Sep/2012:15:17:00 +0200] "GET /9 HTTP/1.1" 200 150000 "-" "-" 82.114.62.14 - - [13/Sep/2012:15:17:00 +0200] "GET /44 HTTP/1.1" 200 150000 "-" "-" 82.114.62.14 - - [13/Sep/2012:15:17:00 +0200] "GET /38 HTTP/1.1" 200 150000 "-" "-" 82.114.62.14 - - [13/Sep/2012:15:17:00 +0200] "GET /47 HTTP/1.1" 200 150000 "-" "-" 82.114.62.14 - - [13/Sep/2012:15:17:00 +0200] "GET /49 HTTP/1.1" 200 150000 "-" "-" 82.114.62.14 - - [13/Sep/2012:15:17:00 +0200] "GET /43 HTTP/1.1" 200 150000 "-" "-" 82.114.62.14 - - [13/Sep/2012:15:17:00 +0200] "GET /37 HTTP/1.1" 200 150000 "-" "-" 82.114.62.14 - - [13/Sep/2012:15:17:00 +0200] "GET /46 HTTP/1.1" 200 150000 "-" "-" 82.114.62.14 - - [13/Sep/2012:15:17:00 +0200] "GET /48 HTTP/1.1" 200 150000 "-" "-" 82.114.62.14 - - [13/Sep/2012:15:17:00 +0200] "GET /36 HTTP/1.1" 200 150000 "-" "-" 82.114.62.14 - - [13/Sep/2012:15:17:01 +0200] "GET /42 HTTP/1.1" 200 150000 "-" "-" 82.114.62.14 - - [13/Sep/2012:15:17:01 +0200] "GET /41 HTTP/1.1" 200 150000 "-" "-" 82.114.62.14 - - [13/Sep/2012:15:17:01 +0200] "GET /45 HTTP/1.1" 200 150000 "-" "-" 82.114.62.14 - - [13/Sep/2012:15:17:01 +0200] "GET /17 HTTP/1.1" 200 150000 "-" "-" 82.114.62.14 - - [13/Sep/2012:15:17:01 +0200] "GET /35 HTTP/1.1" 200 150000 "-" "-" 82.114.62.14 - - [13/Sep/2012:15:17:01 +0200] "GET /16 HTTP/1.1" 200 150000 "-" "-" 82.114.62.14 - - [13/Sep/2012:15:17:01 +0200] "GET /15 HTTP/1.1" 200 17020 "-" "-" 82.114.62.14 - - [13/Sep/2012:15:17:01 +0200] "GET /21 HTTP/1.1" 200 120360 "-" "-" 82.114.62.14 - - [13/Sep/2012:15:17:01 +0200] "GET /40 HTTP/1.1" 200 117600 "-" "-" 82.114.62.14 - - [13/Sep/2012:15:17:01 +0200] "GET /34 HTTP/1.1" 200 60660 "-" "-" Any hints are welcome. I have no clue what's going wrong there :(

    Read the article

  • Save a form in an XML file using Ajax and JSP

    - by novellino
    Hello, I want to create a simple form with a name and an email and save these data in an XML file. So far I found that using Ajax with jQuery is quite easy. So I used the usual code: //dataString have the values taken from the form var dataString = 'name='+ name + '&email=' + email; $.ajax({ type: "POST", url: "users.xml", data: dataString, dataType: "xml", success: function() { .... } }); If I understood well, in the url I should add the name of the XML file that will be created. When the user clicks a button I call the function with the Ajax request, and then I should call somewhere a function for generating the xml. I am using also two beans. One is for setting the elements of the user and the other is for saving the data in the XML. I am using the XStream library for the xml although I don't know if is the best solution. The problem now it that I can not connect all these together in order to save the data in the XML. Does anyone know what should I do? Thanks a lot!

    Read the article

  • Images in database vs file system

    - by Jesse
    We have a project coming up where we will be building a whole backend CMS system that will power our entire extranet and intranet with one package. The question I have been trying to find an answer to is which is better: storing images in the database (SQL Server 2005) so we may have integrity, single replication plan, etc OR storing on the file system? One issue we have is that we have multiple servers load balanced that require to have the same data at all times. As of now we have SQL replication taking care of that but file replication seems to be a little tougher. Another concern we have is that we would like to have multiple resolutions of the same image, we are not sure if creating and storing each version on the file system would be best or maybe dynamically pulling and creating the resolution image we would like upon request. Our concerns are the with the following: Data integrity Data replication Multiple resolutions Speed of database vs file system Overhead load of database vs file system Data management and backup Does anyone have a similar situation or have any input on what would be recommended? Thanks in advance for the help!

    Read the article

  • Django: Serving Media Behind Custom URL

    - by TheLizardKing
    So I of course know that serving static files through Django will send you straight to hell but I am confused on how to use a custom url to mask the true location of the file using Django. http://stackoverflow.com/questions/2681338/django-serving-a-download-in-a-generic-view but the answer I accepted seems to be the "wrong" way of doing things. urls.py: url(r'^song/(?P<song_id>\d+)/download/$', song_download, name='song_download'), views.py: def song_download(request, song_id): song = Song.objects.get(id=song_id) fsock = open(os.path.join(song.path, song.filename)) response = HttpResponse(fsock, mimetype='audio/mpeg') response['Content-Disposition'] = "attachment; filename=%s - %s.mp3" % (song.artist, song.title) return response This solution works perfectly but not perfectly enough it turns out. How can I avoid having a direct link to the mp3 while still serving through nginx/apache? EDIT 1 - ADDITIONAL INFO Currently I can get my files by using an address such as: http://www.example.com/music/song/1692/download/ But the above mentioned method is the devil's work. How can I accomplished what I get above while still making nginx/apache serve the media? Is this something that should be done at the webserver level? Some crazy mod_rewrite? http://static.example.com/music/Aphex%20Twin%20-%20Richard%20D.%20James%20(V0)/10%20Logon-Rock%20Witch.mp3

    Read the article

  • problem in accessing the path involving pagination using display tag

    - by sarah
    Hi All, I am using display tag for pagiantion and display of the data in table format the code is like <display:column title="Select" style="width: 90px;"> <input type="checkbox" name="optionSelected" value="<c:out value='${userList.loginName}'/>"/> </display:column> <display:column property="loginName" sortable="false" title="UserName" paramId="loginName" style="width: 150px; text-align:center" href="./editUser.do?method=editUser"/> the pagesize is one it will display one in a page and eidt page will be dispaly on click of login name,but when i go to the second page i get an error saying /views/editUser path not found.How exactly should i define the path ? the struts-config is like <!-- action for edit user --> <action path="/editUser" name="editUserForm" type="com.actions.UserManagementAction" parameter="method" input="/EditUser.jsp" scope="request"> <forward name="success" path="/views/EditUser.jsp" /> <forward name="failure" path="/views/failure.jsp" /> </action> Pleas tell me how should i define the access path for display tag as on click of edit link the first page works but not the second edit

    Read the article

  • Pattern for limiting number of simultaneous asynchronous calls

    - by hitch
    I need to retrieve multiple objects from an external system. The external system supports multiple simultaneous requests (i.e. threads), but it is possible to flood the external system - therefore I want to be able to retrieve multiple objects asynchronously, but I want to be able to throttle the number of simultaneous async requests. i.e. I need to retrieve 100 items, but don't want to be retrieving more than 25 of them at once. When each request of the 25 completes, I want to trigger another retrieval, and once they are all complete I want to return all of the results in the order they were requested (i.e. there is no point returning the results until the entire call is returned). Are there any recommended patterns for this sort of thing? Would something like this be appropriate (pseudocode, obviously)? private List<externalSystemObjects> returnedObjects = new List<externalSystemObjects>; public List<externalSystemObjects> GetObjects(List<string> ids) { int callCount = 0; int maxCallCount = 25; WaitHandle[] handles; foreach(id in itemIds to get) { if(callCount < maxCallCount) { WaitHandle handle = executeCall(id, callback); addWaitHandleToWaitArray(handle) } else { int returnedCallId = WaitHandle.WaitAny(handles); removeReturnedCallFromWaitHandles(handles); } } WaitHandle.WaitAll(handles); return returnedObjects; } public void callback(object result) { returnedObjects.Add(result); }

    Read the article

  • Is Software Engineering Dead? [closed]

    - by nik
    Right from Jeff's blog: Software Engineering: Dead? I was utterly floored when I read this new IEEE article by Tom DeMarco (pdf). See if you can tell why. He quotes DeMarco, "I'm gradually coming to the conclusion that software engineering is an idea whose time has come and gone". Further, "What DeMarco seems to be saying -- and, at least, what I am definitely saying -- is that control is ultimately illusory on software development projects." I am writing these lines without context to invoke reading of the related subject. What are the views of the programming community here? I have started to realize that a community wiki is not getting the right amount of participation here. That is the reason I left this question out in the open, while still contemplating a change to CW. It was closed once, and I thought that was the end of it. But, now I see it was reopened and has more answers (all of which I have not yet read). However, I see a lot of CW requests and am forced to reconsider that. This is how I intend to make the CW decision here. There is a comment by Neil Butterworth requesting a CW at 12 upvotes -- "should be community wiki" There is a comment by Lance Roberts requesting no CW at 0 upvotes -- "+1 for not putting it in community wiki" The difference is 12 for a CW request at the moment If this difference becomes 5 more (that is 17), I'll move this question to CW, and it will not return back from there Of course, there is also a close vote at the moment; the question may be closed again.

    Read the article

  • How to send a JSONObject to a REST service?

    - by Sebi
    Retrieving data from the REST Server works well, but if I want to post an object it doesn't work: public static void postJSONObject(int store_type, FavoriteItem favorite, String token, String objectName) { String url = ""; switch(store_type) { case STORE_PROJECT: url = URL_STORE_PROJECT_PART1 + token + URL_STORE_PROJECT_PART2; //data = favorite.getAsJSONObject(); break; } HttpClient httpClient = new DefaultHttpClient(); HttpPost postMethod = new HttpPost(url); try { HttpEntity entity = new StringEntity("{\"ID\":0,\"Name\":\"Mein Projekt10\"}"); postMethod.setEntity(entity); HttpResponse response = httpClient.execute(postMethod); Log.i("JSONStore", "Post request, to URL: " + url); System.out.println("Status code: " + response.getStatusLine().getStatusCode()); } catch (ClientProtocolException e) { I always get a 400 Error Code. Does anybody know whats wrong? I have working C# code, but I can't convert: System.Net.WebRequest wr = System.Net.HttpWebRequest.Create("http://localhost:51273/WSUser.svc/pak3omxtEuLrzHSUSbQP/project"); wr.Method = "POST"; string data = "{\"ID\":1,\"Name\":\"Mein Projekt\"}"; byte [] d = UTF8Encoding.UTF8.GetBytes(data); wr.ContentLength = d.Length; wr.ContentType = "application/json"; wr.GetRequestStream().Write(d, 0, d.Length); System.Net.WebResponse wresp = wr.GetResponse(); System.IO.StreamReader sr = new System.IO.StreamReader(wresp.GetResponseStream()); string line = sr.ReadToEnd();

    Read the article

  • oData/ADO.NET Data Services using LINQ-to-SQL with a decryption layer

    - by Program.X
    I have written an application using LINQ-to-SQL that submits a web form into a database. I absact the LINQ-to-SQL away using a Repository pattern. This repository has the basic methods: Get(), Save(), etc. As a development of the project, I needed to encrypt certain fields in the form. This was trivial, as I just added the encryption calls to the Get(), Save() methods in the Repository. Now, I want to put an oData layer over it, to allow RESTful extraction from MS Excel 2010 (when it comes out). I have this working, after a few stumbles on useless error messages, etc. However, obviously, those encrypted fields are still encrypted. My repository pattern would have decrypted these for me. As far as I know, I have to directly bind my oData service to the LINQ-to-SQL context for the schema, etc. to work - unless I enter a whole world of pain (any URLs appreciated). Is there a way I can insert my encryption/decryption layer into the request so decryption is done "on the fly"? I looked at the OnStartProcessingRequest() overload of DataService but this doesn't seem that useful.

    Read the article

  • With regards to urllib AttributeError: 'module' object has no attribute 'urlopen'

    - by Matt
    import re import string import shutil import os import os.path import time import datetime import math import urllib from array import array import random filehandle = urllib.urlopen('http://www.google.com/') #open webpage s = filehandle.read() #read print s #display #what i plan to do with it once i get the first part working #results = re.findall('[<td style="font-weight:bold;" nowrap>$][0-9][0-9][0-9][.][0-9][0-9][</td></tr></tfoot></table>]',s) #earnings = '$ ' #for money in results: #earnings = earnings + money[1]+money[2]+money[3]+'.'+money[5]+money[6] #print earnings #raw_input() this is the code that i have so far. now i have looked at all the other forums that give solutions such as the name of the script, which is parse_Money.py, and i have tried doing it with urllib.request.urlopen AND i have tried running it on python 2.5, 2.6, and 2.7. If anybody has any suggestions it would be really welcome, thanks everyone!! --Matt ---EDIT--- I also tried this code and it worked, so im thinking its some kind of syntax error, so if anybody with a sharp eye can point it out, i would be very appreciative. import shutil import os import os.path import time import datetime import math import urllib from array import array import random b = 3 #find URL URL = raw_input('Type the URL you would like to read from[Example: http://www.google.com/] :') while b == 3: #get file name file1 = raw_input('Enter a file name for the downloaded code:') filepath = file1 + '.txt' if os.path.isfile(filepath): print 'File already exists' b = 3 else: print 'Filename accepted' b = 4 file_path = filepath #open file FileWrite = open(file_path, 'a') #acces URL filehandle = urllib.urlopen(URL) #display souce code for lines in filehandle.readlines(): FileWrite.write(lines) print lines print 'The above has been saved in both a text and html file' #close files filehandle.close() FileWrite.close()

    Read the article

  • share the same cookie between two website using PHP cURL extension

    - by powerboy
    I want to get the contents of some emails in my gmail account. I would like to use the PHP cURL extension to do this. I followed these steps in my first try: In the PHP code, output the contents of https://www.google.com/accounts/ServiceLoginAuth. In the browser, the user input username and password to login. In the PHP code, save cookies in a file named cookie.txt. In the PHP code, send request to https://mail.google.com/ along with cookies retrieved from cookie.txt and output the contents. The following code does not work: $login_url = 'https://www.google.com/accounts/ServiceLoginAuth'; $gmail_url = 'https://mail.google.com/'; $cookie_file = dirname(__FILE__) . '/cookie.txt'; $ch = curl_init(); curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false); curl_setopt($ch, CURLOPT_HEADER, false); curl_setopt($ch, CURLOPT_POST, true); curl_setopt($ch, CURLOPT_RETURNTRANSFER, true); curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true); curl_setopt($ch, CURLOPT_COOKIEJAR, $cookie_file); curl_setopt($ch, CURLOPT_URL, $login_url); $output = curl_exec($ch); echo $output; curl_setopt($ch, CURLOPT_URL, $gmail_url); curl_setopt($ch, CURLOPT_COOKIEFILE, $cookie_file); $output = curl_exec($ch); echo $output; curl_close($ch);

    Read the article

  • Qt Socket blocking functions required to run in QThread where created. Any way past this?

    - by Alexander Kondratskiy
    The title is very cryptic, so here goes! I am writing a client that behaves in a very synchronous manner. Due to the design of the protocol and the server, everything has to happen sequentially (send request, wait for reply, service reply etc.), so I am using blocking sockets. Here is where Qt comes in. In my application I have a GUI thread, a command processing thread and a scripting engine thread. I create the QTcpSocket in the command processing thread, as part of my Client class. The Client class has various methods that boil down to writing to the socket, reading back a specific number of bytes, and returning a result. The problem comes when I try to directly call Client methods from the scripting engine thread. The Qt sockets randomly time out and when using a debug build of Qt, I get these warnings: QSocketNotifier: socket notifiers cannot be enabled from another thread QSocketNotifier: socket notifiers cannot be disabled from another thread Anytime I call these methods from the command processing thread (where Client was created), I do not get these problems. To simply phrase the situation: Calling blocking functions of QAbstractSocket, like waitForReadyRead(), from a thread other than the one where the socket was created (dynamically allocated), causes random behaviour and debug asserts/warnings. Anyone else experienced this? Ways around it? Thanks in advance.

    Read the article

  • ASP.Net Response Filter Causing SharePoint 2010 "Unexpected Error"

    - by Jason Weber
    Hello everyone, I'm debugging an HttpModule with an ASP.NET response filter. This dynamically rewrites portions of rendered SharePoint WCM pages. The publishing pages render fine in SP2007 on both Server 2003 and Server 2008. However the equivalent pages fail to render in SP2010 B2 on Server 2008 R2. The generic "An unexpected error has occurred message" page is displayed. This error only happens when the response filter is applied to an .aspx page. Other page types, such as .css, render fine on this platform. This error also happens when the response filter does not modify the page at all (pure pass-through). This KB article seems very closely related: http://support.microsoft.com/kb/2014472. However, this same error occurs with caching disabled. I see no related entries in any of the following: ULS for SP, Event Log, Failed Request Tracing (IIS7). Running under the debugger suggests that the custom code is not raising any exceptions. Any help or insight would be greatly appreciated.

    Read the article

  • How can I display an ASP.NET MVC html part from one application in another

    - by Frank Sessions
    We have several asp.net MVC apps in the following setup SecurityApp (root application - handles forms auth for SSO and has a profile edit page) Application1 (virtual directory) Application2 (virtual directory) Application3 (virtual directory) so that domain.com points to SecurityApp and domain.com/Application1 etc point to their associated virtual directories. All of our Single Sign On (SSO) is working properly using forms authentication. Based on the users permissions when logging in a menu that lists their available applications and a logout link will be generated and saved in the cache - this menu displays fine whenever the user is in the SecurityApp (editing their profile) but we cannot figure out how to get the Applications in the virtual directories to display the same application menu. We have tried: 1) Using JSONP to do an request that will return the html for the menu. The ajax call returns the HTML with the html; however, because User.IsAuthenticated is false the menu comes back empty. 2) We created a user control and include it along with the dll's for the SecurityApp project and this works; however, we dont want to have to include all the dlls for the SecurityApp project in every application that we create (along with all the app settings in the web.config) We would like this to be as simple as possible to implement so that anyone creating a new app can add the menu to their application in as few steps as possible... Any ideas? To Clarify - we are using ASP.NET MVC 1.0 since these apps are in production and we do not have the okay to go to ASP.NET MVC 2.0 (unfortunately)

    Read the article

  • C# Byte[] to Url Friendly String

    - by LorenVS
    Hello, I'm working on a quick captcha generator for a simple site I'm putting together, and I'm hoping to pass an encrypted key in the url of the page. I could probably do this as a query string parameter easy enough, but I'm hoping not too (just because nothing else runs off the query string)... My encryption code produces a byte[], which is then transformed using Convert.ToBase64String(byte[]) into a string. This string, however, is still not quite url friendly, as it can contain things like '/' and '='. Does anyone know of a better function in the .NET framework to convert a byte array to a url friendly string? I know all about System.Web.HttpUtility.UrlEncode() and its equivalents, however, they only work properly with query string parameters. If I url encode an '=' inside of the path, my web server brings back a 400 Bad Request error. Anyways, not a critical issue, but hoping someone can give me a nice solution **EDIT: Just to be absolutely sure exactly what I'm doing with the string, I figured I would supply a little more information. The byte[] that results from my encryption algorithm should be fed through some sort of algorithm to make it into a url friendly string. After this, it becomes the content of an XElement, which is then used as the source document for an XSLT transformation, and is used as a part of the href attribute for an anchor. I don't believe the xslt transformation is causing the issues, since what is coming through on the path appears to be an encoded query string parameter, but causes the HTTP 400 I've also tried HttpUtility.UrlPathEncode() on a base64 string, but that doesn't seem to do the trick either (I still end up with '/'s in my url)**

    Read the article

  • PHP Json Encoding w/ quote escaping in 5.2?

    - by NickAldwin
    I'm playing with the flickr api and php. I want to pass some information from PHP to Javascript through Ajax. I have the following code: json_encode($pics); which results in the following example JSON string: [{"id":"4363603591","title":"blue, white and red...another seattle view","date_faved":"1266379499"},{"id":"4004908219","title":"\u201cI just told you my dreams and you made me see that I could walk into the sun and I could still be me and now I can't deny nothing lasts forever.\u201d","date_faved":"1259987670"}] Javascript has problems with this, however, due to the unescaped single-quote in the second item ("can't deny"). I want to use the function json_encode with the options parameter to make it strip the quotes, but that's only available in PHP 5.3, and I'm running 5.2 (not my server). Is there a fast way to run through the entire array and escape everything before encoding it in Json? I looked for a way to do this, but it all seems to deal with encoding it as the data is generated, something I cannot do as I'm not the one generating the data. If it helps, I'm currently using the following javascript after the ajax request: var photos = eval('(' + resptxt + ')');

    Read the article

< Previous Page | 519 520 521 522 523 524 525 526 527 528 529 530  | Next Page >