Search Results

Search found 50062 results on 2003 pages for 'http 1 1'.

Page 621/2003 | < Previous Page | 617 618 619 620 621 622 623 624 625 626 627 628  | Next Page >

  • Problematic tags displaying HTML in Java

    - by Andez
    Hi again, I have the following tag included in my HTML which causes the JEditorPane not to show the HTML output. <META http-equiv="Content-Type" content="text/html; charset=UTF-8"> Not so much a big deal but anyone know why this would happen? Cannot find too much documentation on this - best I came up with was someone having a go a few years ago to see what was supported on retro virus site. At the minute I'm doing a simple find replace on the string which is not good - are there any better ways? I've seen a lot of people saying that RegEx is no good. The code I have used is this._html = this._html.replace( "<META http-equiv=\"Content-Type\" content=\"text/html; charset=UTF-8\">", ""); Andez

    Read the article

  • fullscreen map not displayed correctly

    - by user1747168
    I want the map to be opened on full-screen. I've tried this: <div class="b-firm-map-content" id="map"></div> <a href="#" onclick="test2();return false;" >full screen</a> function test2(){ var width = $(window).width()-3; var height = $(window).height(); $('#map').css({ 'width': width, 'height': height - 40 , 'position': 'absolute ', 'z-index' : '900' }); } but it result in: http://pixs.ru/showimage/Snimokpng_5811285_6065704.png http://pixs.ru/showimage/Snimok1png_4265065_6065745.png My map not completely displayed.

    Read the article

  • php string search - grabbing specific urls

    - by MEM
    Hello, I have this string that may contain some urls that I need to grab. For instance, if the user does: www.youtube ... or www.vimeo ... or http://www.youtube ... or HttP://WwW.viMeo I need to grab it (until he finds a space perhaps). and store it on a already created array. The need is to separate the vimeo links from the youtube ones and place each of those on the appropriate video object. I'm not sure if this is possible, I mean, if the URL coming from the browser could be used to be placed on a predefined video object. If it is, then this is the way to go (so I believe). If all this is feasible, can I have your help in order to build such a rule? Thanks in advance

    Read the article

  • Delay and Inconsistent results using Twitter search API when using "since_id" parameter.

    - by benr
    Hi. We've noticed what seems to be a delay and/or inconsistent results using the Twitter Search API when specifying a sinceid in the param clause. For example: http://search.twitter.com/search?ors=%23b4esummit+@b4esummit+b4esummit&q=&result_type=recent&rpp=100&show_user=true&since_id= Will give the most recent Tweets, but: http://search.twitter.com/search?ors=%23b4esummit+@b4esummit+b4esummit&q=&result_type=recent&rpp=100&show_user=true&since_id= 12642940173 will often not give tweets that are after that ID for several hours (even though they're visible in the first query)... anyone have similar problems?

    Read the article

  • Automatically Host User Domains in Rails/Apache

    - by Steve F
    Hi, I'm currently developing a user facing web application that gives each new user their own subdomain on the site, which is fine (using subdomain_fu), but is there a way to let a user map their own domain to this subdomain? I know how to do this manually through SSH-ing into the server and editing the Apache Vhosts file by hand, but is there a way to do this automatically so that a user simply enters their domain into a box on the site (obviously they'd have to change their own DNS elsewhere)? I'm using Ruby 1.8 and Rails 2.3.3 on top of Apache. Essentially letting; http://user.application.com/article-1 be accessed from http://userdomain.com/article-1 Thanks for any help!

    Read the article

  • How to create simpliest PHP Get API with UTF-8 support?

    - by Ole Jak
    How to create simpliest *(less lines of code, less strange words) PHP Get API *(so any programm made in .Net C# could call url like http://localhost/api.php?astring=your_utf-8_string&bstring=your_utf-8_string ) with UTF-8 support? What I need Is PHP API with one function - concatinate 2 strings so that a simple .net client like this would be able to use it: public string setStream(string astring, string bstring) { string newAstring =Uri.EscapeDataString(astring); string newBstring = Uri.EscapeDataString(bstring); WebClient client = new WebClient(); var result = client.DownloadString(("http://localhost/api.php?" + string.Format("astring={0}&bstring={1}", newAstring, newBstring)).ToString()); return result; }

    Read the article

  • Need url's to be non secure when moving away from a secured link (without hardcoded url's in html)?

    - by Tony_Henrich
    I have an asp.net site. It has an order form which is accessible at https://secure.example.com/order.aspx. The links on the site do not include the domain name. So for example the home page is 'default.aspx'. The issue is that if I click on a link like the home page from the secure page, the url becomes https://secure.example.com/default.aspx instead of http://www.example.com/default.aspx. What's a good way to handle this? The scheme should automatically work using any domain name based on where it's launched from. So if the site is launched from 'localhost', moving away from the secured page, the url's should be http://localhost/... The navigation links are in a master page.

    Read the article

  • Scheduler for asp.net ?

    - by user359706
    Is there a schedule control in asp.net. What I need: column display: users Rows display : months and days. On clicking cell will open a popup In popup we can : - select a status in a dropdownList, - if the status is "be close" = two calendars ( date start and end) - then apply a color for the selected period. I know I would not find an exact need control, but I want a component that would be closest. Somethink like http://www.daypilot.org/ or http://www.codeproject.com/KB/webforms/EventCalendarControl.aspx. hoping to be clearly understandable. thank you for your help will be precious to me.

    Read the article

  • problem in playing mp4 video in safari

    - by Codiator
    I have a particular link to a video file in server. when i load the safari through [[UIApplication sharedApplication] openURL:[NSURL URLWithString:@"http://mydomain/category/2.mp4"]]; simulator pops out an alert message saying that "Cannot Play MOVIE.. Server is not correctly configured"....... Even I tried using the mediaPlayer framework - MPMoviePlayerController and loading it with th e url-"http://mydomain/category/2.mp4" but it didnt play out. when i paste the same url in the 10.5 leopards safari it plays out smoothly.. What may be the reason behind it.?

    Read the article

  • Entity Framework 4.0 - Code only Reference

    - by joe
    Hello All, I am trying to learn EF 4 and its code only features. I tried the following great articles and was able to make a sample application. http://blogs.taiga.nl/martijn/2009/11/22/entity-framework-4-0-a-fresh-start-with-demo-application/#reply http://blogs.msdn.com/efdesign/archive/2009/10/12/code-only-further-enhancements.aspx But I am looking for a good reference library / website on Code only feature. I tried searching MSDN but couldn't find it. Please help. Thanks a lot.

    Read the article

  • How to remove "index.php?" from HTACCESS [duplicate]

    - by Francis Goris
    This question already has an answer here: Reference: mod_rewrite, URL rewriting and “pretty links” explained 2 answers I have url like this: www.site.com/index.php?/genero/aventura/av/ But I would like this to be my new url: site.com/genero/aventura/av/ I used the following code: <IfModule mod_rewrite.c>RewriteEngine On RewriteCond %{HTTP_HOST} !^www.site.com/$ [NC] RewriteRule ^index.php\?/(.*)$ site.com/$1 [R=301,L] </IfModule> but only returns me: site.com/index.php?/genero/aventura/av/ This is my latest & full version: RewriteEngine on #RewriteCond $1 !^(index\.php|ver_capitulo\.html|google3436eb8eea8b8d6e\.html|BingSiteAuth\.xml |portadas|public|mp3|css|favicon\.ico|js|plantilla|i|swf|plugins|player\.swf|robots\.txt) RewriteCond $1 !^(index\.php|public|css|js|i|feed|portadas|robots\.txt|BingSiteAuth\.xml|plugins|i|mp3|favicon\.ico|pluginslist\.xml|google3436eb8eea8b8d6e\.html) RewriteRule ^(.*)$ /index.php?/$1 [L] #DirectoryIndex index.php #RewriteCond %{THE_REQUEST} http://www.page.com/index\.php [NC] #RewriteRule ^(.*?)index\.php$ http://page.com/$1 [L,R=301,NC,NE] #DirectoryIndex index.php #RewriteEngine On Thanks for reading.

    Read the article

  • [Jquery] Help with animation

    - by Pennywise83
    Hì there, I've used the "FeatureList" Jquery Plugin to make my own content slider. The script can be found here: http://pastebin.com/7iyE5ADu Here is an exemplificative image to show what I'm triyng to achieve: http://i41.tinypic.com/6jkeq1.jpg Actually the slider add a "current" class to an item (in the example the squares 1,2 and 3) and for each thumb show a content in the main area. In the example, with an interval of 2 seconds, the script switch from 1 to 2, from 2 to 3, and so on. I'd like to make a continuous animation of the thumbs, anyone can help me?

    Read the article

  • Django template CSS/IMG is "off" in the URL

    - by erimar77
    I have /path/to/my/theme/static/css/frontend.css which is called by base.html <link rel="stylesheet" type="text/css" href="{{ STATIC_URL }}css/frontend.css" media="all" /> In which I've got a background for the header: #header-wrapper min-width: 960px; height: 150px; background: transparent url(img/header-bg.png) repeat-x center bottom; } The file is /path/to/my/theme/static/img I've run manage.py collectstatic to gather the files and almost everything looks correct except the link generated looks like: http://example.com/static/css/img/header-bg.png In which the image does not show, because the correct URL is: http://example.com/static/img/header-bg.png Where am I going wrong??

    Read the article

  • JQuery + Rails + Select Menu

    - by blackpond
    I want to have a select menu to change a field on a Customer dynamically, I've never used Jquery with a select menu, and I'm having problems. The code: <% form_for @customer , :url = { :action = "update" }, :html ={:class = "ajax_form"} do |f| % Pricing: <%= select :customer, :pricing, Customer::PRICING, {}, :onchange = "$('this').closest('form').submit();" % Application.js: $(document).ready(function(){ $(".ajax_link").live("click",function(event){ //take div class = ajax_link and call this funciton when clicked. event.preventDefault(); // cancels http request $.post($(this).attr("href"), null, null, "script"); return false; }); ajaxFormSubmitHooks(); }); function ajaxFormSubmitHooks(){ $(".ajax_form").submit(function(event){ event.preventDefault(); // cancels http request $.post($(this).attr("action"), $(this).serializeArray(), null, "script"); return false; }); }

    Read the article

  • Data extract from website URL

    - by user2522395
    From this below script I am able to extract all links of particular website, But i need to know how I can generate data from extracted links especially like eMail, Phone number if its there Please help how i will modify the existing script and get the result or if you have full sample script please provide me. Private Sub btnGo_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles btnGo.Click 'url must be in this format: http://www.example.com/ Dim aList As ArrayList = Spider("http://www.qatarliving.com", 1) For Each url As String In aList lstUrls.Items.Add(url) Next End Sub Private Function Spider(ByVal url As String, ByVal depth As Integer) As ArrayList 'aReturn is used to hold the list of urls Dim aReturn As New ArrayList 'aStart is used to hold the new urls to be checked Dim aStart As ArrayList = GrabUrls(url) 'temp array to hold data being passed to new arrays Dim aTemp As ArrayList 'aNew is used to hold new urls before being passed to aStart Dim aNew As New ArrayList 'add the first batch of urls aReturn.AddRange(aStart) 'if depth is 0 then only return 1 page If depth < 1 Then Return aReturn 'loops through the levels of urls For i = 1 To depth 'grabs the urls from each url in aStart For Each tUrl As String In aStart 'grabs the urls and returns non-duplicates aTemp = GrabUrls(tUrl, aReturn, aNew) 'add the urls to be check to aNew aNew.AddRange(aTemp) Next 'swap urls to aStart to be checked aStart = aNew 'add the urls to the main list aReturn.AddRange(aNew) 'clear the temp array aNew = New ArrayList Next Return aReturn End Function Private Overloads Function GrabUrls(ByVal url As String) As ArrayList 'will hold the urls to be returned Dim aReturn As New ArrayList Try 'regex string used: thanks google Dim strRegex As String = "<a.*?href=""(.*?)"".*?>(.*?)</a>" 'i used a webclient to get the source 'web requests might be faster Dim wc As New WebClient 'put the source into a string Dim strSource As String = wc.DownloadString(url) Dim HrefRegex As New Regex(strRegex, RegexOptions.IgnoreCase Or RegexOptions.Compiled) 'parse the urls from the source Dim HrefMatch As Match = HrefRegex.Match(strSource) 'used later to get the base domain without subdirectories or pages Dim BaseUrl As New Uri(url) 'while there are urls While HrefMatch.Success = True 'loop through the matches Dim sUrl As String = HrefMatch.Groups(1).Value 'if it's a page or sub directory with no base url (domain) If Not sUrl.Contains("http://") AndAlso Not sUrl.Contains("www") Then 'add the domain plus the page Dim tURi As New Uri(BaseUrl, sUrl) sUrl = tURi.ToString End If 'if it's not already in the list then add it If Not aReturn.Contains(sUrl) Then aReturn.Add(sUrl) 'go to the next url HrefMatch = HrefMatch.NextMatch End While Catch ex As Exception 'catch ex here. I left it blank while debugging End Try Return aReturn End Function Private Overloads Function GrabUrls(ByVal url As String, ByRef aReturn As ArrayList, ByRef aNew As ArrayList) As ArrayList 'overloads function to check duplicates in aNew and aReturn 'temp url arraylist Dim tUrls As ArrayList = GrabUrls(url) 'used to return the list Dim tReturn As New ArrayList 'check each item to see if it exists, so not to grab the urls again For Each item As String In tUrls If Not aReturn.Contains(item) AndAlso Not aNew.Contains(item) Then tReturn.Add(item) End If Next Return tReturn End Function

    Read the article

  • <script> Tag cannot be self closed?

    - by Joe Hopfgartner
    I had this code in my Website <script type="text/javascript" src="http://code.jquery.com/jquery-1.4.4.min.js"/> <script type='text/javascript' src='/lib/player/swfobject.js'></script> swfobject was not working (not loaded). After altering the code to: <script type="text/javascript" src="http://code.jquery.com/jquery-1.4.4.min.js"></script> <script type='text/javascript' src='/lib/player/swfobject.js'></script> It worked fine. The document was parsed as HTML5. I think its funny. Okay, granted a tag that is closed and a self closing tag are not the same. So i would understand if jquery couldnt load. Altough i find it rediciulous. But what i do not understand is that jquery loads but the following, correctly written tag, doesnt?

    Read the article

  • CodeIgniter won't run in a subdirectory

    - by Tim Piele
    I have a CodeIgniter install running in our root web directory that I copied into a subdirectory called /dev... I edited the config/config.php file to reflect the change so it is now like this: $config['base_url'] = 'http://mysite.com/dev'; $config['sbase_url'] = 'https://mysite.com/dev'; $config['default_email'] = '[email protected]'; $config['site_url'] = 'http://mysite.com/dev'; This is my .htaccess file: <IfModule mod_rewrite.c> RewriteEngine On RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*)$ index.php/$1 [L] </IfModule> When I hover over any links on the site they are correct, for example the contact us page link reads www.mysite.com/dev/contact but when I click any of the links I get a 404 error... Is there something common I am missing?

    Read the article

  • not autolinking all-numeric twitter hashtags in perl?

    - by all_numeric_no_hash
    I'm producing HTML from twitter search results. Happily using the Net::Twitter module :-) One of the rules in Twitter is that all-numeric hashtags are not links. This allows to unambiguously tweet things like "ur not my #1 anymore", as in here: http://twitter.com/natarias2007/status/11246320622 The solution I came up with looks like: $tweet =~ s{#([0-9]*[A-Za-z_]+[0-9]*)}{<a href="http://twitter.com/search?q=%23$1">#$1</a>}g; It seems to work (let's hope), but I'm still curious... how would you do it?

    Read the article

  • AWS ECS API - ItemLookup - Get item's (EAN) SalesRank by BrowseNode

    - by Lysender
    I'm using amazon japan, using ItemLookup. We already have the EAN numbers (used as ItemId). It was working well until we need to modify the SalesRank. The previous SalesRank returned is for the whole SearchIndex, ex: Books, Music, etc. What we want is to get the SalesRank for a specific BrowseNode. I don't know but it seems that it is not implemented on Amazon US. For example: http: // www . amazon . com / Slackware-Essentials-Cantrell-Johnson-Lumens/dp/1571763384/ref=sr_1_1?ie=UTF8&s=books&qid=1276142663&sr=1-1 The book above only shows the SalesRank (Bestsellers Rank) but on Japan Amazon: http://www.amazon.co.jp/Slackware-Linux-Dummies/dp/0764506897/ref=sr_1_4?ie=UTF8&s=english-books&qid=1276142708&sr=1-4 It shows SalesRank for other three different BrowseNodes. We need to get the SalesRank for a specific BrowseNode for a given ItemId or by any other means. Any ideas guys? TIA

    Read the article

  • Using directory traversal attack to execute commands

    - by gAMBOOKa
    Is there a way to execute commands using directory traversal attacks? For instance, I access a server's etc/passwd file like this http://server.com/..%01/..%01/..%01//etc/passwd Is there a way to run a command instead? Like... http://server.com/..%01/..%01/..%01//ls ..... and get an output? EDIT: To be clear here, I've found the vuln in our company's server. I'm looking to raise the risk level (or bonus points for me) by proving that it may give an attacker complete access to the system

    Read the article

  • [MySQL] Optimize Query

    - by bordeux
    Hello. I have problem with optimize this query: SET @SEARCH = "dokumentalne"; SELECT SQL_NO_CACHE `AA`.`version` AS `Version` , `AA`.`contents` AS `Contents` , `AA`.`idarticle` AS `AdressInSQL` , `AA` .`topic` AS `Topic` , MATCH (`AA`.`topic` , `AA`.`contents`) AGAINST (@SEARCH) AS `Relevance` , `IA`.`url` AS `URL` FROM `xv_article` AS `AA` INNER JOIN `xv_articleindex` AS `IA` ON ( `AA`.`idarticle` = `IA`.`adressinsql` ) INNER JOIN ( SELECT `idarticle` , MAX( `version` ) AS `version` FROM `xv_article` WHERE MATCH (`topic` , `contents`) AGAINST (@SEARCH) GROUP BY `idarticle` ) AS `MG` ON ( `AA`.`idarticle` = `MG`.`idarticle` ) WHERE `IA`.`accepted` = "yes" AND `AA`.`version` = `MG`.`version` ORDER BY `Relevance` DESC LIMIT 0 , 30 Now, this query using ^ 20 seconds. How to optimize this? EXPLAIN gives this: 1 PRIMARY AA ALL NULL NULL NULL NULL 11169 Using temporary; Using filesort 1 PRIMARY ALL NULL NULL NULL NULL 681 Using where 1 PRIMARY IA ALL accepted NULL NULL NULL 11967 Using where 2 DERIVED xv_article fulltext topic topic 0 1 Using where; Using temporary; Using filesort This is example server with my data: user: bordeux_4prog password: 4prog phpmyadmin: http://phpmyadmin.bordeux.net/ chive: http://chive.bordeux.net/

    Read the article

< Previous Page | 617 618 619 620 621 622 623 624 625 626 627 628  | Next Page >