Search Results

Search found 3028 results on 122 pages for 'urls'.

Page 84/122 | < Previous Page | 80 81 82 83 84 85 86 87 88 89 90 91  | Next Page >

  • Next track or shuffle in M3U playlist?

    - by Benjamin Oakes
    I have a M3U playlist that has URLs for some MP3s around the web. It's on a server so I can open it on other computers and my iPhone. Unfortunately, all the players I've tried don't let me hit the "next" button to go to the next song in the playlist. Is there a way to specify that ability in the M3U file? Or, if not that, can I make a media player automatically shuffle the playlist? I could always make a script to shuffle it myself, but I'd like to use something built into M3U if it exists.

    Read the article

  • What's my best bet for replacing plain text links with anchor tags in a string? .NET

    - by Craig Bovis
    What is my best option for converting plain text links within a string into anchor tags? Say for example I have "I went and searched on http://www.google.com/ today". I would want to change that to "I went and searched on http://www.google.com/ today". The method will need to be safe from any kind of XSS attack also since the strings are user generated. They will be safe before parsing so I just need to make sure that no vulnerabilities are introduced through parsing the URLs.

    Read the article

  • How reliable are URIs like /index.php/seo_path

    - by Boldewyn
    I noticed, that sometimes (especially where mod_rewrite is not available) this path scheme is used: http://host/path/index.php/clean_url_here --------------------------^ This seems to work, at least in Apache, where index.php is called, and one can query the /clean_url_here part via $_SERVER['PATH_INFO']. PHP even kind of advertises this feature. Also, e.g., the CodeIgniter framework uses this technique as default for their URLs. The question: How reliable is the technique? Are there situations, where Apache doesn't call index.php but tries to resolve the path? What about lighttpd, nginx, IIS, AOLServer? A ServerFault question? I think it's got more to do with using this feature inside PHP code. Therefore I ask here.

    Read the article

  • Regexing it up with IIS re-write module

    - by Michael Jasper
    I am developing a profile-based web application where each user is assigned there own url through their username & iis rewrite mod's magic. A typical user's profile url would be http://www.mymark.com/mike Each user is also created a blog in a multi-user wordpress installation. The wordpress url would look like this: http://www.mymark.com/blog/mike I am trying to use the rewrite module to create more canonical urls for the user (http://www.mymark.com/mike/blog), and have tried several regex variations that I have created through RegExr(a regex generation tool) and come up with this as the pattern to match (www.|)mymark.com/([^/]+)/blog but haven't had any success so far. What am I doing wrong here? Here is a screen shot of my re-write rule:

    Read the article

  • Detecting a url using preg_match? without http:// in the string

    - by Stefan
    Hey there, I was wondering how I could check a string broken into an array against a preg_match to see if it started with www. I already have one that check for http://www. $stringToArray = explode(" ",$_POST['text']); foreach($stringToArray as $key=>$val){ $urlvalid = isValidURL($val); if($urlvalid){ $_SESSION["messages"][] = "NO URLS ALLOWED!"; header("Location: http://www.domain.com/post/id/".$_POST['postID']); exit(); } } Thanks! Stefan

    Read the article

  • How to setup custom DNS with Azure Websites Preview?

    - by husainnz
    I created a new Azure Website, using Umbraco as the CMS. I got a page up and going, and I already have a .co.nz domain with www.domains4less.com. There's a whole lot of stuff on the internet about pointing URLs to Azure, but that seems to be more of a redirection service than anything (i.e. my URLs still use azurewebsites.net once I land on my site). Has anybody had any luck getting it to go? Here's the error I get when I try adding the DNS entry to Azure (I'm in reserved mode, reemdairy is the name of the website): There was an error processing your request. Please try again in a few moments. Browser: 5.0 (Windows NT 6.1; WOW64) AppleWebKit/536.5 (KHTML, like Gecko) Chrome/19.0.1084.56 Safari/536.5 User language: undefined Portal Version: 6.0.6002.18488 (rd_auxportal_stable.120609-0259) Subscriptions: 3aabe358-d178-4790-a97b-ffba902b2851 User email address: [email protected] Last 10 Requests message: Failure: Ajax call to: Websites/UpdateConfig. failed with status: error (500) in 2.57 seconds. x-ms-client-request-id was: 38834edf-c9f3-46bb-a1f7-b2839c692bcf-2012-06-12 22:25:14Z dateTime: Wed Jun 13 2012 10:25:17 GMT+1200 (New Zealand Standard Time) durationSeconds: 2.57 url: Websites/UpdateConfig status: 500 textStatus: error clientMsRequestId: 38834edf-c9f3-46bb-a1f7-b2839c692bcf-2012-06-12 22:25:14Z sessionId: 09c72263-6ce7-422b-84d7-4c21acded759 referrer: https://manage.windowsazure.com/#Workspaces/WebsiteExtension/Website/reemdairy/configure host: manage.windowsazure.com response: {"message":"Try again. Contact support if the problem persists.","ErrorMessage":"Try again. Contact support if the problem persists.","httpStatusCode":"InternalServerError","operationTrackingId":"","stackTrace":null} message: Complete: Ajax call to: Websites/GetConfig. completed with status: success (200) in 1.021 seconds. x-ms-client-request-id was: a0cdcced-13d0-44e2-866d-e0b061b9461b-2012-06-12 22:24:43Z dateTime: Wed Jun 13 2012 10:24:44 GMT+1200 (New Zealand Standard Time) durationSeconds: 1.021 url: Websites/GetConfig status: 200 textStatus: success clientMsRequestId: a0cdcced-13d0-44e2-866d-e0b061b9461b-2012-06-12 22:24:43Z sessionId: 09c72263-6ce7-422b-84d7-4c21acded759 referrer: https://manage.windowsazure.com/#Workspaces/WebsiteExtension/Website/reemdairy/configure host: manage.windowsazure.com message: Complete: Ajax call to: https://manage.windowsazure.com/Service/OperationTracking?subscriptionId=3aabe358-d178-4790-a97b-ffba902b2851. completed with status: success (200) in 1.887 seconds. x-ms-client-request-id was: a7689fe9-b9f9-4d6c-8926-734ec9a0b515-2012-06-12 22:24:40Z dateTime: Wed Jun 13 2012 10:24:42 GMT+1200 (New Zealand Standard Time) durationSeconds: 1.887 url: https://manage.windowsazure.com/Service/OperationTracking?subscriptionId=3aabe358-d178-4790-a97b-ffba902b2851 status: 200 textStatus: success clientMsRequestId: a7689fe9-b9f9-4d6c-8926-734ec9a0b515-2012-06-12 22:24:40Z sessionId: 09c72263-6ce7-422b-84d7-4c21acded759 referrer: https://manage.windowsazure.com/#Workspaces/WebsiteExtension/Website/reemdairy/configure host: manage.windowsazure.com message: Complete: Ajax call to: /Service/GetUserSettings. completed with status: success (200) in 0.941 seconds. x-ms-client-request-id was: 805e554d-1e2e-4214-afd5-be87c0f255d1-2012-06-12 22:24:40Z dateTime: Wed Jun 13 2012 10:24:40 GMT+1200 (New Zealand Standard Time) durationSeconds: 0.941 url: /Service/GetUserSettings status: 200 textStatus: success clientMsRequestId: 805e554d-1e2e-4214-afd5-be87c0f255d1-2012-06-12 22:24:40Z sessionId: 09c72263-6ce7-422b-84d7-4c21acded759 referrer: https://manage.windowsazure.com/#Workspaces/WebsiteExtension/Website/reemdairy/configure host: manage.windowsazure.com message: Complete: Ajax call to: Extensions/ApplicationsExtension/SqlAzure/ClusterSuffix. completed with status: success (200) in 0.483 seconds. x-ms-client-request-id was: 85157ceb-c538-40ca-8c1e-5cc07c57240f-2012-06-12 22:24:39Z dateTime: Wed Jun 13 2012 10:24:40 GMT+1200 (New Zealand Standard Time) durationSeconds: 0.483 url: Extensions/ApplicationsExtension/SqlAzure/ClusterSuffix status: 200 textStatus: success clientMsRequestId: 85157ceb-c538-40ca-8c1e-5cc07c57240f-2012-06-12 22:24:39Z sessionId: 09c72263-6ce7-422b-84d7-4c21acded759 referrer: https://manage.windowsazure.com/#Workspaces/WebsiteExtension/Website/reemdairy/configure host: manage.windowsazure.com message: Complete: Ajax call to: Extensions/ApplicationsExtension/SqlAzure/GetClientIp. completed with status: success (200) in 0.309 seconds. x-ms-client-request-id was: 2eb194b6-66ca-49e2-9016-e0f89164314c-2012-06-12 22:24:39Z dateTime: Wed Jun 13 2012 10:24:40 GMT+1200 (New Zealand Standard Time) durationSeconds: 0.309 url: Extensions/ApplicationsExtension/SqlAzure/GetClientIp status: 200 textStatus: success clientMsRequestId: 2eb194b6-66ca-49e2-9016-e0f89164314c-2012-06-12 22:24:39Z sessionId: 09c72263-6ce7-422b-84d7-4c21acded759 referrer: https://manage.windowsazure.com/#Workspaces/WebsiteExtension/Website/reemdairy/configure host: manage.windowsazure.com message: Complete: Ajax call to: Extensions/ApplicationsExtension/SqlAzure/DefaultServerLocation. completed with status: success (200) in 0.309 seconds. x-ms-client-request-id was: 1bc165ef-2081-48f2-baed-16c6edf8ea67-2012-06-12 22:24:39Z dateTime: Wed Jun 13 2012 10:24:40 GMT+1200 (New Zealand Standard Time) durationSeconds: 0.309 url: Extensions/ApplicationsExtension/SqlAzure/DefaultServerLocation status: 200 textStatus: success clientMsRequestId: 1bc165ef-2081-48f2-baed-16c6edf8ea67-2012-06-12 22:24:39Z sessionId: 09c72263-6ce7-422b-84d7-4c21acded759 referrer: https://manage.windowsazure.com/#Workspaces/WebsiteExtension/Website/reemdairy/configure host: manage.windowsazure.com message: Complete: Ajax call to: Extensions/ApplicationsExtension/SqlAzure/ServerLocations. completed with status: success (200) in 0.309 seconds. x-ms-client-request-id was: e1fba7df-6a12-47f8-9434-bf17ca7d93f4-2012-06-12 22:24:39Z dateTime: Wed Jun 13 2012 10:24:40 GMT+1200 (New Zealand Standard Time) durationSeconds: 0.309 url: Extensions/ApplicationsExtension/SqlAzure/ServerLocations status: 200 textStatus: success clientMsRequestId: e1fba7df-6a12-47f8-9434-bf17ca7d93f4-2012-06-12 22:24:39Z sessionId: 09c72263-6ce7-422b-84d7-4c21acded759 referrer: https://manage.windowsazure.com/#Workspaces/WebsiteExtension/Website/reemdairy/configure host: manage.windowsazure.com

    Read the article

  • Building path independent mod_rewrite statements for generic .htaccess file

    - by Pekka
    Say I have three small web applications stored under a shared web root: www.example.com/app1/ www.example.com/app2/ www.example.com/app3/ www.example.com/app4/ each application has a .htaccess file containing some run-off-the-mill mod_rewrite statements to rewrite urls like RewriteCond %{REQUEST_URI} ^/app1/([^/]+)/([^/]+)\.html$ RewriteRule .* /app1/index.php?selectedProfile=%1&match=%2&%{QUERY_STRING} now, I would like to have a generic .htaccess file in each /app{n} directory. So, no RewriteBase and no /app{n} prefix in the RewriteConds. One idea I had was making the first level a wildcard directory as well: RewriteCond %{REQUEST_URI} ^/([^/]+)/([^/]+)/([^/]+)\.html$ seeing as the .htaccess file gets triggered only when the /app{n} directory is entered, this should work. Is this an acceptable solution? Are there other, better ones?

    Read the article

  • Remove index.php in CodeIgniter

    - by Gabriel Bianconi
    Hello. I'm trying to remove the 'index.php' from CI Urls. I've tried many solutions, none of them worked. I've already set these variables in 'config.php': $config['index_page'] = ""; $config['uri_protocol'] = "REQUEST_URI"; And my current .htaccess is: Options +FollowSymLinks RewriteEngine On RewriteBase / RewriteCond %{HTTP_HOST} ^plugb.com$ [NC] RewriteRule ^(.*)$ http://www.plugb.com/$1 [R=301,L] RewriteCond $1 !^(index\.php|files|robots\.txt) RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*)$ index.php/$1 [L,QSA] The www prefix part works fine. But the 'index.php' part doesn't. If you want to check the webpage, here is it: http://www.plugb.com/index.php/home

    Read the article

  • Regex for url formatting (www.domain.tld to anchors)

    - by Kristaps
    Hi. I'm currently developing a little browser-based Twitter widget. Currently, I'm stuck with getting the URLs to work. I'm kinda newbie, when it comes to regex (I know, how to get parts of a string, but this one – tough one). So, I need a regex that would search/replace www.domain.tld -> <a href="http://www.domain.tld">http://www.domain.tld</a> With/without http://, preferably. Any advice is welcome. Thanks.

    Read the article

  • URL Encoding - Illegal Character Replacement

    - by ThePower
    Hi, I am doing some url redirections in a project that I am currently working on. I am new to web development and was wondering what the best practise was to remove any illegal path characters, such as ' ? etc. I'm hoping I don't have to resort to manually replacing each character with their encoded urls. I have tried UrlEncode and HTMLEncode, but UrlEncode doesn't cater for the ? and HTMLEncode doesn't cater for ' E.G. If I was to use the following: Dim name As String = "Dave's gone, why?" Dim url As String = String.Format("~/books/{0}/{1}/default.aspx", bookID, name) Response.Redirect(url) I've tried wrapping url like this: Dim encodedUrl As String = Server.UrlEncode(url) And Dim encodedUrl As String = Server.HTMLEncode(url) Thanks in advance. P.S. Happy Christmas

    Read the article

  • How to stop a curl while requesting or stop the running php script?

    - by Chris
    I using url to request remote urls that sometimes may very slow or just down. In this case, my php scripts still waiting for the response, it makes apache has to many requests stay in memory and then overload. I need a way to stop curl requesting or stop running php script when specified time passed. I'd tried declare(), it makes no sense for curl. Can someone know how to solve it? BTW: what is the effect of CURLOPT_CONNECTTIMEOUT and CURLOPT_TIMEOUT? They don't work like what I think.

    Read the article

  • iPhone: fast hash function for storing web images (url) as files (hashed filenames)

    - by Stefan Klumpp
    What is a fast hash function available for the iPhone to hash web urls (images)? I'd like to store the cached web image as a file with a hash as the filename, because I suppose the raw web url could contain strange characters that could cause problems on the file system. The hash function doesn't need to be cryptographic, but it definitely needs to be fast. Example: Input: http://www.calumetphoto.com/files/iccprofiles/icc-test-image.jpg Output: 3573ed9c4d3a5b093355b2d8a1468509 This was done by using MD5(), but since I don't know much about that topic I don't know if it is overkill (- slow).

    Read the article

  • Any thoughts on A/B testing in Django based project?

    - by Maddy
    We just now started doing the A/B testing for our Django based project. Can I get some information on best practices or useful insights about this A/B testing. Ideally each new testing page will be differentiated with a single parameter(just like Gmail). mysite.com/?ui=2 should give a different page. So for every view I need to write a decorator to load different templates based on the 'ui' parameter value. And I dont want to hard code any template names in decorators. So how would urls.py url pattern will be?

    Read the article

  • Wrap link around links in tweets with php preg_replace

    - by Ben Paton
    Hello I'm trying to display the latest tweet using the code below. This preg_replace works great for wrapping a link round twitter @usernames but doesn't work for web addresses in tweets. How do I get this code to wrap links around urls in tweets. <?php /** Script to pull in the latest tweet */ $username='fairgroceruk'; $format = 'json'; $tweet = json_decode(file_get_contents("http://api.twitter.com/1/statuses/user_timeline/{$username}.{$format}")); $latestTweet = htmlentities($tweet[0]->text, ENT_QUOTES); $latestTweet = preg_replace('/@([a-z0-9_]+)/i', '<a href="http://twitter.com/$1" target="_blank">@$1</a>', $latestTweet); $latestTweet = preg_replace('/http://([a-z0-9_]+)/i', '<a href="http://$1" target="_blank">http://$1</a>', $latestTweet); echo $latestTweet; ?> Thanks for the help, Ben

    Read the article

  • Can I use the CSS :visited pseudo class on 'wildcard' links?

    - by rabidpebble
    Let's say I have a site with multiple links as follows: www.example.com/product/1 www.example.com/product/2 www.example.com/product/3 I also append tracking info to links from time to time so that I can see how my site is being used, e.g, if somebody visits the products page from the product browser I would set a ref parameter: www.example.com/product/1&ref=pb www.example.com/product/2&ref=pb www.example.com/product/3&ref=pb The problem with this is that if the user visits a link of the first type and then views a link of the second type then the :visited pseudo class doesn't seem to apply because the browser only seems to match on exact URLs. Is there any way to have "wildcards" apply to links in this sense, so that when the user sees either the first type or the second type of link that it is highlighted? Note: I cannot change this "ref" architecture; it is inherited.

    Read the article

  • How to get all paths in drupal install

    - by Aaron
    Hi, I need to write a module that gives me a page will all possible paths in a drupal install, including orphaned pages. (site map won't work for that). I can query the url_alias table for aliases, and I can query the menu_router table for all paths, even ones set in page/feed displays in views. But, variable paths (those with arguments) get interpreted at run-time. So, is there a way to get all possible paths in a drupal install, including dynamic paths and orphans? It's catch22. I have to know all the urls ahead of time to get them.

    Read the article

  • What is the best way to generate a sitemap?

    - by Zakaria
    Hi everybody, I need to build a sitemap for my website. The url will be "www.example.com/mysitemap.html". I know that there are some tools that generate automatically an XML file that contains the reachable URLs and also improve the SEO. So my questions are: How can I build this HTML page going from the generated XML? Or am I wrong and this kind of HTML page is built manually? If not, how do we integrate the XML and convert it to the website? Thank you very much. Regards.

    Read the article

  • Trouble using genericra to integrate activemq and glassfish when using failover protocol

    - by Kyle
    Hi, I'm attempting to use activemq in glassfish using the genericra resource adapter provided with glassfish 2.1. I have found a few pages with helpful information including http://activemq.apache.org/sjsas-with-genericjmsra.html. I have actually had success and been able to get MDBs to use activemq as their JMS provider, but I'm running into an issue as I'm trying to do some more complicated configuration. I want to set up a master-slave configuration, which would require my clients to use a brokerURL of failover:(tcp://broker1:61616,tcp://broker2:61616). In order to do this, I set the following property when calling asadmin create-resource-adapter-config (I have to escape '=' and ':'): ConnectionFactoryProperties=brokerURL\=failover\:(tcp\://127.0.0.1\:61616,tcp://127.0.0.1\:61617) However, I am now getting a StringIndexOutOfBoundsException when my application starts up. I suspect the comma in between the two URLs is the culprit, since this works fine: brokerURL\=failover\:(tcp\://127.0.0.1\:61616) Just wondering if anyone has dealt with this issue before. Also wondering if there is a better way to integrate with glassfish than using the generic resource adapter.

    Read the article

  • Hide *.inc.php from website visitors

    - by Ghostrider
    I have a script myscript.inc.php which handles all urls that look like /script-blah I accomplish this by using following .htaccess RewriteEngine On RewriteRule ^script-(.*)$ myscript.inc.php?s=$1 [QSA,L] However users could also access it this way by typing /myscript.inc.php?s=blah I would like to prevent that. I tried <Files ~ "\.inc\.php$"> Order deny,allow Deny from all </Files> and RewriteCond %{REQUEST_URI} \.inc\.php RewriteRule .* - [F,L,NS] They both prevent users from viewing /myscript.inc.php?s=blah but they also cause /script-blah to return 403... Is there a way to do this correctly?

    Read the article

  • Using .htaccess to replace backslash in URL with forward-slash

    - by DamienL
    I realise that a backslash should never appear in a URL in a form other than a URL escape code, however in this case the URL's are being generated by a .NET application for generating flashbooks. I have contacted the developer of this application with a bug report. In the interim i would like to use .htaccess to rewrite the offending backslashes. This is how the URLs appear in fiddler debugging proxy. www.example.com/folder/folder/thumbs%5C1.jpg I am using Firefox and it looks as though Firefox is translating them into the URL encoded equivalent ( \ == %5C1 ). Interestingly IE translates the backslash into a forward-slash automatically (not adhering to standards but convenient in this case). Is there a way to use .htaccess to rewrite all \ to /?

    Read the article

  • Mac dashboard widgets not loading external images

    - by andrhamm
    I set out to make a quick Mac OS X dashboard widget. I read the documentation and was pleased to find out they use simple HTML, JS, and CSS. I created my widget and it works when I open the .html file in Firefox, but it does not work when I install the widget to the dashboard. The widget is simple: it displays the most recent image from a weather web cam stream. The image URLs look like this: http://webcam.com/stream.jpg?1274213999617. The timestamp is appended to the URL and the server automatically responds with the latest image for that time. I did not write the server script. The widget appears to be loading correctly, but the web cam image will not load. Notice the blue question mark in the upper left. The image should appear over the square background image. Is there any special procedure for loading external images into a widget?

    Read the article

  • How can I write a "user can only access own profile page" type of security check in Play Framework?

    - by karianneberg
    I have a Play framework application that has a model like this: A Company has one and only one User associated with it. I have URLs like http://www.example.com/companies/1234, http://www.example.com/companies/1234/departments, http://www.example.com/companies/1234/departments/employees and so on. The numbers are the company id's, not the user id's. I want that normal users (not admins) should only be able to access their own profile pages, not other people's profile pages. So a user associated with the company with id 1234 should not be able to access the URL http://www.example.com/companies/6789 I tried to accomplish this by overriding Secure.check() and comparing the request parameter "id" to the ID of the company associated with the logged in user. However, this obviously fails if the parameter is called anything else than "id". Does anyone know how this could be accomplished?

    Read the article

  • What database works well with 200+GB of data?

    - by taw
    I've been using mysql (with innodb; on Amazon rds) because it's sort of universal default, but it's been ridiculously under-performing, and tweaking it only delays the inevitable. The data is mostly relatively short (<1kB of bytes each) blobs information about 100Ms of urls. There is (or should be, mysql cannot seem to handle it) very high amount of insert / update / retrieve but few complex queries - not that complex queries wouldn't be useful, but because mysql is so slow that it's far faster to get the data out, process it locally, and cache the results somewhere. I can keep tweaking mysql and throwing more hardware at it, but it seems increasingly futile. So what are the options? SQL/relational model/etc. optional - anything will do as long as it's fast, networked, and language-independent.

    Read the article

  • How do I protect myself?

    - by ved
    I was poking around at my work computer this evening and was looking at my timesheets. I noticed that all my timesheets had variables in the URLs and I could figure out the numbering scheme for the pages. Then I got a little curious about SQL injection and thought of trying out adding simple SQL injections like "OR 1=1" etc. to see how protected we really were with our timesheet info. One of these strings yielded a friendly error page saying that an error email was sent to the developer. I am concerned that my ID, and request will be seen by the developer , immediately recognized as SQL injection and will be reported to network security officer as a malicious attempt by an employee to hack the timesheet dB. what is my defense? I am really worried.

    Read the article

  • Actionscript problems with social share encoding

    - by Rittmeyer
    Hi, I'm trying to make some "social share" buttons at my site, but the urls I generate just don't get decoded by this services. One example, for twitter: private function twitter(e:Event):void { var message:String = "Message with special chars âõáà"; var url:String = "http://www.twitter.com/home?status="; var link:URLRequest = new URLRequest( url + escape(message) ); } But when twitter opens up, the message is: Message with special chars %E2%F5%E1%E0 Something similar is happening with Facebook and Orkut (but these two hide the special chars). Someone know why is this happening?

    Read the article

< Previous Page | 80 81 82 83 84 85 86 87 88 89 90 91  | Next Page >