Search Results

Search found 22988 results on 920 pages for 'url encoding'.

Page 7/920 | < Previous Page | 3 4 5 6 7 8 9 10 11 12 13 14  | Next Page >

  • file-name encoding problems

    - by tenhouse
    I googled over this topic but couldn't find what I was looking for... the following "happend" to me: I had my files stored on a NTFS-USB Harddisk, because of space problems I moved them to an ext3 system....somehow the filename (content is still ok as far as I saw) encoding screwed up....my files look like the following now: Kküken <--- should have an "ü" Jäger <--- should be an "ä" Zwölf <--- should be an "ö" fünfte <-- should be an "ü" etc .... These are just examples, but already give me my first question Why has the "ü" two different representations? (Maybe I screw up, before I screw up and now I have a mixing of x different encoding-layers? :) ) I tried the following command: convmv -r -f UTF-8 -t ISO-8859-1 * This command work for some files (for example Zwölf) but not for all: iso-8859-1 doesn't cover all needed characters for: "fünfte" So Iguess it must be another encoding - but which? How can I find out this? And is there any way that I can still fix all of this?

    Read the article

  • How to auto detect text file encoding?

    - by ???
    There are many plain text files which were encoded in variant charsets. I want to convert them all to UTF-8, but before running iconv, I need to know its original encoding. Most browsers have an Auto Detect option in encodings, however, I can't check those text files one by one because there are too many. Only having known the original encoding, I then can convert the texts by iconv -f DETECTED_CHARSET -t utf-8. Is there any utility to detect the encoding of plain text files? It doesn't have to be a 100% perfect correct, but it should recognize most of them.

    Read the article

  • dns hosting - url forwarding - hiding forwarded url?

    - by jeremycollins
    I have free dns hosting with the domain registrar and I'd like the dns hosted domain www.example.com to display contents of www.myotherlongdomain.com. I only have 301/302/iframe forwarding options, however I want to mask the redirected (longdomain) url. If I use frames, users can view the source and see the (longdomain) url the contents are coming from. How can I hide it so it always displays www.example.com? There is no cloaking/masking option with the registrar. Thanks.

    Read the article

  • Shopping Cart URL Structure

    - by Drew
    In regards to URL structure when it comes to guests and authenticated users, am I able to track traffic associated with both paths, but at the same time track total conversions going through the shopping cart? I have set up the following URL structure: Authenticated users follow this path: /cart /checkout /checkout-confirmation-ty Guests go like such: /cart /checkout-guest /checkout-confirmation-guest-ty can I track the authenticated and guest users separately? is this possible with Google Analytics?

    Read the article

  • What does it take to successfully run a URL shortener service [on hold]

    - by MxyL
    What are the costs (technology wise) for running a URL shortener service such as bit.ly or anonym.to? For example, if I decided to use an inexpensive shared hosting with "unlimited" bandwidth, would that be feasible? Or would I need a dedicated hosting? I found this question: I want to run a URL shortener for my own usage, what do I need to do?, which makes it easy to set it up, but I'm not too clear what kind of things I need to consider.

    Read the article

  • Do extra words in url affect SEO?

    - by smp7d
    Often for technical reasons we end up with some extra words in a url that we would not want to optimize for as they would have no bearing on the content. Examples would be: sportssite.com/content/sports-article movieportal.com/node/movie-review electronicsforum.com/blog/top-10-cameras webmasters.stackexchange.com/questions/34046/do-extra-words-in-url-affect-seo Do these have any affect on ranking in any of the major search engines? Would it behoove us to strip the extra words?

    Read the article

  • Uniform url in different device

    - by yanglifu90
    I noticed almost all of StackExchange's sites uses the same url in mobile browser, I think this is cool because when I share something on my phone, people viewing the link would not see a mobile webpage on their desktop. What is this specification called by W3C? How do I find other websites that use this technology. I noticed that ArsTechnica and the Telegraph used the same url with their desktop version.

    Read the article

  • bitly php url wont work???

    - by mathiregister
    Hi guys, <?php include('bitly.php'); $bitly = new bitly('myusername', 'myapikey'); print $bitly->shorten('http://www.google.com'); ?> WORKING!!! $currenturl = (!empty($_SERVER['HTTPS'])) ? "https://".$_SERVER['SERVER_NAME'].$_SERVER['REQUEST_URI'] : "http://".$_SERVER['SERVER_NAME'].$_SERVER['REQUEST_URI']; include('bitly.php'); $bitly = new bitly('myusername', 'myapikey'); print $bitly->shorten($currenturl); WORKING!!! include('bitly.php'); $currenturl = (!empty($_SERVER['HTTPS'])) ? "https://".$_SERVER['SERVER_NAME'].$_SERVER['REQUEST_URI'] : "http://".$_SERVER['SERVER_NAME'].$_SERVER['REQUEST_URI']; $url = "somehashtag" $shareurl = $currenturl . '#' . $url; $bitly = new bitly('myusername', 'myapikey'); print $bitly->shorten($shareurl); NOT WORKING!!! Any idea why? If i print out the $shareurl i can see that it's a completely normal url that i could paste onto the normal bit.ly website. I don't get it! Any ideas? Would be great if you could help me!

    Read the article

  • Ripping Blu-Ray for Xbox 360 with Minimal Encoding

    - by Adam Haile
    What's the best way to rip a Blu-ray disc to an Xbox 360 compatible format, while preferably maintaining surround sound and as little video encoding as possible? As far as I can tell, the 360 technically supports both AVC and VC-1 (though if at those bit rates is questionable), so I'm kind of hoping that you could do it without actually re-encoding the video at all and, instead, just processing the audio and the re-muxing everything together in a new file.

    Read the article

  • .htaccess url rewriting problem

    - by letsworktogether
    I'm kind of stuck at this part and was hoping that I'd get some assistance. I'm building a highscores page in PHP, that's going great, it works. however, I dislike the idea of "index.php?skill=name" and therefore wanted a bit of SEO in this. I have successfully replaced the url with a more friendly version: "highscores/skill/name" And this is where the problem starts, I have added pagination to the highscores and the page is read from the HTTP_GET page variable ($_GET['page']). I dislike the idea of "highscores/skill/name&page=2" and was hoping if you guys could assist me to make the url like the following: Page 1, so accessing the file without declaring the page number: DOMAIN.TLD/highscores/skill/name Page 1 so now the page variable is needed:DOMAIN.TLD/highscores/skill/name/2 As you can tell the "2" will define page 2 and load the correct data for page 2. However, I'm having much trouble in my .htaccess file to configure it this way. RewriteRule ^highscores\/skill\/(.*?)(\/(.*?)*)$ highscores/skills.php?skill=$1&page=$2 [L] # Skills page That is my latest attempt in order to get it to work, unfortunately it does not work, it makes the page look horrible (CSS doesn't work) and it doesn't go to the page specified on the URL. I hope you understand my issue, thank you!

    Read the article

  • Changing website Url - Am I making an SEO mistake

    - by Denis
    I have a webiste with a .com domain that is a year old. The business is a shop based in Ireland and I have purchased a .ie domain. I plan to move the website over to the new domain, SEO Good or Bad idea? Old Url - SmythsOfTerenure.com | New Url - SmythsComputerRepair.ie (I am using Fake names and fictional business in the example Url's) The new domain has my main keyword in it. The old domain has my family name and business location (city district) It currently ranks high for lots of relevant keywords in Google with low traffic and low competition. Current website traffic is about 80 session per week. 80% of that traffic is Organic from Google. I am changing domain in an attempt to help SEO long term by having a CC TLD (.ie rather than .com) and having my main Keyword in the domain. I plan to do 301 re-directs from old to new and update GW Tools and G Analytics but am I making a mistake changing it at all as I know rankings may fall in the sort term. Homepage PR=0 and very few inbound links. Should I just leave it on the old domain? Or after a few months should I be back up ranking as well as I am now?

    Read the article

  • How to set the mechanize page encoding?

    - by Juan Medín
    Hi, I'm trying to get a page with an ISO-8859-1 encoding clicking on a link, so the code is similar to this: page_result = page.link_with( :text => 'link_text' ).click So far I get the result with a wrong encoding, so I see characters like: 'T?tulo:' instead of 'Título:' I've tried several approaches, including: Stating the encoding in the first request using the agent like: @page_search = @agent.get( :url => 'http://www.server.com', :headers => { 'Accept-Charset' => 'ISO-8859-1' } ) Stating the encoding for the page itself page_result.encoding = 'ISO-8859-1' But I must be doing something wrong: a simple puts always show the wrong characters. Do you know how to state the encoding? Thanks in advance, Added: Executable example: require 'rubygems' require 'mechanize' WWW::Mechanize::Util::CODE_DIC[:SJIS] = "ISO-8859-1" @agent = WWW::Mechanize.new @page = @agent.get( :url => 'http://www.mcu.es/webISBN/tituloSimpleFilter.do?cache=init&layout=busquedaisbn&language=es', :headers => { 'Accept-Charset' => 'utf-8' } ) puts @page.body

    Read the article

  • PLT Scheme URL dispatch

    - by Inaimathi
    I'm trying to hook up URL dispatch with PLT Scheme. I've taken a look at the tutorial and the server documentation. I can figure out how to route requests to the same servlets. Specific example: (define (start request) (blog-dispatch request)) (define-values (blog-dispatch blog-url) (dispatch-rules (("") list-posts) (("posts" (string-arg)) review-post) (("archive" (integer-arg) (integer-arg)) review-archive) (else list-posts))) (define (list-posts req) `(list-posts)) (define (review-post req p) `(review-post ,p)) (define (review-archive req y m) `(review-archive ,y ,m)) Assuming the above code running on a server listening 8080, localhost:8080/ goes to a page that says "list-posts". Going to localhost:8080/posts/test goes to a PLT "file not found" page (with the above code, I'd expect it to go to a page that says "review-post test"). It feels like I'm missing something small and obvious. Can anyone give me a hint?

    Read the article

  • Question on dynamic URL parsing

    - by jerebear
    I see many, many sites that have URLs for individual pages such as http://www.mysite.com/articles/this-is-article-1 http://www.mysite.com/galleries/575 And they don't redirect, they don't run slowly... I know how to parse URL's, that's easy enough. But in my mind, that seems slow and cumbersome on a dynamic site. As well, if the pages are all staticly built (hende the custom URL) then that means all components of the page are static as well... (which would be bad) I'd love to hear some ideas about how this is typically accomplished.

    Read the article

  • Getting text after URL in asp.net / URL Rewriting (sort of!)

    - by alex
    My app is a very simple "one page" type app- It has Default.aspx I'm basically trying to get, for example: www.myappurl.com/this is my text I want to get hold of "this is my text" from the above example. This will be displayed on the page (for now) I didn't really want to have to use any complext url rewriting things for this... (My hosting provider uses IIS6) I tried using a 404 handler, but this is a bit long winded, and i'm using shared hosting, that can't set the "execute url" on custom 404 pages. Any other ideas?

    Read the article

  • SEO URL Structure

    - by Neil
    Based on the following example URL structure: mysite.com/mypage.aspx?a=red&b=green&c=blue Pages in the application use ASP.net user controls and some of these controls build a query string. To prevent duplicate keys being created e.g. &pid=12&pid=10, I am researching methods of rewriting the URL: a) mysite.com/mypage.aspx/red/green/blue b) mysite.com/mypage.aspx?controlname=a,red|b,green|c,blue Pages using this structure would be publishing content that I would like to get indexed and ranked - articles and products (8,000 products to start, with thousands more being added later) My gut instinct tells me to go with the first method, but would it would be overkill to add all that infrastructure if the second method will accomplish my goal of getting pages indexed AND ranked. So my question, looking at the pro's and con's, Google Ranking, time to implement etc. which method should I use? Thanks!

    Read the article

  • Encoding over SSH Issues

    - by user1104160
    I have a Linux machine and a Windows machine, both using Vim with the Powerline plugin. They both work great with patched fonts. Next, I want to SSH onto an OSX 10.6 machine and also use the Powerline in the terminal with Vim. However, I get weird symbols with normal mode ("^^B" in one area) and fancy mode ("~@" and "~B" spread throughout the bar. I thought this mixup was an encoding issue, but when I look at Putty's encoding it is using UTF-8 and the same with the Ubuntu terminal. Additionally, on the OSX machine, "locale" returns "en_US.UTF-8" for all variables (I set it to do that in order to troubleshoot). However, the symbols are still showing. I am using a patched font (Inconsolata, the same one as the Ubuntu terminal) for the OSX terminal, so I am stumped. Is there a missing component to this equation? Are there additional problems that can arise from SSH encoding? On the OSX end, additionally, these same symbols appear, so it may not even be related to SSH and therefore I'm totally lost.

    Read the article

  • What is the SEO-recommended method for using underscores and dashes in URLs that contain geographic locations?

    - by ElHaix
    In reading through this article: In Subfolder & File Names, Use Dashes, Not Underscores Good: Good: http://www.domain.com/sub-folder/file-name.htm Bad: http://www.domain.com/sub_folder/file_name.htm In my URL's, I may have one or two city names, ending with the province/state: Burnaby_New_Westminister-BC/[some search term]. My URL rules currently are defined such that everything after the dash is the prov/state. Some geographic locations already contain dashes: Notre-Dame-de-Grâce (in QC), which I would convert to ~/Notre_Dame_de_Grace-QC/ I thought of placing the prov/state after another "/", however in some cases the province/state name may not exist, thus ~/Notre_Dame_de_Grace/, so the first term after the domain name contains the geo location {city, city_name-state}. I am now revisiting this, and wondering if this rule set should change, and if so, what is the recommended way of implementing this? -- UPDATE -- After reviewing this video, I see that I should be using the dashes, rather than underscores. However since I still want to have my geo locations in the first URL section, is there anything wrong with using a double-dash separator - ie: /city-name--state/ ?

    Read the article

  • Rewrite a URL that's already been redirected?

    - by Jack
    Hi guys, I'm running an Apache2 web server with a dynamic IP address. I bought exampledomain.net, and I use no-ip.com's domain-update service to redirect any visitors to my current ip address (endnote #1). For example, someone visits exampledomain.net and they get redirected to 73.181.57.34. It works like a charm. However, it isn't all that user-friendly. Can I rewrite the redirected, ip-address URL? I tried these rewrite rules in the root folder's .htaccess... RewriteEngine On RewriteCond %{HTTP_HOST} ^73\.181\.57\.34:88 RewriteRule ^(.*)$ http://www.exampledomain.net/$1 [L,NC] # I simplified the RewriteCond. I would use regex in a real situation. Of course, this creates an infinite loop. The user visits www.exampledomain.net. They're redirected to 73.181.57.34:88 by no-ip. Apache redirects them to www.exampledomain.net which redirects them back to 73.181.57.34:88... so on and so forth. I'm a noob when it comes to rewriting, but is there a way to rewrite a URL without redirecting? I tried these rewrite rules too (a shot in the dark)... RewriteEngine On RewriteCond %{HTTP_HOST} ^73\.181\.57\.34:88 RewriteRule ^(.*)$ my.exampledomain.net/$1 [L,NC] # I'd read that Apache replied with a redirect header when you include http Thanks! (1) No-IP works like this: You download and install their dynamic update client on your server. Every couple of minutes it polls your server for its current external ip address. If it's changed, it updates your server's ip address in no-ip's records.

    Read the article

< Previous Page | 3 4 5 6 7 8 9 10 11 12 13 14  | Next Page >