Search Results

Search found 8185 results on 328 pages for 'transfer encoding'.

Page 2/328 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • Purchase existing domain and transfer to new registrar

    - by Kiefer
    I am purchasing an existing domain from the owner who has it registered with GoDaddy. I want to transfer the domain to another registrar and of course have it under my name. If they update the registrant info to my name then it will lock down for 60 days. That's no good. If they simply transfer it to my registrar, how will they update the registrant info? I know about escrow services, but I don't feel I need one because I trust the seller and the amount is (relatively) small. Advice? Thanks!

    Read the article

  • Encoding problem with preg_replace() and scandir()

    - by itarato
    Hi, On OS-X (PHP5.2.11) I have a file: siësta.doc (and thousand other with Unicode filenames) and I want to convert the file names to a web-consumable format (a-zA-Z0-9.). If I hardcode the file name above I can do the right conversion: <?php $file = 'siësta.doc'; echo preg_replace("/[^a-zA-Z0-9.]/u", '_', $file); // Output: si_sta.doc ?> But if I read the file names with scandir, I've got strange conversions: <?php $files = scandir(DIRNAME); foreach ($files as $file) { echo preg_replace("/[^a-zA-Z0-9.]/u", '_', $file); // Output for the file above: sie_sta.doc } ?> I tried to detect the encoding, set the encoding, convert it with iconv functions. I tried the mb_ functions also. But it was just worse. What did I do wrong? Thanks in advance

    Read the article

  • C#, UTF-8 and encoding characters

    - by AspNyc
    This is a shot-in-the-dark, and I apologize in advance if this question sounds like the ramblings of a madman. As part of an integration with a third party, I need to UTF8-encode some string info using C# so I can send it to the target server via multipart form. The problem is that they are rejecting some of my submissions, probably because I'm not encoding their contents correctly. Right now, I'm trying to figure out how a dash or hyphen -- I can't tell which it is just by looking at it -- is received or interpreted by the target server as ?~@~S (yes, that's a 5-character string and is not your browser glitching out). And unfortunately I don't have a thorough enough understanding of Encoding.UTF8.GetBytes() to know how to use the byte array to begin identifying where the problem might lie. If anybody can provide any tips or advice, I would greatly appreciate it. So far my only friend has been MSDN, and not much of one at that.

    Read the article

  • Is There any GUI Application for Flash Media Live Encoding for Ubuntu or Linux

    - by Dumindu Mahawela
    I need to Broadcast a TV channel to a Website. I need a GUI application for Flash Media Live Encoding. Famous Adobe FME does not have a Linux version. I did try to install Open Broadcast Encoder in Ubuntu 13.04 64amd but wasnt successfull. So the things that I need to know are; Is There any GUI Application for Flash Media Live Encoding for Ubuntu or Linux ? Is it able to succesfully install Open Broadcast Encoder In Ubuntu ?

    Read the article

  • Change language encoding for file uploading

    - by jc.yin
    Previously we were running a Wordpress site on a Mac OS Server machine. We had several hundred images with Chinese characters for the image names. Now we're trying to migrate to a Ubuntu system and everything is fine except the images. Every time I try to upload an image with a Chinese name via FTP, I get the following message: "MyImage contains illegal characters. Please choose an appropriate text encoding" I have no idea how to solve this issue, do I need to somehow change the system language encoding in Ubuntu to allow for image uploading? Thanks

    Read the article

  • Global UTF-encoding, the right way

    - by mowgli
    I'm curious, as to what is the right way to have UTF-8 encoding on all web files All my files (incl. CSS and JS) are made and saved in UTF-8 encoding In PHP, I set the char-set on top of the main page (this page includes all others) with: header('Content-type: text/html; charset=utf-8'); In the same page I have this html meta tag: <meta http-equiv="Content-Type" content="text/html; charset=utf-8" /> Then I stubled upon an external css file that has this on first line: @charset "UTF-8"; And now I wonder, should I set the charset INSIDE all my CSS/JS files too, like that? And/or should I serve each file with charset=utf-8 in the meta tag?

    Read the article

  • sell skimmed dump+pin([email protected])wu transfer,bank transfer,paypal+mailpass

    - by bestseller
    http://megareserve.blogspot.com///////// [email protected]////gmail:[email protected] Sell, Cvv, Bank Logins, Tracks, PayPal, Transfers, WU, Credit Cards, Card, Hacks, Citi, Boa, Visa, MasterCard, Amex, American Express, Make Money Fast, Stolen, Cc, C++, Adder, Western, Union, Banks, Of, America, Wellsfargo, Liberty, Reserve, Gram, Mg, LR, AlertPay ..PAYMENT METHOD LIBERTYRESERVE AND WESTERNUNION ONLY........ Ccv EU is $ 6 per ccv (Visa + Master) Ccv EU is $ 7 per ccv (Amex + Discover) Ccv Au is $ 6 per ccv Ccv Italy is 15 $ per cc sweden 12$ spain 10$ france 12$ Ccv US is $ 1.5 per ccv (Visa) Ccv US is $ 2 per ccv (master) Ccv US is $ 3 per ccv (Amex + Discover) Ccv UK is $ 5 per ccv (Visa + Master) Ccv UK is $ 6 per ccv (Amex + swith) Ccv Ca is $ 6 per ccv (Visa+ Master) Ccv Ca is $ 9 per ccv (Visa Business + Visa Gold) Ccv Germany is 14$ Per Ccv Ccv DOB with US is 15 $ per ccv Ccv DOB with UK is 19 $ per ccv Ccv DOB + BIN with UK 25$ per ccv Ccv US full info is 40 $ per ccv Ccv UK full info is 60 $ per ccv 1 Uk check bins= 12.5$/1cvv 1 Sock live = 1$/1sock live 5day yahoo:[email protected] gmail:[email protected] Balance In Chase:.........70K To 155K ========300$ Balance In Wachovia:.........24K To 80K==========180$ Balance In Boa..........75K To 450K==========400$ Balance In Credit Union:.........Any Amount:=========420$ Balance In Hallifax..........ANY AMOUNT=========420$ Balance In Compass..........ANY AMOUNT=========400$ Balance In Wellsfargo..........ANY AMOUNT=========400$ Balance In Barclays..........80K To 100K=========550$ Balance In Abbey:.........82K ==========650$ Balance in Hsbc:.........50K========650$ and more 1 Paypal with pass email = 50$/paypal 1 Paypal don't have pass email = 20$/Paypal 1 Banklogin us or uk (personel)= 550$ yahoo :[email protected] gmail :[email protected] Track 1/2 Visa Classic, MasterCard Standart US - 13$ UK - 17$ EU - 24$ AU - 26$ Track 1/2 Visa Gold | Platinum | Business | Signature, MasterCard Gold | Platinum US - 17$ UK - 20$ EU - 28$ AU - 30$ Bank transfer Balance 71.000$ CITIBANK SOUTH DAKOTA, N.A. Balance 65.000$ Wachovia: 76.000$ Abbey: 65.000£ Hsbc : 87.000$ Hallifax : 97.000£ Barclays: 110.000£ AHLI UNITED BANK --- 80.000£ LLOYDSTSB ---------- 100.000£ BANK OF SCOTLAND --- 123.000£ BOA ----------210.000$ UBS ---------- 152.000$ RBC BANK ------ 245.000$ BANK OF CANADA -------- 78.000$ BDC ---------- 281.000$ BANK LAURENTIENNE ----- 241.000$ please no test yahoo: [email protected] gmail: [email protected] website ; http://megareserve.blogspot.com Prices For Bin and Its List: 5434, 5419, 4670,374288,545140,454634,3791 with d.o.b,4049,4462,4921.4929.46274547,5506,5569,5404,5031,4921, 5505,5506,4921,4550 ,4552,4988,5186,4462,4543,4567 ,4539,5301,4929,5521 , 4291,5051,4975,5413 5255 4563,4547 4505,4563 5413 5255,5521,5506,4921,4929,54609 7,5609,54609,4543, 4975,5432,5187 ,4973,4627,4049,4779,426565,55 05, 5549, 5404, 5434, 5419, 4670,456730,541361, 451105,4670,5505, 5549, 5404, UK Nomall NO BINS(Serial) with DOB is :10$ UK with BINS(Serial) with DOB is :15$ UK Nomall no BINS(Serial) no DOB is: 6$ UK with BINS(Serial) is :12$ Please do not request : cc for TEST and FREE. DON'T CONTACT ME IF YOU NOT READY NEED TO BUY .

    Read the article

  • Character Encoding: â??

    - by akaphenom
    I am trying to piece together the mysterious string of characters â?? I am seeing quite a bit of in our database - I am fairly sure this is a result of conversion between character encodings, but I am not completely positive. The users are able to enter text (or cut and paste) into a Ext-Js rich text editor. The data is posted to a severlet which persists it to the database, and when I view it in the database i see those strange characters... is there any way to decode these back to their original meaning, if I was able to discover the correct encoding - or is there a loss of bits or bytes that has occured through the conversion process? Users are cutting and pasting from multiple versions of MS Word and PDF. Does the encoding follow where the user copied from? Thank you website is UTF-8 We are using ms sql server 2005; SELECT serverproperty('Collation') -- Server default collation. Latin1_General_CI_AS SELECT databasepropertyex('xxxx', 'Collation') -- Database default SQL_Latin1_General_CP1_CI_AS and the column: Column_name Type Computed Length Prec Scale Nullable TrimTrailingBlanks FixedLenNullInSource Collation text varchar no -1 yes no yes SQL_Latin1_General_CP1_CI_AS The non-Unicode equivalents of the nchar, nvarchar, and ntext data types in SQL Server 2000 are listed below. When Unicode data is inserted into one of these non-Unicode data type columns through a command string (otherwise known as a "language event"), SQL Server converts the data to the data type using the code page associated with the collation of the column. When a character cannot be represented on a code page, it is replaced by a question mark (?), indicating the data has been lost. Appearance of unexpected characters or question marks in your data indicates your data has been converted from Unicode to non-Unicode at some layer, and this conversion resulted in lost characters. So this may be the root cause of the problem... and not an easy one to solve on our end.

    Read the article

  • [Asp.Net MVC] Encoding a character

    - by Trimack
    Hi, I am experiencing some weird encoding behaviour in my ASP.NET MVC project. In my Site.Master there is <div class="logo"> <a href="<%=Url.Action("Index", "Win7")%>"><%= Html.Encode("Windows 7 Tutoriál") %></a></div> which translates to the resulting page as <div class="logo"> <a href="/">Windows 7 TutoriA?l</a></div> However, in the Index.aspx there is <h1> Windows 7 Tutoriál</h1> which translates correctly on the same resulting page. I do have <meta http-equiv="Content-Type" content="text/html; charset=utf-8" /> as my first line in <head>. Locally, both files are saved in UTF-8 encoding. Any ideas why is this happening and how to fix it? Thanks in advance.

    Read the article

  • Decoding not reversing unicode encoding in Django/Python

    - by PhilGo20
    Ok, I have a hardcoded string I declare like this name = u"Par Catégorie" I have a # -- coding: utf-8 -- magic header, so I am guessing it's converted to utf-8 Down the road it's outputted to xml through xml_output.toprettyxml(indent='....', encoding='utf-8') And I get a UnicodeDecodeError: 'ascii' codec can't decode byte 0xc3 in position 3: ordinal not in range(128) Most of my data is in French and is ouputted correctly in CDATA nodes, but that one harcoded string keep ... I don't see why an ascii codec is called. what's wrong ?

    Read the article

  • Slow network file transfer (under 20KB/s) on newly built x64 Win7

    - by Mangoshake
    I am getting <20KB/s for local network file transfer. If I transfer a very small file (less than 100KB) it would start quickly then slow down to <20KB/s. all subsequently network file transfer would be slow, a reboot is needed to reset this. If I transfer a large file it would be stuck on calculating for a long time and then begin with <20KB/s immediately. This is a newly built desktop running Windows 7 x64 SP1. Realtek gigabit LAN from the motherboard (ASRock Extreme3 gen3). Problematic speed is observed on the private LAN, both through ethernet and WiFi. The Router is D-Link DIR-655. Remote Differential Compression is off. Drivers are up-to-date from ASRock's website. I have tested network file transfer to and from another Windows 7 laptop and a MacBook Pro, so I am fairly certain it is the desktop's problem. The slow speed only happens with one direction also, outbound from the desktop, regardless of whether I initiate the file transfer action from the origin or the destination. Inbound network file transfer and internet speeds are fine, so I don't think this is a hardware issue. I am getting 74.8MB/s internet upload speed from speedtest.net (http://www.speedtest.net/result/1852752479.png). Inbound network file transfer I can get around 10-15MB/s. I am hoping this community has some insight for me to troubleshoot this. I don't see anything obviously related from the Event Viewer, and beyond that I just don't know where else to look. Any suggestions are greatly appreciated, thank you in advance.

    Read the article

  • C#/Oracle: Specify Encoding/Character Set of Query?

    - by Reini
    I'm trying to fetch some Data of a Oracle 10 Database. Some cells are containing german umlauts (äöü). In my Administration-Tool (TOAD) I can see them very well: "Mantel für Damen" (Jacket for Women) This is my C# Code (simplified): var oracleCommand = new OracleCommand(sqlGetArticles, databaseConnection); var articleResult = oracleCommand.ExecuteReader(); string temp = articleResult.Read()["SomeField"].ToString(); Console.WriteLine(temp); The output is: "Mantel f?r Damen" Tryed on Debugging (moving mouse over variable), Debug-Window, Console-Window, File. I think I have to specify the Encoding/Character Set somwhere. But where?

    Read the article

  • Can SVG render partially if gzipped and chunk-transferred?

    - by Scott Stafford
    Hi - I have some large, dynamically generated SVGs that are being served over a relatively slow internet connection. I'm trying to optimize them to be viewable as fast as possible. If I set the server to Content-Encoding: gzip and Transfer-Encoding: chunked, will any SVG viewers take advantage of that and render it partially, as it is transferred? If not, are there other ways to get it to render as-it-streams? I could break it up into several SVG pieces but that will be a lot of work, I was hoping for server settings... The most common users use IE7 with the Adobe SVG Viewer plugin. I doubt it matters but I'm serving with C#/ASP.NET and IIS6.

    Read the article

  • download speed is fast but transfer rate is slow

    - by Ieyasu Sawada
    I've just changed ISP and I'm pretty disappointed with the transfer rate. My previous connection has a download speed of 1.08 Mb/s as seen from this site: http://speedtest.net and the download transfer rate is about 100kb/s for sites that doesn't limit their bandwidth. Now my connection has about 2Mb/s download speed but the transfer rate is dancing from 20-50kb/s . I was expecting a speed much higher than this because of the download speed that I'm getting when I'm testing. The question is what's the difference between transfer rate and download speed, is it normal to have a high download speed but low transfer rate, should the download speed be proportional to the transfer rate?

    Read the article

  • Free Domain hosting configurations and transfer

    - by upog
    I have registered a new domain name with GODaddy.com now i would like to host my domain for free. Assume the app is a basic HTML page.I have done some search and decided to host it under google app engine I am looking answers for few question currently my domain name is managed by GODaddy, how can i transfer it to Google app engine, so that going forward it will be managed by Google How can i configure the new domain in Google app engine and associate with my domain name Is there any indirect cost involved in domain hosting service hosted by Google app engine Any suggestion for free and reliable hosting is also welcomed Update Can i host free web page in cloud.google.com ?

    Read the article

  • USB file transfer preparing to copy, but file count climbs indefinitely

    - by Alex
    I have downloaded Ubuntu 12.04 to back up my Windows machine that won't boot and am running Ubuntu from the CD. I copied and pasted a volume of about 160GB to my external HDD. The transfer has been stuck in the "preparing to copy" stage for several hours and is displaying a file/GB count about twice as high as the volume of data actually being copied! The number is now larger than the entire partition that's being copied from! However I know it's doing something because occasionally it pops up with a minor I/O error on this or that file which I then have to click through. I've not had this problem before so I can only assume it's a Linux/Ubuntu thing. More importantly what I want to know is is there any other way to copy it across that will actually work?

    Read the article

  • Typical text encoding+BOM, and EOL behavior on mobile devices

    - by Dan W
    Typical things to worry about when dealing with text are the BOM/signature, encoding, and the end of line (EOL) char/chars. I know that Windows often favours \r\n (CR+LF) and Mac/Linux favours \n (LF), but how about mobile devices such as the iPhone and Android? Do typical apps on those platforms favour one or the other? Also, which text encodings are mobiles most likely to use - UTF-8, iso-8859-1, or even Windows 1252 (or other default codepage) or maybe even UTF-16? And if they use UTF-8/16, are they likely to need (or require not having) a BOM/signature? What is the typical behavior here?

    Read the article

  • How to speed up file transfer to/from Ubuntu Server 11.10 (wifi)

    - by Alexander
    I've been searching AU & elsewhere for the last day and a half. Haven't found an answer so I joined AU to ask for help. I'm hoping someone can point me in the right direction. Ubuntu Server 11.10 Samba VSFTPD Windows 7 PC 2 MacBook Pro - Snow Leopard/Lion 1 iMac - Lion Wireless LAN using DLink DIR-655 Link Speed: 195 Mbit/s on Mac - 54Mbps on Windows ISP Connection: Cable - 20 down/3 up No Domain Controller. All machines are members of the same workgroup. No matter how I connect I can't get better than about 700K transfer rate up/down. Mac/PC, SMB/ftp, Domain Name/Local IP I've tried different user accounts and using different folders, different volumes on the server. Nothing seems to make a difference. 700K up/down. Period. Any suggestions would be greatly appreciated. Thanks, Alexander EDIT: Using sftp now and uploading seems to peak at 980k. After about 5 minutes into a 650MB file, downloading is at 1072k and climbing about 500b/s every ten seconds. If any of that matters... I was expecting a lot faster than 1Mb tx rate. Am I off base here? EDIT: From all I've read so far, perhaps the speed isn't that bad. I only installed Ubuntu out of boredom this past weekend. The trouble is, I like it. Guess it's time to ditch the wifi and run some Cat 5.

    Read the article

  • Using GameKit to transfer CoreData data between iPhones, via NSDictionary

    - by OscarTheGrouch
    I have an application where I would like to exchange information, managed via Core Data, between two iPhones. First turning the Core Data object to an NSDictionary (something very simple that gets turned into NSData to be transferred). My CoreData has 3 string attributes, 2 image attributes that are transformables. I have looked through the NSDictionary API but have not had any luck with it, creating or adding the CoreData information to it. Any help or sample code regarding this would be greatly appreciated.

    Read the article

  • Detect the URI encoding automatically in Tomcat

    - by Roland Illig
    I have an instance of Apache Tomcat 6.x running, and I want it to interpret the character set of incoming URLs a little more intelligent than the default behavior. In particular, I want to achieve the following mapping: So%DFe => Soße So%C3%9Fe => Soße So%DF%C3%9F => (error) The bevavior I want could be described as "try to decode the byte stream as UTF-8, and if it doesn't work assume ISO-8859-1". Simply using the URIEncoding configuration doesn't work in that case. So how can I configure Tomcat to encode the request the way I want? I might have to write a Filter that takes the request (especially the query string) and re-encodes it into the parameters. Would that be the natural way?

    Read the article

  • Ruby character encoding problems in netbeans and command wíndow

    - by salgo60
    I use netbeans as development IDE and runs the application from cmd but have problems to display ISO 8859-1 characters like åäö correct in both cmd window and when I run the application from netbeans Question: What is best practice to set it up Right now I do @output.puts indent + "V" + 132.chr + "lkommen till Ruby Camping!" to get ä My environment chcp 65001 Active code page: 65001 ruby main.rb Source encoding: <Encoding:US-ASCII> Default external: #<Encoding:UTF-8> Default internal: nil Locale charmap: "CP65001" where I have in the code def self.printEncoding puts "Source encoding: #{__ENCODING__.inspect}" if defined? __ENCODING__ if defined? Environment::Encoding puts "Default external: #{Encoding.default_external.inspect}" puts "Default internal: #{Encoding.default_internal.inspect}" puts "Locale charmap: #{ Encoding.locale_charmap.inspect}" end puts "LANG environment variable: #{ENV['LANG'].inspect}" unless ENV['LANG'].nil? end ruby -v ruby 1.9.1p378 (2010-01-10 revision 26273) [i386-mingw32]

    Read the article

  • Ruby character encoding problems in netabenas and command wíndow

    - by salgo60
    I use netbeans as development IDE and runs the application from cmd but have problems to display ISO 8859-1 characters like åäö correct in both cmd window and when I run the application from netbeans Question: What is best practice to set it up Right now I do @output.puts indent + "V" + 132.chr + "lkommen till Ruby Camping!" to get ä My environment chcp 65001 Active code page: 65001 ruby main.rb Source encoding: <Encoding:US-ASCII> Default external: #<Encoding:UTF-8> Default internal: nil Locale charmap: "CP65001" where I have in the code def self.printEncoding puts "Source encoding: #{__ENCODING__.inspect}" if defined? __ENCODING__ if defined? Environment::Encoding puts "Default external: #{Encoding.default_external.inspect}" puts "Default internal: #{Encoding.default_internal.inspect}" puts "Locale charmap: #{ Encoding.locale_charmap.inspect}" end puts "LANG environment variable: #{ENV['LANG'].inspect}" unless ENV['LANG'].nil? end ruby -v ruby 1.9.1p378 (2010-01-10 revision 26273) [i386-mingw32]

    Read the article

  • Character-encoding problem spring

    - by aelshereay
    Hi All, I am stuck in a big problem with encoding in my website! I use spring 3, tomcat 6, and mysql db. I want to support German and Czech along with English in my website, I created all the JSPs as UTF-8 files, and in each jsp I include the following: <%@ page language="java" contentType="text/html; charset=UTF-8" pageEncoding="UTF-8"%> <meta http-equiv="Content-Type" content="text/html; charset=UTF-8"> I created messages.properties (the default which is Czech), messages_de.properties, and messages_en.properties. And all of them are saved as UTF-8 files. I added the following to web.xml: <filter> <filter-name>encodingFilter</filter-name> <filterclass> org.springframework.web.filter.CharacterEncodingFilter</filterclass> <init-param> <param-name>encoding</param-name> <param-value>UTF-8</param-value> </init-param> <init-param> <param-name>forceEncoding</param-name> <param-value>true</param-value> </init-param> </filter> <locale-encoding-mapping-list> <locale-encoding-mapping> <locale>en</locale> <encoding>UTF-8</encoding> </locale-encoding-mapping> <locale-encoding-mapping> <locale>cz</locale> <encoding>UTF-8</encoding> </locale-encoding-mapping> <locale-encoding-mapping> <locale>de</locale> <encoding>UTF-8</encoding> </locale-encoding-mapping> </locale-encoding-mapping-list> <filter-mapping> <filter-name>encodingFilter</filter-name> <url-pattern>/*</url-pattern> </filter-mapping> And add the following to my applicationContext.xml: <bean id="messageSource" class="org.springframework.context.support.ResourceBundleMessageSource" p:basenames="messages"/> <!-- Declare the Interceptor --> <mvc:interceptors> <bean class="org.springframework.web.servlet.i18n.LocaleChangeInterceptor" p:paramName="locale" /> </mvc:interceptors> <!-- Declare the Resolver --> <bean id="localeResolver" class="org.springframework.web.servlet.i18n.SessionLocaleResolver" /> I set the useBodyEncodingForURI attribute to true in the element of server.xml under: %CATALINA_HOME%/conf, also another time tried to add URIEncoding="UTF-8" instead. I created all the tables and fields with charset [utf8] and collection [utf8_general_ci] The encoding in my browser is UTF-8 (BTW, I have IE8 and Firefox 3.6.3) When I open the MYSQL Query browser and insert manually Czech or German data, it's being inserted correctly, and displayed correctly in my app as well. So, here's the list of problems I have: By default the messages.properties (Czech) should load, instead the messages_en.properties loads by default. In the web form, when I enter Czech data, then click submit, in the Controller I print out the data in the console before to save it to db, what's being printed is not correct having strange chars, and this is the exact data that saves to db. I don't know where's the mistake! Why can't I get it working although I did what people did and worked for them! don't know.. Please help me, I am stuck in this crappy problem since days, and it drives me crazy! Thank you in advance.

    Read the article

  • ASP.NET GZip Encoding Caveats

    - by Rick Strahl
    GZip encoding in ASP.NET is pretty easy to accomplish using the built-in GZipStream and DeflateStream classes and applying them to the Response.Filter property.  While applying GZip and Deflate behavior is pretty easy there are a few caveats that you have watch out for as I found out today for myself with an application that was throwing up some garbage data. But before looking at caveats let’s review GZip implementation for ASP.NET. ASP.NET GZip/Deflate Basics Response filters basically are applied to the Response.OutputStream and transform it as data is written to it through the ASP.NET Response object. So a Response.Write eventually gets written into the output stream which if a filter is also written through the filter stream’s interface. To perform the actual GZip (and Deflate) encoding typically used by Web pages .NET includes the GZipStream and DeflateStream stream classes which can be readily assigned to the Repsonse.OutputStream. With these two stream classes in place it’s almost trivially easy to create a couple of reusable methods that allow you to compress your HTTP output. In my standard WebUtils utility class (from the West Wind West Wind Web Toolkit) created two static utility methods – IsGZipSupported and GZipEncodePage – that check whether the client supports GZip encoding and then actually encodes the current output (note that although the method includes ‘Page’ in its name this code will work with any ASP.NET output). /// <summary> /// Determines if GZip is supported /// </summary> /// <returns></returns> public static bool IsGZipSupported() { string AcceptEncoding = HttpContext.Current.Request.Headers["Accept-Encoding"]; if (!string.IsNullOrEmpty(AcceptEncoding) && (AcceptEncoding.Contains("gzip") || AcceptEncoding.Contains("deflate"))) return true; return false; } /// <summary> /// Sets up the current page or handler to use GZip through a Response.Filter /// IMPORTANT: /// You have to call this method before any output is generated! /// </summary> public static void GZipEncodePage() { HttpResponse Response = HttpContext.Current.Response; if (IsGZipSupported()) { string AcceptEncoding = HttpContext.Current.Request.Headers["Accept-Encoding"]; if (AcceptEncoding.Contains("deflate")) { Response.Filter = new System.IO.Compression.DeflateStream(Response.Filter, System.IO.Compression.CompressionMode.Compress); Response.Headers.Remove("Content-Encoding"); Response.AppendHeader("Content-Encoding", "deflate"); } else { Response.Filter = new System.IO.Compression.GZipStream(Response.Filter, System.IO.Compression.CompressionMode.Compress); Response.Headers.Remove("Content-Encoding"); Response.AppendHeader("Content-Encoding", "gzip"); } } } As you can see the actual assignment of the Filter is as simple as: Response.Filter = new DeflateStream(Response.Filter, System.IO.Compression.CompressionMode.Compress); which applies the filter to the OutputStream. You also need to ensure that your response reflects the new GZip or Deflate encoding and ensure that any pages that are cached in Proxy servers can differentiate between pages that were encoded with the various different encodings (or no encoding). To use this utility function now is trivially easy: In any ASP.NET code that wants to compress its Response output you simply use: protected void Page_Load(object sender, EventArgs e) { WebUtils.GZipEncodePage(); Entry = WebLogFactory.GetEntry(); var entries = Entry.GetLastEntries(App.Configuration.ShowEntryCount, "pk,Title,SafeTitle,Body,Entered,Feedback,Location,ShowTopAd", "TEntries"); if (entries == null) throw new ApplicationException("Couldn't load WebLog Entries: " + Entry.ErrorMessage); this.repEntries.DataSource = entries; this.repEntries.DataBind(); } Here I use an ASP.NET page, but the above WebUtils.GZipEncode() method call will work in any ASP.NET application type including HTTP Handlers. The only requirement is that the filter needs to be applied before any other output is sent to the OutputStream. For example, in my CallbackHandler service implementation by default output over a certain size is GZip encoded. The output that is generated is JSON or XML and if the output is over 5k in size I apply WebUtils.GZipEncode(): if (sbOutput.Length > GZIP_ENCODE_TRESHOLD) WebUtils.GZipEncodePage(); Response.ContentType = ControlResources.STR_JsonContentType; HttpContext.Current.Response.Write(sbOutput.ToString()); Ok, so you probably get the idea: Encoding GZip/Deflate content is pretty easy. Hold on there Hoss –Watch your Caching Or is it? There are a few caveats that you need to watch out for when dealing with GZip content. The fist issue is that you need to deal with the fact that some clients don’t support GZip or Deflate content. Most modern browsers support it, but if you have a programmatic Http client accessing your content GZip/Deflate support is by no means guaranteed. For example, WinInet Http clients don’t support GZip out of the box – it has to be explicitly implemented. Other low level HTTP clients on other platforms too don’t support GZip out of the box. The problem is that your application, your Web Server and Proxy Servers on the Internet might be caching your generated content. If you return content with GZip once and then again without, either caching is not applied or worse the wrong type of content is returned back to the client from a cache or proxy. The result is an unreadable response for *some clients* which is also very hard to debug and fix once in production. You already saw the issue of Proxy servers addressed in the GZipEncodePage() function: // Allow proxy servers to cache encoded and unencoded versions separately Response.AppendHeader("Vary", "Content-Encoding"); This ensures that any Proxy servers also check for the Content-Encoding HTTP Header to cache their content – not just the URL. The same thing applies if you do OutputCaching in your own ASP.NET code. If you generate output for GZip on an OutputCached page the GZipped content will be cached (either by ASP.NET’s cache or in some cases by the IIS Kernel Cache). But what if the next client doesn’t support GZip? She’ll get served a cached GZip page that won’t decode and she’ll get a page full of garbage. Wholly undesirable. To fix this you need to add some custom OutputCache rules by way of the GetVaryByCustom() HttpApplication method in your global_ASAX file: public override string GetVaryByCustomString(HttpContext context, string custom) { // Override Caching for compression if (custom == "GZIP") { string acceptEncoding = HttpContext.Current.Response.Headers["Content-Encoding"]; if (string.IsNullOrEmpty(acceptEncoding)) return ""; else if (acceptEncoding.Contains("gzip")) return "GZIP"; else if (acceptEncoding.Contains("deflate")) return "DEFLATE"; return ""; } return base.GetVaryByCustomString(context, custom); } In a page that use Output caching you then specify: <%@ OutputCache Duration="180" VaryByParam="none" VaryByCustom="GZIP" %> To use that custom rule. It’s all Fun and Games until ASP.NET throws an Error Ok, so you’re up and running with GZip, you have your caching squared away and your pages that you are applying it to are jamming along. Then BOOM, something strange happens and you get a lovely garbled page that look like this: Lovely isn’t it? What’s happened here is that I have WebUtils.GZipEncode() applied to my page, but there’s an error in the page. The error falls back to the ASP.NET error handler and the error handler removes all existing output (good) and removes all the custom HTTP headers I’ve set manually (usually good, but very bad here). Since I applied the Response.Filter (via GZipEncode) the output is now GZip encoded, but ASP.NET has removed my Content-Encoding header, so the browser receives the GZip encoded content without a notification that it is encoded as GZip. The result is binary output. Here’s what Fiddler says about the raw HTTP header output when an error occurs when GZip encoding was applied: HTTP/1.1 500 Internal Server Error Cache-Control: private Content-Type: text/html; charset=utf-8 Date: Sat, 30 Apr 2011 22:21:08 GMT Content-Length: 2138 Connection: close ?`I?%&/m?{J?J??t??` … binary output striped here Notice: no Content-Encoding header and that’s why we’re seeing this garbage. ASP.NET has stripped the Content-Encoding header but left our filter intact. So how do we fix this? In my applications I typically have a global Application_Error handler set up and in this case I’ve been using that. One thing that you can do in the Application_Error handler is explicitly clear out the Response.Filter and set it to null at the top: protected void Application_Error(object sender, EventArgs e) { // Remove any special filtering especially GZip filtering Response.Filter = null; … } And voila I get my Yellow Screen of Death or my custom generated error output back via uncompressed content. BTW, the same is true for Page level errors handled in Page_Error or ASP.NET MVC Error handling methods in a controller. Another and possibly even better solution is to check whether a filter is attached just before the headers are sent to the client as pointed out by Adam Schroeder in the comments: protected void Application_PreSendRequestHeaders() { // ensure that if GZip/Deflate Encoding is applied that headers are set // also works when error occurs if filters are still active HttpResponse response = HttpContext.Current.Response; if (response.Filter is GZipStream && response.Headers["Content-encoding"] != "gzip") response.AppendHeader("Content-encoding", "gzip"); else if (response.Filter is DeflateStream && response.Headers["Content-encoding"] != "deflate") response.AppendHeader("Content-encoding", "deflate"); } This uses the Application_PreSendRequestHeaders() pipeline event to check for compression encoding in a filter and adjusts the content accordingly. This is actually a better solution since this is generic – it’ll work regardless of how the content is cleaned up. For example, an error Response.Redirect() or short error display might get changed and the filter not cleared and this code actually handles that. Sweet, thanks Adam. It’s unfortunate that ASP.NET doesn’t natively clear out Response.Filters when an error occurs just as it clears the Response and Headers. I can’t see where leaving a Filter in place in an error situation would make any sense, but hey - this is what it is and it’s easy enough to fix as long as you know where to look. Riiiight! IIS and GZip I should also mention that IIS 7 includes good support for compression natively. If you can defer encoding to let IIS perform it for you rather than doing it in your code by all means you should do it! Especially any static or semi-dynamic content that can be made static should be using IIS built-in compression. Dynamic caching is also supported but is a bit more tricky to judge in terms of performance and footprint. John Forsyth has a great article on the benefits and drawbacks of IIS 7 compression which gives some detailed performance comparisons and impact reviews. I’ll post another entry next with some more info on IIS compression since information on it seems to be a bit hard to come by. Related Content Built-in GZip/Deflate Compression in IIS 7.x HttpWebRequest and GZip Responses © Rick Strahl, West Wind Technologies, 2005-2011Posted in ASP.NET   IIS7  

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >