Search Results

Search found 1246 results on 50 pages for 'backupup compression'.

Page 7/50 | < Previous Page | 3 4 5 6 7 8 9 10 11 12 13 14  | Next Page >

  • What is recommended minimum object size for gzip benefits?

    - by utt73
    I'm working on improving page speed display times, and one of the methods is to gzip content from the webserver. Google recommends: Note that gzipping is only beneficial for larger resources. Due to the overhead and latency of compression and decompression, you should only gzip files above a certain size threshold; we recommend a minimum range between 150 and 1000 bytes. Gzipping files below 150 bytes can actually make them larger. We serve our content through Akamai, using their network for a proxy and CDN. What they've told me: Following up on your question regarding what is the minimum size Akamai will compress the requested object when sending it to the end user: The minimum size is 860 bytes. My reply: What is the reason(s) for why Akamai's minimum size is 860 bytes? And why, for example, is this not the case for files Akamai serves for facebook? (see below) Google recommends to gzip more agressively. And that seems appropriate on our site where the most frequent hits, by far, are AJAX calls that are <860 bytes. Akamai's response: The reasons 860 bytes is the minimum size for compression is twofold: (1) The overhead of compressing an object under 860 bytes outweighs performance gain. (2) Objects under 860 bytes can be transmitted via a single packet anyway, so there isn't a compelling reason to compress them. So I'm here for some fact checking. Is the 860 byte limit due to packet size the end of this reasoning? Why would high traffic sites push this lower/closer to the 150 byte limit... just to save on bandwidth costs, or is there a performance gain in doing so?

    Read the article

  • Compressing/compacting messages over websocket on Node.js

    - by icelava
    We have a websocket implementation (Node.js/Sock.js) that exchanges data as JSON strings. As our use cases grow, so have the size of the data transmitted across the wire. The websocket protocol does not natively offer any compression feature, so in order to reduce the size of our messages we'd have to manually do something about the serialisation. There appear to be a variety of LZW implementations in Javascript, some which confuses me on their compatibility for in-browser use only versus transmission across the wire due to my lack of understanding on low-level encodings. More importantly, all of them seem to take a noticeable performance drag when Javascript is the engine doing the compression/decompression work, which is not desirable for mobile devices. Looking instead other forms of compact serialisation, MessagePack does not appear to have any active support in Javascript itself; BSON does not have any Javascript implementation; and an alternative BISON project that I tested does not deserialise everything back to their original values (large numbers), and it does not look like any further development will happen either. What are some other options others have explored for Node.js?

    Read the article

  • Optimize video filesize without quality loss

    - by user12015
    Is there a simple way (on the command line - I want to write a script which compresses all videos in a folder) to reduce the filesize of a video (almost) without quality loss? Is there a method which works equally well for different video format (mp4, flv, m4v, mpg, mov, avi)? I should mention that most of the videos I would like to compress are downloaded web-videos (mp4, flv), so it's not clear if there is much room for further compression.

    Read the article

  • Anyone have any opinions about Chilkatsoft? [closed]

    - by Joe Enos
    I'm considering purchasing the Chilkatsoft bundle, which includes a bunch of libraries on lots of technologies. Specifically, I care about .NET compression, encryption, FTP, and mail libraries, but I'm interested in looking at the rest of their stuff as well. Does anyone have any experience using these libraries, or opinions on the company or product in general? The price is right, and the content seems good, so I just want to make sure I do my homework before purchasing. Thanks

    Read the article

  • Decoding a compressed short string; uncertain on compression used - Updated

    - by James
    Hi, I have a program that is compressing a string in an unknown way. I know a few inputs and the output produced, but I am not sure what is being used to compress the string. Here are my examples. (just 38 x a, no spaces or anything else) In: "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa" Out: "026900211AA63000026900"   (just 32 x a) In: "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa" Out: "0209001c1aa7a000020900" (31 x a, then 1 b) In: "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaab" Out: "0209000177c553c000020900" (31 x b, then 1 a) In: "bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbba" Out: "0209001e7754f38000020900" In: "Hey wot u doing 2day u wanna do something" Out: "02990011C7C62E78CE6B8E3ACD83E81B37C5C5A6B9D1E1B06963DB5E71155C1000029900" (same as previous string, but with a space at the end) In: "Hey wot u doing 2day u wanna do something " Out: "02A90012C7718B9E339AE2EB360FA02CDF17177A674786DF4B1EDAF388AAE08000000002A90000" The only definite thing I can see so far is digit 2 and 3 are the amount of characters (hex value), ie first one is 26 = 38 Also the first 6 digits are repeated at the end Any help / advice would be great, thanks!

    Read the article

  • What are the benefits of the PHP the different PHP compression libraries?

    - by Christopher W. Allen-Poole
    I've been looking into ways to compress PHP libraries, and I've found several libraries which might be useful, but I really don't know much about them. I've specifically been reading about bcompiler and PHAR libraries. Is there any performance benefit in either of these? Are there any "gotchas" I need to watch out for? What are the relative benefits? Do either of them add to/detract from performance? I'm also interested in learning of other libs which might be out there which are not obvious in the documentation? As an aside, does anyone happen to know whether these work more like zip files which just happen to have the code in there, or if they operate more like Python's pre-compiling which actually runs a pseudo-compiler? ======================= EDIT ======================= I've been asked, "What are you trying to accomplish?" Well, I suppose the answer is that this is all hypothetical. It is a combination of these: What if my pet project becomes the most popular web project on earth and I want to distribute it quickly and easily? (hay, a man can dream, right?) It also seems if using PHAR can be done easily, it would be the best way to create a subversion snapshot. Python has this really cool pre-compiling policy, I wonder if PHP has something like that? These libraries seem to do something similar. Will they do that? Hey, these libraries seem pretty neat, but I'd like clarification on the differences as they seem to do the same thing

    Read the article

  • Arithmetic Coding Questions

    - by Xophmeister
    I have been reading up on arithmetic coding and, while I understand how it works, all the guides and instructions I've read start with something like: Set up your intervals based upon the frequency of symbols in your data; i.e., more likely symbols get proportionally larger intervals. My main query is, once I have encoded my data, presumably I also need to include this statistical model with the encoding, otherwise the compressed data can't be decoded. Is that correct? I don't see this mentioned anywhere -- the most I've seen is that you need to include the number of iterations (i.e., encoded symbols) -- but unless I'm missing something, this also seems necessary to me. If this is true, that will obviously add an overhead to the final output. At what point does this outweigh the benefits of compression (e.g., say if I'm trying to compress just a few thousand bits)? Will the choice of symbol size also make a significant difference (e.g., if I'm looking at 2-bit words, rather than full octets/whatever)?

    Read the article

  • Ogre3D : seeking advices about game files management

    - by Tibor
    I'm working on a new game, and its related level editor, based on Ogre3D. I was thinking about how i could manage the game files, knowing that Ogre use .mesh files for models, .material for materials/texture information etc... . At first i thought about a common .zip folder decompressed at runtime (the same way Torchlight and Ogre samples do). But this way the game assets become a monolithic archive, loading takes time, and could be difficult to eventually patch them. So, let's say i have a game object named "Cube" i want to load in my program. Going for modularity, what if i create a compressed file (using zlib compression routines) named Cube.extname, containing its sub-files Cube.mesh, Cube.material and so on ? Are there any alternatives or should i stick with compressed objects? PS: Just to clear things, the answer is unrelated to my program code, at the moment i'm using "resources.cfg" pointing to the OgreSDK media directory.

    Read the article

  • Photo sharing social platform [closed]

    - by user1696497
    I am working on a photo sharing social platform like Flickr, Photobucket. To start off with I have half a million photos as of now. I want to convert all of these into a single format, compression ratio and use it as an original image. I will be storing original image, re-sized image according to layout and a thumbnail. I have started off with ruby, didn't find supporting libraries. I am considering python as it has a good image processing library and instagram is using it. I want some advise about how the image has to be processed while uploading, efficient way of storage whether database or a file system, image compressions, and precautions to be taken. I would be having profile pictures, do I need store them separately or along with the images? If I want to store the images on a file system, which file system should I use and also should I store the url or should I use any intermediate key value store like redis?

    Read the article

  • ?????????????????~BI/DWH????????(2)

    - by Yusuke.Yamamoto
    Oracle Database ???BI???????????????????????????????????????????????????????????????????????????????????? ?????????????????????????????! ??????????(04:19)????????????? Enterprise Manager ?????????????????????????? ???????????????????? Enterprise Manager ??SQL????????????????????I/O?????????? ??????????????Oracle GRID Center ??????????????? ????????????????????????? ????????DWH????????????????·???????? ???? ???????/???????!! ???????????? ~????·???????????????~ ????????: Advanced Compression|??????????? Oracle Enterprise Manager|??????????? DWH(?????????)??·??

    Read the article

  • Can compressing Program Files save space *and* give a significant boost to SSD performance?

    - by Christopher Galpin
    Considering solid-state disk space is still an expensive resource, compressing large folders has appeal. Thanks to VirtualStore, could Program Files be a case where it might even improve performance? Discovery In particular I have been reading: SSD and NTFS Compression Speed Increase? Does NTFS compression slow SSD/flash performance? Will somebody benchmark whole disk compression (HD,SSD) please? (may have to scroll up) The first link is particularly dreamy, but maybe head a little too far in the clouds. The third link has this sexy semi-log graph (logarithmic scale!). Quote (with notes): Using highly compressable data (IOmeter), you get at most a 30x performance increase [for reads], and at least a 49x performance DECREASE [for writes]. Assuming I interpreted and clarified that sentence correctly, this single user's benchmark has me incredibly interested. Although write performance tanks wretchedly, read performance still soars. It gave me an idea. Idea: VirtualStore It so happens that thanks to sanity saving security features introduced in Windows Vista, write access to certain folders such as Program Files is virtualized for non-administrator processes. Which means, in normal (non-elevated) usage, a program or game's attempt to write data to its install location in Program Files (which is perhaps a poor location) is redirected to %UserProfile%\AppData\Local\VirtualStore, somewhere entirely different. Thus, to my understanding, writes to Program Files should primarily only occur when installing an application. This makes compressing it not only a huge source of space gain, but also a potential candidate for performance gain. Testing The beginning of this post has me a bit timid, it suggests benchmarking NTFS compression on a whole drive is difficult because turning it off "doesn't decompress the objects". However it seems to me the compact command is perfectly capable of doing so for both drives and individual folders. Could it be only marking them for decompression the next time the OS reads from them? I need to find the answer before I begin my own testing.

    Read the article

  • Disable SSL / TLS compression in Apache 2.2.x

    - by DevGav
    Is there a way to disable SSL/TLS Compression in Apache 2.2.x when using mod_ssl? If not, what are people doing to mitigate the effects of CRIME/BEAST in older browsers? Related Links: https://issues.apache.org/bugzilla/show_bug.cgi?id=53219 https://threatpost.com/en_us/blogs/new-attack-uses-ssltls-information-leak-hijack-https-sessions-090512 http://security.stackexchange.com/questions/19911/crime-how-to-beat-the-beast-successor

    Read the article

  • RAR Command Line Maximum Compression

    - by Steve Robathan
    I am writing a batch file to compress a folder using various archiving applications. Currently I also use Winrar (X64) but manually set up the parameters I would like to add rar to my batch The folder concerned has many sub folders and I need to take this into account What is the command line for the following keeping folder structure?: Solid Archive, Archive Format=RAR5, Compression Method=Best, Dictionary Size=1024MB Many thanks

    Read the article

  • VLC Dynamic Range compression multiple songs

    - by Sion
    In my collection of music I have some songs which seem to be compressed nicely. But in addition to those I have songs which are overly quite compared to the louder compressed songs. So maybe the problem isn't compression but average volume. Would the Dynamic Range Compressor in VLC work for this type of problem or would I have better luck using external speakers and running it through a guitar compressor?

    Read the article

  • AAC 256kbit to MP3 320kbit conversion. I know it's lossy, but how?

    - by Fabian Zeindl
    Has anyone ever transcoded music from a high-quality aac to an mp3 (or vice-versa). The internet is full of people who say this should never be done, but apart from the theoretical standpoint that you can only lose information, does it matter in practise? is the difference perceivable, except on studio-equipment? does the re-encoding actually lose much information? If, p.e., high frequences are chopped away by the initial compression, those frequencies aren't there anymore, so this part of the compression-algorithm won't touch the data during the second compression. Am i wrong?

    Read the article

  • How to higly compress files

    - by Alfred James
    I want to learn how can we higly compress a file. I have downloaded files whose compressed size was 4 gb and after expanding they became 10GB file. So Which software I need to use for such high compression? Will I lose quality of the data by compression? If you direct me to Winrar or 7zip, please tell me how can I higly compress a file cuz I have tried compression with it but the size was reduced just by few MBs. I am compressing games like .exe files. Thanks

    Read the article

  • Why does a zip file appear larger than the source file especially when it is text?

    - by PeanutsMonkey
    I have a text file that is 19 bytes in size and having compressed the file using zip and 7zip, it appears to be larger. I had a read of the question on Why is a 7zipped file larger than the raw file? as well as Why doesn't ZIP Compression compress anything? but considering the file is not already compressed I would have expected further compression. Attached is a screenshot. EDIT0 I took the example further by creating a file that contained random data as follows dd if=/dev/urandom of=sample.log bs=1G count=1 and attempted to compress the file using both zip and 7zip however there were no compression gains. Why is that?

    Read the article

  • Convert old videos to have smaller sizes

    - by Tim
    I have some videos from a few years ago,with various formats, such as avi, mpg, wmv, rm, rmvb, .... Their sizes are huge(more than 500 MB, and sometimes 1GB). Given there may likely be some advance in data compression, I would like to know which file formats and compression methods are recommended these days, by the standard that without losing obvious data, while achieving big size reduction. How can I perform the file format conversion and data compression in Ubuntu 12.04? Command line and batch ways would be the most convenient, although GUI ways are also appreciated. Thanks and regards!

    Read the article

  • Oracle Database 11g R2 támogatott SAP alatt is

    - by Lajos Sárecz
    Húsvét óta már SAP alatt is használható az Oracle Database 11g R2. Köztudott, hogy az SAP csak a Release 2-re ad ki támogatást, így ez most egy igazán örömteli hír az SAP felhasználóknak, hiszen az alábbi 11g R2 újdonságokat tudják alkalmazni SAP környezetben: • Advanced Compression opció (táblára, RMAN mentésre, expdp-re, Data Guard hálózatra) • Real Application Testing • Oracle Database 11g Release 2 Database Vault • Oracle Database 11g Release 2 RAC • Advanced Encryption táblaterekre, RMAN mentésekre, expdp-re, Data Guard hálózatra • Direct NFS • Deferred Segments • Online Patching Azaz például tömöríthetové válik az SAP adatbázisa, vagy az abból készített mentések. Az eddigi tapasztalatok szerint a tömörítés aránya adatbázistól függoen 2-4-szeres. Az adatbázis upgrade és minden egyéb adatbázis infrastruktúrát érinto változatatás kockázata jelentosen csökkentheto lesz a Real Application Testing alkalmazásával. A rendszergazdai szerepkörök szeparaláhatóvá válnak a Database Vault felhasználásával. A Real Application Clusters 11g R2 újdonságai is elérheto lesznek. A Transparent Data Encryption révén a táblaterek és a mentések titkosíthatók úgy, hogy az alkalmazás számára mindez transzparens, azonban a médiához közvetlenül hozzáférve nem lesznek visszafejthetok az adatok. Támogatott lesz a Direct NFS kliens, ezzel NFS elérési sebesség jelentosen javul. A Deffered Segments révén pedig a tábla szegmensek csak akkor kerülnek lefoglalásra, amikor adat kerül a táblába. Ez azért hasznos, mert általában alkalmazások telepítésekor létrejön minden tábla, azonban sok táblába nem kerül adat. Ezáltal mind a telepítés ideje, mind az adatbázis mérete csökkentheto. Az Online Patching pedig lehetové teszi a leállításmentes patch telepítést. Hát azt gondolom ezek vonzó lehetoségek, érdemes betervezni a közeljövobe az SAP rendszerek alatti adatbázis frissítését, hiszen a 10g verzió Premier Support idén nyáron lejár. Az upgrade-hez pedig mindenképp javaslom a Real Application Testing használatát, amivel az éles terhelés mellett teszthelheto teszt környezetben az upgrade. A Sun Oracle Database Machine és az Exadata sajnos még nem támogatott SAP alatt, mivel az ASM certifikáció még nem zárult le. A hírek szerint 2011 elejére várható, hogy ez megtörténik.

    Read the article

  • 'Binary XML' for game data?

    - by bluescrn
    I'm working on a level editing tool that saves its data as XML. This is ideal during development, as it's painless to make small changes to the data format, and it works nicely with tree-like data. The downside, though, is that the XML files are rather bloated, mostly due to duplication of tag and attribute names. Also due to numeric data taking significantly more space than using native datatypes. A small level could easily end up as 1Mb+. I want to get these sizes down significantly, especially if the system is to be used for a game on the iPhone or other devices with relatively limited memory. The optimal solution, for memory and performance, would be to convert the XML to a binary level format. But I don't want to do this. I want to keep the format fairly flexible. XML makes it very easy to add new attributes to objects, and give them a default value if an old version of the data is loaded. So I want to keep with the hierarchy of nodes, with attributes as name-value pairs. But I need to store this in a more compact format - to remove the massive duplication of tag/attribute names. Maybe also to give attributes native types, so, for example floating-point data is stored as 4 bytes per float, not as a text string. Google/Wikipedia reveal that 'binary XML' is hardly a new problem - it's been solved a number of times already. Has anyone here got experience with any of the existing systems/standards? - are any ideal for games use - with a free, lightweight and cross-platform parser/loader library (C/C++) available? Or should I reinvent this wheel myself? Or am I better off forgetting the ideal, and just compressing my raw .xml data (it should pack well with zip-like compression), and just taking the memory/performance hit on-load?

    Read the article

  • How Google Web Starter Kit serves adaptive image for mobile?

    - by 5argon
    My website weirdly (in a good way) serves smaller images when viewed on mobile. I wanted to know what cause this? As far as I know this is not the default behaviour, so I think it must be Google Web Starter Kit's doing.Here is the debug information when debugging on device. All images became 231 B size no matter how large it actually is. (On desktop debugging the size varies.) I tried using Google Web Starter Kit (https://github.com/google/web-starter-kit) recently. The tools in it are made of Ruby, Node.js, SASS and Gulp to help you 'build' website. Pre-build you can enjoy automatic reload because the Gulp script will watch all files for you. When build it will run various tools to minify HTML,CSS and compress images. According to this page https://developers.google.com/web/fundamentals/tools/build/build_site the gulp-imagemin was used. So I guess the imagemin is doing the mobile optimization for me? What kind of compression can serve automatically resized image on mobile? And why is the size 231 B? Is this related to my screen size?

    Read the article

  • gzip compression

    - by SyAu
    I'm using Weblogic application server and Apache web server in my J2EE environment and planning to implement gzip compression of response. Not sure, whether to implement compression on the Apache server or on the weblogic.

    Read the article

< Previous Page | 3 4 5 6 7 8 9 10 11 12 13 14  | Next Page >