Search Results

Search found 1246 results on 50 pages for 'compression'.

Page 14/50 | < Previous Page | 10 11 12 13 14 15 16 17 18 19 20 21  | Next Page >

  • Does gunzip work in memory or does it write to disk?

    - by Ryan Detzel
    We have our log files gzipped to save space. Normally we keep them compressed and just do gunzip -c file.gz | grep 'test' to find important information but we're wondering if it's quicker to keep the files uncompressed and then do the grep. cat file | grep 'test' There has been some discussions about how gzip works if it would make sense that if it reads it into memory and unzips then the first one would be faster but if it doesn't then the second one would be faster. Does anyone know how gzip uncompresses data?

    Read the article

  • Encoding Uncompressed video

    - by Sakamoto Kazuma
    I'd like to encode uncompressed video into compressed avi or mpeg4. Was wondering what program I should look at getting to do such a task. Videos are between 3 and 20 minutes long, and range anywhere from 1.3 to 10 GBs in uncompressed .avi form (fraps).

    Read the article

  • How can I compress a movie to a specific file size in Windows 7's Live Movie Maker?

    - by Nathan Fellman
    In previous versions of Windows Movie Maker I could take a raw video file and specify the file size to compress it to, and Movie Maker would compress it accordingly (with the appropriate loss in quality). Live Movie Maker, which comes with Windows 7, doesn't seem to have this option. I can only set specify the requested quality. Is there any way to specify the size of the target file for Windows Live Movie Maker?

    Read the article

  • Content not being compressed even though I'm using zlib in php.ini

    - by Tola Odejayi
    I've edited my php.ini file so that it has these two entries: zlib.output_compression = On zlib.output_compression_level = 4 However, after restarting apache, when I request php pages, the headers returned in the response indicate that my server is still NOT serving compressed pages (here are selected headers as viewed using Chrome's Network feature): Cache-Control:no-cache, must-revalidate, max-age=0 Connection:Keep-Alive Content-Type:text/html; charset=UTF-8 Date:Mon, 17 Sep 2012 23:46:13 GMT Expires:Wed, 11 Jan 1984 05:00:00 GMT Last-Modified:Mon, 17 Sep 2012 23:46:13 GMT Pragma:no-cache Proxy-Connection:Keep-Alive Server:Apache/2.2.21 (Unix) mod_ssl/2.2.21 OpenSSL/0.9.8e-fips-rhel5 mod_auth_passthrough/2.1 mod_bwlimited/1.4 FrontPage/5.0.2.2635 PHP/5.2.17 Transfer-Encoding:chunked Via:1.1 XXX-PRXY-07 X-Powered-By:PHP/5.2.17 What might I be doing wrong? Is there any other setting that I need to change? EDIT Here is another set of headers returned to another computer: Cache-Control:no-cache, must-revalidate, max-age=0 Connection:close Content-Type:text/html; charset=UTF-8 Date:Thu, 20 Sep 2012 09:45:26 GMT Expires:Wed, 11 Jan 1984 05:00:00 GMT Last-Modified:Thu, 20 Sep 2012 09:45:26 GMT Pragma:no-cache Server:Apache/2.2.21 (Unix) mod_ssl/2.2.21 OpenSSL/0.9.8e-fips-rhel5 mod_auth_passthrough/2.1 mod_bwlimited/1.4 FrontPage/5.0.2.2635 PHP/5.2.17 Transfer-Encoding:chunked Vary:Cookie X-Powered-By:PHP/5.2.17

    Read the article

  • Too big a difference between Loud and Quiet when watching DVD's... constantly have to adjust volume.

    - by Dan
    It is hard for me to find the right volume for my computer to watch my dvd's on because it seems like most reasonable volumes become overwhelming at the loudest parts of a movie and it is hard to even make out the dialog at the quietest parts. I find I'm constantly adjusting the volume during the course of a movie. Are there ways to make the difference between the louds and the quiets not so extreme? (both computer related solutions and non-computer related solutions welcome). Like moving my speakers across the room and increasing the volume? or the opposite? Or would would the extremes be less noticeable if I used headphones? Are there movie players that might have more complex sound adjustment features? If there is a software solution out there for linux that would be great too. Thanks, Dan

    Read the article

  • unable to decompress a *.tar.xz file

    - by neubert
    Per http://askubuntu.com/a/107976 I tried tar xf php-5.6.0RC4.tar.xz and tar -xJf php-5.6.0RC4.tar.xz and in both cases I get the following: tar (child): xz: Cannot exec: No such file or directory tar (child): Error is not recoverable: exiting now tar: Child returned status 2 tar: Error is not recoverable: exiting now Here's php-5.6.0RC4.tar.xz: http://downloads.php.net/tyrael/php-5.6.0RC4.tar.xz I'm running Ubuntu 14.04 LTS.

    Read the article

  • tar: How to create a tar file with arbitrary leading directories w/o 'cd'ing to parent dir

    - by Yan
    Say I have a directory of files at /home/user1/dir1 and I want to create a tar with only "dir1" as the leading directory: /dir1/file1 /dir1/file2 I know I can first cd to the directory cd /home/user1/ tar czvf dir1.tar.gz dir1 But when writing scripts, jumping from directory to directory isn't always favorable. I am wondering is there a way to do it with absolute paths without changing current directories? I know I can always create a tar file with absolute paths INSIDE and use --strip-components when extracting but sometimes extra path names are extra private information that you don't want to distribute with your tar files. Thanks!

    Read the article

  • Compressed disk image on Linux

    - by Aaron Digulla
    I just got my new computer with a much bigger harddisk. I think I copied all important files over but just to be sure, I'd like to keep a disk image of my old disk. To save space, I'd like to compress it but I didn't find an option to mount a compressed image. My goals: Result must be easy to access No need to decompress the whole thing before I can access anything Files should be quick to locate - no TAR/CPIO archive Necessary space should be less than just copying the files over So ideally, I'm looking for a read-only, compressed file system which I can create in a file and which grows automatically.

    Read the article

  • Apache2 and logrotate: delaycompress needed?

    - by j0nes
    Hello, I am currently looking at the file size of my Apache logs as they became huge. In my logrotate conf, I have delaycompress enabled. Does Apache really need this (as the logrotate documentation says that some programs still write in the old file) or is it safe to disable delaycompress? Best regards, Jonas

    Read the article

  • Compressing and copying large files on Windows Server?

    - by Aaron
    I've been having a hard time copying large database backups from the database server to a test box at another site. I'm open to any ideas that would help me get this database moved without having to resort to a USB hard drive and the mail. The database server is running Windows Server 2003 R2 Enterprise, 16 GB of RAM and two quad-core 3.0 GHz Xeon X5450s. Files are SQL Server 2005 backup files between 100 GB and 250 GB. The pipe is not the fastest and SQL Server backup files typically compress down to 10-40% of the original, so it made sense to me to compress the files first. I've tried a number of methods, including: gzip 1.2.4 (UnxUtils) and 1.3.12 (GnuWin) bzip2 1.0.1 (UnxUtils) and 1.0.5 (Cygwin) WinRAR 3.90 7-Zip 4.65 (7za.exe) I've attempted to use WinRAR and 7-Zip options for splitting into multiple segments. 7za.exe has worked well for me for database backups on another server, which has ~50 GB backups. I've also tried splitting the .BAK file first with various utilities and compressing the resulting segments. No joy with that approach either- no matter the tool I've tried, it ends up butting against the size of the file. Especially frustrating is that I've transferred files of similar size on Unix boxes without problems using rsync+ssh. Installing an SSH server is not an option for the situation I'm in, unfortunately. For example, this is how 7-Zip dies: H:\dbatmp>7za.exe a -t7z -v250m -mx3 h:\dbatmp\zip\db-20100419_1228.7z h:\dbatmp\db-20100419_1228.bak 7-Zip (A) 4.65 Copyright (c) 1999-2009 Igor Pavlov 2009-02-03 Scanning Creating archive h:\dbatmp\zip\db-20100419_1228.7z Compressing db-20100419_1228.bak System error: Unspecified error

    Read the article

  • How to split file on Windows 2003 using MS supported tool

    - by Rune
    Hi, Is it possible to split a large file into smaller files on Windows 2003 using a tool provided/supported/sanctioned by Microsoft? I see that there are a lot of freeware tools (various zip tools) for this task, but I need to move files off of a production server, thus would like to avoid tools I don't know if I can trust. I would much prefer some tool included in the Windows Server 2003 Resource Kit Tools or something along those lines. Does such a tool exist? Thank you.

    Read the article

  • How can I evaluate the best choice of archive format for compressing files?

    - by Mehrdad
    In general, I've observed the following: Linux-y files or tools use bzip2 or gzip for distributing archives Windows-y files or tools use ZIP for distributing archives Many people use 7-Zip for creating and distributing their own archives Questions: What are the advantages and disadvantages of these formats, all of which appear to be open formats? When/why should I choose one (say, 7-Zip) over another (say, ZIP)? Why does the trend above appear to hold, even though all of these are portable formats? Are there any particular advantages to using a particular archive format on a particular platform?

    Read the article

  • CRC error when extracting to SSD from 2nd HDD

    - by gbn
    Hello I have a large RAR file (split up) containing an ISO on my 2nd HDD When I extract it: to the same HDD, it's OK to the system/OS SSD, I get CRC errors I've checked memory, run memtests, checked wires etc I have no other issues; only with this one RAR file Any ideas please?

    Read the article

  • Multithreaded TIFF to PNG conversion

    - by mtone
    I'm working with images of scanned books, so hundreds of high resolution pictures. I'm doing conversion work with Photoshop Elements - I can quickly save them to uncompressed TIFF, but converting to compressed PNG using a single thread takes ages. Do you know a software, ideally simple and free, that would batch convert those TIFFs to PNG in a multi-threaded manner (4 to 8 simultaneous files) to take advantage of those cores and cut converting times? I'm not too worried in slight variations in final size.

    Read the article

  • gunzip unexpected end of file

    - by Mark J Seger
    I like to write high volume data directly to a zip file via perl. Works like a champ! However, on some rare occasions I've found that gunzip can't uncompress them declaring an unexpected end-of-file. The thing is I can uncompress it with a simple perl script, which probably misses the corrupted record(s) at the end. My question is, does anyone know of a way to use a standard utility to do the same thing? Maybe with a 'compress-as-much-as-you-can' switch? -mark

    Read the article

  • Minimize HTML, CSS and JS files

    - by karmic
    How do you automatically pack/minimize the HTML, CSS and JS files served on a webpage. More specifically, I wish to have this for a wordpress website. Should it be done at the webserver level (lighttpd), at the application level (wordpress), at the PHP level, or somewhere else?

    Read the article

  • Three formats? Why?

    - by Yar
    I needed to download the Ruby Source recently from here and it says, "available in three formats" which are .tar.bz2, .tar.gz and .zip. Is there any reason that we need all three formats? At least on Linux and OSX I can do any of the three easily. On Windows, only zip is built-in, I think. Is there anything behind these preferences or is this just a religious battle?

    Read the article

  • Looking for Unix tool/script that, given an input path, will compress every batch of uncompressed 100MB text files into a single gzip file

    - by newToFlume
    I have a dump of thousands of small text files (1-5MB) large, each containing lines of text. I need to "batch" them up, so that each batch is of a fixed size - say 100MB, and compress that batch. Now that batch could be: A single file that is just a 'cat' of the contents of the individual text files, or Just the individual text files themselves Caveats: unix split -b will not work here as I need to keep lines of text intact. Using the lines option is a bit complicated as there is a large variance in the number of bytes in each line. The files need not be a fixed size strictly, as long as it's within 5% of the requested size The lines are critical, and should not be lost: I need to confirm that the input made its way to output without loss - what rolling checksum (something like CRC32, BUT better/"stronger" in face of collisions) A script should do nicely, but this seems like a task someone has done before, and it would be nice to see some code (preferably python or ruby) that does atleast something similar.

    Read the article

  • How could I compress a folder into splitted archives (individual ZIPs)?

    - by Shiki
    I have to compress folders into ZIP packages. But the size is limited, only a ~10-15mb is allowed to used per package. Every major application comes with the "Split archive to..." option, which does what I want... except I can't uncompress them one-by-one. (You need them all, and then use the .7z, .rar, .zip file to uncompress.) Here is an example. FolderX is 35 mb. That makes 4 packages, 4 zip files. The normal split function would give me: folderx.zip, folderx.zip.001, folderx.zip.002, folderx.zip.003 What I would really need is: folderx_1.zip, folderx_2.zip, folderx_3.zip, folderx_4.zip (Individually uncompressable files/packs.) I can code this down into an app, but it's a waste of time if such a utility already exists.

    Read the article

  • What's faster, cp -R or unpacking tar.gz files?

    - by Buttle Butkus
    I have some tar.gz files that total many gigabytes on a CentOS system. Most of the tar.gz files are actually pretty small, but the ones with images are large. One is 7.7G, another is about 4G, and a couple around 1G. I have unpacked the files once already and now I want a second copy of all those files. I assumed that copying the unpacked files would be faster than re-unpacking them. But I started running cp -R about 10 minutes ago and so far less than 500M is copied. I feel certain that the unpacking process was faster. Am I right? And if so, why? It doesn't seem to make sense that unpacking would be faster than simply duplicating existing structures.

    Read the article

  • Is there a tool for verifying the contents of a Zip archive against the source directory's contents?

    - by Basil
    Here's the scenario: I create a ZIP archive using some GUI package like WinZip, 7-Zip or whatever by right-clicking on a directory "somename" and selecting "Compress to archive 'somename.zip'" When the archive is completed, I open it and discover that some files don't exist in the archive (for reasons yet unknown). I want to find all files that are missing from the archive without having to extract the archive to another directory, then doing directory diff, etc. So.. Is there a tool (GUI or command-line, standalone or built into a compressor, for Windows or Linux, I don't care) that can walk through an archive and compare its contents against a directory on the filesystem?

    Read the article

  • Best archive format & tool for large amounts of data (50gb+)

    - by marcusstarnes
    I only realised this afternoon that the ZIP format has a limit of what appears to be around 20gb. I am trying to automate an archive process (using Automate) to zip/rar/whatever a collection of folders/files on one of my disks. It always appeared to bomb out with an incomplete archive at about 20gb. So I tried using WinRAR and doing it manually as a ZIP file, but it told me of the limit. So, I was wondering, what is a recommended zip format (and tool for accomplishing the task) for archiving up a large amount of data (around 50gb)? Thanks

    Read the article

< Previous Page | 10 11 12 13 14 15 16 17 18 19 20 21  | Next Page >