Search Results

Search found 802 results on 33 pages for 'compressed sensing'.

Page 2/33 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • How to mount a compressed ISO image?

    - by dma_k
    I have problem to mount a compressed (ISZ) image under Linux, which was created by e.g. UltraISO? I am aware about user-space fuseiso, but it fails to mount these images, as I have reported in Debian bugtracker (correct me if I ddi something wrong). I ask the community for a help: I need a proved solution to mount these images without decompressing them. I believe that CONFIG_ZISOFS kernel option cannot help, as it refers a special RockRidge extension (per-file compression with mkisofs -z or mkzftree).

    Read the article

  • How to inline compressed CSS in Rails with assets pipeline

    - by haimg
    I'm trying to inline CSS into my layout. I'm currently using = Rails.application.assets.find_asset('embedded.css').body.html_safe However, the CSS returned is not compressed. I verified what .digest_path asset file exists, and is properly compressed. I can, of course, write a helper that will check if current on-disk compressed asset file exists for a given asset, and use it. However, I think find_asset actually compiles a CSS asset each time it is called -- not good in production. I hope a cleaner solution exists for this issue.

    Read the article

  • gunzip: invalid compressed data--format violated

    - by Arunjith
    Problem definition: I transferred a tar.gz file from a Linux machine to a Windows partition.The Windows partition has mounted with the Linux server as cifs. OS : Red Hat Enterprise Linux Server release 5 Symptom: After the copy process is successful, doing an integrity check with gunzip -t and the process get the following error: gunzip -t Backup-28--Jun--2011--Tuesday.tar.gz gunzip: Backup-28--Jun--2011--Tuesday.tar.gz: invalid compressed data--format violated And further tried to untar (tar -xvzf) and the process as well is failed.

    Read the article

  • Read and write directly from and to compressed files in C

    - by victor
    Hi, in Java I think it is possible to cruise through jar files like they were not compressed. Is there some similar (and portable) thing in C/C++ ? I would like to import binary data into memory from a large (zipped or similar) file without decompressing to disk first and afterwards writing to disk in a compressed way. Maybe some trick with shell pipes and the zip utility?

    Read the article

  • Very basic question about Hadoop and compressed input files

    - by Luis Sisamon
    I have started to look into Hadoop. If my understanding is right i could process a very big file and it would get split over different nodes, however if the file is compressed then the file could not be split and wold need to be processed by a single node (effectively destroying the advantage of running a mapreduce ver a cluster of parallel machines). My question is, assuming the above is correct, is it possible to split a large file manually in fixed-size chunks, or daily chunks, compress them and then pass a list of compressed input files to perform a mapreduce?

    Read the article

  • MySql: How to know if an entry is compressed or not

    - by Guy
    I'm working with python and mysql and I want to verify that a certain entry is compressed in the db. Ie: cur = db.getCursor() cur.execute('''select compressed_column from table where id=12345''') res = cur.fetchall() at this point I would like to verify that the entry is compressed (ie in order to work with the data you would have to use select uncompress(compressed_column)..). Ideas?

    Read the article

  • Identify ENCRYPTED compressed files at the command line

    - by viking
    I have directories with hundreds of RAR files. Currently I use Powershell 2.0 with a script that utilizes WinRAR's RAR utility to decompress the files. The issue is that a small number of the files end up being encrypted, which pauses the script and requires interaction. Is there any way to do one of the following: Identify the encrypted files before trying to decompress Entirely ignore the encrypted files Automate an incorrect (or correct) password that will attempt to open the file, but just skip it if incorrect. NOTE: Some of the compressed files encrypt just file contents, whereas others encrypt file name and file contents. Relevent Code: $files = Get-ChildItem foreach($file in $files) { if($file.Attributes -eq "Archive") { $folder = $file.basename rar x $file $folder\ -y } }

    Read the article

  • How to disable Apache http compression (mod_deflate) when SSL stream is compressed

    - by Mohammad Ali
    I found that Goggle Chrome supports ssl compression and Firefox should support it soon. I'm trying to configure Apache to to disable http compression if the ssl compression is used to prevent CPU overhead with the configuration option: SetEnvIf SSL_COMPRESS_METHOD DEFLATE no-gzip While the custom log (using %{SSL_COMPRESS_METHOD}x) shows that the ssl layer compression method is DEFLATE, the above option did not work and the http response content is still being compressed by Apache. I had to use the option: BrowserMatchNoCase ".Chrome." no-gzip' I prefer if there are more general method in case other browsers supports ssl compression or some has a version of chrome that does not have ssl compression.

    Read the article

  • Compressed filesystem inside a file in Linux [migrated]

    - by Doc
    I have a flash drive which is FAT32 formatted. I want to put a linux filesystem on the drive inside a file. I know I can do this by creating a file and formatting is with ext3 (or any other file system) and then mounting it with the -o loop option. What I would like is that the above filesystem be compressed. Essentially something like a read-write squashfs. Is there something that exits that I can use? Additional bonus if the file can be stored as sparse, i.e. the file re-sizes as data is written or deleted.

    Read the article

  • Serving Compressed Files Amazon vs Lightty

    - by tike
    We are currently using amazon CloudFront to serve css and according to Amazon itself, Amazon CloudFront can serve both compressed and uncompressed files from an origin server. But while i check compression it shows everything fine in origin server but it shows notcompressed checking in the link with cloudfront. e.g. http://www.port80software.com/tools/compresscheck.asp?url=http%3A%2F%2Fimgsrv.mydomain.com%2Fen-UK%2Fsomething.css it would result with Compression status: (gzip) while with cloudfront http://www.port80software.com/tools/compresscheck.asp?url=http%3A%2F%2hereisit.cloudfront.net%2F%2Fsomething.css Compression status: Uncompressed Origin server is running lighttpd with mod_deflate however, allowed config is: deflate.allowed_encodings = ("bzip2", "gzip", "deflate") [i would think, putting extra allowed encoding wont affect as such.] Here i am clueless, what is the real issue.

    Read the article

  • show differences between file and file in (compressed) tar archive

    - by Kyss Tao
    Say I have unpacked a gz-compressed tar file, and do not remember what changes I made to the unpacked files, or I archived a folder a while ago and want to know what has changed to the files since. I can use tar -zd to get an overview. Then, say it shows me file foo has changed. How can I see the changes in this file, i.e. the difference between the file on my file system and the (older) file in the archive (ideally in vimdiff, but diff output would be fine too)?

    Read the article

  • Improve efficiency when using parallel to read from compressed stream

    - by Yoga
    Is another question extended from the previous one [1] I have a compressed file and stream them to feed into a python program, e.g. bzcat data.bz2 | parallel --no-notice -j16 --pipe python parse.py > result.txt The parse.py can read from stdin continusuoly and print to stdout My ec2 instance is 16 cores but from the top command it is showing 3 to 4 load average only. From the ps, I am seeing a lot of stuffs like.. sh -c 'dd bs=1 count=1 of=/tmp/7D_YxccfY7.chr 2>/dev/null'; I know I can improve using the -a in.txtto improve performance, but with my case I am streaming from bz2 (I cannot exact it since I don't have enought disk space) How to improve the efficiency for my case? [1] Gnu parallel not utilizing all the CPU

    Read the article

  • BOOTMGR is compressed

    - by JavaAndCSharp
    So I've just finished messing up my computer beyond repair. I originally had a Windows 7 install on my 64GB SSD and all my data on my 1.5TB HDD. So I decided to install Windows 8 Consumer Preview. I tried to clone my 64GB SSD onto my HDD; but that didn't work either at first. I later learned that it wiped all my data off of it. Thank goodness for the cloud. So I retried the cloning after I realized that my data was gone, and I finished it. So I rebooted and got a cryptic error message: BOOTMGR is compressed. Press Ctrl+Alt+Del to restart. So I did. Several times. I followed the instructions to rebuild the bootmanager from many websites, including HowToGeek, smallvoid.com, and more. None worked. BTW, I've lost my Windows 7 Professional DVDs. All I have is my Windows 7 Home Premium DVD. It's the one I got with my laptop. How can I get back to a working computer? I'm willing to wipe anything, as my data is all in the cloud.

    Read the article

  • Copying compressed files from Server 2008 R2 network share to XP client via VPN fails

    - by Dejan Janjuševic
    At the first sight the question looks similar to this one. I have experienced an odd behavior while trying to copy a certain file from Windows Server 2008 R2 network share to Windows XP Professional client via VPN. The VPN was set up using RRAS on the server machine. I will try to provide as much informations as possible in order to make the issue more clear. When trying to copy the compressed file sized ~2.5 MB (via Explorer or CMD, doesn't matter), the process stalls after some 20%, producing an error message after few seconds: Cannot copy filename: The specified network name is no longer available. If i start the command ping -t 192.168.2.1 (where the IP address specified belongs to the server) side by side with the copy command, I can clearly see that the ping command times out for few seconds as the copy process stalls. When this happens all network activities are frozen. After a few seconds, the network recovers, ping continues to run normally, however the copy process stands still before it displays the above error message. Copying other files (I tried 4-5 files), of which some are larger and some are smaller, succeeds. Seems to me that I can copy all uncompressed files. As soon as I try to copy an archive, the process freezes. Even a 707 KB large archive can't be copied. I can only reproduce this behavior on 2 machines, both Windows XP Professional, one is w/ SP2 and the other w/ SP3. Other XP clients don't have this problem, neither do Windows 7 clients. If I connect to the server using Remote Desktop Connection without using VPN from either of these 2 machines (using the same user account), I can copy anything I want normally, even these "problematic" files. Does anyone have any clue about what could possibly be going on?

    Read the article

  • Content not being compressed even though I'm using zlib in php.ini

    - by Tola Odejayi
    I've edited my php.ini file so that it has these two entries: zlib.output_compression = On zlib.output_compression_level = 4 However, after restarting apache, when I request php pages, the headers returned in the response indicate that my server is still NOT serving compressed pages (here are selected headers as viewed using Chrome's Network feature): Cache-Control:no-cache, must-revalidate, max-age=0 Connection:Keep-Alive Content-Type:text/html; charset=UTF-8 Date:Mon, 17 Sep 2012 23:46:13 GMT Expires:Wed, 11 Jan 1984 05:00:00 GMT Last-Modified:Mon, 17 Sep 2012 23:46:13 GMT Pragma:no-cache Proxy-Connection:Keep-Alive Server:Apache/2.2.21 (Unix) mod_ssl/2.2.21 OpenSSL/0.9.8e-fips-rhel5 mod_auth_passthrough/2.1 mod_bwlimited/1.4 FrontPage/5.0.2.2635 PHP/5.2.17 Transfer-Encoding:chunked Via:1.1 XXX-PRXY-07 X-Powered-By:PHP/5.2.17 What might I be doing wrong? Is there any other setting that I need to change? EDIT Here is another set of headers returned to another computer: Cache-Control:no-cache, must-revalidate, max-age=0 Connection:close Content-Type:text/html; charset=UTF-8 Date:Thu, 20 Sep 2012 09:45:26 GMT Expires:Wed, 11 Jan 1984 05:00:00 GMT Last-Modified:Thu, 20 Sep 2012 09:45:26 GMT Pragma:no-cache Server:Apache/2.2.21 (Unix) mod_ssl/2.2.21 OpenSSL/0.9.8e-fips-rhel5 mod_auth_passthrough/2.1 mod_bwlimited/1.4 FrontPage/5.0.2.2635 PHP/5.2.17 Transfer-Encoding:chunked Vary:Cookie X-Powered-By:PHP/5.2.17

    Read the article

  • l2tp server always 'sent [CCP ResetReq id=0x3]' when got compressed data request

    - by wilbur
    I have built a xl2tpd/ipsec server on my ubuntu 12.04.3, and I managed to make a l2tp vpn connection to the xl2tpd server from my android phone. The xl2tpd log said xl2tpd[10828]: Enabling IPsec SAref processing for L2TP transport mode SAs xl2tpd[10828]: IPsec SAref does not work with L2TP kernel mode yet, enabling forceuserspace=yes xl2tpd[10828]: setsockopt recvref[22]: Protocol not available xl2tpd[10828]: This binary does not support kernel L2TP. xl2tpd[10828]: xl2tpd version xl2tpd-1.2.8 started on atime.me PID:10828 xl2tpd[10828]: Written by Mark Spencer, Copyright (C) 1998, Adtran, Inc. xl2tpd[10828]: Forked by Scott Balmos and David Stipp, (C) 2001 xl2tpd[10828]: Inherited by Jeff McAdams, (C) 2002 xl2tpd[10828]: Forked again by Xelerance (www.xelerance.com) (C) 2006 xl2tpd[10828]: Listening on IP address 0.0.0.0, port 1701 xl2tpd[10828]: control_finish: Peer requested tunnel 39154 twice, ignoring second one. xl2tpd[10828]: Connection established to 117.136.8.59, 43149. Local: 25339, Remote: 39154 (ref=0/0). LNS session is 'default' However I cannot access the web in my browser. The pppd log said rcvd [Compressed data] 00 1d 82 c4 7c 04 d8 09 ... sent [CCP ResetReq id=0x7] I have googled a lot and found that this was mostly caused by a mppe decompression error. I have disabled BSD-Compress compression with nobsdcomp in /etc/ppp/xl2tpd-options but it did not work. I used openswan-2.6.33 and xl2tpd-1.2.8 which were built from source. And my configurations: /etc/ipsec.conf version 2.0 config setup nat_traversal=yes virtual_private=%v4:10.0.0.0/8,%v4:192.168.0.0/16,%v4:172.16.0.0/12 oe=off protostack=netkey conn L2TP-PSK-NAT rightsubnet=vhost:%priv also=L2TP-PSK-noNAT conn L2TP-PSK-noNAT authby=secret pfs=no auto=add keyingtries=3 rekey=no ikelifetime=8h keylife=1h type=transport left=106.186.121.214 leftprotoport=17/1701 right=%any rightprotoport=17/%any /etc/xl2tpd/xl2tpd.conf [global] ipsec saref = yes [lns default] local ip = 10.10.11.1 ip range = 10.10.11.2-10.10.11.245 refuse chap = yes refuse pap = yes require authentication = yes ppp debug = yes pppoptfile = /etc/ppp/xl2tpd-options length bit = yes /etc/ppp/xl2tpd-options require-mschap-v2 ms-dns 8.8.8.8 ms-dns 8.8.4.4 asyncmap 0 auth crtscts lock hide-password modem name l2tpd proxyarp lcp-echo-interval 30 lcp-echo-failure 4 debug nobsdcomp Any suggestions? Thanks in advance.

    Read the article

  • Reading from compressed lucene index

    - by Akhil
    I created a lucene index and compressed the index directory with bz2 or zip. I donot want to uncompress it. Is there any API call that can read the index from this zipped directory and thus allow searching and other functionalities. That is, can lucence IndexReader read the index from a compressed file. I saw that Lucnene IndexReader does not support "Reader" to open the index, otherwise I would have created a Reader class that uncompresses the file and streams the uncompressed version. Any alternatives to this are welcome. Thanks, Akhil

    Read the article

  • compressed archive with quick access to individual file

    - by eric.frederich
    I need to come up with a file format for new application I am writing. This file will need to hold a bunch other text files which are mostly text but can be other formats as well. Naturally, a compressed tar file seems to fit the bill. The problem is that I want to be able to retrieve some data from the file very quickly and getting just a particular file from a tar.gz file seems to take longer than it should. I am assumeing that this is because it has to decompress the entire file even though I just want one. When I have just a regular uncompressed tar file I can get that data real quick. Lets say the file I need quickly is called data.dat For example the command... tar -x data.dat -zf myfile.tar.gz ... is what takes a lot longer than I'd like. MP3 files have id3 data and jpeg files have exif data that can be read in quickly without opening the entire file. I would like my data.dat file to be available in a similar way. I was thinking that I could leave it uncompressed and seperate from the rest of the files in myfile.tar.gz I could then create a tar file of data.dat and myfile.tar.gz and then hopefully that data would be able to be retrieved faster because it is at the head of outer tar file and is uncompressed. Does this sound right?... putting a compressed tar inside of a tar file? Basically, my need is to have an archive type of file with quick access to one particular file. Tar does this just fine, but I'd also like to have that data compressed and as soon as I do that, I no longer have quick access. Are there other archive formats that will give me that quick access I need? As a side note, this application will be written in Python. If the solution calls for a re-invention of the wheel with my own binary format I am familiar with C and would have no problem writing the Python module in C. Idealy I'd just use tar, dd, cat, gzip, etc though. Thanks, ~Eric

    Read the article

  • Quicktime Audio Extraction for Compressed Movies

    - by Noorul
    Hi all, I am trying to extract audio from Quicktime Movie. I followed the steps in http://developer.apple.com/quicktime/audioextraction.html. It works fine.But When i try to extract any movies which is compressed(Audio is compressed as AAC), it gives a first chance exception.. In callstack it shows CoreAudioToolbox.dll... If is do continue it renders the audio without any issues. In Mac, this works without any issues. Is this really anything to worry about... I am a QT Beginner. Please help me My QT version is 7.6.7(1675) I am using Windows7

    Read the article

  • How to remove password protection from compressed files

    - by Mehper C. Palavuzlar
    This has always been a problem for me for a long time. Let's see if any SuperUser can solve this: I have a directory in which there are lots of password protected .RAR files of which I know the passwords. I want to remove the password protection from all of them without extracting the contents. Since each file is larger than 1 GB, decompressing & then recompressing without password encryption is not a good option for me. How can I easily do that? I'm using WinRAR 3.80 on Win7. Any other 3rd party tools are welcomed.

    Read the article

  • Combine two or more compressed files

    - by shantanuo
    I have 2 gz files those I need to merge into one. time join <(zcat r_TR2_2012-05-28-08-10-00.gz) <(zcat r_TR1_2012-05-28-08-10-00.gz) The above statement is not working as expected. I am using 3 commands to do the needful. gunzip r_TR2_2012-05-28-08-10-00.gz gunzip r_TR1_2012-05-28-08-10-00.gz tar -zcvf combined.tar.gz r_TR1_2012-05-28-08-10-00 r_TR2_2012-05-28-08-10-00 Is there any way to do it in 1 statement?

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >