Search Results

Search found 1246 results on 50 pages for 'compression'.

Page 5/50 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • Oracle Advanced Compression Webcast Replay Available

    - by [email protected]
    Did you miss our webcast "Save BIG on Storage - with Oracle Database 11g and Advanced Compression"? Don't worry, you can still register and view the recording including the full Q&A session with Tim Shetler and Bill Hodak. Click here to learn how Oracle Advanced Compression can reduce your disk space requirements for all types of data, improve query and storage performance and lower storage costs throughout the datacenter.

    Read the article

  • Compression Array of Bytes

    - by Pedro Magalhaes
    Hi, My problem is: I want to store a array of bytes in compressed file, and then I want to read it with a good performance. So I create a array of bytes then pass to a ZLIB algorithm then store it in the file. For my surprise the algorithm doesn't work well., probably because the array is a random sample. Using this approach, it will will be ber easy to read. Just copy the stream to memory, decompress them and copy it to a array of bytes. But i need to compress the file. Do I have to use a algorithm, like RLE, for compresse the byte array? I think that I can store the byte array like a string and then compress it. But i think I am going to have a poor performance on reading data. Sorry for my poor english. Thanks

    Read the article

  • Windows command line compression/extraction tool?

    - by Will Marcouiller
    I need to write a batch file to unzip files to their current folder from a given root folder. Folder 0 |----- Folder 1 | |----- File1.zip | |----- File2.zip | |----- File3.zip | |----- Folder 2 | |----- File4.zip | |----- Folder 3 |----- File5.zip |----- FileN.zip So, I wish that my batch file is launched like so: ocd.bat /d="Folder 0" Then, make it iterate from within the batch file through all of the subfolders to unzip the files exactly where the .zip files are located. So here's my question: Does the Windows (from XP at least) have a command line for its embedded zip tool? Otherwise, shall I stick to another third-party util?

    Read the article

  • String Compression using Lempel-Ziv

    - by roybot
    Im looking for a way of compressing a given string using the Lempel-Ziv Algorithm. Preferably there would only be a set of two functions, encoder and decoder. The encoder takes the string and returns an integer. The decoder takes the integer and returns the original string. Time complexity is not important. How would you implement this?

    Read the article

  • Maximum compression of a MSI install using WIX

    - by MX4399
    I used to build installs for an app with NSIS and the final self extractor was 1.2 MB. Now I need to use WIX due to operational needs and the same install comes out at 4.2 MB. I set the compressed flags as the docs and specs indicated on the package node. Using 7z to zip the MSI results in a 2.4 MB zip file. Question: How can I do a maximum compress on the MSI or create a small MSI (e.g. remove unneeded resources etc) ? Note - size is uber important and I have to use MSI/WIX now - this is a show stopper!

    Read the article

  • Java code compression and decompression of a string

    - by user1822710
    I am having a problem figuring how to check a string for the same characters in a row then count that same character in a row then printing it out then giving the location of the last occorance of that character count then printing it out then moving to the next character in the string that is different then the previous character and the program is case sensitive. So the input could be: aaaaAAAbbbddccc How would I compress this string to: a4A3b3d2c3 ? and then decompress it?

    Read the article

  • Can a JPEG compressed image be rotated without a loss in quality?

    - by Mat
    JPEG is a lossy compression scheme, so decompression-manipulation-recompression normally reduces the image quality further for each step. Is it possible to rotate a JPEG image without incurring further loss? From what little I know of the JPEG algorithm, it naively seems possible to avoid further loss with a bit of effort. Which common image manipulation programs (e.g. GIMP, Paint Shop Pro, Windows Photo Gallery) and graphic libraries cause quality loss when performing a rotation and which don't?

    Read the article

  • Does ZFS cache Compressed or Uncompressed data in a ZFS file-system with compression turned on?

    - by George Bailey
    ZFS supports file-system compression and it also caches frequently or recently accessed data. If a system has lots of CPU but the underlying data storage system is slow. It is possible that ZFS would perform better with compression turned on. This can be easily tested when writing files by measuring CPU and disk usage and throughput. (of course latency may exist,, but this would not be an issue for large files). But what about cache? If data will have to be decompressed every time it is read then this is probably less of a good idea. Is the cached data compressed?. Does anybody have some information on this?

    Read the article

  • MPI: is there mpi libraries capable of message compression?

    - by osgx
    Sometimes MPI is used to send low-entropy data in messages. So it can be useful to try to compress messages before sending it. I know that MPI can work on very fast networks (10 Gbit/s and more), but many MPI programs are used with cheap network like 0,1G or 1Gbit/s Ethernet and with cheap (slow, low bisection) network switch. There is a very fast Snappy (wikipedia) compression algorithm, which has Compression speed is 250 MB/s and decompression speed is 500 MB/s so on compressible data and slow network it will give some speedup. Is there any MPI library which can compress MPI messages (at layer of MPI; not the compression of ip packets like in PPP). MPI messages are also structured, so there can be some special method, like compression of exponent part in array of double.

    Read the article

  • Increase the compression performance of VPN

    - by Martin
    I am currently switching from a system with HPN-SSH tunnels and enabled compression to something VPN based. I have tried tinc and n2n so far, hamachi requires a library I do not have. In my primitive benchmarks I am not satisfied with the achievable bandwidth compared to the SSH tunnels. In tinc the low LZO setting performed best, but compression is only available in UDP mode. Ideally I would like to have a TCP-based VPN with a multi-threaded compression. Can you suggest me some ideas how to increase the performance? Would it be possible to somehow put a compression filter in front of the tun interface? Or are there any VPN implementations that might be better suited for my needs (fast compression, TCP-based, switch mode, does not have to be super-secure)? I would consider tunnelling Ethernet over SSH, but according to some articles it is not advisable.

    Read the article

  • Is there a quality, file-size, or other benefit to JPEG sizes being multiples of 8px or 16px?

    - by davebug
    The JPEG compression encoding process splits a given image into blocks of 8x8 pixels, working with these blocks in future lossy and lossless compressions. [source] It is also mentioned that if the image is a multiple 1MCU block (defined as a Minimum Coded Unit, 'usually 16 pixels in both directions') that lossless alterations to a JPEG can be performed. [source] I am working with product images and would like to know both if, and how much benefit can be derived from using multiples of 16 in my final image size (say, using an image with size 480px by 360px) vs. a non-multiple of 16 (such as 484x362). In this example I am not interested in further alterations, editing, or recompression of the final image. To try to get closer to a specific answer where I know there must be largely generalities: Given a 480x360 image that is 64k and saved at maximum quality in Photoshop [example]: Can I expect any quality loss from an image that is 484x362 What amount of file size addition can I expect (for this example, the additional space would be white pixels) Are there any other disadvantages to growing larger than the 8px grid? I know it's arbitrary to use that specific example, but it would still be helpful (for me and potentially any others pondering an image size) to understand what level of compromise I'd be dealing with in breaking the non-8px grid. The key issue here is a debate I've had is whether 8-pixel divisible images are higher quality than images that are not divisible by 8-pixels.

    Read the article

  • Creating transparent PNG with exact RGBA values

    - by rrowland
    I'm color-coding a transparent image to be read programatically. However, the image seems to be getting compressed and my code is reading color values other than the ones I mean to pass it. Concept This is the output I get, exporting as PNG-24. I programatically check each pixel for one of the six colors I use in creating the image: 0x00000F 0x0000F0 0x000F00 0x00F000 0x0F0000 0xF00000 Each color represents a different texture to apply. Top right (0x00000F) will pull texture from the tile to its top right and blend it at a ratio equal to the opacity of the pixel. The end goal is to create a hex tiled grid with differing textures that blend smoothly. What's happening It seems that when converting to PNG, Photoshop will change the RGBA to make it smoother, or just to help compression size. Parts that should be 250 red range anywhere from 150 to 255. Question Whether using PNG or another web-compatible format, I need to be able to save these pixel values, essentially instructions, loss-less and still maintain transparency. Is this possible in any format or will I need to re-think my approach?

    Read the article

  • How does a web server/the http protocol handle version control and compression?

    - by Sune Rasmussen
    When a client browser requests a file from the web server, I know that some kind of check is performed, because the files needed to serve the web page may already be cached by the web browser. So, if a file exists in the cache, no files are sent. But if the file on the server has changed since the file was cached in the browser, the file is sent and updated anyhow. Then, if you have compression like gzipping enabled on the server, the files that are to be provided to the client must be gzipped on the way, requiring some amount of server side processing. But how is this managed? The logical approach seems to me, that the web server should have a cache as well, containing the newest version of all files that have been requested within a certain time span, thus a compressed version of these files, so that compression would not have to be done each time a files is requested. And also, how are files eventually requested? Does the browser ask for files, each time it encounters one in the HTML code and the specific file is not stored in the local cache, or does it sum all the files that are needed up and ask for the whole bunch at the same time? But that's only guessing from a programming point of view, and I don't really know. If the answers are very different among web server systems, I'm primarily interested in Apache, but other answers are appreciated, too.

    Read the article

  • Is there a compression method for compressing a group of very similar files, without archiving them?

    - by awiebe
    I want to compress a large nuber of files that have near identical headers, and also some data, however I do not wish to archive them, nor do I wish to zip them individually(because the copression ratio would be much higher if substitutions of similar blocks could be done using a single table). Does a compression method exist to do this already, or should I implement it myself. Note: Don't say "Disk space is cheap", because I may want to use this on an embedded system.

    Read the article

  • btrfs and missing free space

    - by easteregg
    I converted my ext4 partition to btrfs and deleted the save subvolume after doing so. Then I enabled the compression (lzo) of the filessystem in the fstab file and everything is correct so far. Then I forced the compression of all files using the defragmentation command with the parameter -c that the new compression is applied to all files. While doing so, I noticed that my ssd got completly filled up - before I had 6gigs of free space. No I got nothing left. easteregg@x201s:~$ btrfs fi df / Data: total=50.00GB, used=49.17GB System: total=32.00MB, used=4.00KB Metadata: total=24.50GB, used=9.86GB and easteregg@x201s:~$ df -ha Filesystem Size Used Avail Use% Mounted on /dev/sda1 75G 60G 852M 99% / So now. How can I regain my free space. I expected to gain more space because of the lzo compression. And now! The fs is correctly mounted. easteregg@x201s:~$ mount /dev/sda1 on / type btrfs (rw,noatime,ssd,compress=lzo) Any ideas how to fix this issue?

    Read the article

  • Question about Byte-Pairing for data compression [closed]

    - by user1669533
    Question about Byte-Pairing for data compression. If byte pairing converts two byte values to a single byte value, splitting the file in half, then taking a gig file and recusing it 16 times shrinks it to 62,500,000. My question is, is byte-pairing really efficient? Is the creation of a 5,000,000 iteration loop, to be conservative, efficient? I would like some feed back on and some incisive opinions please. Best Regards.

    Read the article

  • How can I maximally compress .gz files in Nautilus?

    - by Takkat
    When selecting Compress... from the right click context menu in Nautilus I am able to quickly compress files to .gz format. However by default Nautilus does not use maximum compression. Can I make Nautilus to use maximum compression like gzip -9? Using gconftool or gconf-editor to set the compression_level for File Roller to maximum seems right but infortunately has not the desired effect and will not lead to maximum compressed files. As this is the expected way of how to set compression levels a bug report has been filed upstream. Any ideas for a workaround are welcome.

    Read the article

  • Need help with some IIS7 web.config compression settings.

    - by Pure.Krome
    Hi folks, I'm trying to configure my IIS7 compression settings in my web.config file. I'm trying to enable HTTP 1.0 requests to be gzip. MSDN has all the info about it here. Is it possible to have this config info in my own website's web.config file? Or do i need to set it at an application level? Currently, I have that code in my web.config... <system.webServer> <urlCompression doDynamicCompression="true" dynamicCompressionBeforeCache="true" /> <httpCompression cacheControlHeader="max-age=86400" noCompressionForHttp10="False" noCompressionForProxies="False" sendCacheHeaders="true" /> ... other stuff snipped ... </system.webServer> It's not working :( HTTP 1.1 requests are getting compressed, just not 1.0. That MSDN page above says that it can be used in :- Machine.config ApplicationHost.config Root application Web.config Application Web.config Directory Web.config So, can we set these settings on a per-website-basis, programatically in a web.config file? (this is an Application Web.config file...) What have i done wrong? cheers :) EDIT: I was asked how i know HTTP1.0 is not getting compressed. I'm using the Failed Request Tracing Rules, which reports back:- DYNAMIC_COMPRESSION_START DYNAMIC_COMPRESSION_NOT_SUCESS Reason: 3 Reason: NO_COMPRESSION_10 DYNAMIC_COMPRESSION_END

    Read the article

  • Does Compressed Sensing bring anything new to data Compression?

    - by anon
    Compressed sensing is great for situations where capturing data is expensive (either in energy or time), and thus less samples can now be taken. However, in situations like image compression, given that the data is already on the computer -- does compressed sensing offer anything? For example, would it offer better data compression? Would it result in better image search?... (Note: If you don't know what the field of Compressed Sensing is, please do not respond.)

    Read the article

  • Sharp HealthCare Reduces Storage Requirements by 50% with Oracle Advanced Compression

    - by [email protected]
    Sharp HealthCare is an award-winning integrated regional health care delivery system based in San Diego, California, with 2,600 physicians and more than 14,000 employees. Sharp HealthCare's data warehouse forms a vital part of the information system's infrastructure and is used to separate business intelligence reporting from time-critical health care transactional systems. Faced with tremendous data growth, Sharp HealthCare decided to replace their existing Microsoft products with a solution based on Oracle Database 11g and to implement Oracle Advanced Compression. Join us to hear directly from the primary DBA for the Data Warehouse Application Team, Kim Nguyen, how the new environment significantly reduced Sharp HealthCare's storage requirements and improved query performance.

    Read the article

  • Oracle saves with Oracle Database 11g and Advanced Compression

    - by jenny.gelhausen
    Oracle Corporation runs a centralized eBusiness Suite system on Oracle Database 11g for all its employees around the globe. This clustered Global Single Instance (GSI) has scaled seamlessly with many acquisitions over the years, doubling the number of employees since 2001 and supporting around 100,000 employees today, 24 hours a day, 7 days a week around the world. In this podcast, you'll hear from Raji Mani, IT Director for Oracle's PDIT Group, on how Oracle Database 11g and Advanced Compression is helping to save big on storage costs. var gaJsHost = (("https:" == document.location.protocol) ? "https://ssl." : "http://www."); document.write(unescape("%3Cscript src='" + gaJsHost + "google-analytics.com/ga.js' type='text/javascript'%3E%3C/script%3E")); try { var pageTracker = _gat._getTracker("UA-13185312-1"); pageTracker._trackPageview(); } catch(err) {}

    Read the article

  • How do I get the compression on specific dynamic body

    - by Mike JM
    Sorry, I could not find any tag that would suit my question. Let me first show you the image and then write what I want to do: I'm using box2D. As you can see there are three dynamic bodies connected to each other (think of it as a table from front view).The LEG1 and LEG2 are connected to the static body. (it's the ground body). Another dynamic body is falling onto the table. I need to get the compression in the LEG1 and LEG2 separately. Joints have GetReactionForce() function which returns a b2Vec, which in turn has Length() and LengthSqd functions. This will give the total sum of the forces in any taken joint. But what I need is forces in individual bodies that are connected with joints. Once you connect several bodies with a single joint it again will show the sum of forces which is not useful.Here's the case iI'm talking about:

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >