Search Results

Search found 22292 results on 892 pages for 'image optimization'.

Page 147/892 | < Previous Page | 143 144 145 146 147 148 149 150 151 152 153 154  | Next Page >

  • install grub on disk image

    - by Dima
    I have disk image with 2 partitions: Partition 1 has cramfs file system (read only). This partition contains all system files of the OS Partition 2 has ext3 file system. This partition has only configuration files that may be changed. How can I install GRUB1 boot loader on MBR. I tried to copy first 446 bytes of my hard disk and copy GRUB files to the /boot directory on the 1st (cramfs) partition. I cannot use grub-install because I have disk image and not disk itself. Any ideas?

    Read the article

  • Creating Custom ISO Images

    - by ericl42
    I am working on creating some custom ISO images using primarily Fedora and CentOS. I want the image to be a bootable live CD with some specific files on it. I also want it to have the option to be able to be downloaded to the hard drive. I've read some various articles but want to get a few more opinions since I've never done this before. Currently I'm trying 2 different methods. Install Fedora with the configuration exactly how I want it and then run the livecd-tools program to pull everything I currently have to an ISO. I haven't got this to work yet but I do see a few issues with it. Such as the default passwords I had to put in. Run a Fedora live CD and install a few things I want on it and then copy the image of it. I believe this would work better since it has more of a live cd feel. However I"m not 100% sure how I should go about pulling the current image to my own ISO. I know some people have said to use mkisofs and a few other programs but any advice would be greatly appreciated.

    Read the article

  • MySQL too many connections

    - by Webnet
    On my server I have 7 databases. Our server has 512 MB of RAM which I'm getting upgraded this evening to 2GB and has a 2.4 single processor. I've gotten an error about the connection limit exceeded. With increasing my RAM, is it ok to increase the number of connections? Currently it's set to 200 but a single page may connect to 3-4 databases considering JOINs and things. We've setup so many databases for mere organization. We have a total of about 250-300 tables in all of the databases. Any advice would be appreciated :)

    Read the article

  • Can't select hard disk off Windows 7 system image creator

    - by David
    When I try to create a system image in Windows 7 from the Control Panel (Control Panel\All Control Panel Items\Backup and Restore) I get the option to select a hard disk or a removeable disk to select, I have 2 disks and wanted to create the image on my spare one. However when I click refresh it doesn't show either of my disks but shows my CDROM under the removable disks area, anyone have this problem? Also, when I select a USB disk instead, it tries to iamge both my disks! I can't select my active Windows 7 installed disk! How pointless!

    Read the article

  • Using different SSDs types (not only SATA based) as system drive

    - by Hubert Kario
    Currently I have a Thinkpad X61s and want to make it both a bit faster and a bit more power efficient. For that reason I thought that adding SSD drive would make most sense. Unfortunately, because of financial reasons, buying SSD of over 200GB capacity is out of reach for me (not only it would be worth more than the rest of the laptop, but also I currently have a 500GB drive in it, so even such a drive would be kind of a downgrade for me). During preliminary testing with a cheap Transcend 4GB Class 6 (14MiB/s streaming, 9MiB/s random read) card I experienced boot times to be reduced by half so putting the OS only on it would already would be an improvement. Unfortunately, my system now is about 11GiB in size so anything less than 16GB would be constraining. In this laptop I can connect additional drives on at least 5 different ways: using SATA-ATA converter caddy in the X6 Ultrabase using internal mini PCIe slot using integrated SDHC slot using CardBus (a.k.a PCMCIA or PC Card) slot using USB Thankfully, because I use only Linux on this PC the bootability of them is irrelevant as I can put the /boot partition on internal HDD and / on any of the above mentioned Flash memories (as I already did for the SDHC test). From what I was able to research and from my own experience those options come with rather big downsides or other problems: SATA-ATA caddy It has three downsides: I have to carry the Ultrabse with me at all times (it's not really inconvenient, but those grams do add) and couldn't disconnect it when I want to disconnect the battery It makes the bay unusable for the optical drive and occasional quick access to other hard drives the only caddies I could buy have rather flaky controllers in them so putting my OS on it would hamper its stability Internal mini PCIe slot This would be an ideal solution, if only I could find real PCIe SSDs, not only devices that could talk only SATA or ATA over PCIe mechanical connection (the ones used in Dell Mini or Asus EEE). Theoretically Samsung did release such devices but I couldn't find them in retail anywhere. Integrated SDHC slot It's a nice solution with a single drawback: the fastest 16GB SDHC card on the market can only do around 35MiB/s read and 15MiB/s write while still costing like a normal 40GB SATA SSD that's 10 times faster. Not really cost-effective. CardBus (a.k.a PCMCIA or PC Card) slot Those cards are much faster than the SDHC option (there are ones that can do well over 50MiB/s read in benchmarks) and from what I could find the PCMCIA controller in my laptop does support UDMA so it should be able to deliver comparable speeds. They still cost similarly to SD cards but at least they provide streaming performance comparable to my current HDD. USB That's the worst option. Not only is it limited to 20-30MiB/s by the interface itself the drive would stick out of the laptop so it's a big no no. The question As such I think that going the "CF in a CardBus adapter" route will be the best option. My question is: did anyone try using CF cards in CardBus adapters as system drives with Linux on Thinkpad laptops? Laptops in general? What was the real-world performance? I don't have any CF cards so I can't check how well does it work with suspend/resume, or whatever it's easy to make it work in initramfs (I'm using ArchLinux and SD card was trivial — add 3 modules in single config line and rebuilding initramfs) so any tips/gotchas on this are welcome as well.

    Read the article

  • Utility to optimally distribute files onto multiple DVDs?

    - by Alex R
    I have a bunch of media files which I want to record to DVD, but since each DVD only fits 4.5GB, I have to find the optimal way to organize the files to use the minimum number of DVDs (otherwise the empty space left in each DVD can easily add up). Are there any tools to help with this? Many years ago there was a DOS utility to do this with floppy disks.

    Read the article

  • convert serveral image files to a single djvu file

    - by user62046
    Suppose I have serveral BMP image file, say 001.bmp, 002.bmp,..., 100.bmp. I want to convert these files to a single djvu file, whose first page is the content of 001.bmp, the second page is the content of 002.bmp...etc. What is the best way (software) to do this task? I don't want to upload those image file to a server, since it takes too much time. On the other hand, I am not restricted to use BMP files, I can also work with PNG or JPG files.

    Read the article

  • Cannot install 64-bit version of Visio due to Microsoft Office Single Image 2010

    - by Ryan Kohn
    I tried to install Visio on Windows 7, but I received the below error message. You cannot install the 64-bit version of Office 2010 because you have 32-bit Office products installed. These 32-bit products are not supported with 64-bit installations: Microsoft Office Single Image 2010 If you want to install 64-bit Office 2010, you must uninstall all 32-bit Office products first, and then run setup.exe in the x64 folder. If you want to install 32-bit Office 2010, close this Setup program, and then either go to the x86 folder at the root of your CD or DVD and run setup.exe, or get the 32-bit Office 2010 from the same place you purchased 64-bit Office 2010. I cannot find Microsoft Office Single Image 2010 in the programs list, so I tried to use Microsoft's Fix It to remove the software, but this doesn't resolve my issue.

    Read the article

  • PowerShell create new Azure VM from uploaded disk (not image)

    - by MikeBaz
    I have a VHD in Azure storage. That VHD is configured as an OS disk through a command like the following: Add-AzureDisk -DiskName $newCode -MediaLocation "http://$script:accountName.blob.core.windows.net/$newCode/$sourceVhdName.vhd" ` -Label $newCode -OS "Windows" I would like to create a new VM pointing at that disk. From what I can tell if I was doing this with an image I would do something like: New-AzureVMConfig -Name $newCode -InstanceSize $instanceSize ` -MediaLocation "http://$script:accountName.blob.core.windows.net/$newCode/$sourceVhdName.vhd" -ImageName $newCode ` | Add-AzureProvisioningConfig -Windows -Password $adminPassword ` | New-AzureVM -ServiceName $newCode However this is wrong for me because I don't have an image - I have a configured VHD that is not sysprepped and can't be. How can I create the VM in PowerShell to point at the existing disk like I can through the portal?

    Read the article

  • How to determine the best byte size for the dd command

    - by James
    I know that doing a dd if=/dev/hda of=/dev/hdb does a deep hard drive copy. I've heard that people have been able to speed up the process by increasing the number of bytes that are read and written at a time (512) with the "bs" option. People have suggested that the optimal byte size is due to sector size. I personally think it would have something to do with the amount of cache that the hard drive has. My question is: What determines the ideal byte size for copying from a hard drive? and Why does that determine the ideal byte size?

    Read the article

  • How to get the best LINPACK result and conquer the Top500?

    - by knweiss
    Given a large Linux HPC cluster with hundreds/thousands of nodes. What are your best practices to get the best possible LINPACK benchmark (HPL) result to submit for the Top500 supercomputer list? To give you an idea what kind of answers I would appreciate here are some sub-questions (with links): How to you tune the parameters (N, NB, P, Q, memory-alignment, etc) for the HPL.dat file (without spending too much time trying each possible permutation - esp with large problem sizes N)? Are there any Top500 submission rules to be aware of? What is allowed, what isn't? Which MPI product, which version? Does it make a difference? Any special host order in your MPI machine file? Do you use CPU pinning? How to you configure your interconnect? Which interconnect? Which BLAS package do you use for which CPU model? (Intel MKL, AMD ACML, GotoBLAS2, etc.) How do you prepare for the big run (on all nodes)? Start with small runs on a subset of nodes and then scale up? Is it really necessary to run LINPACK with a big run on all of the nodes (or is extrapolation allowed)? How do you optimize for the latest Intel/AMD CPUs? Hyperthreading? NUMA? Is it worth it to recompile the software stack or do you use precompiled binaries? Which settings? Which compiler optimizations, which compiler? (What about profile-based compilation?) How to get the best result given only a limited amount of time to do the benchmark run? (You can block a huge cluster forever) How do you prepare the individual nodes (stopping system daemons, freeing memory, etc)? How do you deal with hardware faults (ruining a huge run)? Are there any must-read documents or websites about this topic? E.g. I would love to hear about some background stories of some of the current Top500 systems and how they did their LINPACK benchmark. I deliberately don't want to mention concrete hardware details or discuss hardware recommendations because I don't want to limit the answers. However, feel free to mention hints e.g. for specific CPU models.

    Read the article

  • Unable to create a Windows 7 system image of a failing hard drive

    - by Rahul
    The hard disk of my one year old T400 Thinkpad has started failing periodic hardware tests. I get a "Targeted Read Test Failed" error. The "SMART short self test" times out. I am now trying to create a Windows 7 System image of the hard disk but it fails without giving any specific error messages. I tried using Comodo Backup but got an error (code 101117) there as well. I have copied the important files in Dropbox but would like to take a full System backup as I have plenty of software installed on the machine. Does anyone know why this is happening and how I can take a backup of the system image ?

    Read the article

  • Taking an image backup of an entire server?

    - by WarDoGG
    I am currently using a dedicated server for my hosting needs. However, the costs are too high and I would like to suspend everything until I work out my business strategy again. Is there a way I can take a complete backup of the filesystem and run it in VMWare ? I cannot just copy the entire filesystem because there are lots of tools installed and tight changes to the server configuration files I myself dont know about (by the developers), but I need a snapshot of the entire disk image along with processes installed and everything is as is because for development needs, I need to work on this copy in VMWare or VirtualBox etc. Is it possible for me to take a full image copy ? How do I do it ?

    Read the article

  • Unusable Source for Ubuntu image on Xen 3

    - by Roberto Aloi
    Hi all, I'm trying to create a new VM in Xen 3, running Ubuntu 10.4 (32 bit) as the guest OS. Xen 3 is installed on a machine running OpenSuse 11.2. I downloaded the Ubuntu image from the ubuntu.com website and I mounted it on /dev/loop0. When I try to create the new VM in Xen with the given source, Xen complains the "source is unusable". I've also checked the md5 sum for the image. It's fine. Any suggestion or hint that could help me?

    Read the article

  • how to optimize apache on web-server

    - by Prakash
    how can I optimize the server with following configuration. It takes too much time to load a page. IBM X3200 M3 Server - 1 Intel Xeon Processor with 4 GB Ram Below is my current configuration for apache: Start Servers: 5 (Default) Minimum Spare Servers: 10 Maximum Spare Servers: 20 Server Limit: 500 Max Clients: 500 Max Requests Per Child: 10000 (Default) Keep-Alive: On Keep-Alive Timeout: 5 Max Keep-Alive Requests: 100 Timeout: 200

    Read the article

  • Is ext4 more expensive than ntfs?

    - by ???
    I have just converted an NTFS partition to ext4, however the total space seems reduced from 421G to 415G. Where did the 6G go? And, the reserved space is grown to 199M in ext4, much larger compared to 78M in NTFS, why? The partition is mainly used for movies/musics, so most files are very large (10M each). I want to use ext4 file system, is there any suggestion? mkfs.ntfs: /dev/sdb4 421G 78M 421G 1% /mnt/mmedia mkfs.ext4: /dev/sdb4 415G 199M 393G 1% /mnt/mmedia It's also weired that the remaining size of ext4 is 393G, shouldn't it be 415G or 414G? What happened to the disappeared 22G? Compared to NTFS, ext4 seems eaten 28G in total.

    Read the article

  • Lots of files being used by blank web page. What are they?

    - by byronyasgur
    I am trying to optimise a website and I was using the network waterfall facility in Google Chrome. When I looked at the results there were lots of files which I didnt recognise. I first thought they might be something to do with Google Chrome itself, so I put a blank HTML file on my desktop and checked but there was nothing in the waterfall except the file itself. So I put a blank file on my server and I got the output below. What are all these files, are they all necessary, is this normal and do I need to be in any way concerned. My hosting provider has always been excellent in every regard that I'm aware of. My host is shared hosting, using cpanel and is based on a LAMP server. I also note that a couple of those file have problems but I have no idea how to fault find that or whether it's a concern. EDIT: I have cleared the cache so I don't think it's a browser cache issue.

    Read the article

  • Image resolution not showing up in "Get info" dialog in Mac OS X

    - by R.A
    When I bought a Mac Mini with Mac OS X, I could show image dimensions in the Get Info dialog. After that, I installed some mobile application development software and later I noticed, Get Info was not showing the image dimensions anymore. I need to check the dimensions of those images to include artwork in my project. Now, I reinstalled my Mac OS X and it's working fine – it is showing the dimensions: Before that, 516x314 was missing. Why did that happen? How can I prevent it from happening again?

    Read the article

  • Prefork or Worker MPM for amazon xlarge server?

    - by Netismine
    I'm trying to measure would it be better to have prefork or worker mpm apache module for the server I'm working on, which is Amazon X-Large 15 GB memory 8 EC2 Compute Units (4 virtual cores with 2 EC2 Compute Units each) and that will run a Magento website with about 50 active users at once. Site serves a lot of images and about 45 requests per page. Images sometimes hang, so it seems worker would be a better option? Thanks

    Read the article

  • Image doesn't fill monitor when connected via HDMI

    - by russau
    Connecting a Dell Studio Slim to a Dell ST2210 21.5" monitor. The PC has a ATI Radeon HD 4350 video card. The video card has DVI and HDMI connections. When I connect with the DVI everything works fine. When I connect with the HMDI the image doesn't fill the screen, i.e. there's a approx 1 inch black border around the image. Win7 tells me the res is 1920x1080 (the native res of the monitor). Tried rebooting, no luck.

    Read the article

  • MysqlTunner and query_cache_size dilemma

    - by wbad
    On a busy mysql server MySQLTuner 1.2.0 always recommends to add query_cache_size no matter how I increase the value (I tried up to 512MB). On the other hand it warns that : Increasing the query_cache size over 128M may reduce performance Here are the last results: >> MySQLTuner 1.2.0 - Major Hayden <[email protected]> >> Bug reports, feature requests, and downloads at http://mysqltuner.com/ >> Run with '--help' for additional options and output filtering -------- General Statistics -------------------------------------------------- [--] Skipped version check for MySQLTuner script [OK] Currently running supported MySQL version 5.5.25-1~dotdeb.0-log [OK] Operating on 64-bit architecture -------- Storage Engine Statistics ------------------------------------------- [--] Status: +Archive -BDB -Federated +InnoDB -ISAM -NDBCluster [--] Data in InnoDB tables: 6G (Tables: 195) [--] Data in PERFORMANCE_SCHEMA tables: 0B (Tables: 17) [!!] Total fragmented tables: 51 -------- Security Recommendations ------------------------------------------- [OK] All database users have passwords assigned -------- Performance Metrics ------------------------------------------------- [--] Up for: 1d 19h 17m 8s (254M q [1K qps], 5M conn, TX: 139B, RX: 32B) [--] Reads / Writes: 89% / 11% [--] Total buffers: 24.2G global + 92.2M per thread (1200 max threads) [!!] Maximum possible memory usage: 132.2G (139% of installed RAM) [OK] Slow queries: 0% (2K/254M) [OK] Highest usage of available connections: 32% (391/1200) [OK] Key buffer size / total MyISAM indexes: 128.0M/92.0K [OK] Key buffer hit rate: 100.0% (8B cached / 0 reads) [OK] Query cache efficiency: 79.9% (181M cached / 226M selects) [!!] Query cache prunes per day: 1033203 [OK] Sorts requiring temporary tables: 0% (341 temp sorts / 4M sorts) [OK] Temporary tables created on disk: 14% (760K on disk / 5M total) [OK] Thread cache hit rate: 99% (676 created / 5M connections) [OK] Table cache hit rate: 22% (1K open / 8K opened) [OK] Open file limit used: 0% (49/13K) [OK] Table locks acquired immediately: 99% (64M immediate / 64M locks) [OK] InnoDB data size / buffer pool: 6.1G/19.5G -------- Recommendations ----------------------------------------------------- General recommendations: Run OPTIMIZE TABLE to defragment tables for better performance Reduce your overall MySQL memory footprint for system stability Increasing the query_cache size over 128M may reduce performance Variables to adjust: *** MySQL's maximum memory usage is dangerously high *** *** Add RAM before increasing MySQL buffer variables *** query_cache_size (> 192M) [see warning above] The server has 76GB ram and dual E5-2650. The load is usually below 2. I appreciate your hints to interpret the recommendation and optimize the database configs.

    Read the article

  • How to optimize a postgreSQL server for a "write once, read many"-type infrastructure ?

    - by mhu
    Greetings, I am working on a piece of software that logs entries (and related tagging) in a PostgreSQL database for storage and retrieval. We never update any data once it has been inserted; we might remove it when the entry gets too old, but this is done at most once a day. Stored entries can be retrieved by users. The insertion of new entries can happen rather fast and regularly, thus the database will commonly hold several millions elements. The tables used are pretty simple : one table for ids, raw content and insertion date; and one table storing tags and their values associated to an id. User search mostly concern tags values, so SELECTs usually consist of JOIN queries on ids on the two tables. To sum it up : 2 tables Lots of INSERT no UPDATE some DELETE, once a day at most some user-generated SELECT with JOIN huge data set What would an optimal server configuration (software and hardware, I assume for example that RAID10 could help) be for my PostgreSQL server, given these requirements ? By optimal, I mean one that allows SELECT queries taking a reasonably little amount of time. I can provide more information about the current setup (like tables, indexes ...) if needed.

    Read the article

< Previous Page | 143 144 145 146 147 148 149 150 151 152 153 154  | Next Page >