Search Results

Search found 13068 results on 523 pages for 'copy and paste'.

Page 421/523 | < Previous Page | 417 418 419 420 421 422 423 424 425 426 427 428  | Next Page >

  • Apache on CentOS 5.9 VM serves my optimized images corrupted (but my Mac doesn't)

    - by Robert K
    I'm using a Vagrant VM to mirror the client's environment as closely as I can. As part of our build process we do no optimization of assets early on; that comes as we're ready to take a site live. Needless to say, this issue is beginning to worry me as we need to take the site live very soon. I use ImageOptim to automate optimization of image assets, which runs a whole series of tools (Zopfli, PNGOUT, OptiPNG, AdvPNG, PNGCrush). I always set the optimizations to their maximum setting. After optimization, my PNGs start looking like this: What's weird is, if I serve the same file through my Mac's copy of Apache, not through Vagrant, the image loads fine. In fact, the only time it's ever corrupt like this is when the image is served from the Vagrant VM and its install of Drupal. All optimized JPEGs display only the first ~20% of the image. And PNGs, depending on the image, may show either a portion or the "progressive"-style corruption below. The browser itself makes no difference, the same browser will serve an uncorrupted image from my Mac's Apache instance and a corrupt image from the VM. When I disable all PNG optimizations except PNGCrush, and the removal of the PNG metadata, the image is served corrupted. I'm optimizing JPEG images with JPEGmini. The server is running CentOS 5.9, Apache 2.2.3-85, PHP 5.3.3, and Drupal 7. As best as I can tell the error lies somewhere within the VM, either with Apache or with (perhaps) the network stack. Seems like the tools that optimize the compression of the PNGs and JPEGs are what trigger this error. I've already determined that the .htaccess file isn't interfering with how the images load. What should I try to troubleshoot this?

    Read the article

  • How to backup virtual machines on a standalone ESXi host?

    - by Massimo
    Standalone ESXi (4.1) host without any vCenter Server. How to backup virtual machines as quickly and storage-friendly as possible? I know I can access the ESXi console and use the standard Unix cp command, but this has the downfall of copying the whole VMDK files, not only their actually used space; so, for a 30-GB VMDK of which only 1 GB is used, the backup would take 30 full GBs of space, and time accordingly. And yes, I know about thin-provisioned virtual disks, but they tend to behave very badly when physically copied, and/or to blow up to their full provisioned size; also, they are not recommended for actual VM performance. It is ok for me to shut down the VMs before backing them up (i.e. I don't need "live" backups); but I need a way to copy them around efficiently; and yes, a way to automate shutdown/startup when taking a backup would also help. I only have ESXi; no Service Console, no vCenter Server... what's the best way to handle this task? Also, what about restores?

    Read the article

  • Alias for Drupal "Sites" folder with Apache on Windows Server 2008

    - by sgtbeano
    I'm having to move a number of sites from a LAMP stack to a WAMP one, provided by Zend, and I've hit a problem. Our architecture is a number of loadbalanced web servers which have their own local webapp drives which are kept in sync with one server performing as the master copy. There is then a separate DFS share provided to all web servers from our pillar san. Usually a Drupal install under our LAMP cluster would have the main Drupal web app in a local HTDOCS mount for each server and the SITES directory within Drupal would then be symlinked out to the DFS or NFS share so that there is a common FILES and TMP directory. The problem I'm having is that there seems to be no equivalent of symlinks on Win Server 2008, shortcuts have a .ink at the end making Apache see them as a distinct file. So I've tried using an alias call in the vhost file like this; <Location /drupal-626/sites> Order deny, allow Allow from all </Location> Alias /drupal-626/sites "Z:\Path to alternate sites directory" The root for this test is; http://main-domain-url/drupal-626/ Unfortunately this isn't work so I'm wondering if any of you have a solution which would work? Many thanks for taking the time to read this.

    Read the article

  • Missing Memory on Windows Server 2008

    - by Chris Lively
    I have a windows 2008 x64 server with 8GB of RAM installed. Task Manager and Resource Monitor both insist that 7.5GB of the RAM is in use. However, the memory list under Processes (Memory Private Bytes) doesn't add up. I do have Show Processes from all users checked and hand adding the numbers I come up with about 3.5GB of RAM. I also looked at the latest copy of SysInternals Process Explorer. And neither the Private Bytes or Working Set adds up to more than about 3.5GB of RAM in use. What's going on? ===== Update: I bounced the server to see what would happen with the memory utilization. After boot and regular operations began it sat at 3GB of RAM usage. 18 hours later, it's back up to 6.8GB of usage with no indication as to where the additional 3.5GB or so of RAM is being used. Here are links to screen shots of the resource monitor and task manager: Resource Monitor Task Manager Update 2: Well, I believe I located the problem. When I detached one of the larger databases from my sql server the amount of ram shown as "in use" dropped drastically. The Memory Private Bytes count barely moved. So I'm guessing that SQL server has some way of allocating memory where it doesn't really show up in any of the monitors. I went further and created a new database file, then transferred all of the data from the one I detached. Even though it has the same data, and the same transactions going through it, the memory in use has stayed low. Maybe there was some corruption in the DB? I'll leave it to the DB gods and go searching for another "problem" ;)

    Read the article

  • How can I get DVDs playing after a Vista to XP change?

    - by Liath
    I replaced my vista install on a Dell Inspiron 1525 with XP and have managed to get most things up and running again however I'm having trouble with playing DVDs. When I try and play a DVD I get the following message: Windows Media Player cannot play this DVD because there is a problem with digital copy protection between your DVD drive, decoder, and video card. Try installing an updated driver for your video card. I have ensured that my drive is configured to play Region 2 discs (I'm in the UK), I've installed the most up to date XP codec pack which makes me think it's a driver issue. In device manager I have got my DVD drivers up to date however under "Other Devices" I'm missing several which sound key: Audio Device on High Definition Audio Bus Modem Device on High Definition Audio Bus Video Controller Video Controller (VGA Compatible) However I've installed all the relevant drivers I can find on the Dell website. The drive itself is working - I've run software from the drive. I'm afraid I am far from a sys-admin so I'm struggling on this one. How can I get my DVDs playing again?

    Read the article

  • Un-do Windows disk convert HFS+

    - by BLAKE
    Last night, a friend asked my to give him a copy of a word document. He handed me an external hard drive and left. I plugged the hard drive into my file server running Windows Server 2003, opened disk management and clicked OK. (I know that in Windows 2003 you need to manually assign a drive letter to external drives.) I then looked at the drive in disk management and it said that it was unallocated space. I called my friend and he said that there was data on the drive, but he used it with his Mac Book. Aperantly when I clicked OK in disk management I converted the from HFS+ file system to something else. Is there any way to undo the disk convert? I immediately removed the drive, so there was no writing to it. Windows did not format the drive, it just converted it. Is the data still there? All the data recovery programs I have are for windows, can they read the Mac file system? I need to get the data back, what can I do?

    Read the article

  • Create DFS replica from a NAS drive

    - by Mark
    We have two offices, at two different locations. In one we have a NAS, with some shares. We also have a Domain Controller using Windows 2003 R2. We have setup a second Domain Controller using Windows 2003 R2 to put that in the second office. What we would also like is to replicate the NAS drive onto the second Domain Controller so in the second office they have a local copy, and that their changes are replicated back to the NAS. Is there a way to setup DFS replication to do this? Or will it only work with local folders on each Server? Update 1 Sept Base on the answer below, I think I need to add some clarification. The real issue is that the NAS which hosts the shared folder that we want to replicate is external to both servers. And we have a particular share mapped to say S: . In the replication setup it doesnt seem to accept network shares external to the server to be candidates for replication. I can understand why, I just need confirmation that DFSR will only work with block devices that are local on at least one server. Is this the case?

    Read the article

  • Windows - Decrypt encrypted file when user account is destroyed

    - by dc2
    I have a Virtual Machine running on my Windows Server 2008 computer that originally was received by me encryped, as the builder of the VM did it on a MAC, which decrypts files by default. I never thought to decrypt these files, as they automatically 'decrypt' when you have permission over them, so the VM has been running for over a year despite the encryption. I just upgraded my computer to Domain Controller (dcpromo.exe). Now when I try to access/run the VM, I can't because I don't have permission to decrypt the files as that was on another logon (local administrator) and now I am the domain administrator. Apparently the local admin is totally nuked when you upgrade to domain controller. I have tried EVERYTHING - taking ownership of the files, which works. Doesn't do anything for me. Adding full control to everyone on the files. I go to File Properties Advanced Details (under encryption) Users who can access this file. The only user is administrator@localcomputername, and there is a cert number. I try adding a new cert, I don't have permission. I don't have permission to: Decrypt the file (access is denied). Copy the file (to another computer) - access denied. I am totally stumped and this VM is a production machine and needs to get up right now. Does anyone have any ideas?

    Read the article

  • How can I create a macro that acts on a relative reference rather than an absolute reference to cell A1?

    - by Bruce
    I have a master rent statement in an Excel 2007 (macro enabled) spreadsheet that shows all tenants in rows with columns formed by the months. Each tenant then has a separate rent statement sheet like the one below that pulls the data through from the master rent statement and all I do then is to copy the last 4 columns to the right and add them to the right, just renaming the month labelled as ‘rent due’ with the current month and then hiding the previous last 4 columns to the left so that the statement always shows the previous month's activity and the amount due for the current month: I used a macro to speed up the creation of these statements, but then found that in some cases the result was wrong and needed major correction because the macro use absolute references i.e. its starting position was relative to cell A1 whereas some of my rent worksheets commence from a different column and in some cases from a different column and a different row. I have tried recording the macro with 'Use relative references' but when trying to use the macro it only gets part way through its operation before it stops and the message appears: Run time error '1004' Application defined or object defined error with the option to End or Debug or go to Help and then I'm stuck as I don't know how to debug and work in VBA or understand what has gone wrong. I want to record a single macro that always remains relative to the last 'Total Due' column heading (in the sample, it’s cell FF3 but on another worksheet could be cell GA26) and thus enables me regardless of where on the worksheet the rent statement is placed to add through my recorded macro a further four columns with updated dates and a repositioned 'Total Due' summary (in the sample in cells FE23 and FF23). The contents of cells FE23 and FE22 are always the same number of rows from the 'Sample Rent Statement, Service Charge and Sub Total' rows. I've searched on the web and in the help files of Excel 2007 but have been totally stumped by this, so currently I have to re-record a quantity of macros each month to cover all of the permutations of the worksheets in my Excel rent workbook, which is starting to become pointless in terms of saving time. Does someone know a solution to this problem please?!

    Read the article

  • Games on windows 8 in bootcamp lag even on lowest graphics

    - by Jackson Gariety
    I've been playing Crysis 2 and Skyrim on my Retina MacBookPro (10,1) for months now. The two games used to run super smoothly even on nearly maxed out settings. This laptop has an Nvidia GeForce GT 650M graphics card inside, it runs great. But I recently replaced my Windows 8 consumer preview with the retail copy, and since then, 3D games lag in this odd way, no matter what the graphics settings. Every second Skyrim and Crysis alternates between running smoothly and lagging. It's a cyclical lag that comes and goes like clockwork. I can turn the graphics down to 800x600 with no antialiasing and low texture quality, and it runs much smoother on the "up" motion of the cycle, but every second it moves back into this lag spike. I've tried installing beta graphics drivers, re installing the operating system, re installing the bootcamp support software, and freeing up space (I have about 20 GB free). I can't figure out what suddenly caused this other than some obscure difference between the consumer preview and the retail version. What can I try? Is my video card failing? Are there some other drivers I can install? This isn't normal lag from maxing out the card, it

    Read the article

  • solution for an offline server

    - by dashmug
    I'm trying to setup a development server at work that will ideally be able to test drive a couple of projects in PHP, Rails, or Django (not always running at the same time). I develop the apps locally on a Mac and then I'll put the projects up on this server for testing with my actual users (non-techies) before deploying to a production server. My problem is that we have a very poor internet connection (almost negligible) at work and doing the usual apt-get/yum/ports (make, clean, install) processes for setting up servers always get their packages from online repositories somewhere. I know I could probably download the source and then compile them myself but that's going to be too much of a hassle for me. I'm thinking about two solutions: Plan A: Run a server VM on my Mac and then use this VM as the source repository for the offline server. I've read about Ubuntu's apt-proxy and it seems to be good enough though I haven't tried it yet. I'm not sure if this is possible but can I simply do apt-get install nginx --downloadonly so that the package and its dependencies will be downloaded into my VM and my server can use the VM as the source repo for apt-get? Plan B: Run a server VM on my Mac (which I can setup/update easily when I'm home) and then clone the VM to the offline development server. Maybe I should simply make the server a VM host so I can simply copy the VM over. I think this is okay for the first-time setup but subsequent updates will take too long (cloning the VM image). If I was working on Windows, I imagine it'd be easier because most services have an installer file that I can download and then run at the server. If you could suggest another way, it would be much appreciated. Update: From Michael Hampton's answer, I found a possible solution which is apt-cacher. I also found this page on Ubuntu's website. I wonder if there is a better tool than this one.

    Read the article

  • Can installing Linux or Grub make all of my operating systems slow?

    - by Geore Shg
    I installed Linux Mint 64-bit recently but it freezes up all the time. Not only that, but Grub wouldn't detect Windows. I thought it was just an issue with Linux Mint so I booted into Ubuntu - however that froze up as well. I downloaded Fedora to replace Linux Mint with, but that also freezes up. I finally gave up trying to repair Grub, burning a supergrub grub2 disk and using that to boot into Windows. Windows now freezes up as well. I installed a new copy of Windows on a different drive but to no avail. Right before all this started my computer was running very smoothly. I am wondering if the installation of Linux Mint, the reinstallation of Grub, or me messing with the BIOS (when I was attempting to repair Grub) could have done something drastic enough that everything is slow now? I realize that computers get gradually slower over time, but this was in no way gradual and it happened directly after the installation of Linux Mint. If so, what should I do?

    Read the article

  • Shortcuts that make using Windows easier? [closed]

    - by ekaj
    Over the years I have found out a good bit of shortcuts, all of which make using Windows that much faster and easier. I was wondering if I had missed any important ones, or ones that just saved any time. I tried to keep these relative to Windows only, as programs such as Mozilla, Photoshop, etc. bring in hundreds of other shortcuts, which do not apply to all users. These are the current ones I know: For the mouse: Scroll wheel - opens a link in IE into a new tab - closes a specific tab in IE when closed on Right click-n-drag - nifty menu to make a copy of a file or create a shortcut For the keyboard: Cntrl + Alt + Del - need this be explained? WinKey + R - opens 'Run..' WinKey + D - shows the desktop WinKey + E - opens 'My Computer' WinKey + F - opens 'Find..' WinKey + Tab - cool way to switch inbetween windows WinKey + L - locks the computer Alt + Tab - switches windows with a simple interface Alt + F4 - closes windows Alt + F - opens 'File' (Handy for things like IE if you have the menubar disabled) Cntrl + F - find text on current document Cntrl + W - closes a window, or current active tab in IE (or IE itself if only one tab) Cntrl + C - copies selected text / image Cntrl + V - pastes selected text / image So does anyone know of any more shortcuts that just make life easier? I would be much appreciative of any, and I am sure that some other users would like to know some too =]

    Read the article

  • Should the MAC Tables on a switch Stack be the same between sessions?

    - by Kyle Brandt
    According to Cisco's documentation: "The MAC address tables on all stack members are synchronized. At any given time, each stack member has the same copy of the address tables for each VLAN." However, when logged into the switch I see the following: ny-swstack01#show mac ad | inc Total Total Mac Addresses for this criterion: 222 ny-swstack01#ses 2 ny-swstack01-2#show mac ad | inc Total Total Mac Addresses for this criterion: 229 ny-swstack01-2#exit ny-swstack01#ses 3 ny-swstack01-3#show mac ad | inc Total Total Mac Addresses for this criterion: 229 ny-swstack01-3#exit ny-swstack01#ses 4 ny-swstack01-4#show mac ad | inc Total Total Mac Addresses for this criterion: 235 ny-swstack01-4#exit ny-swstack01#show mac ad | inc Total Total Mac Addresses for this criterion: 222 Going back and forth this isn't just because it is changing over time either, within certain sessions there are entries that I don't seen from the master session. We are currently waiting to hear back on CIsco from this, but has anyone run into this before? I stumbled upon this when looking into Unicast flooding, one of the hosts that is a destination MAC of flooding has a MAC entry that appears in session 3, but nowhere else. Also, I checked an all sessions show the same aging time.

    Read the article

  • I can't change mysql port (5.6.12) changing the lines of my.ini (windows 8)

    - by videador
    I was trying to change the port of my mysql server in my local machine but i can't. The version of mysql is 5.6.12, is an installation from wamp and I am on Windows 8. I change these lines in my my.ini file located in (C:\wamp\bin\mysql\mysql5.6.12). [client] #password = your_password port = 3307 socket = /tmp/mysql.sock [wampmysqld] port = 3307 socket = /tmp/mysql.sock key_buffer = 16M max_allowed_packet = 1M The previous values were 3306. Ok then I've reset the server installed, but it doesn't works, the mysql server is still running on 3306. Then, I rename the path of the services with this, to make sure that the my.ini is read by the mysql instance. c:\wamp\bin\mysql\mysql5.6.12\bin\mysqld.exe --defaults-file="C:\wamp\bin\mysql\mysql5.6.12\my.ini" wampmysqld But nothing, it stil doesn't works. My last bullet was to copy the content of my.ini to a file my-default.ini (a file that is placed in C:\wamp\bin\mysql\mysql5.6.12\ and that I don't know what is its mission). However it still doesn't work and the port is still 3306.

    Read the article

  • Sharepoint web part fails intermittently

    - by pringly
    I have a MOSS 2007 environment, 2 web servers and a DB server, load balanced between the two web servers. I deployed a web part recently, which worked fine for a while, but failed on web server 2 after a day. When it fails, it gets the error message: 'A Web Part or Web Form Control on this Page cannot be displayed or imported. The type could not be found or it is not registered as safe’ Once it has failed, it will stay that way until an IIS reset is done. The other web server never fails, I tried to force the second web server to fail to recreate the issue and have been unable to do it. I tried placing it under heavy http traffic and it handled it fine. Put it back in the pool and it failed again after about 7 hours. So, if i remove the .dll for the webpart from the affected web server, the webpart doesnt stop working. Is this normal behavior? I checked the bin directory for the site and the global assembly and it there is no other copy of the .dll anywhere else on the server. Also, when checking the web part gallery, if the web part has failed it will appear in the gallery, but by trying to add a new webpart, the .dll wont be listed. I have no idea how to continue troubleshooting from here or even fix it, any ideas?

    Read the article

  • AWS instance went down for restart, came back up and has been "Waiting for network configuration... " for 12 hours

    - by kavisiegel
    What happened was about a month ago, I created a new instance, re-configured everything, and mounted the old instances drive to the new one. I then detached the old drive from the instance, but it was still mounted, so it errored out. Now, come the outage last month, and when the instance boots back up - the drive isn't there, because in the downtime, it dismounted. It hangs because the fstab is looking for it. I get word that the server's not up today, I check the logs, re-attach the drive and restart. Now I'm getting "Waiting for network configuration..." and it doesn't seem to be doing anything. Some googling told me that I'm going to need to start from zero again, which I don't think is right. I created a new instance, which booted no problem, then I stopped it and swapped the two old drives into the new instance. Still the same error. Using the fresh AMI, it worked. I figure it's just a misconfigured Amazon file. I can attach the drive to a functioning instance on the same AMI and copy some files over, then try again - but I don't know where to even start or what files to check.

    Read the article

  • How should an experienced Windows SysAdmin learn Linux? [closed]

    - by Systemspoet
    I have a new hire starting in a few weeks who is an experienced Windows SysAdmin. I think he's fairly senior on the Windows side, with a pretty deep AD understanding and experience with Exchange 2007, 2010, and exchange migrations. He's done a little PowerShell but I suspect more of the "run this command to do this" variety then "write a script to do this" sort. However, we are a mixed shop and (he knows this) I expect him to become a reasonably competent Linux SysAdmin over time. I'm looking for good starting points to bring him along. I have over ten years of Linux/UNIX experience, so it all sort of seems intuitive to me, but I've been thinking about the toolkit you actually need to be productive in the Linux CLI world. Just to be able to use the machines at all, off the top of my head... vi Basic CLI stuff -- move around, rename files, copy files, tar, gzip, changing passwords, finding relevant manpages, keep track of where you are, find things in your history, etc, etc. More advanced things that I take for granted but are actually pretty hard -- doing things with 'find', extracting relevant text via 'awk' and/or 'cut', knowing when to use 'grep' and when to use 'grep -e' or 'egrep'. Distribution specific stuff... compiling software, rpm, yum, apt-get, you name it. This all seems pretty basic to me, but when I think back to 1995 when I was first learning my way, some of those things took me years to master. So my question is -- where should I send him to pick up those skills? I'm not just thinking of classes, but rather also websites and books? Where do you all suggest as a starting point for picking up Linux skills?

    Read the article

  • GIT and Django Projects

    - by Garfonzo
    I have two servers, a Dev server and a Production server. The Production server runs a live Django site, while the Dev server has a copy of the Django project. I use the Dev server to work on the Django site, make improvements, fix bugs, etc. Once I am satisfied with how the Dev version is working, I move the whole Django directory from the Dev server and replace the same directory on the Production server. The two servers are not on the same LAN so the process is not straight forward. There are a few issues with this that I am having so far. Moving the whole directory is laborious and time consuming If I only change a few files, it is even move tedious to replace a few files than the whole directory since the project is getting fairly large and I worry that I'll miss something I often run into permission issues after I've moved things It's super inefficient, and, due to lack of time, I haven't bothered figuring out a new method. Now it's just getting out of hand and i need to address the situation. I am thinking I need to move to a GIT repository for this process. But my question is how would I set this all up? Do I host the repository on the Production server, pull from the Dev server, do work, then commit? Then I would pull from the Production server (same server the repo is hosted on) to run the current working version? Do I host the repo on the Dev Server, pulling from the same server to do work on the repo, then pull a working version onto the Production server? Should I be hosting the repo on a different server than the Production server and the Dev server (a third server)? Are there any special considerations with Django and repos that I need to worry about? Thanks for the help :)

    Read the article

  • DVD/CD burning .zip: is it more reliable, faster, longer lasting to burn a zip of files rather than the files as a folder?

    - by Rob
    Is it more reliable, faster, longer lasting to burn to CD/DVD a zip (or a few large zips) of files rather than the files as a folder? Just thinking if 1000s of small files would not be as efficiently recorded compared with one or a few large zips. Also, even after the burning program verifies the disc, I also use Beyond Compare to compare the files with those on the disc. Always binary compares as identical but I hear the drive stuttering presumably as the head is being shifted only slightly each time to seek the next file, which leads me to think that its best to make one or more zips and copy those locally to compare. Or is it that burning invidual files to the disc is not as readable which causes the head to stutter. There aren't any problems, my disc burns are reliable, just thinking more of efficiency and longevity, the discs burn and verify fast enough on my 18x DVD burner. I'm using ImgBurn mostly. Also used Nero in the past. I burn whole discs closed, finalised. Not sure which write mode but would think Disc At Once from a temporary cached image made by the burning program would be the most reliable.

    Read the article

  • Booting Windows from different partition than system

    - by szamil
    I have bought an SSD disk, but my laptop (Dell Precision M6300) refuse to use it as a target disk for windows (AHCPI on/off, BIOS up-to-date). I can't exchange the disk unfortunately... But fortunately, I've managed to install windows using USB disk case. The problem is, that when I put that disk as my internal drive it can't boot. (Disk read error, Three Finger Salute ... ) So I tried with Linux (openSUSE), I manage to install it as well, but when I tried to boot GRUB from internal drive I get errors again. (Should I try GRUB2?) I figured out that I can boot into that internal hard drive's openSUSE system using small USB drive with GRUB, kernel and image on it. So, I just run GRUB from USB drive, it loads necessary stuff from the USB drive and then continues from the internal drive. I want to do the same with Windows. But GRUB (rootnoverify and chainloader +1) does not boot my windows on internal drive. The question: is there any chance to copy the critical windows' boot files into the USB drive, to make it possible to boot from that USB drive, but continue booting from internal (different in general) drive? The USB drive would became a system hardware key! ;-) Disk: Plextor M5S 128GB Sata III, laptop has Sata II, but it's compatible anyway, right?

    Read the article

  • ZFS & Deduplicating FLAC Data

    - by jasongullickson
    I'm experimenting with using ZFS to deduplicate a large library of FLAC files. The purpose of this is twofold: Reduce storage utilization Reduce bandwidth needed to sync the library with cloud storage Many of these files are of the same music tracks but from different physical media. This means that for the most part they are the same and usually close to the same size, which makes me think that they should benefit from block-level deduplication. However in my testing I'm not seeing good results. When I create a pool and add three of these tracks (identical songs from different source media) zpool list reports 1.00 dedupe. If I copy all of the files (make exact duplicates of the three) dedupe climbs, so I know that it is enabled and functioning, but it's not finding any duplication in the original collection of files. My first thought was that perhaps some of the variable header data (metadata tags, etc.) might be mis-aligning the bulk of the data in these files (the audio frames) but even making the header data consistent across the three files doesn't seem to have any impact on deduplication. I'm considering taking alternate routes (testing other dedupe filesystems as well as some custom code) but since we're already using ZFS and I like the ZFS replication options, I'd prefer to use ZFS dedupe for this project; but perhaps it's simply not capable of working well with this sort of data. Any feedback regarding tuning that might improve dedupe performance for this sort of dataset, or confirmation that ZFS dedupe is not the right tool for this job are appreciated.

    Read the article

  • Install Windows 8 on SSD and Program & Users on HDD

    - by Foe
    I have been dealing with a few problems while installing Windows 8 on my computer. On my old configuration, I had Windows 7 installed on my 60Go SSD, and my programs and user data on my 1To HDD, thanks to relative links. Yet, while installing Windows 8 on my SSD, he made a small partition on my HDD "System related". Plus, I'm afraid using only links is a bit cheap, and I saw lots of people messing with their registry when trying to put user data on another drive. I read a lot about optimizing Windows 8 for SSD, putting Users on another drive, and very similar situation that didn't quite correspond to what I was trying to achieve. Here's what I tried : http://www.eightforums.com/tutorials/4275-user-profiles-relocate-another-partition-disk.html Booting on Audit mode and using an XML to relocate Users didn't work as the specified version in the file is a test one, and I don't what to enter if I'm using the last release. Booting with the install DVD in repair mode to do a copy of the User and create a relative link, resulting in an error on the logon screen while entering my password saying that "The profile can't be load" (average translation of my error from french to english) Do anyone know how to do a clean separated install of Windows 8, with OS on a drive, and the other data on a second one ? Thanks.

    Read the article

  • Failed Software RAID0 on Linux - Attempting to recover data

    - by Gizmo_the_Great
    I have a two disk RAID0 software raid (not hardware raid) that is reported to have failed during boot and my OS won't start. Using a Live CD, I get the following output : sudo mdadm -E /dev/sdc1 /dev/sdd1 /dev/sdc1: Magic : a92b4efc Version : 1.2 Feature Map : 0x0 Array UUID : 3710713d:fb301031:84b61247:d1d53e0f Name : HP-xw9300:0 Creation Time : Sun Sep 1 15:22:26 2013 Raid Level : -unknown- Raid Devices : 0 Avail Dev Size : 1465145328 (698.64 GiB 750.15 GB) Data Offset : 16 sectors Super Offset : 8 sectors State : active Device UUID : ad427cd2:9f885f57:7f41015f:90f8f6af Update Time : Sun Jun 8 12:35:11 2014 Checksum : a37407ff - correct Events : 1 Device Role : spare Array State : ('A' == active, '.' == missing) /dev/sdd1: Magic : a92b4efc Version : 1.2 Feature Map : 0x0 Array UUID : 3710713d:fb301031:84b61247:d1d53e0f Name : HP-xw9300:0 Creation Time : Sun Sep 1 15:22:26 2013 Raid Level : -unknown- Raid Devices : 0 Avail Dev Size : 976771056 (465.76 GiB 500.11 GB) Data Offset : 16 sectors Super Offset : 8 sectors State : active Device UUID : 2ea0199d:cb08d9e7:0830448a:a1e1e348 Update Time : Sun Jun 8 13:06:19 2014 Checksum : 8883c492 - correct Events : 1 Device Role : spare Array State : ('A' == active, '.' == missing) GParted lists both disks, detects the flags as 'Raid' and lists the data usage. Can anyone please help me re-assemble just so that I can copy some of the data off that I have not backed up recently? Thanks

    Read the article

  • Non-restored Files Corrupted on System Restore

    - by Yar
    I restored OSX 10.6.2 today (was 10.6.3 and not booting) by copying the system over from a backup. The data directories were not touched. In the data directories, I'm seeing some files as 0 bytes, and getting permission-denied errors when copying, even when using sudo cp or the Finder itself. Some programs, differently, take the files at face value and see no permission problems (such as zip), but they see the files as zero bytes, which would be game-over for recovery. cp: .git/objects/fe/86b676974a44aa7f128a55bf27670f4a1073ca: could not copy extended attributes to /eraseme/blah/.git/objects/fe/86b676974a44aa7f128a55bf27670f4a1073ca: Operation not permitted I have tried sudo chown, sudo chmod -R 777 and sudo chflags -R nouchg which do not change the end result. Strangely, this is only affecting my .git directories (perhaps because they start with a period, but renaming them -- which works -- does not change anything). What else can I do to take ownership of these files? Edit: This question comes from StackOverflow because I originally thought it was a GIT problem. It's definitely not (just) GIT. Anyway, this is to help put some of the comments in context.

    Read the article

< Previous Page | 417 418 419 420 421 422 423 424 425 426 427 428  | Next Page >