Search Results

Search found 35708 results on 1429 pages for 'default copy constructor'.

Page 684/1429 | < Previous Page | 680 681 682 683 684 685 686 687 688 689 690 691  | Next Page >

  • wget not converting links

    - by acrosman
    I am trying to mirror a fairly large site (20,000+ pages) prior to a major overhaul. Basically, I need a backup before cutting over to the new one in case we forgot something we need (we'll have about 1,000 pages at launch). The site is run on a CMS that I cannot easily extract usable data from, so I'm trying to make the copy with wget. My problem is that wget does not appear to be actually converting links, despite the presence of --convert-links or -k in the command. I've tried a couple of different combinations of flags, but I haven't been able to get the output I need. Most recent failed attempt was: nohup wget --mirror -k -l10 -PafscSnapshot --html-extension -R *calendar* -o wget.log http://www.example.org & I've also included the --backup-converted, and --convert-links instead of -k (not that it have mattered). I've done it with and without -P and -l, again no that they should matter. Results in files that still have links like: http://www.example.org/ht/d/sp/i/17770

    Read the article

  • Specific font in Windows 7 works, but in Windows Server 2003 doesn't. Why?

    - by Vinicius Ottoni
    I have a .TTF font and when i open it in Windwos 7 it's all ok, the characters is appearing in various sizes and etc.., but when i open it in Windows Server 2003 nothing is appearing inside it. Shows up a "blank font", whitout the characters. I need that font for my app that have to work in both systems... Obs: all others fonts are ok in Windows Server 2003, when i open anyone the characters is appearing. -- EDIT I copy the font to another Windows Server 2003.... and works fine. Anyone have any idea?

    Read the article

  • Are periodic full backups really necessary on an incremental backup setup?

    - by user2229980
    I intend to use an old computer I have as a remote backup server for myself and a few other people. We are all geographically separated, and the plan is to do incremental daily backups using rsync and ssh. My original idea was to make one initial full backup then never again have to deal with the overhead of doing it, and from that moment on only copy the files changed since the last backup. I've been told that this could be bad, but I fail to understand why. Since each snapshot is comprised of hard links to the unchanged files plus the original changed ones, isn't it going to be identical to a new full backup? Why would I want to make another full backup?

    Read the article

  • CentOS Existing host to new host with all data/files

    - by ganesh
    Good noon. Our small startup management decided to move our production server from existing provider to azure. We have centOS on both. It is for classified's related site, considerable amount of data and ~thousands users with their disc space quota. This is our first time moving our servers. I need your Guidance and suggestions on these. 1) How to migrate the mysql db (dump OR slave OR copy filesystem)? 2) How to manage the emails during the downtime. 3) Manage the files 4) How to security/Firewall check list for the new system 5) IP/DNS related Checklist 6) Anything that I missed out!. Since first time, planning to be more cautious. Any reference documents Highly appreciated. Thank you all!.

    Read the article

  • Postfix sends to original recipient name instead of alias name

    - by user141742
    I have setup Postfix as part of my ISPConfig implementation. It should just forward all mail. Clients should be able to define a new alias and receive mail on a different e-mail address, using ISPConfig. Example: A mail sent to [email protected] on my ISPConfig server should be forwarded to external mailbox [email protected] This works fine but one important thing. When opening the mailbox for [email protected] I see the mail with the original sender name, and the original recipient name, i.e. [email protected] I have tried the forward function and the send copy function on a mailbox in ISPConfig. Both cases show the original recipient [email protected] instead of the final recipient [email protected], as it would when I manually forward an e-mail. Can this be achieved without having to maintain a list for each entry? Thanks for looking into this. M.

    Read the article

  • Transfer nearly an entire file system to a fresh install as smoothly as possible

    - by Xander
    I've got a friend who needs his computer working in just a few hours. His files are safe, however, he managed to corrupt his main install of Windows 7. My plan is to go in with a Linux disk, copy his C:\ do a backup drive I've got and then reinstall. Restoring many of his files will be pretty simple (such as documents and such), however, things such as applications won't transfer as easily. Is there any easy way to transfer applications such as MS Office (which he needs in the morning) or other commercial software packages without having to go through the hassle of locating the keys and reinstalling them completely? I don't think just moving them over will work just because of the fact that I'm sure much of that is stored in the registry (validation stuff and such). Anyways, quick responses would be super nice! Also, additional help would be great!

    Read the article

  • I can't work locally unless connected to the internet - how to fix?

    - by Rodney
    Hi, In Firefox, when I am disconnected from the net, I want to work locally on my local IIS server (Win XP, Firefox 3.5.10). I do NOT have Work Offline checked but FF says that it cannot find my site (ie. the message from FF if you try to access an online site offline) This applies to any localhost URL. I tried 127.0.0.1 and checked my Host file - that does not work either. If I check Work Offline then it shows the Firefox message that it cannot be reached because I have Work Offline checked. Unchecking it does not help. Then - I load up Safari, copy and paste the URL into that browser and it connects to my development localhost site. It is not just browser caching as I can log in etc. So Firefox will not let me develop locally unless I am connected to the internet, which is a problem. Suggestions please?

    Read the article

  • Problem installing Windows Server 2008 R2 on Xen 3.0

    - by GodEater
    Hi there folks, I've been googling this for a few hours now and not really getting anywhere. We have a Xen 3.0 host which I'm trying to install a copy of Windows Server 2008 R2 Standard Edition onto as a guest OS - but the install hangs at the "Starting Windows" screen when it starts running the installer. Is this is a known issue with the version of Xen we're running (I know it's positively ancient)? Is there a workaround for it at all? We've successfully got a great number of vanilla 2008 servers running on it, it appears it's an issue specific to R2. Bryan

    Read the article

  • Copying dovecot maildir to another server with courier maildir

    - by NovumCoder
    Hi all, i just moved all my mailboxes first from one server to the new one by using rsync. After that i created the folders using Thunderbird to have same folder structure like on old server. Then i copied all mail files into the folders. Now when i subscribe and click on the folder in Thunderbird it starts downloading the headers of all mails, but after finishing download nothing appears in the mail list. Its like my folder is empty and everytime i click again on the folder thunderbird starts again downloading headers. What is wrong here? I found a solution using a tool called imapsync, but its not for free, so i started doing it by copy&paste. I thought Thunderbird will be able to fix the indexes. :-( Or is there a better solution to migrate from dovecot maildir to courier maildir?

    Read the article

  • Office 2007 network share access denied

    - by Rodent43
    Hope I have not duplicated an issue already posted but I could not find anything from the search... Right here is the problem, we have recently updated all our desktops to the MS Office 2007 suite and people have issues trying to open simple files like word documents... the systems are Windows XP (SP3) Novell Network with novell client Office 2007 when they try to open a word document from a usual network share word presents a message reporting Access Denied Contact Administrator So we assumed network permissions, none of which have changed...so try the same file with Wordpad and it opens fine, be it with formating issues of course... Now copy the file to your desktop, which is not redirected, and you can open the file in word as normal... so does anyone know if office 2007 uses some new permission when opening files? does it create temps or something... any pointers would be appreciated

    Read the article

  • Replacing files in a folder structure with files from an unsorted folder

    - by Andrew
    I have over 50,000 PDFs organized into folders in a file called PDFACT. I needed to compress these files so I ran them through Adobe to batch compress them and this worked—except Adobe could only output the files without their folder structure. So basically I have 50,000 PDFs set up in a folder with hundreds of subfolders, and everything was organized. I ended up with one folder with 50,000 compressed PDFs in it, just in alphabetical order. Somehow I need to replace all the orginal PDFs with their compressed copies. Let me give an example: In the folder PDFACT we have the following file: C:\PDFACT\BIG DINNER\BILL\NEWESTBILL.PDF … and in the output folder that Adobe created we have just: C:\COMPRESSED_PDF_FOLDER\NEWESTBILL.PDF This copy is smaller then the one in PDFACT and has the same name but it is just lumped in with every other PDF. The folder structure and subfolders are gone. Is there any way to replace all the larger uncompressed PDFS inside the orginal folder structure with their now compressed counterparts?

    Read the article

  • Harddrive in the freezer ever work for you?

    - by Stefan Thyberg
    Once upon a time, my little 10 GB drive in my webserver failed and of course I had no backup, teaching me to immediately set up an automatic backup job afterwards. Anyhow, this drive refused to start and as a last-ditch effort I put it in a plastic bag and put it in the freezer overnight, since I had heard somewhere that it might work and I really didn't have any other options. The next day I take it out, immediately plug it in outside the case and lo and behold, the drive works long enough for me to copy my data off it. Have you ever had a similar experience with this method?

    Read the article

  • Automatic LaTex document generation from Excel spreadsheet

    - by Bowler
    I have some data in an excel file from which I have to generate a report. I repeat this task fairly regularly and am looking to automate it. I have a LaTeX project into which I usually just copy data by hand, export the necessary worksheets as pdfs and add them to my LaTeX project and compile with pdflatex. It has occured to me that there must be a way to automate this process. Is there an efficient way to export the data from excel and into a LaTeX project, possibly a vba script in excel could run the process? Also, it doesn't have to be LaTeX, I'm not all that experienced with MS office's more advanced features is there some way akin to a mail merge that I could achieve this with? In some ways this might be better in case I have to pass the work on to someone who doesn't know LaTeX. Thanks.

    Read the article

  • How do I mklink junction + move content from C:\Program Files to D:\Program Files?

    - by Matt
    I have a few applications that absolutely refuse to install into anything but C:\Program Files or C:\Program Files(x86). Changing the registry keys for default install folders doesn't seem to provide any satisfaction and so now I'm wondering about throwing a NTFS junction in there to force these pesky applications to cooperate. There are files currently in use within Windows so it's quite likely I am not going to be able to do this within the active OS. Is there some bootable Windows 7 system tools that would allow me to make this happen? Seems I will need the ability to copy files (with permissions!) from one drive to another, as well as make the junction for Windows.

    Read the article

  • Does RAID 1 protect against corruption?

    - by Shaun
    Does Raid 1 protect against data corruption? For example, let's say that I am keeping all of my important files on a NAS that uses 2 disks in a RAID 1. If one hard drive has some kind of internal problem and the data becomes corrupted, does the RAID recognize this automatically and correct it using data from the other good disk? Could it even know which copy is the good one? Does RAID 5 protect against corruption? I know that RAID is not a backup solution. I am trying to figure out how to make sure that I am not backing up corrupt data!

    Read the article

  • Strange FTP issues - some files are not downloaded

    - by FractalizeR
    I have a machine, which cannot fetch some files from remote servers by FTP. Machine is powered by CentOS. I tested FTP on three files: 12.09.2012 21:21 166 007 ll091212.002 13.09.2012 11:32 23 040 ll091212.003 13.09.2012 11:50 61 313 ll091212.004 From them, I can always successfully download only one - ll091212.004. Two others are downloaded by about 90% (I can see them on disk) and then FTP transfer hangs without any error messages. I move files, copy them about the remote server - no luck. Another machine from the same subnet can download all three of them easily. I just don't know what's the reason of this.

    Read the article

  • VMware ESXi 4 On-Disk Data Deduplication - possible and supported?

    - by hurikhan77
    Environment: We are running multiple web, database, and application servers which usually share a pretty common installation (gentoo linux) and similar configuration in VMware ESXi 4. The differences are usually only some installed features or differing component versions. To create a new server, I usually choose the most similar (by features) running server, rsync a copy of it into freshly mounted filesystems, run grub, reconfigure and reboot. Problem: Over time this duplicates many on-disk data blocks which probably sums up to several 10's of gigabytes. I suppose if I could use a base system as template with the actual machines based on top of that, only writing changed blocks to some sort of "diff image", performance should improve (increased cache hit rate) and storage efficiency should increase (deduplicated storage space). This would be similar to what ESXi already supports for RAM deduplication (page sharing). Question: Is there any way to easily do this on ESXi 4? I already share the portage tree via NFS but this would not work for the rootfs.

    Read the article

  • How to detect when a user copies files from a server over the network?

    - by Mr. Graves
    I have a few virtual servers + desktops that are used for shared development with remote users, including some consultants. Each user has an account with access to most aspects of the server. I don't want to prevent people from being productive, or track passwords or read emails, but I do want to know when and what files they copy from the virtual server or what they upload from the server to a remote site, and what if any applications they install. This will help make sure my IP is protected, that no one is installing tools they shouldn't, and that things are licensed appropriately. What is the simplest way to do this? In order of importance I would say detecting file transfers off the machine to be most critical. Thanks

    Read the article

  • Installing W2K8 R2 on a Dell Poweredge 2850

    - by DerekT
    I'm a server novice and have been given a Dell 2850 (PERC 4e/Di). It has 3 blank HDs that I think are configured as RAID 5. I am trying to install W2K8 R2 SP1 on it. It doesn't have a DVD reader so I created an install USB. This works fine until it's time to copy files to the HD. It can't see a HD. There is an option to browse for drivers at this stage. Any idea what drivers I need and where to download them? I downloaded this driver RAID_DRVR_WIN_R227150.EXE but this failed with invalid signature. Thanks to dyasny for the link to LSI-LOGIC_MULTI-DEVICE_A00_R227150.exe.

    Read the article

  • Fake domain doesn't resolve when offline

    - by Fletcher Moore
    I have a flimsy grasp of DNS. Nonetheless, in order to install a local development copy of Wordpress MU, I needed to create a fake domain, which I called local.dev. It and all subdomains simply resolve to 127.0.0.1. Apache then directs to the correct folder. I installed PowerDNS, and got it working properly with a MySQL backend. I didn't feel comfortable, but since it worked, I didn't ask any more questions. The bizarre thing is it requires an internet connection to resolve correctly, and now I need to use it offline. If I am offline, Chrome provies the error: Error 105 (net::ERR_NAME_NOT_RESOLVED): The server could not be found. If you need more information, I am happy to provide it.

    Read the article

  • Downloading multiple files with wget and handling parameters

    - by coure2011
    How can I download multiple files using wget? I also want to rename the files. Here are the commands I'm running one by one (copy/paste on terminal): wget -c --load-cookies cookies.txt http://www.filesonic.com/file/812720774/PS11.rar -O part11.rar wget -c --load-cookies cookies.txt http://www.filesonic.com/file/812721094/PS12.rar -O part12.rar wget -c --load-cookies cookies.txt http://www.filesonic.com/file/812720804/PS13.rar -O part13.rar wget -c --load-cookies cookies.txt http://www.filesonic.com/file/812720854/PS14.rar -O part14.rar ........ and so on.. What can I do to download all these files one by one?

    Read the article

  • Problem with script that excludes large files using Duplicity and Amazon S3

    - by Jason
    I'm trying to write an backup script that will exclude files over a certain size. If i run the script duplicity gives an error. However if i copy and paste the same command generated by the script everything works... Here is the script #!/bin/bash # Export some ENV variables so you don't have to type anything export AWS_ACCESS_KEY_ID="accesskey" export AWS_SECRET_ACCESS_KEY="secretaccesskey" export PASSPHRASE="password" SOURCE=/home/ DEST=s3+http://s3bucket GPG_KEY="gpgkey" # exclude files over 100MB exclude () { find /home/jason -size +100M \ | while read FILE; do echo -n " --exclude " echo -n \'**${FILE##/*/}\' | sed 's/\ /\\ /g' #Replace whitespace with "\ " done } echo "Using Command" echo "duplicity --encrypt-key=$GPG_KEY --sign-key=$GPG_KEY `exclude` $SOURCE $DEST" duplicity --encrypt-key=$GPG_KEY --sign-key=$GPG_KEY `exclude` $SOURCE $DEST # Reset the ENV variables. export AWS_ACCESS_KEY_ID= export AWS_SECRET_ACCESS_KEY= export PASSPHRASE= When the script is run I get the error; Command line error: Expected 2 args, got 6 Where am i going wrong??

    Read the article

  • File permissions issue with an NFSv4 share, uploaded from a Mac Lion

    - by POP.sicle
    I have an NFSv4 share that was working fine, with Macs using Snow Leopard, to share files across the network. The NFS share has one cloned user/group that all clients autoconnect as. However, when I use a Lion Mac to copy a file from their user directory to the NFS, no other computer (mac SL/mac Lion/Win7) can edit/delete/write to the file that was uploaded - despite having the correct read/write/ex permissions visible on the NFS and through terminal. Attempting to edit the file permissions through Finder completely locks the file. I suspect this has something to do with Lion's ACLs (or maybe its version control) conflicting with NFSv4. Is there a way to disable or ignore extended ACLs or extended file permissions on the NFSv4 side, that would allow users to not run into this conflict? The work around currently is to use NFS Manager and set automounts to ignore ownership but installing NFS manager and configuring automounts for all of the computers seems more troubling than attempting to reconfigure the NFS settings. Advice?

    Read the article

  • How to choose my own filename format for subscribed podcast files?

    - by meomaxy
    I subscribe to several podcasts where the filenames of the downloaded mp3 files have no particular pattern to them. When I copy the directory of accumulated mp3 files into my mp3 player, the files play in alphabetical order. What I really want is to play the files chronologically by release date. I currently use iTunes on Windows XP to download the files. What I do now is manually rename the files, adding the date in YYYYMMDD format to the start of each filename so that an alphabetical listing of files will correspond to their chronological order when I listen to them later in PocketTunes on my Palm Centro. Is there some way to get the release date into the filename automatically? If so, I could automate or possibly skip the file renaming step. I would switch from iTunes to something else if that would solve my problem. The file creation time on my local disk isn't a reliable indicator because sometimes I download a few days worth of content at one time, and the files don't necessarily get downloaded in chronological order.

    Read the article

  • System Monitoring Redundancy

    - by Josh Brower
    I consult in a small business environment where I have two HyperV hosts (with <10 VMs) + a couple other servers. I recently had an issue where one of the HyperV hosts had a CPU issue and it came down, bringing most of my non-critical VMs with it, plus a free piece of software that I use for network & system monitoring and availability. Because of this, and the fact that iDRAC locked up to, I did not get any alerts about the crash. So I am wondering how I can (cheaply) get a redundant availability monitoring system in place--Is is as simple as running Nagios or Zenoss (or whatever) on two different HyperV hosts? It just seems like running more than one copy of Nagios/Zenoss/etc could be expensive and have high overhead. Thoughts? Thanks! -Josh

    Read the article

< Previous Page | 680 681 682 683 684 685 686 687 688 689 690 691  | Next Page >