Search Results

Search found 13068 results on 523 pages for 'copy and paste'.

Page 301/523 | < Previous Page | 297 298 299 300 301 302 303 304 305 306 307 308  | Next Page >

  • Transparently cache files from a network drive in Linux

    - by Vadim
    We have a Linux server that reads files from a network drive and processes them. In a common scenario, a user will log in and access the same files over and over again. The size of the files varies but the larger ones can be around 50+ Mb. The files seldom change. I was wondering if it's somehow possible to transparently cache the files. I don't want (or can) change the program the reads the files, nor do I control the protocol by which the files are accessed. I just want something to detect that I access a certain path, copy the file locally (if needed) and then read the file from the local drive. I've read about Bcache but can't figure out if it's what I need. Do you have any suggestions? Thanks, Vadim.

    Read the article

  • Why not install Msvcr71.dll into system32?

    - by hillu
    While looking for an authoritative source for the missing Msvcr71.dll that is needed by a few old applications, I stumbled across the MSDN article Redistribution of the shared C runtime component in Visual C++. The advice given to developers is to drop the DLL into the application's directory instead of system32 since DLLs in this directory are considered before the system paths. What can/will go wrong if I (as an administrator, not a developer) decide to take the lazy path and install Msvcr71.dll (and Msvcp71.dll while I'm at it) into the system32 directory (of 32 bit Windows XP or Windows 7 systems) instead of putting a copy in each application's directory? Is there another good solution to provide the applications with the needed DLLs that doesn't involve copying stuff to the application directories?

    Read the article

  • Using 64 bit wuauclt from 32 bit command prompt

    - by Tim Brigham
    I have a script that for legacy reasons needs to run inside a 32 bit command shell. This script also includes references to certain core windows binaries - most notably wuauclt but others as well - which are not accessible by default within the 32 bit environment. This script is being run in several locations including many windows 7 and server 2008 r2 boxes. I'm aware of the possibility to copy files from the system32 to syswow64 in order to get around this. Is there any better method - something along the lines of adding an entry to the path variable - which will allow me to fall back to these 64 bit binaries from within a 32 bit script?

    Read the article

  • Script to gather all the files ending in .log and create a tar.gz file.

    - by Oscar Reyes
    I'm currently using this script line to find all the log files from a given directory structure and copy them to another directy where I can easily compress them. find . -name "*.log" -exec cp \{\} /tmp/allLogs/ \; The problem I have, is, the directory/subdirectory information gets lost because, I'm copying only the file. For instance I have: ./product/install/install.log ./product/execution/daily.log ./other/conf/blah.log And I end up with: /tmp/allLogs/install.log /tmp/allLogs/daily.log /tmp/allLogs/blah.log And I would like to have: /tmp/allLogs/product/install/install.log /tmp/allLogs/product/execution/daily.log /tmp/allLogs/other/conf/blah.log

    Read the article

  • When to use delaycompress option in logrotate?

    - by Anand Chitipothu
    The man page of logrotate says that: It can be used when some program cannot be told to close its logfile and thus might continue writing to the previous log file for some time. I'm confused by this. If a program cannot be told to close its logfile, it will continue to write forever, not for sometime. If the compression is postponed to next rotation cycle, the program continues to write to that file even after the next rotation cycle. How is postponing solving the problem? My understanding is that copytruncate should be used when a program cannot be told to close the logfile. I'm aware that some data written to the logfile gets lost when the copy is in progress. I was looking at the logrotate file for couchdb, and it had both copytruncate and delaycompress options. /usr/local/couchdb-1.0.1/var/log/couchdb/*.log { weekly rotate 10 copytruncate delaycompress compress notifempty missingok } It looks like there is no point using delaycompress when copytruncate is already there. What am I missing?

    Read the article

  • Outlook 2010: using signatures stored on network

    - by Gregory MOUSSAT
    With Outlook before the 2010 version, it was possible to specify any path for the signatures. With Outlook 2010, the only way is to use those stored into C:\Documents and Setting\UserName\Local Settings\Application Datas\Microsoft\Signature\ I'd like to point the signatures to a network share. Allowing us to modify the signatures into the share, instead of login on every computers each time we are asked to modify them (and this is quite often because the signatures contain logos about current events). We currently use a script to copy the signatures from the share to the local disk when users login. Question: How to set Outlook 2010 to use signatures outside of the default signature folder ?

    Read the article

  • Moving files from Public folder to C: takes a minute, even though they are same hard drive and same

    - by Jian Lin
    I have a big file, like 2GB, and would like to move it from Network -> Bookroom -> Users -> Public (this is the computer in the bookroom in the house) to c:\myfiles and they are actually on the SAME hard drive (and same partition). But copying still takes a minute or so? I thought if on the same hard drive and partition, then it is a "move" and it should take 2, 3 seconds only. that public folder also is \\Bookroom\Users\Public Update: Sorry, I actually mean "move" all the way... so it is not copy but move. So that's why I thought it should take 2, 3 seconds only.

    Read the article

  • How can I format a USB drive as FAT from a MacBook Pro?

    - by Edward Tanguay
    I plugged in a 250GB USB hard drive into my MacBook Pro and want to format it in FAT so I can transfer files back and forth between a windows machine. (My windows7 machine only formats in exFAT which my Snow Leopard 2.6.4 doesn't support until I do the update). So I want to format it on the mac. but when I right click on the drive, it gives me the options to eject, copy, but not to format. I can go into Disk Utilities, click on Partition, but the only option is the "Mac Journaled format". How can I Format my USB drive as FAT from my MacBook Pro?

    Read the article

  • What is the simplest and fastest way to transfer large file through a Windows network?

    - by Sake
    I have a Window Server 2000 machine running MS SQL Server that stores over 20GB of data. The database is backed-up every day to the second harddrive. I want to transfer those backup files to another computer to build another test server and for recovery practicing. (the backup never actually got restored for almost 5 years. Don't tell my boss about that!) I have trouble transfering that huge file through the network. I've tried plain network copy, apache download, and ftp. Any method I tried end up failing when the amount of data transfered reach 2GB. The last time that I successfully transfered the file, it was through a usb attached external harddrive. But I want to perform this task routinely and preferably automatically. Wonder what is the most pragmatic approach for this situation ?

    Read the article

  • Trouble Connecting to Virtual Machine after IP address Change

    - by David
    I have a VMware image running a copy of Fedora 11 which is hosted on a remote server. The remote server recently had its IP address change. I'm now unable to connect to my virtual machine. The server admin assures me that my virtual machine is running and assigned the new IP address. I have checked the firewalls and had the remote admin restart the VM instance. Neither of these fixed the problem. How do I troubleshoot a remote server which I am unable to SSH to? I'm actually even unable to ping the remote IP (connection timed out).

    Read the article

  • remote symbolic link / junction

    - by Blueberry
    Might be a pretty obvious one but have had some trouble finding solid answers. I have a directory on a windows network share containing different versions of an application. I would like to have a link to one of these called 'current', which will be a symbolic link to the directory sitting beside all the other versions and pointing to one of these. Creating this link seems to be more of an issue than I would have thought. Looks like symlink only shows the link on the same machine as where it was created (which is not going to work for obvious reasons) and junction needs to be run on the server which is practically impossible due to various restrictions. What would be the best way to go about this? Would I just need to copy the files twice or can I have a symbolic link which can be created and accessed remotely?

    Read the article

  • mod_rewrite RewriteRule is not working

    - by buggy1985
    Hi, This is a follow-up of this question: Rewrite URL - how to get the hostname and the path? And a copy of this: mod_rewrite RewriteRule is not working I got this Rewrite Rule: RewriteEngine On RewriteRule ^(http://[-A-Za-z0-9+&@#/%=~_|!:,.;]*)/([-A-Za-z0-9+&@#/%=~_|!:,.;]*)\?([A-Za-z0-9+&@#/%=~_|!:,.;]*)$ http://http://www.xmldomain.com/bla/$2?$3&rtype=xslt&xsl=$1/$2.xsl it seems to be correct, and exactly what I need. But it doesn't work on my server. I get a 404 page not found error. mod_rewrite is enabled, as the following simple rule is working fine: RewriteEngine On RewriteRule ^page/([^/\.]+)/?$ index.php?page=$1 [L] Can you help? Thanks

    Read the article

  • Elastic beanstalk access private git repo

    - by user221676
    I am trying to currently add an ssh key to my elastic beanstalk instances using .ebextensions commands. The keys I have stored are in my application code and I try to copy them to the root .ssh folder so I can access them when doing a git+ssh clone later here is an example of the config file in my .ebextensions folder packages: yum: git: [] container_commands: 01-move-ssh-keys: command: "cp .ssh/* ~root/.ssh/; chmod 400 ~root/.ssh/tca_read_rsa; chmod 400 ~root/.ssh/tca_read_rsa.pub; chmod 644 ~root/.ssh/known_hosts;" 02-add-ssh-keys: command: "ssh-add ~root/.ssh/tca_read_rsa" the problem is that I get is an error when attempting to clone the repo Host key verification failed. I have tried many ways of try to add the host to the known_hosts file but none have worked! The command that is doing the clone is npm install as the repo points to a node module

    Read the article

  • Where should I go to learn about networking? [closed]

    - by Ollie Saunders
    I wonder if anyone could recommend resource or resources such as a good book that: explains how all the important protocols work and interact. I’m interested in those that are relevant in a typical home network and used over the Internet explains in detail how ADSL Internet connections work to the level of depth necessary so that I’m able to tweak and measure performance settings starts from the beginning but attempts to provide proper understanding rather than idiot-oriented steps to follow Basically, I’m interested in how these technologies work and tend to be implemented in hardware and software rather than “here’s what to do if…” I’m interested in Computer Networking by Andrew S. Tanenbaum and I wonder if anyone else has any experience with that title. It’s expensive but I could probably loan a copy for £3 from the library or so.

    Read the article

  • Shared storage solution for our sql server backups

    - by Gokhan
    We have 3 clustered sql servers. We have 5+ multi terrabyte databases and their backup files (compressed using quest litespeed) are hitting over 600gb each, We are required to keep at least a week or two weeks (if we can) of weekly full backups and then 6 days differential backups, and a week or 2 weeks worth of log backups local. We are currently limited to 2TB volumes from our san team, we can have multiple volumes but they are expensive ($200 per raw TB per month) and having to deal with many backup volumes instead of a single big volume is difficult. I think if we could have a shared network storage of 20TB+ raid 10 or so for all our servers for keeping the backups and another department will copy them to tape from the network storage and delete files according to the retention period would be good, if this box would be a build in operating system (even unix a complete file storage system) that would be good. What do you guys think, does this make sense to you, is there any manufacturer that sells a storage product like that which that work in a clustered environment? Thank you

    Read the article

  • Is there a way reinstall a fresh Adobe trial on Mac?

    - by Moshe
    I was wondering if it is easier to fool the Adobe Installer into giving a second and third etc 30 day trial of various product on Mac OS X 10.5 than it is on Windows. Is this doable? How so. I am expecting to get a PC version of CS4 soon possibly through a legit source, this is just for the interim. Edit : I am not a big fan of pirating, but as I stated above, I expecting a legit copy soon. Why the downvotes?

    Read the article

  • Homegroup and NTFS permissions

    - by bytenik
    I'm running a copy of Windows 7 as a "server" at my home. I have several file shares that I want to make available to specific users only. I've modified the NTFS permissions to only allow these users to access their respective shares. However, while a locally logged on user can access the actual folders just fine, over the network the remote access is authenticating as HomeGroupUser$ rather than the actual user in question, as shown by the Computer Management panel for shares. I do have matching user accounts (i.e. my username locally is abc and a parallel account with username abc and the same password exists on the server machine). I don't want to disable homegroup because there are other shares where homegroup authentication would be desirable, especially for some people where they don't have a parallel account. Is there a way to get the system to authenticate first by matching username, and then by homegroup authentication if there's no matching user?

    Read the article

  • Trigger ZFS dedup one-off scan/rededup

    - by Jake Wharton
    I have a ZFS filesystems which has been running for some time and I recently had the opportunity to upgrade it (finally!) to the latest ZFS version. Our data doesn't scream dedup but I firmly believe based on small tests that we could gain anywhere from 5-10% of our space back for free by utilizing it. I have enabled dedup on the filesystem and new files are slowly being dedupified but the majority (95%+) of our data already exists on the filesystem. Short of moving the data off-pool and then recopying it back, is there any way to trigger a dedup scan of existing data? It doesn't have to be asynchronous or live. (And FYI there isn't enough room on the pool to copy the entire filesystem to another and then just switch the mounts.)

    Read the article

  • Safe to Use an Amazon EBS Volume While Snapshot in Progress?

    - by Justin Noel
    Is it safe to use an EBS volumne while a snapshot is being created? I've currently got a 100Gb EBS volume mounted. I am in the process of snapshotting it. Goodness it's slow!! It's going to end up taking more than 45 minutes to snapshot. My question : Is the EBS volume already copied and just being saved somewhere? Or, is the snapshot actively copying from my mounted volume right now? Basically, if I start using it before the snapshot completes, am I hosed? I just can't believe it takes this long to copy. There really isn't even 100GB in use. It's more like 25Gb.

    Read the article

  • Reboot loop after Windows XP Service Pack 3 update

    - by espais
    Recently I upgraded to Service Pack 3, and now it seems that something has gone terribly wrong with the update. After logging in, my computer will blue screen after about 5 minutes and then go into a reboot loop (I don't have the exact error message handy). I have a Sager NP2092 notebook, running an Intel chipset. I'd rather avoid having to reformat my XP, especially with my copy of Windows 7 arriving right around the corner. After doing some Googling, I came across this article: Does your AMD-based computer boot after installing XP SP3? However, it deals with the AMD chips, and specifically states not to use its fix on Intel based systems. EDIT After killing the reboot, this is the error that pops up: STOP: 0x000000F4 (0x00000003, 0x8A187118, 0x8A18728C, 0x80604438) EDIT2 I have run Memtest86, and it reported 0 errors.

    Read the article

  • breaking mdadm raid and moving to NTFS

    - by daveyt
    I'm running Ubuntu 8 something and my data is on a mirrored pair of 1TB disks formatted as ext3, and the RAID is via mdadm. I want to move to Windows 7 (yeah yeah I know but Linux aint doing it for me at the moment) and migrate the disks to NTFS. My plan is: Break the MDADM RAID (by failing one disk logically) Format the 'failed' disk as NTFS Copy data from the RAID array to the NTFS disk (dont care about perms) Install Windows, (new separate non RAid disk) and my data disk is available. I've researched this and it seems the easiest way. I dont have another disk to back up to so I think this is my only way. Can anyone see a better/easier way?

    Read the article

  • WMV file won't play in Windows Media Player

    - by user1053768
    I have a .wmv file I uploaded to my website. When I click the link to the video, the video plays in Windows Media Player without any problems. However, on some systems when the user clicks the link, Windows Media Player gives them the error: Window Media Player cannot play the file. The player might not support the file type or might not support the codec that was used to compress the file All I did was copy the video to the server and store the URL in a database. Why are users getting this error? How can I fix it?

    Read the article

  • Need disc image help pronto!

    - by data
    I recently got a job as a junior network administrator. Last week the senior admins did their yearly reinstall of server 2003, exchange, drivers etc on the main server. I've been asked to back up the disc so that next year they can just copy over the pre-made image. What tools can i use to achieve both the creation of the entire servers HDD image and loading it back on (id like to test it in the sandbox.) To impress them, a program that is free is preferable. And maybe a tool that can do it all from booting the program off of a USB drive.

    Read the article

  • How do I SCP/FTP to a Coraid SR2421

    - by Pitto
    I need to save data to a Coraid SR2421. I am no expert and I'm trying to understand how this costy piece of hardware works. All I have is a Ethernet cable connecting my laptop and the coraid to a switch and the coraid's console with cursor blinking. Any kind of help would be greatly appreciated. edit and further explanation: I've tried using software on the coraid website but It's not working. So I've contacted the support and they sent me a file to upload on the coraid. It looks like if I upgrade coraid's firmware then I'll be able to use the coraid software on the windows side. That's why I'd need SCP/FTP or any kind of solution to copy a file in the partition that can be read from coraid console command line so I can launch the update command as requested. I hope I was a bit clearer now.

    Read the article

  • Getting error code -41 when copying files to external drive

    - by diego
    I'm having trouble copying some files from my mac to an external hard drive: I keep getting the nondescript "error code -41". I noticed some of the files with an additional "@" permission bit had the "com.apple.quarantine" flag set. I used the "xattr" command from this article What should I do about com.apple.quarantine? to take care of the quarantine flag and sort that out (these files were copied over from another mac on my network, so I guess OS X flagged them as quarantine). That took care of the problem for those files but I still have some that I can't manually copy over to the external drive. The only other thing I've noticed is that some of these files have a an extra permission bit: "drwxr-xr-x+" which I haven't been successful in googling. Aside from that I don't see anything else. Also, Disk Utility says everything's fine. Any help would be greatly appreciated.

    Read the article

< Previous Page | 297 298 299 300 301 302 303 304 305 306 307 308  | Next Page >