Search Results

Search found 10417 results on 417 pages for 'large'.

Page 15/417 | < Previous Page | 11 12 13 14 15 16 17 18 19 20 21 22  | Next Page >

  • Storing large amounts of small files into bigger files on Windows

    - by asmo
    Let's say I have 50 GiB of files that weights around 500 KiB each. My guess is that having, for example, 5 large files of 10 GiB each with the same content archived in them would be better for hard drive performance. Am I correct? Will there be a noticeable gain on an NTFS filesystem? ===================================================================== Finally, which tool could I use to group the files together while retaining the ability to modify the content of the archive with zero or minor performance loss? For example, I like TrueCrypt archiving because after mounting an archive file, it creates a drive which I can use seamlessly as if it was a normal drive. The only thing with TrueCrypt is that I don't need encryption/compression, only archiving.

    Read the article

  • design a large scale network for an organization

    - by Essam
    i want to design a large scale network for an organization with HQ and two branches. i want to use a class A subnet. if i am using the network address 30.0.0.0 for the whole organization how can it be different from another organization company or whatever which is using the same address in another country? now i have the three locations for this organization,so i need 5 subnets [one for the HQ,two for branch A and branch B , one for connecting A to HQ and one for connecting branch B with HQ since i will use central DHCP server at the HQ,is that (number of subnetting) right? is it advisable to use class A or class B for this organization it term of address that will be wasted (let's say it is a university with two branches in two different states)?!

    Read the article

  • Downloading a large site with wget

    - by Evan Gill
    Hi, I'm trying to mirror a very large site but wget never seems to finish properly. I am using the command: wget -r -l inf -nc -w 0.5 {the-site} I have downloaded a good portion of the site, but not the whole thing. The content does not change fast enough to bother using time-stamping. After running overnight, this message appears: File `{filename}.html' already there; not retrieving. File `{filename}.html' already there; not retrieving. File `{filename}.html' already there; not retrieving. File `{filename}.html' already there; not retrieving. Killed does anyone know what is happening and how I can fix it?

    Read the article

  • How do you manage large web farms?

    - by Andrew Katz
    I have a quickly growing web farm running IIS 7 (30+ servers). All servers are identical copies of each other and all servers are physical. We update the software about once a month, and in the current process, we follow the following steps: Disable server from pool on F5 load balancer. Disable HTTP Keep-alives in IIS so connections drop quickly. Change default directory of website to new folder containing new binaries. Test server Enable HTTP Keep-alives. Enable server in F5 pool. Move to server 2 Microsoft used to have Application Center which was abandoned a while ago. They have made a second attempt with the Web Farm Framework, but this adds as much QA time testing the release package as it saves in the deployment. Has anyone seen a commercial off the shelf application that is tailored for managing and deploying to large web farms? Thanks!

    Read the article

  • Do large folder sizes slow down IO performance?

    - by Aaron
    We have a Linux server process that writes a few thousand files to a directory, deletes the files, and then writes a few thousand more files to the same directory without deleting the directory. What I'm starting to see is that the process doing the writing is getting slower and slower. My question is this: The directory size of the folder has grown from 4096 to over 200000 as see by this output of ls -l. root@ad57rs0b# ls -l 15000PN5AIA3I6_B total 232 drwxr-xr-x 2 chef chef 233472 May 30 21:35 barcodes On ext3, can these large directory sizes slow down performance? Thanks. Aaron

    Read the article

  • Photoshop / Illustrator Fill text box with large string.

    - by Xetius
    I have a massive string (lots of Fibonacci numbers concatenated together). I don't know how much of this text I need to fill an A4 page. What I was hoping for was to paste a large block into a text box and have it display as much as possible, wrapping the text at the end of a line, but it is not doing that. It is just displaying a blank box (With the text overflowing into an awaiting textbox or something. I have tried pasting smaller amounts of text into the text box, and it appears that it will get about half way and then go into 'blank' mode. All I need is a simple way of creating a background of numbers which I don't have to type in. Any ideas?

    Read the article

  • Large .tmp files in C:\Windows\Temp

    - by Brian
    I have a Windows 2008 server which functions as a web service server, and every other week or so, the C:\Windows\Temp folder is filled with large (between 40-300 meg) files with names like: ~DF5D73.tmp, ~DFA95.tmp, etc. They appear My webservices on the server don't create any files with that type of name, and the contents don't appear to help identify them. Process Explorer indicates that w3wp.exe is currently using several of the files, but I can't imagine specifically where they're coming from. As an example, the top of one of the files is: ÐÏࡱá > þÿ þÿÿÿ þÿÿÿ ÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿýÿÿÿþÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿR o o t E n t r y ÿÿÿÿÿÿÿÿÿÿÿÿ þÿÿÿ ÿÿÿÿÿÿÿÿÿÿÿÿ ÿÿÿÿÿÿÿÿÿÿÿÿ ÿÿÿÿÿÿÿÿÿÿÿÿ BM¦h' 6 ( / ph' ÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿ The owner of the file is only identified as localname\Administrators, so that's no help. Anyone else experienced a similar issue or any suggestions to track where the files are coming from?

    Read the article

  • Backups of Exchange 2007 SP3 using VSS are abnormally large

    - by Stew
    I have recently implemented Veeam backup and recovery 6.0, and have noted when backing up my exchange server via incremental updates, it is transferring way more data than expected. Backup is incremental, and setup to use VSS. VSS is stable and healthy, according to vssadmin. Exchange 2007 SP3 running on Windows Server 2008 R2, just last weekend I installed the latest Rollup for Exchange. I thought the nightly incrementals were large, but perhaps my users really are sending that much mail so I tested taking one incremental backup, waiting 10 minutes and taking a second. The second incremental backup transfered 5.8GB of data. We as an organization are absolutely NOT putting 5.8GB of data on the mail server every 10 minutes. Are there any other veeam users who have seen something similar? Is my test faulted? Are there other considerations for VSS?

    Read the article

  • Installing software at large organisation

    - by CJ7
    I have to give an MSI to a large organisation for it to install (presumably via GPO) on to some of their workstations that are running XP. I already know that my application is not allowed to write to the application folder. I realise it should be writing to the AppData folder. The organisation has allocated a folder on a server for the database files and other configuration files. This folder is referenced by UNC naming and not by mapped drives. My question is: based on normal practices, is my app likely to have the rights to create a sub-folder on this server? Is my app likely to have the rights to create files on this server?

    Read the article

  • Copy large files to multiple machines on a LAN

    - by Jonathan Callen
    I have a few large files that I need to copy from one Linux machine to about 20 other Linux machines, all on the same LAN as quickly as is feasible. What tools/methods would be best for copying these files, noting that this is not going to be a one-time copy. These machines will never be connected to the Internet, and security is not an issue. Update: The reason for my asking this is because (as I understand it) we are currently using scp in serial to copy the files to each of the machines and I have been informed that this is "too slow" and a faster alternative is being sought. According to what I have been told, attempting to parallelize the scp calls simply slows it down further due to hard drive seeks.

    Read the article

  • md5sum of large files gives different results sometimes

    - by Emanuele
    I have an AMD quad core, 8 gb RAM, 1 SSD EXT2 (2 months old), 2 HDD EXT4, approximately 1 year old. I'm using Ubuntu 10.04 x86-64 and when I compute the md5sum of large files (9 GB) sometimes I get different values than the one stored on a reference file. Upon restarting and switching off the PC then I get the expected results no matter how many times I repeat it. But this is random. I've turn on ECC (the fastest possible settings) and the issue seems to be rarer, but I've run memtest86+ for 6+ hours without a glitch (and with ECC off!). Any idea? Should I update the BIOS of my motherboard (Asus EVO-something...don't remember it now)? I've tried all the rest apart this, but genuinely don't know what to do anymore... Any suggestion is appreciated!

    Read the article

  • A raw dump from a corrupted large file

    - by Masoud M.
    I have a large .rar file inside partion D (Windows7/NTFS). It's corrupted due to bad sector (I think) and when I copy it to another place (External-HDD) the system freezes after 88% of progress. I even tried to copy it with my Ubuntu and same problem occurred. Also I tried chkdsk and it dosen't fix it. I think my last chance is dump that file with a tool which ignores bad sectors and create a raw copy of it. Then I will repair the file with rar tools. But I can not found a tool to raw dump a specific file. (In linux there is dd tool but it dumps all partition and I can not use it) So, Is somebody know a tool to do a raw dump from a file?

    Read the article

  • Upload large database SQL file

    - by Devy
    I've a database of more than 20Gb of size on my hard disk. What is the best way to upload it with the least (money) load possible on the server? - I'm on Windows 7. - I have FTP and SSH access on the server. I avoid using FTP because my connection cuts off a lot, I can't imagine I re-upload again the file after failing on 99%. I found some tools that split the large .sql file to small .sql files, but they didn't mention how to gather these files again into one file. Another way is to archive the big .sql file to .rar with -v option, upload them through FTP then unpack them. But unpacking will also cost, right? I know it will cost in any cases, but any best practice will be strongly appreciated.

    Read the article

  • How to copy/paste LARGE amounts of text in Windows

    - by Johnson
    I am not sure if this is more suited for Superuser or Stackoverflow, but here goes... A little bit of background: I'm learning SQL and was trying to make a very large table which I could use for optimization tests. Something generic with random values. I created a little Java program to do just that, and was able to put out a text file with 100,000 lines, each line being an SQL INSERT statement for a new random record. However, with anything much bigger than 100,000 lines, I had problems either opening/using the text file in any text editor, or copying/pasting the text to the windows clipboard and then into SQL Developer so I could execute it as a script. I'm probably overlooking something really obvious, or doing something really stupid. There has got to be a better way to do this, but I couldn't find anything through Google or Stackoverflow or Superuser. Thanks!

    Read the article

  • Need to transfer large video from Camera!-app to computer

    - by Henrik Söderlund
    I have a jailbroken iPhone 4S and am trying to transfer a 25minute long HD-video that I have recorded through SmugMug's Camera-Awesome (Camera!) app. Once recorded in the app, it stays within that app's interface until you choose to save it via the app onto the camera roll. When trying this option, the app just stalls, even when leaving it for an hour plus. I assume the video is too large to copy. I am trying the iExplorer app on my MacBookAir. I can find the Documents folder inside the Camera!-folder. But as soon as I access it to view the contents, it simply stalls the app completely. Probably after trying to read the enormous video. Is there a clever way to transfer this file onto the computer? I can use iFile on the iPhone to transfer through wifi, but I don't know the Camera! app's Documents folder location on the file system.

    Read the article

  • Preventing - Large Number of Failed Login Attempts from IP

    - by Silver89
    I'm running a CentOS 6.3 server and currently receive emails entitled "Large Number of Failed Login Attempts from IP" from my server every 15 minutes or so. Surely with the below configured it should mean only the person using the (my static ip) should be able to even try and log in? If that's the case where are these remote unknown users trying to log into which is generating these emails? Current Security Steps: root login is only allowed without-password StrictModes yes SSH password login is disabled - PasswordAuthentication no SSH public keys are used SSH port has been changed to a number greater than 40k cPHulk is configured and running Logins limited to specific ip address cPanel and WHM limited to my static ip only hosts.allow sshd: (my static ip) vsftpd: (my static ip) whostmgrd: (my static ip) hosts.deny ALL : ALL

    Read the article

  • Clean out a large MediaWiki text table

    - by Bart van Heukelom
    I just discovered that an old MediaWiki of mine was infested with spam, and the database table named "text" (which contains the page content) is 3GB large. I've deleted all the spam pages manually, but: The table is still the same size. I wonder how it got to 3GB anyway. There wasn't that much spam (about a hundred medium-sized pages) How can I get rid of this mess? If you want to inspect the wiki, it's over here. The database is MySQL 5.0.75.

    Read the article

  • Manage Large E-Book Archive

    - by Cnkt
    I have a very large e-book archive (approx. 1TB) including various file formats eg. PDF, DJVU, MOBI and EPUB. I put them in different folders by subject eg. Engineering, Programming etc. But after many years, things are going crazy. The programming folder itself is 220GB and file names are cryptic. Some filenames are well defined like: 236659889_Final_Report_of_2012_Climate_Change_Conference.pdf but some filenames are just ISBN numbers or just download.pdf. I need an application for organizing and searching my e-books. I already tried Calibre, Mendeley and Debenu. But all these apps try to import files first and I dont have any spare 1+ TB for the apps import folder. Is there any good Windows application for just indexing filenames and contents of ebooks without importing them?

    Read the article

  • Copying large Windows directory structure to new server with permissions intact

    - by Chris
    I'm soon going to be doing a large migration of an old-school web server that serves mostly ASP pages (currently on a Windows 2003 server) to a newer, virtualized Windows 2008 server. This new server is going to be in a different domain, as well. So I'm copying the root web folder, and all its subfolders and files, to this new server. I'd like to keep permissions intact. It's also pretty massive - I'd like to be able to compress it before transferring to the new server with permissions intact. Any way to do that? And will the new server being in a different AD domain screw with my plans?

    Read the article

  • Windows 7 pagefile size with large RAM and SSD

    - by Avi
    I've just upgraded my Windows 7 machine from 12GB to 24GB - both for running more VMs and for future proofing. My C driver is an SSD with 129GB formatted size. I was surprised to find out that the SSD only has 68GB free (most of my files are on D: to G:). Researching I found 24GB of my precious C: SSD are taken for virtual memory. So - do I need such large amounts of virtual memory when I have 24GB or RAM? I bought this size of memory so I"d not have to go to disk...

    Read the article

  • Best archive format & tool for large amounts of data (50gb+)

    - by marcusstarnes
    I only realised this afternoon that the ZIP format has a limit of what appears to be around 20gb. I am trying to automate an archive process (using Automate) to zip/rar/whatever a collection of folders/files on one of my disks. It always appeared to bomb out with an incomplete archive at about 20gb. So I tried using WinRAR and doing it manually as a ZIP file, but it told me of the limit. So, I was wondering, what is a recommended zip format (and tool for accomplishing the task) for archiving up a large amount of data (around 50gb)? Thanks

    Read the article

  • Large volume at /mnt on AWS instance

    - by rhaag71
    I know this is probably a somewhat 'dumb' question :) I have an AWS (small) instance and I just noticed that there is a ~150gb volume attached at /mnt, is this normal? It kinda freaked me out, I was thinking maybe someone was trying to capture whatever I mount in /mnt, there is the entry in my fstab too (and I found that others have this by googling)... the entry is as follows /dev/xvdb /mnt auto defaults,nobootwait,comment=cloudconfig 0 2 I don't have any volumes this large in my AWS volumes section though. I was just trying to understand this and be sure that someone is not trying to 'get in'... as there are many attempts daily. Thanks

    Read the article

  • Archive format & tool for large amounts of data (50gb+)

    - by marcusstarnes
    I only realised this afternoon that the ZIP format has a limit of what appears to be around 20gb. I am trying to automate an archive process (using Automate) to zip/rar/whatever a collection of folders/files on one of my disks. It always appeared to bomb out with an incomplete archive at about 20gb. So I tried using WinRAR and doing it manually as a ZIP file, but it told me of the limit. So, I was wondering, what is a recommended zip format (and tool for accomplishing the task) for archiving up a large amount of data (around 50gb)?

    Read the article

  • php.ini settings change not taking effect for large file uploads

    - by user51347
    My server was just reprovisioned, and my application which uploads large (100M+) files now breaks upon re-installation. the symptom is quite consistent : smaller files (8mb in my tests) upload just fine. Larger files cut off at quite close to the same point every time given a particular computer. a file that fails at 26% will fail at just about the same every time. One that takes 1:40s to fail will take within 2 seconds of that every time, before failure. I have set my php.ini settings extravagently : post_max_size = 512M upload_max_filesize = 512M max_input_time = 3600 max_execution_time 3600 Is there possibly a setting at the Apache Level which would override PHP?

    Read the article

  • NHibernate - Stream large result sets?

    - by Dan Black
    Hi, I have to read in a large record set, process it, then write it out to a flat file. The large result set comes from a Stored Proc in SQL 2000. I currently have: var results = session.CreateSQLQuery("exec usp_SalesExtract").List(); I would like to be able to read the result set row by row, to reduce the memory foot print Thanks

    Read the article

< Previous Page | 11 12 13 14 15 16 17 18 19 20 21 22  | Next Page >