Search Results

Search found 3618 results on 145 pages for 'huge'.

Page 4/145 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • SQL Server Backup modes, and a huge log file

    - by Matt Dawdy
    Okay, I'm not a server administrator, a network guy, or a DBA. I'm merely a programmer helping out a small company. They have IT guy who isn't MS centric (most stuff is on Mac) and he and I are trying to figure out a solution here. We've got 1 main database. We run nightly full backups. I know they are full backups because I can take the latest file, or any of the daily backups, and go to a completely new machine and "restore" the backup to an empty database and our app runs perfectly fine off of this backup. The backups have grown from 60 MB to 250MB over 4 months. When running, then log file is 1.7 GB, and the data file is only 200-300 MB. Yes, recovery model is set to full. So, my question, after all of that, if we are keeping daily backups, and we don't have the need / aren't smart enough to roll the DB back to a certain time, if I change the recovery mode to simple, am I really losing anything? And, if I do change it to simple, will it completely dump the log file or at least reduce it way the hell down? And, will that make our database run faster? I know that it'll make my life easier when I copy a relatively recent backup to my local machine to do development and testing...

    Read the article

  • Huge excel sheet taking too long to update links or calculate formulae

    - by user7231
    I have Excel sheet with 5000 rows and columns till AY (size 12MB). Except for the first 6 columns, rest contain either vlookups or formulae. All the vlookups are in separate Excel sheet. I have changed the Excel setting to manually update the links and calculate formulae. Now everytime I try to update the links, either the Excel sheet hangs or it takes something like 15 minutes. Any ideas on how I can get it done quickly.

    Read the article

  • Email server for huge number of subscriber

    - by bogha
    My question is that my company is thinking of providing a free email account for each of its customers. As a new company we will assume that our corporate email system will be MS Exchange server which will support about 1000 employees. They are asking why not adding the customer list to be a part of Exchange users. My suggestion was to separate the two systems, for the corporate we can use Exchange but for customers (around 30000) we have to use a Linux based system. My only argument was that Linux can be used for enterprise services like this and Microsoft may fail. What do you suggest? And if you are with me on choosing Linux as the server platform, what do you suggest to use as an alternative for Exchange in Linux? Thank you.

    Read the article

  • Ubuntu 12.4 - Terminal - Huge/Large text on each command line [closed]

    - by gotqn
    Possible Duplicate: Is it possible to change my terminal window prompt text? I have been using "Ubuntu 12.4" for few days now (no previous Linux experiences at all) and I have noticed that the symbols on each command line more then this in many examples in the network. For example, I have: And I want to remove the "gotqn-System-Product-Name" part, because it is taking too much space? What should I do to change this?

    Read the article

  • Looking for a new, free firewall (Sunbelt has a huge hole)

    - by Jason
    I've been using Sunbelt Personal Firewall v. 4.5 (previously Kerio). I've discovered that blocking Firefox connections in the configuration doesn't stop EXISTING Firefox connections. (See my post here yesterday http://superuser.com/questions/132625/sunbelt-firewall-4-5-wont-block-firefox) The "stop all traffic" may work on existing connections - but I'm done testing, as I need to be able to be selective, at any time. I was using the free version, so the "web filtering" option quit working after some time (mostly blocking ads and popups), but I didn't use that anyway. I used the last free version of Kerio before finally having to go to Sunbelt, because Kerio had an unfixed bug where you'd eventually get the BSOD and have to reset Kerio's configuration and start over (configure everything again). So I'm looking for a new Firewall. I don't like ZoneAlarm at all (no offense to all it's users that may be here - personal taste). I need the following: (Sunbelt has all these, except *) - 1. Be able to block in/out to localhost (trusted)/internet selectively for each application with a click (so there's 4 click boxes for each application) [*that effects everything immediately, regardless of what's already connected]. When a new application attempts a connection, you get an allow/deny/remember windows. - 2. Be able to easily set up filter rules for 'individual application'/'all applications,' by protocol, port/address (range), local, remote, in, out. [*Adding a filter rule also doesn't block existing connections in Sunbelt. That needs to work too.] - 3. Have an easy-to-get-to way to "stop all traffic" (like a right click option on the running icon in the task bar). - 4. Be able to set trusted/internet in/out block/allowed (4 things per item) for each of IGMP, ping, DNS, DHCP, VPN, and broadcasts. - 5. Define locahost as trusted/untrusted, define adapter connections as trusted/untrusted. - 6. Block incoming connetions during boot-up and shutdown. - 7. Show existing connections, including local & remote ip/port, protocol, current speed, total bytes transferred, and local ports opened for Listening. - 8. An Intrusion Prevention System which blocks (optionally select each one) known intrustions (long list). - 9. Block/allow applications from starting other applications (deny/allow/remember window). Wish list: A way of knowing what svchost.exe is doing - who is actually using it/calling it. I allowed it for localhost, and selectively allowed it for internet each time the allow/deny window came up. Thanks for any help/suggestions. (I'm using Windows XP SP3.)

    Read the article

  • Huge HDD response time in Resource monitor

    - by Mille
    Just bought all parts for a computer and put it together and installed a fresh version of windows 7. After a while, when using the computer it gets very slow, and even closing down windows can take several minutes. I started to look in the resource monitor and though I found the answer watching my hdd. The thing is that the hdd completes all tests in Seagate's SeaTools for Windows successfully. Which makes me doubt on the problem and weather I can send it in to get an replacement. Suggestions on what it could be and what I can do about it? Here a screenshoot from the resource monitor:

    Read the article

  • Mysql migrate huge db from innodb to ndbcluster Err: the table is full

    - by Nguyen Trong Nhan
    I'm trying to migrate old database to mysql cluster (4 data nodes) by using command: ALTER TABLE sample ENGINE=NDBCLUSTER but I'm getting the following error: The table '#sql-7ff3_3' is full There are approximately 300 mil rows in this table. Here are my config file: /mysql-cluster/config.ini [NDBD DEFAULT] NoOfReplicas=2 DataDir=/data/mysql-cluster/ndb/ BackupDataDir=/data/mysql-cluster/backup/ DataMemory=10G IndexMemory=5G TimeBetweenLocalCheckpoints=6 FragmentLogFileSize=256MB NoOfFragmentLogFiles=50 MaxNoOfOrderedIndexes=8000 MaxNoOfConcurrentOperations=100000 MaxNoOfTables = 10000 RedoBuffer=128M MaxNoOfAttributes=5000 MaxNoOfUniqueHashIndexes=1024 /etc/my.cnf [mysqld] basedir=/usr/local/mysql datadir=/data/mysql-cluster/mysqld/ event_scheduler=on default-storage-engine=ndbcluster ndbcluster ndb-connectstring=192.168.x.x,192.168.x.x innodb_file_per_table innodb_buffer_pool_size = 512MB key_buffer = 512M key_buffer_size = 512M sort_buffer_size = 512M table_cache = 1024 read_buffer_size = 512M

    Read the article

  • Why the huge discrepancy in size between two similar zip files

    - by twpc
    I use WinZip to zip entire directories of code and send them to a fellow programmer. He makes changes and sends the directories of code back to me. Ignoring the fact that this is not a good way to keep the code clean when we are both working on it, I notice that his zip files are far smaller than mine, with basically the same data inside (mine range around 36,000 KB, his 2,000 KB). I believe he is also using WinZip. What's going on here, and how can I make mine "more compressed"?

    Read the article

  • HUGE MAC FILTER and scripting

    - by user195917
    I make an dhcp server on CentOS, and i apply a mac filter for my clients. Now, with a small number of clients (max 10) ,is not that hard, but what I will do with 2000 clients? My idea was to create a list (ex. "macfilter.lst") and this list, to be updated after a database. I have tow questions. First: How do i create a filter in IPTABLES that takes info`s from a file (file hosted on server) Second: Any idea about how to write a script, that update a file after a database?? Thanks so much for your help.

    Read the article

  • Robocopy (without /l flag) running for hours making log file huge, but not actually copying anything

    - by Mickster
    Here's my command: Robocopy C: C:\C_root /FP /BYTES /TEE /S /E /COPYALL /DCOPY:T /MOVE /Z /ETA /XJ /R:2 /W:30 /XF pagefile.sys /XD /LOG:C:\robocopy.log. BTW, notice the /XD option above. After that I had a few directories that I want to omit. I showed these between angle brackets like this: (left angle bracket) a few dirs I wanted to exclude (right angle bracket). Amongst these dirs was the C_root dir itself, so that it did not get into an "infinite" recursion. (This part was stripped from the post because angle brackets apparently have a meta-meaning to superuser.com about hyperlinks.) The command window this was running in listed a few "EXTRA" files then "hung". By this, I mean no more output, no cmnd prompt, and if I tried to scroll it up, it would immediately scroll right back to the bottom. After about six hours, it finally finished, although I never got a cmnd prompt back in the window I started it in. DIR shows the log file at more than 1.3GB, but when I try to do a MORE on it, I get "Cannot access file". C:\C_root never grew larger. Does anyone have an idea what is going on here?

    Read the article

  • Huge discrepancy in Inkscape file size

    - by Keyran
    When using Inkscape to create many pictures with common elements across them, I tend to copy the first SVG file I have created as many times as I need pictures, and then edit the copies. If I reuse files across projects, it can result in a file being copied and modified tens to hundreds of files. I have recently realized that the latest copies have a size between 29 and 60 MB, slowing my computer down significatively. My pictures are very simple, nothing that would normally go over 1 MB in size. As an experiment, I copied the entire content of one of the latest files into a new Inkscape file. I am certain that I have copied the content of the file entirely (I have only one layer and I used the "Select All" option). The new file has a size of 102,2 KB. This would indicate that about 30 MB of data per file is irrelevant to me. What could be the cause of this size difference ? Is there a way to reduce the size of a file without having to copy the content into a new file ? I am using Inkscape 0.48.4 on Debian Unstable. Thanks for any input you might be able to provide !

    Read the article

  • huge C file debugging problem

    - by valdo
    Hello all. I have a source file in my project, which has more than 65,536 code lines (112,444 to be exact). I'm using an "sqlite amalgamation", which comes in a single huge source file. I'm using MSVC 2005. The problems arrives during debugging. Everything compiles and links ok. But then when I'm trying to step into a function with the debugger - it shows an incorrect code line. What's interesting is that the difference between the correct line number and the one the debugger shows is exactly 65536. This makes me suspect (almost be sure in) some unsigned short overflow. I also suspect that it's not a bug in the MSVC itself. Perhaps it's the limitation of the debug information format. That is, the debug information format used by MSVC stores the line numbers as 2-byte shorts. Is there anything can be done about this (apart from cutting the huge file into several smaller ones) ?

    Read the article

  • How to upload a huge(GBs) file to a webserver

    - by Aman Jain
    Hi I want to create a web interface that will allow users to upload huge files. These files are actually vmdk files(Virtual Machine Disks), and they can be multi GB in size. I think this makes the situation different from normal file uploads(10-20MB file uploads). So any advise on how to achieve the same in an efficient and fault tolerant way, is appreciated. Thanks Aman Jain

    Read the article

  • Deleting huge chunks of data from mysql innodb

    - by ming yeow
    I need to delete a huge chunk of my data in my production database, which runs about 100GB in size. If possible, i would like to minimize my downtime. My selection criteria for deleting is likely to be DELETE * FROM POSTING WHERE USER.ID=5 AND UPDATED_AT<100 What is the best way to delete it? Build an index? Write a sequential script that deletes via paginating through the rows 1000 at a time?

    Read the article

  • Opening a huge .csv file with Excel Interop using C#

    - by user262102
    Hi, I have an application that write huge .csv files about the size ranging from 1 GB to 2 GB. I need to color code the file and save it as .xlsx. So I have tried using Excel Interop and it works great for small files, but when I try to open a 1.3 GB .csv file with Excel, I get an Hresult error. Any ideas as to how I could accomplish this task either with using Excel, or if there is any other way of doing it. Thanks!

    Read the article

  • Huge Graph Structure

    - by Harph
    I'm developing an application in which I need a structure for represent a huge graph (between 1000000 and 6000000 nodes and 100 or 600 edges) in memory. The edges representation will contain some attribute of the relation. I have tried a memory map representation, arrays, dictionaries and string for represent that structure in memory, but this always crash because the memory limit. I would to get an advice of how can I represent this, or something similar. By the way, I'm using python.

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >