Search Results

Search found 21702 results on 869 pages for 'large objects'.

Page 462/869 | < Previous Page | 458 459 460 461 462 463 464 465 466 467 468 469  | Next Page >

  • Programatically Determine Exchange Attachment Limit

    - by Jeff Ballard
    Is there any way to query the exchange server to determine the maximum attachment file size? I'd be doing this in ASP.NET/C#. I'd like to be able to validate the file they want to attach is not over the limit before the user attempts to send the file to the server as opposed to having the server send back an exception when it attempts to attach the file and it discovers the file is too large. I've also posted this question about this on stackoverflow.com as well - I figured a sysadmin for Exchange may have an answer as well as a developer. Hopefully I do not incur the wrath of the stackexchange gods.

    Read the article

  • Reocurring unpack failed on git repo improted from svn

    - by xavier
    I have a git repo created from svn with git-svn. Everything converted just fine, but from time to time, when I try to git push, I get: error: unpack failed: unpack-objects abnormal exit Other repos on our server (created from scratch or imported from svn) work fine. The solution is usually to unstage, commit and push files one by one, modify the one that fails (e.g. add a whitespace or something) and commit it once again. It's obviously very irritating, for big commits it's a productivity killer - and requires a lot of server pushes. I'd be grateful for any suggestions on where to look, I couldn't google anything up.

    Read the article

  • nginx static file buffer

    - by Philip
    I have a nfs which several frontend-servers are connected to for making the files stored on the nfs available for http downloads. It looks like I have problems with the way apache is serving the files, there seems to be a very small buffer or no buffer at all which results in a lot disk seeks. I did some testing with loading the whole requested file into memory at once and serve it to the client from memory. With this technique I need less disk seeks for a download stream. Since I don't want to implement this by myself for production use I thought that I could maybe use nginx for that because the documentation says that it uses buffers for static file serving. Is it possible to increase the buffer size to a few mb, if so which config parameter do I have to change for this? Has anyone experience with large buffers for static file serving? Is there a better way to reduce disk seeks?

    Read the article

  • how to warehouse data that is not needed from sql server

    - by I__
    I have been asked to truncate a large table in sql server 2008. The data is not needed but might be needed once every two years. It will NEVER have to be changed, only viewed. The question is, since I don't need the data on a day-to-day basis, what do I do with it to protect and back it up? Please keep in mind that I will need to have it accessible maybe once every two years, and it is FINE for us if the recovery process takes a few hours. The entire table is about 3 million rows and I need to truncate it to about 1 million rows.

    Read the article

  • Accidentally ejected my Verbatim drive and can't get the icon back

    - by Erin
    Hi, I have time machine running on my iMac OSX v10.5.8 and also have a Verbatim 1TB attached that I use as a workspace/scratchdisk so I can manipulate large music files before I transfer them. However, when cleaning behind my computer the other day I think I dislodged the connection (or maybe one of the kids hit the eject button, i don't know) however, I've re-booted many times and it's not reconnected. It doesn't appear in my disc utility windown and I don't know how to get the icon back! I've looked in time machine but it doesn't appear there at all (cos it's not supposed to I think - it's not connected - my mate hooked it up for me and he won't return my calls!). Help. I don't know how to get it back! Sorry for being a plank.

    Read the article

  • Moving folders take long in windows 7

    - by acidzombie24
    What can i do to fix this? maybe drop permission properties? maybe not. I have a large folder with 100k files. I moved it into my archive folder and its taking forever to move. Why is that? I know on XP it takes <1sec but not on windows 7. I am sure its a permission thing, is there a way i can disable it and make it faster?

    Read the article

  • How to fix missing icons in the taskbar and start menu?

    - by adrianbanks
    I installed Visual Studio 2012 yesterday and during the install my path somehow got screwed up1. Since then, the icons for applications that are part of Windows are the default "unknown" icon, but other icons are fine. The applications that the shortcuts link to launch fine when the icons are clicked on. Taskbar: Start Menu: I have fixed my path, but the icons still show incorrectly. Any ideas on how to flush what appears to be a set of cached icons? 1 Something took a path of A;B;C;D; and turned it into A;B;C;D;A;B;C;D;E;F; - duplicating a large part of it to that point that no more characters were available to type in the edit box in system properties. This had the side-effect of Windows reporting that it couldn't find %windir%.

    Read the article

  • Can there be multiple monitors with these video cards and setup

    - by z_Zelman
    I have a Acer Aspire TimelineX 4830TG-6808 laptop running windows 7 Ultimate with one VGA port and one HDMI port. It has one Intel HD Graphics family card and one NVIDIA GeForce GT 540M. Right now I have one monitor running via the VGA port, and also the basic laptop monitor. When I look at the monitor setup, I see one unused video card available. Can I have another monitor via the HDMI port? If I cannot run three monitors, is there some way I could make a large "single" desktop that extends over two monitors? For the sake of celerity, maybe something like having a virtual machine window that extends over two monitors. Could this maybe come from third party software?

    Read the article

  • How to split file on Windows 2003 using MS supported tool

    - by Rune
    Hi, Is it possible to split a large file into smaller files on Windows 2003 using a tool provided/supported/sanctioned by Microsoft? I see that there are a lot of freeware tools (various zip tools) for this task, but I need to move files off of a production server, thus would like to avoid tools I don't know if I can trust. I would much prefer some tool included in the Windows Server 2003 Resource Kit Tools or something along those lines. Does such a tool exist? Thank you.

    Read the article

  • Using my System as the server- need advice

    - by Ashwin
    We are deploying my web application in my system in jboss. And I am planning to use my system as the server as well. There are no html pages or jsp pages deployed. The client requests for resources and the server provides the resources in the form of objects. We are also using databases(Postgresql-15 tables) as the database server(this will also reside in my machine). My system configuration is Windows Vista, 2.3 GHZ, 4GB Ram, 32 bit. There can be many requests coming in at the same time. So is this configuration enough? Should we go with a different Operating System like Windows Server? I have never used Windows Server OS. How will it be different from other windows operating systems? Or if you feel please some other OS will be good in this situation, please suggest.

    Read the article

  • monitoring TCP/IP performance on Solaris

    - by Andy Faibishenko
    I am trying to tune a high message traffic system running on Solaris. The architecture is a large number (600) of clients which connect via TCP to a big Solaris server and then send/receive relatively small messages (.5 to 1K payload) at high rates. The goal is to minimize the latency of each message processed. I suspect that the TCP stack of the server is getting overwhelmed by all the traffic. What are some commands/metrics that I can use to confirm this, and in case this is true, what is the best way to alleviate this bottleneck?

    Read the article

  • Comparing two strings in excel, add value for common variables

    - by overtime
    I'm comparing two large datasets containing strings in excel. Column A contains the numbers 1-1,000,000. Column B contains 1,000,000 strings, neatly organized in the desired order. Column C contains 100,000 randomly organized strings, that have identical values somewhere in column B. Example: A B C D 1 String1 String642 2 String2 String11 3 String3 String8000 4 String4 String78 What I'd like to do is find duplicate values in columns B and C then output the Column A value that corresponds with the string in Column C into Column D. Desired Output: A B C D 1 String1 String642 642 2 String2 String11 11 3 String3 String8000 8000 4 String4 String78 78

    Read the article

  • Word table copy/drag formulae like Excel

    - by tumchaaditya
    I am inserting formulae into a word table to get subtotals for rows and columns and a grand total. My problem is, I have got large number of rows and I don't want to enter the formulae manually in each cell. I cannot use =SUM(LEFT) because the subtotal does not have all the columns. So, is there any way to drag the formulae like we do in excel(bottom right corner of selected cell)? I cannot send the table to excel and copy it back because it would ruin the formatting which took a lot of my time.

    Read the article

  • What's a fast way to copy a lot of files from an internal hard-drive to external (USB) storage?

    - by jonathanconway
    I have a large amount of data - about 500 GB - on the internal hard drive of a desktop PC. This includes music, videos, PDFs... you name it. I want to copy everything to an external USB hard drive (1.5 tb capacity). The desktop PC runs Ubuntu. To being with, I simply plugged in and mounted the hard drive and dragged the top-level folder onto the drive. It's started copying, but it seems to be proceeding very slowly. About 10 minutes later and it's only done about 500 MB. I'm sure this is slower than what I could achieve with less total data. So I'm wondering if there's a quicker way of doing this. Would it be better to copy it in portions of 500MB or so, rather than all at once?

    Read the article

  • Add SATA Port to Motherboard?

    - by YAS
    I recently took off the bottom covers to my laptop, an Aspire 6930, and one of the covers was hiding an empty space large enough for a second hard drive. The bit of motherboard that was showing had the solder joints for a SATA port, but no port. What I'm wondering is; If I get a spare SATA port and solder it in would it mess up my motherboard and kill my laptop? I'm not concerned about a clean solder job, I can do that. But if the port is soldered in cleanly if there would be any danger to doing it. It'd be pretty darn awesome to get a second hard drive in my laptop.

    Read the article

  • Joining H264 *without* re-encoding

    - by jdmuys
    I have two halves of a single show in two .MP4 files, encoded in H264. I would like to join them without re-encoding. Is this possible? I managed to create a joined video as a Quicktime file (.mov) using Quicktime Pro, but then Quicktime Pro will not convert it back to .MP4 without re-encoding. This may be because looking inside the .mov file, the two H264 videos are in there still separated as individual "objects". I am also struggling with MPEG StreamClip without reaching a real solution. But I may have missed something. Note that I do not have the same issue with MPEG2 files. I can export them to a .MPEG container or a .TS file for example, and then I can join them without re-encoding using MPEG Streamclip. Any suggestion welcome, preferably using Mac software.

    Read the article

  • Is ext4 more expensive than ntfs?

    - by ???
    I have just converted an NTFS partition to ext4, however the total space seems reduced from 421G to 415G. Where did the 6G go? And, the reserved space is grown to 199M in ext4, much larger compared to 78M in NTFS, why? The partition is mainly used for movies/musics, so most files are very large (10M each). I want to use ext4 file system, is there any suggestion? mkfs.ntfs: /dev/sdb4 421G 78M 421G 1% /mnt/mmedia mkfs.ext4: /dev/sdb4 415G 199M 393G 1% /mnt/mmedia It's also weired that the remaining size of ext4 is 393G, shouldn't it be 415G or 414G? What happened to the disappeared 22G? Compared to NTFS, ext4 seems eaten 28G in total.

    Read the article

  • Nagios remote monitoring: NRPE Vs. SSH

    - by sam
    We use Nagios to monitor quite a few (~130) servers. We monitor CPU, Disk, RAM and a few other things on each server. I've always used SSH to run the remote commands, purely because it requires little to no additional config on the remote server, just install nagios-plugins, create the nagios user and add the SSH key, all of which I've automated into a shell script. I've never actually considered the performance implications of using SSH over NRPE. I'm not too bothered about the load hit on the Nagios server (It's probably over-speced for what it does, it's never been over 10% CPU), but we run each remote check every 30 seconds and each server has 5 different checks performed. I assume SSH requires more resources for each check but is there a huge difference? (I.E. enough of a difference to warrant the switch to NRPE). If it's any help, we monitor a mix of physical servers (Normally with 8, 12 or 16 physical cores) and Amazon EC2 medium/large instances.

    Read the article

  • A good log file analyzer for windows

    - by Raminder
    Is there a text-editor for windows that can open for me first n lines of a large file? It would be nice if it could also open a set of lines from the middle of the file. EDIT: Basically my requirement is that I want to analyze huge(2GB) log files. So any good tool that can open huge files with some analysis capabilities(searching, text highlighting etc.) would be nice. I like notepad++ but it wouldn't open a file even of about 650 MB. P.S. - Open source tools will be preferred.

    Read the article

  • Should I run my own MTA?

    - by Mascarpone
    I need to send a large volume of emails, roughly 60.000 per week. At the moment we outsource this service to a third party, and we expect to double our volume within the next 6 months. Since the service is starting to be too expensive, I was thinking about setting up our own MTA. Our own SysAdmin told us it is not difficult at all to have our own MTA, but I'm afraid he might have oversimplified this. Is it difficult to handle a MTA? Should I be afraid that my MTA will lose the company mails? Should I stay with a third party service? p.s: The emails have been collected respecting the local legislation on privacy, so no spam.

    Read the article

  • Multiple subnets behind SonicWall TZ 180

    - by Derek
    We have a SonicWall TZ180 that acts as a VPN endpoint. Right now it has one WAN IP address and a /24 assigned to the LAN interface. Our mail cluster administrator asked if it was possible to add a second private class C behind the VPN. This second subnet would be available to the other network and then we would use address objects and acls to limit access. Is this possible? I read up on PortShield but I don't know if that's what we would need to use because we're pushing all data out of one physical port into a Cisco switch that has VLANs already set up. Addendum: It appears that PortShields will do what I want with only one limitation; it requires a direct 1-1 relationship of portshield to physical port. This would then limit us to 4 PortShields on 1 TZ180. Is there a better solution than this?

    Read the article

  • How to delete "System Volume Information" folder from external drives?

    - by Nadude
    I'm running Vista Prof 32bit on a lenovo w500 thinkpad. I have four external drives and use 4 different PCs, that all have system volume information folders, taking up lots of space, and I can't delete them. I don't even know which computer's files are backed up on which external. I've used Thinkvantage rescue and recovery to delete all backups, as well as checked system restore settings to only use my main C drive. I checked all the PC's to ensure only the Main drive keeps system restore points and deleted previous versions. I ran Disc Clean up too. But I can't figure out how to get rid of these large folders from my external drives.

    Read the article

  • Fix Fatal Error Condition showing system path

    - by JMC
    I've noticed there are a large number of servers running Magento Commerce that will return a fatal error showing the system path: Fatal error: Uncaught exception 'Exception' with message 'File '/usr/local/www/magento/data1702/media/css' does not exists.' in /usr/local/www/magento/data1702/lib/Varien/File/Transfer/Adapter/Http.php:96 Stack trace: #0 /usr/local/www/magento/data1702/get.php(205): Varien_File_Transfer_Adapter_Http->send('/usr/local/www/...') #1 /usr/local/www/magento/data1702/get.php(165): sendFile('/usr/local/www/...') #2 {main} thrown in /usr/local/www/magento/data1702/lib/Varien/File/Transfer/Adapter/Http.php on line 96 Magento as an application is generally good about supressing error messages. How can a linux server running apache be configured to avoid returning this error message since the app has problems suppressing it.

    Read the article

  • Terminal runs svn commands very slowly, how can I speed this up?

    - by Paul
    Spending all day in terminal is beginning to get frustrating. We're working with large CakePHP projects, including a ton of schema files and complex controllers. Whenever I go into a project, and enter svn up, or svn ci my system chokes. It takes a good 15-30 seconds before it returns what revision number I'm on. I'm running OSX 10.6 on a Macbook Pro. Any reasoning behind this? Anyway I could fix this speed issue?

    Read the article

  • Linux using the link command

    - by Xavier
    Here it goes. I have a folder that contains a not so large amount of space called /data/backup but I have been told that if I link that folder (/data/backup) to an even bigger folder area like /bigdata/backup for example, that I will be able to execute backups to the /data/backup folder because it will be just a link but the data will be seen in both folders and the latter one (/bigdata/backup) will contain the backup results but it will show on both folders and since the /bigdata/backup has far more disk space then the backup will no longer fail because of space problems in the /data/backup one. Is this true? Thanks Xav

    Read the article

< Previous Page | 458 459 460 461 462 463 464 465 466 467 468 469  | Next Page >