Search Results

Search found 11197 results on 448 pages for 'handle leak'.

Page 220/448 | < Previous Page | 216 217 218 219 220 221 222 223 224 225 226 227  | Next Page >

  • What antivirus software supports updates without an internet connection?

    - by Michael Gundlach
    I'm putting antivirus software on Windows 7 computers in the middle of Africa. The computers don't have internet access, but still need to be protected against viruses from CDs and thumbdrives. Separate from these computers is one computer that does have extremely spotty internet access. What's the best AV software for this situation? The important part, as I see it, is that we need to keep the computers up to date, but can't let the AV software suck down updates at its leisure: the computers are disconnected, and getting emails onto the connected computer is a challenge enough. We thought we might transfer update files to the connected computer using a protocol that can handle repeated connection drops (e.g. FTP with resume.) Then we'd manually apply the update files to the disconnected computers. Does any AV software support this? Is there a better solution?

    Read the article

  • Micrsoft fax server not seeing new modem

    - by Tim Meers
    I recently added a new modem (a plain ol consumer grade one) to a fax server thats been up and running for years running on Microsoft Server 2003 fax services. The server currently has two modems, the new one is identical to one of the existing. After installing the new modem it showed up in the Fax Server Manager as a device but was not doing outbound faxes. (The server by default does not handle incoming.) So after a reboot the server no loger sees the modem in the Fax Server Manager but is listed as a device in device manager. I've attempted to restart just the fax service and even the whole box again but to no avail. Any one have any ideas on this one? Or any one with good links to resources for the fax service?

    Read the article

  • Hyper-V stuck restoring VM

    - by Blax
    I have a hyper-v server that runs 5 virtual machines. I believe the physical box rebooted last night which is usually not a problem but this morning one of the virual machines is stuck at "restoring 0%". I have rebooted the physical machine and same thing, 4 virtual machines come up fine the 5th gets stuck at "restoring 0%". I right-clicked the VM in hyper-v manager and selected "Cancel restore" and nothing happens. I was able to copy the VHD to another hyper-v box and light it up there so I know the VHD is good. Any ideas on what to look at next? If I can I would like to just dump the saved state and move on, or if there is a better way to handle it I'm all ears. TIA!!

    Read the article

  • Solutions for scheduling cronjob

    - by Shamsul
    I have setup a list of corn, Some of the corn script takes long time like 1-5 hours, and its increasing day by day. I do not want to run two corn script at the same time, its not for the dependency, bu its because my server memory is not that much so that it can not handle two big operation, so i need to find out a solution so that the scheduled scripts will not start until the other previous script no finished. i have 10-15 corn job in the list. and 5 of them i do not want to overlap. Anyone help me find out any solutions?

    Read the article

  • Comparing outgoing mail options, self hosted VS google apps in my situation

    - by Hoofamon
    I am setting up a small site that has the need to be able to programmatically send mail from time to time. Password resets and things like that. I'm already using google apps to handle my domains email, so now i'm left with the decision between setting up basic SMTP services on my server to send "password reset mails and things like that" OR instead to have the software use google apps to send mail. I like the google apps option as it's less stuff to worry about and seems to fit my needs, i'm trying to figure out what the downsides would be? Thanks for any thoughts

    Read the article

  • Google Chrome not using local cache

    - by Steve
    Hi. I've been using Google Chrome as a substitute for Firefox not being able to handle having lots of tabs open at the same time. Unfortunately, it looks like Chrome is having the same problem. Freakin useless. I had to end Chrome as my whole system had slowed to a crawl. When I restarted it, I opted to restore the tabs that were last open. At this stage, every one of the 20+ tabs srated downloading the pages they had previously had open. My question is: why can't they open a locally stored/saved copy of the web page from cache? Does Google Chrome store pages in a cache? Also: after most of the pages had completed their downloading, I clicked on each tab to view the page. Half of them only display a white page, and I have to reload the page manually. What is causing this? Thanks for your help.

    Read the article

  • Postfix appears to ignore domain's MX records

    - by DisgruntledGoat
    On my dedicated server, I have Postfix installed for sending email through the websites. One of my clients hosts their email with a third party so we have MX records set up on the domain. However, when sending any Postfix emails from the server, they do not get the emails. I think since the domain is pointing at the server itself, it tries to send mail to itself, but there is nothing on the server to handle the email for that domain. (There are mail accounts for other domain which are working fine.) How do I get Postfix to use the domain's MX records to send email? Server is Ubuntu 8.10 with standard LAMP stack. I have Webmin installed, and a control panel called "Matrix" provided by the host.

    Read the article

  • Multiple Java Apps on single Server

    - by kdssea
    What is the best way to host potentially dozens of fairly trivial Java web applications on a single machine? These will be for different clients, and so having them isolated from each other is important, so that if one goes down, they don't all go down. I realize Tomcat can handle multiple apps on its own, but having them all run in a single tomcat instance sounds a little scary to me. An instance of tomcat per app also seems silly, since these apps are likely to be fairly basic. Any thoughts?

    Read the article

  • Is is better to combine Apache for file manipulation and upload and Nginx for static file serving, or to use one of the two alone

    - by user1032393
    Based on my research, I've read that nginx is best and ideal for serving up static files and images. My application depends heavily on uploading of images and rewriting them, then serving them up. Given that I only have one VPS currently, it has been suggested that I use nginx to serve up the images and website, and reverse proxy to Apache (on the same VPS) to rewrite files with image magick and handle the file uploads. Which would be the best solution, Apache, Nginx, or Apache + Nginx? In terms of best solution, I'm looking at minimal average RAM consumption, while maintaining decent load speed of maybe sub 2 seconds?

    Read the article

  • Keeping folder of files in sync over 3 machines

    - by Wizzard
    Morning, Got 3 machines that have user content on them, which I need to keep in sync. This is a 3 way sync. Currently I run rsync but we just don't handle deletes. Have looked at something like gluster, but that seems a little over the top Any other software out there to do a 3 way sync, or a good network file system...? There is for web servers so we don't want a slow / IO hungry process. 3 servers... user content could be added to 1 and needs to be moved to other two.

    Read the article

  • "private" directory not accessible in Apache

    - by janeden
    The directory private lives under my DocumentRoot, and despite its name, it should be accessible just like any other dir. But if I add the following RewriteRule to httpd.conf: RewriteRule ^/([^\.]+)$ /$1.html [L] Apache returns 403 for http://server/private/2201. The error log states client denied by server configuration: /private/2201.html If I then rename private to foo, or if I request 2201.html directly, the file is served: 127.0.0.1 - - [21/Nov/2011:10:24:45 +0100] "GET /private/2201 HTTP/1.1" 403 214 127.0.0.1 - - [21/Nov/2011:10:24:58 +0100] "GET /foo/2201 HTTP/1.1" 200 3068 127.0.0.1 - - [21/Nov/2011:10:27:39 +0100] "GET /private/2201.html HTTP/1.1" 200 3068 This is confusing. Is there any special rule for directories named private? If so – why does the direct request for 2201.html work (although the denied request seems to handle the same resource, at least according to the error log entry)?

    Read the article

  • 1K incoming http post requests per second, each with a 10-50K file

    - by Blankman
    I'm trying to figure out what kind of server setup I will need to support: 1K http post requests per second each post will contain a xml file between 5-50K (average of 25 kilobytes) Even if I get a 100 Mb/s connection with my dedicated box (they usually give 10 Mb/s but you can upgrade), from my calculations that is about 12K kb/s which means about 480 25kb files per second. So this means I need around 3 servers then, each with 100 Mb/s connection. Would a single server running HAProxy be able to redirect the requests to other servers or does this mean I need to get something else that can handle more than 100 Mb/s to proxy things out to the other servers? If my math is off I'd appreciate any corrections you may have.

    Read the article

  • Clients can make maximum only 15 connections to ubuntu custom server

    - by gtan
    I have a custom server in C# being run on Ubuntu 9 under mono. I can make up to 15 silverlight clients connect to the server. When I make the 16th, it just waits. And if I close one of the established connections, the 16th client is able to connect. I am making the connections from one machine. I am also not exceeding any file handle limit. The limit is 1024 and I am having around 300. Any ideas how to make more connections? Also why the number 15? Is it something linux-specific? EDIT: I have run the same server on an Ubuntu 11.10 virtual machine and was able to make more than 15 connections. I presume it's a configuration problem on the Ubuntu 9 machine then. Any help on that?

    Read the article

  • Resizing 2 partitions (NTFS and ReiserFS3)

    - by steven
    When creating a Win7 and Gentoo setup I miss allocated the space needed for Windows and Linux. I have a 320 gb drive and created a 40gb partition on Win7 and used the rest of the space on Linux. Now I need about 70gbs on the NTFS partition. Are there any tools that will shrink the ReiserFS3 partition? (It is using about 80gbs and has the reset free), while growing the NTFS partition? If I have to clone, does the tool copy freespace inside the image? I would prefer this not happen as that I'm sort on backup space. [I can handle a 100-150gb of images, but I can't copy the entire HD]

    Read the article

  • Will 5 Terabyte NAS drive be compatible with Windows XP SP3 32 bit?

    - by TrevorBoydSmith
    (NOTE: The operating system (in this case Windows XP SP3 32 bit) we are using is not a choice.) I am trying to setup a short term storage device. First, I found a large 5 Terabyte NAS drive that would IMO fulfill my storage requirements. Second, I also found that Windows XP seems to have a hard drive size limit (see 'Is there a limit to the size of a hard drive for Windows XP pre-SP1?'): XP should handle up to 2 TB per volume after the service packs are applied. You are correct. There was a 137gb limit on the orginal pre service pack windows xp. This was addressed/fixed in SP1. My question is, will my Windows XP SP3 32 bit machine see the 5 Terabyte NAS and be able to read/write properly to the NAS drive?

    Read the article

  • Apache: multiple domains handling

    - by cache
    So I use following schema to handle multiple sites on my apache: <VirtualHost 192.168.1.100:80> # get the server name from the Host: header UseCanonicalName Off VirtualDocumentRoot /var/www/%0/docs VirtualScriptAlias /var/www/%0/cgi-bin </VirtualHost> Therefore, if a client go to www.example.com, it will actually point to /var/www/www.example.com/doc/, which is good. However, what if the client go to example.com? It will point to /var/www/example.com/doc, which is not what we want. So my question is: is there any better schema for that? Or what should I do to fix the issue? Thanks!

    Read the article

  • How to prefer ipv6 over ipv4 only for specific websites?

    - by kria
    I only have ipv6 connectivity via a HE tunnel on my router, so normally I want to prefer ipv4 over ipv6. For some websites however, I would like to prefer ipv6. Right now I have just set DisabledComponents to 0x20 and hard coded the ipv6 resolution into my hosts file for the sites i want to access over ipv6. Since these ip addresses change at times, this is not a good solution. Any ideas on how to handle this in a non-clunky way? Some kind of Chrome/Firefox add-on might do the trick, but I couldn't find one for this purpose.

    Read the article

  • Stream computer screen to TV via network instead of a USB wireless link

    - by user24559
    I want to stream my computer screen (not just video or a limited amount of content) to my TV via the network. I know there are wireless devices that use USB to tranfer the screen to the TV. However, these are limited to a short distance. What I want to do is stream the data via the network so I can be anywhere within the network and have the data shown on the tv. I am looking for video and sound to transfer. I want the entire computer screen to transfer just like when you connect the computer to the tv via VGA or HDMI and the sound out using the 3.5mm plug. I have been unable to find a unit that allows for the entire computer screen to transfer via the network. I just find the ability to stream video. I am using Windows 7 Ultimate with a quad processor and 16 GB of memory so I have the power to handle the transfer. My tv is hdtv.

    Read the article

  • BIND: How do I allow DNS query from specific external host?

    - by krbvroc1
    I'm running Centos 5.8 (bind 9.3.6). Here is my issue... I run my own DNS server to serve the local machine. I would like to use my DNS server from home. Since my home is a dynamic IP address, I am not sure how this would be accomplished. In my named.conf, there is an allow-query{} and allow-recursion{}. It seems both of those take an IP address, but i need to specify a hostname (at least a cname). This is not a public DNS server (so any is not an option). My hostname/cname is already updated automatically using nsupdate. The only solution I can think of, which I do not like, is to change my nsupdate script to somehow modify the named.conf to search/replace the allow-query/recursion IP address. That would require restarting named whenever the hostname changes as well as Is there some other way to handle this?

    Read the article

  • OSX : Setup for filestorage in medium business

    - by Franatique
    In our office every machine runs OSX. In search of an ideal storage and sharing solution we decided to let OSX Server handle all account information and auth requests whereas an 7TB QNAP provides NFS shares. All shares are published as mounts in the companywide LDAP. As it turns out, handling permissions in this situation is very clumsy (e.g. inherit permissions on newly created files). Unfortunately using NFS4 in combination with ACLs did not solve the problem. As a possible solution I set up a iSCSI connection between QNAP and the machine running OSX Server which in turn serves the LUN as AFP share. Permission handling works like a charm for this setup. Although I am a bit concerned about the performance of this setup. As we are a fast growing company we expect the solution to serve at least 100 clients while using files aprox. above 100MB each. Are there any known drawbacks of this solution?

    Read the article

  • Hardware advice for bitmap / openGL image processing server?

    - by pdizz
    I am trying to work out a build for a processing server to handle bitmap processing as well as openGL rendering for chroma-keying images and Photoshop automation. My searches here and on Google have turned up surprisingly few results, and seeing that there aren't tags for bitmap or image processing I take it this is a specialized application. The bitmap processing is very cpu-intensive while the chroma-keying and Photoshop stuff is gpu-intensive. I doubt this is a case of over-optimization as our company batches thousands of images a day (currently on individual workstations) and any saving in processing time and workstation down-time would be beneficial. Does anyone have any experience with this type of processing server? Any special considerations that would go into a build like this or am I over-thinking it?

    Read the article

  • Multi server management

    - by user788721
    We are running a website that allow users to create their own content, then share it through an iframe. We would like to get more servers to host the user content, and the main one for our website. Each user has a link like xxxx.com/content989856, xxxx.com/content45454545 We were thinking of two options : using a htaccess on the main server that will redirect to the good server but the problem is that if the main server is out, then all the content is out as well using subdomain depending on where the content is hosted, but then if we change the user content from one domain to another one, we will have to change his links as well Do you know a better option or is that really the only two available ? I am wondering how big websites like youtube handle this problem. Thank you very much for your help,

    Read the article

  • Syncing 1TB+ to a iSCSI device, software needed

    - by mojah
    Hi, I need to sync a local disk to a iSCSI mount on Windows (server 2003), and I'm struggling to find software that's capable of doing so in a reasonable timeframe. Notes on the current 1TB disk: - 800GB currently in use - Contains a folder with several hundred thousand subfolders, which in turn have several thousand files, ... So I'm trying to find a piece of software that can handle such large filelists, and give me a good timeframe on when this will be copied. I've tried DeltaCopy (the rSync GUI client for Windows), but it's intolerably slow and doesn't provide me with a good estimate time remaining. DeltaCopy: http://www.aboutmyip.com/AboutMyXApp/DeltaCopy.jsp Does anyone know alternative software for Windows, that would do this well?

    Read the article

  • What factors can affect performance of Http Server written in C-Sharp? [on hold]

    - by Yousaf
    I am having trouble in terms of handling huge databases. I have multiple clients like 100-300 (clients are basically servers with i.e windows sql). Each client may have 38 thousand rows/listing of data, each row has 10-12 fields. I cannot afford to have json files of each client and than handle them on main server, because of memory issue. What if i have http server written in c or c# installed on clients and they return 250 rows in each response to the main server. How the factors like speed, memory or other issues can effect us ? What exactly I am asking for ? In short words if a server writter in c-sharp sends 250 rows per request. What factors can effect the performance of server ? for example. Speed, processing, Operating system, Implementation of algorithm of server ? How these factors can really effect the performance on large scale?

    Read the article

  • Need hosting (e-mail, http) for external domains

    - by disappointed
    This may not be the right place, but since it is a more technical aspect of the hosting world, I am taking the liberty to ask: I'm currently running a virtual server with nginx and postfix for web and e-mail, but I can't handle the administration and, due to frequent problems with e-mail services, I need to resolve this with a almost-standard hosting package (anything should work, even 5 MB static files would be OK). The exception being that I would like to use several domains, hosted with different registrars, for web and e-mail. Currently, this is a very simple configuration in my setup. All hosters I have looked at seem to think this a costly business (more than domain registration costs), but of course the recommend to transfer domains to them (they want the $$). Does anyone know of a hosting company that allows its customers to freely manage domains registered somewhere else?

    Read the article

< Previous Page | 216 217 218 219 220 221 222 223 224 225 226 227  | Next Page >