Search Results

Search found 13928 results on 558 pages for 'large scale nat'.

Page 205/558 | < Previous Page | 201 202 203 204 205 206 207 208 209 210 211 212  | Next Page >

  • best tomcat hosting servrice provider for jsp

    - by akshay
    I want to host my website build using jsp/java.I am looking for a good host which offer following features. unlimited bandwidth (to support large traffic.I dont want to run slowely) low price` good customer care suport that can help me in deploying in case of any problems I am running tight on budget.As i am a university passout. `

    Read the article

  • Is there anything better than Microsoft Project? [closed]

    - by GuruAbyss
    Possible Duplicate: Project Planning Tools I'll soon be knee-deep into a very large project and I'm looking into project management software. I need users opinions on software based (no web based) solutions that are equal or better than MS Project. It can be open source or closed source. Thank you all in advanced for your insight and opinions!

    Read the article

  • Out of sync audio video using mencoder

    - by 1ch1g0
    hi i converted a mkv (matroska) file to avi using ffmpeg: ffmpeg -i input.mkv -f mp4 -vcodec mpeg4 -sameq -r 29.97 -b 512kb -acodec ac3 -ab 128kb -vol 512 output.avi the output file plays fine using mplayer. after that, i using mencoder to insert subtitles: mencoder output.avi -o new.avi -oac pcm -ovc lavc -subfont-text-scale 3 -sub subtitle.srt however, after i play back the video "new.avi", the video and audio is out of sync. What options can i put into mencoder to sync the A/V. ? I have also tried ffmpeg -newsubtitle option but can't get it work. Any examples of usage of -newsubtitle would be greatly appreciated. thanks

    Read the article

  • Add padding to display

    - by frbry
    I know this one is weird a bit. I need to scale down the display on my monitor like the example below: http://img706.imageshack.us/img706/4130/58518806.jpg Normally, this could be done with some adjustments in monitor's OSD, but there is no OSD on my LCD monitor. Is it even possible by software? Thanks. Edit: Graphic Card: Intel G33/G31 express Chipset family It would be good if it's permament but it's not needed. I can change my operating system but currently i use Windows XP.

    Read the article

  • Can't chgrp in NFS4 mounts

    - by Philipp
    Hello, I'm using Linux in a large multi-user network. Let A be some group which I'm am member of, but which is not my primary group. According to chmod(2) I should be able to chgrp a file to group A. Trying to do so succeeds on a local as well as on a NFSv3 mount, but not on a NFSv4/Kerberos mount (EPERM). Are there any special considerations regarding chgrp when using NFSv4 mounts?

    Read the article

  • Amazon S3 bucket - download only certain files

    - by mottey
    Hi I have an Amazon S3 bucket with 10,000 images sitting in it with a standard naming convention: 001_small.jpg 001_large.jpg 002_small.jpg 002_large.jpg Because there are such a large amount of files I don't want to download ALL of them and I don't want to sit there for a couple of hours to select just the *_large.jpg files... Can someone suggest an S3 file manager that can let me select only the *_large.jpg files to download?? Thanks!

    Read the article

  • Providing DNS redirection to honeypot server for known bad domains

    - by syn-
    Currently running BIND on RHEL 5.4 and am looking for a more efficient manner of providing DNS redirection to a honeypot server for a large (30,000+) list of forbidden domains. Our current solution for this requirement is to include a file containing a zone master declaration for each blocked domain in named.conf. Subsequently, each of these zone declarations point to the same zone file, which resolves all hosts in that domain to our honeypot servers. ...basically this allows us to capture any "phone home" attempts by malware that may infiltrate the internal systems. The problem with this configuration is the large amount of time taken to load all 30,000+ domains as well as management of the domain list configuration file itself... if any errors creep into this file, the BIND server will fail to start, thereby making automation of the process a little frightening. So I'm looking for something more efficient and potentially less error prone. named.conf entry: include "blackholes.conf"; blackholes.conf entry example: zone "bad-domain.com" IN { type master; file "/var/named/blackhole.zone"; allow-query { any; }; notify no; }; blackhole.zone entries: $INCLUDE std.soa @ NS ns1.ourdomain.com. @ NS ns2.ourdomain.com. @ NS ns3.ourdomain.com.                        IN            A                192.168.0.99 *                      IN            A                192.168.0.99

    Read the article

  • How can I archive a 30 GB file?

    - by Joel Coehoorn
    I have a 30 GB zip file containing an archive of digital materials available in the school library that I want to burn to DVD. Of course, 30 Gb is far too large for a single DVD and the content is already zipped. I'm open to ideas, but leaning towards suggestions that will help me automatically spread the file over multiple DVDs, including a simple program to stitch it back together again later.

    Read the article

  • stsadm -o. What does the -o mean?

    - by ddono25
    I am working on a large SharePoint farm, mainly with the backend SQL Servers. We have always used stsadm -o for all stsadm functions, but no one seems to know why. I can't seem to find the info specific for stsadm, would it be general Windows command-line sytax?

    Read the article

  • Will new Acer Revo (with Atom 330) be fast enough to be MythTV client/server?

    - by vava
    As a geek I really like Atom CPUs but can't find a reason to buy one yet :( Although I was thinking about making my own DVR with NAS and media center functionality. Unfortunately, even today's Acer Revo, built on ION platform is not fast enough for streaming Full HD videos. So what do you think, will new two core CPU make it better, will it be able to show Full HD videos, store them to disk and transfer something over the network at the same time? Will it be able to scale videos from Hulu and YouTube to fullscreen?

    Read the article

  • What are some good Server Name Themes/Categories [duplicate]

    - by Arian
    This question already has an answer here: What are the most manageable and interesting server naming schemes being used? [closed] 17 answers I need to create a naming scheme for my servers, but I am having a hard time come by a good category list to go by. I want something with an abundance of names to use, so as I scale my server count I won't run out. Some that I have heard being used is greek philosophers (plato) planet names (saturn, mercury, venus, mars) Mario Characters (mario, luigi, yoshi, toad) I feel like the above categories are kind of limited. What are some good naming scheme that you use?

    Read the article

  • How do I host multiple SSL websites on a single EC2 instance using Amazon Elastic Load Balancers?

    - by Developr
    If I have 3 separate websites which all require SSL (separate certificates) that I want to host on the same EC2 instance(s) across multiple availability zones so that we have the ability to scale and be highly available, how do I achieve this using ELBs in my Amazon VPC? Each site requires a separate IP address, so I have added multiple private IPs to the EC2 instance, but I am unsure how to bind the ELB to a certain IP on the instance. I was also able to setup multiple ELB pointing to the same instance, but again, I am not seeing any way to bind each ELB to a separate IP on the instance. If this is not possible, what is the best option? Run each site on a separate EC2 instance / ELB combo (expensive and harder to maintain) Give each site a separate public IP and use Route 53 to do the load balancing (seems like a hack) Use a different load balancer option such as HAProxy that should be able to work like a normal load balancer appliance. Please help!

    Read the article

  • Free web hosting that allows JavaScript and CSS

    - by Raul Agrait
    I was considering using Google Sites to host some webpages with HTML5 and JavaScript experiments I'm trying out, but it seems that they don't allow JavaScript. Does anybody have any good suggestions for a free web hosting service where I can upload simple HTML/CSS & JavaScript experiments? I don't have large bandwidth needs, nor do I need a WYSIWYG editor. Ideally I'd like to just upload the HTML, CSS, and JS files directly.

    Read the article

  • Sync my files across multiple computers

    - by EnderMB
    I do a lot of work on my home computer, ranging from programming, writing stored procedures and writing documentation and reporting. A lot of this work is university related and constantly swapping files across several computers is annoying at best. I have a large final-year project coming up and I'm going to be sharing this work amongst home and university and require some kind of online storage that provides version control for my programs, as well as my Word documents, PDF's and saved academic papers. Are there any good solutions for my problem?

    Read the article

  • TCP/IP & throughput between FreeNAS (BSD) server & other LAN machines

    - by Tim Dickerson
    I have got a question for someone that knows BSD a bit better than me that are in regards to my LAN setup at home/work here outside Chicago. I can't seem to fully optimize my network's (LAN) thoughput via my FreeNAS (BSD based) file server. It runs with the latest FreeBSD release which is modified to support several protocols for file transfers and more. Every machine that is behind my Smoothwall (Linux based) router is on the usual 192.168.0.x subnet and for most part works just fine. Behind the Smoothwall box, all machines are connected to a GB HP unmanaged switch. I host a large WISP here and have an OC-3 connection here at home/work and have no issues with downloading/uploading from/to the 'net'. My problem is with throughput. When I try and transfer large files...really any for that matter..between any of the machines to/and from the FreeNAS server via FTP, the max throughput I can achieve say between a Win 7 or a Linux box is ~65Mbit/sec. All machines are running Intel Pro 1000 GB NIC's and all cable is CAT6. Each is set to 'auto negotiation' and each shows 1500 MTU Full Duplex @1GB so I know the hardware is okay. I have not adjusted the MTU on any machine as I understand it to be pointless unless certain configurations are used (I assume I am not one of those). My settings for the FreeNAS machine are the following: # FreeNAS /etc/sysctl.conf - pertinent settings shown kern.ipc.maxsockbuf=262144 kern.ipc.nmbclusters=32768 kern.ipc.somaxconn=8192 kern.maxfiles=65536 kern.maxfilesperproc=32768 net.inet.tcp.delayed_ack=0 net.inet.tcp.inflight.enable=0 net.inet.tcp.path_mtu_discovery=0 net.inet.tcp.recvbuf_auto=1 net.inet.tcp.recvbuf_inc=524288 net.inet.tcp.recvbuf_max=16777216 net.inet.tcp.recvspace=65536 net.inet.tcp.rfc1323=1 net.inet.tcp.sendbuf_inc=16384 net.inet.tcp.sendbuf_max=16777216 net.inet.tcp.sendspace=65536 net.inet.udp.recvspace=65536 net.local.stream.recvspace=65536 net.local.stream.sendspace=65536 net.inet.tcp.hostcache.expire=1 From what I can tell, that looks to be a somewhat optimized profile for a typical BSD machine acting as a server for a LAN. I might be wrong and just wanted to find out from someone that knows BSD better than I do if indeed that is ok or if something is out of tune or what. Are there other ways I would find better for P2P file transfers? I honestly do not know what I SHOULD be looking for with respect to throughput between the NAS box and another client when xferring files via FTP, but I am told that what I get on average (40-70MB/sec) is too low for what it could be. I have thought about adding another NIC in the FreeNAS box as well as the Win7 machine and use a X-over cable via a static route, but wanted to check with someone first to see if that might be worth it or not. I don't know if doing that would bypass the HP GB switch and allow for a machine to machine xfer anyways. The FTP client I use is: Filezilla and have tried both active and passive modes with no real gain over each other. The NAS box runs ProFTPD.

    Read the article

  • What is the best way to keep a folder synchronized with my USB drive?

    - by Ivo Flipse
    I know there is a similar topic on syncing between computers, but I'm looking for an application to run on one computer that will sync a "document/file" folder with a folder on my secondary/external USB drive. What would be the best solution? I know I could use Dropbox & Live Mesh, but they use up bandwith which isn't very good when I drop in a lot of large files. I'm running Windows 7, but I assume any solution for Windows Vista would work just fine.

    Read the article

  • How to make the PC speaker beep from the Windows 7 command prompt?

    - by oKtosiTe
    I'm running some lengthy video encodes using the Handbrake command line interface. After all my encodes are done, I would like to have the PC speaker beep, as I usually turn my large external speakers off. On Linux I would install the "beep" package, but so far I haven't found such a program for Windows 7. Possibly related links: System "Beep" sound does not function in Windows Vista x64 with HD Audio devices (I am indeed using an HD Audio device: the SoundMAX ADI1986A) What’s up with the Beep driver in Windows 7?

    Read the article

  • Proven and Scalable Comet Server

    - by demetriusnunes
    What is the most proven, scalable comet server solution out there that can handle up to 100.000 real-life connections per node using HTTP streaming (not long-poll)? It must be a free, preferably open-source project. We've already tried Meteor (Perl), with no success. Meteor was able to scale just up to 20.000 connections per node. We are looking right now at these options: APE (C++), Orbited (Python), Grizzly (Glassfish), Cometd (Jetty). Any big success stories with any of these?

    Read the article

< Previous Page | 201 202 203 204 205 206 207 208 209 210 211 212  | Next Page >