Search Results

Search found 9952 results on 399 pages for 'big al'.

Page 189/399 | < Previous Page | 185 186 187 188 189 190 191 192 193 194 195 196  | Next Page >

  • How does a winkydink Teradici offer high res, full FPS, 3D rendering on ESXi 5 VDIs for AutoCAD/SolidWorks/1080p YouTube applications?

    - by BlueToast
    How does such a small Teradici card ![enter image description here][1] offer high resolution, full FPS 3D graphics (1:38) http://www.youtube.com/watch?v=eXA4QMmfY5Y&feature=player_detailpage#t=97s for ESXi 5.0/5.1 VDI environments? We're shooting for an AutoCAD/SolidWorks/YouTube 1080p capable environment. I can't see how such a small and low profile card could possibly have the horsepower to handle such GPU computations for a big environment like that. We're going to have up to 64 VDIs per server, and are a 500-1000 employee count sized company. Someone enlighten me please! Determining which route to go (between RemoteFX and VMware View/PCoIP) and the hardware (NVIDIA 4GB non-Quadro/Tesla GPUs vs Teradici card). Servers have three 4x, three 8x, and one 16x PCI-E lane. Two of the 8x lanes will be occupied by SAS RAID cards.

    Read the article

  • How to prevent people taking software home?

    - by Robert MacLean
    Most companies I have worked at have had either a collection of disks or a network share with the installs of the commonly used software in them. This is to allow the IT dept and skilled users to install the software they need on their work machines very easily. However some users would see this as an opportunity to get "free" software for their home machines. I've seen the draconian approach of locking the machine down completely, but that does not work well (in my view - if you disagree feel free to comment on it) because You add so much extra work to IT Users get that big brother feeling So how do you find a way to prevent users from taking home software but still allowing them to install what they need? You can make the assumption that most of the users in the organisations I work in are smart enough to install software, I'm not worried about the tea lady here.

    Read the article

  • Why is ext3 so slow to delete large files?

    - by Janis Peisenieks
    I have a server, which makes an incremental backup of a system every night. Now on saturdays, there is a full backup. But after the full backup has finished, a script kicks in, that deletes the incrementals. Now, the script sometimes breaks, and it is because the incrementals are each about 10GB files, and sometimes takes too long for the script. Now could someone explain to me, or point me in the direction of a resource, that explains why ext3 is so slow to delete files, when compared to, lets say, NTFS? I know theses are 2 completely different file systems, but I'm really interested why is there such a big difference in deletion?

    Read the article

  • monitoring TCP/IP performance on Solaris

    - by Andy Faibishenko
    I am trying to tune a high message traffic system running on Solaris. The architecture is a large number (600) of clients which connect via TCP to a big Solaris server and then send/receive relatively small messages (.5 to 1K payload) at high rates. The goal is to minimize the latency of each message processed. I suspect that the TCP stack of the server is getting overwhelmed by all the traffic. What are some commands/metrics that I can use to confirm this, and in case this is true, what is the best way to alleviate this bottleneck?

    Read the article

  • Install Windows XP using USB

    - by AmanBe
    How to install windows xp from usb ? I have the iso image. My cdrom is not working. I read up something on internet about this issue but all the articles are just way too complex and big + they are all different so don't know which one to try. I want to know if someone has tried something like this and to tell me what's the best and easiest way, like some tool that will automatically write the iso file onto the flash drive and make it bootable or smth. Thank you in advance.

    Read the article

  • Spam mail through SMTP and user spoofing

    - by Josten Moore
    I have noticed that it's possible to telnet into a mailserver that I own and send spoofed messages to other clients. This only works for the domain that the mail server is regarding; I cannot do it for other domains. For example; lets say that I own example.com. If I telnet example.com 25 I can successfully send a message to another user without authentication: HELO local MAIL FROM: [email protected] RCPT TO: [email protected] DATA SUBJECT: Whatever this is spam Spam spam spam . I consider this a big problem; how do I secure this?

    Read the article

  • What is a good php 5.3.x shared hosting company?

    - by Abba Bryant
    I am looking for the best shared host - features-wise, not price - for hosting CakePHP and Lithium applications. I would like to be able to use MongoDB / MySQL as well as have access to some of the more common PHP extensions like MCrypt, etc. I currently use dreamhost with a custom PHP 5.3.x build on my sandbox domain - Please do not suggest this as a solution. I want to move away from managing my own PHP build if possible. I need ssh access but email support isn't as big of an issue.

    Read the article

  • How to determine the source for wakeup in hibernate

    - by Erik
    I have a big problem with my home theater PC that runs Windows 7 64Bit. Normally, I send that PC to hibernate every evening, but from time to time, it keeps waking up for no obvious reason, and stays on until I realize, which is sometimes half a day later :( I have already checked for Windows update, which is not set to automatical, since I prefer installing updates manually. When I look in the system event log, there is an entry called "Power Troubleshooter" which tells me that my system was reactivated at a specific time, but it also says: Source = Unknown, which is the most annoying part. So how can I actually figure out, which process reactivates the PC? Is it possible to set a group policy which forbidds applications or services from scheduling tasks that allow waking up from hibernate at all?

    Read the article

  • Is there a RAR extractor (for multiple rar files like .r00 etc.) that will use all of my quad cores?

    - by Christopher Done
    I've got a quad core Intel processor. I've got a big file split into little ones as RAR files, foo.r00, foo.r01, etc. which the RAR program extracts into one file/directory. Is there a RAR program that I can specify like "use four cores" in the extract process? At the moment it sits there using 100% of one core. I recognise the bottleneck might be my hard drive anyway, but I don't see a lot of HD usage and suspect the decompression process is more intensive than waiting on I/O. For example, GNU Make accepts a (-j, I think) argument to tell it how many cores to use, which I used to compile PHP 6 really quickly.

    Read the article

  • Nginx save file to local disk

    - by Dean Chen
    My case is: In our China company, we have to access one web server in USA headquarter through Internet. But network is too slow, and we download many big image files. All our developers have to wait. So we want to setup a Nginx which acts as reverse proxy, its upstream is our USA web server. Question is can we make Nginx save the image files from USA web server into its local disk? I mean let Nginx act as one cache server.

    Read the article

  • What is this video format, and how do I convert It

    - by OrangeRind
    Description I have a big (7.4G) .mkv file (1080p) which I want to convert to H.264 (using x264) Problem MediaCoder and GSpot are unable to detect the codec. They don't display anything. Just that the file is a matroska Container Video with a MIMEtype of video/x-matroska. No bitrate, profile etc. But the source tells me that is VC-1 encoded. Question So how do I encode this file. as in, using what encoding software, since MediaCoder has failed.

    Read the article

  • Automatically Log off Google when logging in using Google OpenID?

    - by Ross Charette
    I use Google as my OpenID provider. Once you log into a website with Google's OpenID I am then logged into Google as well. I do not desire this. Can I somehow automatically log off my Google account, or prevent Google from logging in every time I use my Google OpenID? I prefer not to have my personal google account always logged in. It's not a big deal going to gmail and clicking log off, but if there is a simpler way that would be good. Note to admins: this is not just for stack overflow, please don't close the question.

    Read the article

  • Standalone server setup for compute capacity

    - by mikera
    I'm developing an application for my company that will require a lot of compute capacity (running some very big mathematical calculations), and looking for some form of server setup to do this. For various reasons, we want to run this on-site in our office rather than hosting it externally. It's been a while since I last had to set up my own servers so I thought I would tap into the collective wisdom of serverfault! My broad requirements are: Budget $30-50k, with an aim to get as much compute capacity as possible for that budget 64-bit servers suitable to run Ubuntu Linux + Java Some relatively standalone rack that can be installed in secure office space Fast/low latency network connections between the servers, but don't really care about connectivity to the outside world Storage capacity shared between the servers - they don't necessarily need their own storage providing they can be booted from a common image Downtime can be tolerated (since the calculations are run in batch mode) The software itself is fault-tolerant, so there is no need for extra resiliency in the server setup (cheap replaceable commodity parts will be fine in general) Given these requirements what kind of setup would you recommend and why?

    Read the article

  • Does mdadm allow to mix sata drives and USB -> sata drives ?

    - by marc
    Welcome, I have question, i have working array md5 of 5 drives. We are running out of space. Server don't have big usage but it have to be online. We can't add another controller. I got idea to buy a USB drive case ? Does mdadm allow to mix array created on sata drives by controller and virtual usb ? (we don't boot from mdadm) But im not sure is this will work ? Is usb drives are configurated before mdadm demon start ? Regards

    Read the article

  • Ubuntu from console/command-line/shell

    - by Xolve
    Earlies linux distros though required lot of manual work they were quite good to use from commandline. If the X-server didn't start or you just want a shell to work they all supported. Network was configured by init; sound was up and ready; new devices inserted would be configured and their configureation was placed in fstab. Also there were small scripts I found on many distros which on X used windows while on console they switched to ncurses. But now this all needs GUI with a desktop manager (KDE, GNOME) for the new paradigms :'-( require GUI (NetworkManger, hal etc.). So if on just command line you have to be root, looks like they believe only geeky admins need that, and need to edit config files or type big commands. Any way so that this is easy in Ubnubtu through shell again.

    Read the article

  • Any limitations for putting an SSD in a Mini? How fast would an external HDD be via Firewire? Is Ser

    - by Cyrcle
    I'm considering getting a Mini for web programming. I do a lot of text searches so I want to put a SSD in it. Does the Mini have any limitations that might effect the performance of a SSD? I'm trying to decide if I should get a Mini Server. I'd like to be able to have two internal drives so one can be SSD for OS and the code I'm working on, and the other can be my storage drive. However, I'm not sure if I'll be using the extra functionality of the server edition OSX or not, so I'm reluctant to pay the $200 premium. In a "regular" Mini I could put the SSD internal and use an external big drive, but would the external drive be fast enough via Firewire? Thanks in advance for any info.

    Read the article

  • Illustrator "Save for Web & Devices" returning crappy, pixelated images

    - by Tory Waterman
    I'm trying to create a nice title for my webpage... a big white title to sit on a black background. I'm using Illustrator to do so. When I create it, it looks nice, but when I hit "save for web & devices", it comes out looking like a pixelated piece of crap on the site. Is there some setting I need to change to make Illustrator save a higher resolution image? Thanks EDIT I understand, from looking at some other posts, that this may be a result of "posterization" or "dither", but this is only a plain white image so I don't how this results in a colors problem. (I could be completely misinterpreting these terms) EDIT Figured version might be important... I'm using CS5.1

    Read the article

  • How to view multiple log files as one file in unix/linux

    - by user42679
    Hi, I was wondering if there is a convenient way in linux/unix to read multiple log files as one. More specifically, I would like to view a sequence of log files (app.log, app.log.1 app.log.2, etc) as one big file using normal unix tools (vi, less, etc). When the EOF is read the tool will automatically move to the beginning of the next file. During my work I have to analyze uat/prod logs to investigate and solve problems. The fact that I need to traverse many log files disturbs my work and causes delays. Any ideas?

    Read the article

  • Fan is spinning too fast just in Windows - software?

    - by B. Roland
    I've recently replaced my fans (CPU, GPU, and bought a CHA fan). The GPU remained the same, but I've seen it when it was spinned 2 times faster, than it usual... but it is rarely. The problem is, that the CPU fan in Windows (especially 7) spinned too much, 'cos it keeps in under 40°C, and it is spinning with 3300-3600 RPM, which is too high I think. If I swich to Ubuntu, it keeps on ~40-45°C with with 2500-2800 RPM, which is a big difference in numbers, and in noise. I'm looking for a manual fan control solution, or just reduce the Windows' multipliers of fan speed control, somehow... I was bought the new fans because of the lower noise (and it does it, but not with 3.6k RMP). Thank you!

    Read the article

  • Not enough storage is available to process this command

    - by Mohit
    I am getting this error on almost all of the operations on a Windows 7 pro 32 bit machine. By operations I mean anything I do. Update a repo from subversion. Access a local IIS Site. Copy a big folder. Run an installer.and sometime if I try again. It get solved. I think there is something wrong wit windows7 . I searched around and found posts suggesting to increase IRPStackSize value in registry I did that no Luck. I am using Microsoft Security Essentials Version: 1.0.1961.0 as my antivirus package Once this errors starts popping up. I have to restart and then in after some random time. It starts showing up again. Any help is appreciated. I am losing lot of my time in restarting my system or retrying again and again.

    Read the article

  • How do I put back different SCSI hard drives into their original RAID arrays across different servers?

    - by Edgar
    I have potentially a big mess in my hands: I received today a box with several hard drives that used to be connected to different servers each one of them using an unknown - at least as of right now- RAID configuration. Regretfully, these are not marked and I'm not sure how to go about putting them back into their original servers. Currently I don't have much more information: I don't know what type of array was being used on each instance and I don't have any specifics about the RAID controller originally used on each one of the servers (currently these servers are at a remote location with no easy access). Is there a way to sort through this mess? What would be the consequences of using trial and error to go about it? This might be a very basic question but I don't have much experience dealing with RAID arrays.

    Read the article

  • Why does't rsync use delta-transfer for local files ?

    - by o_O Tync
    I have a big iso image which is currently being downloaded by a torrent client with space-reservation turned on: that means, file size is not changing while some chunks in in (4 Mib) are constantly changing because of a download. At 90% download I do the initial rsync to save time later: $ rsync -Ph DVD.iso /some/target/ sending incremental file list DVD.iso 2.60G 100% 40.23MB/s 0:01:01 (xfer#1, to-check=0/1) sent 2.60G bytes received 73 bytes 34.59M bytes/sec total size is 2.60G speedup is 1.00 Then, when the file's fully downloaded, I rsync again: total size is 2.60G speedup is 1.00 Speedup=1 says delta-transfer was not used, although 90% of the file has not changed. Why?!

    Read the article

  • Installing VMWare Tools in Windows Server 2008 fails system startup

    - by Hoghweed
    I recently created a vmware virtual machine with windows server 2008 enterprise as Guest. My host is Ubuntu 10.04 on my Lenovo laptop. I fall into a big trouble which makes my created VM unusable after I've installed VMWare Tools. After installing tools I'm able to run the system only in safe mode. After some event manager analysis I found the issue is with drivers installed by vnmware tools. Any one has got the same issue? Is there any good practice for doing that? The configuration of vm machine is the following CPU : 1 RAM : 1020 HD : 40GB Splitted files, SCSI CD : IDE Thanks in advance

    Read the article

  • Request Multiple Maya Floating Server Licenses for extra Satellite clients

    - by Rob
    Hello all: I am currently setting up a 'render farm' for Maya 2008 Unlimited. One Maya workstation license comes with the ability to render on eight satellite nodes. It works perfect, the remote rendering works like a charm. However, we have additional boxes to set up as satellite rendering nodes, and we have extra Maya workstation licenses. Ideally, the workstation can take two licenses and thus render on 16 nodes, but I haven't been able to figure it out, or determine if it is actually possible. It's a big project, where rendering the entire thing is in the scope of weeks, so the speed up would be worth it. Any thoughts?

    Read the article

  • When can an FTP server close its passive connections?

    - by Don Kirkby
    Does the FTP protocol allow the server to close any of its passive connections while the client is still connected? Can it tell when the client is finished receiving and then close the connection? I'm including an FTP server in my application using the pyftpdlib Python project. I've got it to work in active and passive mode, but I'm a bit concerned about when it closes its passive connections. I've tried connecting to it with both FileZilla and the default ftp command in Ubuntu, and in both cases, I get a new passive port for every request. That is, if I sit in the root folder and type ls 10 times, I use up 10 ports. This means that I have to allocate a big block of passive ports for the FTP server to use so it won't run out. As soon as the client disconnects, the server releases all the passive connections associated with that client and those ports can be reused. However, a long-running connection could use up a lot of ports.

    Read the article

< Previous Page | 185 186 187 188 189 190 191 192 193 194 195 196  | Next Page >