Search Results

Search found 18148 results on 726 pages for 'performance monitor'.

Page 223/726 | < Previous Page | 219 220 221 222 223 224 225 226 227 228 229 230  | Next Page >

  • Can you use a DVI-VGA Adapter on a monitor instead of a video card?

    - by Joel Coehoorn
    I suspect the answer to this is "no", but here goes: I have a monitor with inputs for DVI and VGA. I want to be able to share this display with two computers (one at a time, of course) that both have VGA only. I also have a DVI-VGA dongle that came with a video card that's in a different computer. Can I connect this dongle directly to the DVI port on the monitor so that I can connect both VGA computers? I'd rather not resort to a kvm.

    Read the article

  • Windows 7 treating my third monitor as an extension to my second rather than detecting it as a stand alone

    - by user331837
    I am trying to attach a third monitor to my PC, however I am coming across a rather strange issue. Windows screen resolution is still only showing that I have two monitors connected, but the second has been extended over both the second and third monitors. Basically, it shows as the main display being a normal size, and the secondary as being twice as long as normal. I have my main connected via a the display port, and the second and third are using two of the three mini HDMI ports my video card provides. How do I get windows to accept my third monitor the same as it does my second?

    Read the article

  • How can I watch full size video on 1 monitor and work on another?

    - by jasondavis
    Here is my situation. I have 3 monitors hooked up to my new PC running off 2 video cards. When I watch a video on one of the monitors and make it go full screen on that monitor it is great, however as soon as I click anywhere on one of the other 2 monitors, it makes me lose the full screen mode of the video and makes it go back to it's original size. This happens when watching a flash or silverlight based video in Google chrome as well as when I watch video from a player such as iTunes. Is it possible to make a video play fullscreen on one of my monitors and still work in the other two screens without loosing my full screen mode on the one monitor?

    Read the article

  • C# : Direct3D in a control, AND fullscreen on a secondary monitor - what's the best way ?

    - by Led
    I'm working on a C# application that needs to use Direct3D in a control in a windows form, AND (at the same time) fullscreen on a secondary monitor. Basically, I want a Windows Forms application on one screen with a user-interface to control the graphics, and I'd like to show preview-graphics in a small control, and full-blown superduper megafancy graphics fullscreen on a secondary monitor. What's the best way to approach this? (For example, I know XNA can render in a Windows Forms control, but is it possible to then add a fullscreen window on another monitor as well?)

    Read the article

  • How to monitor MySQL query errors, timeouts and logon attempts?

    - by Abel
    While setting up a third party closed source CMS (Sitefinity) the setup doesn't create all tables and procedures necessary to run it. The software lacks a logging system itself and it made me wonder: could I trace and monitor failing SQL statements from MySQL? This serves more than only the purpose of solving my issue with Sitefinity. More often I wonder what's send to the MySQL server, not wanting to dive into the software products or setup a debugging environment etc. I tried JetProfiler (only performance) and looked through a few others, but although they monitor a lot, they don't monitor query failures, timeouts or logon attempts. Does anyone know a profiler, tracer, monitoring tool, commercial or free, that can show me this information?

    Read the article

  • How can I display a hidden view in Interface Builder which is on a unattached monitor?

    - by Brennan
    I am using Interface Builder to work on NIBs and one of the NIBs must have a view on my external monitor which is not attached because I cannot see it on my MacBook. I have had this problem with editing iPad NIBs which I work on with my larger external monitor. For some reason Interface Builder is not detecting that there is now just one screen and not pulling this view onto this monitor. There has to be a way to get this back into the visible space so that I can work on it. I have tried double clicking on the view icon in the organizer which normally brings the view forward but it is not coming into view. What can I do? Is this really a bug that has been around this whole time?

    Read the article

  • Ubuntu ATI second display as main display

    - by Josh
    how can i make my external second display as main display for ubuntu? Im using the ATI Control Center (amdcccle) Seems there is no way to make this switch under the GUI Section "ServerLayout" Identifier "amdcccle Layout" Screen 0 "amdcccle-Screen[1]-0" 0 0 EndSection Section "Files" EndSection Section "Module" Load "glx" EndSection Section "ServerFlags" Option "Xinerama" "off" EndSection Section "Monitor" Identifier "0-LCD" Option "VendorName" "ATI Proprietary Driver" Option "ModelName" "Generic Autodetecting Monitor" Option "DPMS" "true" Option "PreferredMode" "1366x768" Option "TargetRefresh" "60" Option "Position" "1680 0" Option "Rotate" "normal" Option "Disable" "false" EndSection Section "Monitor" Identifier "0-CRT1" Option "VendorName" "ATI Proprietary Driver" Option "ModelName" "Generic Autodetecting Monitor" Option "DPMS" "true" Option "PreferredMode" "1680x1050" Option "TargetRefresh" "60" Option "Position" "0 0" Option "Rotate" "normal" Option "Disable" "false" EndSection Section "Device" Identifier "Default Device" Driver "fglrx" EndSection Section "Device" Identifier "amdcccle-Device[1]-0" Driver "fglrx" Option "Monitor-LCD" "0-LCD" Option "Monitor-CRT1" "0-CRT1" BusID "PCI:1:5:0" EndSection Section "Device" Identifier "amdcccle-Device[1]-1" Driver "fglrx" Option "Monitor-LCD" "0-LCD" BusID "PCI:1:5:0" Screen 1 EndSection Section "Screen" Identifier "Default Screen" DefaultDepth 24 EndSection Section "Screen" Identifier "amdcccle-Screen[1]-0" Device "amdcccle-Device[1]-0" DefaultDepth 24 SubSection "Display" Viewport 0 0 Virtual 3046 3046 Depth 24 EndSubSection EndSection Section "Screen" Identifier "amdcccle-Screen[1]-1" Device "amdcccle-Device[1]-1" DefaultDepth 24 SubSection "Display" Viewport 0 0 Depth 24 EndSubSection EndSection

    Read the article

  • Tripple-Head Setup: Raedon 5770

    - by Aren B
    I currently own a Sapphire Raedon HD 5770 1GB w/ DDR5 (Link) I've got two LCDs set up in this configuration: +---------++------+ | || | | 1 || 2 | +---------++------+ Im looking into buying a new TV/Monitor, a Samsung T240HD (h**p://www.memoryexpress.com/Products/PID-MX22473(ME).aspx) and I'd like to set up a tripple monitor setup like this (new monitor being #3) +---------++------++---------+ | || || | | 3 || 2 || 1 | +---------++------++---------+ Monitor 1: DVI Monitor 2: DVI Monitor 3: HDMI PS3 - Monitor 3: HDMI2 Is this possible with my current video card? Can I plug in 2x DVI + 1x HDMI and get a third display? Or am I going to have to buy a slew of Display Port Adapters? I know older video cards you could only have 2 active displays, but I heard that barrier was defeated with the Display-Port series video cards.

    Read the article

  • Strange scaling when duplicating monitors with another screen

    - by Aerione
    I can't get two monitors to scale application resolutions the same way. My main monitor works normally. My second monitor however, which is set to duplicate its image onto a TV I have in my room, renders the applications in a far lower resolution than the 1080p I've set it to. Also, the mouse pointer on the second monitor is enormous, it looks 2-3 times bigger than the one on the main monitor. I've checked the "Let me choose one scaling level for all my displays", to no avail. Here are some comparison pictures. Metro on the main monitor: Metro on monitor 2 (set to 1080p and to duplicate on a TV): This issue only seem to arise when I duplicate the monitor onto the TV. Does anyone have any idea of how to solve this?

    Read the article

  • What are some good ways to store performance statistics in a database for querying later?

    - by Nathan
    Goal: Store arbitrary performance statistics of stuff that you care about (how many customers are currently logged on, how many widgets are being processed, etc.) in a database so that you can understand what how your servers are doing over time. Assumptions: A database is already available, and you already know how to gather the information you want and are capable of putting it in the database however you like. Some Ideal Attributes of a Solution Causes no noticeable performance hit on the server being monitored Has a very high precision of measurement Does not store useless or redundant information Is easy to query (lends itself to gathering/displaying useful information) Lends itself to being graphed easily Is accurate Is elegant Primary Questions 1) What is a good design/method/scheme for triggering the storing of statistics? 2) What is a good database design for how to actually store the data? Example answers...that are sort of vague and lame... 1) I could, once per [fixed time interval], store a row of data with all the performance measurements I care about in each column of one big flat table indexed by timestamp and/or server. 2) I could have a daemon monitoring performance stuff I care about, and add a row whenever something changes (instead of at fixed time intervals) to a flat table as in #1. 3) I could trigger either as in #2, but I could store information about each aspect of performance that I'm measuring in separate tables, opening up the possibility of adding tons of rows for often-changing items, and few rows for seldom-changing items. Etc. In the end, I will implement something, even if it's some super-braindead approach I make up myself, but I'm betting there are some really smart people out there willing to share their experiences and bright ideas!

    Read the article

  • How to tweak the performance of Bit blit on Barco monitors?

    - by krishna
    Hi, The performance of bit blit on Small monitor(16 bpp,60Hz,1280X1024 resolution) it gives 0.9909ms. The performance of big monitors(8bpp,60hz,2048X5260) it gives 52.315ms . I use SRCCOPY to do the bit blit operation.how we can optimize the performance of bit blit on big monitor? Please share your thoughts. Thanks kk

    Read the article

  • DB2 increase bufferpool size and compressed tables not equal better performance. Why?

    - by Mestika
    Hi, I’m working on tuning and increasing the performance of my IBM DB2 version 9.7 database. I’ve been searching around the net for the last couple of days and learned that if I created my tables in COMPRESS mode and created one more bufferpool and set both of them to access 1024mb, then the performance in my queries should increase because of the less I/Os to the disks. However, when I run my time analysis, the performance Decrease. I added the new additions to my regular database with the indexes I’ve used all the time. Each time I search google I come up with the statement that: Increased bufferpool size and several bufferpools AND a table compression SHOULD prove to get better performance. I’m very puzzled about the total unexpected result. Are there some tuning mechanisms I’ve forgot or does anyone have a explanation for this odd behavior? Sincerely Mestika

    Read the article

  • Why Is Vertical Resolution Monitor Resolution so Often a Multiple of 360?

    - by Jason Fitzpatrick
    Stare at a list of monitor resolutions long enough and you might notice a pattern: many of the vertical resolutions, especially those of gaming or multimedia displays, are multiples of 360 (720, 1080, 1440, etc.) But why exactly is this the case? Is it arbitrary or is there something more at work? Today’s Question & Answer session comes to us courtesy of SuperUser—a subdivision of Stack Exchange, a community-driven grouping of Q&A web sites. The Question SuperUser reader Trojandestroy recently noticed something about his display interface and needs answers: YouTube recently added 1440p functionality, and for the first time I realized that all (most?) vertical resolutions are multiples of 360. Is this just because the smallest common resolution is 480×360, and it’s convenient to use multiples? (Not doubting that multiples are convenient.) And/or was that the first viewable/conveniently sized resolution, so hardware (TVs, monitors, etc) grew with 360 in mind? Taking it further, why not have a square resolution? Or something else unusual? (Assuming it’s usual enough that it’s viewable). Is it merely a pleasing-the-eye situation? So why have the display be a multiple of 360? The Answer SuperUser contributor User26129 offers us not just an answer as to why the numerical pattern exists but a history of screen design in the process: Alright, there are a couple of questions and a lot of factors here. Resolutions are a really interesting field of psychooptics meeting marketing. First of all, why are the vertical resolutions on youtube multiples of 360. This is of course just arbitrary, there is no real reason this is the case. The reason is that resolution here is not the limiting factor for Youtube videos – bandwidth is. Youtube has to re-encode every video that is uploaded a couple of times, and tries to use as little re-encoding formats/bitrates/resolutions as possible to cover all the different use cases. For low-res mobile devices they have 360×240, for higher res mobile there’s 480p, and for the computer crowd there is 360p for 2xISDN/multiuser landlines, 720p for DSL and 1080p for higher speed internet. For a while there were some other codecs than h.264, but these are slowly being phased out with h.264 having essentially ‘won’ the format war and all computers being outfitted with hardware codecs for this. Now, there is some interesting psychooptics going on as well. As I said: resolution isn’t everything. 720p with really strong compression can and will look worse than 240p at a very high bitrate. But on the other side of the spectrum: throwing more bits at a certain resolution doesn’t magically make it better beyond some point. There is an optimum here, which of course depends on both resolution and codec. In general: the optimal bitrate is actually proportional to the resolution. So the next question is: what kind of resolution steps make sense? Apparently, people need about a 2x increase in resolution to really see (and prefer) a marked difference. Anything less than that and many people will simply not bother with the higher bitrates, they’d rather use their bandwidth for other stuff. This has been researched quite a long time ago and is the big reason why we went from 720×576 (415kpix) to 1280×720 (922kpix), and then again from 1280×720 to 1920×1080 (2MP). Stuff in between is not a viable optimization target. And again, 1440P is about 3.7MP, another ~2x increase over HD. You will see a difference there. 4K is the next step after that. Next up is that magical number of 360 vertical pixels. Actually, the magic number is 120 or 128. All resolutions are some kind of multiple of 120 pixels nowadays, back in the day they used to be multiples of 128. This is something that just grew out of LCD panel industry. LCD panels use what are called line drivers, little chips that sit on the sides of your LCD screen that control how bright each subpixel is. Because historically, for reasons I don’t really know for sure, probably memory constraints, these multiple-of-128 or multiple-of-120 resolutions already existed, the industry standard line drivers became drivers with 360 line outputs (1 per subpixel). If you would tear down your 1920×1080 screen, I would be putting money on there being 16 line drivers on the top/bottom and 9 on one of the sides. Oh hey, that’s 16:9. Guess how obvious that resolution choice was back when 16:9 was ‘invented’. Then there’s the issue of aspect ratio. This is really a completely different field of psychology, but it boils down to: historically, people have believed and measured that we have a sort of wide-screen view of the world. Naturally, people believed that the most natural representation of data on a screen would be in a wide-screen view, and this is where the great anamorphic revolution of the ’60s came from when films were shot in ever wider aspect ratios. Since then, this kind of knowledge has been refined and mostly debunked. Yes, we do have a wide-angle view, but the area where we can actually see sharply – the center of our vision – is fairly round. Slightly elliptical and squashed, but not really more than about 4:3 or 3:2. So for detailed viewing, for instance for reading text on a screen, you can utilize most of your detail vision by employing an almost-square screen, a bit like the screens up to the mid-2000s. However, again this is not how marketing took it. Computers in ye olden days were used mostly for productivity and detailed work, but as they commoditized and as the computer as media consumption device evolved, people didn’t necessarily use their computer for work most of the time. They used it to watch media content: movies, television series and photos. And for that kind of viewing, you get the most ‘immersion factor’ if the screen fills as much of your vision (including your peripheral vision) as possible. Which means widescreen. But there’s more marketing still. When detail work was still an important factor, people cared about resolution. As many pixels as possible on the screen. SGI was selling almost-4K CRTs! The most optimal way to get the maximum amount of pixels out of a glass substrate is to cut it as square as possible. 1:1 or 4:3 screens have the most pixels per diagonal inch. But with displays becoming more consumery, inch-size became more important, not amount of pixels. And this is a completely different optimization target. To get the most diagonal inches out of a substrate, you want to make the screen as wide as possible. First we got 16:10, then 16:9 and there have been moderately successful panel manufacturers making 22:9 and 2:1 screens (like Philips). Even though pixel density and absolute resolution went down for a couple of years, inch-sizes went up and that’s what sold. Why buy a 19″ 1280×1024 when you can buy a 21″ 1366×768? Eh… I think that about covers all the major aspects here. There’s more of course; bandwidth limits of HDMI, DVI, DP and of course VGA played a role, and if you go back to the pre-2000s, graphics memory, in-computer bandwdith and simply the limits of commercially available RAMDACs played an important role. But for today’s considerations, this is about all you need to know. Have something to add to the explanation? Sound off in the the comments. Want to read more answers from other tech-savvy Stack Exchange users? Check out the full discussion thread here.     

    Read the article

  • Change order of monitors without changing fullscreen"size"

    - by user171489
    I have a dual monitor setup. My primary monitor is a 22" with a max resolution of 1680x1050 and my secondary is a 19" with a max resolution of 1280x1024. The secondary is standing on the left side of the primary one. My problem now is, that, if I change the order of the monitors in my nvidia x-server settings, so that my secondary is the first one (or the one on the left), the fullscreen mode in flash in scaled up to my secondary monitor, even if it´s displayed on my primary one. Meaning that i get a 1280x1024 "fullscreen" window on my bigger primary monitor. When I configure my x-server settings so the secondary monitor is the one on the right, I don´t have this problem. The only thing then is, that I have to scroll out on the right to get to my monitor on the right. I can´t move my secondary monitor on the right side of my primary due to lack of space and my belief that there must be a software solution. ;) Thanks in advance.

    Read the article

  • Using onboard and pci-e graphics card at the same time

    - by Endle
    Hello wonderful people. I know there are several other posts with similar questions. I also know how to use Google. I also have read up on posts discussing bumblebee, crossfire, ati catylist and many other interesting topics. I would just like someone to give me advice on how to use the onboard and pci-e graphics at the same time. I know the computer is capable of doing this. It works in Windows. I can use the VGA and DVI onboard port and the HDMI port of the add on card all at the same time. Works great in Windows 7, In Ubuntu, it seems only one or the other will work. I can use any combination of two displays on either adapter: VGA and HDMI..HDMI and DVI..so forth and so on. I have started experimenting with xorg.conf files, but have not been able to get any of them to work. Here is my last attempt at writing an xorg.conf file: Section "ServerLayout" Identifier "X.org Configured" Screen 0 "Screen0" 0 0 Screen 1 "Screen1" LeftOf "Screen0" Screen 2 "Screen2" LeftOf "Screen1" InputDevice "Mouse0" "CorePointer" InputDevice "Keyboard0" "CoreKeyboard" EndSection Section "Device" Identifier "Onboard Video" Driver "radeon" BusID "PCI:01:05.0" EndSection Section "Device" Identifier "Graphics Card" Driver "radeon" BusID "PCI:02:00.0" EndSection Section "Monitor" Identifier "CRT2" Option "VendorName" "ViewSonic" Option "ModelName" "Generic Autodetecting Monitor" Option "DPMS" "true" EndSection Section "Monitor" Identifier "DVI1" VendorName "ACR" ModelName "P224W" Option "DPMS" EndSection Section "Monitor" Identifier "DVI2" Option "VendorName" "Acer" Option "ModelName" "Generic Autodetecting Monitor" Option "DPMS" "true" EndSection Section "Screen" Identifier "Screen0" Device "Onboard Video" Monitor "CRT2" DefaultDepth 24 SubSection "Display" Depth 24 Modes "1280x1024" EndSubSection EndSection Section "Screen" Identifier "Screen1" Device "Graphics Card" Monitor "DVI1" DefaultDepth 24 SubSection "Display" Depth 24 Modes "1920x1080" EndSubSection

    Read the article

  • How Can I Disable Windows 7's Aero Performance Warnings?

    - by Jason Fitzpatrick
    You know your computer isn’t cutting edge, but there’s no need for Windows 7 to constantly remind you. Read on to see how you can disable its constant nagging to adjust your color scheme to improve performance. Today’s Question & Answer session comes to us courtesy of SuperUser—a subdivision of Stack Exchange, a community-drive grouping of Q&A web sites. Secure Yourself by Using Two-Step Verification on These 16 Web Services How to Fix a Stuck Pixel on an LCD Monitor How to Factory Reset Your Android Phone or Tablet When It Won’t Boot

    Read the article

  • Dual displays not working - NVidia - Ubuntu 12.4

    - by user75105
    Graphics Card: NVidia 460 GTX. Driver: NVIDIA accelerated graphics driver (version current) I have one DVI monitor, an old Dell LCD from 2005, and one VGA monitor, an Asus ML238H from 2010 whose HDMI port broke. The Asus is plugged into my graphics card's primary monitor slot and is the better monitor even though it is VGA but my computer defaults to the Dell. This happens when I boot as well; the loading screens, the motherboard brand image, etc. are all displayed on the Dell monitor until Windows loads. Then both monitors work. The same thing happened when I booted up Ubuntu 12.4 but I did not see the second monitor when the log-in screen popped up, nor did I when I logged in. I went to System Settings/Displays and my Asus monitor is not an option. I clicked Detect Displays and the Asus is not detected. I looked at the other questions regarding NVIDIA drivers and recalled my problems with Ubuntu a few years ago and decided to check the driver. I went to Additional Drivers to install the proprietary driver and it looks like it's installed and active but I'm still having this problem. There is another driver option, the post-release NVIDIA driver, but that does not fix the problem either. Also, under System Details/Graphics the graphics device is listed as Unknown, which might indicate that it is using an open source generic driver and not the proprietary NVidia driver. But under Additional Drivers it says that I am using the NVidia driver. Any help is appreciated.

    Read the article

  • How can I use Performance Counters in C# to monitor 4 processes with the same name?

    - by Waffles
    I'm trying to create a performance counter that can monitor the performance time of applications, one of which is Google Chrome. However, I notice that the performance time I get for chrome is unnaturally low - I look under the task-manager to realize my problem that chrome has more than one process running under the exact same name, but each process has a different working set size and thus(what I would believe) different processor times. I tried doing this: // get all processes running under the same name, and make a performance counter // for each one. Process[] toImport = Process.GetProcessesByName("chrome"); instances = new PerformanceCounter[toImport.Length]; for (int i = 0; i < instances.Length; i++) { PerformanceCounter toPopulate = new PerformanceCounter ("Process", "% Processor Time", toImport[i].ProcessName, true); //Console.WriteLine(toImport[i].ProcessName + "#" + i); instances[i] = toPopulate; } But that doesn't seem to work at all - I just monitor the same process several times over. Can anyone tell me of a way to monitor separate processes with the same name?

    Read the article

  • Impact of the L3 cache on performance - worth a dual-processor system?

    - by Dan Nissenbaum
    I will be purchasing a new high-end system, and I would like to have a better sense of whether a dual-processor Xeon system (I am looking at the new, high-end Xeon E5-2687W) might, realistically, provide a noticeable performance improvement due to the doubling of the L3 cache (20 MB per CPU). (This is in addition to the occasional added advantage due to the doubling of cores and RAM.) My usage scenario is, roughly, that I have many background applications running at any time - 3 or 4 data compression/backup applications, a low-impact web server, one or two virtual machines at any given time (usually fairly idle), and perhaps 20 utility programs that utilize a noticeable (but small) portion of the CPU cores. In total, when I am not actively using the computer, about 25% of the total CPU power is utilized in my current i7-970 6-core (12 thread) system. When I am doing routine work, the CPU utilization often exceeds 50%, and occasionally hits 75%-80%. The Xeon E5-2687W is not only a second-generation i7 (so should improve performance for that reason), but also has 8 cores (16 threads), rather than 6 cores. For this reason, I expect to run into the 75% CPU range even less frequently. Nonetheless, the ability to double the cores and the RAM is a consideration. However, in the end, I believe this decision comes down to whether the doubling of the L3 cache will provide a noticeable improvement. There are many benchmarks, and a lot of discussion, regarding CPU power. However, I find very little discussion of L3 cache utilization, and how increases in the L3 cache (such as doubling it with dual processors) affect performance. For example: If there are only two processes running, but each benefits from a large L3 cache (such as might be the case for background processes that frequently scan the file system), perhaps the overall system performance might noticeably improve with dual CPU's - even if only a single core is active on each CPU - due to each process having double the effective L3 cache. I am hoping that someone has a sense of the benefits of increasing (or doubling) the L3 cache size. Note: the CPU I am considering (the Xeon E5-2687W) has 20 MB L3 cache, so a system with dual CPU's would have 40 MB L3 cache.

    Read the article

  • Database Snapshots of Mirrored databases affect performance of Principal database?

    - by yrushka
    I have 2 servers set in Mirroring High-safety. One is Principal and another in Mirror. Currently I have 2 snapshots of a Production database (100 GB size) created on Principal server (for no_lock purpose of massive select processes) and 2 snapshots on the mirror server for the same database for reporting purposes. I know snapshots reduce performance of source databases but I am not sure if snapshots from mirror server have any impact on principal server's performance. thanks,

    Read the article

  • mysql 5.0.23 vs 5.5 performance benefits and upgrade issues?

    - by WarDoGG
    I have been told that mysql 5.5 has a significant performance boost compared to 5.0 Our server handles a lot of data (around 30 million records processed per 5-10 seconds) and requires every drop of performance boost we can give. Will it be beneficial if we upgrade from 5.0.23 to mysql 5.5? Also, we have lots of database indexes setup on the tables and I've been told that sometimes the indexes become corrupt after a version upgrade and they have to be rebuilt. Is this true?

    Read the article

  • Does QEMU's performance (still) lag VirtualBox's and is there a way to improve it without kvm?

    - by Catskul
    I've noticed several articles that have claimed that QEMU is slower than VirtualBox (without hardware assistance) but several are years old, and the newest seemed to be from last year. Is it true that QEMU is slower than VirtualBox? If so why? Are there any tricks to close the performance gap? Some of my host systems do not have virtualization support so I'm especially interested in performance tips that work without the kernel module.

    Read the article

  • VM Virtual guest machine disk defrag improves performance, myth or reality?

    - by jafin
    In operation of a virtual Vmware or Hyper-V guest typically advice is given to defrag the host and virtual disk images so to result in improved performance. Something like a cmd: vmware-vdiskmanager -d <file.vmdk> works great. Yet I can't find any qualitive evidence that suggest defraging inside the guest VM improves performance. Does anyone have advice or evidence that doesn't come from a commercial defragger's whitepaper that suggests inside guest defragging helps?

    Read the article

  • How to monitor outgoing messages from TIBCO EMS .Net client?

    - by Waheed Sayed
    While using the .Net client TIBCO EMS, How to monitor outgoing messages from my .Net Tibco client? I'm going to send Application-level, not jms-level, acknowlegements and replies. How can I tell If the application sent them or not? If the client failed to send a message will it throw an exception or store the message to try again later? Bottom line, Is there any tool enables me to monitor activities(outgoing) from client point of view?

    Read the article

< Previous Page | 219 220 221 222 223 224 225 226 227 228 229 230  | Next Page >