Search Results

Search found 6387 results on 256 pages for 'cpu allocation'.

Page 175/256 | < Previous Page | 171 172 173 174 175 176 177 178 179 180 181 182  | Next Page >

  • Building a workstation computer for Image processing? [closed]

    - by echolab
    I am taking a gigapixel image my goal is 50gigapixel and shooting is almost done , i am doing some research to build a workstation so i can stitch images together , my questions is ! Could u suggest some dual cpu mainboard that works fine with xeon 5500+ , with 64GB+ ram support ? My other question is which hardware is most important in image processing , all i see in story of gigapixel panoramas is they have dual xeon and 32gb+ ram ? i wonder if i am doing this right , i mean they don't post information on graphic card , mainboard and stuff ! I did asked several websites , but nothing best answer was get some high-end workstation and plenty of hours , i don't want to purchase ready to use workstations, i wanna build it up Thanks in advance

    Read the article

  • Git completion __git_ps1 really slow on Mac

    - by mckeed
    I've had __git_ps1 in my bash prompt for a while, but just recently (I noticed it after I did some messing around with Homebrew and rbenv), it has slowed down my prompt horribly. When I'm in a git directory I have to wait 3-4 seconds after every command for the prompt to appear. If I just mash return and watch the Activity Monitor, it shows that distnoted and Finder are using more CPU than normal during the delay. Could something git-completion.bash is doing be triggering a notification to Finder? Maybe it involves folder actions or something?

    Read the article

  • Kernel Compiling from Vanilla to several machines

    - by Linux Pwns Mac
    When compiling kernels for machines is there a safe or correct way to create a template for say servers? I work with a lot of RHEL servers and want to compile them with GRSEC. However, I do not wish to always rebuild off of the .config for each machine and go in and remove a bunch of unrelated modules like wireless, bluetooth, ect... which you typically do not need in servers. I want to create a template .config that can be used on any machine, but is there a safe way to do that when hardware changes? I know with Linux, at least from my experience, you can cross jump hardware way easier then Windows/OSX. I assume that as long as I leave MOST of all the main hardware modules/CPU in that this could create a .config that would work for all or just about any machine?

    Read the article

  • How do I disable Windows Search Indexing Service while on battery on Windows 7?

    - by Slaggg
    Since upgrading from Windows XP to Windows 7, I've found laptop battery life is reduced. One issue is that the Windows Search Indexing Service is constantly running, even when on battery power. Using Resource Monitor shows it is often the top consumer of CPU time and top producer of disk activity. I imagine this is one of the primary reasons the battery is getting drained faster. I cannot find any options to tell Windows Search Indexing Service to not index while on battery power. The only solution seems to be to stop the service, and also disable the service (if you just stop it, it will restart itself after a while). Doing this gives me considerably more battery life, but it's a pain because I have to re-enable the service when plugged in again. Is there a better solution?

    Read the article

  • VPS stops responding every now and again

    - by Or W
    I have a Linode vps that I use to host some of my websites on. It's Ubuntu based and it's up to date in terms of all packages. I don't have any cron jobs scheduled or any automatic processes. I host a few (up to date) wordpress blogs there that have very little traffic altogether. Every day (at a different time) my server stops responding, I can't SSH to it, web access is getting timed out and it just dies until I reboot it through the Linode manager. On the linode dashboard I can see that the CPU is not very high (2-3%) Incoming/Outgoing traffic is on 0 and the IO count has a spike just before the server stops responding (SWAP IO is at 2k and IO Rate is at 5k). When I reboot the server everything is just fine. I'm trying to figure out a way to analyze what's going on at these random times where the server freezes up. How can I determine the problem?

    Read the article

  • I know this is a stupid question but... How many websites can my server potentially hold?

    - by Daniel Kindler
    Sorry for the "noob" question, but... About how many medium-sized websites with average traffic could this server hold? Just like the average website, kind of like a small business site. How many sites could this server hold, but still maintain nice, decent speed? PowerEdge R510 PE R510 Chassis for Up to Four 3.5" Cabled Hard Drives, LED edit Processor Intel® Xeon® E5630 2.53Ghz, 12M Cache,Turbo, HT, 1066MHz Max Mem edit Memory 8GB Memory (4x2GB), 1333MHz Single Ranked UDIMMs for 1 Procs, Optimized edit Operating System SUSE Linux Enterprise Server 10, SP3, Up To 32 CPU Lic, 1 YR Sub, DIB, Media edit Red Hat Enterprise Linux Licensing Hard Drives 250GB 7.2K RPM SATA 3.5" Cabled Hard Drive edit Hard Drives 1TB 7.2K RPM SATA 3.5" Cabled Hard Drive edit Hard Drives 2 X 2TB 7.2K RPM SATA 3.5in Cabled Hard Drive Hard Drive Configuration No RAID, Embedded SATA Controller for x4 Chassis edit Power Supply 480 Watt Non-Redundant Power Supply edit Thank you!

    Read the article

  • MySQL Optimizing

    - by Thoman
    Hello My web use an dedicated Intel(R) Xeon(R) CPU E5620 8core 12Gram Centos32bit/Driectadmin DISK SAS 80G Php-cgi This dedicated running one website Use wordpress 2.92(+plugin cache...) Database size 600MB only 100online But mywebsite runing very snow. please hep me config file my.cnf [mysqld] user=mysql key_buffer=128M set-variable = max_connections=1000 socket = /var/lib/mysql/mysql.sock key_buffer =32M table_cache = 1024 open_files_limit = 16344 join_buffer_size = 8M read_buffer_size = 8M sort_buffer_size = 8M tmp_table_size=512M read_rnd_buffer_size=8M max_heap_table_size=256M #myisam_sort_buffer_size=256M thread_cache_size=8 thread_cache=32 query_cache_type=1 query_cache_limit=1024M query_cache_size=1024M thread_concurrency = 16 wait_timeout = 10 connect_timeout = 10 interactive_timeout = 10 long_query_time=1 log-slow-queries = /var/log/mysqlslowqueries.log max_allowed_packet=32M skip-innodb [myisamchk] key_buffer = 64M sort_buffer = 64M read_buffer = 16M write_buffer = 16M [isamchk] key_buffer=64M sort_buffer=64M read_buffer=16M write_buffer=16M And apache

    Read the article

  • RCA monitor won't display highest resolution when NVidia drivers are installed

    - by novellterminator
    My computer is a dual-monitor setup. It won't display my second monitor's highest res... In fact, it will only display one resolution. Anything higher or lower makes the monitor not show the screen. I've had this problem with this computer, and I upgraded almost all my components (cpu, RAM, mobo, and video card included) and I still have this problem with my new setup. This leads me to believe it's a problem with my monitor and the NVidia driver together. Any thoughts on what can be done? The manufacturer of the monitor doesn't make a driver for this monitor. I used to run Windows Vista and now my new setup runs Windows 7.

    Read the article

  • Problems with the backup

    - by marcodv
    I did a script which run around 4 o'clock in the morning, for backup all the mysql databases and the config file for 250 linux vm. The problem is that it tooks ages for complete and more than 50% of these vm, need more than 8 hours for complete. More or less all the vm had the same configuration,I mean Same amount of ram same amount of disk space same number of cpu Debian 6.0.5 I am saving these backup on amazon s3, because is the cheapest solutions that I've found. Now my questions is: Has anyone some solutions or suggestions about that? On one blog I've read that probably the ionice and nice combination could be good work around about that. any thought?

    Read the article

  • Freeing of allocated memory in Solaris/Linux

    - by user355159
    Hi, I have written a small program and compiled it under Solaris/Linux platform to measure the performance of applying this code to my application. The program is written in such a way, initially using sbrk(0) system call, i have taken base address of the heap region. After that i have allocated an 1.5GB of memory using malloc system call, Then i used memcpy system call to copy 1.5GB of content to the allocated memory area. Then, I freed the allocated memory. After freeing, i used again sbrk(0) system call to view the heap size. This is where i little confused. In solaris, eventhough, i freed the memory allocated (of nearly 1.5GB) the heap size of the process is huge. But i run the same application in linux, after freeing, i found that the heap size of the process is equal to the size of the heap memory before allocation of 1.5GB. I know Solaris does not frees memory immediately, but i don't know how to tune the solaris kernel to immediately free the memory after free() system call. Also, please explain why the same problem does not comes under Linux? Can anyone help me out of this? Thanks, Santhosh.

    Read the article

  • Isn't ethernet used for LAN?

    - by Alraxite
    I'm not really knowledgeable about networking stuff, hence the stupid question. I read that ethernet was used for LAN connections but I'm pretty sure what I'm currently using to connect to the internet is 'ethernet'. The only knowledge that I have about networking is that there are two types of internet connections: Dial-up and broadband. And broadband can be either DSL or cable. So where does ethernet come from? The only thing I can tell is that I have a DSL model and there is an ethernet port at the back of it, a cable from which runs to a port in the back of my CPU. So, can ethernet be used to connect to the internet? Is it a subtype of DSL? I'm really confused about this, so I would really appreciate if someone could explain this to me. Also, I will apologize if this isn't the right place to ask.

    Read the article

  • Apache stops serving requests when connections increase

    - by Gunjan
    The values for MaxClients, ServerLimit etc parameters are quite high (4000). Available RAM on the server is high too (~8G). Load average remains below 1 on a 24 core CPU. But when the number of visitors on the website increase apache just stops serving requests. The apache error log is blank and access log shows no more requests coming in. Restarting apache makes it work again until the number of requests increases again. Any ideas where to start looking? UPDATE Getting the below errors in apache error log on running it with LogLevel Debug [info] server seems busy, (you may need to increase StartServers, or Min/MaxSpareServers), spawning 32 children, there are 479 idle, and 1027 total children

    Read the article

  • good books about server architecture?

    - by ajsie
    when the traffic for a website grows i dont think one apache server in a vps is the way to go. i would like to know more about how i then should set up the server side architecture. im not that much into hardware stuff (what kind of cables to use, different cpu architectures etc), but interested in the software architecture: what servers (apache, nginx, squid, varnish etc) to use and how they interact with each other one server in one machine? how many mysql servers. how many apache, nginx servers and so on. how the "machine court" looks like. are there any good books about this area?

    Read the article

  • games and movies become too slow after some time

    - by ishaq
    On my gaming desktop, I was fixing my desktop's power supply when something seemed to burn (I got that stinging burning smell). However when I turned ON the computer (it was OFF before), everything seemed to be fine. However, I have noticed that now it becomes too slow if I play a game or movie on it. I am talking about the computer becoming painfully slow after about 5 minutes into the movie/game (it works fine otherwise e.g. browsing), it becomes so slow I can see individual frames from movies/games. What could be the problem? A fried video card? friend memory (RAM), something else? My system's configuration is: Intel Core i7 CPU 3.40GHz 8GB DDR3 RAM 2TB WDC HDD NVIDIA GeForece GT 220 (1GB) Thermal Take Commander MS1 Chassis ( http://www.thermaltakeusa.com/Product.aspx?S=1394&ID=2051 ) mainboard: Intel DH67CL AAG10212-208 Realtek High Definition Audio

    Read the article

  • How do I know if my disks are being hit with too much IO reads or writes or both?

    - by Mark F
    Hi All, So I know a bit about disk I/O and bottlenecks relating to this especially when relating to databases. But how do I really know what the max IO numbers will be for my disks? What metric might be available to me for working out roughly (but needs to be a good approximation) of how much capacity (if you will) have I got left available in I/O. I've seen it before where things are bubbling along nicely and then all of a sudden, everything screams to a halt, and it ends up being an IO bound problem. Is there a better way to predict when IO is reaching its limits? This article was interesting but not giving the answer I desire. "http://serverfault.com/questions/61510/linux-how-can-i-see-whats-waiting-for-disk-io". So is my best bet surrounding just looking at 'CPU IO WAIT'? There must be a more reactive method for this? Best, M

    Read the article

  • What sort of things can cause a whole system to appear to hang for 100s-1000s of milliseconds?

    - by Ogapo
    I am working on a Windows game and while rendering, some computers will experience intermittent pauses ("hitches" for lack of a better term). When profiled they appear in seemingly random places in the code. Eventually I noticed that it wasn't just my process that was affected, but (seemingly) every process on the system. All of the threads in my application hitch at once. The CPU utilization drops during these hitches and it appears as if most processes make no progress. This leads me to believe this may be an Operating System or Driver issue, but it only occurs while playing the game (and only on some systems). What sort of operations might the operating system be doing that would require the kernel to pause all user threads and block. Some kind of I/O? At first I thought of paging but my impression is that would only affect a single process, no? Some systems in use: Windows, DirectX (3d), nVidia cards (unknown if replicates on ATI), using overlapped io for streaming

    Read the article

  • Hardware changes to require XP Activation ? (for a Virtual Machine)

    - by NVRAM
    I have WXP-64 running on a VM and, for testing and performance reasons, I would like to occasionally change the allocations for it. Changes might include: Number of CPU cores, Amount of RAM Add/remove network adapters. But I'm concerned that XP will demand re-activation and that I might eventually have licensing issues if I do this. So, can anyone tell me: What kind of changes trigger re-activation in XP? Is there limits or caveat with regard to re-activation? I've perused this question and the article it references, but wanted more recent and verified info. (FWIW, I'm not trying to cheat: the OS copy was purchased explicitly for the VM.)

    Read the article

  • ESXI 4.0 Slow response in opening anything accross the network on a Virtual Server running Win2008

    - by user40944
    Hi I recently installed a HP ML350G6 Server with Windows Small Business Server 200864bit, Exchange 2007. The server was running fantastically and we transferred all user data onto the new server and no problems for 2 weeks! We then installed SQL2008 and transferred the accounts package onto the server and this is where the problems started. Users are now complaining to open a work document can take 2 minutes and the same with regard to anything else. The server itself seems fine, the virtual server seems fine! No disk performance problems (doesn't go above 50% unless i really copy lots of things), no memory, (12Gb only using 7Gb) cpu (usage is low average about 15%) etc on both the VM and in Windows Task manager. I have made sure disk caching is enabled on the raid controller (which made no difference). Network cards are running 1Gb and plugged into HP GB switch. Please help!

    Read the article

  • Dell Inspiron 530 - SSD Worth it?

    - by DrFredEdison
    I'm going to be upgrading my Dell Inspiron 530 (2.0 Ghz Intel Dual Core CPU, 3 GB RAM) to windows 7 soon, and rather than backup and reformat my existing drive, I'm planning on getting a 2nd drive to replace my current primary, and moving it to a secondary. Thus, this seems like an excellent time to get a solid-state drive, if its going to be worth it. As far as I can tell this machine has a SATA-I controller, and I'm unsure if I'll see a noticeable performance increase with an SSD without going to SATA-II. So I have a three part question here given all that: Will spending the money on a SSD be worth it if hook it into a SATA-I controller? Is it reasonable to upgrade the controller on this machine to a SATA-II controller? Given that this PC is kind of old to begin with, am I better off performance wise to just stick with a faster HDD?

    Read the article

  • Which motherboard for Intel i7 and how to get RAID working?

    - by jasondavis
    I am wanting to build a new PC, I have a couple questions. 1) I am wanting to go with the Intel Core i7 920 Processor, can anyone reccomend a good reliable motherboard for this processor? Graphics card support does not matter (sli-crossfire). I would like to support a lot of ram, so the more ram slots the better. I have read so many bad reviews about certain boards not working good, I would love recommendation from experience. 2) I am wanting to run a couple SSD drives in RAID-0, I have never done this, will I need to purchase anything additional to the MB and CPU and drives to get raid working?

    Read the article

  • Why can't I open programs after watching youtube videos for a while?

    - by manjivsanotsu
    I have recently built a new PC, and it worked fine for a while (1-2 months of no problems whatsoever). However in the recent weeks I noticed that after I watched some youtube videos and closed everything, I can no longer do anything except move the mouse and expand the Startup Menu. If I click on any of the programs on the Start Menu or type a program on the Run text box, it won't open anything. I can't open taskmgr, or windows explorer, or even shut down the PC. I don't have anything else running when I'm watching videos except ZoneAlarm and Avast. The only workaround I can do when this happens is a forced shutdown (holding the power button of my PC), and restart if I wanted to do anything more. But this happens a lot - about 4-5 times a week so I'm worried it would fry up my hardware if I keep on doing this. OS: Windows 7 Other Installed Software: Open Office, Tropico 4 game, Adobe Photoshop Browser used: Google Chrome Hardware: CPU: i7 2600K RAM: 16 GB Motherboard: Asus P8Z68-V GEN3 Hard Drive: 120GB Corsair Force GT SSD Graphics: 2047MB GeForce GTX 560 Ti

    Read the article

  • Sharing RAM resources between 2 or more computers

    - by davee44
    I know there was a somewhat similar question before: How to share CPU or RAM? But, let me just specify it a little more... When Microsoft Windows requires more RAM capacity than available it uses a swap-file to temporarily store the data there, this is actually something like a hard-drive-based RAM. This technology is used for many years. Theoretically, it shouldn't be too hard to implement a similar technology that uses the RAM of different computer(s) in the network for temporary data storage. This just requires a software that runs on computers in the network that accepts and returns data from/to the main computer and keep that data in the RAM; plus the operation system of the main computer must have the ability to use computers in the network instead of (or in addition to) the swap-file. I wonder, are there any implementations of this idea? This would allow users to build RAM clusters using all of their home or office computers, that will boost the performance of a single computer for some development/gaming/video tasks, etc.

    Read the article

  • Dual 9500GT video cards, one works

    - by bossDub
    I have two video cards, both are PCIe 512MB Nvidia 9500GTs. One is an Evga and the other is another brand. If I put them both in, the one in the top slot will work (either card). The one in the lower slot says Code 12, not enough resources to use this device. I've tried disabling Firewire and a few other things in an attempt to get more "resources". Intel DP45SG motherboard. 4G Ram. 64bit Vista. Q8200 CPU. I do not have the SLI connector.

    Read the article

  • Why does reusing arrays increase performance so significantly in c#?

    - by Willem
    In my code, I perform a large number of tasks, each requiring a large array of memory to temporarily store data. I have about 500 tasks. At the beginning of each task, I allocate memory for an array : double[] tempDoubleArray = new double[M]; M is a large number depending on the precise task, typically around 2000000. Now, I do some complex calculations to fill the array, and in the end I use the array to determine the result of this task. After that, the tempDoubleArray goes out of scope. Profiling reveals that the calls to construct the arrays are time consuming. So, I decide to try and reuse the array, by making it static and reusing it. It requires some additional juggling to figure out the minimum size of the array, requiring an extra pass through all tasks, but it works. Now, the program is much faster (from 80 sec to 22 sec for execution of all tasks). double[] tempDoubleArray = staticDoubleArray; However, I'm a bit in the dark of why precisely this works so well. Id say that in the original code, when the tempDoubleArray goes out of scope, it can be collected, so allocating a new array should not be that hard right? I ask this because understanding why it works might help me figuring out other ways to achieve the same effect, and because I would like to know in what cases allocation gives performance issues.

    Read the article

  • How to get rid of disturbance on LCD monitor?

    - by Uday Kanth
    I have an Acer G195HQL LCD monitor and recently I've been noticing a lot of disturbance on the screen. They appear to be like flickering horizontal white lines. These lines are more apparent on dark backgrounds like grey/black. The curious thing is that, the intensity of these lines increases and decreases with no specific pattern. The disturbance temporarily goes away when I detach and re-attach the VGA cable at the CPU end. My speakers are magnetically shielded, but the problem persists even when I turn them off. I don't know what to do and this is really annoying me. Is it possible that my monitor is failing? Or is there anything I should check?

    Read the article

< Previous Page | 171 172 173 174 175 176 177 178 179 180 181 182  | Next Page >