Search Results

Search found 7107 results on 285 pages for 'processing efficiency'.

Page 163/285 | < Previous Page | 159 160 161 162 163 164 165 166 167 168 169 170  | Next Page >

  • Simple Ubuntu Server - Expanding disk space - adding a new drive LVM, RAID0 existing setup - how?

    - by NightWolf
    I have a 1TB ext4 partition mounted at / with all my data and Ubuntu 11.04 (natty) installed. Now this drive is almost full (I used it as a database server for some processing). RAID0 is ok, I can take a failure (touch wood). But I need a way to grow this partition. I have a new 1TB drive I want to add, however as my Ubuntu boot and all data is on the one partition I'm not sure how I can go about setting up a RAID0 or LVM array without loosing all my data. So the question is how can I extend my existing ext4 partition over two physical drives without losing data? Thanks!

    Read the article

  • How do I analyze an Apache Bench result?

    - by Alan Hoffmeister
    I need some help with analyzing a log from Apache Bench: Benchmarking texteli.com (be patient) Completed 100 requests Completed 200 requests Completed 300 requests Completed 400 requests Completed 500 requests Completed 600 requests Completed 700 requests Completed 800 requests Completed 900 requests Completed 1000 requests Finished 1000 requests Server Software: Server Hostname: texteli.com Server Port: 80 Document Path: /4f84b59c557eb79321000dfa Document Length: 13400 bytes Concurrency Level: 200 Time taken for tests: 37.030 seconds Complete requests: 1000 Failed requests: 0 Write errors: 0 Total transferred: 13524000 bytes HTML transferred: 13400000 bytes Requests per second: 27.01 [#/sec] (mean) Time per request: 7406.024 [ms] (mean) Time per request: 37.030 [ms] (mean, across all concurrent requests) Transfer rate: 356.66 [Kbytes/sec] received Connection Times (ms) min mean[+/-sd] median max Connect: 27 37 19.5 34 319 Processing: 80 6273 1673.7 6907 8987 Waiting: 47 3436 2085.2 3345 8856 Total: 115 6310 1675.8 6940 9022 Percentage of the requests served within a certain time (ms) 50% 6940 66% 6968 75% 6988 80% 7007 90% 7025 95% 7078 98% 8410 99% 8876 100% 9022 (longest request) What this results can tell me? Isn't 27 rps too slow?

    Read the article

  • Improve heavy work in a loop in multithreading

    - by xjaphx
    I have a little problem with my data processing. public void ParseDetails() { for (int i = 0; i < mListAppInfo.Count; ++i) { ParseOneDetail(i); } } For 300 records, it usually takes around 13-15 minutes. I've tried to improve by using Parallel.For() but it always stop at some point. public void ParseDetails() { Parallel.For(0, mListAppInfo.Count, i => ParseOneDetail(i)); } In method ParseOneDetail(int index), I set an output log for tracking the record id which is under processing. Always hang at some point, I don't know why.. ParseOneDetail(): 89 ... ParseOneDetail(): 90 ... ParseOneDetail(): 243 ... ParseOneDetail(): 92 ... ParseOneDetail(): 244 ... ParseOneDetail(): 93 ... ParseOneDetail(): 245 ... ParseOneDetail(): 247 ... ParseOneDetail(): 94 ... ParseOneDetail(): 248 ... ParseOneDetail(): 95 ... ParseOneDetail(): 99 ... ParseOneDetail(): 249 ... ParseOneDetail(): 100 ... _ <hang at this point> Appreciate your help and suggestions to improve this. Thank you! Edit 1: update for method: private void ParseOneDetail(int index) { Console.WriteLine("ParseOneDetail(): " + index + " ... "); ApplicationInfo appInfo = mListAppInfo[index]; var htmlWeb = new HtmlWeb(); var document = htmlWeb.Load(appInfo.AppAnnieURL); // get first one only HtmlNode nodeStoreURL = document.DocumentNode.SelectSingleNode(Constants.XPATH_FIRST); appInfo.StoreURL = nodeStoreURL.Attributes[Constants.HREF].Value; } Edit 2: This is the error output after a while running as Enigmativity suggest, ParseOneDetail(): 234 ... ParseOneDetail(): 87 ... ParseOneDetail(): 235 ... ParseOneDetail(): 236 ... ParseOneDetail(): 88 ... ParseOneDetail(): 238 ... ParseOneDetail(): 89 ... ParseOneDetail(): 90 ... ParseOneDetail(): 239 ... ParseOneDetail(): 92 ... Unhandled Exception: System.AggregateException: One or more errors occurred. --- > System.Net.WebException: The operation has timed out at System.Net.HttpWebRequest.GetResponse() at HtmlAgilityPack.HtmlWeb.Get(Uri uri, String method, String path, HtmlDocum ent doc, IWebProxy proxy, ICredentials creds) in D:\Source\htmlagilitypack.new\T runk\HtmlAgilityPack\HtmlWeb.cs:line 1355 at HtmlAgilityPack.HtmlWeb.LoadUrl(Uri uri, String method, WebProxy proxy, Ne tworkCredential creds) in D:\Source\htmlagilitypack.new\Trunk\HtmlAgilityPack\Ht mlWeb.cs:line 1479 at HtmlAgilityPack.HtmlWeb.Load(String url, String method) in D:\Source\htmla gilitypack.new\Trunk\HtmlAgilityPack\HtmlWeb.cs:line 1103 at HtmlAgilityPack.HtmlWeb.Load(String url) in D:\Source\htmlagilitypack.new\ Trunk\HtmlAgilityPack\HtmlWeb.cs:line 1061 at SimpleChartParser.AppAnnieParser.ParseOneDetail(ApplicationInfo appInfo) i n c:\users\nhn60\documents\visual studio 2010\Projects\FunToolPack\SimpleChartPa rser\AppAnnieParser.cs:line 90 at SimpleChartParser.AppAnnieParser.<ParseDetails>b__0(ApplicationInfo ai) in c:\users\nhn60\documents\visual studio 2010\Projects\FunToolPack\SimpleChartPar ser\AppAnnieParser.cs:line 80 at System.Threading.Tasks.Parallel.<>c__DisplayClass21`2.<ForEachWorker>b__17 (Int32 i) at System.Threading.Tasks.Parallel.<>c__DisplayClassf`1.<ForWorker>b__c() at System.Threading.Tasks.Task.InnerInvoke() at System.Threading.Tasks.Task.InnerInvokeWithArg(Task childTask) at System.Threading.Tasks.Task.<>c__DisplayClass7.<ExecuteSelfReplicating>b__ 6(Object ) --- End of inner exception stack trace --- at System.Threading.Tasks.Task.ThrowIfExceptional(Boolean includeTaskCanceled Exceptions) at System.Threading.Tasks.Task.Wait(Int32 millisecondsTimeout, CancellationTo ken cancellationToken) at System.Threading.Tasks.Parallel.ForWorker[TLocal](Int32 fromInclusive, Int 32 toExclusive, ParallelOptions parallelOptions, Action`1 body, Action`2 bodyWit hState, Func`4 bodyWithLocal, Func`1 localInit, Action`1 localFinally) at System.Threading.Tasks.Parallel.ForEachWorker[TSource,TLocal](TSource[] ar ray, ParallelOptions parallelOptions, Action`1 body, Action`2 bodyWithState, Act ion`3 bodyWithStateAndIndex, Func`4 bodyWithStateAndLocal, Func`5 bodyWithEveryt hing, Func`1 localInit, Action`1 localFinally) at System.Threading.Tasks.Parallel.ForEachWorker[TSource,TLocal](IEnumerable` 1 source, ParallelOptions parallelOptions, Action`1 body, Action`2 bodyWithState , Action`3 bodyWithStateAndIndex, Func`4 bodyWithStateAndLocal, Func`5 bodyWithE verything, Func`1 localInit, Action`1 localFinally) at System.Threading.Tasks.Parallel.ForEach[TSource](IEnumerable`1 source, Act ion`1 body) at SimpleChartParser.AppAnnieParser.ParseDetails() in c:\users\nhn60\document s\visual studio 2010\Projects\FunToolPack\SimpleChartParser\AppAnnieParser.cs:li ne 80 at SimpleChartParser.Program.Main(String[] args) in c:\users\nhn60\documents\ visual studio 2010\Projects\FunToolPack\SimpleChartParser\Program.cs:line 15

    Read the article

  • Supporting Piping (A Useful Hello World)

    - by blastthisinferno
    I am trying to write a collection of simple C++ programs that follow the basic Unix philosophy by: Make each program do one thing well. Expect the output of every program to become the input to another, as yet unknown, program. I'm having an issue trying to get the output of one to be the input of the other, and getting the output of one be the input of a separate instance of itself. Very briefly, I have a program add which takes arguments and spits out the summation. I want to be able to pipe the output to another add instance. ./add 1 2 | ./add 3 4 That should yield 6 but currently yields 10. I've encountered two problems: The cin waits for user input from the console. I don't want this, and haven't been able to find a simple example showing a the use of standard input stream without querying the user in the console. If someone knows of an example please let me know. I can't figure out how to use standard input while supporting piping. Currently, it appears it does not work. If I issue the command ./add 1 2 | ./add 3 4 it results in 7. The relevant code is below: add.cpp snippet // ... COMMAND LINE PROCESSING ... std::vector<double> numbers = multi.getValue(); // using TCLAP for command line parsing if (numbers.size() > 0) { double sum = numbers[0]; double arg; for (int i=1; i < numbers.size(); i++) { arg = numbers[i]; sum += arg; } std::cout << sum << std::endl; } else { double input; // right now this is test code while I try and get standard input streaming working as expected while (std::cin) { std::cin >> input; std::cout << input << std::endl; } } // ... MORE IRRELEVANT CODE ... So, I guess my question(s) is does anyone see what is incorrect with this code in order to support piping standard input? Are there some well known (or hidden) resources that explain clearly how to implement an example application supporting the basic Unix philosophy? @Chris Lutz I've changed the code to what's below. The problem where cin still waits for user input on the console, and doesn't just take from the standard input passed from the pipe. Am I missing something trivial for handling this? I haven't tried Greg Hewgill's answer yet, but don't see how that would help since the issue is still with cin. // ... COMMAND LINE PROCESSING ... std::vector<double> numbers = multi.getValue(); // using TCLAP for command line parsing double sum = numbers[0]; double arg; for (int i=1; i < numbers.size(); i++) { arg = numbers[i]; sum += arg; } // right now this is test code while I try and get standard input streaming working as expected while (std::cin) { std::cin >> arg; std::cout << arg << std::endl; } std::cout << sum << std::endl; // ... MORE IRRELEVANT CODE ...

    Read the article

  • Is it possible to upgrade PHP to 5.3 on a Centos Kloxo installation?

    - by Malachi Soord
    I have a VPS running Centos with Kloxo on and I was wondering how I would upgrade the PHP to 5.3 - It's currently running 5.2.6. When I try and do a yum update I get the following errors: Resolving Dependencies --> Running transaction check --> Processing Dependency: libpq.so.4 for package: lxphp ---> Package postgresql-libs.i386 0:8.3.7-umask.7 set to be updated --> Finished Dependency Resolution lxphp-5.2.1-400.i386 from installed has depsolving problems --> Missing Dependency: libpq.so.4 is needed by package lxphp-5.2.1-400.i386 (installed) Error: Missing Dependency: libpq.so.4 is needed by package lxphp-5.2.1-400.i386 (installed) You could try using --skip-broken to work around the problem You could try running: package-cleanup --problems package-cleanup --dupes rpm -Va --nofiles --nodigest The program package-cleanup is found in the yum-utils package. Any help would be greatly appreciated.

    Read the article

  • How do I read multiple lines from STDIN into a variable?

    - by The Wicked Flea
    I've been googling this question to no avail. I'm automating a build process here at work, and all I'm trying to do is get version numbers and a tiny description of the build which may be multi-line. The system this runs on is OSX 10.6.8. I've seen everything from using CAT to processing each line as necessary. I can't figure out what I should use and why. Attempts read -d '' versionNotes Results in garbled input if the user has to use the backspace key. Also there's no good way to terminate the input as ^D doesn't terminate and ^C just exits the process. read -d 'END' versionNotes Works... but still garbles the input if the backspace key is needed. while read versionNotes do echo " $versionNotes" >> "source/application.yml" done Doesn't properly end the input (because I'm too late to look up matching against an empty string).

    Read the article

  • Why did you start with Linux ? And why did you continue using it ?

    - by Stefano Borini
    I'd like to know the reasons that moved you towards Linux. Personally, I started because we had to use a Digital for the Fortran 77 exercises during my first year at the university. Linux was installed on many university computers, and I got interested in it. I always liked to code (on the C64) in basic and assembler, but I knew nothing about other languages. I soon discovered a chat engine called NUTS, and the idea of becoming proficient in C appealed me, so I started hacking the code. To do so, I needed a Unix at home, so I bought a Slackware 3.4 and installed it on my Pentium 166. I then continued using it for many years, reason being that I had pleasure in learning new things and the openness of information about the internals. It was a great learning platform. I then moved to osx because I enjoy the power of Unix with the beauty and efficiency of its interface. I am interested in your answer because I believe that the panorama has changed somehow. Although I still guess to find many "hackers" interested in Linux for the sake of knowledge, I also believe that there are other reasons (work, friends, bought a netbook).

    Read the article

  • <authentication mode=“Windows”/>

    - by kareemsaad
    I had this error when I browse new web site that site sub from sub http://sharp.elarabygroup.com/ha/deault.aspx ha is new my web site Configuration Error Description: An error occurred during the processing of a configuration file required to service this request. Please review the specific error details below and modify your configuration file appropriately. Parser Error Message: It is an error to use a section registered as allowDefinition='MachineToApplication' beyond application level. This error can be caused by a virtual directory not being configured as an application in IIS. Source Error: Line 61: ASP.NET to identify an incoming user. Line 62: -- Line 63: Line 64: section enables configuration Line 65: of what to do if/when an unhandled error occurs I note other web site as sub from sub hadn't web.config is that realted with web.config and all subs and domain has one web.config

    Read the article

  • VPS hang when one Virtual CPU usage is 100%

    - by garconcn
    We are using Xen Center to manage all of our cPanel VPS servers. The hardware has two CPUs(Intel(R) Xeon(R) CPU [email protected]) and 32GB memory. Each hardware has 4 cPanel VPS and each VPS has 8GB memory and 4 Virtual CPUS. Every one or two months, one of the VPS server will hang because one Virtual CPU usage is 100% and it couldn't release the CPU unless we use force reboot. We have 10 similar hardware, and this cause our server down almost every day. We have tried to avoid the Statistics Processing and Fantastico update during the night, but the problem still happens randomly. I can not find anything in the server log when it hangs. Any clue? Thank you.

    Read the article

  • Suggestions for Backup solution

    - by jiewmeng
    i am considering between windows home server simple nas extra HDD's in desktop btw, i will be the main user i am looking to fulfil the following needs: reliability (i am think RAID 1 or 5) not so prone to virus/malware infections (will using a separate NAS or home server help? say windows home server is still a windows pc except separated by network?) power efficiency (eg. spin down when not in use) download (eg. i may want to dl big files/torrents overnight and i may not want to use a full powered PC for it? does a full pc vs NAS provide significant power usage to justify cost of new system esp. since i am only user?) performance (i guess i like to write/access my files fast, on 2nd thought, maybe for backup i can forgo this? maybe for a WD Green HDD? but how much slower will it be? plus since i am the only user, i think the whole HDD will be mine?)

    Read the article

  • High-performance Academic Server [closed]

    - by PHPsmith
    Suppose I want to build a server for the university's academic interests. The server is dedicated only to a site, where users (students and lecturers) just view and fill the academic data. But at a time (e.g. once a semester), about 12,000 students will access the site simultaneously. Due to limitation of resources, I have to build the server using free software (except for the operating system Windows 7, the university has been prepared). The hardware is also limited to the usual 4-core computers (eg, Ivy Bridge Intel Core i7-3770) with approximately 16GB of memory (DDR3 1600 MHz), equipped with an RJ-45 port (Intel 82 579 Gigabit Ethernet). With all these limitations, I have to choose the software (web server, database, etc) are appropriate for this purpose is achieved. I decided to create a site in PHP. Please help me by answering the following questions based on your expertise. (my prime candidate software to consider after googling) Web server which is faster & stable & secure, when implemented and optimized for PHP? And why? (nginx) PHP accelerator which is faster & stable & compatible with the selected web server? And why? (APC with Zend Optimizer+) Database which is faster & stable & secure, when implemented and optimized for selected web server and selected PHP accelerator? (MySQL) Are there any errors that have been or will be happening from my condition is? If there is, please enlighten me? Is there anything else I need to know in order to achieve this goal? If there is, please enlighten me? I understand that the performance also depends on the implementation of source-code program, so I assume it will create a site with the best efficiency (e.g. using AJAX).

    Read the article

  • Why is only one Excel spreadsheet crippled, but others are fine?

    - by Dallas
    I have an inherited spreadsheet that I really don't want to rebuild at the moment. It's a simple small workbook that is small (< 200 rows that don't even reach to AA) and does nothing more than calculate some totals within the same worksheets. No macros, no external data sources, nothing beyond basic formatting of dates, numbers and strings. I see importing data from CSV/text has created many many workbook connections over time, but even if I delete them all (there were hundreds) it makes no difference in performance. Even clicking to simply change focus from cell to cell takes 10+ seconds, adorned by the spinning cursor and (Not Responding) appending to the title bar and the application locking up. The program seems to "recover" every time, but efficiency of editing this file is obviously seriously handicapped. All other files seem fine in Excel, and other programs have no apparent performance issues. I see Excel is chewing up CPU but I'm not sure how to narrow down what process or service is "clashing" with Excel. I tried the same file on other computers and performance is fine. If I turn off all start-up services and run only Excel, performance is restored... until I start using other programs and then it bogs down again. At this point, I would entertain almost any idea, theory or suggestion that helps pinpoint, solve or work around the issue.

    Read the article

  • Which GUI Toolkit was used for TuneUp Utilities and Avast ?

    - by vettipayyan
    There are arguments supporting and discouraging the use of TuneUp Utilities . However , their User Interface is a really good one and i can't find out what gui toolkit they've used .... I wonder this for Avast too... I'm using PyQt for my image processing app and trying to improve the look and feel which is essential for it. But i can't find these type of custom styles etc... If you've suggestions tell me. It'll be helpful..

    Read the article

  • How to run a process and completely detach it of its parent shell

    - by Bicou
    I'm running a program on a linux server that will take days to complete. I'm launching it from my workstation from an SSH terminal, as this program is command-line only. I want to be able to do all of these : launch that program, redirect standard outputs to files, exit my SSH session without making this terminate the process. I thought about $ ./MyProg.csh -params -foo -bar </dev/null 1>~/out.log 2>~/err.log & However, the process is terminated the moment I close my SSH session. My workstation is running Windows XP, and I cannot guarantee its uptime over several days, which is required for the processing of my data on the Linux server. As you may have noted, my program requires to be launched from CSH. Is it possible to do this ? Thanks.

    Read the article

  • Is there a convenient method to pull files from a server in an SSH session?

    - by tel
    I often SSH into a cluster node for work and after processing want to pull several results back to my local machine for analysis. Typically, to do this I use a local shell to scp from the server, but this requires a lot of path manipulation. I'd prefer to use a syntax like interactive FTP and just 'pull' files from the server to my local pwd. Another possible solution might be to have some way to automatically set up my client computer as an ssh alias so that something like scp results home:~/results would work as expected. Is there any obscure SSH trick that'll do this for me? Working from grawity's answer, a complete solution in config files is something like local .ssh/config: Host ex HostName ssh.example.com RemoteForward 10101 localhost:22 ssh.example.com .ssh/config: Host home HostName localhost Port 10101 which lets me do commands exactly like scp results home: transferring the file results to my home machine.

    Read the article

  • How to batch convert video files on OSX for AppleTV2 / iPhone4?

    - by Luke404
    I'd like to have a solution to batch convert video files to a format suitable for the AppleTV2, iPad2, iPhone4, while at the same time preserving as much quality as possible; I want a single output file that will play on both devices and also good for consumption by other Mac software (eg. Aperture, iMovie, iTunes). Batch processing is a requirement since I'm gonna convert many many files from different sources (mainly lots of videos captured by compact digital cameras, cell phones, and so on). I'm looking into ffmpeg and MEncoder (both installed via MacPorts), but I can't seem to find a suitable preset for libx264 even if everyone out there is talking about them. A different approach involving different software would be ok too as long as I can script it somehow and run it on a whole directory full of files to be converted.

    Read the article

  • Combine several locations with regex in nginx

    - by AlexAtNet
    I dynamic number of Joomla installations in subfolders of the domain. For example: http://site/joomla_1/ http://site/joomla_2/ http://site/joomla_3/ ... Currently I have the follwing config that works: index index.php; location / { index index.php index.html index.htm; } location /joomla_1/ { try_files $uri $uri/ /joomla_1/index.php?q=$uri&$args; } location /joomla_2/ { try_files $uri $uri/ /joomla_2/index.php?q=$uri&$args; } location ~ \.php$ { fastcgi_pass unix:/var/run/php5-fpm/joomla.sock; ... } I'm trying to combine joomla_N rules in one: location ~ ^/(joomla_[^/]+)/ { try_files $uri $uri/ /$1/index.php?q=$uri&$args; } but server starts to return index.php as is (does not call the php-fpm). It looks like the nginx stops the processing of the regex rules after the first match. Is there any way to combine this rules with something like regex?

    Read the article

  • App to slice'n'dice video, specifically remove chunks, on a Mac?

    - by Phillip Oldham
    I have a couple of collections of DVD Box-Sets I've ripped to my mac. Now I'd like to sweeten the viewing experience by removing the title sequences and credits so that viewing doesn't mean I have to keep reaching for the remote to skip 30 seconds of annoying music (think watching multiple episodes of Family Guy). If I can find an app that will let me do this reasonably quickly manually that would be great, but it would be perfect if I could dump a load of commands into a file and have everything trimmed while the mac is "inactive". I'm thinking that if I can specify chunks of time to remove from the original file that would be perfect. I had a quick look at importing into iMovie to do it manually and gave up at the "Processing Thumbnails" stage as it said it would be a couple of hours to produce them for a 45min mp4 file, which I can understand at 25fps but I'm not willing to wait, especially when I've got over a week's worth of files. Any suggestions?

    Read the article

  • What happens if a server never receives the RST packet?

    - by Rob
    Someone recently decided to show me a POC of a new Denial of Service method using SYN/TCP he's figured out. I thought it was complete nonsense, but after explaining to him about SYN-SYN/ACK-RST, he left me speechless. He told me "what if the server you're using to trick into sending the SYN/ACK packets can't receive the RST packet?" I have no idea. He claims that the server will continue trying to send SYN/ACK packets, and that the packetrate will continue to build up. Is there any truth to this? Can anyone elaborate? Apparently, the way it works is this: He spoofs the IP of the SYN packet to the target's IP. He then sends the SYN packet to a handful of random servers They all reply with their SYN/ACK packet to the target IP, of course The target responds with RST, as we know BUT somehow he keeps the target from sending the RST or keeps the random servers from processing it With this, apparently the servers will continue trying to send the SYN/ACK packets, thus producing a somewhat of a "snowball" effect.

    Read the article

  • How can I get a notification from my server if the mail queue stops

    - by Ash
    I am using QMail with Plesk 10 on an Apache server. Occasionally the mail queue stops processing emails - this most recently happenend when an email account got hacked and started sending hundreds of emails. We did not find out about this until a client of ours contacted to say that their emails were not being recieved, so we checked the mail queue and lo and behold the service had stopped. In future I would like to be notified when the mailqueue stops. How can I set something up so the server will run a command whenever the mailqueue stops?

    Read the article

  • MysqlTunner and query_cache_size dilemma

    - by wbad
    On a busy mysql server MySQLTuner 1.2.0 always recommends to add query_cache_size no matter how I increase the value (I tried up to 512MB). On the other hand it warns that : Increasing the query_cache size over 128M may reduce performance Here are the last results: >> MySQLTuner 1.2.0 - Major Hayden <[email protected]> >> Bug reports, feature requests, and downloads at http://mysqltuner.com/ >> Run with '--help' for additional options and output filtering -------- General Statistics -------------------------------------------------- [--] Skipped version check for MySQLTuner script [OK] Currently running supported MySQL version 5.5.25-1~dotdeb.0-log [OK] Operating on 64-bit architecture -------- Storage Engine Statistics ------------------------------------------- [--] Status: +Archive -BDB -Federated +InnoDB -ISAM -NDBCluster [--] Data in InnoDB tables: 6G (Tables: 195) [--] Data in PERFORMANCE_SCHEMA tables: 0B (Tables: 17) [!!] Total fragmented tables: 51 -------- Security Recommendations ------------------------------------------- [OK] All database users have passwords assigned -------- Performance Metrics ------------------------------------------------- [--] Up for: 1d 19h 17m 8s (254M q [1K qps], 5M conn, TX: 139B, RX: 32B) [--] Reads / Writes: 89% / 11% [--] Total buffers: 24.2G global + 92.2M per thread (1200 max threads) [!!] Maximum possible memory usage: 132.2G (139% of installed RAM) [OK] Slow queries: 0% (2K/254M) [OK] Highest usage of available connections: 32% (391/1200) [OK] Key buffer size / total MyISAM indexes: 128.0M/92.0K [OK] Key buffer hit rate: 100.0% (8B cached / 0 reads) [OK] Query cache efficiency: 79.9% (181M cached / 226M selects) [!!] Query cache prunes per day: 1033203 [OK] Sorts requiring temporary tables: 0% (341 temp sorts / 4M sorts) [OK] Temporary tables created on disk: 14% (760K on disk / 5M total) [OK] Thread cache hit rate: 99% (676 created / 5M connections) [OK] Table cache hit rate: 22% (1K open / 8K opened) [OK] Open file limit used: 0% (49/13K) [OK] Table locks acquired immediately: 99% (64M immediate / 64M locks) [OK] InnoDB data size / buffer pool: 6.1G/19.5G -------- Recommendations ----------------------------------------------------- General recommendations: Run OPTIMIZE TABLE to defragment tables for better performance Reduce your overall MySQL memory footprint for system stability Increasing the query_cache size over 128M may reduce performance Variables to adjust: *** MySQL's maximum memory usage is dangerously high *** *** Add RAM before increasing MySQL buffer variables *** query_cache_size (> 192M) [see warning above] The server has 76GB ram and dual E5-2650. The load is usually below 2. I appreciate your hints to interpret the recommendation and optimize the database configs.

    Read the article

  • Why won't Windows use the other CPU cores?

    - by revloc02
    In Windows Task Manager the Performance tab shows the first CPU maxed out, the other 7 just idling along with the occasional spike. What gives? More info: I've got 8GB and only 4.5GB are being used. The Processes tab has no indication of any process hogging processing power. In fact System Idle Process is 98-99. When I program stuff and have like 8 to 12 applications going (several directly unrelated to programming of course) my computer slows to a crawl. Sysyem Info: Intel Core i7-2600K Processor (quad-core with hyper-threading), 8GB RAM, Intel BOXDZ68BC LGA 1155 Motherboard, 500GB HDD

    Read the article

  • Odd SVN Checkout failures occur frequenctly on VMWare virtual machines

    - by snowballhg
    We've recently been experiencing seemingly random SVN checkout failures on our Hudson build system. Google search has failed me; I'm hoping the super user community can help me out :-) We are occasionally receiving the following SVN error when our Hudson build jobs checkout source via the Hudson Subversion plug-in (which uses svn kit): ERROR: Failed to check out http://server/svnroot/trunk org.tmatesoft.svn.core.SVNException: svn: Processing REPORT request response failed: XML document structures must start and end within the same entity. (/svnroot/!svn/vcc/default) svn: REPORT request failed on '/svnroot/!svn/vcc/default' This issue seems to only occur when checking out from our Virtual Machines (Windows XP, Fedora 9, Fedora 12) using Hudson's SVN Plug-in. Systems that use the traditional SVN client seem to work. SVN Server version: 1.6.6 Hudson version: 1.377 Hudson SVN Plugin Version: 1.17 Has anyone dealt with this issue, or have any suggestions? Thanks

    Read the article

  • Outbound Traffic Logging on ASA 5520 possible?

    - by j2k4j
    Taking a look at the ASDM (6.4) for my ASA 5520, I get a nice summary of the traffic status, with items like "interface traffic usage", and "connections per second". This works well, but only shows the data for the last 5-6 minutes or so. Recently, I've been asked whether it's possible to pull up this same type of traffic data for a particular time in the past. (Such as: Find the traffic usage for a 3 minute period from date xx:xx:xx @ time xx:xx:xx) I've noticed that my ASA 5520 is logging the warning, errors, etc that it is processing. But traffic data is not logged (yet) according to my search through the ASA. Is logging the traffic data amounts (as wondered above) actually a possibility? Is there any way to find out the past data for traffic and such values? Thanks!

    Read the article

  • How to block null/blank user-agents in IIS 7.5

    - by Jeremy
    We are going through a large scale DDOS attack, but it isn't the typical bot-net that our Cisco Guard can handle, it is a BitTorrent attack. This is new to me, so I am unsure how to stop it. Here are the stats IIS is processing between 40 and 100 requests per second from BitTorrent clients. We have about 20% of the User Agents, but the other 75% are blank. We want to block the blank user agents at the server level. What is the best approach?

    Read the article

< Previous Page | 159 160 161 162 163 164 165 166 167 168 169 170  | Next Page >