Search Results

Search found 65558 results on 2623 pages for 'large data'.

Page 461/2623 | < Previous Page | 457 458 459 460 461 462 463 464 465 466 467 468  | Next Page >

  • PHP 5.3.2 + Fcgid 2.3.5 + Apache 2.2.14 + SuExec => Connection reset by peer: mod_fcgid: error reading data from FastCGI server

    - by Zigzag
    I'm trying to use PHP 5.3.2 + Fcgid 2.3.5 + Apache 2.2.14 but I always have the error : "Connection reset by peer: mod_fcgid: error reading data from FastCGI server". And Apache returns an error 500 each time I tried to execute a php page : I have compiled the Apache with this options: ./configure --with-mpm=worker --enable-userdir=shared --enable-actions=shared --enable-alias=shared --enable-auth=shared --enable-so --enable-deflate \ --enable-cache=shared --enable-disk-cache=shared --enable-info=shared --enable-rewrite=shared \ --enable-suexec=shared --with-suexec-caller=www-data --with-suexec-userdir=site --with-suexec-logfile=/usr/local/apache2/logs/suexec.log --with-suexec-docroot=/home Then PHP: ./configure --with-config-file-path=/usr/local/apache2/php --with-apxs2=/usr/local/apache2/bin/apxs --with-mysql --with-zlib --enable-exif --with-gd --enable-cgi Then FCdigd: APXS=/usr/local/apache2/bin/apxs ./configure.apxs The VHOST is: <Directory /home/website_panel/site/> FCGIWrapper /home/website_panel/cgi/php .php ... ErrorLog /home/website_panel/logs/error.log </Directory> cat /home/website_panel/logs/error.log [Sun Mar 07 22:19:41 2010] [warn] [client xx.xx.xx.xx] (104)Connection reset by peer: mod_fcgid: error reading data from FastCGI server [Sun Mar 07 22:19:41 2010] [error] [client xx.xx.xx.xx] Premature end of script headers: test.php [Sun Mar 07 22:19:41 2010] [warn] [client xx.xx.xx.xx] (104)Connection reset by peer: mod_fcgid: error reading data from FastCGI server [Sun Mar 07 22:19:41 2010] [error] [client xx.xx.xx.xx] Premature end of script headers: test.php [Sun Mar 07 22:19:42 2010] [warn] [client xx.xx.xx.xx] (104)Connection reset by peer: mod_fcgid: error reading data from FastCGI server [Sun Mar 07 22:19:42 2010] [error] [client xx.xx.xx.xx] Premature end of script headers: test.php [Sun Mar 07 22:19:43 2010] [warn] [client xx.xx.xx.xx] (104)Connection reset by peer: mod_fcgid: error reading data from FastCGI server [Sun Mar 07 22:19:43 2010] [error] [client xx.xx.xx.xx] Premature end of script headers: test.php The Suexec log: root:/usr/local/apache2# cat /var/log/apache2/suexec.log [2010-03-07 22:11:05]: uid: (1001/website_panel) gid: (1001/website_panel) cmd: php [2010-03-07 22:11:15]: uid: (1001/website_panel) gid: (1001/website_panel) cmd: php [2010-03-07 22:11:23]: uid: (1001/website_panel) gid: (1001/website_panel) cmd: php [2010-03-07 22:19:41]: uid: (1001/website_panel) gid: (1001/website_panel) cmd: php [2010-03-07 22:19:41]: uid: (1001/website_panel) gid: (1001/website_panel) cmd: php [2010-03-07 22:19:42]: uid: (1001/website_panel) gid: (1001/website_panel) cmd: php [2010-03-07 22:19:43]: uid: (1001/website_panel) gid: (1001/website_panel) cmd: php root:/usr/local/apache2# cat logs/error_log [Sun Mar 07 22:18:47 2010] [notice] suEXEC mechanism enabled (wrapper: /usr/local/apache2/bin/suexec) [Sun Mar 07 22:18:47 2010] [notice] mod_bw : Memory Allocated 0 bytes (each conf takes 32 bytes) [Sun Mar 07 22:18:47 2010] [notice] mod_bw : Version 0.7 - Initialized [0 Confs] [Sun Mar 07 22:18:47 2010] [notice] Apache/2.2.14 (Unix) mod_fcgid/2.3.5 configured -- resuming normal operations root:/usr/local/apache2# /home/website_panel/cgi/php -v PHP 5.3.2 (cli) (built: Mar 7 2010 16:01:49) Copyright (c) 1997-2010 The PHP Group Zend Engine v2.3.0, Copyright (c) 1998-2010 Zend Technologies If someone has got an idea, I want to hear it ^^ Thanks !

    Read the article

  • Is it better to have more small ram chips or fewer large ones?

    - by Alex Andronov
    I am currently building a new server. I have options between say 32GB Memory for 2 CPUs, DDR3, 1066MHz (8x4GB Dual Ranked RDIMMs) and 36GB Memory for 2 CPUs, DDR3, 1066MHz (18x2GB Dual Ranked RDIMMs) Both at the same price. Should I go for the higher ram amount or the fewer chips? This will be for a Dell PowerEdge R710 with two Intel® Xeon® E5530, 2.4Ghz, 8MB Cache, 5.86 GT/s QPI, Turbo, HT Thanks

    Read the article

  • How to connect my Android to my Laptop wirelessly , to stream data between the two?

    - by Deepun
    I want to stream data from my Laptop PC to my Android phone using TCP or UDP by creating sockets in both the phone and the laptop, but it has to be done wirelessly. How do I connect them to stream the data ? I thought creating an ad-hoc wireless network from my laptop and connecting to it using my Android would work. But my Android is not detecting the ad-hoc network. Is there any other way how I can connect the two ? I downloaded this software called 'connectify' and created a wifi hotspot on my laptop, and successfully connected the two. But will I be able to stream data to my device using this connection ? Can simple direct Bluetooth connection help me in creating sockets in both phone and laptop and stream the data ?

    Read the article

  • Is anybody using Splunk in a large-scale production environment?

    - by Nano Taboada
    I've been watching the videos at splunk.com and really it's hard to believe that one can get all those features for free, there's still that "where's the catch?" in the back of my head. So it'd be great if anybody that is actually using it Splunk on production would like to share their experiences, perhaps highlighting its benefits over, say, Nagios? Thanks much in advance.

    Read the article

  • Best way to send large files point-to-point?

    - by Adam S
    I'm looking for a way to send a 10GB file to a friend. I really need to send it over the internet, but e-mail or uploading sites are not really an option. I remember using MSN messenger and having a file transfer feature that worked decently well. However, my friend doesn't have this software and doesn't want to get it. I know that the professional versions of TeamViewer have such a feature, but are there any free alternatives?

    Read the article

  • Website with large number of users keeps going down due to memory leaks:Tomcat6 and Java 6

    - by user1766478
    We host many websites on one of our two virtual servers. We use tomcat 6 and Java 6. It is an MVC model with a hibernate like layer. The problem is, one of our biggest clients with the most number of members, keeps crashing the server every 6-8 hours(precisely in the mornings when most members login) We have been having this issue for 4 days now. Trying to figure out the problem but we suspect memory leaks. Any suggestions?

    Read the article

  • Copying and rotating large table from Excel to Word without turning it into picture/wmf/...

    - by ldigas
    What would be the easiest way of copying and rotating a table made in Excel, to Word without turning it into a picture/enhanced metafile/or something alike. I know I can use the Section Break routine, but the problem is the table needs to go into a company frame (which I cannot turn onto a landscape), so I literally need to turn the table by 90 degrees. Any way of doing something like that ?

    Read the article

  • Viewing a large field in a query in SQL management studio with ZOOM?

    - by smithym
    Hi there, Can anyone help? I am using SQL management studio (sql server 2008) to run queries and some of the fields that come back are varchar(max) for example and it has a lot of information - Is there a zoom feature to open the window and show me the contents with vertical and horizontal scrollbars? I remember there was, i thought it was F2 but i must have been mistaken as it doesn't work Now i have to scroll horizontal on the field and its really difficult to see everything Also some of the fields contain new line codes etc so it would be great if the zoom feature would display the info using the new line codes etc Any body know how to do this?

    Read the article

  • Why does partition tool GParted read the 190GB of data twice when shrink a 250GB partition to 190GB?

    - by Jian Lin
    When using GParted to shrink a 250GB partition to 190GB, I thought it will move the 60GB of data back into the 190GB region and call it done. But instead it reads the 190GB of data twice, the first time taking about 1 hour and the second time for 2 hours. The question is: 1) how come it touches the 190GB of data instead of the 60GB of data? 2) how come it reads it twice? Update: i am suspecting this: it says "moving /dev/sdb1 to the right and then shrink it to 190GB"... so is that the reason, first it is to shrink the partition to 190GB, and then move it to the right? So it is not moving to the right and then shrink it, but to shrink it first and move it. (cannot move first because the original 250GB is the whole hard drive). Also, why move it to the right?

    Read the article

  • database on SSD: data only or the DBM program too?

    - by simone
    I plan on moving the data I use for statistical analysis (100-ish Gb) onto an SSD. The data is either sqlite single-file db's, or postgresql-managed data. The SSD is 240 Gb, 550 MB/s read and 520 MB/s write. Should I reserve that space for the data only, or would it be a good idea to install the operating system (Mac OS X) and the application directory (Adobe Suite, Microsoft Office and the like) on the SSD too? And would it make a substantial speed difference whether I also install the postgresql binaries on the SSD? I have plenty of other space (another 300Gb hard-drive, and a 1Tb one). Don't know the features of the non-SSD drives, though they're our standard equipment on all Macs, and they're definitely OK. Thanks.

    Read the article

  • In gnuplot, how to plot with lines but skip missing data points?

    - by Anna
    I've got a value associated to each day, as such: 120530 70.1 120531 69.0 120601 69.2 120602 69.5 # and so on for 200 lines When plotting this data in gnuplot with lines, the data points are nicely connected. Unfortunately, at places over a week of data points can be missing. Gnuplot draws long lines over these intervals. How can I make gnuplot only connect points on consecutive days? Solutions that require preprocessing of the data are fine, as I already smooth it with a script. Here is what I use: set xdata time set timefmt "%y%m%d" plot "vikt_ma.txt" using 1:2 with lines title "first line", \\ "" using 1:3 with lines title "second line" Example:

    Read the article

  • Very Large number of connections in TIME_WAIT state; Server is slow, ipconntrac

    - by Sparsh Gupta
    I have a nginx server with load balancing and reverse proxy. Right now its behing another nginx but very soon I plan to make it front, where it will receive TCP connections from clients directly at a rate of 500req/second I am having some big troubles with the server. I have pasted my configurations here and I am kinda sure that the problem is with ipconntrac and similar things which are alient to me http://paste.org/pastebin/view/28543 root@load_balancer:/proc/sys/net/ipv4# netstat -an|awk '/tcp/ {print $6}'|sort|uniq -c 67 CLOSING 727 ESTABLISHED 173 FIN_WAIT1 183 FIN_WAIT2 19 LAST_ACK 5 LISTEN 447 SYN_RECV 1 SYN_SENT 27970 TIME_WAIT Its a ubuntu machine with mainly nginx (load balancer and reverse proxy) installed. It surely isnt great. Can you help me understand whats going on and how can I fix it. This is my live server and I am sure its in a bad shape right now. Any document or commands to fix this, or settings I should make to make this better and reduce time wait and fin_wait1/2 better would be awesome.

    Read the article

  • I will need a formula showing counts, totals and sub-totals for data set from different sheet

    - by Sapthagiri
    I am using MS2003 EXCEL. I have a cell in Sheet 1 with a color value and totals, with sub-totals. On sheet 2, I have a data set with 3 columns (colors, dress, type). On Sheet 1, I will need a tabulation showing Totals for Colors, with totals at sub-group of dress (shirt,pants) split by type totals (Full, Half, Tee) Below table represents my Data set in Sheet 2 Colors Make Dress Type -------------------------------- Red Arrow shirt full Red Levi shirt half blue Rugger Pant full yellow Wrangler shirt tee yellow Rugger Pant half yellow Arrow shirt tee yellow Wrangler Pant half Green Rugger Pant full Red Levi shirt tee blue Rugger Pant full blue Arrow shirt full blue Wrangler Pant half Green Levi shirt full I will need a formula showing counts, totals and sub-totals on Sheet 1 for data set from Sheet 2. Refer my table below which represent my expected data on Sheet 1, total Shirt Full Half Tees Pants Full Shorts Red 10 8 4 3 1 2 1 1 Blue Green Yellow Please note I am not looking for a Pivot table solution.

    Read the article

  • Do CDNs actually dump the data to the client or pass the URL of the content?

    - by zengr
    I am curious, say for example: Facebook is the client of Akamai CDN. So, now when I login to my FaceBook page, I see all the content (vid, image, text etc) and I click on a video to view it. Now, Facebook is the client to Akami to get the content. So, when a request is made from Facebook to Akami, does Akamai dump the vid/image from their data centers to Facebook's data centers where they reside for a while (depending on their heuristics) and flushed after some time? Or, I see that video (stream it) directly from Akamai's server? UPDATE Data resides in CDN permanently (agreed), but is a copy of the content sent to Facebook's data centers too

    Read the article

  • Return cell reference as result of if statement with vlookups.

    - by EMJ
    I have two sets of data in excel. One contains a set of data which represents the initial step of a process. The other set of data represents the additional steps which take place after the first step is completed. Each of the data records in the "additional step data" has an id in a column. I need to find the identifying codes of the "additional step data" which correspond with the initial step data records. The problem is that I have to match the data in 4 columns between the two data sets and return the id of the "additional step data". I started by doing a combination of an if and vlookup functions, but I got stuck when I tried to figure out how to get the if statement to reference the id of the matching "additional step data". Basically I am trying to avoid having to search by manually filtering between two sets of data and finding corresponding records. Does anyone have any idea about how to do this?

    Read the article

  • What are the possible disadvantages of enabling the "data access" server option in sys.servers for t

    - by Corp. Hicks
    We plan to change the default server options of an SQL2k5 server instance by enabling data access. The reason is that we want to run "SELECT * FROM OPENQUERY(LOCALSERVER, '...')" -like statements on the server. What are the possible disadvantages of enabling server option "data access" (alias sys.servers.is_data_access_enabled) for the local server (sys.servers.server_id = 0)? (There must be a reason for MS setting this option to disabled by default...) EDIT: it turns out that I'm not the first person to ask this question: http://sqlblogcasts.com/blogs/piotr_rodak/archive/2009/11/22/data-access-setting-on-local-server.aspx "The DATA ACCESS server option is not very well documented in my opinion - the Books On Line say it is a property of linked servers. It doesn't mention at all that you actually can have it enabled on your local server to enable OPENQUERY calls. I noticed that when you disable DATA ACCESS on a linked server, you can't query any table located on it (I tested it on my loopback server) neither using OPENQUERY nor four-part naming convention. You can still call procedures (with four-part naming) that return rowsets. Well, the interesting question is why it is disabled by default on local server - I suppose to discourage users from using OPENQUERY against it." It also seems that the author of the post (Pjotr Rodak) is a Stack Overflow user :-)

    Read the article

  • How do I get a chart in LibreOffice Calc to have regular time intervals when the data is not regular?

    - by Dave M G
    I have an area chart in LibreOffice Calc where I would like the X axis to be measured in one month intervals spaced evenly apart. The data going into the graph, however, is not regularly spaced at one month. There can be zero, one, or two entries in any given month. Right now, the chart is keeping the horizontal pacing of the data consistent and adjusting the X axis to accomodate, which is the reverse of what I want. In the example picture below, you can see that there is one data point in March, one in May, and two in June. Along the X axis, April is gone and June is taking up double the space. Instead of this, I'd like the months to stay the same spacing so there is April, May, and June, and for the data area of the chart to compress or extend horiozontally as necessary. I have tried editing the X axis and adjusting the time intervals, but this doesn't seem to do anything. Is what I'm after possible?

    Read the article

  • Scanned JPEGs are large and slow to load - can they be optimized losslessly?

    - by Alistair Knock
    I have hundreds of JPEG photographs which were scanned about 5 years ago from negative using a Konica Minolta DiMAGE Scan Dual IV. The dimensions are ~4500x3000, and the filesize is around 12Mb, compared to shots from a DSLR with dimensions of 3000x2300 and filesize of 2-4Mb (actually, these are the output from a RAW convertor). The filesize is obviously quite a big difference, but the issue that's bothering me is that the (perceived) loading time is at least 10 times slower. Is this size/speed discrepancy likely to be because the scanner software saved the JPEGs inefficiently / using an old compression format, or is it simply that the scanned negatives contain much more "detail" (in the form of grain/noise) than the digital images? If the former, is there a way to losslessly optimize them? I've tried re-exporting the scanned files to full size JPEG from my RAW software but the filesize is pretty much the same. Both files will have been saved at 100 quality.

    Read the article

  • Linux: How to break a large file into smaller files?

    - by Runcible
    I have a giant file (20 gigs) sitting on my source machine and I need to transfer it to my target machine. For the purposes of this question, let's assume that I do not have network connectivity between the two machines. I need to break this file into a series of smaller files, write the smaller files to DVD(s), then re-assemble everything on the target machine. Both source and destination machines are Linux boxes. Is there a way to accomplish this using tar? I have a feeling that I need to use the --multi-volume parameter. What are my options? I need to be able to specify the size of the volume files, in order to make sure that each one will fit onto a single DVD. Thanks!

    Read the article

  • Copy past speed very slow for a large number of tiny files on Windows but not on linux

    - by Arno2501
    I've got this folder which contains 15'000 of tiny images (around 400 bytes each). If I copy past this folder on my laptop (Windows 7, i7 latest gen, superfast ssd) it takes about 30 seconds (yes for 7 megs !!!) the average transfer rate is 400 KBytes / second which is so slow. I mean my usual transfer rate is more like hundreds of MBytes per second !!! I get the same problem on my servers (Windows 2003, 2008 /r2) and on every Windows box that I could get my hands on. On the other hand if I do the same on a linux box (debian base, Ext3 FS) (which runs on the same SAN than all the windows servers I've tested) It's nearly instantaneous !!! I'm pretty sure the size / number of the files may stress such filesystem more than another but such differences !? Why is that ? Why is it so slow on the windows boxes (more that 30 sec for 7 MB) and so fast on the linux ones (one sec or so) (I mean this was not a hardlink that I've created it was a true copy). Is it a normal behaviour or something unusual ?

    Read the article

  • What's the best way to monitor a large number of application pools in IIS7?

    - by Kev
    Some background first - We're running IIS 7 on Windows 2008. We're running around 250 websites per server with each site in it's own application pool. I need a way to monitor each application pool for crashes and hangs and to send an email alert if an application pool is unresponsive for more than say 2 minutes. I thought about having a virtual directory mapped into each site with an ASP.NET page that we could poll via our existing monitoring system (HostMonitor). Does anyone else have experience in this area?

    Read the article

  • Why does Exchange 2003 silently reject emails with large attachments?

    - by Cypher
    Our environment: Exchange Server 2003 Standard, single instance, running on Windows Server 2003 Standard. configured to not send/receive mail with attachments larger than 10 MB. NDRs are not enabled. The issue: When an external sender sends an email with an attachment larger than 10MB, Exchange, as configured, does not receive the message. However, the sender of that message does not receive any notifications from his own mail server that the message could not be delivered due to attachment size. However, if an external user tries to send an email to a non-existent user, they do receive a message from their mail server indicating that the user does not exist. Why is that, and is there anything I can do about it? It would be nice if the sender received notification that the attachment file size exceeds our limits and their message was never received... Update The Exchange server has a SpamAssassin box in front of it... could that have something to do with it? Here is one of the last lines from SpamAssassin's logs when searching for my test e-mails: mail postfix/smtp[19133]: 2B80917758: to=, relay=10.0.0.8[10.0.0.8]:25, delay=4.3, delays=2.6/0/0/1.7, dsn=2.6.0, status=sent (250 2.6.0 Queued mail for delivery) My assumption is that Spam Assassin thinks the message is OK and is forwarding it off to Exchange. Update I've verified that Exchange is receiving the message and generating an NDR. However, delivery of NDRs are disabled to prevent Backscatter. Is there something that I can do to get Exchange to send a bounce message to the sending mail server (or verify that message is being sent) so the sending mail server can notify its sender of the bounce?

    Read the article

  • How to rsync a large file, with as little CPU and bandwidth expense as possible?

    - by Johan Allgoth
    I have a 500 GB file that I plan on backing up remotely. The file changes often. I'll be rsyncing it from a desktop to a server. Both can run rsync client or server. What is the proper command for this? The ones I've tried sofar has been taking forever or simply acted strange. Example and results: rsync -cv --partial --inplace --no-whole-file /desktop/file1 myserver.com::module/file1 Seems to work, but only if I do it twice (?!). Also, slow. Does the above command do the checksumming on both computers, or only on the sending one? Is it correct otherwise?

    Read the article

< Previous Page | 457 458 459 460 461 462 463 464 465 466 467 468  | Next Page >