Hi
I'd just like to know what the number '64' represents in mysql connection error message:
Can't connect to MySQL server on '127.0.0.1' (64) in /var/www/test.php
Thank you.
I know Ghost and Clonezilla aren't able to build images of a system while the system is running(Without Rebooting). Haven't Checked on Acronis though, but i don't simpatize with private solutions.
Question: Is there a software solution which is able to build a "Live" image?
Would appreciate anwsers, since I'm one step away from building a Clonezilla test enviroment and this will just help on my decision.
Thank you.
I am trying to reproduce the file structure of my VPS on my mac locally, so that it's easier for me to test websites in a local development environment
to do this I would need have a /home folder at the root level of the hard drive
using panic transmit I can see that there is already a volume called home at the root level
can I store other files and folders in here to set up my local web server?
sorry if this is a dumb question folks
I need to create an Active Directory lab that contains printers and test out various printer-related functionality (adding printers in AD, clients attaching to printers, printers, etc)
Is there a good way to properly simulate printers on a network? Or does there need to be real physical printers somewhere that eventually are attached, even if no output comes out.
How would you solve this problem?
The content on the Server 2008 R2 Trial Software page states that it can be evaluated for upto 180 days, however on a test machine we installed last week, it's requesting "re-arming" every 10 days, which seems to be do-able a maximum of 5 times?
How do we get it to last more than 50 days, as it'd be a pain to have to rebuild the server concerned!
I am in charge of production servers serving static content for a website. Those servers are constantly being crawled by bots looking for potential exploits (which isn't that much of a problem security-wise because no application can be reached behind the web server) but
generates thousands of 404 per day, sometimes per hour. I am looking into ways of blocking those requests but it's tricky (you want to make sure you don't block legitimate traffic and these bots are becoming more and more clever at looking like they're legit) and is going to take me a while to find an acceptable solution.
In the meantime I would like to reduce the performance impact of serving those 404 pages. Indeed we're using nginx which by default is configured to serve it's 404 page from the disk (This can be changed using the error_page directive but in the end the 404 will either have to be served from disk or from another external source (e.g. upstream application which would be worst)) which isn't ideal.
I ran a test with ab on my local machine with a basic configuration: in one case I echo a message directly from nginx so the disk isn't touched at all, in the other case I hit a missing page and nginx serves its 404 from disk.
server {
# [...] the default nginx stuff
location / {
}
location /this_page_exists {
echo "this page was found";
}
}
Here are the test results (my laptop has Intel(R) Core(TM) i7-2670QM + SSD in case you're wondering why they are so high):
$ ab -n 500000 -c 1000 http://localhost/this_page_exists
Requests per second: 25609.16 [#/sec] (mean)
$ ab -n 500000 -c 1000 http://localhost/this_page_doesnt_exists
Requests per second: 22905.72 [#/sec] (mean)
As you can see, returning a value with echo is 11% ((25609-22905)÷22905×100) faster than serving the 404 page from disk. Accordingly I would like to echo a simple 404 Page not Found string from nginx.
I tried many things so far but they all failed, essentially the idea was this:
location / {
try_files $uri @not_found;
}
location @not_found {
echo "404 - Page not found";
}
The problem is that as soon as the echo directive is used, the http response code is set to 200. I tried changing that by doing error_page 200 = 400 but that breaks the configuration.
How can I serve a 404 page directly from nginx? (without hacking the source which may be might next step)
Hi
i want to install internet explorer 6 in windows vista but i can't, is there a way to do?
i just want to test my website on it to fix some css bugs.
Thanks
I want to upgrade my Vista 64-bit edition to Windows 7 64-bit, so I've installed and run the Windows 7 Compatibility Test. The only item that is being highlighted as incompatible is my Zyxel G-202 wireless usb network device.
Does anyone know of a wireless USB device that is compatible with Windows 7 64-bit?
I'm setting up a database server on EC2, and I need to ensure that an EBS volume is automatically attached and is available before the database service starts up.
I'm using SMF so I can test whether a particular filesystem is available before starting the db service, so there's no problem from that perspective, however I'm not quite sure how to tell the server to auto-attach the EBS volume during/after boot.
What would be the best strategy for this?
There's a headless Ubuntu instance used as a host for our build server. I have some UI code that requires some graphical output. Installing a vnc4server and redirecting a DISPLAY to it worked like a charm. Not that my UI tests are running but test scripts can take screen shots.
Problem is that I need to set the resolution that vnc4server uses to serve the graphical content. Does anybody know how to configure it on Ubuntu server?
Is it possible to add wildcard serveralias (example: *.somesite.com) in an apache server without modifying httpd.conf manually? I use a DNS different from my hosting server and i have added asterisk A record to my DNS to point all request like (test.somesite.com,test2.somesite.com) to my hosting servers IP, but i don't see anyway of adding asterisk serveraliases to apache httpd.conf file in my cpanel. Pls is there a solution?
My Nginx setting currently has this:
location / {
if (!-e $request_filename){
rewrite ^/(.*)$ https://domain.com/index.php?id=$1 redirect;
}
}
Basically for non-existing pages (404) it redirects user to the home page. But now I have a wordpress blog setup at https://domain.com/blog/, but any wordpress items eg. https://domain.com/blog/test also got redirected to the home page. I wonder how to fix this?
I am new to Twitter. Rather than using twitter to broadcast my text, I would like to organize my ideas using hashtags. One thing I can't figure out is that I can't search my tweets ONLY.
How do I limit search result so I can only see my tweets?
I tried from:myid #test, it does not work.
I know Ghost and Clonezilla aren't able to build images of a system while the system is running(Without Rebooting). Haven't Checked on Acronis though, but i don't simpatize with private solutions.
Question: Is there a software solution which is able to build a "Live" image?
Would appreciate anwsers, since I'm one step away from building a Clonezilla test enviroment and this will just help on my decision.
Thank you.
I try to use the php ftp_connect fucntion on my dedicated server and I'm unable to establish a connection:
$conn_id = ftp_connect($ftp_server, 21) or die("Unable to connect to $ftp_server") ;
I'm sure the function is available as I test with :
function_exists('ftp_connect')
and it returns true
When I ftp the server through the shell I can reach it so I guess it's not a firewall issue.
Am I missing something else ?
Thanks for your precious advices
I am trying to redirect /folder to / using .htaccess but all am I getting is the Apache HTTP Server Test Page.
My root directory looks like this:
/
.htaccess
-/folder
-/folder2
-/folder3
My .htaccess looks like this:
RewriteEngine On
RewriteCond %{REQUEST_URI} !^/folder/
RewriteRule (.*) /folder/$1
What am I doing wrong? I checked my httpd.conf (I'm running Centos) and the mod_rewrite library is being loaded. As a side note, my server is not a www server, it's simply a virtual machine so it's hostname is centosvm.
In a vsftpd server enviroment, shared various directories from nfs mountpoints, I can log in without problem, but when I send the first "ls", the vsftp give me the directory listing:
lftp [email protected]:~ ls
-rw-rw-rw- 1 1160 1016 392 Jun 06 09:28 test.gif
but not give me the shell again (lftp client). In the server log I can see that the last message is:
"150 Here comes the directory listing."
Why happend this?
I set the default location from c:\inetpub\wwwroot to d:\inetpub\wwwroot but when I access my .NET 4.0 site get this error:
Description: An error occurred during the processing of a configuration file required to service this request. Please review the specific error details below and modify your configuration file appropriately.
Parser Error Message: Unrecognized attribute 'targetFramework'. Note that attribute names are case-sensitive.
Source Error:
Line 105: Set explicit="true" to force declaration of all variables.
Line 106: -->
Line 107: <compilation debug="true" strict="true" explicit="true" targetFramework="4.0">
Line 108: <assemblies>
Line 109: <add assembly="System.Web.Extensions.Design, Version=4.0.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35"/>
When I try to Manage the Basic Settings on the Site and click the "Test Settings" button, I see that I have a problem under "authorization:"
The server is configured to use pass-through authentication with a built-in account to access the specified physical path. However, IIS Manager cannot verify whether the built-in account has access. Make sure that the application pool identity has Read access to the physical path. If this server is joined to a domain, and the application pool identity is NetworkService or LocalSystem, verify that <domain>\<computer_name>$ has Read access to the physical path. Then test these settings again.
1) Do I need to grant rights to IIS to the new folder? Which user? I thought it was something like IIS_USER or something similar but I cannot determine the correct name of the user.
2) Also, do I need to set the default version of the framework somewhere at the Default Site level or at the Virtual folder level? How is this done in IIS6, I am used to IIS5 or whatever came with XP Pro.
3) My original site had a subfolder under wwwroot called "aspnet_client." How was this cleated? I manually copied it to the corresponding new location. My app was using seperate ASP specific databases for storing session state and role info, if that is relevant.
Thanks
Bought a new computer back in August with 4x4 GB RAM. Had problems with the RAM. They sent me four new sticks, which also generated errors. Singled out four sticks (from the eight I now had) that didn't generate any errors. Discovered by coincident a new RAM error last week (this time no BSOD). Contacted the company. According to them there have been issues with a bad stock from last summer so I got two tested 8 GB sticks sent to me. Been running Memtest86+ over the weekend. After 20 hours I got an error (see attached photo). The test has now been running for 37 hours but so far only this one error. I contacted the company where I bought the computer. They wrote back:
I wouldn't worry about hat one fail.
We have had similar situations here whereby it passes numerous times
but then fails once. We think it's an issue with memtest, after all
memory is faulty or it isn't so you can't really have it pass a few
times, fail the next time around and then pass again!
Please trust me on this and continue with the memory we sent you and
if your problems continue we'll look at getting it replaced again.
I gather from other forum posts that many people do not accept a single error. What could this single error signify, faulty RAM or a glitch in the MEMTEST program (or other)?
Update: From the helpful comments below I conclude that an occasional (and rare) "random" error could occur and be acceptable, but repeated errors at the same address would indicate malfunction. Memtest has now run for 45 hours and I still have only one error. For everyone's information, I will keep running the test. In less than two days I am going away for a month. I will most likely leave Memtest running. As I do not have a UPS there is a risk that a power outage will ruin the experiment. The computer is a desktop so I cannot bring it with me (which would curiously have exposed it to more cosmic rays as I will be flying ;)).
I use eCryptFS to encrypt the home directory of my laptop. My backup script copies the encrypted files to a server (together with everything else in (home/.ecryptfs).
How can I mount the encrypted files of the backup? I'd like to verify that I can do that, and that everything is in place.
My naive try with
mount -t ecryptfs /backup/home/.ecryptfs/boldewyn /mnt/test
didn't work, eCryptFS wanted to create a new partition.
When I open Network and Sharing Center in Windows 7, it puts a red X between the network and the internet, but when I run Troubleshoot Problems it tells me that it does not know what the problem is.
Is there any way to tell what test Windows used to place that red X, and how it failed? The system obviously knows something that it is not telling me. Knowing the details would help me solve this problem.
When we downloaded Xcode 4.5.1, to test applications in older simulators we installed it from Settings->Downloads->Components
Now there is Xcode 4.5.2 available. Again we need to download simulators for that Xcode ? or is there any location where those simulators stored similar to Documentation.
In 2nd image below we can see location for documentation but in 1st image selecting simulator doesn't show any location of installation.
Any idea ?
I have Four websites all hosted under the same account (all under public_html of a main domain).
All but one give me an A for "use cookie-free domains" but one of them gives me an F. What should I be doing to change this one domain? I don't even know what to look for...
Wait a second...grr...I just ran the test again, and got an A-weird...
I make regular snapshots of my VM using a nightly script. These backups are compressed using WinRAR and do shrink considerably, but I suspect it's not as efficient as had the file had been deduplicated first (just a hunch which I'm hoping to test).
So instead of compressing the VHD itself, I would like to deduplicate the single file first, and then compress the output of the deduplicator.
Is anyone aware of such a CLI tool?