Search Results

Search found 61241 results on 2450 pages for 'empty set'.

Page 610/2450 | < Previous Page | 606 607 608 609 610 611 612 613 614 615 616 617  | Next Page >

  • Bind DHCP Server to Network Bridge

    - by Luke
    My wireless router died, so I decided to route everything through my server. So I installed a second NIC and a wireless card to be my new network: 1 NIC to the Modem, 1 NIC to the switch, and the Wireless to... Well, wireless. Anyways, I got far enough to get DHCP to work on just ONE adapter when I used Internet Connection Sharing (I couldn't get RRAS set up for the life of me), then I decided to try bridging the wireless and second NIC. Now, the DHCP server won't bind to the bridge, but I can enter manual IP's in my clients and it'll connect to the Internet. I also tried changing my wireless adapter's IP to 192.168.0.2, and to 192.168.1.1 to try to set up a separate scope, but to no avail. Running Windows Server 2003

    Read the article

  • TFTP uploads failing

    - by dunxd
    I am running TFTPD via xinetd on a Centos 5.4 server. I am able to access files via tftp fine, so I know the service is running ok. However, whenever I try and upload a file I get a 0 Permission denied message. I have already created the file in /tftpboot and set the permissions to 666. My tftpd config has verbose logging (-vvvv), but all I see in my /var/log/messages is: START: tftp pid=20383 from=192.168.77.4 I have seen some mention that SELinux can prevent TFTPD uploads, but I'd expect to see something in the logs. I have SELinux set in permissive mode. Any ideas?

    Read the article

  • Bash vs. Gnu screen : Replace Ctrl-A with Ctrl-Shift-A

    - by Stefan Lasiewski
    I'm a new user to GNU Screen. I've been using Bash for a very long time, and I want to give GNU Screen a try. As you know, GNU Screen uses 'C-a' (Control-A) as as the command character. Trouble is, this interferes with the Line Editing feature in Bash (and GNU Readline), because Control-A in Bash will Move to the start of the line." I know I can set the Command Character to another key sequence, like "^Q" or "``" (Backtick), but I have trouble finding another key sequence which isn't already in use (^Q is used by the terminal, backtick is used when writing shell scripts). It appears that the Command Character may only be one or two characters in length. Can I set the GNU Screen control character to be something like "Control-Shift-A"? (I can't use more then one hyperlink yet, so I cannot link to the Bash documentation)

    Read the article

  • PhpMyAdmin::403 error - don't have permission to enter localhost/phpmyadmin/ on Apache2, Fedora 13

    - by George
    I am running an apache 2.17 at a Fedora 13. I did install phpmyadmin from the repos (via yum). It is installed in /usr/share/phpMyAdmin. I did make a symlink from my document root to the /user/share/phpMyadmin. I set 755 permsissions on that folder, set it to be owned by user apache and group apache. And yet, when I try to open http://localhost/phpmyadmin, it gives me the 403 error - you do not have permisssion! I tried also commenting some deny lines in the phpMyAdmin.conf file, no success. Any help would be gladly appreciated.

    Read the article

  • How to recover bitlocker encrypted partition that is now 'unallocated'/'free space'?

    - by Atishay Jain
    My hard drive had 5 partitions(including 1(some 4-5GB) bit locker encrypted one). When I used disk mgmt I could view 2 partitions(24.4GB and 8.94GB) in green colour labeled Empty space. So, I wanted to merge them and I used minitool partition wizard for the purpose. I don't know, what that software did, but all I was left with 2 partitions and lots of green free space. I recovered 2 partitions using EaseUS partition master, but the bitlocker encrypted partition cannot be searched by it(and also minitool partition recovery). Now, the disk mgmt shows 2 free space partitions of 28.36GB and 8.94GB respectively. Here is a screenshot http://s14.postimage.org/4tvij041t/Screen_Shot003.jpg Please, tell me a way to recover the bitlocker encrypted partition that is showing as a free space in disk management. P.S. - It contains very important data.

    Read the article

  • Access internal IP using public IP

    - by willvv
    Hi, I have a DSL modem with a public IP address (201.206.x.x), and I have a web server in my internal network (192.168.0.50). I set up the modem to forward requests to port 80 to my web server, so, if I access 201.206.x.x from outside my network, it shows my web page, the same happens if I access 192.168.0.50 from a computer inside my network. Now, the problem is when I try to access 201.206.x.x from my internal network, the browser tries to connect to the DSL modem configuration, instead of redirecting my request to my Web server. Which settings do I have to change in the modem to set up this redirection? Thanks!

    Read the article

  • Disable Windows Media Player "media server" network locations

    - by Moses
    I'm running Windows 8 and in the Computer menu, I see a huge list of "media server" network locations of many of the PCs in my network (most running Windows 7). Is there a way to either locally disable this so I don't see this list every time, or disable this sharing feature on the other computers? I've tried disabling "Media Streaming options" from the Network and Sharing Center (on my PC), but that had no effect. Another thing I tried was enabling Media Streaming, but then selecting all the found clients and clicking Blocked in the list of found clients. That had no effect in removing the list either. I've also attempted disabling the Windows Media Player Network Sharing Service, but alas, the list remains. I'm starting to believe there's a magic registry key to unbury and flip to a "1", but all the searching I've done has come up empty.

    Read the article

  • Ubuntu 12.04: apt-get "failed to fetch"; apt is trying to fetch via old static IP

    - by gabe
    Sample error: W: Failed to fetch http://security.ubuntu.com/ubuntu/dists/precise-security/universe/i18n/Translation-en Unable to connect to 192.168.1.70:8118: Now this was working just fine until I changed the IP this morning. I have the server set to a static IP of 10.0.1.70 and for years it has been 192.168.1.70 - the IP apt-get is trying to use right now. I use privoxy and tor thus the 8118 port. Like I said it all worked until I changed the static IP from 192.168.1.70 to 10.0.1.70. I was forced to do so because of router issues. (Long and involved story, I didn't really want to change the IP because I know something like this would happen.) The setup for TOR/Privoxy requires that has you point Privoxy at TOR via 127.0.0.1:9050. Then point curl, etc to Privoxy via $HOME/.bashrc. Typically you would set the listen to IP for Privoxy to 127.0.0.1 but if you want it accessible to the rest of the LAN you set the IP to the server's LAN IP. Which I did a long time ago and was working fine until this morning. I have changed all instances of 192.168.1.70 to 10.0.1.70 in both /etc/privoxy/config and $HOME/.bashrc. What makes this really strange for me is that curl is working fine. I curl icanhazip.com and voila I get a new IP every 10 minutes or so. I curl CNN.com and I get the short but sweet permanently moved to www.cnn.com message I expect. Firefox works fine. Ping works fine. And I've tested all of this via Remote Desktop over my LAN. So the connection appears to be fine for everything except apt. I've also rebooted hoping that would clear 192.168.1.70 from apt. So the connection to the internet and DNS aren't an issue for these programs. And they are, as far as I can tell, using Privoxy/TOR just fine. The real irony here is that I've tried to open up Privoxy to go to Ubuntu's servers directly without going through TOR to speed up the downloads from Ubuntu (did this months ago). So somewhere that I have not been able to find, apt has stored the IP 192.168.1.70. And 192.168.1.70 is no longer valid. Thanks for the help

    Read the article

  • customErrors="RemoteOnly" not working properly in Server 2008

    - by Atomiton
    It would appear that on my brand new Windows Server 2008 with IIS7, customErrors is not working. We have customErrors set to RemoteOnly in the web.config on our Asp.Net sites and applications. However, no matter what we do, it would appear that our sites act like it's set to On and we can't get any detailed messages showing up on our applications when remoted into our servers. I'm not entirely sure how to trace where this is being overrided, or if there is something in the way the server is configured that would make the server think the request is internal? How does this actually resolve correctly, anyway? Any help is appreciated... Our network admin has added domains to our hosts file to direct applications to the IP address.

    Read the article

  • Setting the view of the toolbar to Large icons creates a gap in the Windows 7's taskbar

    - by Boris
    If you add a custom toolbar in the Windows 7 taksbar and set the toolbar view to Large Icons (and the icons in the toolbar are not set to Use small icons), the height of the taskbar unexpectedly increases by around 5 pixels, which makes a rather stupid gap at the bottom of the screen. If the option Use small icons is used for the taskbar appearance, the height of the taskbar is normal. It appears that the programmers at Microsoft were not very meticulous while designing the Windows 7 UI; it is obviously a bug. I was wondering if there is a registry hack to fix this or if anyone knows any solution to the problem, except for the obvious one: "use the small icons"? Thanks.

    Read the article

  • Selinux interfering with vboxwebsrv or phpvirtualbox

    - by Mike W
    I have a brand new installation of Fedora 18, with a brand new installation of Virtualbox 4.2. I have spent a painful few hours trying to get phpVirtualBox working. Apache 2.4 and PHP 5.4 are installed, along with the phpVirtualBox software. Attempting to access phpVirtualBox allowed me to login, but then I'd have a prolonged wait until an 'Error fetching HTTP headers' message appeared. Finally, I set SeLinux to permissive, and Bingo! things start to work. For some reason the SeLinux Troubleshooter isn't flagging any messages from SeLinux, I don't know what to look for now. This is a development box so I could leave SeLinux set to permissive but I will need to make this work in anger on the next project. My question, then, is this: What changes to SeLinux policies do I need to make to allow phpVirtualBox and vboxwebsrv to work together? If there's more information I can post that will assist I'll gladly post it - just let me know what it is.

    Read the article

  • Linux Mint something wrong with my .bashrc

    - by user2309862
    The path of my .basrc file is /home/vamsi/.bashrc It is weird that my file has nothing but the path I set. I think I am using a file at the wrong location or that I have lost my .bashrc file as none of the environment variables set here seem to work. #ANDROID_DEV ANDROID_HOME=/opt/android-sdk-linux export ANDROID_HOME PATH= $PATH:$ANDROID_SDK_HOME/tools export PATH PATH=$PATH:$ANDROID_HOME/platform-tools export PATH PATH=$PATH:$ANDROID_HOME/build-tools export PATH #MAVEN-PATH M2_HOME=/opt/apache-maven-3.1.0 export M2_HOME M2=$PATH:$M2_HOME/bin export M2 I was prompted to install maven2 in order to use mvn, but the android command cannot be found. Could you please help me find a solution to this issue. EDIT: Meanwhile,I tried this: export PATH=${PATH}:/opt/android-sdk-linux/platform-tools export PATH=${PATH}:/opt/android-sdk-linux/tools Now,the output of $PATH echoes: bash: /usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/opt/android-sdk-linux/platform-tools:/opt/android-sdk-linux/build-tools: No such file or directory

    Read the article

  • Windows 7 won't load unless other harddrives "disconnect"ed in UEFI shell

    - by lmz
    I have three disks, one GPT partitioned containing Windows 7 and Debian, the other MBR partitioned containing CentOS, and the other one MBR partitioned, empty. It used to work (loading Windows boot manager using rEFIt) but now after installing CentOS and OpenIndiana on the second drive, Windows won't boot. The logo is displayed briefly and then a text mode scrollbar "Loading files", then back to the rEFIt menu. The only thing that makes it work is if I drop into the UEFI shell and run disconnect XX where XX is the device handle of the other hard drives (obtained from running devices). This makes me think that the bootloader is getting confused about where the Windows partition is. Is there any information on how the Windows UEFI boot loader finds the Windows partition, or is there any logging I can turn on to help troubleshoot this issue?

    Read the article

  • Forwarding port to a VM - How to?

    - by Peter Gadd
    I use Win 8 Ent x64 on my PC, and I also have a Win 7 VMware virtual machine set up using a bridged network adapter. The IPv4 number for the Win 7 VM is 192.168.1.115. I require access to the VM from the Internet through port 1688. How do I set up port forwarding to achieve this? My router is a Cisco Linksys WAG120N. ========= If you require any further information to help me with this, I will gladly supply it. ========= Thanks in advance.

    Read the article

  • setting documentroot in apache

    - by fusion
    i've set the documentroot in httpd.conf as: DocumentRoot "C:\Users\user1\Documents\WebProjects" if the files are located in WebProjects, they work; however if i create a sub folder [project] in WebProjects and access them via the browser, it doesn't load. for example, if i create a folder 'test' in WebProjects and a php file called test.php and call it: localhost/test/test.php . .this won't work and give the error of file not found on server. but if i put all the files in WebProjects itself, ie. test.php in WebProjects, it will work [localhost/test.php]. this makes my WebProjects folder look very cluttered with different files of different projects strewn around. and it isn't practical either. i'm new to using apache and hence would like to know how to set the document root such that i can access and load all the Projects/folders in WebProjects.

    Read the article

  • How do I get Windows 7 To Recognize a newly installed RAID 5 volume?

    - by GregH
    I had a previously running Windows 7 (64 bit) system. I added 3 new 1TB Seagate drives that I set up as a RAID 5 volume. I have a Gigabyte GA-P55M-UD2 motherboard. I installed the drives, set up the BIOS and configured the three drives as a RAID volume through the RAID setup utility that was accessed via Ctrl-I while the system was booting. I rebooted the system and could see the drives during the boot sequence. However, when Windows 7 was starting I got an error (quick blue screen) and then Windows tried to repair itself with no success. Do I need to install RAID drivers in Windows? How do I do it if Windows won't boot? Thanks in advance.

    Read the article

  • Ubuntu: How to login without entering username and password

    - by torbengb
    I'm a newbie running Ubuntu 9.10. I have two users (wife and me), and each user's screensaver is set to lock so that on wakeup, we get to choose which user's desktop to go to. However, Ubuntu requires a password, so this is pretty tedious. I'd like to switch users without entering any password. I know about this trick that works for the boot login, but it doesn't deal with multiple users. Is it possible to set empty passwords for users in Ubuntu, or skip the password in other ways? (I'm expecting real Linux users to suggest that passwordless users must not get any rights and there be an admin user with a strong password. Yes, you're right. But that's not what this question is about. Thanks.)

    Read the article

  • Setting up a fileserver, some questions?

    - by Tanax
    Recently I've become very interested in setting up a fileserver, mostly for home usage but also because of the fact that I live in 2 places, I need to be able to access my files from both homes. I have already done some research into this but I am unclear about some things. My requirements are the following; Needs to work on both Mac and PC(only using Windows atm on PC but could be good if it supports more OS's to make it futureproof in case I need Linux or something else) Need to be able to set up a folder/drive/network space to act as a link to a certain folder on the fileserver All files should only be stored on the fileserver, e.g. no "shared" folders like in Dropbox where files are stored on the client computer Would prefer it if folders are password protected or that I can somehow specify what users can access the fileserver's shares Fileserver's OS most likely have to be Windows due to other factors outside of being just a fileserver I've already kinda figured out that I will need to set up a VPN so that I can access my fileserver from outside the local network. Probably going to use OpenVPN. Question 1: How would I go about to set up a VPN server so that I can connect to my local network at the fileserver's location? I know that since I'm on a dynamic IP I will have to get some sort of dynamic DNS server - I've already checked into this and I'm fairly sure I know how to fix that. I also know that I will have to forward the port OpenVPN uses in my router. Question 2: How would I actually share the folders on the fileserver so that I can access them on my other computers? I've researched into Samba but I'm uncertain if it needs to be run on a Linux OS. I know that the clients connecting to it can be Windows for example but can the Samba "server" be run on Windows? Also it appears that Samba shares a folder, meaning it works like Dropbox - I don't want that. So how would I share a folder in that case to make it work like I want it to? Sorry for the incredibly long question, I tried to structure it the best I could for easier read. Thanks in advance!

    Read the article

  • Setting up a fileserver, some questions?

    - by Tanax
    Recently I've become very interested in setting up a fileserver, mostly for home usage but also because of the fact that I live in 2 places, I need to be able to access my files from both homes. I have already done some research into this but I am unclear about some things. My requirements are the following; Needs to work on both Mac and PC(only using Windows atm on PC but could be good if it supports more OS's to make it futureproof in case I need Linux or something else) Need to be able to set up a folder/drive/network space to act as a link to a certain folder on the fileserver All files should only be stored on the fileserver, e.g. no "shared" folders like in Dropbox where files are stored on the client computer Would prefer it if folders are password protected or that I can somehow specify what users can access the fileserver's shares Fileserver's OS most likely have to be Windows due to other factors outside of being just a fileserver I've already kinda figured out that I will need to set up a VPN so that I can access my fileserver from outside the local network. Probably going to use OpenVPN. Question 1: How would I go about to set up a VPN server so that I can connect to my local network at the fileserver's location? I know that since I'm on a dynamic IP I will have to get some sort of dynamic DNS server - I've already checked into this and I'm fairly sure I know how to fix that. I also know that I will have to forward the port OpenVPN uses in my router. Question 2: How would I actually share the folders on the fileserver so that I can access them on my other computers? I've researched into Samba but I'm uncertain if it needs to be run on a Linux OS. I know that the clients connecting to it can be Windows for example but can the Samba "server" be run on Windows? Also it appears that Samba shares a folder, meaning it works like Dropbox - I don't want that. So how would I share a folder in that case to make it work like I want it to? Sorry for the incredibly long question, I tried to structure it the best I could for easier read. Thanks in advance!

    Read the article

  • restore content database in sharepoint server 2007

    - by Boris
    I have a site collection set up at web app running at port 80. I have made the backup of the site collection content db using stsadm.exe tool. Now, I want to restore that backup as a new content db of a different site collection - the one set up at web app running at port 500. I have done the following: Created a backup Created new web app at port 500 (I did not create a site collection for this web app) I have removed the content db of that new web app using Central Administration I have run the stsadm.exe -o addcontentdb -url webapp-at-port-500 -databasename Command is successfully completed, however when I check the Content Database page for that web app, it says that the Number of Sites is 0! Also, when I try to open http://webapp-at-port-500, I get the error saying that the webpage cannot be found. Could anyone please help me, it's driving me crazy. Thanks.

    Read the article

  • Using WSUS to update machines not on the domain

    - by Arcath
    I have a WSUS server providing updates for for the computers on my domain. We also bring allot of machines back to our office and run windows update on them as build image, this means that we end up downloading the same updates over and over again. Is there anyway to get a machine to download its updates from our WSUS Server? i found that theres something running on port 8530 but its just an empty document, in fact every folder listed in IIS config returns a blank document anyone know if this is possible? and how i would do it?

    Read the article

  • TIME_WAIT connections not being cleaned up after timeout period expires

    - by Mark Dawson
    I am stress testing one of my servers by hitting it with a constant stream of new network connections, the tcp_fin_timeout is set to 60, so if I send a constant stream of something like 100 requests per second, I would expect to see a rolling average of 6000 (60 * 100) connections in a TIME_WAIT state, this is happening, but looking in netstat (using -o) to see the timers, I see connections like: TIME_WAIT timewait (0.00/0/0) where their timeout has expired but the connection is still hanging around, I then eventually run out of connections. Anyone know why these connections don't get cleaned up? If I stop creating new connections they do eventually disappear but while I am constantly creating new connections they don't, seems like the kernel isn't getting chance to clean them up? Is there some other config options I need to set to remove the connections as soon as they have expired? The server is running Ubuntu and my web server is nginx. Also it has iptables with connection tracking, not sure if that would cause these TIME_WAIT connections to live on. Thanks Mark.

    Read the article

  • Simple Workstation Imaging Solution?

    - by user23087
    I need a fairly cheap imaging solution for Windows XP corporate desktops. Ideally, I'd be able to set up a desktop exactly as we want it, create an image, deploy this image to a server, then boot a new desktop to a CD/USB Drive/Network and quickly set up the workstation. Ideally, each computer would also have a unique workstation name. Any ideas? Right now I'm using a custom built Linux DD solution, but it's slow, not network-based, can't image multiple computers at the same time as there's only one copy on a USB drive, and can't uniquely name the computers. Thanks, Will

    Read the article

  • Are there any 5.1 surround audio switches on the market?

    - by thepurplepixel
    (Somewhat related to this question) I have a set of Logitech 5.1 surround speakers, which use 3 stereo 3.5mm TRS connectors (minijacks) to transfer the audio (the typical green/black/orange audio outputs). I have a Griffin Firewave hooked up to my MacBook Pro, and my desktop has a Realtek ALC889 audio chipset. I have looked for a way to, essentially, switch the speaker inputs between my Firewave and my desktop without having to disconnect the cables from one, route them around my desk, and plug them into the other. I'd love to have something like an old Belkin DB-25/LPT switch, but for these audio cables. Of course, purchasing one and soldering my own cables on the connection terminals is always an option, but, is there a reasonably priced 5.1 audio switch (or 3x stereo) on the market that will accomplish the simple task of switching audio outputs between two computers into a set of 5.1 speakers? Thanks in advance!

    Read the article

  • Nginx + PHP-FPM executes script, but returns 404

    - by MorfiusX
    I am using Nginx + PHP-FPM to run a Wordpress based site. I have a URL that should return dynamically generated JSON data for use with the DataTables jQuery plugin. The data is returned properly, but with a return code of 404. I think this is a Nginx config issue, but I haven't been able to figure out why. The script 'getTable.php' works properly on the production version of the site which is currently using Apache. Anyone know how I can get this to work on Nginx? URL: http://dev.iloveskydiving.org/wp-content/plugins/ils-workflow/lib/getTable.php SERVER: CentOS 6 + Varnish (caching disabled for development) + Nginx + PHP-FPM + Wordpress + W3 Total Cache Nginx Config: server { # Server Parameters listen 127.0.0.1:8082; server_name dev.iloveskydiving.org; root /var/www/dev.iloveskydiving.org/html; access_log /var/www/dev.iloveskydiving.org/logs/access.log main; error_log /var/www/dev.iloveskydiving.org/logs/error.log error; index index.php; # Rewrite minified CSS and JS files location ~* \.(css|js) { if (!-f $request_filename) { rewrite ^/wp-content/w3tc/min/(.+\.(css|js))$ /wp-content/w3tc/min/index.php?file=$1 last; expires max; } } # Set a variable to work around the lack of nested conditionals set $cache_uri $request_uri; # Don't cache uris containing the following segments if ($request_uri ~* "(\/wp-admin\/|\/xmlrpc.php|\/wp-(app|cron|login|register|mail)\.php|wp-.*\.php|index\.php|wp\-comments\-popup\.php|wp\-links\-opml\.php|wp\-locations\.php)") { set $cache_uri "no cache"; } # Don't use the cache for logged in users or recent commenters if ($http_cookie ~* "comment_author|wordpress_[a-f0-9]+|wp\-postpass|wordpress_logged_in") { set $cache_uri 'no cache'; } # Use cached or actual file if they exists, otherwise pass request to WordPress location / { try_files /wp-content/w3tc/pgcache/$cache_uri/_index.html $uri $uri/ /index.php?q=$uri&$args; } # Cache static files for as long as possible location ~* \.(xml|ogg|ogv|svg|svgz|eot|otf|woff|mp4|ttf|css|rss|atom|js|jpg|jpeg|gif|png|ico|zip|tgz|gz|rar|bz2|doc|xls|exe|ppt|tar|mid|midi|wav|bmp|rtf)$ { try_files $uri =404; expires max; access_log off; } # Deny access to hidden files location ~* /\.ht { deny all; access_log off; log_not_found off; } location ~ \.php$ { try_files $uri =404; fastcgi_split_path_info ^(.+\.php)(/.+)$; include /etc/nginx/fastcgi_params; fastcgi_index index.php; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; fastcgi_param PATH_INFO $fastcgi_script_name; fastcgi_intercept_errors on; fastcgi_pass unix:/var/lib/php-fpm/php-fpm.sock; # port where FastCGI processes were spawned } } Fast CGI Params: fastcgi_param QUERY_STRING $query_string; fastcgi_param REQUEST_METHOD $request_method; fastcgi_param CONTENT_TYPE $content_type; fastcgi_param CONTENT_LENGTH $content_length; fastcgi_param SCRIPT_NAME $fastcgi_script_name; fastcgi_param REQUEST_URI $request_uri; fastcgi_param DOCUMENT_URI $document_uri; fastcgi_param DOCUMENT_ROOT $document_root; fastcgi_param SERVER_PROTOCOL $server_protocol; fastcgi_param HTTPS $https if_not_empty; fastcgi_param GATEWAY_INTERFACE CGI/1.1; fastcgi_param SERVER_SOFTWARE nginx/$nginx_version; fastcgi_param REMOTE_ADDR $remote_addr; fastcgi_param REMOTE_PORT $remote_port; fastcgi_param SERVER_ADDR $server_addr; fastcgi_param SERVER_PORT $server_port; fastcgi_param SERVER_NAME $server_name; # PHP only, required if PHP was built with --enable-force-cgi-redirect fastcgi_param REDIRECT_STATUS 200; UPDATE: Upon further digging, it looks like Nginx is generating the 404 and PHP-FPM is executing the script properly and returning a 200. UPDATE: Here are the contents of the script: <?php /** * Connect to Wordpres */ require(dirname(__FILE__) . '/../../../../wp-blog-header.php'); /** * Define temporary array */ $aaData = array(); $aaData['aaData'] = array(); /** * Execute Query */ $query = new WP_Query( array( 'post_type' => 'post', 'posts_per_page' => '-1' ) ); foreach ($query->posts as $post) { array_push( $aaData['aaData'], array( $post->post_title ) ); } /** * Echo JSON encoded array */ echo json_encode($aaData);

    Read the article

< Previous Page | 606 607 608 609 610 611 612 613 614 615 616 617  | Next Page >