Search Results

Search found 57458 results on 2299 pages for 'http response codes'.

Page 334/2299 | < Previous Page | 330 331 332 333 334 335 336 337 338 339 340 341  | Next Page >

  • Visual Studio 2013 Preview now available as free download

    - by TATWORTH
    Originally posted on: http://geekswithblogs.net/TATWORTH/archive/2013/06/27/visual-studio-2013-preview-now-available-as-free-download.aspxAt http://www.microsoft.com/visualstudio/eng/2013-downloads, Microsoft have made available Visual Studio 2013 Preview available as a free download.  Four versions plus TFS server are available. The versions are:Ultimate PremiumProfessionalTest ProfessionalInstalling them will install the Dot Net Framework 4.5.1. Somesgar blogged about this at http://blogs.msdn.com/b/somasegar/archive/2013/06/26/visual-studio-2013-preview.aspxThe new features that VS2013 brings in are:Round-tipping projects with VS2012 (requires VS2012 Update 3)Git supportSupport for Windows 8.1Improved asynchronous supportImproved debugging64-bit edit and continue

    Read the article

  • how to remove update repo that always fails

    - by David M. Karr
    A week or so ago I tried to add a new package repo that supposedly had a package I wanted. Unfortunately, the information about it was out of date, and I found that it fails to connect to it each time. I'd like to just remove the new repo, but I'm not sure how to do that. For context, when I update, I get this: W:Failed to fetch http://ppa.launchpad.net/geod/ppa-geod/ubuntu/dists/precise/main/source/Sources 404 Not Found [IP: 135.214.42.30 8080] , W:Failed to fetch http://ppa.launchpad.net/geod/ppa-geod/ubuntu/dists/precise/main/binary-amd64/Packages 404 Not Found [IP: 135.214.42.30 8080] , W:Failed to fetch http://ppa.launchpad.net/geod/ppa-geod/ubuntu/dists/precise/main/binary-i386/Packages 404 Not Found [IP: 135.214.42.30 8080] , E:Some index files failed to download. They have been ignored, or old ones used instead.

    Read the article

  • Why does Google report a soft 404 when I redirect to the signup page?

    - by Hettomei
    In the last month, I've got an increased number of "soft 404" errors reported by Google webmaster tools which actually work well for users. Configuration (maybe useless): I have a website built with rails 3.1 Authentication is handled by the gem Devise Problem: On this page http://en.bemyboat.com/yacht-charter/9965-sailboat-beneteau-oceanis-43 Click on "Ask a Boat request" (a simple form, in GET to: http://en.bemyboat.com/boat_requests/new/9965) You are redirected with the HTTP status 302 to sign in You are then sent back to the new page if successfully sign in. Google tells me that the link on "ask a boat request" returns a soft 404. I can't make this form in "POST" (which will solve the problem) because we need to automatically redirect users back to the page after sign in. (the Gem Devise memorizes the "get" link.) To simplify, the question is: How to protect a private page with authentication, reached with a simple "GET" and not to be penalized by Google as a "soft 404".

    Read the article

  • I can see traffic coming from google ads, even though I don't have any google ads running, how is that possible?

    - by freethinker
    I can see "googleads.g.doubleclick.net/pagead/ads...." as a referrer on my website. However I am not running any google ads. So how am I getting the traffic? Here's the complete URL http://googleads.g.doubleclick.net/pagead/ads?output=html&h=250&slotname=5275434421&w=300&lmt=1311690266&flash=10.1.102&url=http%3A%2F%2Fwww.humbug.in%2Fsuperuser%2Fes%2Fpromiscuous-modo-con-intel-centrino-adelantado-n-6200-agn-tarjeta-inalambrica--214946.html&dt=1311690268347&bpp=3&shv=r20110713&jsv=r20110719&prev_slotnames=7553874535%2C5896307875%2C8590890981&correlator=1311690268052&frm=4&adk=3153418812&ga_vid=1396001617.1311690268&ga_sid=1311690268&ga_hid=558368546&ga_fc=1&u_tz=-180&u_his=1&u_java=0&u_h=1024&u_w=1280&u_ah=960&u_aw=1280&u_cd=24&u_nplug=6&u_nmime=22&biw=1263&bih=851&fu=0&ifi=4&dtd=441&xpc=T5MgG9EVV9&p=http%3A%2F%2Fwww.humbug.in

    Read the article

  • Nearest color algorithm using Hex Triplet

    - by Lijo
    Following page list colors with names http://en.wikipedia.org/wiki/List_of_colors. For example #5D8AA8 Hex Triplet is "Air Force Blue". This information will be stored in a databse table (tbl_Color (HexTriplet,ColorName)) in my system Suppose I created a color with #5D8AA7 Hex Triplet. I need to get the nearest color available in the tbl_Color table. The expected anaser is "#5D8AA8 - Air Force Blue". This is because #5D8AA8 is the nearest color for #5D8AA7. Do we have any algorithm for finding the nearest color? How to write it using C# / Java? REFERENCE http://stackoverflow.com/questions/5440051/algorithm-for-parsing-hex-into-color-family http://stackoverflow.com/questions/6130621/algorithm-for-finding-the-color-between-two-others-in-the-colorspace-of-painte Suggested Formula: Suggested by @user281377. Choose the color where the sum of those squared differences is minimal (Square(Red(source)-Red(target))) + (Square(Green(source)-Green(target))) +(Square(Blue(source)-Blue(target)))

    Read the article

  • Hyper-V for Developers - presentation from NxtGenUG Oxford (including link to more info on Dynamic M

    - by Liam Westley
    Many thanks to Richard Hopton and the NxtGenUG guys in Oxford for inviting me to talk on Hyper-V for Developers last night, and for Research Machines for providing the venue.  It was great to have developers not yet using Hyper-V who were really interested in some of the finer points to help them with specific requirements. For those wanting to follow up on the topics I covered, you can download the presentation deck as either PDF (with speaker notes included) or as the original PowerPoint slidedeck,   http://www.tigernews.co.uk/blog-twickers/nxtgenugoxford/HyperV4Devs.pdf   http://www.tigernews.co.uk/blog-twickers/nxtgenugoxford/HyperV4Devs.zip I also mentioned the new feature, Dynamic Memory, coming in Service Pack 1, had been presented in a session at TechEd 2010 by Ben Armstrong, and you can download his presentation from here,   http://blogs.msdn.com/b/virtual_pc_guy/archive/2010/06/08/talking-about-dynamic-memory.aspx

    Read the article

  • Call Web Service via BizTalk Orchestration via received file

    This example shows how some business logic can be implemented by receiving a file into a BizTalk Orchestration and calling a Web Service. The results of the Web Service call are decided upon from the contents of the incoming file and the response message is constructed accordingly. The response message is also saved down to the local file system.  read moreBy BiZTech KnowDid you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Dealing with blackhat SEO companies and low quality link building competitors [closed]

    - by Mikko Ohtamaa
    I have often faced a case where the competitors of my client use SEO blackhat tactics where they contact a SEO company to do link building for their websites and products. Here is an example of a typical case of a fake blog created only for link building purposes A very low content article http://marshallfab.com/fundus-camera-explained.html in obvious fake blog: no author information, partially machine generated text, all blog posts are solely about link building Following the link you get to the promoted company page http://www.patternless.com/ ... which, unsurprisingly, links the SEO company homepage in the footer text http://www.affordableseofl.com/ ... who are not shy to advertise their Extremely aggressive SEO plan Does Google have any feedback channel where one could submit cases like this, so that Google would punish the link builders? Are there any means to bring these blackhat companies to pushame to damage their reputation?

    Read the article

  • What has Ubuntu contributed to the Linux Kernel?

    - by Luis Alvarado
    This question is similar to this one: What unique enhancements and features has Ubuntu brought to the Linux Community but in this case it is directed towards what has Ubuntu contributed to the official Linux Kernel. For example, many times I hear about Intel contributing to patches for the Linux Kernel like the RC6 latest patches and any other related to the recent Sandy/Ivy Bridges. In another group, Android did an upstream patch and a lot of ARM patches have also come to the Linux Kernel. I have seeing only a small percent of companies and groups that have contributed to the Linux Kernel (http://kernel.org) but I want to know, since the beginning of Ubuntu till now, what has Ubuntu contributed to the Linux Kernel in regards to any aspect of the kernel. For Kernel information I typically go to http://kernelnewbies.org and http://kernel.org

    Read the article

  • Google Webmasters tools crawl error caused by URL split into two lines

    - by Shiro
    I am looking in to Google Webmaster Tools - Crawl Error section. How should I handle for those URL due their system / application showed invalid URL. e.g http://www.example/images/products/s_=enlarge_16gb.jpg but, I dunno what happen to yahoo groups, it break the link into http://www.example/images/products/s_= enlarge_16gb.jpg and I only make the top part become hyperlink, which is http://www.example/images/products/s_= Because of the URL, Google show crawl error, I got few error because of this kind of result or because other people typo error. How do I prevent this. I am sure I don't have the right go and change other people post. What is the solution for this. Thanks!

    Read the article

  • Using Substring() in XML FLOWR Queries

    - by Jonathan Kehayias
    Tonight I was monitoring the #sqlhelp hashtag on Twitter for a response to a question I asked when Randy Knight ( Twitter ) asked a question about using SUBSTRING in FLOWR statements with XML. #sqlhelp Is there a way to do a SQL Type "LIKE" or "SUBSTRING" in the where clause of FLWOR statement? Need to evaluate just first n chars. By the time I posted a response, Randy had figured out how to use the contains() function to solve his problem, but I am going to blog this because...(read more)

    Read the article

  • How exactly is Google Webmaster Tools measuring "Site Performance"?

    - by Rémi
    I've been working for two months now on improving our response time (mainly server side) on a new forum (a brand new product on a technical point of view) we've launched in Germany a few month ago and I'm a lot surprised by the results I get. I monitor our response time using Apache logs and our own implementation of Boomerang beacon. Using my stats, I can see that our new product responds in about 680 ms where our old product was responding in about 1050 ms. On the other side, Google Webmaster Tool tells us that our pages have an average reponse time of about 1500 ms today where it was 700 three months ago with our old product. I've figured that GWT was taking client side metrics into account so I've added some measures on our Boomerang beacon and everything looks just fine. I've also ran some random pages on ySlow and Google's Page Speed and everything looks better than it was before. We event have a 82% on Google's Page Speed tool which is quite cool for a site with some ads in it :) Lately, we have signed a deal with Akamai to use two of their products : CDN for our static files (we were using another CDN before but it wasn't very effective) and RMA to improve Networks routes. We have also introduced a new agressive cache mecanism to ensure that most of the pages served to crawlers are cached by our memcache grid. After checking my metrics, it seems that this changes have improved from 650ms to about 500ms, which is good (still not great but it is definitly an improvement). But webmaster tools continues to report an increasing average response time where we see it decreasing in the same time. Have you ever had the same kind of wierd behavior on your sites while doing performance improvements ? Do you have any idea how to monitor the same thing Google does with Site Performance in Google Webmaster Tools so that we could improve our site and constantly check if it is what Google wants ? Edit 2011/07/26 : Thanks for your answers guys ! Nevertheless, I was not precise enough. The main issue we have is not with the Site Performance page but with the Crawl Stats one for now. We probably found an issue on our side with some very slow pages (around 3000 ms !!) and we are trying to fix them. I'll keep you posted as soon I'll have some infos. Thanks again !

    Read the article

  • [Dear Recruiter] Do you have any disabilities?

    - by refuctored
    Recruiter letter for a technical position: ... Do you have any disabilities that prevent you from successfully performing the essential functions of this job with or without accommodations? ... My response: Robin -- The only qualification I can see as a hiderence to my ability to perform is my lack of fingers.  I find that if I mash the keyboard enough with my stubs, I eventually can get the code to compile correctly. Will this be a problem?   Thank you,George Her response: [None] So much for being an equal opportunity employer, eh?

    Read the article

  • Cannot install nautilus elementary.

    - by coklatua
    when I try apt-cache policy nautilus it shows this, Installed: 1:2.32.0-0ubuntu1-ppa1 Candidate: 1:2.32.0-0ubuntu1-ppa1 Version table: *** 1:2.32.0-0ubuntu1-ppa1 0 100 /var/lib/dpkg/status 1:2.32.0-0ubuntu6~ppa160 0 500 http://ppa.launchpad.net/am-monkeyd/nautilus-elementary-ppa/ubuntu/ maverick/main amd64 Packages 1:2.32.0-0ubuntu1.1 0 500 http://archive.ubuntu.com/ubuntu/ maverick-updates/main amd64 Packages 1:2.32.0-0ubuntu1 0 500 http://archive.ubuntu.com/ubuntu/ maverick/main amd64 Pack As you can see I allready add the am-monkeyd ppa but when i'm update & upgrade nothing change.

    Read the article

  • Finally found a replacement for my.live.com&hellip;

    - by eddraper
    As I had alluded to before, the transition of http://my.live.com/ to http://my.msn.com/ caused me serious grief. I've been an RSS addict for many, many years and I found the my.live.com UI to be the ultimate RSS reader and gateway to the web. It had been my home page for a long time.  My.msn has a LONG way to go before it matches the elegance and performance of my.live. The site I ended up going with is http://www.netvibes.com/ .   It’s the closest thing I could find that could do four column tiles with a reasonable amount of information density .  I’d still prefer a lot less “chrome” and better use of space, but it’s as close as I’m going to get… One feature I really do like about netvibes, is the pagination feature.  The built in feed reader is also quite nice… All-in-all, I’d recommend netvibes…

    Read the article

  • FREE Technical Training on Windows Server 2012 Virtualization / Hyper-V / Private Cloud

    - by KeithMayer
    Microsoft Learning partnered with the Microsoft Server and Tools team and Developer and Platform Evangelism (DPE) to deliver the “Windows Server 2012 Jump Start: Preparing for the Datacenter Evolution” on June 20-21, 2012. Thanks to an amazing product and a phenomenal team effort, this event shattered two Jump Start records with 2,064 attendees from 103 different countries and extremely positive event feedback! We are excited to announce the release of the HD-quality video recordings available on TechNet Videos now!For complete details: http://aka.ms/TrainWS12JSIf I can help with any other learning topics, please feel free to connect with me and let me know!HTH,Keith http://keithmayer.com | Twitter: @KeithMayer | LinkedIn: http://linkedin.com/in/KeithM

    Read the article

  • Sharing unique links on social media vs SEO

    - by MJWadmin
    We're currently implementing a voucher system on our site which will allow our users to obtain a 25+% discount on certain products, provided they donate 10% of the purchase price to charity. We will offer the ability to share the discounts via social media in return for larger discounts to the sharer for each person who clicks through the link and buys an item. I understand that social links have SEO benifits, but this appears to be based on lots of people sharing the same link. If our voucher users share a unique link i.e. http://ourdomain.com/sipsfesdf rather than a fixed link http://ourdomain.com/product-name will we still receive the same benifts? Should we instead share something like http://ourdomain.com/product-name/sipsfesdf Thanks in advance.

    Read the article

  • Remove Kernel 3.1

    - by chazdg
    Is there a way to remove kernel 3.1 from Oneiric? I downloaded and upgraded to 3.1 with these instructions: Open the terminal and run these two commands for both 32-bit and 64-bit versions of Ubuntu 11.10/11.04: wget http://kernel.ubuntu.com/~kernel-ppa...241006_all.deb sudo dpkg -i linux-headers-3.1.0-030100_3.1.0-030100.201110241006_all.deb Ubuntu (64-bit) For Ubuntu 11.10/11.04 (64-bit), issue these commands: wget http://kernel.ubuntu.com/~kernel-ppa...1006_amd64.deb sudo dpkg -i linux-headers-3.1.0-030100-generic_3.1.0-030100.201110241006_amd64.deb wget http://kernel.ubuntu.com/~kernel-ppa...1006_amd64.deb sudo dpkg -i linux-image-3.1.0-030100-generic_3.1.0-030100.201110241006_amd64.deb Everything went well. I was able to reboot quickly, but Firefox and Chrome constantly crash with Kernel 3.1. I am using Gnome 3.2 and saw improvement with 3.0.0.13 provided by ppa. Any help with 3.1 or just removing it would be helpful. Thanks to all that reply.

    Read the article

  • How to Implement Single Sign-On between Websites

    - by hmloo
    Introduction Single sign-on (SSO) is a way to control access to multiple related but independent systems, a user only needs to log in once and gains access to all other systems. a lot of commercial systems that provide Single sign-on solution and you can also choose some open source solutions like Opensso, CAS etc. both of them use centralized authentication and provide more robust authentication mechanism, but if each system has its own authentication mechanism, how do we provide a seamless transition between them. Here I will show you the case. How it Works The method we’ll use is based on a secret key shared between the sites. Origin site has a method to build up a hashed authentication token with some other parameters and redirect the user to the target site. variables Status Description ssoEncode required hash(ssoSharedSecret + , + ssoTime + , + ssoUserName) ssoTime required timestamp with format YYYYMMDDHHMMSS used to prevent playback attacks ssoUserName required unique username; required when a user is logged in Note : The variables will be sent via POST for security reasons Building a Single Sign-On Solution Origin Site has function to 1. Create the URL for your Request. 2. Generate required authentication parameters 3. Redirect to target site. using System; using System.Web.Security; using System.Text; public partial class _Default : System.Web.UI.Page { protected void Page_Load(object sender, EventArgs e) { string postbackUrl = "http://www.targetsite.com/sso.aspx"; string ssoTime = DateTime.Now.ToString("yyyyMMddHHmmss"); string ssoUserName = User.Identity.Name; string ssoSharedSecret = "58ag;ai76"; // get this from config or similar string ssoHash = FormsAuthentication.HashPasswordForStoringInConfigFile(string.Format("{0},{1},{2}", ssoSharedSecret, ssoTime, ssoUserName), "md5"); string value = string.Format("{0}:{1},{2}", ssoHash,ssoTime, ssoUserName); Response.Clear(); StringBuilder sb = new StringBuilder(); sb.Append("<html>"); sb.AppendFormat(@"<body onload='document.forms[""form""].submit()'>"); sb.AppendFormat("<form name='form' action='{0}' method='post'>", postbackUrl); sb.AppendFormat("<input type='hidden' name='t' value='{0}'>", value); sb.Append("</form>"); sb.Append("</body>"); sb.Append("</html>"); Response.Write(sb.ToString()); Response.End(); } } Target Site has function to 1. Get authentication parameters. 2. Validate the parameters with shared secret. 3. If the user is valid, then do authenticate and redirect to target page. 4. If the user is invalid, then show errors and return. using System; using System.Web.Security; using System.Text; public partial class _Default : System.Web.UI.Page { protected void Page_Load(object sender, EventArgs e) { if (!IsPostBack) { if (User.Identity.IsAuthenticated) { Response.Redirect("~/Default.aspx"); } } if (Request.Params.Get("t") != null) { string ticket = Request.Params.Get("t"); char[] delimiters = new char[] { ':', ',' }; string[] ssoVariable = ticket.Split(delimiters, StringSplitOptions.None); string ssoHash = ssoVariable[0]; string ssoTime = ssoVariable[1]; string ssoUserName = ssoVariable[2]; DateTime appTime = DateTime.MinValue; int offsetTime = 60; // get this from config or similar try { appTime = DateTime.ParseExact(ssoTime, "yyyyMMddHHmmss", null); } catch { //show error return; } if (Math.Abs(appTime.Subtract(DateTime.Now).TotalSeconds) > offsetTime) { //show error return; } bool isValid = false; string ssoSharedSecret = "58ag;ai76"; // get this from config or similar string hash = FormsAuthentication.HashPasswordForStoringInConfigFile(string.Format("{0},{1},{2}", ssoSharedSecret, ssoTime, ssoUserName), "md5"); if (string.Compare(ssoHash, hash, true) == 0) { if (Math.Abs(appTime.Subtract(DateTime.Now).TotalSeconds) > offsetTime) { //show error return; } else { isValid = true; } } if (isValid) { //Do authenticate; } else { //show error return; } } else { //show error } } } Summary This is a very simple and basic SSO solution, and its main advantage is its simplicity, only needs to add a single page to do SSO authentication, do not need to modify the existing system infrastructure.

    Read the article

  • cowbuilder --create --distribution lucid fails

    - by Daenyth
    I'm trying to create a build environment for Lucid, and calling cowbuilder --create --distribution lucid fails with the messages below: Get:1 http://us-east-1.ec2.archive.ubuntu.com lucid Release.gpg [189B] Hit http://us-east-1.ec2.archive.ubuntu.com lucid Release Hit http://us-east-1.ec2.archive.ubuntu.com lucid/main Packages Fetched 189B in 0s (2376B/s) Reading package lists... I: Obtaining the cached apt archive contents Reading package lists... Building dependency tree... 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. Reading package lists... Building dependency tree... apt is already the newest version. Package cowdancer is not available, but is referred to by another package. This may mean that the package is missing, has been obsoleted, or is only available from another source E: Package cowdancer has no installation candidate I: unmounting dev/pts filesystem I: unmounting proc filesystem pbuilder create failed forking: rm -rf /opt/cowbuilder

    Read the article

  • Network strange problem

    - by Ali
    I have a CPanel server with 5 IPs and a few domains. During night, access to main domain through HTTP return 324 no response many times. After several refresh it comes but many assets won't load and return 324. Using HTTPS is fine. During day HTTP is also fine. But another domain on that very sever works fine all the day through HTTP. The server DNS are ns1 and ns2 of the first domain. Second domain is on the shared IP and first domain has a dedicated IP. I cant resolve the problem :( and appreciate any help so much!

    Read the article

  • How to install GIMP 2.7?

    - by Bucic
    Here I ask this question since Beta 2 is usable and this question would come up sooner then later ;) After issuing the standard sudo add-apt-repository ppa:matthaeus123/mrw-gimp-svn sudo apt-get update I get W: Failed to fetch http://ppa.launchpad.net/matthaeus123/mrw-gimp-svn/ubuntu/dists/precise/main/source/Sources 404 Not Found W: Failed to fetch http://ppa.launchpad.net/matthaeus123/mrw-gimp-svn/ubuntu/dists/precise/main/binary-amd64/Packages 404 Not Found W: Failed to fetch http://ppa.launchpad.net/matthaeus123/mrw-gimp-svn/ubuntu/dists /precise/main/binary-i386/Packages 404 Not Found E: Some index files failed to download. They have been ignored, or old ones used instead. Is there any way to get around this. Excluding compiling from source as it usually introduces even more multilevel errors.

    Read the article

  • Bandwidth Problem in Terminal?

    - by Rob Barker
    I'm trying to install the Mac cursors to Ubuntu 12.04 but i get this error when using the wget command in Terminal. ubuntu@ubuntu:~$ wget -O mac-cursors.zip http://dl.dropbox.com/u/53319850/NoobsLab.com/mac-cursors.zip --2012-12-09 16:31:17-- http://dl.dropbox.com/u/53319850/NoobsLab.com/mac-cursors.zip Resolving dl.dropbox.com (dl.dropbox.com)... 23.21.195.136, 23.23.139.153, 107.20.134.231, ... Connecting to dl.dropbox.com (dl.dropbox.com)|23.21.195.136|:80... connected. HTTP request sent, awaiting response... 509 Bandwidth Error 2012-12-09 16:31:18 ERROR 509: Bandwidth Error. Can someone tell me what this means please, and a possible workaround? Thanks very much.

    Read the article

  • Latest MapViewer 11g patch released

    - by lqian
    Hi,   We are glad to announce that the latest MapViewer 11g patch (version 11.1.1.7.2) has just been uploaded to OTN in the usual place. This is mostly a bug fix release, with several noticeable enhancements to the HTML5 API. For the full release note, please check it here:  http://download.oracle.com/otndocs/products/mapviewer/mapviewer_11p6_2_readme.txt In a related note, our hosted mapping service (elocation.oracle.com) has also updated its MapViewer server to this release. Finally, the public demo server running all the standard mapViewer demos have been patched to 11.1.1.7.2 as well. So make sure to give the demos a spin! http://slc02okf.oracle.com    :  show cases some of the main HTML5 mapping demos http://slc02okf.oracle.com/mvdemo : the MapViewer Samples & Demos Application.  Thanks LJ 

    Read the article

  • Port 21 will not open. status is closed. forward port 21 to new domains.

    - by Bob Swaggerty
    I am having a problem opening port 21 on my Linux Ubuntu server. No matter what i do, i can not get my status to change from closed to open. Here is a recent iptables command i used and the result iptables -A INPUT -p tcp -m tcp --dport 21 -j ACCEPT nmap -p21-22,25,80,443 CCR1 Starting Nmap 4.53 ( http://insecure.org ) at 2012-06-19 03:13 CDT Interesting ports on CCR1.chennaichristianradio.com (5.10.69.98): PORT STATE SERVICE 21/tcp closed ftp 22/tcp open ssh 25/tcp closed smtp 80/tcp open http 443/tcp closed https I also used commands from the advise from this forum http://www.cyberciti.biz/faq/iptables-open-ftp-port-21/ I need to open this for FTP access to the server, and ultimately I need to forward port 21 to 2 domains i am setting up. Thank you for any assistance you can provide -Bob Swaggerty

    Read the article

< Previous Page | 330 331 332 333 334 335 336 337 338 339 340 341  | Next Page >