Search Results

Search found 18715 results on 749 pages for 'website attack'.

Page 264/749 | < Previous Page | 260 261 262 263 264 265 266 267 268 269 270 271  | Next Page >

  • How are suspected DoS attacks handled by webservers?

    - by Jan Kuboschek
    I rent a server somewhere out in Canada or so that I'm using to host a website of mine. That website has close to 400,000 pages that I wanted to index today. For that, I wrote a crawler a while back (see JCrawler on Stackoverflow.com). Now, I'm greedy and didn't want it to take too long so I ran multiple threads resulting in some 60+ requests per second from my IP. A couple minutes later, my server locked me out. I can still FTP into it, but I can't HTTP it. As server administrator or user, do you have any idea how servers usually handle these situations? Is it common to place a permanent or temporary ban on the IP or what is typically done? Naturally, I'll re-run my software with fewer requests once I'm back on.

    Read the article

  • Is the decision to use SNI or IP based SSL made during cert purchase or cert installation?

    - by Neil Thompson
    It's time to renew an SSL cert - but the website will soon be moving from a dedicated machine with a fixed IP to a cloud based host behind a load balancer. When I renew or re-purchase my ssl cert do I make the decision about whether it should be an SNI / IP based SSL Cert at the point of purchase - or is a cert a cert and it's all about where and how it's installed? I'm hoping the renewed cert can continue to be IP based for now, and in a few months when the website (and it's domain ofc) moves to the cloud I can re-use the cert in 'SNI mode'

    Read the article

  • Can't get IIS6 to setup properly.

    - by AngryHacker
    I am trying to setup a website on a Windows 2003 R2 box. So I cranked up IIS, clicked on Web Sites, created a new website and setup the host headers for my domain. No matter what I did, I kept getting Service Unavailable errors. Finally, I followed the instructions in this link, and it worked. Basically, the page recommends switching to IIS5 isolation mode. I don't really understand why it works and I don't understand why it didn't work before. If someone could enlighten me, that would be great. Bonus for explaining how to do with the IIS6 isolation mode.

    Read the article

  • Changing Windows 'hosts' file in guest OS under Parallels Desktop 6 on Mac [migrated]

    - by ling
    I am running Win7 in a Parallels Desktop 6 on Mac. On the mac, I use /etc/hosts : 127.0.0.1 mysite and it works fine : I can type mysite in the url and the website will display. What I want to do is be able to reproduce the same on Win7 with IE8 for example. Another guy did it successfully here but I can't : Parallels: How to see a Mac-hosted website from Windows? On Mac : ifconfig -a vnic0 10.211.55.2 on Win7 I tried this C:\Windows\System32\drivers\etc\hosts 10.211.55.2 mysite What am I missing ?

    Read the article

  • Transferring whole folders using Dropbox without the desktop app

    - by Ender
    Is there any way I can upload whole folders to my Dropbox account without the desktop application? I am syncing my home computer to a computer at university and the university computers do not allow users to install applications anywhere, meaning that I cannot use the Dropbox utility to share my folders. I've attempted to use the online utility (on the website) to share my files, but much of my work is spanned over dozens of files all within their own folders. All you can do with the upload feature on the website is upload files and for my projects this would take a long time.

    Read the article

  • Danger in running a proxy server? [closed]

    - by NessDan
    I currently have a home server that I'm using to learn more and more about servers. There's also the advantage of being able to run things like a Minecraft server (Yeah!). I recently installed and setup a proxy service known as Squid. The main reason was so that no matter where I was, I would be able to access sites without dealing with any network content filter (like at schools). I wanted to make this public but I had second thoughts on it. I thought last night that if people were using my proxy, couldn't they access illegal materials with it? What if someone used my proxy to download copyright material? Or launched an attack on another site via my proxy? What if someone actually looked up child pornography through the proxy? My question is, am I liable for what people use my proxy for? If someone does an illegal act and it leads to my proxy server, could I be held accountable for the actions done?

    Read the article

  • Windows 2008 server unaccessible without traces in the event log

    - by Rob
    I am trying to figure out why a Windows 2008 server became inaccessible in terms of RDP and access to a web application. The server was turned off and then on. Look at the event log at the time it went offline, I can't find anything. And looking at misc application logs, the system was running like normal after it went offline. It has to be said that by mistake the firewall was switched off earlier, so a lot of attempts had been done to access the SQL Server with the sa user as well as RDP login. But the attempts has been going on for days, so nothing new about that. Besides the event logs, is there anywhere else I can go to examine the cause of this? I am also in doubt whether or not a DOS attack or similar would show up in the event log. From a log for a backup application running on this server I can see that an attempt was done to access a remote IP after the server went offline, but got no connection.

    Read the article

  • MAMP and WordPress - Theme is gone

    - by ninjaboi21
    Hey SuperUsers, I hope this is the right section to post this kind of question. If it's not, I am sorry. But to my question: I created a working WordPress blog, running locally with MAMP Pro. I wanted to share my site, and thought I would go public with it, but self hosted. I signed up at DynDNS for my domain and got it working. People can connect, myself included of course. But here goes my problem: Everytime I try to access my website (**.dyndns.info:8888), I am able to access it, but the theme and pictures are gone, so it's only pure text with some links. I can provide a picture of the website if you want it? But what is it I am doing wrong? I tried to enable Web Sharing, but that didn't make a difference. Thank you so much if you can figure this out!

    Read the article

  • What are the current options to encrypted a partition on mac os x ?

    - by symbion
    I recently got my laptop stolen with some sensitive informations on it (personal source code, bank details in a secure file, passwords, etc) and I learnt the lesson: encrypt your sensitive data. Now, I am wondering what are the options to encrypt a partition (not an encrypt disk image) ? Aim: The aim is to prevent anyone (except me) to access those data. Requirement 0: The software must be able to encrypt non system partition. Requirement 1: Plausible deniability is required but preventing cold boot attack is however not an absolute requirement (I am not famous enough or have sensitive enough info to have this kind of requirement). Requirement 2 : Software taking advantage of AES hardware encryption are very welcome as I intent to get a Macbook Pro with i7 CPU (with AES-NI enabled instructions). I will have avirtual machine running in the encrypted partition. Requirement 3 : Free or reasonably cheap. Requirement 4 : Software must run on Mac OS X Snow Leopard or Lion. So far, TrueCrypt is the only option I have found. Regards,

    Read the article

  • Browser considering www domain and without www domain different

    - by user1444680
    I've bought a domain name and hosted it. My browser is storing separate passwords for mydomain.com and www.mydomain.com, and also caching them separately. I want these two to be considered the same website. The zone records of mydomain.com are: "A" record: "@" points to the IP address of my hosting CNAME: www points to "@" As CNAME signifies alias, shouldn't browser understand (like search engines do) that the two URLs refer to the same website? Is it browser's fault? Please tell how to correct the problem? Do I need to enter some other record for www instead of CNAME?

    Read the article

  • Multiple domains for different products?

    - by alexandertr
    I have a website with software applications. Is it good for SEO to choose one keyword rich domain name for each of our software products or should we stick to a single domain? From a user's perspective I think it would be easier to remember a domain that is keyword rich as the user will instantly know what this product is for. But I have read articles that the latest trend in SEO is to stick to one domain for all of your products and invest on this single domain website. Is that true? What do you advise? Should I register a separate domain for each of our products or should I use only one single domain? Should I do a 301 redirect with a .htaccess to a single domain? And what about the sitemaps? Should I register all sites in Google Webmaster Tools and post a separate sitemap for each one of them? should my main site sitemap include all pages or should separate domains have their own sitemaps?

    Read the article

  • web services access not being reached thru the web browser [closed]

    - by Tony
    I am trying to reference my .asmx webservices in .NET but my server is not exposed to the internet. When I put on the following address I get the message mentioned below. What's the reason for not being able to see the directory? Am I missing something in my IIS configuraction? Am I missing anything in my permissions? Just as reference I have other folders with webservices and I have the same issue. When I login to the server I am doing it with my windows user and password (I am using windows authentication). It's necessary to mention that when I put the URL I am getting a popup screen to put in my userid and password but it seems that's not able to validate since keeps asking me a couple of times. Let me know if you need more information to address this issue . http://appsvr02/Inetpub/wwwroot/DevWebApi/ Internet Explorer cannot display the webpage What you can try: It appears you are connected to the Internet, but you might want to try to reconnect to the Internet. Retype the address. Go back to the previous page. Most likely causes: •You are not connected to the Internet. •The website is encountering problems. •There might be a typing error in the address. More information This problem can be caused by a variety of issues, including: •Internet connectivity has been lost. •The website is temporarily unavailable. •The Domain Name Server (DNS) is not reachable. •The Domain Name Server (DNS) does not have a listing for the website's domain. •If this is an HTTPS (secure) address, click tools, click Internet Options, click Advanced, and check to be sure the SSL and TLS protocols are enabled under the security section. For offline users You can still view subscribed feeds and some recently viewed webpages. To view subscribed feeds 1.Click the Favorites Center button , click Feeds, and then click the feed you want to view. To view recently visited webpages (might not work on all pages) 1.Click Tools , and then click Work Offline. 2.Click the Favorites Center button , click History, and then click the page you want to view.

    Read the article

  • Custom Rule Sets in JohnTheRipper

    - by user854619
    I'm trying to create a custom rule set to do hash cracking. I have a SHA1 hash and a rule set that was enforced to create the password. The password must be of the form, 6-8 characters Every other letter changes case Password "shifts" characters at least one degree and at most three One odd number and one even number are at the beginning of the password One special character and one punctuation character are appended to the end of the password How can I defined a brute force attack in JohnTheRipper or similar hash cracking program? I've also attempted to write code to generate a wordlist of possible passwords, with no success. Thanks!

    Read the article

  • Google Analytics custom variables and how are they recorded?

    - by mrtsherman
    I have been asked to add GA custom variable tracking to my company's website. The company website uses server side includes, so making modifications to the tracking code happens identically everywhere. Maintenance is therefore a headache. Also, GA takes about twenty-four hours for custom variables to start showing up in reports and that makes troubleshooting a headache. So if you have custom variables // visitor level tracking, id = 12345 _gaq.push(['_setCustomVar', 1, 'id', '12345', 1]); // page level tracking, email = [email protected] _gaq.push(['_setCustomVar', 1, 'email', '[email protected]', 1]); The marketing people want the following out of this: User visits site and we record a unique id for them. Whenever they return this id will be used in GA. User signs up for our newsletter on page X and we record their email address. Whenever they return this email address is used in GA. Now a big problem for me is that I don't use GA and the marketing people don't use custom variables. So we don't actually know how this will work. Do I want Page, Session or Visitor level tracking? What happens because the same GA code is used on every page? If they visit the email sign up form and we record the email address, but then they go somewhere else where email is nonexistent will the value get 'overwritten.' Sorry for the long question, but there are a lot of unknowns for a GA noob.

    Read the article

  • IIRF - Redirecting all traffic to the http equivalent

    - by GordonB
    I'm using IIRF and having some trouble getting it to redirect all traffic to the secure version of my sites. So... I have a website with about 20 apps in virtual directories in IIS6. The website takes 80 and 443 traffic. I want to use IIRF to redirect all port 80 traffic EG; http://myserver/app1/page1/param1 http://myserver/app2/ http://myserver To the secure equivalent (https). Here's my config so far; # Iirf.ini # # ini file for IIRF # RewriteLogLevel 1 RewriteLog D:\Websites\Apptemetry\IirfLogs RewriteEngine ON StatusInquiry ON IterationLimit 5 RewriteLogLevel 3 RewriteCond %{HTTPS} off RewriteCond %{SERVER_PORT} ^80$ RedirectRule ^http(.*)$ https$1 Can anyone advise the correct configuration to use, to redirect all traffic?

    Read the article

  • syslog log of TCP packet

    - by com
    Occasionally, I noticed a lot of following messsages in syslog Nov {datetime} hostname kernel: [8226528.586232] AIF:PRIV TCP packet: IN=eth0 OUT= MAC={mac} SRC={sourceip} DST={destinationip} LEN=60 TOS=0x00 PREC=0x00 TTL=63 ID=20361 DF PROTO=TCP SPT=39950 DPT=37 WINDOW=14600 RES=0x00 SYN URGP=0 On the Internet, I found that DOS attack may cause such type of output, unfortunately, I don't understand what does this log mean. The only thing is clear for me is this log is related to network. The source host is the host where nagios is installed. Does it mean nagios somehow does behave well? And what does it mean at all?

    Read the article

  • How to move mail accounts when migrating webhosting

    - by pkswatch
    I am migrating my website abc.com from one webhosting company to another in a shared hosting environment. Both have cpanel. And the second hosting account i am preparing to move is my multi-domain hosting account with 3 domains already in it. The problem is, i have many email accounts associated with my website abc.com, which are accessed using webmail. So if i move it to the other host, will i lose all those accounts and their emails? If yes, then how should i synchronise the email accounts so that all the accounts and the contained emails remain intact? I saw some several sync tools like IMAP Sync, etc. But these require two hosts while synchronizing, and as you see, i have just one domain name to be synchronized over 2 servers. PS, i do not have any ssh access on either of them, and i have made complete backup of all files using backup wizard in cpanel.

    Read the article

  • How to emulate a domain name - Webmin Setup

    - by theonlylos
    I am currently working on a client project where they are using a custom CMS which relies on having the specific domains configured for it to work properly. So in English, that means that when I try running the site on my test environment, the entire website fails because it isn't located on the primary domain (and I'm pretty sure the domain is hard coded since there's no control panel to adjust the file locations). Anyway what I wanted to ask is whether it is possible to use my test environment URL but have Apache and the DNS emulate my clients website URL locally, rather than calling the actual name servers. Right now I have a virtual host setup in Apache but I am not sure where to go from there. Any assistance is greatly appreciated.

    Read the article

  • Google Indexing Issue after htaccess changes

    - by Klement
    I have a site called www.FuneralCoverFinder.co.za. I have about 30 pages on the site and usually have 29 indexed. (Excluding 15 blog posts) They are new. I recently upgraded my entire site and made some redirection changes in my .htaccess file. I have made my url's more SEO friendly (Removing index.php/) and redirecting dead pages to working pages. I have tons of unique content all checked by grammarly and plagium to ensure I have no duplicate content. I have since resubmited my sitemap to Google and now have only one page indexed. It was within a couple of minutes. I usually see results almost immediately after submitting, now it's stuck on 1 page indexed. I assume I might have made errors in the .htaccess file as this was my first attempt. The site runs perfectly and all the url's redirect the way they should. I'm scared I have some or other loop, although the website runs fine. I still see many of my old indexed pages in the SERP's, I'm just worried that the issue with the new sitemap can cause my rankings some harm. My website is pretty SEO optimized onsite. I have about 1500 indexed backlinks and have been building them steadily over about half a year. I would really appreciate some clarity on this matter.

    Read the article

  • Artists and music - Need Help Deciding on a CMS

    - by infty
    A friend has asked me to build a site with the following options: staff members must be able to add new music and artists to the page a gallery must be provided - it is also good if each artist has the ability to have his/her own smaller gallery users must be able to vote for artists users must be able to alter in discussions (forums or comments sections) staff members must be able to blog staff members must be able to write articles I did a small project where i actually implemented all of these features, but I want to use an existing content management system for all of these features so that future developers can, hopefully, more easy extend the website. And also, so that I don't have to provide too much documentation. I have never developed a website using an external CMS like Drupal or Wordpress and after seeing hours of tutorial videos of both systems, I still can't make up my mind on whether i should : a) use Drupal 7 b) use Wordpress 3 c) create my own cms I can imagine that staff members would also want to create content using iPhone or android based mobile devices, but this is not a required feature. Can someone, with experience, please tell me about their experiences with larger projects like this? The site will have approximately 400 000 - 500 000 visitors (not daily visitors, based on numbers from last year in a period of 4 months)

    Read the article

  • How do web servers enforce the same-origin policy?

    - by BBnyc
    I'm diving deeper into developing RESTful APIs and have so far worked with a few different frameworks to achieve this. Of course I've run into the same-origin policy, and now I'm wondering how web servers (rather than web browsers) enforce it. From what I understand, some enforcing seems to happen on the browser's end (e.g., honoring a Access-Control-Allow-Origin header received from a server). But what about the server? For example, let's say a web server is hosting a Javascript web app that accesses an API, also hosted on that server. I assume that server would enforce the same-origin policy --- so that only the javascript that is hosted on that server would be allowed to access the API. This would prevent someone else from writing a javascript client for that API and hosting it on another site, right? So how would a web server be able to stop a malicious client that would try to make AJAX requests to its api endpoints while claiming to be running javascript that originated from that same web server? What's the way most popular servers (Apache, nginx) protect against this kind of attack? Or is my understanding of this somehow off the mark? Or is the cross-origin policy only enforced on the client end?

    Read the article

  • My servers been hacked EMERGENCY

    - by Grant unwin
    I'm on my way into work at 9.30 pm on a Sunday because our server has been compromised somehow and was resulting in a DOS attack on our provider. The servers access to the Internet has been shut down which means over 5-600 of our clients sites are now down. Now this could be an FTP hack, or some weakness in code somewhere I'm not sure till j get there. Does anyone have any tips on how I can track this down quickly. Were in for a whole lot of litigation if I dont get the server back up asap. Any help appreciated.

    Read the article

  • Should I indicate that the user exists or was deleted on the error page?

    - by animuson
    On an ordinary public website, the user's profile is always publicly visible to all visitors (such as Stack Overflow), where they can limit certain pieces of information via privacy settings or just removing the information. Now the user has decided to delete their account (in my case deactivate) so that their account doesn't technically "exist" anymore. The way my system is set up, when their account is deactivated, their username for any content connected to them just becomes "Anonymous User" as if it were a guest that posted. I feel like this could cause some confusion for other users. I'm also concerned about what kind of error to display when someone attempts to view their profile page. My gut tells me to just display a standard 404 page to hide the fact that they ever existed, but then you also have to consider that, since usernames must be unique, anyone can go to the register page and type in the username to see if it really exists or not. I have a similar problem with another website, which gives users the ability to hide their profiles from the public and only allow registered users to view it. Again it's with the dilemma of what kind of error message to display when an unregistered users attempts to view their profile with invalid permissions. So, would it be acceptable to display basic errors such as "user has been deactivated" or "you must be logged in to view this profile" in order to give other visitors some idea of why the page can't be displayed, or should I attempt to cover the user's privacy a little and just display a standard 404 without indicating in any way that the user might exist? Are there any other issues that I'm not realizing about either route? To go back to the beginning, should I even bother changing the user's name to "Anonymous User" when their account is deactivated? Would it be acceptable to just display a non-linked version of their username in place of the normal linked display name?

    Read the article

  • Googlebot visit but no cache update - why?

    - by Mick
    I have made a new plain vanilla HTML website. I have been making regular modifications to it on an almost daily basis. The site is hosted by hostmonster and as part of their service they offer "awstats" to let you know assorted details of visitors to the site. One thing is puzzling me. According to awstats, a "robot/spider" calling itself "Googlebot" visited my site as recently as today (28th June 2011), but when I find my site on google (e.g. by searching for "full reserve banking") the cache is dated only the 5th June. I always thought that a visit from the google robot was synonymous with a cache update. Am I wrong? Or have I accidentally put something in the site telling google that nothing has been updated? EDIT: It seems a moderator has removed the name of my website, so there is now no chance that anyone could check out if I had made some error on my site :-( ... but anyway, in answer to paulmorriss' question, here is what aw stats was telling me:

    Read the article

  • Amazon - install a complete server on EBS

    - by user1169575
    is it possible to install a full working OS with a webserver, db, and all needed stuff on an EBS storage? If so, would I immediatly gain benefits by mounting this EBS on a better instance? Otherwise (if I cannot install a complete image, or if you don't think it's reasonable) can I install the software so that I only need to mount the EBS on a new instance to have it working? I purchased a Medium Reserved Instance, but when there will be the need to get a better instance I'd like to move the whole db/website, I'd simply buy a better instance and then attach the EBS. Is it possibile? I'm imaging about it like an hard drive that would be mounted on a better server. Of course, more RAM would allow me to increase caching limits, and that's ok, but I don't want to reinstall anything (the main website is a magentocommerce and it's pretty painful to move it). P.S. is the Standard EBS (100 IOPS) valid or do I need to choose a Provisioned IOPS (up to 1000 IOPS)?

    Read the article

< Previous Page | 260 261 262 263 264 265 266 267 268 269 270 271  | Next Page >