Search Results

Search found 18715 results on 749 pages for 'website attack'.

Page 256/749 | < Previous Page | 252 253 254 255 256 257 258 259 260 261 262 263  | Next Page >

  • Public Facing Recursive DNS Servers - iptables rules

    - by David Schwartz
    We run public-facing recursive DNS servers on Linux machines. We've been used for DNS amplification attacks. Are there any recommended iptables rules that would help mitigate these attacks? The obvious solution is just to limit outbound DNS packets to a certain traffic level. But I was hoping to find something a little bit more clever so that an attack just blocks off traffic to the victim IP address. I've searched for advice and suggestions, but they all seem to be "don't run public-facing recursive name servers". Unfortunately, we are backed into a situation where things that are not easy to change will break if we don't do so, and this is due to decisions made more than a decade ago before these attacks were an issue.

    Read the article

  • How to generate user-specific PDF with encrypted hidden watermark?

    - by Dave Jarvis
    Background Using LaTeX to write a book. When a user purchases the book, the PDF will be generated automatically. Problem The PDF should have a watermark that includes the person's name and contact information. Question What software meets the following criteria: Applies encrypted, undetectable watermarks to a PDF Open Source Platform independent (Linux, Windows) Fast (marks a 200 page PDF in under 1 second) Batch processing (exclusively command-line driven) Collusion-attack resistant Non-fragile (e.g., PDF - EPS - PDF still contains the watermark) Well documented (shows example usages) Ideas & Resources Some thoughts and findings: Natural language processing (NLP) watermarks. Apply steganography on a randomly selected image. http://openstego.sourceforge.net/cmdline.html The problem with NLP is that grammatical errors can be introduced. The problem with steganography is that the images are sourced from an image cache, and so recreating that cache with watermarked images will impart a delay when generating the PDF (I could just delete one image from the cache, but that's not an elegant solution). Thank you!

    Read the article

  • Does Google sometime prevent new white hat sites from ranking at all in some verticals?

    - by JVerstry
    Assuming someone wants to implement a new viagra or akai berry e-commerce website. There is a lot of competition and this site does not really bring something new, other than a new online counter to buy products at a nice price. Assuming this site does not use any black hat techniques and that it stays with Google quality guidelines, and assuming it has no (or few) backlinks (from non-authoritative websites). Assuming this website's pages are indexed properly in Webmaster Tool, and that no penalties are reported. No site improvements are suggested. Google crawls the site daily as reported in GWT. No robots.txt configuration issues. Does Google sometime decide to no rank this site for any user query (for weeks), because of lack of original content? The reason I am asking this is that I am trying to understand the possible cause of a similar situation I am observing with two sites. If so, what is the way out to start ranking for these site? If not, does it mean the cause is elsewhere for sure? Any confirmed info to get out of the maze is welcome.

    Read the article

  • which server is best for me??

    - by mathew
    I am looking for a best hosting service for my website. My website is a PHP MySQl driven site which hase got site scrapping for more than 10 websites and near about 8-10 API pursing and some about 150 mb dat file reading(from local hard drive), and also one rss pursing,Live graph from other sites, geo map from Google,Map api from Google and so on. one widget which real-time result for any one who chose the same. so my question is which is best option for me. actually I am considering softlayer cloud as they are pretty cheap and more facilities than rackspace. another option is a dedicated server which has 2 single core processor, 4GB ram and 250GB sata II. with 100 mbps uplink. so please tell me which will be the best option?? I heard that dedicated server has lots of limitations than cloud. but for cloud they uses SAN for storage so I am afraid the reading proces for database may be bit slower...and their basic plan ram is only 1 GB.

    Read the article

  • Do I need a ssl certificate if just pointing my domain to Cloudfront?

    - by hashpipe
    I have a website running on a domain (e.g site.com). I have an additional domain(e.g sitecdn.com) which basically points to Amazon Cloudfront for delivery. Amazon Cloudfront in turn basically fetches the data from the main domain (site.com). I use this setup primarily to have multiple subdomains of my sitecdn.com to point to assets via the cdn. The main website has a ssl certificate, and I intend to put all assets served from the cdn as https links only. Something like <img src="https://img.sitecdn.com/image.jpg" /> I'm a little confused whether I need a ssl for my cdn domain. In cloudfront I can set the setting to allow both https and http traffic. Do I need a ssl certificate for this ? If yes, then where do I install the ssl certificate, since I don't have a server for sitecdn.com.

    Read the article

  • Do I need social networks to be an expert developer? [closed]

    - by Gerald Blizzy
    This question may sound odd, but do I need twitter, facebook and google+ if I am a web-developer? I see many expert developers nowadays using it in working order. It seems like it's harder to stay in touch with customers, co-workers and potential customers if you don' use social networks. Am I right? Reason why I ask is that I am totally not a facebook/twitter person, I find it boring and annoying. I understand that linkedin is usefull for career, but what about twitter and facebook? Are they needed for web-developer career? What I am trying to ask is if I only use linkedin, own portfolio website, google talks, gmail and something like github, would I actually miss anything professionally/job-wise? My thoughts are that I can just have my portfolio website where I list all my projects aswell as contacts page with my google talks/gmail account. It can suit both fulltime job, freelance and own projects. So this way email and google talks is just enough. Am I right or not? Thanks in advance!

    Read the article

  • Devoxx 2011 Started Today

    - by Yolande
    Devoxx 2011, organized by Java user group in Belgium, is the biggest Java conference in Europe. The first two University Days set the tone for the weeklong conference with its in-depth technical sessions lead by luminaries from the Java community and industry experts. Each day is a great mix of 3 hour sessions and hands-on labs, 30 minute Tools-in-Action sessions giving tips for faster and better application development and the traditional Birds-of-a-Feather sessions in the evening. Java sessions for today and tomorrow: - Next Gen Enterprise Apps - Bert Ertman and Paul Bakker talked about new Java EE 6 APIs that reduces the need for boilerplate code and configuration. - JavaFX 2.0 – A Java developer’s guide - Stephen Chin and Peter Pilgrim will give an overview of new version and how Java developers can take advantage of it - Java Rich Clients with JavaFX 2.0 - Richard Bair and Jasper Potts will get into JavaFX 2.0 APIs - Building an end-to-end application using Java EE 6 and NetBeans - Arun Gupta will showcase how to write Java EE 6 applications more effectively. - The OpenJDK Community BOF with Dalibor Topic Starting Tuesday, come by the Oracle booth to chat about technology, enter our raffle and have a beer every day at 18:45 The sessions will be available on Parleys website after the conference. In the meantime, you can learn a lot about those Java technologies on our website: - JavaFX 2.0 tutorials and documentation - OpenJDK - News from the GlassFish community - JavaEE 6 resources - JavaOne sessions

    Read the article

  • Is there an apache module to slow down site scans?

    - by florin
    I am administering a few web servers. Each night, random hosts from the Internet are probing them for various vulnerabilities in php, phpadmin, horde, mysqladmin, etc. Is there a way (apache plugin?) to slow down the rate of attack? For SSH, I have a rate limiting rule on the firewall, which does not allow more than three connections per minute. But I don't want to rate limit all HTTP access, only the access that returns 404s. Is there such an apache module?

    Read the article

  • .htaccess https redirect best method

    - by Douglas Cottrell
    I have searched through all the redirects posted buy others and cant quite find the answer to my problem. I have a website with over 3000 pages and we are getting duplication issues within google. We want to keep everything in the parent directory to be http except our contact.php and login.php page. We then have 3 folders that must be secured. admin, clients, customers I have tried using the following code in seperate .htaccess files for each folder, but I keep getting a conflict when I try and I am still trying to find a good solution for the home directory. RewriteEngine On RewriteCond %{SERVER_PORT} 80 RewriteCond %{REQUEST_URI} admin RewriteRule ^(.*)$ https://www.website.com/$1 [R,L] Any help would be greatly appreciated.

    Read the article

  • Creating a backup - Rsync - Connection refused (111)

    - by pablofiumara
    I am trying to create a backup of my website for free. I just want to have a backup of my website, including not only all files and the configuration but also the databases. I mean, a full backup. If it can be done automatically, it would be better. I feel there are better ways than using the cpanel to achieve that (actually, I believe sometimes web hosters does not have any cpanel). I read the following on how to do it: Automatically mirror the entire contents and configuration of your main server to a secondary backup server on a completely separate network in a different data centre. Use RSync, FXP, cPanel voodoo, or whatever method you wish to automate syncing. That is why I installed Rsync Daemon which is an alternative to SSH for remote backups. I configured it but the test went wrong. The terminal is showing me this: pablofiumara@pablofiumara-Lenovo-G470:~$ sudo rsync [email protected]::share [sudo] password for pablofiumara: rsync: failed to connect to pablofiumara.com (50.87.147.75): Connection refused (111) rsync error: error in socket IO (code 10) at clientserver.c(122) [Receiver=3.0.9] pablofiumara@pablofiumara-Lenovo-G470:~$ sudo rsync [email protected]::share failed to connect to 50.87.147.7 (50.87.147.7): Connection refused (111) rsync error: error in socket IO (code 10) at clientserver.c(122) [Receiver=3.0.9] What should I do? Is there a better or easier way to achieve what I wish (I mentioned this in the first paragraph)?

    Read the article

  • Blank pale blue screen with Live USB Kubuntu on AMD Sempron 2800+ processor

    - by WGCman
    I am trying to install Kubuntu onto a USB stick to use on my Acer Aspire 1362 laptop with an AMD Sempron 2800+ chip. Using Windows XP, I downloaded and saved to the laptop's hard drive: kubuntu-2.04.1-desktop-i386.iso from the GetKubuntu website and LinuxLive USB Creator 2.8.16.exe from the Linux live website I then installed the latter and ran it, installing the kubuntu onto the Memory stick. Leaving the Bios setup unchanged, the USB stick is ignored and Windows boots. If I change the Bios boot order so that the memory stick takes precedence, I see a dark blue screen announcing Kubukntu 12.04, and on selecting either “live Mode” or “Persistent mode”, messages flash by quickly, some of which appear to be error messages, including “trying to unpack rootfs image as initramfs”, “cannot allocate resource for mainboard”, “no plug and play device found”. Eventually I see a pale blue screen with four moving dots announcing Kubukntu 12.04, similar to the login screen of my Kubuntu desktop, but no invitation to log in or indeed any dialog. After several minutes, this changes to a black screen with more messages including “no caching mode present”, “ADDRCONF(NETDEV_UP): wlan0: link is not ready”, then degrades to a blank pale blue screen which can only be moved by switching the computer off. Finding no way to log the error messages passing by, I managed to photograph most of them, but know no way to attach the photo to this forum. As suggested by User 68186 (to whom thanks!), I have edited my original post to reflect the recent progress, so the following two comments are now superseded.

    Read the article

  • Hacking prevention, forensics, auditing and counter measures.

    - by tmow
    Recently (but it is also a recurrent question) we saw 3 interesting threads about hacking and security: My server's been hacked EMERGENCY. Finding how a hacked server was hacked File permissions question The last one isn't directly related, but it highlights how easy it is to mess up with a web server administration. As there are several things, that can be done, before something bad happens, I'd like to have your suggestions in terms of good practices to limit backside effects of an attack and how to react in the sad case will happen. It's not just a matter of securing the server and the code but also of auditing, logging and counter measures. Do you have any good practices list or do you prefer to rely on software or on experts that continuously analyze your web server(s) (or nothing at all)? If yes, can you share your list and your ideas/opinions?

    Read the article

  • Radeon 9200 Driver?

    - by usuhh86
    I've been using Linux for a while, but haven't really done more than installed programs. I have Ubuntu 12.04 with an ATI Radeon 9200 graphics card. Ubuntu didn't recognize it during the install. All I want is a driver. Preferably the proprietary driver, but I'll settle for an open-source one if I need to. I went to the AMD support website, and whenever I click on the download link for "ATI Proprietary Linux x86 Display Driver 8.28.8" I get redirected to the main AMD website. I tried this on Firefox, Chrome, even booted up my netbook and tried it on IE9. Does anybody have a download link? And if not, a link to download an open-source alternative? Any help will be greatly appreciated. I'm pretty much still a newbie at this. OpenGL vendor string: Tungsten Graphics, Inc. OpenGL renderer string: Mesa DRI R200 (RV280 5961) x86/MMX/SSE2 TCL DRI2 OpenGL version string: 1.3 Mesa 8.0.2 Not software rendered: yes Not blacklisted: yes GLX fbconfig: yes GLX texture from pixmap: yes GL npot or rect textures: yes GL vertex program: yes GL fragment program: no GL vertex buffer object: yes GL framebuffer object: yes GL version is 1.4+: no Unity 3D supported: no

    Read the article

  • Showing content from pages at different URL's (masking), possibly with .htaccess

    - by zigojacko
    If I have URL's like:- domain.com/category/widgets/filter/blue domain.com/category/widgets/filter/red And it is pretty difficult to reconstruct them to something like:- domain.com/category/blue-widgets domain.com/category/red-widgets Is there any way at all that I can use URL rewrites or anything else with .htaccess or on the server to display the URL's as the domain.com/category/blue-widgets on the domain.com/category/widgets/filter/blue page? I've looked into masking URL's but got nowhere and this has been something bugging me for almost 6 months now. Is there any way to achieve what I want to do? FYI: This is a Magento website and the above process, I am wanting to implement for potentially hundreds of URL's. Edit To respond to @kkugelmann's answer:- I couldn't get your proposed RewriteRule to make a difference at all in the .htaccess file so I started testing a few things in this .htaccess tester:- The proposed RewriteRule didn't work in this tester:- However, the following did:- But adding any of these RewriteRule's into the website's .htaccess file did not rewrite the URL at all... Edit2 By the way, if I add [R=301,L] to the end of the URL rewrite rule, it does actually then rewrite the rule, but of course 301 redirects it as well which is unwanted behaviour. Edit3 I found another question with the same issue... And an accepted answer that solved the problem which seemed to be something to do with using mod_proxy and the [P] tag on the rule (if I try this, the page 404's).

    Read the article

  • Tried teaching myself to program before college, accidently overwhelmed myself, tips?

    - by Gunnar Keith
    I'm sixteen, I'm overly interested in programming, and I'm currently taking IT classes during my mornings in high school. Last year, I tried teaching myself to code. It was quite exciting, but all I did was watch TheNewBoston's videos on YouTube for Python. After his tutorials, I just did research, made some CMD programs, and that's it. After that, I got cocky and got my feet wet in many other languages. Java, C++, C#, Perl, Ruby... and it overwhelmed me. Which made it less fun to code. I want to go to college for a 2 year programming course. And I want to make writing code my profession. But how do you recommend I attack re-learning it all again? Start with Python? Don't even try? Also, I'm not 100% in math, but I'm good friends with a lot of programmers, who say they suck at math, but manage to code just fine. I'm not looking for negative feedback. I just want the proper head-start on things before college.

    Read the article

  • Domain changes required for SSL integration

    - by user131003
    Currently my site supports regular payment options (User is taken to Payment Gateway/PG website). Now I'm trying to implement "seamless" PG integration. I need SSL for this. I'm having a dedicated server with 5 static IPs from Hostgator/HG. options: I take SSL for www.my_domain.com. According to HG, I need to change IP of main site as current IP is not really dedicated as it is being shared by cpanel etc. So They need to bind another dedicated IP to main domain for SSL to work. This would required DNS change for main website and hence cause few hours downtime (which is ok). I've noticed that most of the e-commerce websites are using subdomains like secure.my_domain.com for ssl/https. This sounds like a better approach. But I've got few doubts in this case: a) Would I need to re-register with existing PGs (Paypal, Google Checkout, Authorize.net) if I switch to subdomain? Re-registering is not an option for me. b) Would DNS change be required for www.my_domain.com in this case. This confusion arose because of following reply from HG : "If the sub domain secure.my_domain.com is added to an existing cPanel it will use the IP for that cPanel so as long as it is a Dedicated IP that will be fine. If secure.my_domain.com gets setup as its own cPanel it will need to be assigned to a Dedicated IP which would have a DNS change involved.". Please suggest?

    Read the article

  • Basic IIS7 permissions question

    - by Tom Gullen
    We have a website, with a file: www.example.com/apis/httpapi.asp This file is used by the site internally to make requests joining two systems on the website together (one is Classic ASP, the other ASP.net). However, we do not want the public to be able to access the file. In IIS7.5, is there a setting I can do to make this file internal only? I've tried rewriting the URL for it but this rewrite is also applied internally so the scripts stop working as they fetch the rewritten url. Thanks for any help!

    Read the article

  • Unix: Sync directory with FTP or SFTP directory

    - by Svish
    I have a website on my local computer running Mac OS X. I am wondering if there is any built-in command that I can run in the Terminal that will upload that website to my webserver either through FTP or, if possible, SFTP. Installing new commands through MacPorts is also a possibility. A big bonus would be that it only uploaded the files that needs to be updated and not everything else. It would also be nice if I can tell it to delete the files on the server that no longer exists locally once in a while. Any good tips?

    Read the article

  • Controlling access to site folders if you cannot user Roles

    - by DavidMadden
    I find myself on an assignment where I could not use System.Web.Security.Roles.  That meant that I could not use Visual Studio's Website | ASP.NET Configuration.  I had to go about things another way.  The clues were in these two websites:http://www.csharpaspnetarticles.com/2009/02/formsauthentication-ticket-roles-aspnet.htmlhttp://msdn.microsoft.com/en-us/library/b6x6shw7(v=VS.71).aspxhttp://msdn.microsoft.com/en-us/library/b6x6shw7(v=VS.71).aspxYou can set in your web.config the restrictions on folders without having to set the restrictions in multiple folders through their own web.config file.  In my main default.aspx file in my protected subfolder off my main site, I did the following code due to MultiFormAuthentication (MFA) providing the security to this point:        string role = string.Empty;         if (((Login)Session["Login"]).UserLevelID > 3)         {             role = "PowerUser";         }         else         {             role = "Newbie";         }         FormsAuthenticationTicket ticket =  new FormsAuthenticationTicket( 1,                 ((Login)Session["Login"]).UserID,                 DateTime.Now,                 DateTime.Now.AddMinutes(20),                 false,                 role,                 FormsAuthentication.FormsCookiePath);         string hashCookies = FormsAuthentication.Encrypt(ticket);         HttpCookie cookie =  new HttpCookie(FormsAuthentication.FormsCookieName, hashCookies);         Response.Cookies.Add(cookie); This all gave me the ability to change restrictions on folders without having to restart the website or having to do any hard coding.

    Read the article

  • Legal issues regarding embedding a toolbar into a browser [closed]

    - by OmarOthman
    We are in the process of developing a software that provides service to internet users and we would like to ask about the legal liabilities of some issues. Of course, everything is to be done with the consent of the user of our software but our concern is about third party tools and services that may be invoked/used by our product. In particular, these are the concerns: (1) Embedding a toolbar to an existing browser. This screenshot is an example, where the words in the highlighted toolbar are passed to www.google.com for searching, and the contents of the window are the results of the search. I want to know if any consent should be obtained before such a toolbar can be embedded in a web browser, whether there are any legal requirements by the web browser; whether different web browsers have different requirements (at least for Internet Explorer, Firefox, Chrome, Opera and Safari). (2) Invoking a free website from that toolbar (like Google’s search page). The screenshot above demonstrates such an existing toolbar. (3) Full ownership and unrestricted access to the data entered to this toolbar. In the screenshot above, I want to take the words (translation english to spanish) and own them, i.e. storing them in my database and do some processing on them. (4) Ability to track the pages entered by the user starting from that free website. In the screenshot above, you can notice that the user opted only for the third result, whose URL is translate.google.com. I want to have access to this and all URLs clicked from this page for some processing as well. This is a commercial application, so I need a very concrete, precise and reference-supported answer.

    Read the article

  • Why was my site rejected for Google Adsense?

    - by hyuun jjang
    I have a 3 year old blog and its got around 16 articles/tutorials about some programming problems and solutions. It's getting pretty much a lot of view lately so I decided to apply for a google adsense account. When I first applied via blogger, google replied with the following statement: Page Type: In order to participate in Google AdSense, publishers' websites and application information must satisfy the following guidelines: - Your website must be your own top-level domain (www.example.com and not www.example.com/mysite). - You must provide accurate personal information with your application that matches the information on your domain registration. - Your website must contain substantial, original content... So, as I understood it, I decide to buy a domain and point my blogger blog to that new naked domain. and here is the newly bought domain where all the contents of my old blog resides. http://icodeya.com/ I reapplied, hoping that this time, I will make the cut. But then I got this reply Further detail: Unable to review your site: While reviewing http://www.icodeya.com/, we found that your site was down or unavailable. We suggest you check whether there was a typo in the URL submitted. When your site is operational, you can resubmit your application with the correct site by following the directions below. I'm a bit disappointed. Maybe I did something wrong with DNS configuration or something. But you can clearly see that my site is fully functional. I heard that google sends robots to crawl on to the site etc. It's just sad because I invested on a domain name, and now I can't even find ways to earn from it. Any tips?

    Read the article

  • Why is Ubuntu offline (except torrents) while Windows is online?

    - by Fahim al Islam
    I am using a static wired connection. Everything was perfect. But suddenly from few hours back I can't access any website. Dropbox, Ubuntu One also can't connect. Ping request is also unsuccessful, but I can download through torrent. I am not trying torrent download and browsing at the same time. So, I think it's not an issue about torrent using all the bandwidth. One important point is that this connection works perfectly on Windows on this same PC (My PC is dual-boot). I have tried the way what izx has suggested (using "sudo sh -c 'echo nameserver 8.8.8.8 /etc/resolv.conf'"), but I'm facing the same problem again. Now I can't even ping 8.8.8.8 and google.com. Though I can ping 74.125.228.2 (which is Google IP address) I can't understand what's happening and why this is happening. I'm new in this website many rules and regulations is unknown to me. So, please don't be bothered for my mistakes. Looking forward for help from anyone. Thanks to all.

    Read the article

  • Personalized Pricing

    - by David Dorf
    In past postings I've spent a fair amount of time talking about targeted promotions.  Using a complete view of the customer that includes purchase history, location history, and psychographics gleaned from social media, we can select the offer with the greatest chance of redemption.  This is done to influence shopping behavior, which might be introducing the consumer to a new product line, increasing their basket size, increasing frequency of purchases, etc. Safeway seems to be taking a slightly different approach with their personalized pricing.  In additional to offering electronic coupons and club card offers, they are also providing a personalized price for certain items based on purchase history.  So when Sally want to shop at Safeway, she first checks the "Just for U" website for three types of deals.  She starts by selecting manufacturer coupons to load into her loyalty card, then she checks the Club Card for offers like "buy one get one free." The third step is the interesting one.  Safeway will set a particular lower price for Sally good for 90 days on items she buys often.  Clearly this isn't enforcing a new behavior but rather instilling loyalty.  I would love to know exactly how they are determining the personalized price.  Of course bargain hunters can still stack the three offers so they can, for example, get their $4.99 Oatmeal for $0.72. I like this particular question and answer from their website's FAQ: My offers are not that great. Can I tell you what offers I need? That's a good idea. That functionality is not currently available, but we appreciate your input and are constantly improving our just for U program. Stay tuned for exciting enhancements! I suppose if Safeway is tracking all the purchases, they can easily determine whether the customer if profitable.  As long as the customer stays profitable, why not let them determine a few offers themselves?  Food for thought.

    Read the article

  • Basic Google Analytics Click Tracking and/or Overview

    - by Alan Storm
    This is a really basic Google Analytics question. Apologies in advance if it's not appropriate here, but I've had a lot of luck on Stack Overflow and this seems like the best Stack Exchange site for a question like this. I'm trying to understand how Google Analytics goals work, or if they're the right feature to be using for my situation. Most of the documentation I find online refers to the old version of the UI, not the new one. I have a website, let's call is blog.example.com. This website drives traffic to an ecommerce store, let's call that store.example2.com. I want to get reports on which links from blog.example.com are being clicked through leading to store.example2.com. How do you do this in Google analytics? Are goals the right area to be looking? Do I setup the goals on store.example2.com or blog.example.com? Or both? Is there any canonical user guide (free or paid) that covers how this works? I'm a competent programmer, but it's years since I dealt with conversion tracking on any serious level, and we've progressed well beyond my frozen caveman pixel tracking knowledge. Thanks in advance

    Read the article

  • Strange spam posts not making sense

    - by Paaland
    I'm running a web site with a forum where one small part is open for posting from unregistered users. The site uses captcha, but still some spam posts get through every day. Here is the thing. All of the messages follow the same pattern, but all also come from different IP's. That makes me thing this is some sort of automated scripted "attack" from a botnet of some sorts. The strange thing is that all the messages start with six random characters and contains a couple of links. The words have no meaning and the domains in the links does not even exist. Why would anyone use time and resources spreading these things? Below you can see two of these messages: A5Zfs6 exrzvrbspntz, [url=http://nktqoqllnuab.com/]nktqoqllnuab[/url], [link=http://wtrenldadvsy.com/]wtrenldadvsy[/link], [http://rnlrqfgdvdot.com/] O2oLpL nqeffxhryfdk, [url=http://jutyurbpfxow.com/]jutyurbpfxow[/url], [link=http://jpcdtmdalpow.com/]jpcdtmdalpow[/link], [http://qopqwqxwjdjx.com/] Since all the messages come from different IP's I can't see blocking those will help much. For now I'm considering just dropping all messages following this pattern since it's quite easy to match with a regexp. Have anyone else seen these kinds of messages or know the point of posting them?

    Read the article

< Previous Page | 252 253 254 255 256 257 258 259 260 261 262 263  | Next Page >