Search Results

Search found 7625 results on 305 pages for 'scraper sites'.

Page 5/305 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • HTG Explains: How Hackers Take Over Web Sites with SQL Injection / DDoS

    - by Jason Faulkner
    Even if you’ve only loosely followed the events of the hacker groups Anonymous and LulzSec, you’ve probably heard about web sites and services being hacked, like the infamous Sony hacks. Have you ever wondered how they do it? There are a number of tools and techniques that these groups use, and while we’re not trying to give you a manual to do this yourself, it’s useful to understand what’s going on. Two of the attacks you consistently hear about them using are “(Distributed) Denial of Service” (DDoS) and “SQL Injections” (SQLI). Here’s how they work. Image by xkcd HTG Explains: How Hackers Take Over Web Sites with SQL Injection / DDoS Use Your Android Phone to Comparison Shop: 4 Scanner Apps Reviewed How to Run Android Apps on Your Desktop the Easy Way

    Read the article

  • Sites with overlapping code-bases. Developing multiple sites with little changes

    - by Web Developer
    I have to develop 3 different sites video.com for hosting video audio.com for hosting audio docs.com for hosting docs. domain names for example only Almost 80% of the functionality is the same for all the three, with remaining 20% being completely different features... How do I handle this? How does sites like SO handle this? I am developing this in YII framework and was thinking of having these different features as modules but in this case the menu/code links in html code can become difficult.

    Read the article

  • 1000 most visited sites on the web: A Google Analysis

    Google has released an analysis on the 1000 most visited sites on the web. Considering that we own/operate 3 of the top 10 sites and has a significant interest in Facebook, plus this recent report that states that Microsoft employees are the most social-media-savvy will go to great lengths to show how well we can operate in our cloud and social media integration and collaboration strategies. William Tay 2000-2010 | Swinging Technologist http://www.softwaremaker.net/blog...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Can't connect to certain HTTPS sites

    - by mind.blank
    I've just moved to a new apartment and with internet connection via a router and I'm finding that I can't connect to quite a few sites that use SSL. For example trying to connect to PayPal: curl -v https://paypal.com * About to connect() to paypal.com port 443 (#0) * Trying 66.211.169.3... connected * successfully set certificate verify locations: * CAfile: none CApath: /etc/ssl/certs * SSLv3, TLS handshake, Client hello (1): * Unknown SSL protocol error in connection to paypal.com:443 * Closing connection #0 curl: (35) Unknown SSL protocol error in connection to paypal.com:443 curl -v -ssl https://paypal.com gives the same output. For some sites it works: curl -v https://www.google.com * About to connect() to www.google.com port 443 (#0) * Trying 74.125.235.112... connected * successfully set certificate verify locations: * CAfile: none CApath: /etc/ssl/certs * SSLv3, TLS handshake, Client hello (1): * SSLv3, TLS handshake, Server hello (2): * SSLv3, TLS handshake, CERT (11): * SSLv3, TLS handshake, Server key exchange (12): * SSLv3, TLS handshake, Server finished (14): * SSLv3, TLS handshake, Client key exchange (16): * SSLv3, TLS change cipher, Client hello (1): * SSLv3, TLS handshake, Finished (20): * SSLv3, TLS change cipher, Client hello (1): * SSLv3, TLS handshake, Finished (20): * SSL connection using ECDHE-RSA-RC4-SHA * Server certificate: * subject: C=US; ST=California; L=Mountain View; O=Google Inc; CN=www.google.com * start date: 2011-10-26 00:00:00 GMT * expire date: 2013-09-30 23:59:59 GMT * common name: www.google.com (matched) * issuer: C=ZA; O=Thawte Consulting (Pty) Ltd.; CN=Thawte SGC CA * SSL certificate verify ok. > GET / HTTP/1.1 > User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3 > Host: www.google.com > Accept: */* > < HTTP/1.1 302 Found < Location: https://www.google.co.jp/ . . . I'm using Ubuntu 12.04, with Windows 7 installed as well. These sites work on Windows :( Not sure if this information helps but I ran ifconfig and got the following: eth0 Link encap:Ethernet HWaddr 1c:c1:de:bc:e2:4f inet6 addr: 2408:c3:7fff:991:686b:8d18:81b3:8dd1/64 Scope:Global inet6 addr: 2408:c3:7fff:991:1ec1:deff:febc:e24f/64 Scope:Global inet6 addr: fe80::1ec1:deff:febc:e24f/64 Scope:Link UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 RX packets:87075 errors:0 dropped:0 overruns:0 frame:0 TX packets:54522 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:78167937 (78.1 MB) TX bytes:10016891 (10.0 MB) Interrupt:46 Base address:0x4000 eth1 Link encap:Ethernet HWaddr ac:81:12:0d:93:80 inet6 addr: fe80::ae81:12ff:fe0d:9380/64 Scope:Link UP BROADCAST MULTICAST MTU:1500 Metric:1 RX packets:0 errors:0 dropped:0 overruns:0 frame:498 TX packets:0 errors:26 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:0 (0.0 B) TX bytes:0 (0.0 B) Interrupt:17 lo Link encap:Local Loopback inet addr:127.0.0.1 Mask:255.0.0.0 inet6 addr: ::1/128 Scope:Host UP LOOPBACK RUNNING MTU:16436 Metric:1 RX packets:630 errors:0 dropped:0 overruns:0 frame:0 TX packets:630 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:0 RX bytes:39592 (39.5 KB) TX bytes:39592 (39.5 KB) ppp0 Link encap:Point-to-Point Protocol inet addr:180.57.228.200 P-t-P:118.23.8.175 Mask:255.255.255.255 UP POINTOPOINT RUNNING NOARP MULTICAST MTU:1492 Metric:1 RX packets:39631 errors:0 dropped:0 overruns:0 frame:0 TX packets:22391 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:3 RX bytes:43462054 (43.4 MB) TX bytes:2834628 (2.8 MB)

    Read the article

  • What Do Your Customers Want in an Online Experience?

    - by Christie Flanagan
    In a time where customers have an increasing number of choices and an increasing level of control over their relationships with brands, what matters most is engagement. In order to engage your customers online, you need to provide them with a relevant, interactive and multichannel experience.  Check out this video to see the kind of engaging online experience that Oracle WebCenter can power for your customers. Want to learn more?  Visit our Connected Customer Experience Resource Center to: See a demonstration of how easy it is for marketers and other non–technical business users to create and manage online experiences like the one above with Oracle WebCenter Sites Hear Ancestry.com describe how they use Oracle WebCenter Sites to deliver an online experience that converts site visitors into customers and keeps them coming back to learn more about their family histories Hear what analysts are saying about the exciting new and enhanced web experience management capabilities in Oracle WebCenter Sites 11g

    Read the article

  • What is /etc/apache2/sites-available used for and is it necessary?

    - by Mariane
    I have 3 sites, each with a specific IP, running on apache2 (up-to-date Ubuntu). To put a site online, I just created a file in: /etc/apache2/sites-enabled and in this file I told apache which directory was the root directory for this site, and to which IP it should correspond. So I have 000-default 001-www.lapf.eu 002-www.felkin.info 003-www.seidhr.fr in this directory. My first site, lapf suddenly lost contact with its database after the domain name was transferred from another registrar unto the registrar who is also hosting the site's data. Then I did an update, and I reinstalled mysql-server and mysql-common, and I did I-have-forgotten-what to reinstall the locales (uft8 and such) which had vanished for some reason. This fixed my first site. Now I noticed that the other 2 sites are offline. Pointing a browser to them just hangs until timeout. They used to function, and their domain names did not move, they are still registered at the same place. The files are still in /etc/apache2/sites-enabled I noticed another directory: /etc/apache2/sites-available with just defaut and default.ssl in it. Why are there 2 directories, sites-enabled and sites-available? Should I copy the files from "sites-enabled" into "sites-available"? Or should I put a modified version of each in "sites-available"? command: "apache2ctl -S" VirtualHost configuration: 92.243.20.169:80 Charlotte (/etc/apache2/sites-enabled/001-www.lapf.eu:1) 92.243.21.141:80 xvm-21-141.ghst.net (/etc/apache2/sites-enabled/002-www.felkin.info:1) 92.243.4.114:80 xvm-4-114.ghst.net (/etc/apache2/sites-enabled/003-www.seidhr.fr:1) wildcard NameVirtualHosts and default servers: *:80 is a NameVirtualHost default server Charlotte (/etc/apache2/sites-enabled/000-default:1) port 80 namevhost Charlotte (/etc/apache2/sites-enabled/000-default:1) Syntax OK

    Read the article

  • Add site to trusted sites through GPO

    - by Matt Bear
    I need to add a site to trusted sites on all computers in my domain. I can do it with the "site to zone assignment list", however when I do, it locks trusted sites on the client computer "this setting is managed by your administrator". What I need is a way to add the site, make it persistant, and not affects the users ability to add trusted sites of thier own. (It's a development enviroment, sites are created and tested regularly, they need that ability.)

    Read the article

  • Sending parameters to other sites

    - by moustafa
    look here first http://stackoverflow.com/questions/2883338/how-can-i-send-a-date-from-one-site-to-other-sites let me change the question a bit, i didnt really explain myself properly. What i intend to do is get z.php to read a text file called 'sites.txt' which has a list of sites: site1.com/a.php site2.com/b.php site3.com/c.php to execute the urls in the sites in 'sites.txt' i want it to go through siteA.com/z.php?ip=xxx.xxx.xx.xxx&location=UK (z.php will then read 'sites.txt') All sites in the 'sites.txt' file will be executed as site1.com/a.php?ip=xxx.xxx.xx.xxx&location=UK site2.com/b.php?ip=xxx.xxx.xx.xxx&location=UK I hope that makes more sense, i have tried looking around but couldnt find what i was looking for. Thanks for your help so far everyone. site3.com/c.php?ip=xxx.xxx.xx.xxx&location=UK

    Read the article

  • Best way to setup multiple sites' emails in my Gmail

    - by John
    I've a dozen sites and I want all of their emails come to my one gmail id and I want to reply centrally from Gmail only. I've also added all of those emails in "send email as:" list in Gmail. I could add email forwarders in my Cpanel but in that case I'll not be able to send email whose inboxes haven't been created( for example [email protected]). If I create email account then I'd receive emails in my inbox as well as forwared by the forwarder( to my gmail id). Otherwise I can setup Gmail for my domain. But for a dozen emails I'm not sure if that'd be fine. I see in http://www.google.com/enterprise/apps/business/pricing.html that for up to 10 emails it is free. But then to send email from webhosting the php code will need SMTP login details and leaving my important gmail account details in my webhosting account is very risky given my sites have been compromised twice. What is the best way to centralize all my emails so that I can read/reply/search from single place?

    Read the article

  • Best CMS for review-type sites

    - by Pru
    Is there an ideal CMS for making a review site? By review site, I mean like a restaurant review site where you have each entry belonging to different major categories like Cuisine and City. Then users can browse and filter by each or by combination (Chinese Food in Los Angeles, with suggestions of other Chinese restaurants in LA, etc). Furthermore, I'd want it to support other fields like price, parking, kid-friendliness, etc. And to have users be able to filter by those criteria. I've been told that with a combination of custom taxonomies, plug-ins and many clever little queries, that Wordpress 3.x can handle this. But I'm having a heck of a time with it getting into the nitty gritty, and that's where I find the community support is lacking. The sort of stuff you'd think would work in WP, like making one parent category for Cuisine and one for City, don't really work once you get further in and start trying to pull it all together. Then you find these blog posts where people say, "This example shows that one could create a huge movie review site using custom taxonomies..." but when you go and try it you hit all sorts of challenges and oddities that point a big long finger at Wordpress being in fact a blogging platform. The best I came up with was one category for the cuisine and one tag for the city, then I created a couple of custom tag-like taxonomies for the other features. It's quite a mess to try to figure out how to assemble all of that into a natural, intuitive site. I expect a few versions down the road WP will be able to do these sorts of sites out of the box. So I thought I'd take a step back before I run back into the Wordpress fray and find out if maybe there is another platform better suited to this sort of relational content site. Directory scripts in some ways offer many of the features I'm looking for, but I need something more flexible and, hopefully, interactive (comments, reviews). I'm especially looking for feedback from people who've crafted sites like this. Thanks!

    Read the article

  • Identifying elements from data feeds generated by affiliate sites

    - by SPI
    I am working with data feeds from affiliate sites. The basic idea is to provide an interface where the user can paste a link to an XML datafeed (these are huge btw, around 60 mb) that would then be streamed, parsed into small chunks, and mined for the required data which would then be stored in the database. The problem is that different affiliate sites have different Schemas for their XML's. It is a little hard mapping the elements in an XML to your database attributes when you don't actually know which element contains what. My Solution: Use XPath to traverse through the first set of parent and it's descendent's, fetch the elements as well as the data and and ask the user to map this data to the attributes in the database by selecting from a set of radio buttons that represent the attributes from the database. This will be done just once for each new Feed, once the system know's what's what it will automatically upload the data from the XML to the database. Does this sound viable? Is there a better solution? I realize this leaves an uncomfortable opening for human error.. Thanks.

    Read the article

  • Programmatically Starting and Stopping FTP Sites in IIS 7 and IIS 8

    - by The Official Microsoft IIS Site
    I was recently contacted by someone who was trying to use Windows Management Instrumentation (WMI) code to stop and restart FTP websites by using code that he had written for IIS 6.0; his code was something similar to the following: Option Explicit On Error Resume Next Dim objWMIService, colItems, objItem ' Attach to the IIS service. Set objWMIService = GetObject( "winmgmts:\root\microsoftiisv2" ) ' Retrieve the collection of FTP sites. Set colItems = objWMIService.ExecQuery( "Select...(read more)

    Read the article

  • Example sites which use UCC certificates

    - by Brian
    Can anyone point me to a few sites that make use of a UCC (SAN) certificates? I tried to search for this but found a lot of information about UCC certficates without any examples. As a sanity check before buying/configuring a UCC certificate, I wish to do some basic testing to determine exactly how the certificate will look in different browsers. Yes, I realize I could just use makecert instead. I would rather just look at them in the wild.

    Read the article

  • Google Web Fonts v2 propose de nouvelles polices de caractères facilement intégrables dans les sites Web

    Google Web Fonts v2 propose de nouvelles polices de caractères Facilement intégrables dans les sites Web Après la présentation de son nouveau réseau social Google +, et la mise à jour de l'interface utilisateur de son moteur de recherche, Google a procédé à une mise a jour de son API Google Fonts et du répertoire de polices Web Google Web Fonts. Disponible désormais en version finale, Google Web Fonts v2 intègre de nouvelles polices de caractères Web ainsi qu'une nouvelle interface permettant de visualiser rapidement les rendus sur des phrases. Par...

    Read the article

  • The Best Way to Build Backlinks - A List of 36 Sites to Get Backlinks

    Every webmaster can understand the meaning of backlinks. We need backlinks to rank our sites higher in Google and other search engines. Search engines count the number of backlinks for a web page and assign a rank to it in in search results. Hence, every webmaster always look to get as many backlinks as possible. In this article I explained few free methods of getting links.

    Read the article

  • Web Sites to Accommodate New Technology

    Popularity has always been the driving force of success through out the history of mankind. Creating popularity on Twitter, Facebook and other social networking sites opens the door for money and skilled manipulators to sell their wares.

    Read the article

  • Web Site Search Engines - Sending Your Site to Search Engine Sites

    Search engines are number one cost effective approach to market your business and web site. Studies indicate that vast majority viewers find web sites via leading search engines and directories. Quality listing on leading search engine or directory may drive targeted traffic to your website and improve your business in a short period of time.

    Read the article

  • When Using Social Networking Sites Exercise Caution

    With more people using social networking sites there is also an increase in the various threats people may encounter online. Unfortunately population masses tend to attract people with less than nobl... [Author: TJ Philpott - Computers and Internet - April 14, 2010]

    Read the article

  • Creating advanced website by redirecting and replacing content from Google Sites

    - by David
    I would like to create a corporate website with members area. Importantly, I want many novice webadmins to be able to modify static content themselves. Therefore, I got the idea to create the site using Google Sites and insert elements with width and height in places where I want dynamic content. The website would be read using PHP on a different server and the marker elements would be replaced with dynamic content created by PHP. What would be the drawbacks of this approach?

    Read the article

  • Why Internet Predators Love Social Network Sites

    Internet predators have become a fixture of sorts on many social media sites which necessitates the need for users to exercise caution. Since the advent of the internet ';instances'; of cyber crime hav... [Author: TJ Philpott - Computers and Internet - May 26, 2010]

    Read the article

  • restricting acces to sites with squid and elinks

    - by Rexxar
    I want to block in elinks the yahoo sites(www.yahoo.com and all his subdomains fr.yahoo.com etc). I tried with squid(squid.conf): acl Badsites dstdomain .yahoo.com http_acces deny Badsites and i wrote in elinks.conf: set.protocol.http.proxy.host = "proxy.host:3128" set.protocol.http.proxy.user = "" set.protocol.http.proxy.passwd = "" and it dosent work. it tells me Host not found on every site i whant to enter. DO you have any idee why it works that way and can you tell me a solution?

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >