Search Results

Search found 29495 results on 1180 pages for 'cross site scripting'.

Page 823/1180 | < Previous Page | 819 820 821 822 823 824 825 826 827 828 829 830  | Next Page >

  • Blocking a country (mass iP Ranges), best practice for the actual block

    - by kwiksand
    Hi all, This question has obviously been asked many times in many different forms, but I can't find an actual answer to the specific plan I've got. We run a popular European Commercial deals site, and are getting a large amount of incoming registrations/traffic from countries who cannot even take part in the deals we offer (and many of the retailers aren't even known outside Western Europe). I've identified the problem area to block a lot of this traffic, but (as expected) there are thousands of ip ranges required. My question now (finally!). On a test server, I created a script to block each range within iptables, but the amount of time it took to add the rules was large, and then iptables was unresponsive after this (especially when attempting a iptables -L). What is the most efficient way of blocking large numbers of ip ranges: iptables? Or a plugin where I can preload them efficiantly? hosts.deny? .htaccess (nasty as I'd be running it in apache on every load balanced web server)? Cheers

    Read the article

  • Curl authentication

    - by Jack Humphries
    I am trying to download a file with cURL from a password protected directory on my site. It is not working. Instead of the downloading the requested file, it downloads a HTML file that says, "Authentication Required!" I'm not sure what the problem is. I've tried both of these (with the same result). The username and password are correct (and if the link below is used in a web browser, the file downloads successfully). 1) The username and password are included as part of the URL. curl https://username:[email protected]/auth/file.dmg --O /file.dmg; 2) The username and password are included as an option. curl -u username:wordpass.1 https://www.example.com/auth/file.dmg --O /file.dmg;

    Read the article

  • SQL Server DBA - How to get a good one!

    - by ETFairfax
    I'm a lone developer. I am currently developing an application which is seeing me get way way way out of my depth when it comes to SQL DBA'ing, and have come to realise that I should hire a DBA to help me (which has full support from the company). Problem is - who? This SO thread sees someone hire a DBA only to realise that they will probably cause more harm then good! Also, I have just had a bad experience with a ASP.NET/C# contractor that has let us down. So, can anyone out there on SO either... a) Offer their services. b) Forward me onto someone that could help. c) Give some tips on vetting a DBA. I know this isn't a recruitment site, so maybe some good answers for c) would be a benefit for other readers!! BTW: The database is SQL Server 2008. I'm running into performance issues (mainly timeouts) which I think would be sorted out by some proper indexing. I would also need the DBA to provide some sort of maintenance plan, and to review how our database will deal what we intend at throwing at it in the future!

    Read the article

  • Finding how a hacked server was hacked

    - by sixtyfootersdude
    I was just browsing through the site and found this question: My server's been hacked EMERGENCY. Basically the question says: My server has been hacked. What should I do? The best answer is excellent but it raised some questions in my mind. One of the steps suggested is to: Examine the 'attacked' systems to understand how the attacks succeeded in compromising your security. Make every effort to find out where the attacks "came from", so that you understand what problems you have and need to address to make your system safe in the future. I have done no system admin work so I have no idea how I would start doing this. What would be the first step? I know that you could look in the server log files but as an attacker the first thing that I would do would be errasing the log files. How would you "understand" how the attacks succeeded?

    Read the article

  • How do i get a more recent version of Java on my Mac than is showing up in software update?

    - by Cd Lolly
    I need at least Java 1.6 to run a program that someone else in my lab wrote On the Java website it tells me to update Java via apple's software update function, i've run this a few times but it only got up to Java 1.5.0_24 and it now says no more updates are available for my computer Is there another way to update Java on a Mac? Is my operating system maybe to old for Java 1.6? i'm not sure what i'm running exactly, and i can't find a list of what mac operating systems run what versions of Java because the java site just suggests using Mac's software update.

    Read the article

  • Massive Crawling requests from Google Apps Engine useragent

    - by SilentPlayer
    Hi friends, I'm badly affected with 'Google AppEngine-Google' UserAgent.. receiving 5/6 requests per second on http server. This bot is crawling my site just like GoogleBot does. Following is the sample of url in my access logs. 72.14.192.3 - - [19/May/2010:01:27:06 +0000] "GET /some-url/etc-123.htm HTTP/1.1" 200 4707 "-" "AppEngine-Google; (+http://code.google.com/appengine; appid: harpy000)" I have checked the ip address it is registered with Google Inc. Can anyone tell me where i can report Abuse to Google Inc. Or any information about this issue. Thank you!

    Read the article

  • Issues with FTP on my server

    - by homestead
    I installed FTP on my windows 2003 box I created a FTP site on a ipaddress, I am not allowing anonymous access. I created a user that has access to the folder c:\ftptest connecting to my server using filezilla shows: connectiong to -0.0.0.0 connection established, waiting for welcome message could not connect to server I tried both active and passive modes. The port on the server is open i.e. TCP 21 I can connect to other FTP sites so my locla firewall isnt' the issue. (now I know why sys admin work is so fun!)

    Read the article

  • Using TrueCrypt with File Replication on Windows Server

    - by neildeadman
    We have a few folders that are set to replicate to a DR file server off-site. One of these folders contains a file that is a TrueCrypt volume container. When this file is mounted in TrueCrypt, the file won't replicate (fair enough!). I'm looking at alternatives to improve this situation. One solution I currently have is to have a scheduled task to unmount the volume and then every morning as the volume is needed, have someone mount it. This is a pain slightly because the password is known by a few people (I'm not one and neither are my colleagues who would be performing the mounting operation) so we'd need to continually get them to come over and type it in. The other I had was to have one TrueCrypt container on each server and replicate the contents when they are mounted. I wasn't able to get TrueCrypt to see the mounted volume so I guess this is a no go. Any other solutions I have missed or a fix for the above?

    Read the article

  • What is the easiest way to get XtraDB for MySQL running on CentOS 5

    - by Jeremy Clarke
    I'm having a lot of issues with a dedicated MySQL server and it seems like upgrading to the XtraDB version of InnoDB will probably have a positive effect, but I'm hesitant to get involved with it since I am not really a sysadmin and prefer to stick with things that start with "yum update". What is the easiest way to get XtraDB installed? Should I use the Percona server? MariaDB? OurDelta? Is there a way to avoid using custom RPMs and sticking to a repo instead? The current yum version of MySQL is 5.0.xx, whereas a lot of the alternate MySQL builds are based on 5.1.xx. How does this factor in? Do I need to figure out 5.1 on CentOS before working on getting XtraDB in? For bonus points: Do I need to seriously test XtraDB with my server before implementing it, or is it relatively safe to have the brief downtime for switching servers followed by putting the site back online with XtraDB?

    Read the article

  • How to write PowerShell code part 3 (calling external script)

    - by ybbest
    In this post, I’d like to show you how to calling external script from a PowerShell script. I’d like to use the site creation script as an example. You can download script here. 1. To call the external script, you need to first to grab the script path. You can do so by calling $scriptPath = Split-Path $myInvocation.MyCommand.Path to grab the current script path. You can then use this to build the path for your external script path. $scriptPath = Split-Path $myInvocation.MyCommand.Path $ExternalScript=$scriptPath+"\CreateSiteCollection.ps1" $configurationXmlPath=$scriptPath+"\SiteCollection.xml" [xml] $configurationXml=Get-Content $configurationXmlPath & "$ExternalScript" $configurationXml Write-Host 2.If you like to pass in any parameters , you need to define your script parameters in param () at the top of the script and separate each parameter by a comma (,) and when calling the method you do not need comma (,) to separate each parameter. #Pass in the Parameters. param ([xml] $xmlinput)

    Read the article

  • How can I correlate a wall jack to a user/machine on the domain?

    - by harryfino
    After reading Valve's new employee handbook, I was really interested in setting up a company map like they described on page 6: "The fact that everyone is always moving around within the company makes people hard to find. That’s why we have http://user — check it out. We know where you are based on where your machine is plugged in, so use this site to see a map of where everyone is right now." What I'm trying to figure out is: how I can tell which machine or domain user (either will do) is connected to a particular wall jack?

    Read the article

  • Basic IIS7 permissions question

    - by Tom Gullen
    We have a website, with a file: www.example.com/apis/httpapi.asp This file is used by the site internally to make requests joining two systems on the website together (one is Classic ASP, the other ASP.net). However, we do not want the public to be able to access the file. In IIS7.5, is there a setting I can do to make this file internal only? I've tried rewriting the URL for it but this rewrite is also applied internally so the scripts stop working as they fetch the rewritten url. Thanks for any help!

    Read the article

  • HTTP 401 error in Windows Authentication disappears after swapping Providers

    - by Ray Cheng
    The IIS 7 on Windows 2008 R2 is acting really weird. We deploy our web apps as web sites with appcmd.exe. After they are deployed, if I browse to them, I'll get HTTP 401 errors. The web sites are only have Windows Authentication enabled and the providers are Negotiate and NTLM in such order. But if I swap the providers, the HTTP 401 error goes away. Even if I swap it back, the errors are still gone. So the order of the providers doesn't seem to matter, what matters is the swapping. It must have triggered something. Even if we delete the web site and application pool and reinstall the web sites, the errors are still gone. So far, we can't reproduce it easily since it happens randomly. Has anyone experienced this? How do I go about to troubleshoot it?

    Read the article

  • WouldISurviveANuke Assesses Your Distance From Nuclear War Strike Sites

    - by Jason Fitzpatrick
    WouldISurviveANuke is a morbid Google Maps mashup that plots out the effective radius of nuclear weapons on major metropolitan areas, your distance from them, and your chances of survival. Visit the site, plug in your zipcode, and set the parameters (how big of a nuclear weapon and how large the nearest target city needs to be) to find out if you’re in the blast radius. We plugged in a downtown address in Detroit, MI. The verdict? Neither we nor the cockroaches will be coming out alive. If you plug in a location far enough away from the direct blast radius you’ll also get a quality of life report that spells out the effects of a local nuclear strike. As far as startling anti-nuclear proliferation arguments go, WouldISurviveANuke is an effective and interactive demonstration. Hit up the link below to try it out. WouldISurviveANuke [via Y! Tech] How to Run Android Apps on Your Desktop the Easy Way HTG Explains: Do You Really Need to Defrag Your PC? Use Amazon’s Barcode Scanner to Easily Buy Anything from Your Phone

    Read the article

  • Active DFS node did not restore after failure

    - by Mark Henderson
    On Tuesday we had a Server 2008 R2 DFS-R node go offline unexpectedly. DFS did the right thing and started routing requests to a different node, which was in a remote site. This is by design, because even though it's slow, at least it's still working. We had the local DFS-R node back online within an hour, and it had synced all its changes 10 minutes after that. 3 of the 5 terminal servers reset themselves to the local DFS node, but the other two stayed pointing at the remote DFS node for three days, until someone finally piped up about how slow requests were. What reasons could there be why some, but not all, of the server reverted? Is the currently active DFS node for a namespace exposed anywhere in the OS (WMI, or even scripts) so that we can monitor the active nodes?

    Read the article

  • update ocz vertex le capacity via firmware update

    - by Ben Voigt
    I have an OCZ Vertex LE 100GB drive. It's actually 128GiB of NAND flash, with a whopping 28%+ reserved for write combining. Most 128GiB drives are actually ~ 115GB usable (and marketed as 120GB or 128GB). There were rumors that the reserved fraction could be decreased on OCZ 100GB drives. Can anyone provide a link to firmware that does that, or an official statement that no such firmware exists? (NB: I recently installed the 1.24 firmware from the OCZ site, it didn't affect the capacity. Possibly because the rumors say the capacity change is destructive to existing content.) Of possible interest: flashing firmware was more of a pain than it should have been -- the tool didn't detect the disk until I booted an older Windows install off a secondary hard disk, I suspect the Intel SATA driver is the issue and tool only works with the msachi.sys driver.

    Read the article

  • Is there a way to get a larger desktop than the screen?

    - by Cajuntechie
    Is there any freeware out there that will allow me to run the desktop that is larger than the resolution of the screen? My situation: Win XP, Dell Mini 10, iirc- Intel GMA950 I need to use a netbook for one specific web site (with firefox) that doesn't render properly if the screen isn't at least 1024x768. It is a netbook, so the res is 1024x600. Is there anything out there that will allow me to use a 1024x768 desktop on this smaller screen WITHOUT shrinking the desktop to fit. I want to pan around. Thanks! Cajuntechie

    Read the article

  • Find BIOS update for HP Pavilion xt966

    - by NitroxDM
    I installed FreeNAS on a HP Pavilion xt966. Every three seconds or so I message comes up on the console. acpi_tz0: _TMP value is absurd, ignored (-270.2C) From what I have been able to find this is because of a bug in the BIOS... but I have yet to find an update for it. HP's site only shows drivers. FreeNAS is build on BSD. Anyone know where to find a BIOS update for this unit?

    Read the article

  • Easy way to access cookies in Chrome

    - by macek
    To view specific cookies in Chrome, currently I have to: Go to preferences Click Under the Hood tab Click Content Settings... button Click Cookies tab (if it's not already active) Click Show cookies and other site data... button If I want to narrow this down to a specific domain, I have to type it in, too. Compare this to Firefox: View Page Info Click Security tab Click View Cookies The domain for the page I'm currently on is already used as a filter, too. My question: Is there an easier way in Chrome? I've done some searching for an extension but have come up with nothing. Any help is appreciated :)

    Read the article

  • Fix for poor hd playback for 11.04 upwards

    - by mark kirby
    Hi guys ive seen loads of posts on this site about poor 720/1080p playback in recent ubuntu versions I had this problem and fixed it so I thought id share it with everyone.... 1 install mplayer 2 install smplayer frontend {in software center} 3 open smplayer 4 go to "OPTIONS" then "PREFRENCES" then "GENRAL" 5 if you have a nvidia card choose "OUTPUT DRIVER" and select "VDPAU" {for ATI or AMD choose xv (0 - ATI Radeon AVIVO video) I dont know if this will work as my card is nvidia but it should) 6 go to performance on the left hand side and set both local and streaming cache to 99999 (this may also fix dvd playback if you set that cache aswell} 7 check the box for "ALLOW HARD FRAME DROP" and set "LOOP FILTER" to skip only on HD 8 Set the "THREDS FOR DECODING OPTION TO THE NUMBER OF CORES YOUR CPU HAS IF YOU HAVE MORE THAN ONE CPU ADD UP ALL THE CORES FOR BEST PERFORMANCE" 9 Enjoy you HD movies again on ubuntu...... I have a pretty avrage machine heres my spec.... 2x Pentium 4 ht 3 ghz Stock dell power and motherboard GFORCE 310 HDMI 24 inch full HD tv as a monitor so any one with dule core cpu should have no problems getting this to work. hope this helps someone out.

    Read the article

  • Use mod_rewrite or RedirectMatch to redirect oldfile.aspx?p=blah to newfile.php, ignoring ?p=blah

    - by Dan
    I've got a site with many incoming links to the old structure (gone for years), with tonnes of URL vars that are no longer relevant, as the database mappings were changed. So, I'd like to redirect: http://www.mysite.com/oldfile.aspx?p=1&c=2 to: http://www.mysite.com/newfile.php without the query string at the end. The actual query string varies - there are hundreds of them, but since they don't match up to a particular case anymore, I want to take people to the new index page for the content they're looking for, so they can find it from there. I currently use: RedirectMatch 301 ^/oldfile\.aspx$ /newfile.php This puts the query back on the end though. Can someone let me know the voodoo recipe I need?

    Read the article

  • How does azure memory usage work?

    - by Jed Grant
    I have a windows azure website. In the dashboard it shows me that I have used 1.51 GB of the 2GB available per hour. I keep increasing the number of instances available in the shared node so the site doesn't shut down. After each hour finishes, the memory usage still shows 1.51 GB used. I assume this would start at ZERO and then be used as time goes on, but that doesn't appear to be the case. How does server memory work? What are some reasons my application using this much memory? (I use no output caching and generally have just built off of the basic MVC templates provided in visual studio.) What other considerations should I be making to get the amount of memory needed to decrease?

    Read the article

  • Cannot connect to internet by Internet Explorer

    - by user428368
    I was using Mozilla Firefox for browsing and it workes Ok. But now when I want to open any web site by using IE8 it says Internet Explorer cannot display the webpage (and Mozilla still works). I came to this problem because I wanted to install Google Earth, but it says that the program cannot access the server .... so Google sugested me to open one of three links in IE .... and it doesn't work. By the way I'm connected to Internet via LAN, but in IE, Internet Options, Connections there is nothing, not a single conection. People, please help....

    Read the article

  • placing h2 and h3 tags around words in paragraphs

    - by sam
    if i have a page like with an H1 heading and then just a long paragraph wraped in p tags, is it ok to write the paragraph as bellow (with the h tags mixed into the paragraph) and just style it so it looks all the same so that i get the benefit of using h2 and h3 tags ? Im aware this is not the 'proper' use of the H tags as their meant to be headings but can i use them like this as the site isnt built using mulitple headings on the same page (please ignore over optimization this is just for illitrative purposes) <h1>Red shoes</h1> <p>Lorem ipsum dolor sit amet, consectetur adipiscing elit. Phasellus id dui id mi consectetur tincidunt. Mauris at sem non urna congue eleifend sed quis nulla. Aenean nisl porta eget auctor vel, semper eget massa.</p> <h2>Red shoes</h2> <p>Lorem ipsum dolor sit amet, consectetur adipiscing elit. Phasellus id dui id mi consectetur tincidunt. Mauris at sem non urna congue eleifend sed quis nulla. Aenean nisl porta eget auctor vel, semper eget massa.</p> <h3>red shoes</h3> <p>Lorem ipsum.</p>

    Read the article

  • How do I make subsonic (media server) work with SSL?

    - by John Baber
    The roughly out-of-the-box setup as a regular user works fine (meaning the site appears at http://myserver.com:4040). From ps aux java -Xmx100m -Dsubsonic.home=/var/subsonic -Dsubsonic.host=0.0.0.0 -Dsubsonic.port=4040 -Dsubsonic.httpsPort=0 -Dsubsonic.contextPath=/ -Dsubsonic.defaultMusicFolder=/var/music -Dsubsonic.defaultPodcastFolder=/var/music/Podcast -Dsubsonic.defaultPlaylistFolder=/var/playlists -Djava.awt.headless=true -verbose:gc -jar subsonic-booter-jar-with-dependencies.jar but just giving an https port java -Xmx100m -Dsubsonic.home=/var/subsonic -Dsubsonic.host=0.0.0.0 -Dsubsonic.port=4040 -Dsubsonic.httpsPort=6060 -Dsubsonic.contextPath=/ -Dsubsonic.defaultMusicFolder=/var/music -Dsubsonic.defaultPodcastFolder=/var/music/Podcast -Dsubsonic.defaultPlaylistFolder=/var/playlists -Djava.awt.headless=true -verbose:gc -jar subsonic-booter-jar-with-dependencies.jar makes http://myserver.com:4040 say HTTP ERROR: 404 NOT_FOUND RequestURI=/index.view Powered by jetty:// and https://myserver.com:6060 say Unable to connect I'm only making the change by doing # SUBSONIC_ARGS="--port=80 --https-port=443 --max-memory=120" SUBSONIC_ARGS="--max-memory=100 --https-port=6060" in /etc/default/subsonic and issuing a sudo service subsonic restart (this is Ubuntu Oneiric)

    Read the article

< Previous Page | 819 820 821 822 823 824 825 826 827 828 829 830  | Next Page >