Search Results

Search found 51847 results on 2074 pages for 'web browser'.

Page 765/2074 | < Previous Page | 761 762 763 764 765 766 767 768 769 770 771 772  | Next Page >

  • IIS7 binding to subdomain causing authentication errors

    - by Tommy Jakobsen
    I'm trying to bind a IIS web site to a subdomain, which is causing authentication errors. First I'll explain what I've done to set it up. This is the fist time I do this, so please correct me if I'm wrong. The web server is a stand-alone Windows Server 2008 R2 x64, running IIS7 with .NET Framework 4. I have the following A-records, pointing to my server: server.mydomain.com *.server.mydomain.com So all subdomains of server.mydomain.com points to the server. In IIS7 I have a web site on port 8080, with a virtual directory (named virtual) that is using Windows Authentication. I have one binding on the web site pointing to all unassigned IP addresses, port 8080 and having a host name of sub.server.mydomain.com. Now, shouldn't I be able to access the virtual directory through: http://sub.server.mydomain.com/virtual That is not working. However, I can access it through: http://sub.server.mydomain.com:8080/virtual But, it won't let me authenticate using a Windows account (Server\Username). A windows account that I can authenticate with, when accessing the site through http://localhost:8080/virtual. What am I missing here?

    Read the article

  • how to authenticate once for multiple servers, using only apache configs?

    - by Wang
    My problem is, I have a number of prepackaged web apps (a print system, a wiki, a bug tracker, an email archive, etc.) running on different Mac OS X Leopard (soon to be SL) servers that each need to authenticate users from the internet at large. Right now every server presents an Apache basic authentication prompt, which takes a shared login, but it's apparently enough of an inconvenience to log in repeatedly that people are sending email without checking the wiki or bug tracker or archive. In the case of the bug tracker, a user [might need to log in twice---once for apache if he hasn't used any other protected service on that server, once for the bug tracker itself so it can distinguish different people. Since the only common component to all these apps is Apache 2 itself, does it have any way of authenticating a user once, in some way that will be respected by other servers and various web apps? Looked at http://serverfault.com/questions/32421/how-is-session-stickiness-achieved-across-multiple-web-servers but it sounds like the answer is assuming that I get to write my own web app. Looked at Ian Bicking's blog but it's four years old and recommends something available only for apache 1.3, not apache 2. Sorry not to hyperlink the second site---apparently I need 10 reputation points. Edit: Shibboleth does what I need, but I should have specified that I'm looking for a really dumb, really simple solution for in-house services that need to handle all of a dozen users, probably not more than three at a time.

    Read the article

  • All websites migrated from server running IIS6 to IIS7

    - by Leah
    Hi, I hope someone will be able to help me with this. We have recently migrated all of our clients' sites to a new server running IIS7 - all the sites were originally running on a server running IIS6. Ever since the migration, lots of our clients are reporting error messages. There seems to be quite a number of issues related to sending emails and also, we have had the following error message reported by several different clients: Server Error in '/' Application. -------------------------------------------------------------------------------- Validation of viewstate MAC failed. If this application is hosted by a Web Farm or cluster, ensure that <machineKey> configuration specifies the same validationKey and validation algorithm. AutoGenerate cannot be used in a cluster. Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code. Exception Details: System.Web.HttpException: Validation of viewstate MAC failed. If this application is hosted by a Web Farm or cluster, ensure that <machineKey> configuration specifies the same validationKey and validation algorithm. AutoGenerate cannot be used in a cluster. I have read elsewhere that this error can appear if a button is clicked before the whole page has finished loading. But as this error has now appeared on multiple sites and only since the server migration, it seems to me that it must be something else. I was wondering if someone could tell me if there is something specific which needs to be changed for .NET sites when sites are moved from a server running IIS6 to a server running IIS7? I don't deal with the actual servers very much so I'm afraid this is very much a grey area for me. Any help would be very much appreciated.

    Read the article

  • Permission / owner issue with pushing to git when editing directly from repo?

    - by Susan
    I have a web interface for deploying scripts from our repo at Github to our live server. The web interface just triggers a bash script with some git commands. If I make changes locally, push to repo, then run the bash script to pull from repo to live it works fine. However, if I make changes directly in the repo (via Github's web interface), I'm running into fast-forward / lock issues. These are the steps I'm taking: Make a change on a file at Github repo Run a bash script (as apache) via web from live server that attempts a git push / pull. Get these problems: PUSH To [email protected]:name/name.git ! [rejected] master - master (non-fast-forward) error: failed to push some refs to '[email protected]:name/name.git' To prevent you from losing history, non-fast-forward updates were rejected Merge the remote changes before pushing again. See the 'Note about fast-forwards' section of 'git push --help' for details. PULL From github.com:name/name branch master - FETCH_HEAD error: unable to unlink old 'includes/footer.inc' (Permission denied) Updating 8f6d922..d1eba9d Updating 8f6d922..d1eba9d SSH in as root, attempt a push / pull and it works fine. Ideas on why would this method not work from apache?

    Read the article

  • Serving and caching content from Amazon S3 with Tomcat

    - by Rob
    Hi all, We're looking to serve a range of content using Amazon S3 as a store for the content and Tomcat to host the web application. The content is divided into free and paid for content. We intend to authenticate the users when they access the web application running in Tomcat. Based around their authentication we are able to tell if the user has access to paid for content or simply free stuff. So I envision the flow of a request being something like this: Authenticated request to Tomcat If user is "paid" user, display links to premium content Direct requests for paid content back through Tomcat to prevent direct access to it by non-paying users. Tomcat makes request to S3 through a web cache to keep our costs down Content is returned to user. As we have to pay for each request to S3, I'd ideally like to cache content locally to the Tomcat instance after it has been requested for the first time to keep costs to a minimum and to speed things up. I would also like to be able to invalidate this cache if we publish fresh content to S3. So to confirm my proposal: Client Request - Tomcat - Web Cache - S3 To invalidate the cache, I was thinking of using something like PubSubHubbub with the cache waiting for updates to the feed for content that it should invalidate. I'd appreciate some general feedback on this approach as I've no real experience of caching and I'm sure I've made some invalid assumptions. I'd also appreciate any recommendations for caching technologies. Thanks.

    Read the article

  • Getting 401 when using client certificate with IIS 7.5

    - by Jacob
    I'm trying to configure a web site hosted under IIS 7.5 so that requests to a specific location require client certificate authentication. With my current setup, I still get a "401 - Unauthorized: Access is denied due to invalid credentials" when accessing the location with my client cert. Here's the web.config fragment that sets things up: <location path="MyWebService.asmx"> <system.webServer> <security> <access sslFlags="Ssl, SslNegotiateCert"/> <authentication> <windowsAuthentication enabled="false"/> <anonymousAuthentication enabled="false"/> <digestAuthentication enabled="false"/> <basicAuthentication enabled="false"/> <iisClientCertificateMappingAuthentication enabled="true" oneToOneCertificateMappingsEnabled="true"> <oneToOneMappings> <add enabled="true" certificate="MIICFDCCAYGgAwIBAgIQ+I0z6z8OWqpBIJt2lJHi6jAJBgUrDgMCHQUAMCQxIjAgBgNVBAMTGURldiBDZXJ0aWZpY2F0ZSBBdXRob3JpdHkwHhcNMTAxMjI5MjI1ODE0WhcNMzkxMjMxMjM1OTU5WjAaMRgwFgYDVQQDEw9kZXYgY2xpZW50IGNlcnQwgZ8wDQYJKoZIhvcNAQEBBQADgY0AMIGJAoGBANJi10hI+Zt0OuNr6eduiUe6WwPtyMxh+hZtr/7eY3YezeJHC95Z+NqJCAW0n+ODHOsbkd3DuyK1YV+nKzyeGAJBDSFNdaMSnMtR6hQG47xKgtUphPFBKe64XXTG+ueQHkzOHmGuyHHD1fSli62i2V+NMG1SQqW9ed8NBN+lmqWZAgMBAAGjWTBXMFUGA1UdAQROMEyAENGUhUP+dENeJJ1nw3gR0NahJjAkMSIwIAYDVQQDExlEZXYgQ2VydGlmaWNhdGUgQXV0aG9yaXR5ghB6CLh2g6i5ikrpVODj8CpBMAkGBSsOAwIdBQADgYEAwwHjpVNWddgEY17i1kyG4gKxSTq0F3CMf1AdWVRUbNvJc+O68vcRaWEBZDo99MESIUjmNhjXxk4LDuvV1buPpwQmPbhb6mkm0BNIISapVP/cK0Htu4bbjYAraT6JP5Km5qZCc0iHZQJZuch7Uy6G9kXQXaweJMiHL06+GHx355Y="/> </oneToOneMappings> </iisClientCertificateMappingAuthentication> </authentication> </security> </system.webServer> </location> The client certificate I'm using in my web browser matches what I've placed in the web.config. What am I doing wrong here?

    Read the article

  • How to add Sharepoint Powershell to Console2

    - by BGM
    Salvete! I want to add the Powershell Console for Sharepoint to the tablist in Console2. I already have plain Powershell, but I want the Sharepoint Powershell snapin added automatically. If I look at the properties of the Sharepoint Powershell Console shortcut, I see this: C:\Windows\System32\WindowsPowerShell\v1.0\PowerShell.exe -NoExit " & ' C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\14\CONFIG\POWERSHELL\Registration\\sharepoint.ps1 ' " but that doesn't work in Console2, so I tried this, which doesn't work either: C:\WINDOWS\system32\windowspowershell\v1.0\powershell.exe -PSConsoleFile "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\14\CONFIG\POWERSHELL\Registration\psconsole.psc1" -NoExit " & ' C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\14\CONFIG\POWERSHELL\Registration\\sharepoint.ps1 ' " Whenever I try, it will load Powershell, but not the Sharepoint Console. I get this: Add-PSSnapin : The Windows PowerShell snap-in 'Microsoft.SharePoint.PowerShell' is not installed on this machine. At C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\14\CONFIG\POWERSHELL\Registration\SharePoint.ps1:3 char:13 + Add-PsSnapin <<<< Microsoft.SharePoint.PowerShell + CategoryInfo : InvalidArgument: (Microsoft.SharePoint.PowerShell:String) [Add-PSSnapin], PSArgumentException + FullyQualifiedErrorId : AddPSSnapInRead,Microsoft.PowerShell.Commands.AddPSSnapinCommand I tried this out, too. Anybody know?

    Read the article

  • Disappearing Arial on 2 Macs

    - by drewk
    I noticed that Safari started rendering common web pages in a funny manner on two different Macs that I have. One is a Macbook Pro and the other a Mac Pro desktop. Yahoo and Google would appear all excessively bold or all italic and not at all look right or acceptable. The computers are all running OS X 10.6.3 "Snow Leopard" Turns out that "Arial.TTF" and "Arial Bold.ttf" got deleted somehow on these two computers. I restored Arial through Font Book and got my web mojo back. So questions: 1) Anyone seen "arial" strangely randomly disappear? The only thing in common is these are the only two computers out of eight on site that recently got Adobe CS 5 installed. Has anyone had CS 5 delete arial? 2) When I restored arial with font book, it goes into User fonts rather than the All fonts. Can I use Font Book to restore a font in /System/Library/Fonts or do I need to do that manually? 3) I located THIS article on the web regarding OS X fonts. Essentially, Snow Leopard did away with the older .dfont format and replaced with open True Type. There is a minimum font list, but Arial is not among them. Arial is installed by MS Office. 4) Why are web sites affected by Arial being missing anyway? If I look at the HTML source for Yahoo for example, "arial" is specified by name only in an ad. Yahoo itself does not specify a font name. In my Safari preferences, I have Times and Courier specified as the default font which is the default for Safari when installed. How does a missing Arial screw things up anyway? Thanks in advance.

    Read the article

  • Apache multiple vhost logs, stored locally and sent to remote logstash

    - by benbradley
    I'm investigating centralised logging and it seems there's so many different ways this can be done. I don't want to run logstash as a log "sender", preferring to keep the web servers as lean and simple possible. So that means either using syslog, syslog-ng or the one I'm testing now, rsyslog. But I would like to have separate vhost log files on the web server, in addition to these logs being sent to a remote log collector. I've tested rsyslog using the imfile module to watch the Apache log files, but this means I have to hard-code each vhost log file into my rsyslog.conf. Not ideal as people will invariably forget when they add/remove sites on the server. The reason I'm using rsyslog's imfile is that Apache doesn't appear to let you log to file and syslog. And I want to keep vhost-specific log files on the web server. So how can I do this? Is there a way of having rsyslog produce local log files and forward the logs to a remote collector? I am prepared to change my Apache config to log to a single access/error log for all vhosts, so long as there are vhost-specific log files produced somewhere on the web server machine. I just don't want to lose any logging info if the remote log collector can't be contacted for any reason. Any comments/suggestions? Cheers, B

    Read the article

  • Swapping out a hardware firewall does the mac address get cached?

    - by Dan
    We need to replace a hardware firewall (cisco pix) and have a spare that we will use (temporarily). The firewall sits in front of a couple of web-servers colocated at a data-centre. The replacement will be configured with identical settings (external/internal IP addresses, configured ports etc.). When we swap the firewalls over, will this work immediately or will the old Pix's mac address be cached and the new firewall not be seen until the cache is cleared? (What is it though that is caching the address? Is it just the switch/router that our pix is connected to?) Reason for asking is a few years ago I had a smoothwall firewall in front of a lone server (the external IP of the smoothwall was also the external IP of the web-server). When I replaced the smoothwall with a pix, the IP address of the web-server stayed the same but it now had to be reached via the new firewall on a different IP. It took about 2-4 hours before the rest of the world could see that web-server again. I'm hoping for less downtime this time!

    Read the article

  • Postfix cannot deliver mail to Cyrus mailbox on Ubuntu 11.10 server

    - by user105804
    I have installed and configured Postfix and Cyrus IMAP server with webcyradm according to this document - http://www.delouw.ch/linux/Postfix-Cyrus-Web-cyradm-HOWTO/html/index.html . I can access webcyradm interface, I can create new domains and new users, and I can login via IMAP after creating the user account. However, Postfix fails to deliver mail to cyrus mailboxes. Mail log contains errors shown below. Installing any IMAP server other than cyrus is not an option because it is needed by the web application. Please advise me how to make Postfix deliver email to cyrus mailboxes. The solution should not necessary include web-cyradm, but there should be a web interface for managing mail domains and mailboxes as user-friendly as possible. Dec 30 22:46:17 acer-tower cyrus/lmtpunix[4865]: accepted connection Dec 30 22:46:17 acer-tower cyrus/lmtpunix[4865]: lmtp connection preauth'd as postman Dec 30 22:46:17 acer-tower postfix/cleanup[4868]: 065D5240035: message-id=<[email protected]> Dec 30 22:46:17 acer-tower cyrus/lmtpunix[4865]: verify_user(user.imap0001) failed: Mailbox does not exist Dec 30 22:46:17 acer-tower postfix/bounce[4867]: 6C6CA24185C: sender non-delivery notification: 065D5240035 Dec 30 22:46:17 acer-tower postfix/qmgr[4833]: 065D5240035: from=<>, size=3372, nrcpt=1 (queue active) Dec 30 22:46:17 acer-tower postfix/qmgr[4833]: 6C6CA24185C: removed Dec 30 22:46:17 acer-tower postfix/lmtp[4866]: 53421240372: to=<[email protected]>, orig_to=<[email protected]>, relay=home.webshop-software.ch[/tmp/lmtp], delay=165, delays=165/0.02/0.17/0.09, dsn=5.1.1, status=bounced (host home.webshop-software.ch[/tmp/lmtp] said: 550-Mailbox unknown. Either there is no mailbox associated with this 550-name or you do not have authorization to see it. 550 5.1.1 User unknown (in reply to RCPT TO command))

    Read the article

  • What does "incoming" and "outgoing" traffic mean?

    - by mgibsonbr
    I've seen many resources explaining how to set up a server's firewall to allow incoming and outgoing traffic on HTTP standard ports (80 and 443), but I can't figure out why I would need either of them. Do I need to unblock both for a "regular" web site to work? For file uploads to work? Are there situations where it would be advisable to unblock one and leave the other blocked? Sorry if that's a basic question, but I couldn't find it explained anywhere (also I'm not a native english speaker). I know in a "regular" web site the client is always the one who initiates a request, so I'm assuming a web server must accept incoming traffic on those ports, and my common sense tells me the server is allowed to send a response without unblocking anything else (otherwise it wouldn't make sense to have two types of rules). Is that correct? But what is an outgoing web (service) traffic, and what would be its use? AFAIK if the server wanted to initiate a connection with another machine, the specific port that matters is the one in the other end (i.e. the destination port would be 80), on its end any free port could be used (the source port would be random). I can open HTTP requests from my server (using wget for instance) without unblocking anything. So I'm assuming my concepts of "incoming" and "outgoing" are wrong somehow.

    Read the article

  • Moved servers running Windows Server 2003

    - by Charles
    Our company has two locations and each location has a Windows Server 2003 machine as the DC and several servers, running on two different sub-nets. We are consolidating the locations. I changed the IP address on one of the web servers prior to moving to the main location. I didn't change the IP address on either the DC or the other web servers prior to moving to the main location. Now, only the web server whose IP was changed is able to serve pages. The other web servers are not able to serve pages, cannot be pinged, or be accessed via RDP. Since we don't need the second DC, it has been powered down. When I tried to ping it, the previous IP address was received. My colleague changed the IP address in the DC's DNS, but when I ping it, a timeout error is received. I know that I should have read a lot more before doing this. What can I do to fix it? Thanks, in advance, for your help! Update MarkM, thanks for the info on demoting a DC. That's one of the things I want to do after everything is working. Is there a good, clear article you recommend? Rusty, there are no DMZs involved at this point. I need to set up a DMZ, but that's another project.

    Read the article

  • Multi- authentication scenario for a public internet service using Kerberos

    - by StrangeLoop
    I have a public web server which has users coming from internet (via HTTPS) and from a corporate intranet. I wish to use Kerberos authentication for the intranet users so that they would be automatically logged in the web application without the need to provide any login/password (assuming they are already logged to the Windows domain). For the users coming from internet I want to provide traditional basic/form- based authentication. User/password data for these users would be stored internally in a database used by the application. Web application will be configured to use Kerberos authentication for users coming from specific intranet ip networks and basic/form- based authentication will be used for the rest of the users. From a security perspective, are there some risks involved in this kind of setup or is this a generally accepted solution? My understanding is that server doesn't need access to KDC (see Kerberos authentication, service host and access to KDC) and it can be completely isolated from AD and corporate intranet. The server has a keytab file stored locally that is used to decrypt tickets sent by the users coming from intranet. The tickets only contain username and domain of the incoming user. Server never sees the passwords of authenticated users. If the server would be hacked and the keytab file compromised, it would mean that attacker could forge tickets for any domain user and get access to the web application as any user. But typically this is the case anyway if hacker gains access to the keytab file on the local filesystem. The encryption key contained in the keytab file is based on the service account password in AD and is in hashed form, I guess it is very difficult to brute force this password if strong Kerberos encryption like AES-256-SHA1 is used. As the server has no network access to intranet, even the compromised service account couldn't be directly used for anything.

    Read the article

  • How does Safari's Reader work and when does it show up?

    - by TestSubject528491
    Safari's Reader feature is a cool little app that displays a web page as a newspaper article --- without all the distracting sidebars, comments, and ads. Sometimes it works and sometimes it doesn't, and I'm wondering how "it knows when to show up." On my personal website, one of the pages has this option. You can click the Reader button in the URL bar and it is displayed beautifully like a page in an iBook. However, none of my other web pages (on the same site) do this. I thought it had something to do with the <article> tag, but I removed that and it still works. Anyone know how this app works? Also, does anyone know of any Chrome extensions that are just like this? Google Reader is not the same thing. PS: From the cited Apple website: Safari Reader As you browse the web, Safari detects if you are on a web page with an article. Click the Reader button that appears in the Smart Address Field and an elegant view of the article appears — without any distracting content. Not much help, is it?

    Read the article

  • Problem running “Central Administration” website after windows update at Windows 2003 Server Standar

    - by Magdy Roshdy
    I was have WSS 2.0 and then I upgraded to WSS 3.0 and the old instalation database was SQL 2000, now I have another SQL Server instance called:server_name\MICROSOFT##SSEE . After upgrade every thing works fine and our team started to use the portal and we sent lot of documents and make lot of activities on it. The problem started after installing Windows updates the website suddenly stopped and giving me an error "Cannot connect to the configuration database" If I tried to open SharePoint Products and Technologies Configuration Wizard it is gives me a strange error says: "An exception of type Microsoft.SharePoint.PostSetupConfiguration.PostSetupConfigurationTaskException was thrown. Additional exception information: SharePoint Products and Technologies cannot be configured. The current installation mode does not support SKU to SKU upgrades because there exists an older version of Windows SharePoint Services that must be upgraded first " At this post:http://stackoverflow.com/questions/114398/iis-error-cannot-connect-to-the-configuration-database/249494#249494 the guy of the second answer have the same problem and he suggested a solution but I don't understand well. I tried as he suggested to make the identity of the app pool of the SharePoint web site as "IWAM_server_name " after that the error changed as he said and I web site give me "Server Application Unavailable " and when checked the Event Viewer at the server I found that ASP.NET 2.0 give this exception: "Could not load file or assembly 'System.Web, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a' or one of its dependencies. Access is denied ." and I don't know how to solve this problem. I'm really want to make my web site working because our team really need these documents and its stuff. I hope I will find some one to help me.

    Read the article

  • High-performance Academic Server [closed]

    - by PHPsmith
    Suppose I want to build a server for the university's academic interests. The server is dedicated only to a site, where users (students and lecturers) just view and fill the academic data. But at a time (e.g. once a semester), about 12,000 students will access the site simultaneously. Due to limitation of resources, I have to build the server using free software (except for the operating system Windows 7, the university has been prepared). The hardware is also limited to the usual 4-core computers (eg, Ivy Bridge Intel Core i7-3770) with approximately 16GB of memory (DDR3 1600 MHz), equipped with an RJ-45 port (Intel 82 579 Gigabit Ethernet). With all these limitations, I have to choose the software (web server, database, etc) are appropriate for this purpose is achieved. I decided to create a site in PHP. Please help me by answering the following questions based on your expertise. (my prime candidate software to consider after googling) Web server which is faster & stable & secure, when implemented and optimized for PHP? And why? (nginx) PHP accelerator which is faster & stable & compatible with the selected web server? And why? (APC with Zend Optimizer+) Database which is faster & stable & secure, when implemented and optimized for selected web server and selected PHP accelerator? (MySQL) Are there any errors that have been or will be happening from my condition is? If there is, please enlighten me? Is there anything else I need to know in order to achieve this goal? If there is, please enlighten me? I understand that the performance also depends on the implementation of source-code program, so I assume it will create a site with the best efficiency (e.g. using AJAX).

    Read the article

  • ASP.Net Session Timing Out Rapidly

    - by Zac
    We have an ASP.Net 3.5 website running on Windows Server 2008 with IIS7. The session timeout period for this site is configured to be 20 minutes - however, it is currently lasting for between 40 and 50 seconds. After researching the problem we investigated several configuration values which could be involved in the timeout period but none of them are set to less than 20 minutes. The areas we look are as follows: web.config system.web/sessionState element (20 minutes). web.config system.web/authentication/forms element (not present, defaults to 30 minutes). Sites/{website}/ASP/Session Properties/Time-out (20 minutes). Application Pools/{appPool}/Advanced Settings/Process Model/Idle Time-out (20 minutes). We've also noted that the CPU is staying around 0% and that RAM usage is flat-lining around 1.07 GB (of 8 GB available) - so there is no performance-based reason for IIS to be recycling the Application Pool as far as we can tell. Are there any settings we've overlooked which could cause the session timeouts to be expiring so quickly? EDIT A couple of additional points: This is not occurring in development, only on the server. The session is not sliding (i.e. if we refresh the page a few times it still times out approximately 40 - 50 seconds after the session was created.

    Read the article

  • tomcat vs FULL J2EE Solutions

    - by jrhickey
    We are getting ready to make a major revision in our Web Application architecture which currently is running on JBoss 4.2. At first we were looking at moving from 4.2 to JBoss 6 but after some research tomcat may be a better solution for us. My first question is their anything that JBoss can do that tomcat cannot do assuming you are using the correct plugins. We do not really use EJB's in our solution and it would appear there are simple plugins for web services, JMX and other features. Tomcat appears to have much better support, faster upgrade cycles and many, many books. Since there is less to the system it also seems much easier to support from an admin point of view. What am I missing? The main features we want to enable are better clustering support and session replication / persistence. We will consider other application servers as well such as Glassfish / Geronimo. quote form a web article: Apache Tomcat is the world’s most widely used web application server, with over one million downloads per month and over 70% penetration in the enterprise datacenter. Tomcat is used to power everything from simple one server sites to large enterprise networks.

    Read the article

  • Servers - Buying New vs Buying Second-hand

    - by Django Reinhardt
    We're currently in the process of adding additional servers to our website. We have a pretty simple topology planned: A Firewall/Router Server infront of a Web Application Server and Database Server. Here's a simple (and technically incorrect) diagram that I used in a previous question to illustrate what I mean: We're now wondering about the specs of our two new machines (the Web App and Firewall servers) and whether we can get away with buying a couple of old servers. (Note: Both machines will be running Windows Server 2008 R2.) We're not too concerned about our Firewall/Router server as we're pretty sure it won't be taxed too heavily, but we are interested in our Web App server. I realise that answering this type of question is really difficult without a ton of specifics on users, bandwidth, concurrent sessions, etc, etc., so I just want to focus on the general wisdom on buying old versus new. I had originally specced a new Dell PowerEdge R300 (1U Rack) for our company. In short, because we're going to be caching as much data as possible, I focussed on Processor Speed and Memory: Quad-Core Intel Xeon X3323 2.5Ghz (2x3M Cache) 1333Mhz FSB 16GB DDR2 667Mhz But when I was looking for a cheap second-hand machine for our Firewall/Router, I came across several machines that made our engineer ask a very reasonable question: If we stuck a boat load of RAM in this thing, wouldn't it do for the Web App Server and save us a ton of money in the process? For example, what about a second-hand machine with the following specs: 2x Dual-Core AMD Opteron 2218 2.6Ghz (2MB Cache) 1000Mhz HT 16GB DDR2 667Mhz Would it really be comparable with the more expensive (new) server above? Our engineer postulated that the reason companies upgrade their servers to newer processors is often because they want to reduce their power costs, and that a 2.6Ghz processor was still a 2.6Ghz processor, no matter when it was made. Benchmarks on various sites don't really support this theory, but I was wondering what server admin thought. Thanks for any advice.

    Read the article

  • Access 2010 datasheet view only/relationships unavailable

    - by Luis
    I'm relatively new to MS Access in general and just started working with Access 2010. I've created a new web database with a few tables that I need to relate. First problem: For the life of me, I can't view anything in any view other than datasheet view; everywhere I would expect to be able to change the view, only datasheet view is available. Second problem: I can't change the primary key(s). Presumably I would be able to do this if I could get out of datasheet view and into design view. Third problem: The 'Relationships' button is greyed out. I know these appear to be really simple things but I've been looking for much more time than I'd like to admit trying to figure out how to get unstuck. Update: It would appear that this is happening because it is a 'web database' as I've been able to do all of the above in a new regular database. With this in mind let me ask a different question: Am I able to add relationships and change primary keys in a web database? If so how? More generally, what is the point of a web database?

    Read the article

  • Very, very simple asp.net page takes forever to load

    - by John Hoge
    I've got a page that couldn't be more simple: <%@ Page Trace="true" %> <html> <head></head> <body> <h1>Hello World</h1> <a href="/OtherPage.aspx"/>Other Page</a> <p><%=DateTime.Now.ToString()%> </body> </html> ... but it takes forever to load. There is no database or web service call to slow it down. The trace command reveals that the time from Begin PreInot to End Render is .000049 seconds, but the page itself takes several seconds to load. It is a new web site I just created for this test, and just has a web.config & two test files. The only thing in the web.config is access control: <authorization><allow users="domain\me" /><deny users = "*"/></authorization> What else could IIS be doing with all of that time?

    Read the article

  • Redirect without changing URL

    - by Coobadivin
    Here's the setup. We have a hardware load balancer with an http virtual cluster. Let's call this virtual cluster example1.com. This virtual cluster load balances between two squid reverse proxies which are also on the same physical servers as the web servers. Squid listens on 80 and points to itself as the cache_peer web server which listens on 81. We also have a standalone web server which we will call example2.com. What we are trying to do is create a subdirectory on example1.com called example1.com/example2. This will point to example2.com, but we want our users to stay at example1.com/example2 in their browser. So, it's like a redirect without actually being a redirect. How the hell do I go about doing this? Is this even possible? I'm looking at squid docs in the meantime. example1.com is running a proprietary web server - not Apache :( We can't host example2.com's content in example1.com's file system. These are two very different platforms.

    Read the article

  • Sharepoint Error: [COMException (0x80004005): Cannot complete this action.

    - by ifunky
    Hi, I've created a site basic definition that uses different master and default pages. Everything works quite well except for whenever I create a new site based on the definition I receive the following error when browsing to the new site: [COMException (0x80004005): Cannot complete this action. Please try again.] Please try again.] Microsoft.SharePoint.Library.SPRequestInternalClass.GetFileAndMetaInfo(String bstrUrl, Byte bPageView, Byte bPageMode, Byte bGetBuildDependencySet, String bstrCurrentFolderUrl, Boolean& pbCanCustomizePages, Boolean& pbCanPersonalizeWebParts, Boolean& pbCanAddDeleteWebParts, Boolean& pbGhostedDocument, Boolean& pbDefaultToPersonal, String& pbstrSiteRoot, Guid& pgSiteId, UInt32& pdwVersion, String& pbstrTimeLastModified, String& pbstrContent, Byte& pVerGhostedSetupPath, UInt32& pdwPartCount, Object& pvarMetaData, Object& pvarMultipleMeetingDoclibRootFolders, String& pbstrRedirectUrl, Boolean& pbObjectIsList, Guid& pgListId, UInt32& pdwItemId, Int64& pllListFlags, Boolean& pbAccessDenied, Guid& pgDocId, Byte& piLevel, UInt64& ppermMask, Object& pvarBuildDependencySet, UInt32& pdwNumBuildDependencies, Object& pvarBuildDependencies, String& pbstrFolderUrl, String& pbstrContentTypeOrder) +0 Microsoft.SharePoint.Library.SPRequest.GetFileAndMetaInfo(String bstrUrl, Byte bPageView, Byte bPageMode, Byte bGetBuildDependencySet, String bstrCurrentFolderUrl, Boolean& pbCanCustomizePages, Boolean& pbCanPersonalizeWebParts, Boolean& pbCanAddDeleteWebParts, Boolean& pbGhostedDocument, Boolean& pbDefaultToPersonal, String& pbstrSiteRoot, Guid& pgSiteId, UInt32& pdwVersion, String& pbstrTimeLastModified, String& pbstrContent, Byte& pVerGhostedSetupPath, UInt32& pdwPartCount, Object& pvarMetaData, Object& pvarMultipleMeetingDoclibRootFolders, String& pbstrRedirectUrl, Boolean& pbObjectIsList, Guid& pgListId, UInt32& pdwItemId, Int64& pllListFlags, Boolean& pbAccessDenied, Guid& pgDocId, Byte& piLevel, UInt64& ppermMask, Object& pvarBuildDependencySet, UInt32& pdwNumBuildDependencies, Object& pvarBuildDependencies, String& pbstrFolderUrl, String& pbstrContentTypeOrder) +219 [SPException: Cannot complete this action. Please try again.] I'm able to work around this by checking out the new master page and checking it back in again and after doing so there are no further issues at all. Any ideas to what could cause this? Thanks Dan ONET.XML module section: <Modules> <Module Name="CustomMasterPage" List="116" Url="_catalogs/masterpage" RootWebOnly="FALSE"> <File Url="Shoes.master" Type="GhostableInLibrary" IgnoreIfAlreadyExists="TRUE" /> </Module> <Module Name="Default" List="116" Url=""> <File Url="default.aspx" Name="default.aspx" NavBarHome="True" IgnoreIfAlreadyExists="FALSE"> <AllUsersWebPart WebPartZoneID="Left" WebPartOrder="1"> &lt;webParts&gt;&lt;webPart xmlns="http://schemas.microsoft.com/WebPart/v3"&gt;&lt;metaData&gt;&lt;type name="BCM.SharePoint.Shoes.ShoesComponents.FooterLinks, BCM.SharePoint.Shoes.ShoesComponents, Version=1.0.0.0, Culture=neutral, PublicKeyToken=2881713f39360b71" /&gt;&lt;importErrorMessage&gt;Cannot import this Web Part.&lt;/importErrorMessage&gt;&lt;/metaData&gt;&lt;data&gt;&lt;properties&gt;&lt;property name="AllowClose" type="bool"&gt;True&lt;/property&gt;&lt;property name="Width" type="string" /&gt;&lt;property name="MyProperty" type="string"&gt;Hello SharePoint&lt;/property&gt;&lt;property name="AllowMinimize" type="bool"&gt;True&lt;/property&gt;&lt;property name="AllowConnect" type="bool"&gt;True&lt;/property&gt;&lt;property name="ChromeType" type="chrometype"&gt;None&lt;/property&gt;&lt;property name="TitleIconImageUrl" type="string"&gt;/_layouts/images/BCM_SharePoint_Shoes/wp_FooterLinks.gif&lt;/property&gt;&lt;property name="Description" type="string"&gt;FooterLinks Description&lt;/property&gt;&lt;property name="Hidden" type="bool"&gt;False&lt;/property&gt;&lt;property name="TitleUrl" type="string" /&gt;&lt;property name="AllowEdit" type="bool"&gt;True&lt;/property&gt;&lt;property name="Height" type="string" /&gt;&lt;property name="MissingAssembly" type="string"&gt;Cannot import this Web Part.&lt;/property&gt;&lt;property name="HelpUrl" type="string" /&gt;&lt;property name="Title" type="string" /&gt;&lt;property name="CatalogIconImageUrl" type="string"&gt;/_layouts/images/BCM_SharePoint_Shoes/wp_FooterLinks.gif&lt;/property&gt;&lt;property name="Direction" type="direction"&gt;NotSet&lt;/property&gt;&lt;property name="ChromeState" type="chromestate"&gt;Normal&lt;/property&gt;&lt;property name="AllowZoneChange" type="bool"&gt;True&lt;/property&gt;&lt;property name="AllowHide" type="bool"&gt;True&lt;/property&gt;&lt;property name="HelpMode" type="helpmode"&gt;Modeless&lt;/property&gt;&lt;property name="ExportMode" type="exportmode"&gt;All&lt;/property&gt;&lt;/properties&gt;&lt;/data&gt;&lt;/webPart&gt;&lt;/webParts&gt; </AllUsersWebPart> <AllUsersWebPart WebPartZoneID="Left" WebPartOrder="0"> &lt;webParts&gt;&lt;webPart xmlns="http://schemas.microsoft.com/WebPart/v3"&gt;&lt;metaData&gt;&lt;type name="BCM.SharePoint.Shoes.ShoesComponents.SubFooterLinks, BCM.SharePoint.Shoes.ShoesComponents, Version=1.0.0.0, Culture=neutral, PublicKeyToken=2881713f39360b71" /&gt;&lt;importErrorMessage&gt;Cannot import this Web Part.&lt;/importErrorMessage&gt;&lt;/metaData&gt;&lt;data&gt;&lt;properties&gt;&lt;property name="AllowClose" type="bool"&gt;True&lt;/property&gt;&lt;property name="Width" type="string" /&gt;&lt;property name="MyProperty" type="string"&gt;Hello SharePoint&lt;/property&gt;&lt;property name="AllowMinimize" type="bool"&gt;True&lt;/property&gt;&lt;property name="AllowConnect" type="bool"&gt;True&lt;/property&gt;&lt;property name="ChromeType" type="chrometype"&gt;None&lt;/property&gt;&lt;property name="TitleIconImageUrl" type="string"&gt;/_layouts/images/BCM_SharePoint_Shoes/wp_SubFooterLinks.gif&lt;/property&gt;&lt;property name="Description" type="string"&gt;Shoes home page links (under the hero image)&lt;/property&gt;&lt;property name="Hidden" type="bool"&gt;False&lt;/property&gt;&lt;property name="TitleUrl" type="string" /&gt;&lt;property name="AllowEdit" type="bool"&gt;True&lt;/property&gt;&lt;property name="Height" type="string" /&gt;&lt;property name="MissingAssembly" type="string"&gt;Cannot import this Web Part.&lt;/property&gt;&lt;property name="HelpUrl" type="string" /&gt;&lt;property name="Title" type="string" /&gt;&lt;property name="CatalogIconImageUrl" type="string"&gt;/_layouts/images/BCM_SharePoint_Shoes/wp_SubFooterLinks.gif&lt;/property&gt;&lt;property name="Direction" type="direction"&gt;NotSet&lt;/property&gt;&lt;property name="ChromeState" type="chromestate"&gt;Normal&lt;/property&gt;&lt;property name="AllowZoneChange" type="bool"&gt;True&lt;/property&gt;&lt;property name="AllowHide" type="bool"&gt;True&lt;/property&gt;&lt;property name="HelpMode" type="helpmode"&gt;Modeless&lt;/property&gt;&lt;property name="ExportMode" type="exportmode"&gt;All&lt;/property&gt;&lt;/properties&gt;&lt;/data&gt;&lt;/webPart&gt;&lt;/webParts&gt; </AllUsersWebPart> <NavBarPage Name="$Resources:core,nav_Home;" ID="1002" Position="Start" /> <NavBarPage Name="$Resources:core,nav_Home;" ID="0" Position="Start" /> </File> </Module> </Modules> LOG OUTPUT: 05/21/2010 12:22:55.11 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services General 72nz Medium Videntityinfo::isFreshToken reported failure. 05/21/2010 12:22:55.19 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services Fields 88yv Medium Creating default lists 05/21/2010 12:22:55.19 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services General 72lp Medium Creating directory Lists 05/21/2010 12:22:55.26 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services Fields 88yf Medium Creating list "Master Page Gallery" in web "http://mmm-dev-ll/sites/Shoes/test" at URL "_catalogs/masterpage", (setuppath: "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\12\Template\global\lists\mplib") 05/21/2010 12:22:55.28 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services Fields 88y1 Medium No document templates uploaded for list "Master Page Gallery" -- none found for list template "100". 05/21/2010 12:22:55.28 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services General 72kc Medium Failed to find generic XML file at "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\12\Template\xml\onet.xml", falling back to global site definition. 05/21/2010 12:22:56.22 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services Fields 88yz Medium Creating default modules at URL "http://mmm-dev-ll/sites/Shoes/test" 05/21/2010 12:22:56.22 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services General 8e27 Medium Ensuring module folder _catalogs/masterpage 05/21/2010 12:22:56.89 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services General 72h7 Medium Applying template "SubSite#1" to web at URL "http://mmm-dev-ll/sites/Shoes/test". 05/21/2010 12:22:57.09 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services Fields 88yy Medium Activating web-scoped features for template "SubSite#1" at URL "http://mmm-dev-ll/sites/Shoes/test" 05/21/2010 12:22:57.12 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services General 8l1c Medium Preparing 20 features for activation 05/21/2010 12:22:57.14 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services General 8l1d Medium Feature Activation: Batch Activating Features at URL http://mmm-dev-ll/sites/Shoes/test 'AnnouncementsList' (ID: '00bfea71-d1ce-42de-9c63-a44004ce0104'), 'ContactsList' (ID: '00bfea71-7e6d-4186-9ba8-c047ac750105'), 'CustomList' (ID: '00bfea71-de22-43b2-a848-c05709900100'), 'DataSourceLibrary' (ID: '00bfea71-f381-423d-b9d1-da7a54c50110'), 'DiscussionsList' (ID: '00bfea71-6a49-43fa-b535-d15c05500108'), 'DocumentLibrary' (ID: '00bfea71-e717-4e80-aa17-d0c71b360101'), 'EventsList' (ID: '00bfea71-ec85-4903-972d-ebe475780106'), 'GanttTasksList' (ID: '00bfea71-513d-4ca0-96c2-6a47775c0119'), 'GridList' (ID: '00bfea71-3a1d-41d3-a0ee-651d11570120'), 'IssuesList' (ID: '00bfea71-5932-4f9c-ad71-1557e5751100'), 'LinksList' (ID: '00bfea71-2062-426c-90bf-714c59600103'), 'NoCodeWorkflowLibrary' (ID: '00bfe... 05/21/2010 12:22:57.14* w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services General 8l1d Medium ...a71-f600-43f6-a895-40c0de7b0117'), 'PictureLibrary' (ID: '00bfea71-52d4-45b3-b544-b1c71b620109'), 'SurveysList' (ID: '00bfea71-eb8a-40b1-80c7-506be7590102'), 'TasksList' (ID: '00bfea71-a83e-497e-9ba0-7a5c597d0107'), 'WebPageLibrary' (ID: '00bfea71-c796-4402-9f2f-0eb9a6e71b18'), 'workflowProcessList' (ID: '00bfea71-2d77-4a75-9fca-76516689e21a'), 'WorkflowHistoryList' (ID: '00bfea71-4ea5-48d4-a4ad-305cf7030140'), 'XmlFormLibrary' (ID: '00bfea71-1e1d-4562-b56a-f05371bb0115'), 'TeamCollab' (ID: '00bfea71-4ea5-48d4-a4ad-7ea5c011abe5'), . 05/21/2010 12:22:57.15 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services General 8l1f Medium Feature Activation: Batch Activated Features at URL http://mmm-dev-ll/sites/Shoes/test 'AnnouncementsList' (ID: '00bfea71-d1ce-42de-9c63-a44004ce0104'), 'ContactsList' (ID: '00bfea71-7e6d-4186-9ba8-c047ac750105'), 'CustomList' (ID: '00bfea71-de22-43b2-a848-c05709900100'), 'DataSourceLibrary' (ID: '00bfea71-f381-423d-b9d1-da7a54c50110'), 'DiscussionsList' (ID: '00bfea71-6a49-43fa-b535-d15c05500108'), 'DocumentLibrary' (ID: '00bfea71-e717-4e80-aa17-d0c71b360101'), 'EventsList' (ID: '00bfea71-ec85-4903-972d-ebe475780106'), 'GanttTasksList' (ID: '00bfea71-513d-4ca0-96c2-6a47775c0119'), 'GridList' (ID: '00bfea71-3a1d-41d3-a0ee-651d11570120'), 'IssuesList' (ID: '00bfea71-5932-4f9c-ad71-1557e5751100'), 'LinksList' (ID: '00bfea71-2062-426c-90bf-714c59600103'), 'NoCodeWorkflowLibrary' (ID: '00bfea... 05/21/2010 12:22:57.15* w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services General 8l1f Medium ...71-f600-43f6-a895-40c0de7b0117'), 'PictureLibrary' (ID: '00bfea71-52d4-45b3-b544-b1c71b620109'), 'SurveysList' (ID: '00bfea71-eb8a-40b1-80c7-506be7590102'), 'TasksList' (ID: '00bfea71-a83e-497e-9ba0-7a5c597d0107'), 'WebPageLibrary' (ID: '00bfea71-c796-4402-9f2f-0eb9a6e71b18'), 'workflowProcessList' (ID: '00bfea71-2d77-4a75-9fca-76516689e21a'), 'WorkflowHistoryList' (ID: '00bfea71-4ea5-48d4-a4ad-305cf7030140'), 'XmlFormLibrary' (ID: '00bfea71-1e1d-4562-b56a-f05371bb0115'), 'TeamCollab' (ID: '00bfea71-4ea5-48d4-a4ad-7ea5c011abe5'), . 05/21/2010 12:22:57.15 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services General 88jb Medium Feature Activation: Activating Feature 'RadEditorFeatureRichText' (ID: '747755cd-d060-4663-961c-9b0cc43724e9') at URL http://mmm-dev-ll/sites/Shoes/test. 05/21/2010 12:22:57.15 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services General 75fb Medium Calling 'FeatureActivated' method of SPFeatureReceiver for Feature 'RadEditorFeatureRichText' (ID: '747755cd-d060-4663-961c-9b0cc43724e9'). 05/21/2010 12:22:57.20 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services General 75f8 Medium Feature Activation: Feature 'RadEditorFeatureRichText' (ID: '747755cd-d060-4663-961c-9b0cc43724e9') was activated at URL http://mmm-dev-ll/sites/Shoes/test. 05/21/2010 12:22:57.51 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services Fields 88yv Medium Creating default lists 05/21/2010 12:22:57.51 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services General 72lp Medium Creating directory Lists 05/21/2010 12:22:57.51 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services Fields 88yz Medium Creating default modules at URL "http://mmm-dev-ll/sites/Shoes/test" 05/21/2010 12:22:57.51 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services General 8e27 Medium Ensuring module folder _catalogs/masterpage 05/21/2010 12:22:57.56 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services General 72ix Medium Not enough information to determine a list for module "Default". Assuming no list for this module. 05/21/2010 12:22:57.87 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services General 72h8 Medium Successfully applied template "SubSite#1" to web at URL "http://mmm-dev-ll/sites/Shoes/test". 05/21/2010 12:22:59.48 w3wp.exe (0x1E40) 0x0980 Windows SharePoint Services General 8e2s Medium Unknown SPRequest error occurred. More information: 0x80070057 05/21/2010 12:23:07.06 OWSTIMER.EXE (0x0884) 0x106C Office Server Setup and Upgrade 8u3j High Registry key value {SearchThrottled} was not found under registry hive {Software\Microsoft\Office Server\12.0}. Assuming search sku is not throttled. 05/21/2010 12:23:07.08 OWSTIMER.EXE (0x0884) 0x106C Search Server Common MS Search Administration 90gf Medium SQL: dbo.proc_MSS_PropagationGetQueryServers 05/21/2010 12:23:07.09 OWSTIMER.EXE (0x0884) 0x106C Search Server Common MS Search Administration 8wni High Resuming default catalog with reason 'GPR_PROPAGATION' for application 'SharedServices1'... 05/21/2010 12:23:07.11 OWSTIMER.EXE (0x0884) 0x106C Search Server Common MS Search Administration 8wnj High Resuming anchor text catalog with reason GPR_PROPAGATION' for application 'SharedServices1'... 05/21/2010 12:23:07.14 OWSTIMER.EXE (0x0884) 0x106C Search Server Common MS Search Administration 8dvl Medium Search application '3c6751cc-37b0-470a-bfa2-bfd0b5635fe1': Provision start addresses in default content source. 05/21/2010 12:23:07.15 OWSTIMER.EXE (0x0884) 0x106C Search Server Common MS Search Administration 7hmh High exception in SearchUpgradeProvisioner Keyword Config System.InvalidOperationException: jobServerSearchServiceInstance is null at Microsoft.Office.Server.Search.Administration.SearchUpgradeProvisioner..ctor(SearchServiceInstance searchServiceInstance) at Microsoft.Office.Server.Search.Administration.OSSPrimaryGathererProject.ProvisionContentSources() 05/21/2010 12:23:29.19 OWSTIMER.EXE (0x0884) 0x0FFC SharePoint Portal Server Business Data 79bv High Initiating BDC Cache Invalidation Check in AppDomain 'DefaultDomain' 05/21/2010 12:23:29.19 OWSTIMER.EXE (0x0884) 0x0FFC SharePoint Portal Server Business Data 79bx High Completed BDC Cache Invalidation Check in AppDomain 'DefaultDomain' 05/21/2010 12:23:45.84 OWSTIMER.EXE (0x0884) 0x08A8 SharePoint Portal Server SSO 8inc Medium In SSOService::Synch(), sso database conn string: 05/21/2010 12:23:50.80 OWSTIMER.EXE (0x0884) 0x0F14 Excel Services Excel Services Administration 8tqi Medium ExcelServerSharedWebApplication.Synchronize: Starting synchronize for instance of Excel Services in SSP 'SharedServices1'. 05/21/2010 12:23:50.80 OWSTIMER.EXE (0x0884) 0x0F14 Excel Services Excel Services Administration 8tqj Medium ExcelServerSharedWebApplication.Synchronize: Successfully synchronized instance of Excel Services in SSP 'SharedServices1'. 05/21/2010 12:23:52.31 w3wp.exe (0x1E40) 0x0980 SharePoint Portal Server Runtime 8gp7 Medium Topology cache updated. (AppDomain: /LM/W3SVC/1963195510/Root-1-129188762904047141)

    Read the article

  • .NET HTML Sanitation for rich HTML Input

    - by Rick Strahl
    Recently I was working on updating a legacy application to MVC 4 that included free form text input. When I set up the new site my initial approach was to not allow any rich HTML input, only simple text formatting that would respect a few simple HTML commands for bold, lists etc. and automatically handles line break processing for new lines and paragraphs. This is typical for what I do with most multi-line text input in my apps and it works very well with very little development effort involved. Then the client sprung another note: Oh by the way we have a bunch of customers (real estate agents) who need to post complete HTML documents. Oh uh! There goes the simple theory. After some discussion and pleading on my part (<snicker>) to try and avoid this type of raw HTML input because of potential XSS issues, the client decided to go ahead and allow raw HTML input anyway. There has been lots of discussions on this subject on StackOverFlow (and here and here) but to after reading through some of the solutions I didn't really find anything that would work even closely for what I needed. Specifically we need to be able to allow just about any HTML markup, with the exception of script code. Remote CSS and Images need to be loaded, links need to work and so. While the 'legit' HTML posted by these agents is basic in nature it does span most of the full gamut of HTML (4). Most of the solutions XSS prevention/sanitizer solutions I found were way to aggressive and rendered the posted output unusable mostly because they tend to strip any externally loaded content. In short I needed a custom solution. I thought the best solution to this would be to use an HTML parser - in this case the Html Agility Pack - and then to run through all the HTML markup provided and remove any of the blacklisted tags and a number of attributes that are prone to JavaScript injection. There's much discussion on whether to use blacklists vs. whitelists in the discussions mentioned above, but I found that whitelists can make sense in simple scenarios where you might allow manual HTML input, but when you need to allow a larger array of HTML functionality a blacklist is probably easier to manage as the vast majority of elements and attributes could be allowed. Also white listing gets a bit more complex with HTML5 and the new proliferation of new HTML tags and most new tags generally don't affect XSS issues directly. Pure whitelisting based on elements and attributes also doesn't capture many edge cases (see some of the XSS cheat sheets listed below) so even with a white list, custom logic is still required to handle many of those edge cases. The Microsoft Web Protection Library (AntiXSS) My first thought was to check out the Microsoft AntiXSS library. Microsoft has an HTML Encoding and Sanitation library in the Microsoft Web Protection Library (formerly AntiXSS Library) on CodePlex, which provides stricter functions for whitelist encoding and sanitation. Initially I thought the Sanitation class and its static members would do the trick for me,but I found that this library is way too restrictive for my needs. Specifically the Sanitation class strips out images and links which rendered the full HTML from our real estate clients completely useless. I didn't spend much time with it, but apparently I'm not alone if feeling this library is not really useful without some way to configure operation. To give you an example of what didn't work for me with the library here's a small and simple HTML fragment that includes script, img and anchor tags. I would expect the script to be stripped and everything else to be left intact. Here's the original HTML:var value = "<b>Here</b> <script>alert('hello')</script> we go. Visit the " + "<a href='http://west-wind.com'>West Wind</a> site. " + "<img src='http://west-wind.com/images/new.gif' /> " ; and the code to sanitize it with the AntiXSS Sanitize class:@Html.Raw(Microsoft.Security.Application.Sanitizer.GetSafeHtmlFragment(value)) This produced a not so useful sanitized string: Here we go. Visit the <a>West Wind</a> site. While it removed the <script> tag (good) it also removed the href from the link and the image tag altogether (bad). In some situations this might be useful, but for most tasks I doubt this is the desired behavior. While links can contain javascript: references and images can 'broadcast' information to a server, without configuration to tell the library what to restrict this becomes useless to me. I couldn't find any way to customize the white list, nor is there code available in this 'open source' library on CodePlex. Using Html Agility Pack for HTML Parsing The WPL library wasn't going to cut it. After doing a bit of research I decided the best approach for a custom solution would be to use an HTML parser and inspect the HTML fragment/document I'm trying to import. I've used the HTML Agility Pack before for a number of apps where I needed an HTML parser without requiring an instance of a full browser like the Internet Explorer Application object which is inadequate in Web apps. In case you haven't checked out the Html Agility Pack before, it's a powerful HTML parser library that you can use from your .NET code. It provides a simple, parsable HTML DOM model to full HTML documents or HTML fragments that let you walk through each of the elements in your document. If you've used the HTML or XML DOM in a browser before you'll feel right at home with the Agility Pack. Blacklist based HTML Parsing to strip XSS Code For my purposes of HTML sanitation, the process involved is to walk the HTML document one element at a time and then check each element and attribute against a blacklist. There's quite a bit of argument of what's better: A whitelist of allowed items or a blacklist of denied items. While whitelists tend to be more secure, they also require a lot more configuration. In the case of HTML5 a whitelist could be very extensive. For what I need, I only want to ensure that no JavaScript is executed, so a blacklist includes the obvious <script> tag plus any tag that allows loading of external content including <iframe>, <object>, <embed> and <link> etc. <form>  is also excluded to avoid posting content to a different location. I also disallow <head> and <meta> tags in particular for my case, since I'm only allowing posting of HTML fragments. There is also some internal logic to exclude some attributes or attributes that include references to JavaScript or CSS expressions. The default tag blacklist reflects my use case, but is customizable and can be added to. Here's my HtmlSanitizer implementation:using System.Collections.Generic; using System.IO; using System.Xml; using HtmlAgilityPack; namespace Westwind.Web.Utilities { public class HtmlSanitizer { public HashSet<string> BlackList = new HashSet<string>() { { "script" }, { "iframe" }, { "form" }, { "object" }, { "embed" }, { "link" }, { "head" }, { "meta" } }; /// <summary> /// Cleans up an HTML string and removes HTML tags in blacklist /// </summary> /// <param name="html"></param> /// <returns></returns> public static string SanitizeHtml(string html, params string[] blackList) { var sanitizer = new HtmlSanitizer(); if (blackList != null && blackList.Length > 0) { sanitizer.BlackList.Clear(); foreach (string item in blackList) sanitizer.BlackList.Add(item); } return sanitizer.Sanitize(html); } /// <summary> /// Cleans up an HTML string by removing elements /// on the blacklist and all elements that start /// with onXXX . /// </summary> /// <param name="html"></param> /// <returns></returns> public string Sanitize(string html) { var doc = new HtmlDocument(); doc.LoadHtml(html); SanitizeHtmlNode(doc.DocumentNode); //return doc.DocumentNode.WriteTo(); string output = null; // Use an XmlTextWriter to create self-closing tags using (StringWriter sw = new StringWriter()) { XmlWriter writer = new XmlTextWriter(sw); doc.DocumentNode.WriteTo(writer); output = sw.ToString(); // strip off XML doc header if (!string.IsNullOrEmpty(output)) { int at = output.IndexOf("?>"); output = output.Substring(at + 2); } writer.Close(); } doc = null; return output; } private void SanitizeHtmlNode(HtmlNode node) { if (node.NodeType == HtmlNodeType.Element) { // check for blacklist items and remove if (BlackList.Contains(node.Name)) { node.Remove(); return; } // remove CSS Expressions and embedded script links if (node.Name == "style") { if (string.IsNullOrEmpty(node.InnerText)) { if (node.InnerHtml.Contains("expression") || node.InnerHtml.Contains("javascript:")) node.ParentNode.RemoveChild(node); } } // remove script attributes if (node.HasAttributes) { for (int i = node.Attributes.Count - 1; i >= 0; i--) { HtmlAttribute currentAttribute = node.Attributes[i]; var attr = currentAttribute.Name.ToLower(); var val = currentAttribute.Value.ToLower(); span style="background: white; color: green">// remove event handlers if (attr.StartsWith("on")) node.Attributes.Remove(currentAttribute); // remove script links else if ( //(attr == "href" || attr== "src" || attr == "dynsrc" || attr == "lowsrc") && val != null && val.Contains("javascript:")) node.Attributes.Remove(currentAttribute); // Remove CSS Expressions else if (attr == "style" && val != null && val.Contains("expression") || val.Contains("javascript:") || val.Contains("vbscript:")) node.Attributes.Remove(currentAttribute); } } } // Look through child nodes recursively if (node.HasChildNodes) { for (int i = node.ChildNodes.Count - 1; i >= 0; i--) { SanitizeHtmlNode(node.ChildNodes[i]); } } } } } Please note: Use this as a starting point only for your own parsing and review the code for your specific use case! If your needs are less lenient than mine were you can you can make this much stricter by not allowing src and href attributes or CSS links if your HTML doesn't allow it. You can also check links for external URLs and disallow those - lots of options.  The code is simple enough to make it easy to extend to fit your use cases more specifically. It's also quite easy to make this code work using a WhiteList approach if you want to go that route. The code above is semi-generic for allowing full featured HTML fragments that only disallow script related content. The Sanitize method walks through each node of the document and then recursively drills into all of its children until the entire document has been traversed. Note that the code here uses an XmlTextWriter to write output - this is done to preserve XHTML style self-closing tags which are otherwise left as non-self-closing tags. The sanitizer code scans for blacklist elements and removes those elements not allowed. Note that the blacklist is configurable either in the instance class as a property or in the static method via the string parameter list. Additionally the code goes through each element's attributes and looks for a host of rules gleaned from some of the XSS cheat sheets listed at the end of the post. Clearly there are a lot more XSS vulnerabilities, but a lot of them apply to ancient browsers (IE6 and versions of Netscape) - many of these glaring holes (like CSS expressions - WTF IE?) have been removed in modern browsers. What a Pain To be honest this is NOT a piece of code that I wanted to write. I think building anything related to XSS is better left to people who have far more knowledge of the topic than I do. Unfortunately, I was unable to find a tool that worked even closely for me, or even provided a working base. For the project I was working on I had no choice and I'm sharing the code here merely as a base line to start with and potentially expand on for specific needs. It's sad that Microsoft Web Protection Library is currently such a train wreck - this is really something that should come from Microsoft as the systems vendor or possibly a third party that provides security tools. Luckily for my application we are dealing with a authenticated and validated users so the user base is fairly well known, and relatively small - this is not a wide open Internet application that's directly public facing. As I mentioned earlier in the post, if I had my way I would simply not allow this type of raw HTML input in the first place, and instead rely on a more controlled HTML input mechanism like MarkDown or even a good HTML Edit control that can provide some limits on what types of input are allowed. Alas in this case I was overridden and we had to go forward and allow *any* raw HTML posted. Sometimes I really feel sad that it's come this far - how many good applications and tools have been thwarted by fear of XSS (or worse) attacks? So many things that could be done *if* we had a more secure browser experience and didn't have to deal with every little script twerp trying to hack into Web pages and obscure browser bugs. So much time wasted building secure apps, so much time wasted by others trying to hack apps… We're a funny species - no other species manages to waste as much time, effort and resources as we humans do :-) Resources Code on GitHub Html Agility Pack XSS Cheat Sheet XSS Prevention Cheat Sheet Microsoft Web Protection Library (AntiXss) StackOverflow Links: http://stackoverflow.com/questions/341872/html-sanitizer-for-net http://blog.stackoverflow.com/2008/06/safe-html-and-xss/ http://code.google.com/p/subsonicforums/source/browse/trunk/SubSonic.Forums.Data/HtmlScrubber.cs?r=61© Rick Strahl, West Wind Technologies, 2005-2012Posted in Security  HTML  ASP.NET  JavaScript   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

< Previous Page | 761 762 763 764 765 766 767 768 769 770 771 772  | Next Page >