Search Results

Search found 88824 results on 3553 pages for 'sharepoint server'.

Page 489/3553 | < Previous Page | 485 486 487 488 489 490 491 492 493 494 495 496  | Next Page >

  • Does SQL Server 2005 error message numbers back to the asp.net application?

    - by Duke
    I'd like to get the message number and severity level information from SQL Server upon execution of an erroneous query. For example, when a user attempts to delete a row being referenced by another record, and the cascade relationship is "no action", I'd like the application to be able to check for error message 547 ("The DELETE statement conflicted with the REFERENCE constraint...") and return a user friendly and localized message to the user. When running such a query directly on SQL Server, the following message is printed: Msg 547, Level 16, State 0, Line 1 <Error message...> In an Asp.Net app is this information available in an event handler parameter or elsewhere? Also, I don't suppose anyone knows where I can find a definitive reference of SQL Server message numbers?

    Read the article

  • How do you encrypt data between client and server running in Flash and Java?

    - by ArmlessJohn
    We have a multiclient system where the client is written in Flash and the server is written in Java. Currently, communication is done in Flash by usage of flash.net.Socket and the protocol is written in JSON. The server uses a custom port to receive connections and then proceed to talk with each client. As expected, data is sent and received on both fronts as raw bytes, which are then decoded as needed. We would like to encrypt the communication between clients and server. I have some basic understanding about public/private key encryption, but I do not know what is the best way to exchange keys or what libraries are available (on both languages) to do this. What would be the best strategy to attack this problem and where should I start looking for libraries/methods to implement this encryption?

    Read the article

  • How to shutdown local tomcat server when closing browser window?

    - by agez
    Hi, I hava a web app running on a local tomcat server. When the user starts the app (via desktop shortcut) the server starts and the app is opened in a browser window. But when the user just clicks on the close button to stop the application the server is still running in the background - that's annoying. I tried to utilize the "unonload" and "onbeforeunload" events from javascript but unfortunately these events are also fired on some other requests in the app. So I can't use them, except I do a lot of refactoring. Does anyone have an idea for a possible solution? Btw, what I find interesting is the behaviour of Visual Studio when debugging a web application. When I close the browser window Visual Studio also gets a trigger to stop debug mode. So it seems it somehow notices the close event of the browser window, which would be exactly what I need. But I don't know how they do it... Cheers, Helmut

    Read the article

  • Forwarding HTTP Request with Direct Server Return

    - by Daniel Crabtree
    I have servers spread across several data centers, each storing different files. I want users to be able to access the files on all servers through a single domain and have the individual servers return the files directly to the users. The following shows a simple example: 1) The user's browser requests http://www.example.com/files/file1.zip 2) Request goes to server A, based on the DNS A record for example.com. 3) Server A analyzes the request and works out that /files/file1.zip is stored on server B. 4) Server A forwards the request to server B. 5) Server B returns file1.zip directly to the user without going through server A. Note: steps 4 and 5 must be transparent to the user and cannot involve sending a redirect to the user as that would violate the requirement of a single domain. From my research, what I want to achieve is called "Direct Server Return" and it is a common setup for load balancing. It is also sometimes called a half reverse proxy. For step 4, it sounds like I need to do MAC Address Translation and then pass the request back onto the network and for servers outside the network of server A tunneling will be required. For step 5, I simply need to configure server B, as per the real servers in a load balancing setup. Namely, server B should have server A's IP address on the loopback interface and it should not answer any ARP requests for that IP address. My problem is how to actually achieve step 4? I have found plenty of hardware and software that can do this for simple load balancing at layer 4, but these solutions fall short and cannot handle the kind of custom routing I require. It seems like I will need to roll my own solution. Ideally, I would like to do the routing / forwarding at the web server level, i.e. in PHP or C# / ASP.net. However, I am open to doing it at a lower level such as Apache or IIS, or at an even lower level, i.e. a custom proxy service in front of everything.

    Read the article

  • persist values/variables from page to page

    - by Todd
    Hello, I'm wondering if there's another solution to my problem, that's considered more the Sharepoint way. FIrstly, my site is an Internet site, not Intranet. The problem is, all I'm trying to do is save values/variables from page to page in Sharepoint. I know the issue with Session Variables, but this seems to be the only way I can see to accomplish this. I know there are webparts that can store this value, but am I wrong in thinking this won't be persisted from page to page? Basically, I'll be extending the Content Query Web Part to dynamically filter it's results based off of a variable/value. The user chooses their 'area' from a dropdown, and the CQWP in the site will change and query results based off of this value (It will be a provincial structure as it is a Canadian site, so if someone chooses the province 'Ontario', this value is saved in a global variable, and these extended CQWP that are throughout the site, will get this value, and query lists flagged as Ontario). Is Session variables the only solution? Thanks everyone!

    Read the article

  • Regarding focusing on the web part on a site on sharepoint after clicking

    - by Brigadier Jigar
    Hi Friends, I have deployed 3 web parts on a site on Sharepoint server. In my 3rd web part, I have a functionality which needs a button click. Now when I click that button, the whole page gets refreshed and I have to scroll down to see the change that has happened due to clicking that functionality in 3rd web part. Is there any way by which when i click on any point of the web part, the page is reloaded but the control point or the focus remains to the same position where I clicked... Help me out. Regards, Jigar

    Read the article

  • Sharepoint Designer XSLT count boolean node = true

    - by Heather Masters
    I have a SharePoint list I converted to XSLT to do some additional grouping and counting and percentages. I need to return the number of items = true within my nodeset, I have: <xsl:value-of select="count($nodeset/@PartnerArrivedAtCall)"/> (which returns the count of all the nodes) I have tried <xsl:value-of select="count($nodeset/@PartnerArrivedAtCall [@PartnerArrivedAtCall = 'Yes'])"/> (returns zero) and <xsl:variable name="ArrivedYes" select="$nodeset/@PartnerArrivedAtCall [@PartnerArrivedAtCall='Yes']"/> (also returns zero) Can you please give me a good example of how to count only the true values (in my XML, true = "Yes") Thanks!

    Read the article

  • What's the best way to test SQL Server connection programmatically?

    - by backslash17
    I need to develop a single routine that will be fired each 5 minutes to check if a list of SQL Servers (10 to 12) are up and running. I can try to obtain a simple query in each one of the servers but this means that I have to create a table, view or stored procedure in every server, even if I use any already made SP I need to have a registered user in each server too. The servers are not in the same physical location so having those requirements would be a complex task. Is there a way to simply "ping" from C# one SQL Server? Thanks in advance!

    Read the article

  • allow editing of config files by WIndows Server 2008 admins running non-elevated?

    - by Justin Grant
    My company produces a cross-platform server application which loads its configuration from user-editable configuration files. On Windows, config files are locked down at Setup time to allow reading by all users but restrict editing to Administrators only. Unfortunately, on Windows Server 2008, even local administrators no longer have admin privileges (because of UAC) unless they're running an elevated app. My question is: if a Windows Server 2008 admin wants to edit an admins-only config file, how does he normally do it? Is he forced to use a text editor which is smart enough to auto-elevate when elevation is needed, like Windows Explorer does in response to access denied errors? Or is there something that we can do in our app (e.g. in ACLs we lay down at setup time) which signal apps (or explorer) that elevation is needed before editing the file or which otherwise make our app friendlier to admins running on modern Windows OS's?

    Read the article

  • Connecting to Sharepoint using HttpClient

    - by Aravindan R
    I am trying to connect to a Sharepoint 2007 using HttpClient 4. The Authentication scheme is NTLM and i wrote an implementation based on this (http://hc.apache.org/httpcomponents-client-ga/ntlm.html). When i try to connect supplying NTCredentials (username,password,host & domain), i get a 500 error saying "The requested function is not supported". I am trying to understand this error. Does anyone know what this error is? and Why is this occuring? Thanks

    Read the article

  • Installing SharePoint 2010 on a dev machine with an external database

    - by Dan Revell
    I've been following Microsoft's guide for installing a dev environment on Windows 7: http://msdn.microsoft.com/en-us/library/ee554869.aspx In order for it to not run like a dog I've created a SQL Server 2008 instance on our database server specifically for this dev machine. The article does mention that you might be wanting to use an external database in regard to making sure the database cumulative update is installed. It doesn't make any other mention of configuring it to use a external database. I was hoping that the configuration wizard would then prompt about which database to use but annoyingly it just set-up the configuration database locally. How do I go about installing SharePoint on a dev environment with an external database, and will I need to reformat this machine and do it all again?

    Read the article

  • Deleting list items via ProcessBatchData()

    - by q-tuyen
    You can build a batch string to delete all of the items from a SharePoint list like this: 1: //create new StringBuilder 2: StringBuilder batchString= new StringBuilder(); 3: 4: //add the main text to the stringbuilder 5: batchString.Append(""); 6: 7: //add each item to the batch string and give it a command Delete 8: foreach (SPListItem item in itemCollection) 9: { 10: //create a new method section 11: batchString.Append(""); 12: //insert the listid to know where to delete from 13: batchString.Append("" + Convert.ToString(item.ParentList.ID) + ""); 14: //the item to delete 15: batchString.Append("" + Convert.ToString(item.ID) + ""); 16: //set the action you would like to preform 17: batchString.Append("Delete"); 18: //close the method section 19: batchString.Append(""); 20: } 21: 22: //close the batch section 23: batchString.Append(""); 24: 25: //preform the batch 26: SPContext.Current.Web.ProcessBatchData(batchString.ToString()); The only disadvantage that I can think of right know is that all the items you delete will be put in the recycle bin. How can I prevent that? I also found the solution like below: // web is a the SPWeb object your lists belong to // Before starting your deletion web.Site.WebApplication.RecycleBinEnabled = false; // When your are finished re-enable it web.Site.WebApplication.RecycleBinEnabled = true; Ref [Here](http://www.entwicklungsgedanken.de/2008/04/02/how-to-speed-up-the-deletion-of-large-amounts-of-list-items-within-sharepoint/) But the disadvantage of that solution is that only future deletion will not be sent to the Recycle Bins but it will delete all existing items as well which user do not want. Any idea to prevent not to delete existing items? Many thanks in advance, TQT

    Read the article

  • How to force SQL Server 2008 to not change AUTOINC_NEXT value when IDENTITY_INSERT is ON ?

    - by evilek
    Hello, I got question about IDENTITY_INSERT. When you change it to ON, SQL Server automatically changes AUTOINC_NEXT value to the last inserted value as identity. So if you got only one row with ID = 1 and insert row with ID = 100 while IDENTITY_INSERT is ON then next inserting row will have ID = 101. I'd like it to be 2 without need to reseed. Such behaviour already exists in SQL Server Compact 3.5. Is it possible to force SQL Server 2008 to not change AUTOINC_NEXT value while doing insert with IDENTITY_INSERT = ON ?

    Read the article

  • How do I make use of multiple cores in Large SQL Server Queries?

    - by Jonathan Beerhalter
    I have two SQL Servers, one for production, and one as an archive. Every night, we've got a SQL job that runs and copies the days production data over to the archive. As we've grown, this process takes longer and longer and longer. When I watch the utilization on the archive server running the archival process, I see that it only ever makes use of a single core. And since this box has eight cores, this is a huge waste of resources. The job runs at 3AM, so it's free to take any and all resources it can find. So what I need to do if figure out how to structure SQL Server jobs so they can take advantage of multiple cores, but I can't find any literature on tackling this problem. We're running SQL Server 2005, but I could certainly push for an upgrade if 2008 takes of this problem.

    Read the article

  • Why SQL Server Express 2008 install requires Visual Studio 2008 in checklist ?

    - by asksuperuser
    When installing SQL Server Express Edition 2008, checklist says "Previous version of Visual Studio 2008" and asked me to upgrade to sp1. Unfortunately sp1 for some reason refuses to install on my brand new pc (Windows 7). So why can't I just bypass this ? Why would SQL Server Express needs VS2008 to install that's insane. SQL Server install used to be as easy as 123, now it has become a nightmare like installing Oracle. Will I have to go back to Windows XP ?

    Read the article

  • SQL Server 2005: When clicking "Add" database I keep getting 'verify that the path or file exists'

    - by Code Sherpa
    When I right click on "databases" in Sql Server 2005 Management Studio and then Attach... Add I get the following error: C:\Documents and Settings\Administrator\My Documents\SQL Server Management Studio\ Projects\Path\To\MDF\And\LDF\Files\ cannot access the specified path or file on the server. Verify that you have the necessary security privileges and that the path or file exists. The answer is easy - the MDF and LDF files where removed when Nant (by way of my dev machine) issued a drop command. But, after replacing the MDF and LDF files, I want to reattach the database but the above error keeps coming up when I select "Add". Also, I have already "unattached" the database in question and it no longer appears on the left under "databases". I have tried to replace a copy of the MDF and LDF files in the folder being referenced and that didn't work. Any ideas as to how to gracefully get rid of this error?

    Read the article

  • Can SQL Server 2008 Express be used for offering commercial web hosting services?

    - by Tony_Henrich
    I read that SQL Server 2008 Express R2 database size limit has increased to 10G. That's good news. Can I use the Express edition of SQL Server to offer web hosting services to the public? Microsoft should be best in answering this but I can't find a clear answer on their site. I am also seeing several Windows web hosting plans include SQL Server as a total package for less than $5/month. I am wondering how they can afford to offer this.

    Read the article

  • Outlook 2007 unable to connect to exchange server

    - by mattwarren
    I came into the office a week ago and outlook has refused to connect ever since, it just says "Disconnected" in the bottom right-hand corner. I've tried restarting it, rebooting Windows etc. I'm the only one if our office who is having this problem, so it's not a general problem with the server. Things I've tried Pinging the server via IP address and host name, both work fine Connecting via OWA, this works using the same machine name Connecting to Exchange via HTTP ("Outlook Anywhere") doesn't work None of the suggestions in this question helped, http://serverfault.com/questions/21755/can-ping-exchange-server-cant-connect-outlook-to-it Disabling Windows firewall on my laptop also has no effect. There are no items in the event viewer that indicate that anything it up. Also no permissions have changed on the server since when it worked. What else can I do to diagnose this, any suggestions?

    Read the article

  • Exchange 2010 add mailbox server to DAG error

    - by Michael
    Hello, i'm having some problems when adding a second mailbox server to my DAG in Exchange 2010. The test setup goes like this: 1x windows server 2008 (DC/DNS) 2x windows server 2008 (Exchange 2010) I have made sure all services are up and running and that the "Exchange Trusted Subsystem" account is set as a local admin. When i create a DAG i can add the first mailbox server (A) without any problems, but when i go to add the second (B) it gives me an error saying "Unable to contact the Cluster service on 1 other members (member) of the Database availability group. It does the same if i add (B) first and then try to add (A). Here is a part of the log file: [2010-04-05T15:00:27] GetRemoteCluster() for the mailbox server failed with exception = An Active Manager operation failed. Error: An error occurred while attempting a cluster operation. Error: Cluster API '"OpenCluster(EXCHANGE20102.area51.com) failed with 0x6d9. Error: There are no more endpoints available from the endpoint mapper"' failed.. This is OK. [2010-04-05T15:00:27] Ignoring previous error, as it is acceptable if the cluster does not exist yet. [2010-04-05T15:00:27] DumpClusterTopology: Opening remote cluster AREA51DAG01. [2010-04-05T15:00:27] DumpClusterTopology: Failed opening with Microsoft.Exchange.Cluster.Replay.AmClusterApiException: An Active Manager operation failed. Error: An error occurred while attempting a cluster operation. Error: Cluster API '"OpenCluster(AREA51DAG01.area51.com) failed with 0x5. Error: Access is denied"' failed. --- System.ComponentModel.Win32Exception: Access is denied --- End of inner exception stack trace --- Any help would be really appreciated, thanks.

    Read the article

  • Gmail - error adding pop3 account from my mail server (postfix+courier)

    - by Lucas Lobosque
    I use courier to add pop3/imap support to my mail server, and I get this when I try to add a new pop3 account in gmail: Server returned error: "Missing +OK response upon connecting to the server: * OK [CAPABILITY IMAP4rev1 UIDPLUS CHILDREN NAMESPACE THREAD=ORDEREDSUBJECT THREAD=REFERENCES SORT QUOTA IDLE ACL ACL2=UNION STARTTLS] Courier-IMAP ready. Copyright 1998-2011 Double Precision, Inc. See COPYING for distribution information." Any help on how to fix this would be appreciated.

    Read the article

  • Dig returns "status: REFUSED" for external queries?

    - by Mikey
    I can't seem to work out why my DNS isn't working properly, if I run dig from the nameserver it functions correctly: # dig ungl.org ; <<>> DiG 9.5.1-P2.1 <<>> ungl.org ;; global options: printcmd ;; Got answer: ;; ->>HEADER<<- opcode: QUERY, status: NOERROR, id: 24585 ;; flags: qr aa rd ra; QUERY: 1, ANSWER: 1, AUTHORITY: 2, ADDITIONAL: 1 ;; QUESTION SECTION: ;ungl.org. IN A ;; ANSWER SECTION: ungl.org. 38400 IN A 188.165.34.72 ;; AUTHORITY SECTION: ungl.org. 38400 IN NS ns.kimsufi.com. ungl.org. 38400 IN NS r29901.ovh.net. ;; ADDITIONAL SECTION: ns.kimsufi.com. 85529 IN A 213.186.33.199 ;; Query time: 1 msec ;; SERVER: 127.0.0.1#53(127.0.0.1) ;; WHEN: Sat Mar 13 01:04:06 2010 ;; MSG SIZE rcvd: 114 but when I run it from another server in the same datacenter I receive: # dig @87.98.167.208 ungl.org ; <<>> DiG 9.5.1-P2.1 <<>> @87.98.167.208 ungl.org ; (1 server found) ;; global options: printcmd ;; Got answer: ;; ->>HEADER<<- opcode: QUERY, status: REFUSED, id: 18787 ;; flags: qr rd; QUERY: 1, ANSWER: 0, AUTHORITY: 0, ADDITIONAL: 0 ;; WARNING: recursion requested but not available ;; QUESTION SECTION: ;ungl.org. IN A ;; Query time: 1 msec ;; SERVER: 87.98.167.208#53(87.98.167.208) ;; WHEN: Sat Mar 13 01:01:35 2010 ;; MSG SIZE rcvd: 26 my zone file for this domain is $ttl 38400 ungl.org. IN SOA r29901.ovh.net. mikey.aol.com. ( 201003121 10800 3600 604800 38400 ) ungl.org. IN NS r29901.ovh.net. ungl.org. IN NS ns.kimsufi.com. ungl.org. IN A 188.165.34.72 localhost. IN A 127.0.0.1 www IN A 188.165.34.72 The server is running Ubuntu 9.10 and Bind 9, if anyone can shed some light on this for me it'd make me very happy! thanks

    Read the article

  • Setting Proxy Server for IE 10 on Windows 8 using pac file and Group Policy

    - by Greg Bray
    We currently use group policy to configure a proxy server PAC file for Windows XP and Windows 7 computers on our network. We now are starting to get requests for Windows 8, but have noticed that our current GPO does not work for setting the proxy server on Windows 8 clients or server 2012. Is it possible to do this using a 2008 R2 domain controller or would we need to update our domain to a 2012 server? I found a reference to creating new GPO settings for "Internet Explorer 10 and 11" and vague references to using RSAT on Windows 8 to set IE 10 settings via preferences, but nothing that talks about using group policy to manage proxy settings.

    Read the article

  • Avoiding DNS timeouts when a dns server fails

    - by user65124
    Hi there. We have a small datacenter with about a hundred hosts pointing to 3 internal dns servers (bind 9). Our problem comes when one of the internal dns servers becomes unavailable. At that point all the clients that point to that server start performing very slowly. The problem seems to be that the stock linux resolver doesn't really have the concept of "failing over" to a different dns server. You can adjust the timeout and number of retries it uses, (and set rotate so it will work through the list), but no matter what settings one uses our services perform much more slowly if a primary dns server becomes unavailable. At the moment this is one of the largest sources of service disruptions for us. My ideal answer would be something like "RTFM: tweak /etc/resolv.conf like this...", but if that's an option I haven't seen it. I was wondering how other folks handled this issue? I can see 3 possible types of solutions: Use linux-ha/Pacemaker and failover ips (so the dns IP VIPs are "always" available). Alas, we don't have a good fencing infrastructure, and without fencing pacemaker doesn't work very well (in my experience Pacemaker lowers availability without fencing). Run a local dns server on each node, and have resolv.conf point to localhost. This would work, but it would give us a lot more services to monitor and manage. Run a local cache on each node. Folks seem to consider nscd "broken", but dnrd seems to have the right feature set: it marks dns servers as up or down, and won't use 'down' dns servers. Any-casting seems to work only at the ip routing level, and depends on route updates for server failure. Multi-casting seemed like it would be a perfect answer, but bind does not support broadcasting or multi-casting, and the docs I could find seem to suggest that multicast dns is more aimed at service discovery and auto-configuration rather than regular dns resolving. Am I missing an obvious solution?

    Read the article

  • How to secure JBoss application server using SELinux

    - by Jakub Elias
    I want to secure RedHat 5.4 application server by SELinux (targeted policy) and have several questions 1, where can i get SELinux sources (/etc/selinux//src/policy/)There seems to be no such package on install cd .. 2, how to restrict user rights (for example user jboss could not modify /etc/my.cnf) 3, how to configure JBoss application server to work under SELinux Although i read many documents from NSA the whole topic is still not clear for me.What i want is to basically protect filesystem in case one account is broken.I cannot find any materials about securing jboss server using either chroot jail, ACLs or SELinux ....

    Read the article

  • DHCP "no answer" on CentOS 6.4

    - by Kev
    I installed a DHCP server (yum install dhcp) and this is my conf: # create new # specify domain name option domain-name "mydomain.name"; # specify DNS's hostname or IP address option domain-name-servers 10.0.1.1, 10.0.1.2; option ntp-servers 10.0.1.1, 10.0.1.2; allow unknown-clients; # default lease time default-lease-time 2628000; # max lease time max-lease-time 2628000; # about a month # this DHCP server to be declared valid authoritative; # specify network address and subnet mask subnet 10.0.0.0 netmask 255.0.0.0 { # specify the range of lease IP address range dynamic-bootp 10.0.2.1 10.0.2.50; # specify broadcast address option broadcast-address 10.255.255.255; # specify default gateway option routers 10.0.0.1; allow unknown-clients; } service dhcp start reports [ OK ]. Yet, if I disable my other DHCP server (Win2k3) and get a client to try renewing its IP lease, it times out. So I installed dhcping. No matter what options I try, including directing dhcping at my server, adding a client address in the range, adding my hardware address, it replies 'no answer'. I'm also trying -i since that seems to be more akin to what a WinXP client would try to do, based on /var/log/messages. It logs the attempts (from dhcping here) as: Oct 24 18:55:13 newdc dhcpd: DHCPINFORM from 10.0.2.15 via eth0:4 Oct 24 18:55:13 newdc dhcpd: DHCPACK to 10.0.2.15 (00:11:25:66:4e:7f) via eth0:4 Oct 24 18:55:13 newdc dhcpd: DHCPINFORM from 10.0.2.15 via eth0:3 Oct 24 18:55:13 newdc dhcpd: DHCPACK to 10.0.2.15 (00:11:25:66:4e:7f) via eth0:3 Oct 24 18:55:13 newdc dhcpd: DHCPINFORM from 10.0.2.15 via eth0 Oct 24 18:55:13 newdc dhcpd: DHCPACK to 10.0.2.15 (00:11:25:66:4e:7f) via eth0 The :3 and :4 are because I have a few extra Host(A) records for this server so it responds on more than one IP for our intranet app. No answer? It sounds like it should be getting three answers...no? (And if that's the problem, how do I limit the DHCP service to replying from eth0?)

    Read the article

< Previous Page | 485 486 487 488 489 490 491 492 493 494 495 496  | Next Page >