Search Results

Search found 91557 results on 3663 pages for 'server 2008'.

Page 130/3663 | < Previous Page | 126 127 128 129 130 131 132 133 134 135 136 137  | Next Page >

  • CC.NET + SVN : Server certificate issue

    - by MSI
    I am trying to setup Continuous Integration in our office. Being a puny little developer I am facing this supposedly infamous problem: " Source control operation failed: svn: OPTIONS of 'https://trunkURL': Server certificate verification failed: issuer is not trusted" So I tried the following solution - Run CC.NET service (server running as win service) using a domain account (rather than default LOCAL SYSTEMS) and accept cert permanently using command prompt under that user by using svn log/list on the repo. Doesn't help :(. I am getting the following from my artifact/log files(or dashboard) ThoughtWorks.CruiseControl.Core.CruiseControlException: Source control operation failed: svn: OPTIONS of 'https://TrunkURL': Server certificate verification failed: issuer is not trusted (https://ServerAdd) . Process command: E:\(svn.exe Path) log https://TrunkURL -r "{2010-11-08T02:12:20Z}:{2010-11-08T02:13:21Z}" --verbose --xml --no-auth-cache --non-interactive at ThoughtWorks.CruiseControl.Core.Sourcecontrol.ProcessSourceControl.Execute(ProcessInfo processInfo) at ThoughtWorks.CruiseControl.Core.Sourcecontrol.Svn.GetModifications(IIntegrationResult from, IIntegrationResult to) at ThoughtWorks.CruiseControl.Core.Sourcecontrol.QuietPeriod.GetModificationsWithLogging(ISourceControl sc, IIntegrationResult from, IIntegrationResult to) at ThoughtWorks.CruiseControl.Core.Sourcecontrol.QuietPeriod.GetModifications(ISourceControl sourceControl, IIntegrationResult lastBuild, IIntegrationResult thisBuild) at ThoughtWorks.CruiseControl.Core.IntegrationRunner.GetModifications(IIntegrationResult from, IIntegrationResult to) at ThoughtWorks.CruiseControl.Core.IntegrationRunner.Integrate(IntegrationRequest request) We are using VisualSVN Server and CC.NET for this adventure. Tips, suggestions will be highly appreciated. Thanks

    Read the article

  • Slow RDP after server joins domain

    - by Chris Grove
    We're having RDP issues with Amazon cloud servers that we recently joined to an Active Directory domain. The setup is: A local office network A virtual private cloud in Amazon An IPSec tunnel between the two networks A number of Windows 2008 R2 servers on both networks An AD domain (call it abc.net), with one domain controller in each network. The domain controllers are both new, fresh installs. Before we had the domain set up we had local accounts for the cloud computers which were used for RDP access. Our idea was to get all of the servers on to the domain so we could use domain logins instead of per-server local logins. Before the cloud servers were in the domain, RDP (from the office network or through a VPN to the cloud) worked great. After we joined the cloud servers to the domain, RDP from the office became very slow - a few minutes to log in, long frequent pauses when the interface is unresponsive, generally just a slow and frustrating experience. This is a problem regardless of whether a domain or local login is used for RDP. Oddly, when outside of the office network and connecting to the cloud directly with the VPN, RDP is still very responsive. Any idea why RDP from office to cloud is suddenly very slow after the cloud servers join the domain? What can I look at in our configuration to address this? Any help is greatly appreciated.

    Read the article

  • How can I tell why I have access to a file share on Windows Server

    - by Joel
    I have a file share on a Windows 2008 R2 server in a AD domain (call it \SECURESERVER\STUFF) and I am not sure if I have the share and folder permissions set up right. I noticed the problem when I set up new server (WORKGROUP\FOREIGNSERVER) that was not joined to the domain and tried to copy some files off of \SECURESERVER\STUFF. I was surprised to find that when I tried to access the files, it did not prompt me for a username and password and proceeded to give me full access to the files. That worried me so I tried the same thing on some workstations that were not in the domain and they did NOT have the same behavior (they did prompt for a username/password as desired/expected). So, I think there is something peculiar about FOREIGNSERVER. I am logging into it with a local admin account, but my domain and SECURESERVER should know nothing of this server. I've carefully gone through the share and folder permissions on the share but I can't find the reason that FOREIGNSERVER has access. How can I find out why FOREIGNSERVER has access to SECURESERVER?

    Read the article

  • Windows Server wbadmin recover with commas

    - by dlp
    I want to do a recovery of files with commas in their names from the command line, ala: wbadmin start recovery -version:10/01/2013-12:00 -itemType:File -overwite:Overwrite -quiet "-Items:C:\Path\To\File, With Comma.txt,C:\Path\To\File 2, With Comma.txt" So there are two files: C:\Path\To\File, With Comma.txt C:\Path\To\File 2, With Comma.txt The problem is wbadmin assumes commas separates each file, so it sees 4 files specified instead of 2. I've tried putting a \ in front of commas that are part of the file names like so: wbadmin start recovery -version:10/01/2013-12:00 -itemType:File -overwite:Overwrite -quiet "-Items:C:\Path\To\File\, With Comma.txt,C:\Path\To\File 2\, With Comma.txt" but it doesn't work, it just says there's a syntax error. The documentation on Technet doesn't seem to mention anything that'll help either. OS is Windows Server 2008 R2. A clarifying comment: I've changed the file names to be different than the actual names to be less revealing, but I also see I dumbed it down too much. The comma can occur either in the file name itself like C:\Path\To\File, With Comma.txt' or in the path to the file, like:C:\Path, To\Other\File.txt`.

    Read the article

  • SSRS - Unable to determine if the owner of job has server access [SQLSTATE 42000] (Error 15404))

    - by John DaCosta
    SQL Server Reporting Services, in SSRS it seems like Schedules never fire, however a look at the SQL Agent reveals a permission issue related to not being able to resolve a user account. Seems SQL Agent does not rely on caching or whatever voodoo Windows magically works. link text Fix is listed here... edit -- Above is the fix I used to workaround this issue, has any one found any other work arounds or resolutions to this issue? It seems that by default the SSRS Generated Schedules are run as this phantom user account. How do I change this default? Is SSRS creating the jobs as the user the service runs as? Thanks Remus

    Read the article

  • Migrating a Windows Server to Ubuntu Server to provide Samba, AFP and Roaming Profiles

    - by Dan
    I'm replacing our old Windows XP Pro office server with a HP Microserver running Ubuntu Server 12.04 LTS. I'm not a Linux expert but I can find my way around a terminal prompt, I'm a Mac user by choice. The office use a mix of Windows XP Pro machines and OSX Lion laptops. I included Samba during installation, and I'm planning on using Netatalk for the AFP and Bonjour sharing. I'd quite like to have samba make the server appear in 'My network places' on the Windows machines the way Bonjour makes it appear in finder on the Macs, if this is possible? I want to get to a point so that a user logging into Windows, gets connected to the Ubuntu server (do they need an Ubuntu user account?) which get them their shares and their Windows user profile (though a standard profile across users would do). The upshot is to make centralised control of user accounts (e.g. If a person leaves, killing their account on the server stops their Windows logon and ability to access Samba shares) and to ensure files aren't stored on the individual machines for backup/security purposes. I want to make this as simple as possible, so don't want to have loads of stuff I don't need, I just can't figure out: What I need at the server end: - will Samba be enough (already installed as part of initial installation), or will I need to cock around with LDAP (and how does this interact with Samba) - For someone of moderate Linux competence like me, is there a package that offers easy admin of user accounts, e.g. a GUI like phpLDAPadmin (if LDAP is necessary) How to configure the XP machines: - do I need to have the XP machines set up as a domain controller (I've no idea, really) - roaming profiles looks to offer the feature of putting the user's files on the server rather than the machine itself along with a profile that follows the user from machine to machine. Syncing Mac user's home folders with the server This is less of a concern because I can set up Time Machine if it comes to it, but I'd appreciate any recommendations of what approach I should take having the Mac home folders synced to the server.

    Read the article

  • Changing SQL Server Database sorting

    - by plaisthos
    I have a request to change the collation of a SQL Server Database: ALTER DATABASE solarwind95 collate SQL_Latin1_General_CP1_CI_AS but I get this strange error: Meldung 5075, Ebene 16, Status 1, Zeile 1 Das 'Spalte'-Objekt 'CustomPollerAssignment.PollerID' ist von 'Datenbanksortierung' abhängig. Die Datenbanksortierung kann nicht geändert werden, wenn ein schemagebundenes Objekt von ihr abhängig ist. Entfernen Sie die Anhängigkeiten der Datenbanksortierung, und wiederholen Sie den Vorgang. Sorry for the german errror message. I do not know how to switch the language to english, but here is a translation: Translation: Message 5075, Layer 16, Status 1, Row 1 The 'column' object 'CustomPollerAssignment.PollerID' depends on 'Database sorting. The database sorting cannot be changed if a schema bound object depends on it. Remove the dependency of the database sortieren and retry. I got a ton more of the errors like that.

    Read the article

  • Visual Studio 2010 is asking to convert RDLC created on VS2008 to RDLC 2008 format?

    - by Junior Mayhé
    I've created my project on Visual Studio 2008, as well RDLC files on it. But now, when I open the solution on Visual Studio 2010 and want to open RDLC file, it's showing me a warning. That's a little funny. The report was created on VS2008 and VS2010 is asking to convert to 2008 format. Perhaps there was a problem on my VS2008 installation that created RDLC files using some ancient format (2005??!) The problem is, when you confirm with Ok button, do some design ajustments and run the app, it throws an error on 'Main report': ex.InnerException {"The definition of the report 'Main Report' is invalid."} [Microsoft.Reporting.DefinitionInvalidException]: {"The definition of the report 'Main Report' is invalid."} Data: {System.Collections.ListDictionaryInternal} HelpLink: null InnerException: {"The report definition is not valid. Details: The report definition has an invalid target namespace 'http://schemas.microsoft.com/sqlserver/reporting/2008/01/reportdefinition' which cannot be upgraded."} Message: "The definition of the report 'Main Report' is invalid." Source: "Microsoft.ReportViewer.Common" StackTrace: " at Microsoft.Reporting.ReportCompiler.CompileReport(CatalogItemContext context, Byte[] reportDefinition, Boolean generateExpressionHostWithRefusedPermissions, ReportSnapshotBase& snapshot)\r\n at Microsoft.Reporting.StandalonePreviewStore.StoredReport.CompileReport()\r\n at Microsoft.Reporting.StandalonePreviewStore.StoredReport.get_Snapshot()\r\n at Microsoft.Reporting.StandalonePreviewStore.GetCompiledReport(CatalogItemContext context, Boolean rebuild, ReportSnapshotBase& snapshot)\r\n at Microsoft.Reporting.LocalService.GetCompiledReport(CatalogItemContext itemContext, Boolean rebuild, ReportSnapshotBase& snapshot)\r\n at Microsoft.Reporting.LocalService.CompileReport(CatalogItemContext itemContext, Boolean rebuild)\r\n at Microsoft.Reporting.WinForms.LocalReport.CompileReport()" TargetSite: {Microsoft.ReportingServices.ReportProcessing.PublishingResult CompileReport(Microsoft.ReportingServices.Diagnostics.CatalogItemContext, Byte[], Boolean, Microsoft.ReportingServices.Library.ReportSnapshotBase ByRef)}

    Read the article

  • Entire Table is pushed to the next page when rendering a SSRS 2005 Report (as .pdf) in SSRS 2008

    - by Pwninstein
    I have a SSRS 2005 report that I'm rendering in SSRS 2008 as a .pdf. The report contains (among other things) a table that's very simple: header row, details, no footer, no aggregation, no grouping, keep together = false, pageBreakAtStart = false, pageBreakAtEnd = false, repeatHeaderOnNewPage = true. I resized the table to be much narrower than the body of the report just to be sure it wasn't extending beyond the bounds of the report, pushing everything down. But, no matter what I try, if some of the detail rows in that table would need to be pushed to the next page, then the ENTIRE TABLE is pushed to the next page, not just the extra rows. So my question is: Is there a workaround for this problem, is this a known issue, or is it even possible to get this 2005 report to render properly in 2008? NOTE: this is related to a question that I previously asked here, and is based on this MSDN forum post started by a coworker. This question is not the same as my previous question, as I'd like to see things work properly in with a 2005 report. If it's not possible, that would be good to know, as it would indicate that we need to upgrade one of our servers to SQL 2008. Thanks!

    Read the article

  • Continuous Integration with 64-bit Sharepoint and TFS 2008?

    - by Hirvox
    I've set up a 64-bit TFS 2008 build server with Sharepoint, continuous integration and out-of-the-box MSTest. Unit tests for plain business logic classes run just fine and test results are published into TFS. However, any test that uses Sharepoint's API fails horribly, SPFarm.Local returning null and so on. Is there a way to fix this? The tests run fine in an otherwise identical 32-bit development environment (Windows Server 2008 under Hyper-V, Sharepoint patched up to June 2009 cumulative update) from both Visual Studio and command line, so the problem is not about improper use of SPContext.Current or any other part of the API that needs to be run in a web server context. I've ruled out permissions issues, because the build agent account can deploy the solution and create site collections just fine with stsadm. The next culprit could be that the unit tests were being run with a 32-bit process, which couldn't access the 64-bit Sharepoint API properly. I tried a workaround, but it has the side effect of disabling TFS support in MSTest. Do I have to wait for 2010 versions of MS tools (and hope for the best) or is there a third-party test framework available that runs natively in 64 bit and can publish test results into TFS 2008?

    Read the article

  • Our application is responding different on client server

    - by WtFudgE
    I have an application running on our local server and it works perfect. However when I deploy my application to our client's server there are problems. If i copy all the sources to another local from their server, it works fine again.... I am talking about an ASP.NET application talking with windows server 2008. The server specs of our client are: intel Xeon 2.13 GHz 6 GB RAM 300 GB HDD windows server 2008 R2 Although I think it's not that important I will describe the problems I am talking about to give u a better idea: It is related to upating and deleting fields in the sql server. Whenever there is more than one user running the application, and updates or deletes (not even the same record!0, there seems to be fields updating/deleting wrongly. But as I said before, if we copy the source to another server it is not the case. So I assumed it is more configuration related. Do you guys have any idea what the issue could be? Thanks a lot EDIT: my client old me he disabled the sql related accounts Could this be related?

    Read the article

  • How to select a server that supports Windows scheduled file IO

    - by Kristof Verbiest
    Background: I am developing an application that needs to read data from disk with a fairly consistent throughput. It is important that this throughput is not influenced by other actions that happen on the disk (e.g. by other processes). For this purpose, I was hoping to use the 'Scheduled File I/O' feature in Windows (throught the GetFileBandwithReservation and SetFileBandwithReservation functions). However, this StackOverflow question has thought me that this feature is only available if the device driver supports it. Currently I have no computer at my disposition that seems to support this feature (I have an HP Proliant server and a Dell Precision workstation). Question: If I were to order a new server, how can I know beforehand if this feature will be supported by the device driver? How 'upscale' does the server have to be? Has anybody used this feature with success and cares to share his experiences?

    Read the article

  • How to move from a physical server to an online server?

    - by Tiago
    My father has a small company: 10 PCs running Windows, 1 running Windows Server 2000 and 1 Fax/Printer. I want to remove the server and make the network based on an Online server. Can I do that? If yes, how? By using a Windows VPS? Linux VPS with VMWaRE? I'm not sure if that's a viable option, if there's other, please tell me. thank you.

    Read the article

  • Can I do a "one-time" file content search in Windows Server 2008 without adding the folder to the index?

    - by G-.
    Can I search for files which contain a specific string in a folder if that folder is not in the search index? So, lets say folder 'textFiles' is not in the index. I navigate to this folder in windows explorer. I type '.ini' in the search box I want to see a result list containing only 'b.txt' FOLDER C:\textFiles\ FILE a.php CONTENT once twice thrice mice moose monkey FILE b.txt CONTENT mingle muddle middle.ini banana beer FILE c.spo CONTENT sellotape stapler phone book I do not have permission to add folders to the windows index and I do not have permission to install or run any executables that did not ship with the server or approved applications. I'd be happy with a windows native command line solution if necessary? Thanks G

    Read the article

  • What are problems and pitfalls with a public facing Active Directory

    - by Ralph Shillington
    The situation that i'm faced with is this: We plan on using a number of server applications hosted on Amazon EC2 machines, mainly Microsoft Team Foundation Server. These services rely heavily on Active Directory. Since our servers are in the Amazon cloud it should go without saying (but I will) that all our users are remote. It seems that we can't setup VPN on our EC2 instance -- so the users will have to join the domain, directly over the internet then they'll be able to authenticate and once authenticated, use that token for accessing resources such as TFS. on the DC instance, I can shut down all ports, except those needed for joining/authenicating to the domain. I can also filter the IP on that machine to just those address that we are expecting our users to be at (it's a small group) On the web based application servers, I imagine all we need to open is port 80 (or 8080 in the case of TFS) One of the problems that I'm faced with is what domain name to use for this Active directory. Should I go with "ourDomainName.com" or "OurDomainName.local" If I choose the latter, does that not mean that I'll have to get all our users to change their DNS address to point to our server, so it can resolve the domain name (I guess I could also distribute a host file) Perhaps there is another alternative that I'm completely missing.

    Read the article

  • Get Jira to run on shared Windows server on port 80

    - by codeulike
    I know this can be done on Linux with JIRA, using mod_proxy, but I'm not sure if its possible on Windows: Say we have a Windows server running IIS 7.0 and serving up pages on port 80, via an address like: http://twiddle.something.com We then install JIRA on the server, it uses its bundles Apache web server to serve stuff up on port 8080, like this: http://twiddle.something.com:8080 Is there a way to configure IIS and Apache so that JIRA runs off a port 80 folder, as in: http://twiddle.something.com still hits IIS http://twiddle.something.com/Jira hits JIRA on Apache? Thanks edit: I guess we might also want to throw SSL into the mix for JIRA too....

    Read the article

  • How to enable RDP to a Server 2008 R2 on another network? VM network

    - by Saariko
    I have a W2008 R2 installed on a different network (I am on 192.168.0.x - new server on 192.168.3.x) I had trouble ping and RDP to it. I disabled the firewall to test the connection: and that opened the ping feature but I still can not RDP to that machine. the allow remote access is enabled As per sinni80 idea - Here is the error message The networks are divided by a Fortigate 60-B router - 2ndy interface for the gateway is 192.168.3.254 (and pingable from all) any to any rule on both networks is in place. As per Joe Schmoe idea - I am able to RDP to 192.168.3.1 from 192.168.3.3 (which is on the same network) Data to add: - The servers are on a VM host, each of the servers has 2 nics one is DHCP enabled into the 192.168.0.x network 2nd is static IP in the 192.168.3.x -- Further information: The network 192.168.0.x - are on a domain network (active Directory) The network 192.168.3.x - are grouped in a workgroup What should I check more please?

    Read the article

  • win2008 R2 server core software RAID

    - by shimonyk
    I am trying to set up software raid on a win2008 R2 server core. I have the disks configured as dynamic. In the server manager gui, i can see the disks, but when i right click, the option to set up "new mirrored volume" is not listed. I tried it with the command line using diskpart, and it gives the error "Virtual Disk Service Error: The size of the extent is less than the minimum." The drive are a new pair of 1Tb disks. Is this not supported in server core, or am i missing something else? Thank you

    Read the article

  • Developing and implementing a testing plan for a software app deployed on a web server

    - by Abhzoo
    A company in the USA is building a new Web App that will be offered SaaS to customers and the development is being done by a software development team located in a different country(India). They are about to take delivery of a first demo to provide live feedback to the team in India. The overseas team requires a cloud server (Windows + SQL Standard, 8GB Ram, 8 vCPUs, 40GB SSD system disk, 80GB SSD data disk, 1600Mb/s network bandwidth) to serve as a tester server. When the tester is setup the team will install the app on the test server to get live feedback. Q:Explain in detail how you will develop and implement a testing plan for the software App. Be sure to explain the specifics. PLEASE HELP, NEED ANSWER ASAP

    Read the article

  • Slow upload to Server 2008 DC, Downloads work as expected

    - by Anthony
    I have a Windows Server 2008 Domain Controller that I run as a do-it-all server. It has a GbE connection to the network and to every machine on the network. Downloads from the server file shares work as expected, between 70MB/s and 80MB/s to all the machines. However, when I try to copy files back up to the server, speeds fall to 7MB/s-10MB/s. I've disabled flow control and large send offload properties on all the NICs. I had this problem before and managed to fix it through some properties changes, but like an idiot, I never documented my fix and have since moved to a new server. Any ideas what I need to do to get the speeds to be more symmetric? EDIT: Remote differential compression is also disabled.

    Read the article

  • sql server: losing identity column on export/import

    - by Y.G.J
    Recently I started dealing with SQL Server, my previous experience was in MS-Access. When I'm doing an import/export of a db, from the server to my computer or even in the server, all column with primary key loose the key. Identity is set to false and even bit is not set to the default. How can I can I use an import/export job to make an exact copy of the db and its data? I don't want to have to perform a backup and restore every time I want the same db somewhere else, for another project, etc. I have read about "edit mapping" and the checkbox but that did not helped with the identity specification... and what about the primary key of the tables and the rest of the things?

    Read the article

  • Server was unable to process request.

    - by Naveen
    I have a vb.net web application that talks to a web service. I get the following error if I try to connect to the web service. System.Web.Services.Protocols.SoapException: System.Web.Services.Protocols.SoapException: Server was unable to process request. The strange thing is it is working fine on one of the set of servers and on another set of servers, we get this error. The only notable difference between the servers is in the authentication setting in IIS. The server where the it works fine we have the ASP.Net Impersonation disabled, but on the other server I get an error the moment I disable the ASP.Net Impersonation.

    Read the article

  • SQL SERVER – Quick Look at SQL Server Configuration for Performance Indications

    - by pinaldave
    Earlier I wrote SQL SERVER – Beginning SQL Server: One Step at a Time – SQL Server Magazine. That was the first article on the series of my real world experience of Performance Tuning experience. I have written second part the same series over here. Read second part over here: Quick Look at SQL Server Configuration for Performance Indications. In this second part I talk about two types of my clients. 1) Those who want instant results 2) Those who want the right results It is really fun to work with both the clients. I talk about various configuration options which I look at when I try to give very early opinion about SQL Server Performance. There are various eight configurations, I give quick look and start talking about performance. Head over to original article over here: Quick Look at SQL Server Configuration for Performance Indications. Reference : Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL Optimization, SQL Performance, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL SERVER – Service Broker and CAP_CPU_PERCENT – Limiting SQL Server Instances to CPU Usage

    - by pinaldave
    I have mentioned several times on this blog that the best part of blogging is the questions I receive from readers. They are often very interesting. The questions from readers give me a good idea what other readers might be thinking as well. After reading my earlier article Simple Example to Configure Resource Governor – Introduction to Resource Governor – I received an email from a reader and we exchanged a few emails. After exchanging emails we both figured out what is going on. It was indeed interesting and reader suggested to that I should blog about it.  I asked for permission to publish his name but he does not like the attention so we will just call him Jeff. I have converted our emails into chat for easy consumption. Jeff: Your script does not work at all. I think either there is a bug in SQL Server. Pinal: Would you please explain in detail? Jeff: Your code does not limit the CPU usage? Pinal: How did you measure it? Jeff: Well, we have third party tools for it but let us say I have limited the resources for Reporting Services and used your script described in your blog. After that I ran only reporting service workload the CPU is still used more than 100% and it is not limited to 30% as described in your script. Clearly something is wrong somewhere. Pinal: Did you say you ONLY ran reporting server load? Jeff: Yeah, to validate I ran ONLY reporting server load and CPU did not throttle at 30% as per your script. Pinal: Oh! I get it here is the answer - CAP_CPU_PERCENT = 30. Use it. Jeff: What is that, I think your earlier script says it will throttle the Reporting Service workload and Application/OLTP workload and balance it. Pinal: Exactly, that is correct. Jeff: You need to write more in email buddy! Just like your blogs, your answers do not make sense! No Offense! Pinal: Hmm…feedback well taken. Let me try again. In SQL Server 2012 there are a few enhancements with regards to SQL Server Resource Governor. One of the enhancement is how the resources are allocated. Let me explain you with examples. Configuration: [Read Earlier Post] Reporting Workload: MIN_CPU_PERCENT=0, MAX_CPU_PERCENT=30 Application/OLTP Workload: MIN_CPU_PERCENT=50, MAX_CPU_PERCENT=100 Example 1: If there is only Reporting Workload on the server: SQL Server will not limit usage of CPU to only 30% workload but SQL Server instance will use all available CPU (if needed). In another word in this scenario it will use more than 30% CPU. Example 2: If there is Reproting Workload and heavy Application/OLTP workload: SQL Server will allocate a maximum of 30% CPU resources to Reporting Workload and allocate remaining resources to heavy application/OLTP workload. The reason for this enhancement is for better utilization of the resources. Let us think, if there is only single workload, which we have limited to max CPU usage to 30%. The other unused available CPU resources is now wasted. In this situation SQL Server allows the workload to use more than 30% resources leading to overall improved/optimized performance. However, in the case of multiple workload where lots of resources are needed the limits specified in MAX_CPU_PERCENT are acknowledged. Example 3: If there is a situation where the max CPU workload has to be enforced: This is a very interesting scenario, in the case when the max CPU workload has to be enforced irrespective of the workload and enhanced algorithm, the keyword CAP_CPU_PERCENT is essential. It specifies a hard cap on the CPU bandwidth that all requests in the resource pool will receive. It will never let CPU usage for reporting workload to go over 30% in our case. You can use the key word as follows: -- Creating Resource Pool for Report Server CREATE RESOURCE POOL ReportServerPool WITH ( MIN_CPU_PERCENT=0, MAX_CPU_PERCENT=30, CAP_CPU_PERCENT=40, MIN_MEMORY_PERCENT=0, MAX_MEMORY_PERCENT=30) GO Notice that there is MAX_CPU_PERCENT=30 and CAP_CPU_PERCENT=40, what it means is that when SQL Server Instance is under heavy load under different workload it will use the maximum CPU at 30%. However, when the SQL Server instance is not under workload it will go over the 30% limit. However, as CAP_CPU_PERCENT is set to 40, it will not go over 40% in any case by limiting the usage of CPU. CAP_CPU_PERCENT puts a hard limit on the resources usage by workload. Jeff: Nice Pinal, you should blog about it. [A day passes by] Pinal: Jeff, it is done! Click here to read it. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: Service Broker

    Read the article

  • SQL SERVER – Fix – Agent Starting Error 15281 – SQL Server blocked access to procedure ‘dbo.sp_get_sqlagent_properties’ of component ‘Agent XPs’ because this component is turned off as part of the security configuration for this server

    - by Pinal Dave
    SQL Server Agent fails to start because of the error 15281 is a very common error. When you start to restart SQL Agent sometimes it will give following error. SQL Server blocked access to procedure ‘dbo.sp_get_sqlagent_properties’ of component ‘Agent XPs’ because this component is turned off as part of the security configuration for this server. A system administrator can enable the use of ‘Agent XPs’ by using sp_configure. For more information about enabling ‘Agent XPs’, search for ‘Agent XPs’ in SQL Server Books Online. (Microsoft SQL Server, Error: 15281) To resolve this error, following script has to be executed on the server. sp_configure 'show advanced options', 1; GO RECONFIGURE; GO sp_configure 'Agent XPs', 1; GO RECONFIGURE GO When you run above script, it will give a very similar output as following on the screen. Now, if you try to restart SQL Agent it will just work fine. That’s it! Sometimes there is a simpler solution to complicated error. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: PostADay, SQL, SQL Authority, SQL Error Messages, SQL Query, SQL Server, SQL Tips and Tricks, T SQL Tagged: SQL Server Agent

    Read the article

< Previous Page | 126 127 128 129 130 131 132 133 134 135 136 137  | Next Page >