Search Results

Search found 3831 results on 154 pages for 'w2k8 r2'.

Page 19/154 | < Previous Page | 15 16 17 18 19 20 21 22 23 24 25 26  | Next Page >

  • What the best way to achieve RPO of zero and lowest possible RTO (less than 15 minutes) with SQL 2008 R2?

    - by Adrian Hope-Bailie
    We are running a payments (EFT transaction processing) application which is processing high volumes of transactions 24/7 and are currently investigating a better way of doing DB replication to our disaster recovery site. Our current and previous strategies have included using both DoubleTake and Redgate to replicate data to a warm stand-by. DoubleTake is the supported solution from the payments software vendor however their (DoubleTake's) support in South Africa is very poor. We had a few issues and simply couldn't ever resolve them so we had to give up on DoubleTake. We have been using Redgate to manually read the data from the primary site (via queries) and write to the DR site but this is: A bad solution Getting the software vendor hot and bothered whenever we have support issues as it has a tendency to interfere with the payment application which is very DB intensive. We recently upgraded the whole system to run on SQL 2008 R2 Enterprise which means we should probably be looking at using some of the built-in replication features. The server has 2 fairly large databases with a mixture of tables containing highly volatile transactional data and pretty static configuration data. Replication would be done over a WAN link to a separate physical site and needs to achieve the following objectives. RPO: Zero loss - This is transactional data with financial impact so we can't lose anything. RTO: Tending to zero - The business depends on our ability to process transactions every minute we are down we are losing money I have looked at a few of the other questions/answers but none meet our case exactly: SQL Server 2008 failover strategy - Log shipping or replication? How to achieve the following RTO & RPO with logshipping only using SQL Server? What is the best of two approaches to achieve DB Replication? My current thinking is that we should use mirroring but I am concerned that for RPO:0 we will need to do delayed commits and this could impact the performance of the primary DB which is not an option. Our current DR process is to: Stop incoming traffic to the primary site and allow all in-flight transaction to complete. Allow the replication to DR to complete. Change network routing to route to DR site. Start all applications and services on the secondary site (Ideally we can change this to a warmer stand-by whereby the applications are already running but not processing any transactions). In other words the DR database needs to, as quickly as possible, catch up with primary and be ready for processing as the new primary. We would then need to be able to reverse this when we are ready to switch back. Is there a better option than mirroring (should we be doing log-shipping too) and can anyone suggest other considerations that we should keep in mind?

    Read the article

  • IIS 7.5 + Windows Server 2008 R2 + ASP.NET 4.0 HTTP 500 Error?

    - by Dave
    Hi, I'm having an issue I cannot track down and I have looked through the forums and not found anything that sheds any light. I have a fresh install of a Server 2008 R2 Web that I am trying to load an application I created and tested on a Windows 7 machine running IIS 7.5 using ASP.NET 4.0. Everything works fine on the dev machine. But when I used the Web Deployment tool to move it to the server, I now get a HTTP 500 error without a lot of information: Module AspNetInitClrHostFailureModule Notification BeginRequest Handler StaticFile Error Code 0x80070002 Requested URL http://192.168.1.83:80/ Physical Path C:\JustStreamIt Logon Method Not yet determined Logon User Not yet determined Failed Request Tracing Log Directory C:\inetpub\logs\FailedReqLogFiles And in my trace file I get: view trace Warning -SET_RESPONSE_ERROR_DESCRIPTION ErrorDescription An error message detailing the cause of this specific request failure can be found in the application event log of the web server. Please review this log entry to discover what caused this error to occur. view trace Warning -MODULE_SET_RESPONSE_ERROR_STATUS ModuleName AspNetInitClrHostFailureModule Notification 1 HttpStatus 500 HttpReason Internal Server Error HttpSubStatus 0 ErrorCode 2147942402 ConfigExceptionInfo Notification BEGIN_REQUEST ErrorCode The system cannot find the file specified. (0x80070002) And I get the following in the Application Log: Log Name: Application Source: Microsoft-Windows-IIS-W3SVC-WP Date: 5/28/2010 2:08:10 PM Event ID: 2299 Task Category: None Level: Error Keywords: Classic User: N/A Computer: win-ltfkdo1dnfp Description: An application has reported as being unhealthy. The worker process will now request a recycle. Reason given: An error message detailing the cause of this specific request failure can be found in the application event log of the web server. Please review this log entry to discover what caused this error to occur. . The data is the error. Event Xml: <Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event"> <System> <Provider Name="Microsoft-Windows-IIS-W3SVC-WP" Guid="{670080D9-742A-4187-8D16-41143D1290BD}" EventSourceName="W3SVC-WP" /> <EventID Qualifiers="49152">2299</EventID> <Version>0</Version> <Level>2</Level> <Task>0</Task> <Opcode>0</Opcode> <Keywords>0x80000000000000</Keywords> <TimeCreated SystemTime="2010-05-28T21:08:10.000000000Z" /> <EventRecordID>1663</EventRecordID> <Correlation /> <Execution ProcessID="0" ThreadID="0" /> <Channel>Application</Channel> <Computer>win-ltfkdo1dnfp</Computer> <Security /> </System> <EventData> <Data Name="Reason">An error message detailing the cause of this specific request failure can be found in the application event log of the web server. Please review this log entry to discover what caused this error to occur. </Data> <Binary>02000780</Binary> </EventData> </Event> Anyone have a suggestion about where I should start looking?

    Read the article

  • SQL Server 2008 uses half the CPU’s

    - by ACALVETT
    I recently got my hands on a couple of 4 socket servers with Intel E7-4870's (10 cores per cpu) and with hyper threading enabled that gave me 80 logical CPU's. The server has Windows 2008 R2 SP1 along with SQL 2008 (Currently we can not deploy SQL 2008 R2 for the application being hosted). When SQL Server started I noticed only 2 NUMA nodes were configured and 40 logical cores where there should have been 4 NUMA nodes and 80 logical cores (see below). The problem is caused by that fact that...(read more)

    Read the article

  • Referencing SQL Server 2008 R2 SMO from Visual Studio 2010

    - by user69508
    Hello. We read a number of things about referencing SQL Server SMO from Visual Studio but still don't have the definite answers we need. So, here it goes... A number of years ago we created a C# application using Visual Studio 2005 and SQL Server 2005. In that application, we added .NET references to a number of SQL Server SMO objects, and everything worked fine. Those references were: Microsoft.SqlServer.ConnectionInfo GAC 9.0.242.0 Microsoft.SqlServer.Smo GAC 9.0.242.0 Microsoft.SqlServer.SqlEnum GAC 9.0.242.0 We have now migrated to Visual Studio 2010 and SQL Server 2008 R2. However, when we try to reference those same SMO objects for SQL Server 2008 R2, they don't appear in the .NET references tab. We're wanting to reference the SQL Server 2008 R2 version of those same SMO assemblies for our upgraded C# application. On our development machines, we have SQL Server 2008 R2 Developer installed with all options, including the SDK such that the assemblies are found in C:\Program Files (x86)\Microsoft SQL Server\100\SDK\Assemblies. So, my first questions are: Are we supposed to do file references to the SMO assemblies instead of .NET references in Visual Studio 2010 w/ SQL Server 2008 R2? Or, is there some problem with our development machines such that the SMO assemblies are not appearing in the .NET references tab? Next, our production machines will have SQL Server 2008 R2 Workgroup installed with the client tools option selected, thus providing those same SMO assemblies in C:\Program Files (x86)\Microsoft SQL Server\100\SDK\Assemblies. So, the next questions are: When we release to production, are we supposed to redistribute the SMO assemblies with our application? Or, will our application work on the production servers without redistributing the SMO assemblies (since the client tools/SMO assemblies have been installed)? What else????? Thanks for the help!

    Read the article

  • Partner Blog Series: PwC Perspectives Part 2 - Jumpstarting your IAM program with R2

    - by Tanu Sood
    Identity and access management (IAM) isn’t a new concept. Over the past decade, companies have begun to address identity management through a variety of solutions that have primarily focused on provisioning. . The new age workforce is converging at a rapid pace with ever increasing demand to use diverse portfolio of applications and systems to interact and interface with their peers in the industry and customers alike. Oracle has taken a significant leap with their release of Identity and Access Management 11gR2 towards enabling this global workforce to conduct their business in a secure, efficient and effective manner. As companies deal with IAM business drivers, it becomes immediately apparent that holistic, rather than piecemeal, approaches better address their needs. When planning an enterprise-wide IAM solution, the first step is to create a common framework that serves as the foundation on which to build the cost, compliance and business process efficiencies. As a leading industry practice, IAM should be established on a foundation of accurate data for identity management, making this data available in a uniform manner to downstream applications and processes. Mature organizations are looking beyond IAM’s basic benefits to harness more advanced capabilities in user lifecycle management. For any organization looking to embark on an IAM initiative, consider the following use cases in managing and administering user access. Expanding the Enterprise Provisioning Footprint Almost all organizations have some helpdesk resources tied up in handling access requests from users, a distraction from their core job of handling problem tickets. This dependency has mushroomed from the traditional acceptance of provisioning solutions integrating and addressing only a portion of applications in the heterogeneous landscape Oracle Identity Manager (OIM) 11gR2 solves this problem by offering integration with third party ticketing systems as “disconnected applications”. It allows for the existing business processes to be seamlessly integrated into the system and tracked throughout its lifecycle. With minimal effort and analysis, an organization can begin integrating OIM with groups or applications that are involved with manually intensive access provisioning and de-provisioning activities. This aspect of OIM allows organizations to on-board applications and associated business processes quickly using out of box templates and frameworks. This is especially important for organizations looking to fold in users and resources from mergers and acquisitions. Simplifying Access Requests Organizations looking to implement access request solutions often find it challenging to get their users to accept and adopt the new processes.. So, how do we improve the user experience, make it intuitive and personalized and yet simplify the user access process? With R2, OIM helps organizations alleviate the challenge by placing the most used functionality front and centre in the new user request interface. Roles, application accounts, and entitlements can all be found in the same interface as catalog items, giving business users a single location to go to whenever they need to initiate, approve or track a request. Furthermore, if a particular item is not relevant to a user’s job function or area inside the organization, it can be hidden so as to not overwhelm or confuse the user with superfluous options. The ability to customize the user interface to suit your needs helps in exercising the business rules effectively and avoiding access proliferation within the organization. Saving Time with Templates A typical use case that is most beneficial to business users is flexibility to place, edit, and withdraw requests based on changing circumstances and business needs. With OIM R2, multiple catalog items can now be added and removed from the shopping cart, an ecommerce paradigm that many users are already familiar with. This feature can be especially useful when setting up a large number of new employees or granting existing department or group access to a newly integrated application. Additionally, users can create their own shopping cart templates in order to complete subsequent requests more quickly. This feature saves the user from having to search for and select items all over again if a request is similar to a previous one. Advanced Delegated Administration A key feature of any provisioning solution should be to empower each business unit in managing their own access requests. By bringing administration closer to the user, you improve user productivity, enable efficiency and alleviate the administration overhead. To do so requires a federated services model so that the business units capable of shouldering the onus of user life cycle management of their business users can be enabled to do so. OIM 11gR2 offers advanced administrative options for creating, managing and controlling business logic and workflows through easy to use administrative interface and tools that can be exposed to delegated business administrators. For example, these business administrators can establish or modify how certain requests and operations should be handled within their business unit based on a number of attributes ranging from the type of request or the risk level of the individual items requested. Closed-Loop Remediation Security continues to be a major concern for most organizations. Identity management solutions bolster security by ensuring only the right users have the right access to the right resources. To prevent unauthorized access and where it already exists, the ability to detect and remediate it, are key requirements of an enterprise-grade proven solution. But the challenge with most solutions today is that some of this information still exists in silos. And when changes are made to systems directly, not all information is captured. With R2, oracle is offering a comprehensive Identity Governance solution that our customer organizations are leveraging for closed loop remediation that allows for an automated way for administrators to revoke unauthorized access. The change is automatically captured and the action noted for continued management. Conclusion While implementing provisioning solutions, it is important to keep the near term and the long term goals in mind. The provisioning solution should always be a part of a larger security and identity management program but with the ability to seamlessly integrate not only with the company’s infrastructure but also have the ability to leverage the information, business models compiled and used by the other identity management solutions. This allows organizations to reduce the cost of ownership, close security gaps and leverage the existing infrastructure. And having done so a multiple clients’ sites, this is the approach we recommend. In our next post, we will take a journey through our experiences of advising clients looking to upgrade to R2 from a previous version or migrating from a different solution. Meet the Writers:   Praveen Krishna is a Manager in the Advisory Security practice within PwC.  Over the last decade Praveen has helped clients plan, architect and implement Oracle identity solutions across diverse industries.  His experience includes delivering security across diverse topics like network, infrastructure, application and data where he brings a holistic point of view to problem solving. Dharma Padala is a Director in the Advisory Security practice within PwC.  He has been implementing medium to large scale Identity Management solutions across multiple industries including utility, health care, entertainment, retail and financial sectors.   Dharma has 14 years of experience in delivering IT solutions out of which he has been implementing Identity Management solutions for the past 8 years. Scott MacDonald is a Director in the Advisory Security practice within PwC.  He has consulted for several clients across multiple industries including financial services, health care, automotive and retail.   Scott has 10 years of experience in delivering Identity Management solutions. John Misczak is a member of the Advisory Security practice within PwC.  He has experience implementing multiple Identity and Access Management solutions, specializing in Oracle Identity Manager and Business Process Engineering Language (BPEL). Jenny (Xiao) Zhang is a member of the Advisory Security practice within PwC.  She has consulted across multiple industries including financial services, entertainment and retail. Jenny has three years of experience in delivering IT solutions out of which she has been implementing Identity Management solutions for the past one and a half years.

    Read the article

  • VMRC equivalent for Hyper-V?

    - by Ian Boyd
    VMRC was the client tool used to connect to virtual machines running on Virtual Server. Upgrading to Windows Server 2008 R2 with the Hyper-V role, i need a way for people to be able to use the virtual machines. Note: not all virtual machines will have network connectivity not all virtual machines will be running Windows some people needing to connect to a virtual machine will be running Windows XP Hyper-V manager, allowing management of the hyper-v server, is less desirable (since it allows management of the hyper-v server (and doesn't work on all operating systems)) What is the Windows Server 2008 R2 equivalent of VMRC; to "vnc" to a virtual server? Update: i think Tatas was suggesting Microsoft System Center Virtual Machine Manager Self-Service Portal 2.0 (?): Which requires SQL Server IIS Installing those would unfortunately violate our Windows Server 2008 R2 license. i might be looking at the wrong product link, since commenter said there is a version that doesn't require "System Center". Update 2: The Windows Server 2008 R2 running HyperV is being licensed with the understanding that it only be used to host HyperV. From the [Windows Server 2008 R2 Licensing FAQ][4]: Q. If I have one license for Windows Server 2008 R2 Standard and want to run it in a virtual operating system environment, can I continue running it in the physical operating system environment? A. Yes, with Windows Server 2008 R2 Standard, you may run one instance in the physical operating system environment and one instance in the virtual operating system environment; however, the instance running in the physical operating system environment may be used only to run hardware virtualization software, provide hardware virtualization services, or to run software to manage and service operating system environments on the licensed server. This is why i'm weary about installing IIS or SQL Server.

    Read the article

  • New P6 Reporting Database R2

    - by mark.kromer
    Along with our announced GA release of P6 Analytics R1 recently, you may have noticed that when you purchase P6 Analytics, we provide a restricted use license for P6 Reporting Database R2. This represent an updated version of the previous P6 Reporting Database 6.2 and can be purchased individually on a per-CPU basis. Typically, you will want just the reporting database if you would like the P6 data warehouse components such as the ETL, data models, ODS and star schemas in order to report on that data with another reporting tool other than Oracle. The P6 Analytics solution will only work on Oracle BI (OBI). But I pasted below some examples of a simplistic matrix report that I built from the P6 Reporting Database using Microsoft SQL Server Reporting Services. This is the Report Builder tool which is very similar to other similar tools to build reports on the market today such as Crystal Reports or Oracle BI Publisher. This is an example of what you can do (in a very simple format) by using the P6 Reporting Database without P6 Analytics: Here is a quick run-down of some of the key new features in P6 Reporting Database R2 that were added as enhancements to the 6.2 version: • 4 new star schemas (improved projects star, project history, resource utilization and resource allocation) • Improved ETL performance and reliability • P6 security is inherited at the star schema level • Custom P6 project, activity & resource codes are now available as customizable dimensions in the star schemas • Time-phase data down to the data is now available from the star schemas • An updated Operational Data Store (ODS) for operational reporting that includes the WBS hierarchy • The ODS now includes daily spreads for activity and resource assignments

    Read the article

  • SQLAuthority News – Download Whitepaper – SQL Server 2008 R2 Analysis Services Operations Guide

    - by pinaldave
    SQL Server Analysis Service (SSAS) has been always interesting subject for research. Analysis Services cubes are a very powerful tool in the hands of the business intelligence (BI) developer. They provide an easy way to expose even large data models directly to business users. Microsoft has published very informative white paper on Analysis Services Operations Guide. This white paper is authored by Thomas Kejser, John Sirmon, and Denny Lee. In this guide you will find information on how to test and run Microsoft SQL Server Analysis Services in SQL Server 2005, SQL Server 2008, and SQL Server 2008 R2 in a production environment. The focus of this guide is how you can test, monitor, diagnose, and remove production issues on even the largest scaled cubes. This paper also provides guidance on how to configure the server for best possible performance. It is the goal of this guide to make your operations processes as painless as possible, and to have you run with the best possible performance without any additional development effort to your deployed cubes. In this guide, you will learn how to get the best out of your existing data model by making changes transparent to the data model and by making configuration changes that improve the user experience of the cube. Download SQL Server 2008 R2 Analysis Services Operations Guide Note: Abstract taken white paper. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Download, SQL Query, SQL Server, SQL Tips and Tricks, SQL White Papers, SQLAuthority News, T SQL, Technology

    Read the article

  • Microsoft Sql Server 2008 R2 System Databases

    For a majority of software developers little time is spent understanding the inner workings of the database management systems (DBMS) they use to store data for their applications.  I personally place myself in this grouping. In my case, I have used various versions of Microsoft’s SQL Server (2000, 2005, and 2008 R2) and just recently learned how valuable they really are when I was preparing to deliver a lecture on "SQL Server 2008 R2, System Databases". Microsoft Sql Server 2008 R2 System DatabasesSo what are system databases in MS SQL Server, and why should I know them? Microsoft uses system databases to support the SQL Server DBMS, much like a developer uses config files or database tables to support an application. These system databases individually provide specific functionality that allows MS SQL Server to function. Name Database File Log File Master master.mdf mastlog.ldf Resource mssqlsystemresource.mdf mssqlsystemresource.ldf Model model.mdf modellog.ldf MSDB msdbdata.mdf msdblog.ldf Distribution distmdl.mdf distmdl.ldf TempDB tempdb.mdf templog.ldf Master DatabaseIf you have used MS SQL Server then you should recognize the Master database especially if you used the SQL Server Management Studio (SSMS) to connect to a user created database. MS SQL Server requires the Master database in order for DBMS to start due to the information that it stores. Examples of data stored in the Master database User Logins Linked Servers Configuration information Information on User Databases Resource DatabaseHonestly, until recently I never knew this database even existed until I started to research SQL Server system databases. The reason for this is due largely to the fact that the resource database is hidden to users. In fact, the database files are stored within the Binn folder instead of the standard MS SQL Server database folder path. This database contains all system objects that can be accessed by all other databases.  In short, this database contains all system views and store procedures that appear in all other user databases regarding system information. One of the many benefits to storing system views and store procedures in a single hidden database is the fact it improves upgrading a SQL Server database; not to mention that maintenance is decreased since only one code base has to be mainlined for all of the system views and procedures. Model DatabaseThe Model database as the name implies is the model for all new databases created by users. This allows for predefining default database objects for all new databases within a MS SQL Server instance. For example, if every database created by a user needs to have an “Audit” table when it is  created then defining the “Audit” table in the model will guarantees that the table will be located in every new database create after the model is altered. MSDB DatabaseThe MSDBdatabase is used by SQL Server Agent, SQL Server Database Mail, SQL Server Service Broker, along with SQL Server. The SQL Server Agent uses this database to store job configurations and SQL job schedules along with SQL Alerts, and Operators. In addition, this database also stores all SQL job parameters along with each job’s execution history.  Finally, this database is also used to store database backup and maintenance plans as well as details pertaining to SQL Log shipping if it is being used. Distribution DatabaseThe Distribution database is only used during replication and stores meta data and history information pertaining to the act of replication data. Furthermore, when transactional replication is used this database also stores information regarding each transaction. It is important to note that replication is not turned on by default in MS SQL Server and that the distribution database is hidden from SSMS. Tempdb DatabaseThe Tempdb as the name implies is used to store temporary data and data objects. Examples of this include temp tables and temp store procedures. It is important to note that when using this database all data and data objects are cleared from this database when SQL Server restarts. This database is also used by SQL Server when it is performing some internal operations. Typically, SQL Server uses this database for the purpose of large sort and index operations. Finally, this database is used to store row versions if row versioning or snapsot isolation transactions are being used by SQL Server. Additionally, I would love to hear from others about their experiences using system databases, tables, and objects in a real world environments.

    Read the article

  • Microsoft Sql Server 2008 R2 System Databases

    For a majority of software developers little time is spent understanding the inner workings of the database management systems (DBMS) they use to store data for their applications.  I personally place myself in this grouping. In my case, I have used various versions of Microsoft’s SQL Server (2000, 2005, and 2008 R2) and just recently learned how valuable they really are when I was preparing to deliver a lecture on "SQL Server 2008 R2, System Databases". Microsoft Sql Server 2008 R2 System DatabasesSo what are system databases in MS SQL Server, and why should I know them? Microsoft uses system databases to support the SQL Server DBMS, much like a developer uses config files or database tables to support an application. These system databases individually provide specific functionality that allows MS SQL Server to function. Name Database File Log File Master master.mdf mastlog.ldf Resource mssqlsystemresource.mdf mssqlsystemresource.ldf Model model.mdf modellog.ldf MSDB msdbdata.mdf msdblog.ldf Distribution distmdl.mdf distmdl.ldf TempDB tempdb.mdf templog.ldf Master DatabaseIf you have used MS SQL Server then you should recognize the Master database especially if you used the SQL Server Management Studio (SSMS) to connect to a user created database. MS SQL Server requires the Master database in order for DBMS to start due to the information that it stores. Examples of data stored in the Master database User Logins Linked Servers Configuration information Information on User Databases Resource DatabaseHonestly, until recently I never knew this database even existed until I started to research SQL Server system databases. The reason for this is due largely to the fact that the resource database is hidden to users. In fact, the database files are stored within the Binn folder instead of the standard MS SQL Server database folder path. This database contains all system objects that can be accessed by all other databases.  In short, this database contains all system views and store procedures that appear in all other user databases regarding system information. One of the many benefits to storing system views and store procedures in a single hidden database is the fact it improves upgrading a SQL Server database; not to mention that maintenance is decreased since only one code base has to be mainlined for all of the system views and procedures. Model DatabaseThe Model database as the name implies is the model for all new databases created by users. This allows for predefining default database objects for all new databases within a MS SQL Server instance. For example, if every database created by a user needs to have an “Audit” table when it is  created then defining the “Audit” table in the model will guarantees that the table will be located in every new database create after the model is altered. MSDB DatabaseThe MSDBdatabase is used by SQL Server Agent, SQL Server Database Mail, SQL Server Service Broker, along with SQL Server. The SQL Server Agent uses this database to store job configurations and SQL job schedules along with SQL Alerts, and Operators. In addition, this database also stores all SQL job parameters along with each job’s execution history.  Finally, this database is also used to store database backup and maintenance plans as well as details pertaining to SQL Log shipping if it is being used. Distribution DatabaseThe Distribution database is only used during replication and stores meta data and history information pertaining to the act of replication data. Furthermore, when transactional replication is used this database also stores information regarding each transaction. It is important to note that replication is not turned on by default in MS SQL Server and that the distribution database is hidden from SSMS. Tempdb DatabaseThe Tempdb as the name implies is used to store temporary data and data objects. Examples of this include temp tables and temp store procedures. It is important to note that when using this database all data and data objects are cleared from this database when SQL Server restarts. This database is also used by SQL Server when it is performing some internal operations. Typically, SQL Server uses this database for the purpose of large sort and index operations. Finally, this database is used to store row versions if row versioning or snapsot isolation transactions are being used by SQL Server. Additionally, I would love to hear from others about their experiences using system databases, tables, and objects in a real world environments.

    Read the article

  • Dell Management Packs in System Center Operations Manager 2007 R2?

    - by bwerks
    Hey all, I recently set up SCOM in a small business network environment. The root management server is a Dell Poweredge 2950, and I'd like to use SCOM to monitor it using Dell's management packs. I've imported the management packs into the SCOM deployment and followed Dell's installation instructions, but it doesn't seem to be fully working yet. Currently, the Diagram views in the Dell tree (Monitoring tab) seem to show me the server's place in the network topology, so it seems that at least part of it is working. However, none of the reports under "Performance and Power Monitoring Views" provide any information. When clicking on one of them (Power Consumption (Watts), for instance), the display area is blank and there is a tooltip visible that reads "No performance counter is selected. To select a counter, place a check mark in the Show column in legend below." However, in the legend, there's nothing there for me to check. I've installed OpenManage 6.2 on the server as per the Dell documentation, but I don't know what else I could have done that I missed. Does this sound like a familiar problem to anyone?

    Read the article

  • Windows 2008 R2 Scheduled Task Not Running With Admin Privileges even if granted?

    - by j.rightly
    I have a scheduled task that is running as USER. I have checked the box "Run with highest privileges" in the scheduled task properties. The task is a powershell script that, among other things, reboots the system. The script executes and runs normally, but as a scheduled task, it fails to reboot the system. Here is the kicker: When I manually run the script as USER using the exact same command line as what's in the scheduled task, the script still runs but this time it actually reboots the system. I have UAC disabled and USER is a member of the local Admins group. The local Admins group has the right to shut down the system. Nothing in the event logs offers any clues. Why would the same script running under the same credentials work interactively but not as a scheduled task? UPDATE: This is too weird. When the task ran on schedule, everything worked normally.

    Read the article

  • How can I parse/ transform text log data before it gets captured in SCOM 2007 R2?

    - by Abs
    I'm pretty much a noob with System Center Operations Manager 2007, and I'm probably missing something pretty basic, but I'm stumped anyway. We're setting up monitoring on some of our servers, and we'd like to capture data from some plain text log files (e.g. DNS debug logs, DHCP logs). It looks to me like I can set up a generic text file monitoring rule and get events captured into the main Ops Manager database, but my understanding is that the whole line of text from the plain text log gets captured as one field. In an ideal world, we'd be able to parse or transform that log file data to make it easier to query later. Is this possible? Is it easy? Do I have to buy expensive 3rd-party software to do it? One more thing: it would be even better if there was a way to stuff this data into the Audit Collection Services (ACS) database instead of the main one, but I'll take what I can get. Any help would be greatly appreciated.

    Read the article

  • Why does copying an XML file from Windows Server 2008 R2 64-bit to Windows XP 32-bit change the file?

    - by Alex In Paris
    What I do: Copy an xml file (ctrl+C) on a Win Server 2008 machine. Minimize mstsc.exe (remote connection app). Paste the xml file on to my WinXP machine (ctrl+V). The result: All of the original contents are still present but another bit is appended at the end of it. E.g. the proper end of the file looks something like this: <ApplicationName>MyApp</ApplicationName> </ReceivePort> </ReceivePortCollection> <PartyCollection xsi:nil="true" /> </BindingInfo> But, after the copy, it looks like this: <ApplicationName>MyApp</ApplicationName> </ReceivePort> </ReceivePortCollection> <PartyCollection xsi:nil="true" /> </BindingInfo>al, PublicKeyToken=3zzf3xxxadyyy35" Type="1" TrackingOption="ServiceStartEnd MessageSendReceive PipelineEvents" Description="" /> <ReceivePipelineData xsi:nil="true" /> <SendPipeline xsi:nil="true" /> <SendPipelineData xsi:nil="true" /> <Enable>true</Enable> <ReceiveHandler Name="WCF_OracleDB_Rx" HostTrusted="false"> <TransportType Name="WCF OracleDB" Capabilities="779" Configuratio The extra bits it adds are things that come from earlier in the XML file. If I do the copy multiple times, the extra bits are always exactly the same but another XML file will add different lines. Extra information: If I copy/paste the file, as above, but first enclose it into a zip file I do not have the same problem. I.e. the file copies properly and without any extra surprises. If I do a copy/paste from a Windows Explorer window that's opened to the folder on the remote machine, I do not have the same behavior. I.e. the file copies properly and without any extra surprises. Question: Why does this happen?

    Read the article

  • what is Remote Desktop Services in Windows Server 2008 R2 all about?

    - by fejesjoco
    Seriously, I'm lost in all that sales mumbo-jumbo. Let's say I want 1 or 2 users to be able to remotely log on to a server, run Word, Visual Studio, Firefox, and whatever. Do I gain anything at all if I install Remote Desktop Services? Or do I just install Desktop Experience feature pack, enable remote desktop and voila, nobody will ever notice the difference? Here's what TechNet says about Remote Desktop Session Host: A Remote Desktop Session Host (RD Session Host) server is the server that hosts Windows-based programs or the full Windows desktop for Remote Desktop Services clients. Users can connect to an RD Session Host server to run programs, to save files, and to use network resources on that server. Users can access an RD Session Host server by using Remote Desktop Connection or by using RemoteApp. The good old simple remote desktop can also host a full Windows desktop for remote clients so that they can run programs, save files and do all that stuff. Why do they write about it like it's such a great new invention, besides that they want to sell it? RDSH doesn't seem all that different at all. What do I install when I install RDSH, since all those features are already there in Windows? What's even more confusing is that you need to take special care when you want to install applications to an RDSH so that they will be usable by many concurrent users. Why? All the modern applications install the program files in one directory, store some common settings in the ProgramData folder and the HKLM hive, and store user specific settings in the Users folder and the HKCU hive. They are designed to be usable by many users on the same machine. 2 or 2000 users can use them concurrently without any efforts. I can sign in with 2 users to a server with only remote desktop enabled, and both of us can run Word or anything without any problems, can't we? So what changes if I set RDSH to install mode, or what happens if I don't? Why is the feature to switch between install and execute mode there at all? Yes I know of some advantages in Remote Desktop Services, like there's no 2 user limit, it supports virtualization, video acceleration and stuff, it has a whole infrastructure with gateway, web access, connection broker, etc. But I don't need those, so if you take these away, how are these two technologies different? From the articles it seems like they are completely different technologies, whereas it looks to me that they are completely the same at the core, and Remote Desktop Services just adds some additional features, but doesn't reinvent anything.

    Read the article

  • Securing RDP access to Windows Server 2008 R2: is Network Level Authentication enough?

    - by jamesfm
    I am a dev with little admin expertise, administering a single dedicated web server remotely. A recent independent security audit of our site recommended that "RDP is not exposed to the Internet and that a robust management solution such as a VPN is considered for remote access. When used, RDP should be configured for Server Authentication to ensure that clients cannot be subjected to man-in-the-middle attacks." Having read around a bit, it seems like Network Level Authentication is a Good Thing so I have enabled the "Allow connections only from Remote Desktop with NLA" option on the server today. Is this acion enough to mitigate the risk of a Man-in-the-Middle attack? Or are there other essential steps I should be taking? If VPN is essential, how do I go about it?

    Read the article

  • Server 2012 R2 DNS Conditional forwarding not working reliably, possible caching issue?

    - by Matt
    I have a bit of a home lab setup with a domain controller that is acting as the DNS server for my network. For everything, it's working fine and forwards external DNS requests to my ISP. The household recently wanted to get Netflix going and it seemed a DNS option was better than a VPN to get around the region locking, so I signed up for unblock-us.com Since I have a Windows DNS server I thought I'd be clever and make use of conditional forwarders and added the Netflix domain to the list. Initially this worked well and all devices on the network could now access Netflix, however after about an hour going to the Netflix site would result in a page cannot be found. Doing an nslookup of Netflix.com from my PC resulted in it not returning any IP addresses. As a test, I deleted the Netflix domain from the DNS servers cache and things started working again - devices could get to the site again however the same thing happens again after around half an hour to an hour. Have I missed something here that's causing it to stop working?

    Read the article

  • Windows Server 2008 R2 Domain Controller DNS Reverse lookup zone needed?

    - by Joost Verdaasdonk
    When I create a new Domain Controller with dcpromo then the wizard will also add a DNS Role to the server because the first domain controller must be the global catalog server for the forest. After the install when I look at the DNS then I see the forward lookup zone for the newly created domain. However no zone is created for the Reverse lookup zone. So my question is: Is this an advisable endresult or not? In other words is it a good idea to add my domain to the reverse lookup zone as well? Just curious to hear how other people use this zone in the domain controller. Thanks

    Read the article

  • Windows 2008 R2 Task Scheduler triggered an event for unknown reasons.

    - by Mike
    Today I arrived at the office only to find that a task, which was scheduled to trigger at 5:30PM EST each Friday, had triggered on its own at 6:01AM EST this morning. I checked the event logs as well as the task schedule log and all of the evidence points to a timed trigger starting this task with the correct credentials, however the task history reports the task has not been triggered since last Friday when it ran to completion successfully. I do not have this task set to random start times or start if missed. This is the first time I have observed this happen in the Windows Task Scheduler and want to know if anyone else has come across this, why it happened and how to fix it?

    Read the article

  • How to configure what certificates can be issued using Web Enrollment in Windows Server 2008 R2 Enterprise?

    - by antik
    I have a CA installed on of my Windows Servers in a small farm of systems. I've installed the Certification Authority Web Enrollment and Certificate Enrollment Web Service roles on the CA. I want to issue a Computer certificate to a computer not jointed to my domain. The user attempting web enrollment has domain credentials. The user was able to navigate to https://myServerHostname/certsrv and request a User certificate successfully. However, the user needs a Computer cert as well. From the certsrv site, the user tried the following: Advanced Certificate Request Create and Submit a Request to this CA However, the Computer certificate template is not available under the Certificate Template heading. He is only seeing "User" and "Basic EFS". How do I configure the CA to allow him to request a Computer cert for his system?

    Read the article

  • web sites leaking memory? IIS 7.5 Windows server 2008 R2

    - by Charles
    I have several web sites on my windows 2008 server that have been working flawlessly for over a year. Just a few days ago I ran into an issue where my server stopped serving up pages on some of these sites for no apparent reason. I dug into it a little more today and I see that some of my sites (they're all asp.net mvc 3.0 sites), are consuming over 460MB of memory. Like I said, this just started the other day after a very long period of time of no issues at all. I have two questions: 1) is there a way to throttle how much memory is consumed by the w3wp process before I can force it to restart (restart the app pool for a particular site) so that it doesn't keep hogging all of the memory? 2) any ideas what could have caused this to start happening?

    Read the article

  • How to properly secure Windows Server 2008 R2 that will host SQL Server 2012?

    - by Max
    I am a .net programmer trying to create this setup: I want this server to be inaccessible through DMZ accept for IPSEC connections, and to also have a private network which will be accessible through another windows 2008 server which will host vpn. That is how our windows 2003 infrastructure works and I am trying to do the same with 2008 servers, are there any guides or documentations that have this scenario?

    Read the article

< Previous Page | 15 16 17 18 19 20 21 22 23 24 25 26  | Next Page >