Search Results

Search found 23 results on 1 pages for 'bwerks'.

Page 1/1 | 1 

  • Dell Management Packs in System Center Operations Manager 2007 R2?

    - by bwerks
    Hey all, I recently set up SCOM in a small business network environment. The root management server is a Dell Poweredge 2950, and I'd like to use SCOM to monitor it using Dell's management packs. I've imported the management packs into the SCOM deployment and followed Dell's installation instructions, but it doesn't seem to be fully working yet. Currently, the Diagram views in the Dell tree (Monitoring tab) seem to show me the server's place in the network topology, so it seems that at least part of it is working. However, none of the reports under "Performance and Power Monitoring Views" provide any information. When clicking on one of them (Power Consumption (Watts), for instance), the display area is blank and there is a tooltip visible that reads "No performance counter is selected. To select a counter, place a check mark in the Show column in legend below." However, in the legend, there's nothing there for me to check. I've installed OpenManage 6.2 on the server as per the Dell documentation, but I don't know what else I could have done that I missed. Does this sound like a familiar problem to anyone?

    Read the article

  • Clone virtual machine with Server 2008 R2 and Hyper-V?

    - by bwerks
    Hi all, I've recently just started working with Hyper-V, and so far it's quite nice. However, I've been running into problems with what seems like it should be the most basic of workflows. I've set up a baseline Server 2008 R2 configuration, and exported it with the intention of using the export for cloning. I entered "C:\Exports\" as the export folder. However, I run into problems when I try to import the image. From the Hyper-V manager, I select "Import Virtual Machine" and in the resulting window I entered "C:\Exports\BuildServer\" as the folder, set the radial to "Copy the virtual machine (create a new unique ID)" and checked the checkbox for "Duplicate all files so the same virtual machine can be imported again." Doing so results in the following error: "Import failed. Import task failed to copy file from 'H:\Exports\BuildServer\Virtual Hard Disks\BuildServer.vhd' to 'C:\Hyper-V\Virtual Hard Disks\BuildServer.vhd': The file exists. (0x80070050)" Have I somehow messed something up in configuration? Or is this a known thing? I've read it should be possible to clone VMs by copying them in the filesystem but I'd prefer to keep things in the management Ui if possible.

    Read the article

  • Dell fumbles OpenManage installation process, forgets to write documentation?

    - by bwerks
    Hi all, I'm setting up a Dell PowerEdge 2950 for a small business, and I've just spent a while with Dell OpenManage Server Administrator 6.2, trying to clear the installation process of errors before I execute it. Right now I'm getting the following warning from the installer. The installer has detected that the HTTPS listener is not configured for Windows Remote Management. You can either configure the HTTPS listener before installing Remote Enablement, or install Remote Enablement now by selecting the "Custom" installation screen and configure the HTTPS listener later. See the "Remote Enablement Requirements" section in the "Dell OpenManage Installation and Security User's Guide" for information on configuring the HTTPS listener. Note: Remote Enablement is required to manage this system from a remote Server Administrator Web Server and is applicable only for those systems that support Server Instrumentation. Click here to configure HTTPs Listener for Windows Remote Management. The italicized line is a link, which executes...something...via cmd, and doesn't seem to help the problem. Not knowing exactly what to do here, I consulted the documentation. I read through the Setup and Administration section of the User's Guide, but all that it contained was a weird primer on role-based security and some SNMP stuff. The next section skips installation entirely and moved on to features of the suite. Thinking myself crazy, I consulted the readme, which told me that for installation I should consult the "Dell OpenManage Installation and Security Version 6.2 User's Guide" which not only doesn't exist in the documentation, but also not in all of google? Soo yeah, if anyone is familiar with this problem, drop me some knowledge!

    Read the article

  • Can't access Administrator account on Windows XP after adding local user account

    - by bwerks
    I have an installation of windows XP, and it's not part of a domain. Previously, it just had only the administrator account, and upon creating a different user account, all access was lost to the administrator account. When the machine starts up, only the new local account is offered for login, which seemed strange. I've checked that the administrator account was not disabled, nor are any rights missing from the local security policy. Furthermore, the administrator account is accessible via remote desktop, where an opportunity is given to type the desired account. REALLY strange. Upon deletion of the new local user account, the administrator account appeared again. Can anyone tell me what's going on?

    Read the article

  • One-way forest trust between geographically distributed forests using Server 2008 R2

    - by bwerks
    Hi all, I'm planning out a joinder between two domains, as would take place with contracting companies. Forests A and B exist in distant sites, and there is to be a one-way forest trust so that domain users in Forest A can be authenticated on machines in Forest B. In order to facilitate this, each forest's domain controller must be able to contact each other in order to set up & confirm the trust, but my question is what underlying networking magic must take place beneath it. So far the prevailing approach has been to maintain a VPN connection between the two sites, but the technet documentation seems to indicate that DNS forwarding may be the way to go. Is this the case? Furthermore, if DNS will suffice, does that mean that there must be a server running DNS on boundary servers in each domain so that they can be reached from across the internet? How must they be configured? Thanks!

    Read the article

  • Is mismatched firmware on drives in a raid-6 a bad thing?

    - by bwerks
    Hi all, I recently expanded a raid-1 to a raid-6 with six drives. I ordered all four of the new drives from the same place, and all of them were advertised to be the same drives as the original two--Seagate 15krpm 146gb. However, when I was looking at the drives in the perc6/i utility, one of them appeared to be an earlier firmware version; it had S515, compared to the other five drives with S527. Sure enough, after inspecting the drive itself, the label advertised the earlier firmware version. Running Dell's SAS firmware upgrade utility should have in theory moved them all up to S52A, but when I ran it it moved the S527 drives up to S52A, and left the S515 drive untouched. Is this something to be worried about? If it's something that should be corrected, is there a way to target a particular drive for upgrade since the firmware utility didn't seem to do it by itself?

    Read the article

  • Windows 7 VPN Client Default IPsec Configuration?

    - by bwerks
    As far as I can tell, the windows VPN client doesn't provide a lot of flexibility in its IPsec settings. Assuming full configurability on the site end of a client-site VPN configuration, does anyone how to configure the site to match the windows client? Bonus points: how would I discover these settings for myself?

    Read the article

  • Is Sql Server 2008 R2 unsupported by Operations Manager (SCOM) 2007 R2?

    - by bwerks
    Hey all, I'm performing a test configuration of System Center Operations Manager 2007 R2, on a system prepared with Sql Server 2008 R2. Unfortunately, the Scom 2007 R2 prerequisites verification program seems to be detecting exact versions of Sql Server, and not simply a minimum version, like it claims: "System Center Operations Manager 2007 R2 requires SQL Server 2005 Standard or Enterprise Edition with SP1 and above or SQL Server 2008 Standard or Enterprise edition with SP1 and above. Note: Operations Manager 2007 R2 does not support a 32-bit Operations Manager Operations database, Reporting Server data warehouse or Audit Collection database on a 64-bit operating system." I had hoped that this was just a helper tool that was assisting in getting me off the ground, but unfortunately it seems as if it's actually used as a gate for the installation to proceed. Has anyone encountered this? If so, is there a way to fool the installer into thinking that it has a proper version, or otherwise alert it to my valid configuration?

    Read the article

  • Windows 7 upgrade licensing

    - by bwerks
    I'm having trouble wading through Microsoft's marketing information. Does anyone know if Windows 7 x86 to Windows 7 x64 is a valid upgrade path? I know you can't actually use the built-in "upgrade" installation path; this is more of a licensing question. Although that may have answered my own question: is this idea even possible? Or do "upgrade" versions of Windows function only when executed from inside the OS, and not when doing fresh installs? Thanks!

    Read the article

  • Windows 7 pricing

    - by bwerks
    I'm having trouble wading through Microsoft's marketing information. Does anyone know if Windows 7 x86 to Windows 7 x64 is a valid upgrade path? I know you can't actually use the built-in "upgrade" installation path; this is more of a licensing question. Although that may have answered my own question: is this idea even possible? Or do "upgrade" versions of Windows function only when executed from inside the OS, and not when doing fresh installs? Thanks!

    Read the article

  • Netgear FVS336G: appropriate solution for today's small businesses?

    - by bwerks
    Hey all, I've been looking into a routers to facilitate a vpn solution for a small business. While the Netgear FVS336G looks good on paper, it appears to have some fairly crippling setbacks that drag down what appears to be some great hardware. First off, the unit has been around for a couple years now, perhaps before 64-bit operating systems were as common as they are now, and complaints are everywhere that claim that SSL or IPsec (or both) VPN connections will not work with 64-bit operating systems. However, most of these claims mention only Vista, which makes me think that these problems could have potentially been solved since then. Unfortunately though, Netgear's support forums seem to be incredibly private, and policed by some troll named jmizuguchi who just closes down public posts in order to marshal them into the private ones. Danger, will robinson. Apparently their firmware upgrade process is a nightmare too, but that's beside the point. My question is this: has anyone configured one a Netgear FVS336G to operate in a server 2008 (or R2)/windows 7 64-bit network? If so, is it possible to use the microsoft vpn client or are third party clients still required? If this thing has just failed the test of time, is there a feature-comparable unit that I've missed, at anywhere near the same price range? Thanks!

    Read the article

  • Anonymous access to SMB share hosted on Server 2008 R2 Enterprise

    - by bwerks
    Hi all, First off, I have read through this post and a whole slew of non-SF posts which seem to address the same or a similar problem, however I was still unable to fix my problem. I've got three machines in this situation: a domain-joined server that runs Server 2008 R2 Enterprise ("share server") a domain-joined workstation running XP Pro SP3 ("test server") a domain-unjoined test server running Server 2003 R2 SP2 ("workstation") The share server is exposing a share on the network that the test server must access--it's a Source/Symbol Server share for our debugging purposes. I believe visual studio simply accesses the the share with its own credentials in this case, meaning that the share must be accessible anonymously since the test server isn't joined to the domain and there's no opportunity to supply domain authentication. I've attempted a lot of things to avoid the authentication window when accessing the share: I've enabled the Guest account on the share server and given Guest full sharing/NTFS permissions for the share. I've given ANONYMOUS LOGON full sharing/NTFS permissions for the share. I've added my share to “Network Access: Shares that can be accessed anonymously” in LSP. I've disabled “Network access: Restrict anonymous access to Named Pipes and Shares” in LSP. I've enabled “Network access: Let Everyone permissions apply to anonymous users” in LSP. Added ANONYMOUS LOGON to “Access this computer from the network” in LSP. Added the Guest account to “Access this computer from the network” in LSP. Attempted to provision the share using the Share and Storage Management MMC snap-in. Unfortunately when I attempt to access the share from the test server, I still see the prompt and I'm forced to enter "Guest" manually. I also tried this workflow using the local administrator account on a workstation, and the same thing happens both with and without XP Simple File Sharing enabled. Any idea why I'm getting these results, or what I should have done differently?

    Read the article

  • Is Dell's server software bundle necessary? (Poweredge 2950 in my case)

    - by bwerks
    Hi all, Dell includes a fair amount of software with its servers, but I'm having a hard time determining from the documentation what each of them does, and whether or not I should install it. Dell's support site (unless I'm doing it wrong) seems fairly opaque to me and its offerings fairly unstandardized in terms of their usage, so if possible I'd like to stray away from them. Specifically, I'm curious if any of the features offered are duplicated in something like Microsoft System Center. For additional background information, I'm working with a Poweredge 2950 that was just rebuilt with an expanded raid-6, but initially I just installed Server 2008 R2 directly instead of using the Build and Update utility. There's nothing of use on it at the moment so I'm totally open to wiping it again.

    Read the article

  • Authenticate domain-user credentials on unjoined virtual machine?

    - by bwerks
    Hi all, This question may sound silly, and perhaps a bit insane, but--is there any way to run a process on a machine not joined to a domain using credentials from a user in that domain? In my case, I'm running virtual machines installed with release binaries from our build process, as well as Visual Studio. Visual Studio is there to debug our release binaries, however it's being executed with vm-local user credentials. This means that it can't authenticate to our TFS deployment when executing "tf.exe view" to utilize our Source Server for debugging. Team Explorer manages to authenticate to TFS using a UI prompt, however I suspect that it's because we supply it with the TFS deployment's URI, and it's designed to display a prompt to facilitate workgroup scenarios; i.e. it's not like we're getting it for free. My instincts tell me the only way to authenticate on this vm is to join it or somehow form a one-way trust or something, but is there an easier way? For automation we're going to want to script this eventually, but I'm first surveying the feasibility of the thing.

    Read the article

  • Can the features of Dell OpenManage be replaced by Microsoft System Center?

    - by bwerks
    Hi all, I'm new to both OpenManage and System Center, but it sounds as if they're geared towards similar problems/goals. Are the features comparable enough that OpenManage can be forgone completely in favor of System Center products? Specifically I'm hoping to achieve storage monitoring and remote administration, although if someone with experience with both wants to provide a quick compare/contrast (objective, of course) I won't complain. Thanks!

    Read the article

  • Why does Tfs2010 build my Wix project before anything else?

    - by bwerks
    Hi all, A similar question was asked and answered about a year ago, but was either a different issue (everything was in beta) or misdiagnosed. It's located here: http://stackoverflow.com/questions/688162/msbuild-task-fails-because-any-cpu-solution-is-built-out-of-order. My issue is that I have a wix installer project, and after upgrading to Tfs2010 on monday, the build fails on linking because it can't find the build product of the Wpf application in the project. After some digging, it's because it hasn't been built yet. Building in Vs2010 works as normal. The wix project is set to depend on the Wpf project, and when viewing Project Build Order in the IDE, everything looks as normal. The problem was originally encountered with only two platform definitions in the solution; x86 and x64. There are also two flavors, Debug and Release, and TFSBuild.proj is set to build all four combinations. There was no occurence of AnyCPU anywhere. Per the referenced question above, I tried changing the Wpf project to use AnyCPU so that it would be built first. At this point, the wix project used the exact configuration and the Wpf project used the flavor with AnyCPU. However, doing so didn't seem to change anything. I'm using the Tfs2010 RTM, Vs2010 RTM, and the most recent version of Wix, which at the time of this writing is 3.5.1602.0, from 2010-04-02. Anyone else running into this?

    Read the article

  • When does MSBuild set the $(ProjectName) property?

    - by bwerks
    I'm fairly new to MSBuild, and I've done some customization on a Wpf project file that I'm building both in VS2010 and TFS2010. I've customized the output path as follows: <OutputPath Condition=" '$(TeamBuildOutDir)' == '' ">$(SolutionDir)build\binaries\$(ProjectName)\$(Configuration)\$(Platform)</OutputPath> <OutputPath Condition=" '$(TeamBuildOutDir)' != '' ">$(TeamBuildOutDir)binaries\$(ProjectName)\$(Configuration)\$(Platform)</OutputPath> This allows me to build to a centralized binaries directory when building on the desktop, and allows TFS to find the binaries when CI builds are running. However, it seems that in both cases, the $(ProjectDir) property is evaluating to '' at buildtime, which creates strange results. Doing some debugging, it appears as if $(ProjectName) is set by the time BeforeBuild executes, but that my OutputPath property is evaluating it prior to that point. <ProjectNameUsedTooEarly Condition=" '$(ProjectName)' == '' ">true</ProjectNameUsedTooEarly> The preceeding property is in the same property group as my OutputPath property. In the BeforeBuild target, $(ProjectNameUsedTooEarly) evaluates to true, but $(ProjectName) evaluates to the project name as normal by that point. What can I do to ensure that $(ProjectName) has got a value when I use it? edit: I just used Attrice's MSBuild Sidekick to debug through my build file, and in the very first target available for breakpoint (_CheckForInvalidConfigurationAndPlatform) all the properties seem to be set already. ProjectName is already set correctly, but my OutputPath property has already been set using the blank value of ProjectName.

    Read the article

  • DataContractSerializer: preserve string member that happens to be raw xml?

    - by bwerks
    I'm a little inexperienced with the DataContract paradigm, and I'm running into a deserialization problem. I have a field that's a string, but it contains xml and it's not being deserialized correctly. I have a feeling that it's because the DCS is treating it as input to the serializer and not as an opaque string object. Is there some way to mark a DataMember in code to say "This thing is a string, don't treat its contents as xml" similar to XmlIgnore? Thanks!

    Read the article

  • Is my TFS2010 backup/restore hosed?

    - by bwerks
    Hi all, I recently set up a sandbox TFS to test TFS-specific features without interfering with the production TFS. I was happy I did this sooner than I thought--I hadn't been backing up the encryption key from SSRS and upon restoring the reporting databases, they remained inactive, requiring initialization that could only come from applying the encryption key. Said encryption key was lost when I nuked the partition after backing up the TFS databases. The only option I seemed to have is to delete the encrypted data. I'm fine with this, since there wasn't much in there to begin with, however once they're deleted I'm not quite sure how to configure TFS to recognize a new installation of these services while using the restored versions of everything else. Unfortunately, the TFS help file doesn't seem to account for this state though. Is there a way to essentially rebuild the reporting and analysis databases? Or are they gone forever?

    Read the article

  • Tfs 2010: how to set up a corporate source server?

    - by bwerks
    Hi all, I'm looking for guidance in setting up a corporate source server, but when I google this topic the best I can come up with is articles and walkthrough concerned with configuring VS to use microsoft's public symbol servers for use with debugging .NET assemblies. Provided for background info, the environment I'm concerned with using is Vs2010/Tfs2010. Basically, the workflow I'm looking to facilitate is this: 1) customer reports problem with application 2) application of the appropriate version is installed on a virtual machine 3) developer repros bug attaching to process on virtual machine and leveraging source server (symbol server?) on corporate domain. This is the step I'm concerned with. 4) developer pinpoints problem fixes bug in workspace. 5) developer performs a dll swap on VM to test changes? (side topic, not sure on this) 6) normal development/source control workflows. Any advice is welcome! Edit: since writing this, I have stumbled on this article, which is a nice writeup on the configuration of source server for TFS 2008. Has anyone adapted this for Tfs 2010?

    Read the article

1