Search Results

Search found 4379 results on 176 pages for 'hyper v r2'.

Page 76/176 | < Previous Page | 72 73 74 75 76 77 78 79 80 81 82 83  | Next Page >

  • IIS 7.5 returning 404 for unknown host names

    - by WaldenL
    This just doesn't seem correct to me, so I'm looking for someone to tell me how I've misconfigured IIS... Configuration is IIS7.5 (2008R2), without SP1. I have IIS 7.5 configured w/several sites. ALL sites have hostnames defined in the bindings, there is NO site w/out a hostname. However, if I request an unknown hostname from the server IIS (technically Microsoft-HTTPAPI/2.0) return a 404 error, not a 400 error. I would expect a 400 (or some other major error) rather than a lowly 404. This causes a problem when I have nginx in front of multiple IISs and want to stop a site so nginx takes it out of rotation. Since IIS still returns a 404 for the request even when there is no active site for that name, nginx doesn't know the server is dead. NB: IIS returns the 404 regardless of whether there is a server, but it's stopped, or there is no server. Thoughts? Solutions? -- Additional info: OK, I added a site on a port other than 80 (5000) and then on a connection to that port asked for a site that doesn't exist, and I get the expected error 400 (Invalid hostname). So, while IIS isn't listening for generic (no host name) connections on port 80 it would seem that something is. Any ideas how to get HTTPSys to dump the list of what it's listening for?

    Read the article

  • DHCP for Multiple Subnets

    - by TheD
    So this is the current setup - essentially I would like to get my DHCP server, serving DHCP requests for two seperate subnets. Netgear DG834G acting as a modem connected to a Sonicwall Pro 2040. X0 - LAN - 192.168.1.0/24 X1 - WAN - <WAN-IP> X2 - WLAN - 192.168.10.0/24 At the moment, I have a 2008R2 server with DHCP installed, with an IP address on the 192.168.1.0/24 range handling DHCP fine for this subnet. The Sonicwall is configured correctly - anything connected to the WLAN has Full Allow to anything in the LAN, and vice versa but it will not lease an IP from my Server. I've also added another IP address to the server, so the physical NIC now has two IP's: 192.168.1.2 and 192.168.10.2 with a DHCP scope configured for each. Still no luck! Any ideas? Thanks!

    Read the article

  • Can I do a "one-time" file content search in Windows Server 2008 without adding the folder to the index?

    - by G-.
    Can I search for files which contain a specific string in a folder if that folder is not in the search index? So, lets say folder 'textFiles' is not in the index. I navigate to this folder in windows explorer. I type '.ini' in the search box I want to see a result list containing only 'b.txt' FOLDER C:\textFiles\ FILE a.php CONTENT once twice thrice mice moose monkey FILE b.txt CONTENT mingle muddle middle.ini banana beer FILE c.spo CONTENT sellotape stapler phone book I do not have permission to add folders to the windows index and I do not have permission to install or run any executables that did not ship with the server or approved applications. I'd be happy with a windows native command line solution if necessary? Thanks G

    Read the article

  • Possible to disable smart card PIN change in Windows 7?

    - by bobmagoo
    I'm looking for a way to disable the smart card PIN change ability provided with Windows 7's native minidriver. It doesn't allow us to enforce any PIN complexity requirements such that users could change their PIN to 000000 or blank without any issues so we'd like to disable that ability. I've been googling around and haven't found any way to do this, but perhaps someone has encountered a similar issue and found a resolution? A third party minidriver is the next step, but if we could do it without additional tools I'm all for it.

    Read the article

  • How to deploy website in IIS with a host name?

    - by Jayakumar
    I try to host my application in IIS. Below are the steps that I follow: Publish the code and place it in a path. Open IIS, right click on "sites" and select "Add Website". In that dialog I gave the site name and selected the app pool created for the application. I selected the physical path of the published code. I left the IP and port in the binding section without changes. and, finally, gave the host name as fus.km.com. When I try to browse the application the page is not Loading "Internet Explorer cannot display the Page" The machine domain is km.com UPDATE I tried to add the host name to the host file and flushed the DNS. The application asked for user credentials (I use windows Authentication in the application). But it did not login. On repeated tries it throws the error: HTTP Error 401.1 - Unauthorized You do not have permission to view this directory or page using the credentials that you supplied. I tried with different user to login but I get the same result.

    Read the article

  • Remote connection to a Windows 2008 Server Web edition

    - by Lorenzo
    Hello I have just installed Windows 2008 web server to have a development/test site on my office. In the test network I only have 2 machines: Windows server 2008 Web Edition Vista x64 client machine with Visual Studio The client and the server are networked using a NETGEAR router. I have enabled Remote desktop on the server and when I try to connect to it within the Vista client I get the credential window as in the following screenshot. But even if I write the correct credentials I am not able to remote login on the server. Where am I doing wrong? Update 1 I have even tried to create a folder share on the server. But I am not able to access it for the same reason. User or password invalid it says. But this is impossible as I am logging in the server with the same credentials. Update 2 If I try to browse the network from the RDP client I receive a message saying that there are no server running Terminal Services in my network.... :O

    Read the article

  • How can I automatically delete /tmp folder on shared drive?

    - by Matt
    We have a /tmp folder that people use for temporary stuff. It can be anything and any file. We want to make it so that this automatically deletes (or preferably MOVES to another folder on the same shared drive) all the files that haven't been accessed in the last two weeks. This should happen weekly on a schedule that I don't have to manually do. Is there software out there that does this? Anyone have a script possibly? Server 2008R2

    Read the article

  • Clone roaming Appdata in two places

    - by blsub6
    I have my users appdata (roaming) stored on a external server in the location that they're normally in. I have some users that are in two locations equally. This provides a problem when someone tries to open up Firefox on a computer in a location other than where their appdata is stored, it takes forever. Is there a way that I can clone the redirected appdata (roaming) folder to two locations and have the folder redirection look for appdata (roaming) based on the location that the user is at?

    Read the article

  • Quota, AD and C#

    - by Gnial0id
    At first, my mother tongue is not English, so I apologize for the possible mistakes. I'm working on a WS2008R2 server with an Active Directory and a web platform manages this AD with C# code. A group of users have to be able to create user accounts but during the procedure, a disk quota for this new account is (and have to be) created. As the "creator" must not be a member of the Administrators group, the access to the c/: disk is denied. So, I want to perform the File Server Resource Manager operations with C# code by an non-admin account. The code is correct, it works normally with admin account. So, the problem turns around the permissions on the hard drive. I've looked after help on the Internet, without success. It seems that quota delegation is impossible. Only admin can perform this. A colleague helped me a bit, and found the GPO "By pass traverse checking" on a forum but it doesn't seems to be the good way. Any help would be appreciate.

    Read the article

  • WSUS is not using Akamai CDN for syncronisation source

    - by Geekman
    I've just installed a WSUS onto our network, and I'm currently doing the initial sync. I've found that WSUS does not seem to be talking to an Akamai cache, but rather with MS directly. This is contrary to what I've always thought regarding Windows Update traffic. Tcpdump of our WSUS server doing initial sync... As you can see it's speaking with 65.55.194.221. For me to speak to this IP, I have to go over international transit links. Which is of course not ideal. 8:42:31.279757 IP 65.55.194.221.https > XXXX.XXXX.XXXX.XXXX.50888: Flags [.], seq 4379374:4380834, ack 289611, win 256, length 1460 18:42:31.279759 IP 65.55.194.221.https > XXXX.XXXX.XXXX.XXXX.50888: Flags [.], seq 4380834:4382294, ack 289611, win 256, length 1460 18:42:31.279762 IP 65.55.194.221.https > XXXX.XXXX.XXXX.XXXX.50888: Flags [.], seq 4382294:4383754, ack 289611, win 256, length 1460 18:42:31.279764 IP 65.55.194.221.https > XXXX.XXXX.XXXX.XXXX.50888: Flags [P.], seq 4383754:4384144, ack 289611, win 256, length 390 18:42:31.279793 IP XXXX.XXXX.XXXX.XXXX.50888 > 65.55.194.221.https: Flags [.], ack 4369154, win 23884, length 0 18:42:31.279888 IP XXXX.XXXX.XXXX.XXXX.50888 > 65.55.194.221.https: Flags [.], ack 4377914, win 23884, length 0 18:42:31.280015 IP XXXX.XXXX.XXXX.XXXX.50888 > 65.55.194.221.https: Flags [.], ack 4384144, win 23884, length 0 And yet, if I ping download.windowsupdate.com it seems to resolve to a local (national) Akamai node, just fine: root@some-node:~# ping download.windowsupdate.com PING a26.ms.akamai.net (210.9.88.48) 56(84) bytes of data. 64 bytes from a210-9-88-48.deploy.akamaitechnologies.com (210.9.88.48): icmp_req=1 ttl=59 time=1.02 ms 64 bytes from a210-9-88-48.deploy.akamaitechnologies.com (210.9.88.48): icmp_req=2 ttl=59 time=1.10 ms Why is this? And how can I change that (if possible)? I know that I can manually specify a WSUS source to sync with instead of pick the default MS Update like I currently have... But it seems like I shouldn't have to do this. NOTE: I've haven't confirmed if a WUA speaks with Akamai, just looking at WSUS as all WUAs will use our internal WSUS from now on. We'll be looking to join an IX shortly with the hopes of peering with an Akamai cache and have very fast access to Windows Updates. Before I let this drive my motivations for an IX at all I want to first confirm it's actually possible for WSUS to speak with an Akamai cache. I know this is somewhat networking related, but I feel like it has more to do with WSUS than anything, so someone who knows WSUS better than me will likely be able to figure this out.

    Read the article

  • WBAdmin SystemState Problems

    - by TheD
    I recently installed DHCP on my 2008R2 Server and now Backup Exec and WBAdmin/WSB is having System State issues. Essentially, after some research, I came across a handy tool called VSHADOW which has allowed me to output all the files (and their respective directories) to a text file. And hooray, I think I found the problem: * WRITER "Dhcp Jet Writer" - WriterId = {be9ac81e-3619-421f-920f-4c6fea9e93ad} - InstanceId = {0ed0a8f4-19b0-414d-a3a8-d51d6f4ac8e0} - Supports restore events = TRUE - Writer restore conditions = VSS_WRE_IF_REPLACE_FAILS - Restore method = VSS_RME_RESTORE_AT_REBOOT - Requires reboot after restore = TRUE - Excluded files: - Component "Dhcp Jet Writer:\C:_Windows_system32_dhcp\dhcp" - Name: 'dhcp' - Logical Path: 'C:_Windows_system32_dhcp' - Full Path: '\C:_Windows_system32_dhcp\dhcp' The logical path and Full Path for DHCP is completely wrong. However I can't find where I would change this path, I assume in the registry but I've had no luck finding the key !

    Read the article

  • Samba with Active Directory - shares are readonly, NT_STATUS_MEDIA_WRITE_PROTECTED

    - by froh42
    I've set a samba server that seems to work, all shares are seemingly exported as readonly, however. The machine is called "lx". When I'm on lx I can run the following command: froh@lx:~$ smbclient //lx/export -UAdministrator Enter Administrator's password: Domain=[CUSTOMER] OS=[Unix] Server=[Samba 3.5.4] smb: \> mkdir wrzlbrmpf NT_STATUS_MEDIA_WRITE_PROTECTED making remote directory \wrzlbrmpf smb: \> ls . D 0 Fri Dec 3 19:04:20 2010 .. D 0 Sun Nov 28 01:32:37 2010 zork D 0 Fri Dec 3 18:53:33 2010 bar D 0 Sun Nov 28 23:52:43 2010 ork 1 Fri Dec 3 18:53:02 2010 foo 1 Sun Nov 28 23:52:41 2010 gaga D 0 Fri Dec 3 19:04:20 2010 How can I troubleshoot this? What I did: First I set up a fresh install of Ubuntu 10.10 x64. Second I got kerberos working with the following krb5.conf file: [libdefaults] ticket_lifetime = 24000 clock_skew = 300 default_realm = CUSTOMER.LOCAL [realms] CUSTOMER.LOCAL = { kdc = SB4.customer.local:88 admin_server = SB4.customer.local:464 default_domain = CUSTOMER.LOCAL } [domain_realm] .customer.local = CUSTOMER.LOCAL customer.local = CUSTOMER.LOCAL #[login] # krb4_convert = true # krb4_get_tickets = false I also added winbind to group, passwd and shadow in nsswitch.conf. Seemingly Kerberos works: root@lx:~# net ads testjoin Join is OK root@lx:~# wbinfo -a 'Administrator%MYSECRETPASSWORD' plaintext password authentication succeeded challenge/response password authentication succeeded wbinfo -u and wbinfo -g also spit out a list of users and a list of groups respectiveley. I noted that domain accounts did NOT include a domain and they are in german (as on the SBS 2003 that is the domain server). So I get a "Domänenbenutzer" in wbinfo -u's output not a "CUSTOMER+Domain User" or something similar. I'm not sure anymore what I did to the PAM configuration, but here is what I currently have: root@lx:/etc/pam.d# cat samba @include common-auth @include common-account @include common-session-noninteractive root@lx:/etc/pam.d# grep -ve '^#' common-auth auth [success=3 default=ignore] pam_krb5.so minimum_uid=1000 auth [success=2 default=ignore] pam_unix.so nullok_secure try_first_pass auth [success=1 default=ignore] pam_winbind.so krb5_auth krb5_ccache_type=FILE cached_login try_first_pass auth requisite pam_deny.so auth required pam_permit.so root@lx:/etc/pam.d# grep -ve '^#' common-account account [success=2 new_authtok_reqd=done default=ignore] pam_unix.so account [success=1 new_authtok_reqd=done default=ignore] pam_winbind.so account requisite pam_deny.so account required pam_permit.so account required pam_krb5.so minimum_uid=1000 root@lx:/etc/pam.d# grep -ve '^#' common-session-noninteractive session [default=1] pam_permit.so session requisite pam_deny.so session required pam_permit.so session optional pam_krb5.so minimum_uid=1000 session required pam_unix.so session optional pam_winbind.so At some point I joined the linux box into the AD domain. After (manually) creating a home directory on the linux box I can log in using the Adminstrator user with the password taken from AD. Now I run samba with the following setup: [global] netbios name = LX realm = CUSTOMER.LOCAL workgroup = CUSTOMER security = ADS encrypt passwords = yes password server = 192.168.20.244 #IP des Domain Controllers os level = 0 socket options = TCP_NODELAY SO_RCVBUF=16384 SO_SNDBUF=16384 idmap uid = 10000-20000 idmap gid = 10000-20000 winbind enum users = Yes winbind enum groups = Yes preferred master = no winbind separator = + dns proxy = no wins proxy = no # client NTLMv2 auth = Yes log level = 2 logfile = /var/log/samba/log.smbd.%U template homedir = /home/%U template shell = /bin/bash [export] path = /mnt/sdc1/export read only = No public = Yes Currently I don't care whether export is exported to everyone or just one user, I want to see somebody WRITING to that directory before I start fiddling with the authentication settings. (Who may access it). As mentioned, accessing the share from smbclient results in this NT_STATUS_MEDIA_WRITE_PROTECTED . Accessing it from windows shows ACLs that look correct (The user may write) - but it does not work, I can only read files not write. The directory to be exported looks like this: root@lx:/etc/pam.d# ls -ld /mnt/ drwxr-xr-x 5 root root 4096 2010-11-28 01:29 /mnt/ root@lx:/etc/pam.d# ls -ld /mnt/sdc1/ drwxr-xr-x 4 froh froh 4096 2010-11-28 01:32 /mnt/sdc1/ root@lx:/etc/pam.d# ls -ld /mnt/sdc1/export/ drwxrwxrwx+ 5 administrator domänen-admins 4096 2010-12-03 19:04 /mnt/sdc1/export/ root@lx:/etc/pam.d# getfacl /mnt/ getfacl: Entferne führende '/' von absoluten Pfadnamen # file: mnt/ # owner: root # group: root user::rwx group::r-x other::r-x root@lx:/etc/pam.d# getfacl /mnt/sdc1/ getfacl: Entferne führende '/' von absoluten Pfadnamen # file: mnt/sdc1/ # owner: froh # group: froh user::rwx group::r-x other::r-x root@lx:/etc/pam.d# getfacl /mnt/sdc1/export/ getfacl: Entferne führende '/' von absoluten Pfadnamen # file: mnt/sdc1/export/ # owner: administrator # group: domänen-admins user::rwx group::rwx group:domänen-admins:rwx mask::rwx other::rwx default:user::rwx default:group::rwx default:group:domänen-admins:rwx default:mask::rwx default:other::rwx My, oh my what am I overlooking? What am I to blind to see?

    Read the article

  • Set Default Program for All Users on Server

    - by MattN
    I work with a large server environment that's running Windows Server 2003, 2008, and 2012 now on some boxes. We have a custom-built log viewer program that associates with two file types that I'd like to set to be the default program for all users across all boxes, so new users don't have to set the default program themselves on every box they log into. Ideally I'd like to have a simple registry script we could push out to all machines at once. I realize this likely means changing the registry entries for either HKCR or HKLM for the file extensions, but adding the program location with %1 extension to \shell\open\command value in HKLM simply opens the program and does not also load the log file. Am I just missing an open and play setting, or am I looking at this entirely wrong? (And I know the script will need to be different for 2003 and 2008, but changing the version for two scripts isn't hard) Thanks!

    Read the article

  • Manage computer from active directory manager

    - by Ripeed
    Within dsa.mac when I right-click on a computer and choose "Manage", it displays the following error message: Can't find path to computer \\computer.domain.tld. If I try ping computer.domain.tld, DNS translates it to an IP but the computer does not answer. It is right that ping does not answer? Pinging a computer that is not joined to the domain replies as expected. How can I correct this issue?

    Read the article

  • batch file infinite loop when parsing file

    - by Bart
    Okay, this should be a really simple task but its proving to be more complicated than I think it should be. I'm clearly doing something wrong, and would like someone else's input. What I would like to do is parse through a file containing paths to directories and set permissions on those directories. An example line of the input file. There are several lines, all formatted the same way, with a different path to a directory. E:\stuff\Things\something else (X)\ (The file in question is generated under Cygwin using find to list all directories with "(X)" in the name. The file is then passed through unix2win to make it windows compatible. I've also tried manually creating the input file from within windows to rule out the file's creation method as the problem.) Here's where I'm stuck... I wrote the following quick and dirty batch file in Windows XP and it worked without any issues at all, but it will not work in server 2k8. Batch file code to run through the file and set permissions: FOR /F "tokens=*" %%A IN (dirlist.txt) DO echo y| cacls "%%A" /T /C /G "Domain Admins":f "Some Group":f "some-security-group":f What this is SUPPOSED to do (and does in XP) is loop through the specified file (dirlist.txt) and run cacls.exe on each directory it pulls from the file. The "echo y|" is in there to automagically confirm when cacls helpfully asks "are you sure?" for every directory in the list. Unfortunately, however, what it DOES is fall into an infinite loop. I've tried surrounding everything after "DO" with quotes, which prevents the endless loop but confuses cacls so it throws an error. Interestingly, I've tried running the code from after "DO" manually (obviously replacing the variable with the full path, copied straight from the file) at a command prompt and it runs as expected. I don't think it's the file or the loop, as adding quotes to the command to be executed prevents the loop from continuing past where it's supposed to... I really have no idea at this point. Any help would be appreciated. I have a feeling it's going to be something increadibly stupid... but I'm pulling my hair out so I thought I'd ask.

    Read the article

  • How can I stop Windows DNS server properties settings from changing by themselves?

    - by paradroid
    When I open the DNS console in Administrative tools, I keep finding a couple of problems which keep on reappearing by themselves, and I want to stop them from happening. One of the DNS servers has two network interfaces, and it should only be listening for requests on of them, and I get errors in the Event Log otherwise. But when right clicking one DNS server and selecting Properties, I can see on the Interfaces tab that 'All IP addresses' is selected. If I Change it to 'Only the following IP addresses:' and deleselect the WAN addess, I will find it reslected when I next check it after a couple of days. In the other DNS server's Properties, on the Forwarders tab, there should only be two forwarder addresses. However, the address for the router keeps in appearing. This router has the DNS server as its forwarder. There shouldn't be anything using the router's DNS forwarders for DNS other than the router itself, but this surely is causing a loop. How do I get these properties on both DNS servers to stick?

    Read the article

  • EFS Remote Encryption

    - by Apoulet
    We have been trying to setup EFS across our domain. Unfortunately Reading/Writing file over network share does not work, we get an "Access Denied" error. Another worrying fact is that I managed to get it working for 1 machine but no other would work. The machines are all Windows 2008R2, running as VM under ESXi host. According to: http://technet.microsoft.com/en-us/library/bb457116.aspx#EHAA We setup the involved machine to be trusted for delegation The user are not restricted and can be trusted for delegation. The users have logged-in on both side and can read/write encrypted files without issues locally. I enabled Kerberos logging in the registry and this is the relevant logs that I get on the machine that has the encrypted files. In order for all certificate that the user possess (Only Key Name changes): Event ID 5058: Audit Success, "Other System Events" Key file operation. Subject: Security ID: {MyDOMAIN}\{MyID} Account Name: {MyID} Account Domain: {MyDOMAIN} Logon ID: 0xbXXXXXXX Cryptographic Parameters: Provider Name: Microsoft Software Key Storage Provider Algorithm Name: Not Available. Key Name: {CE885431-9B4F-47C2-8415-2D766B999999} Key Type: User key. Key File Operation Information: File Path: C:\Users\{MyID}\AppData\Roaming\Microsoft\Crypto\RSA\S-1-5-21-4585646465656-260371901-2912106767-1207\66099999999991e891f187e791277da03d_dfe9ecd8-31c4-4b0f-9b57-6fd3cab90760 Operation: Read persisted key from file. Return Code: 0x0[/code] Event ID 5061: Audit Faillure, "System Intergrity" [code]Cryptographic operation. Subject: Security ID: {MyDOMAIN}\{MyID} Account Name: {MyID} Account Domain: {MyDOMAIN} Logon ID: 0xbXXXXXXX Cryptographic Parameters: Provider Name: Microsoft Software Key Storage Provider Algorithm Name: RSA Key Name: {CE885431-9B4F-47C2-8415-2D766B999999} Key Type: User key. Cryptographic Operation: Operation: Open Key. Return Code: 0x8009000b Could this be related to this error from the CryptAcquireContext function NTE_BAD_KEY_STATE 0x8009000BL The user password has changed since the private keys were encrypted. The problem is that the users I using at the moment can not change their password.

    Read the article

  • Roaming Profiles & Redirected Folders - storage consumption? offline files and caching?

    - by Ben Swinburne
    I understand the concepts of both roaming profiles and folder redirection and have used both separately before. I am about to set up a network from scratch and would ideally like to use both for the following reasons primarily Roaming profiles allow users to log on to any machine and have their profile Redirected profiles allow users to have their My Documents and Desktop etc backed up without the need to log off at the end of the day. The servers can run their backups overnight and there are no missing files due to the user not logging off. Redirected profiles largely alleviate the slow log in times caused by large profiles. My question is if some of the folders are redirected and therefore not part of the roaming profile what happens on machines which truly roam (i.e. laptops)? If there's offline files or a cache does this mean that the problem whereby a user has to log off comes back? By having them both enabled, is there any duplication i.e. if I have a users$ share and a profiles$ share would I have Desktop twice for example?

    Read the article

  • DFS "clobering" files

    - by Badger
    We have DFS setup using the DFS Management Administrator Tool. I turned on replication in the Distributed File System Administrator Tool as well and this morning we lost tons of files from that share. Please explain to me why this was wrong and if there is anything that can be done to repair it. (No, we don't have backups. We have some shadow copies, but those were deleted as well. We have been using DFS as its own backup)

    Read the article

  • what to disable on Windows server? (by list of opened ports)

    - by javapowered
    I'm using HP DL360p Gen8 for HFT trading. I want to disable any network services I don't need cause I also want to try to disable Windows Firewall to test if this will improve perfomance. Could someone suggest what currently is turned on and can be likely turned off having ports list below? I need only RDP (also I drag & drop files via RDP) Proto Local Address Foreign Address State TCP 0.0.0.0:135 Term:0 LISTENING TCP 0.0.0.0:445 Term:0 LISTENING TCP 0.0.0.0:2301 Term:0 LISTENING TCP 0.0.0.0:2381 Term:0 LISTENING TCP 0.0.0.0:3389 Term:0 LISTENING TCP 0.0.0.0:47001 Term:0 LISTENING TCP 0.0.0.0:49152 Term:0 LISTENING TCP 0.0.0.0:49153 Term:0 LISTENING TCP 0.0.0.0:49154 Term:0 LISTENING TCP 0.0.0.0:49156 Term:0 LISTENING TCP 0.0.0.0:49157 Term:0 LISTENING TCP HIDEN:139 Term:0 LISTENING TCP HIDEN:3389 HIDEN:63373 ESTABLISHED TCP HIDEN:139 Term:0 LISTENING TCP HIDEN:139 Term:0 LISTENING TCP [::]:135 Term:0 LISTENING TCP [::]:445 Term:0 LISTENING TCP [::]:2301 Term:0 LISTENING TCP [::]:2381 Term:0 LISTENING TCP [::]:3389 Term:0 LISTENING TCP [::]:47001 Term:0 LISTENING TCP [::]:49152 Term:0 LISTENING TCP [::]:49153 Term:0 LISTENING TCP [::]:49154 Term:0 LISTENING TCP [::]:49156 Term:0 LISTENING TCP [::]:49157 Term:0 LISTENING UDP 0.0.0.0:68 *:* UDP 0.0.0.0:123 *:* UDP 0.0.0.0:161 *:* UDP 0.0.0.0:500 *:* UDP 0.0.0.0:4500 *:* UDP 0.0.0.0:5355 *:* UDP HIDEN:137 *:* UDP HIDEN:138 *:* UDP HIDEN:137 *:* UDP HIDEN:138 *:* UDP HIDEN:137 *:* UDP HIDEN:138 *:* UDP [::]:123 *:* UDP [::]:161 *:* UDP [::]:500 *:* UDP [::]:4500 *:* UDP [::]:5355 *:*

    Read the article

  • How Windows Server routes PTP v2 unicast messages?

    - by Bobb
    If my server is placed in PTP v2 enabled network which has grand master clock. And the switch is PTP aware. The server is W2008R2 (soon to be W2012). I also have PTP v2 software client. How does the master clock messages are getting on with Windows Server? Does it need special PTP-aware NIC or it will be treated as normal networking traffic and the software receives it through regular NIC no problem?

    Read the article

  • Cannot connect to Windows SBS Essentials 2011

    - by Michael Ervin
    Using the Launchpad from the Mac, trying to connect to Server Essentials 2011, no one is able to actually log in. The little login wheel keeps spinning and spinning. Anyone have any ideas what might be keeping us from logging in? On the mac side, we are using 10.6 and 10.7, but both are exhibiting the same problem. I also cannot connect to the server by remote desktop connection within the local network. HELP!

    Read the article

  • IPv6 6to4 on Windows Server

    - by Graham Wager
    I'm looking for a relatively simple guide to setting up an IPv6 tunnel properly. This network currently has a server (Windows Server 2008R2) running RRAS that establishes connectivity to the internet using a demand-dial PPPoE connection and handles the NAT. It also hosts a DNS server and DHCP. My ISP does not support IPv6, but I have a static IPv4 address. I've read about 6to4 and signed up at tunnelbroker.net, but quickly felt out of my depth. How do I configure my network to use it, and how I should configure my DHCP server with regards to IPv6 addresses?

    Read the article

  • What happens when the server that the Remote Desktop Connection Broker goes down?

    - by Frank Owen
    I would like to setup the Remote Desktop Connection Broker to allow better load balancing of the two terminal servers we have as well as allowing the user to re-establish to the correct server if they get disconnected. My worry is, if I set this up and the server this service is running goes down, does the terminal server stop accepting connections or will they just lose the benefit of having RDCB turned on? I don't want to add another point of failure in this equation unless I have to.

    Read the article

< Previous Page | 72 73 74 75 76 77 78 79 80 81 82 83  | Next Page >