Search Results

Search found 22998 results on 920 pages for 'supervised users'.

Page 602/920 | < Previous Page | 598 599 600 601 602 603 604 605 606 607 608 609  | Next Page >

  • Does Virtual Machines with Microsoft server 2003 with Host operating system Vista Home Premium suffer from Vista contraints?

    - by mokokamello
    Experts ! i have a machine with vista home premium and i wanted to share a folder with my colleagues unfortunately i vista allow only 10 concurrent connections to a shared folder one of my colleagues advised me to install a Virtual machine with windows server 2003 so that i will be able to share the folder with more than 1000 user. another colleague stated that the kind of virtual server does not matter, what matters is the host OS, in this case vista. so the folder will be shared by no more than 10 users. my Question is Does Virtual Machines with Microsoft server 2003 with Host operating system Vista Home Premium suffer from Vista contraints?

    Read the article

  • upgrading servers, need to keep domain same as before. what are the best practices?

    - by nLL
    Hi, I am upgrading a domain controller/file server from win2003 standard to win2008 r2 standard. We are planing to have a file server and an AD controller. Our old hardware will be scrapped, we want to copy all AD users/computers to new machine and keep current domain name. I never done this before. What are the best practices? Is it better if we get a contractor to do it for us? I guess best way to start is to build new servers, copy data, take old server down and put new server online. My gut says we would need to re-join all computers. Is that correct? Any input appreciated.

    Read the article

  • Can't connect to ssh after nfs problem

    - by MihaiM
    Hello, I had a problem yesterday with a server that lost connection (S1). From that server, there was a dir shared with NFS to another one (S2), no homedir and not in $PATH, but a dir to store old files for archiving. S1 was back online after a few hours, but now I cannot access S2 because of this (and I'm sure it's because of that because all other services are running without any problem). The ssh connection hangs here: debug1: Entering interactive session. I know a reboot will do the job, but considering this is the NAS of a big app, my bosses will kill me if I do it. Is there any other way to get over this? I tried with different users, but all of them hang in the same place. I connected with HP iLO and not even there I cannot use my username. Thanks in advance.

    Read the article

  • Router for Infrastructure Network

    - by amfortas
    We have an HPC operation that down the years has grown to several racks of gear at three sites, hooked up via Gigabit fiber and Catalyst 2960s (we control the links and switches). Thus far all machines have been on a flat RCF1918 10/8 but we are looking to segment the network in order to streamline matters for iSCSI and generally keep infrastructure equipment away from our end-users. We have now reached a point where we need to consider introducing VLANs for specific subnets and are wondering if it would be worthwhile in the longer run to acquire a small router to keep to keep track of all this stuff and cut down on the complexity of netmasks and routes on host machines, etc. Has anyone here had a similar experience? Suggestions as to suitable equipment would be welcome.

    Read the article

  • nm-applet and nmcli don't work in arch

    - by user1780765
    I installed arch linux about two weeks ago. But I still can't use Network Manager. When I use nm-applet command, I get this error: https://gist.github.com/4129816 And when I use nmcli con command, I get this: https://gist.github.com/4129837 sudo doesn't help me neither. D-Bus look fine, I can use dbus-monitor. I installed gnome, but I still can't use NM. I 'm using openbox now. I have another problem too: what should do so I can use nm-connection-editor with non-root users?

    Read the article

  • How would you manage development between many Staging branches?

    - by Trip
    We have a Staging Branch. then we came out with a Beta branch for users to move whenever they wanted to from old Production branch to the new features. Our plan seemed simple, we test on Staging, when items get QA'd, they get cherry-picked and deploy to Beta. Here's the problem! A bug will discreetly make its way on to Beta, and since Beta is a production environment, it needs fixes fast and accurate. But not all the QA's got done. Enter Git hell.. So I find a problem on Beta. No sweat, its already been fixed on Staging, but when I go to cherry-pick the item over, Beta barely has any of the other pre-requisites of code to implement this small change. Now Beta has a little here and a little there, and I can't imagine it as a code base being as stable as Staging. What's more, is I'm dealing with some insane Git conflicts, and having to monkey patch a bunch of things to make up for what Beta hasn't caught up with Staging. Can someone polite or non-polite terms, tell me what we're doing wrong here as far as assembling this project? Any awesome recommendations or workarounds or alternatives to the system we came up with?

    Read the article

  • Prioritize file sharing capabilities in Windows Server 2008

    - by cmbrnt
    I've got a server running Windows Server 2008, and use it mainly for sharing files throughout the domain from a number of disks. It's running on VMware ESXi 4.0, in case that matters. My problem is that when I log in to the server to check user permissions etc, the access speed the files on the remote disks almost grinds to a halt. I havn't been able to measure the speeds, but I would guess it slows down to about 100kB/s as soon as I log in. This is on a gigabit network and the problems are equal for all users, even the ones connected to the same switch as the server. I've assigned 2 GB RAM to the server, and reserved it 1,5Ghz processor power. I don't have to do anything special on the server for this halt to occur. How can I make sure file sharing is prioritized on the server, so no matter what applications I'm using it will always make sure file sharing works properly? Could this be a VMware issue?

    Read the article

  • CentOS drive mapping? [on hold]

    - by DroidOS
    This is the first time I am posting on this particular StackExchange forum and I hope that I am using the right one for the present question. Briefly, this is what I need to do I am running a web service where users can, amongst other things, upload and store files on the server. What I want to do is to hive off user file storage to a different location so my server (CentOS 64 bit) can concentrate on what it does best - server side scripting and database management. As things stand all user files go into subdirectories in a folder called stash that lies above DOC_ROOT. What I would like to do is Transparently detect all attempts to read/write to stash/sub_folder and get/set file data on a remote server - ideally the latter would be one which replicates files like a CDN so they can be delivered from the closest/fastest location based on where the user's location. Even nicer would be if for all read accesses I could provide a URL that allows the user's browser to fetch the relevant file directly without having to funnel them via my server. I am a relative newbie when it comes to this sort of thing so I hope that I have phrased this question adequately well. From the little searching I done I gathered that WebDAV can be used to map drives to a different location on the web so perhaps that is a starting point. But if that will work I need to Establish how to get WebDAV up and running on my CentOS 64 bit server. Ideally, identify a service that allows this kind of file storage and provides an API I can use in my own scripting. I'd much appreciate any help with this.

    Read the article

  • Authentication through mod_auth_kerb should provide website with no user if no TGT provided

    - by loomi
    Users are authenticated by mod_auth_kerb which works great. Therefore I need to set Require valid-user If there is no valid user Apache fails with an 401 Authorization Required. I would like Apache to deliver the website anyway but not providing a remote_user to the underlying script. This is related to How to tell mod_auth_kerb to do its job despite no "require valid-user". But with the important difference that on a whole subdirectory on every url a kerberos negotation should be initiated, and if it fails it should deliver the content anyway.

    Read the article

  • Mail Server with Google Apps

    - by Daniel Fukuda
    Hello, Is there any mail server that has a feature to download (POP3) emails from another mail server like Google Apps (Gmail for your own domain), store it and then allow to users to download (POP3/IMAP) emails to their own mail client like Outlook/Live Mail? So I want it to become like a "middle mail server". I hope you guys understand it. My main reason to do so is that Google Apps got limited space for each mailbox and I also want to have emails in one space so its easy to archive and backup.

    Read the article

  • Why can't I set Windows 7 folder to Writeable?

    - by Clay Nichols
    Moved a SATA HD from one PC to another. Copied almost all the files from old drive to new, except for one folder ("oldFolder") and it's subfolders and files. I tried to copy just a single file (to simplify things): Can't copy that file to same directory. (get above error) CAN copy file to Desktop. Can not copy files' container folder to Desktop. Under Properties for that OldFolder: Read Only is Checked. Security: Everyone set to Allow everything except "special permissions" All users are set to allow WRITE.

    Read the article

  • how to manage a multi user server on linux?

    - by user1175942
    I'm working on a university project, where I have Tomcat as a web server, and I want to create a multi user environment on top of linux, so every user that logs into my website has his own credentials, and he can access only his own data (files and folders...). The main issue is that the purpose of the website is executing code on the server-side, so I must have a good (reasonable) protection against malicious code. (a user destroying his own user is fine by me) I thought that defining a linux-user for every website-user is the best solution - it isolates each user from the other, and I can define each one's permissions. Can I create users in linux using shell commands? Can I configure max quota/memory/cpu for a user? Anyone has another idea for managing that kind of multi-user environment?

    Read the article

  • "this network location can't be included because it is not indexed" on Windows 2008R2 Remote Desktop Services Hosting

    - by ChrisNZ
    I'm setting up a new terminal server for our users on Win2008R2 (I guess I should call it Remote Desktop Services now!) When I try to change the location of "Documents" (by removing the default Documents library and adding a new one), to use the file server ie \\fileserver\username\Documents I get the message: "This network location can't be included because it is not indexed" I certainly don't want to make folders available offline, and in fact, I have set the GPO to prohibit offline folders on the terminal servers. What is the best practice for document libraries on terminal server and network file shares?

    Read the article

  • Does Microsoft offer a corporate IM/collaboration tool similar to Campfire? My googlefu skills appear to be failing me today.

    - by user54266
    I mentioned to my boss that we should look into a single unified IM client that we could use and secure on a corporate level, and then suggested Campfire. We're a primarily Microsoft house so he suggested we use something that would better integrate with SharePoint and the other tools our end users use in house. However, I'm not aware of any Microsoft tool that does something like this. Obviously there is MSN Messenger but I think/hope he wasn't referring to that. Other than a product from 2005 I haven't been able to locate a Microsoft corporate IM tool...does anybody know what he may have been talking about?

    Read the article

  • Is there a multi-user Remote Desktop app for Mac OS X?

    - by Peter Walke
    Is there a remote desktop app for the Mac that allows multiple people to be remoted in at the same time, similar to RDP in Windows? I've used VNC, but that only allows one person to control the computer. For some background: I'd like to set up a mac that many users can RDP into from PC's to do XCode development. I did some searching and didn't find anything, so I'm assuming it's just not possible, but I want to confirm. Thanks. Update: Thanks to a link in one of the answers, I found a reasonable solution: AquaConnect

    Read the article

  • How can I access a shared Exchange mailbox with IMAP (over telnet)?

    - by gauteh
    I have an Mailbox which multiple users have access to, it works fine for me and I can add it in Outlook as an additional mailbox to my account and list all its content. I can access my personal mailbox using IMAP, I'm testing it by just telneting in and LIST'ing it. The problem is that another user trying to access it is having problems accessing it through IMAP; and I want to test if I can access the shared mailbox with my account - how can I do that in terms of IMAP commands? What I am doing now is: telnet mail.server 01 LOGIN user pass 02 LIST "" * 03 LOGOUT Edit: If there is another way to test this, that is an equally good answer.

    Read the article

  • Windows 7 CD Command only echoes directory

    - by Zobbl
    The path for every new instance of the shell starts in my user directory (C:\Users\user). Within this directory or rather drive (in this case C:) I can't use the cd command as I'm used to - it only echoes the specified directory. As soon as I change the directory to a parent-directory I can execute "cd D:" and it changes to the drive. But this behavious doesn't appear consistently in all instances of the shell. Sometimes I have to go to C: to change it. I'm quite sure I'm not using the command in the wrong way, since it's what I'm used to do to start grails.

    Read the article

  • Windows 8 Task Manager RAM Usage Accuracy

    - by user264892
    The new Task Manager has a great UI in windows 8, however, there are some discrepancies in the data I can not account for: Machine: 8 GB of total ram. (This is a physical machine, not a virtual) The processes tab shows 45% of Memory utilized. The listed process do not add up to 3.5 GB of RAM, but instead add up to 0.948 GB. There is no "processes for all users" option. The performance Tab Shows: In use : 3.6 GB Available: 4.4 GB Committed : 4.1 /9.2 GB Cached: 3.7 GB Paged Pool: 376 MB Non-paged pool: 135 MB My reading of this says I have ALOT of "cloaked" processes running some where eating my ram. How do I interpret this data and how do I verify it?

    Read the article

  • What's a good algorithm for a random, uneven distribution of a fixed amount of a resource?

    - by NickC
    Problem I have X, a positive integer, of some resource, R. There are N potential targets. I want to distribute all of R to the N targets in some "interesting" way. "Interesting" means: Some targets may not get any R. It should rarely be near even (with a majority of target getting near X/N of the resource). There should be at least a small chance of one target getting all of R. Bad solutions The naive approach would be to pick a random target and give one R to it and repeat X times. This would result in too even of an approach. The next idea is to pick a random number between 1 and X and give it to a random target. This results in too large of a number (at least X/2 on average) being given to one target. Question This algorithm will be used frequently and I want the distribution to be interesting and uneven so that the surprise doesn't wear off for users. Is there a good algorithm for something in between these two approaches, that fits the definition of interesting above?

    Read the article

  • Should I amortize scripting cost via bytecode analysis or multithreading?

    - by user18983
    I'm working on a game sort of thing where users can write arbitrary code for individual agents, and I'm trying to decide the best way to divide up computation time. The simplest option would be to give each agent a set amount of time and skip their turn if it elapses without an action being decided upon, but I would like people to be able to write their agents decision functions without having to think too much about how long its taking unless they really want to. The two approaches I'm considering are giving each agent a set number of bytecode instructions (taking cost into account) each timestep, and making players deal with the consequences of the game state changing between blocks of computation (as with Battlecode) or giving each agent it's own thread and giving each thread equal time on the processor. I'm about equally knowledgeable on both concurrency and bytecode stuff, which is to say not very, so I'm wondering which approach would be best. I have a clearer idea of how I'd structure things if I used bytecode, but less certainty about how to actually implement the analysis. I'm pretty sure I can work up a concurrency based system without much trouble, but I worry it will be messier with more overhead and will add unnecessary complexity to the project.

    Read the article

  • What data should I (can I) gather to prove a .net bug in third party software?

    - by gef05
    Our users are using a third-party system and every so often - a user might go all day without this happening, other days they may see it two or three times in seven hours - a red x will appear on screen rather than a button, field, or control. This is a .net system. Looking online I can see that this is a .net error. Problem is, our devs didn't write it, and the vendor wants proof that the problem resides with their system, not our PCs. What can we do to gather information on such an issue that would allow us to make our case to the vendor and get them to create a fix? I'm thinking of data to gather - anything from OS, memory. .net version installed etc. But I'd also like to know about error logging - if it's possible to log errors that occur in third-party systems.

    Read the article

  • Can't ping my computer - "Transmit failed. General failure."

    - by Vaccano
    I am having an issue with my computer. My IIS services are not working. I have narrowed it down to the fact that my computer cannot find itself via its name. I try pinging my computer by its name and I get this: C:\Users\18773ping MyComputerNameHere Pinging MyComputerNameHere [::1] with 32 bytes of data: PING: transmit failed. General failure. PING: transmit failed. General failure. PING: transmit failed. General failure. PING: transmit failed. General failure. Ping statistics for ::1: Packets: Sent = 4, Received = 0, Lost = 4 (100% loss), I tried having someone else ping my machine and it works fine for them. Any ideas?

    Read the article

  • Hosting solution for startup social app?

    - by happyhardik
    We are in a process of building up a social app. Initially we will have only a few thousands of users than will grow with time. Which would be the best and suitable hosting for this purpose? Grid, cloud or VPS? (it has to be economic, as we are just starting up) The hosting needs to be strong, so, in case our app has increase in the user base all of a sudden it wont break up or slow down the app. Our app is in PHP, MySQL. Sorry, if question posted in wrong place. Thanks, for your time. :)

    Read the article

  • Solution for file store needing large number of simultaneous connections

    - by Tennyson H
    So I'm fairly new to large-scale architectures. We're currently using linode instances for our project, but we're brainstorming about scaling. We need a file store system than can deliver ~50mb folders (user data) to our computing instances in a reasonable amount of time (<20 sec), and scale to 10000+ total users, and perhaps 100+ simultaneous transfers. We are also unsure whether to network mount (sshfs/nfs) or just do a full transfer store-instance at the beginning and rsync instance- store at the end. I've experimented with SSH-FS between our little Linode instances but it seems to be bottlenecked at 15mb/s total bandwith, which wouldn't do under 10+ transfer stress let alone scale v. large. I also tried to investigate NFS but couldn't get it working but have little hope that it'll do within our linode network. Are there tools on other cloud providers that match our needs? Should we be mounting, or should we be transferring? Thanks very much!

    Read the article

  • Using windows 7 and fedora

    - by vedant1811
    I need to partition my hard disk for windows and fedora (root, swap, users). I thought of creating 3 (primary drives). 2 small ones (~5GB) each for win7 and fedora and a large (~700GB) for common files (pictures, vidoes, documents, etc.) one. Please Tell me which file systems to use in each case and the set-up of primary and extended partitions. Also I want to know, where and on which file system should the Linux Grub (my choice of OS chooser) be installed. I have just a bought a new Asus K53S and using the Fedora Installer Partitioner (Anaconda). Your help is greatly appreciated.

    Read the article

< Previous Page | 598 599 600 601 602 603 604 605 606 607 608 609  | Next Page >