Search Results

Search found 2206 results on 89 pages for 'wide eyed pupil'.

Page 38/89 | < Previous Page | 34 35 36 37 38 39 40 41 42 43 44 45  | Next Page >

  • Good quality voice communication software over Internet for windows

    - by ORA600
    I regularly attend over-the-phone meetings, where is a polycom/polispan device on the other side. The quality of the sound is bad, even those whose English is a first language sometimes are struggling to realize what is being said on the other side. I understand this is because of too aggressive audio compression in the line. Does anyone know any good audio chat software which does not over-compress audio stream. The IP channel between two offices is wide enough. In other words, you should be able to listen to a music (not only a speech) from the remote side with good quality. So, Skype is too over-compressing as well. Does anyone know anything like that?

    Read the article

  • Visual Studio .NET 2003 on Windows 7 hangs on search

    - by Nikhil
    So I have Visual Studio 2003 running on Windows 7 - yeah I am aware it isn't officially supported - and no, unfortunately I can't change that situation :-( For the most part it works OK but I have a specific problem, that I can't figure out. The application hangs if you do a project wide search (Ctrl - Shift - F) for a string. I have a reasonably powerful machine and all the other heavy tasks like compiling and debugging all work fine. It also works if I restrict the search to the current document (Ctrl - F). I am running it as administrator and VS.NET 2003 SP1 has been applied. The size of the project does not seem to be a problem since a colleague is also experiencing this issue for a single project solution containing 5 pages. I am currently using Windows Search as a work-around and I was wondering if there is something I missed that I should try. PS: I have asked this question on stack overflow as well - but I suspect this might be problem with Windows 7 OS - so I thought I'd cross post it here as well.

    Read the article

  • Which is prefered internet security + Antivirus solution for Windows, with good detection rate? [clo

    - by metal gear solid
    Possible Duplicate: Free antivirus solutions for Windows Which is the best internet security + Antivirus solution for Windows? free/opensource or commercial it doesn't matter I need best solution. Is Kaspersky best ? or any other? http://www.kaspersky.com/kaspersky_internet_security Award-winning technologies in Kaspersky Internet Security 2010 protect you from cybercrime and a wide range of IT threats: * Viruses, Trojans, worms and other malware, spyware and adware * Rootkits, bootkits and other complex threats * Identity theft by keyloggers, screen capture malware or phishing scams * Botnets and various illegal methods of taking control of your PC or Netbook * Zero-day attacks, new fast emerging and unknown threats * Drive-by download infections, network attacks and intrusions * Unwanted, offensive web content and spam

    Read the article

  • How do ulimit -n and /proc/sys/fs/file-max differ?

    - by bantic
    I notice that on a new CentOS image that I just booted up off of EC2 that the ulimit default is 1024 open files, but /proc/sys/fs/file-max is set at 761,408 and I'm wondering how these two limits work together. I'm guessing that ulimit -n is a per-user limit of number of file descriptors while /proc/sys/fs/file-max is system-wide? If that's the case, say I've logged in twice as the same user -- does each logged-in user have a 1024 limit on number of open files, or is it a limit of 1024 combined open files between each of those logged-in users? And is there much performance impact to setting your max file descriptors to a very high number, if your system isn't ever opening very many files?

    Read the article

  • Check if current Command Prompt was launched as the Administrator

    - by Canadian Luke
    I am looking to write a script that takes user input, then makes system wide changes. I need this to be very generic, but simply put, at the top, I need it to check to see if it's being run 'As an Administrator'. If it's not, then I want to display a message to tell them that; if it is, I want it to continue on. Is there a constant way to verify this? I am not looking to start a new session as the Administrator, I just want to detect if it's currently run as admin

    Read the article

  • Website loading until initial script finishes

    - by wardy277
    Hi, i have a highly used server (running plesk). I have some long scripts that take a while to process (huge mysql database). I have found then in 1 browser, i run the script and while it is loading i cannot view any other parts of the site until the script finishes, it seems that all the requests go off, but they don't get served until the initial script finishes. i thought this may be a server wide issue, but it is not. If i use another computer i can view the site fine, even on the same computer with a different browser i can navigate fine, while the script still loads. I think it much limit the number of requests per session. Is this correct? is there any way to configure this to allow for 2-3 other requests per session? It is really bad that when i am on the phone to a client, i have just run a long report, but cannot use the site or follow what they are saying until the page has loaded? Chris

    Read the article

  • DVI to HDMI - displaying bad.

    - by TutorialPoint
    I currently have a computer with a ASUS EAH5770 (ATi Radeon HD 5770) 1GB GDDR5 video card, and 4GB ram, 2.6 GHZ i5 processor. I just switched from a DVI (the blue one) to an HDMI cable. (Bigben Flat cable HDMI 1.3c) I use a Samsung SyncMaster 2032 MW, which has a HDMI input. The weird thing is, that my screen is off-the-corner of the tv (so it's too wide) (1920x1080), and windows icons and text are not displaying well, though 1080p videos in YouTube are looking brilliant, just like pictures. So i think it has something to do with Windows. I already have the ATI Catalyst Control Center with the drivers i received with my video card. I do not currently know how to fix these problems. Do i have to reinstall Windows or so? Or is it (hopefully) easier?

    Read the article

  • How to set specific environment variables for Apache service run on Windows

    - by Jimm Chen
    I'm facing a problem. I use xampp 1.7.7 on Windows which installs a Apache service. I find that I have to some tweak to have all PHP modules load properly. For example, php_ldap.dll cannot be loaded. It is mysterious why it cannot be loaded until I tried to run httpd.exe from command line, which reveals that libsasl.dll cannot be founded. Actually, there exist D:\xampp\php\libsasl.dll but httpd.exe cannot find it. OK. The best way is to add D:\xampp\php to PATH env-var. Now my question is: How do I set a specific PATH value for that specific Apache service but not system wide. -- because I think it is better not to disturb other processes with that extra PATH value. Is there a general way to do that for a specific Windows service? or, is there a Apache specific way to load extra env-var settings from some specific configuration files?

    Read the article

  • CSC folder data access AND roaming profiles issues (Vista with Server 2003, then 2008)

    - by Alex Jones
    I'm a junior sysadmin for an IT contractor that helps small, local government agencies, like little towns and the like. One of our clients, a public library with ~ 50 staff users, was recently migrated from Server 2003 Standard to Server 2008 R2 Standard in a very short timeframe; our senior employee, the only network engineer, had suddenly put in his two weeks notice, so management pushed him to do this project before quitting. A bit hasty on management's part? Perhaps. Could we do anything about that? Nope. Do I have to fix this all by myself? Pretty much. The network is set up like this: a) 50ish staff workstations, all running Vista Business SP2. All staff use MS Outlook, which uses RPC-over-HTTPS ("Outlook Anywhere") for cached Exchange access to an offsite location. b) One new (virtualized) Server 2008 R2 Standard instance, running atop a Server 2008 R2 host via Hyper-V. The VM is the domain's DC, and also the site's one and only file server. Let's call that VM "NEWBOX". c) One old physical Server 2003 Standard server, running the same roles. Let's call it "OLDBOX". It's still on the network and accessible, but it's been demoted, and its shares have been disabled. No data has been deleted. c) Gigabit Ethernet everywhere. The organization's only has one domain, and it did not change during the migration. d) Most users were set up for a combo of redirected folders + offline files, but some older employees who had been with the organization a long time are still on roaming profiles. To sum up: the servers in question handle user accounts and files, nothing else (eg, no TS, no mail, no IIS, etc.) I have two major problems I'm hoping you can help me with: 1) Even though all domain users have had their redirected folders moved to the new server, and loggin in to their workstations and testing confirms that the Documents/Music/Whatever folders point to the new paths, it appears some users (not laptops or anything either!) had been working offline from OLDBOX for a long time, and nobody realized it. Here's the ugly implication: a bunch of their data now lives only in their CSC folders, because they can't access the share on OLDBOX and sync with it finally. How do I get this data out of those CSC folders, and onto NEWBOX? 2) What's the best way to migrate roaming profile users to non-roaming ones, without losing vital data like documents, any lingering PSTs, etc? Things I've thought about trying: For problem 1: a) Reenable the documents share on OLDBOX, force an Offline Files sync for ALL domain users, then copy OLDBOX's share's data to the equivalent share on NEWBOX. Reinitialize the Offline Files cache for every user. With this: How do I safely force a domain-wide Offline Files sync? Could I lose data by reenabling the share on OLDBOX and forcing the sync? Afterwards, how can I reinitialize the Offline Files cache for every user, without doing it manually, workstation by workstation? b) Determine which users have unsynced changes to OLDBOX (again, how?), search each user's CSC folder domain-wide via workstation admin shares, and grab the unsynched data. Reinitialize the Offline Files cache for every user. With this: How can I detect which users have unsynched changes with a script? How can I search each user's CSC folder, when the ownership and permissions set for CSC folders are so restrictive? Again, afterwards, how can I reinitialize the Offline Files cache for every user, without doing it manually, workstation by workstation? c) Manually visit each workstation, copy the contents of the CSC folder, and manually copy that data onto NEWBOX. Reinitialize the Offline Files cache for every user. With this: Again, how do I 'break into' the CSC folder and get to its data? As an experiment, I took one workstation's HD offsite, imaged it for safety, and then tried the following with one of our shop PCs, after attaching the drive: grant myself full control of the folder (failed), grant myself ownership of the folder (failed), run chkdsk on the whole drive to make sure nothing's messed up (all OK), try to take full control of the entire drive (failed), try to take ownership of the entire drive (failed) MS KB articles and Googling around suggests there's a utility called CSCCMD that's meant for this exact scenario...but it looks like it's available for XP, not Vista, no? Again, afterwards, how can I reinitialize the Offline Files cache for every user, without doing it manually, workstation by workstation? For problem 2: a) Figure out which users are on roaming profiles, and where their profiles 'live' on the server. Create new folders for them in the redirected folders repository, migrate existing data, and disable the roaming. With this: Finding out who's roaming isn't hard. But what's the best way to disable the roaming itself? In AD Users and Computers, or on each user's workstation? Doing it centrally on the server seems more efficient; that said, all of the KB research I've done turns up articles on how to go from local to roaming, not the other way around, so I don't have good documentation on this. In closing: we have good backups of NEWBOX and OLDBOX, but not of the workstations themselves, so anything drastic on the client side would need imaging and testing for safety. Thanks for reading along this far! Hopefully you can help me dig us out of this mess.

    Read the article

  • Remote Desktop Software - TeamViewer comparison?

    - by Martin
    Preliminary Note: After reading what I wrote below, I would like to stress that this ain't a TeamViewer ad. It's just that all other tools that I checkked online seem to miss one feature or the other. :-) OK, so I'm currently trying to get a picture of available solutions for remote desktop software. I have found (through personal usage) that TeamViewer pretty much ticks all boxes that I personally would want from any remoting tool. (Specifically it's setup is amazingly trivial.) It supports a wide range of platforms and it's even free for private use, so I'm really quite OK with it. I would be interested if anyone knows of other tools that ticks as many boxes as TeamViewer seems to do.

    Read the article

  • Mac OS X 10.5/6, authenticate against by NIS or LDAP when both servers have your username

    - by Wang
    We have an organization-wide LDAP server and a department-only NIS server. Many users have accounts with the same name on both servers. Is there any way to get Leopard/Snow Leopard machines to query one server, and then the other, and let the user log in if his username/password combination matches at least one record? I can get either NIS authentication or LDAP authentication. I can even enable both, with LDAP set as higher priority, and authenticate using the name and password listed on the LDAP server. However, in the last case, if I set the LDAP domain as higher-priority in Directory Utility's search path and then provide the username/password pair listed in the NIS record, then my login is rejected even though the NIS server would accept it. Is there any way to make the OS check the rest of the search path after it finds the username?

    Read the article

  • High-res icon in Windows Vista alt-tab thumbnail preview?

    - by netvope
    I have customized my alt-tab screen with the following: [HKEY_CURRENT_USER\Software\Microsoft\Windows\CurrentVersion\Explorer\AltTab] "OverlayIconPx"=dword:00000040 "MaxThumbSizePx"=dword:00000100 "MinThumbSizePcent"=dword:00000064 It works great: the thumbnail becomes 256 pixel wide and the icon at the corner of the thumbnail becomes 64x64 pixels. However, Windows doesn't load the high-res icons from the programs; instead, it uses the 16x16 pixel icon and scaled it up by nearest-neighbor. I'm sure the programs has high-res icons because I saw them with in "Extra Large Icon" view in Explorer. So the question is: How can I force Windows to load the high-res icons for the alt-tab thumbnail preview? (Perhaps a registry key, or a .dll hack/injection?)

    Read the article

  • Windows 7- New Windows Are Not Displayed

    - by Peter
    I am running an up-to-date and OEM version of Windows 7. I have been experiencing an annoying problem for the past few weeks where new windows are not displayed 70-80% of the time. For example, when I launch notepad, the notepad window is not displayed or when I select save from the file menu, the save dialog is not displayed. The window is there, but it is not being displayed; the screen is not refreshed. The way to work around this problem is to force a system-wide refresh by pressing CTRL+SHIFT+DEL and then cancelling hiding all windows using "show desktop" then displaying the window needed via taskbar Can anyone please point me in the right direction to solving this problem? What could be causing this?

    Read the article

  • How do I associate server traffic to a domain hosted on that server?

    - by morley
    I have three or four Linux servers, each of which hosts anywhere from 5 to 50 domains. Each domain has its own folder: /www/projectname/web/ Logs go in: /www/projectname/log However, if there's a traffic spike (or, as I see it on my end, a memory usage spike), I'm not sure how to figure out which domain is responsible for the traffic without running tail -f on each of the projects and making an educated guess based on how fast things scroll. There's got to be a better way! There probably is, but I haven't seen it. And the last time I checked, bandwidth monitors only report system-wide load. So if anyone knows how to do this the right way, please let me know. Thanks!

    Read the article

  • RewriteRule in htaccess in subdirectory

    - by Jay
    Windows server, running Apache. In my Apache conf, I have AllowOverride None for the root of a site and then I have a subdirectory set to AllowOverride All: <Directory /> AllowOverride None </Directory> <Directory "/safe/"> AllowOverride All </Directory> However, when I try to set up a rewrite rule in the subdirectory's htaccess file, nothing happens, I just get a 404 page not found error. Example: RewriteEngine On RewriteRule (.*) /blah?test=$1 [R=302,NC,NE,L] Rwewriting URLs are working fine from the root via the Apache conf. I don't understand why the rule is ignored. I don't want to do the URL re-writing within the conf because for this case I may need to be changing the redirects constantly and don't want to reload the server every time a change is made. I also don't want to affect server performance by enabling htaccess files site-wide, just in the subdirectory I need it.

    Read the article

  • Gnome - windows always open top left

    - by BobTodd
    I find this a highly annoying "feature" on a wide screen monitor that my mostly used apps - terminal and gedit always open directly under the top-left corner of my screen and I have to drag them to my eye position each and every-time. I have tried installing the CompizConfig Settings Manager and using the feature to position windows centre, but this has had no effect - the force feature here isn't working for me either. I can use e.g. gnome-terminal --geometry=140x50+50+50 for the terminal but this doesn't work for gedit. Any ideas? Thanks

    Read the article

  • I am the one who needs 5000 cd's!!!!!!

    - by cabey
    I didn't realise how world wide the users of the site are. I am based in England and will be helping out at an International Camp for young people in Finland this summer. I will be in charge off an game where we will have 1500 young people searcing for these CD's that will be hidden all over the camp site. They will have to find them and bringing them back to base one at a time. The young people will be divided into 5 teams and the team that brings back the most gets a prize. Hope this helps and allows me to put the reequest back on the site. I have tried to source them in Finland but have had no success.

    Read the article

  • uWSGI and python virtual env

    - by user27512
    I'm trying to use uWSGI with a virtual env in order to use the Trac bug tracker on it. I've installed system-wide uwsgi via pip. Next, I've installed trac in a virtualenv $ virtualenv venv $ . venv/bin/activate $ pip install trac I've then written a simple uWSGI configuration script: [uwsgi] master = true processes = 1 socket = localhost:3032 home = /srv/http/trac/venv/ no-site = true gid = www-data uid = www-data env = TRAC_ENV=/srv/http/trac/projects/my_project module = trac.web.main:dispatch_request But when I try to launch it, it fails: $ uwsgi --http :8000 --ini /etc/uwsgi/vassals-available/my_project.ini --gid www-data --uid www-data ... Set PythonHome to /srv/http/trac/venv/ ... *** Operational MODE: single process *** ImportError: No module named trac.web.main unable to load app 0 (mountpoint='') (callable not found or import error) I think uWSGI isn't using the virtual env. When inside the virtual env, I can import trac.web.main without having an ImportError. How can I do that ? Thanks

    Read the article

  • Material to use for computer system cover against UV and salty air?

    - by hippietrail
    I live right next to the sea and have a large window quite close to my computer setup which allows a lot of indirect sunlight to enter. I'd like to buy or make a cover for my computer system. From visiting my usual mom & pop computer shop yesterday I got the impression these might not really exist any more. If I make my own I need a material with these qualities: Block or reduce ultraviolet light which can depolymerize plastics (the sun here in Australia is much stronger than in the northern hemisphere). Block salt-laden sea air which can oxidize USB and other connectors. Not cause static electricity when covering or uncovering. Keep dust off of course (-: My setup is a laptop plugged into a wide-screen LCD with a few external drives. So I think I'd want a largish sheet to flop over the whole desk. Are such covers commonly sold these days? What material(s) should I look for which provides the listed attributes?

    Read the article

  • MDaemon vs Exchange (2007-2010). Which way should we choose ?

    - by Deniz
    We are at the verge of a mail server decision. We do currently use 2 mail servers : MDaemon 10 and Exchange 2003. We are planning to use a company and customer wide one point solution. Our main candidates are MDaemon 11 and Exchange 2007 or 2010. We would like to learn other users experiences on those solutions. The server-side experiences, the user-side experiences , TCO, support options etc. And if there where other solutions (maybe MDaemon 11 + Exchange or anything else) you could suggest ?

    Read the article

  • Setting per-directory umask using ACLs

    - by Yarin
    We want to mimic the behavior of a system-wide 002 umask on a certain directory foo, in order to ensure the following result: All sub-directories created underneath foo will have 775 permissions All files created underneath foo and subdirectories will have 664 permissions 1 and 2 will happen for files/dirs created by all users, including root, and all daemons. Assuming that ACL is enabled on our partition, this is the command we've come up with: setfacl -R -d -m mask:002 foo This seems to be working- I'm basically just looking for confirmation. Is this the most effective way to apply a per-directory umask with an ACL?

    Read the article

  • Taking stock of an existing ASA Firewall

    - by Nate
    Imagine you are given an existing network using an ASA firewall. The network works, but you aren't sure of anything else. The firewall may be completely improperly configured, with "outside" actually being inside and "inside" actually being outside, for all you know. My question is this: what are the commands to take stock of an existing ASA firewall setup? With only CLI access, how do I figure out: What interfaces are available The names of the interfaces The security levels attached to the interfaces The access-lists attached to the interfaces, including rules and directions I know how to set these things (interface, nameif, security-level, and access-list/access-group), but I don't know how to figure them out given an existing system. On a related note, is there anything else that I should worry about checking to make sure that the network isn't wide open? Thanks!

    Read the article

  • Does a single LACP channel over multiple switches increase redundancy?

    - by Sirch
    I am curious for opinions, findings, or evidence that having multiple interfaces bonded using LACP to ports in multiple switches can increase redundancy. Previously bonded interfaces have always been to a single switch, with a redundant channel to another port. Without getting into vendor specifics, my thought is that as this is a single LACP, the likelihood that an event or change could lead to a wide service outage. Without having the spare equipment or time to test this single channel over diverse switches, could anyone with a greater networking knowledge than myself, tell me if there a network side event that would bring down the network connectivity to a server that had created a bonded interface to two ports on separate switches? Does the use of bonded ethernet channels across multiple switches (that we are advised that we can use) from the server, provide both improved throughput (unquestionably), and improved redundancy (uncertain). Could/would network events such as switch failure, port migration, patching, recovery, etc, cause the channel for both server network interfaces to be unavailable? Thanks in advance.

    Read the article

  • Extracting httpdocs from Plesk Panel 9.5.4 Webserver backup file

    - by Paddington
    Good day, I am having problems manually extracting domains from Plesk 9.5 backup that was FTPed onto my back up server. I have followed the article http://kb.parallels.com/en/1757 using method 2. The problem is here: zcat DUMP_FILE.gz DUMP_FILE My backup file CP_1204131759.tar is a tar archive and zcat does not work with it. So I proceed to run the command: cat CP_1204131759.tar CP_1204131759. But when I try # cat CP_1204131759 | munpack I get an error that munpack did not find anything to read from standard input. I went on to extract the tar backup file using the xvf flags and got a lot of files (20) similar to these ones: CP_sapp-distrib.7686-0_1204131759.tgz CP_sapp-distrib.7686-35_1204131759.tgz CP_sapp-distrib.7686-6_1204131759.tgz How best can I extract the httpdocs of a domain from this server wide Plesk 9.5.4 backup?

    Read the article

  • SAMBA and Linux ACLs -- "Permission denied" on write to share but file written nevertheless

    - by MCH
    I set up a writable share directory "/home/net/share" with acl like this: sudo mkdir -p "/home/net/share" sudo setfacl -m "u:localuser:rwx,u:remoteuser:rwx,g:users:rwx" "/home/net/share" My /etc/samba/smb.conf looks like this: [global] workgroup = w server string = server security = user load printers = no log file = /var/log/samba/%m.log max log size = 50 dns proxy = no printing = bsd printcap name = /dev/null disable spoolss = yes encrypt passwords = true invalid users = nobody root follow symlinks = yes wide links = yes [share] comment = Writable by localuser and remoteuser path = /home/net/share valid users = remoteuser read only = no public = no printable = no Locally, localuser and remoteuser have user accounts and smbpasswds and can both read, create and delete files in /home/net/share. But when I log on from a different machine (like this: sudo mount -t cifs //server/share mountpoint/ -o username=remoteuser ), I get "Permission denied" both when trying to create directories and files, oddly though, it does create files (not directories!) despite these messages! How can I get this working?

    Read the article

< Previous Page | 34 35 36 37 38 39 40 41 42 43 44 45  | Next Page >