Search Results

Search found 6568 results on 263 pages for 'shared'.

Page 144/263 | < Previous Page | 140 141 142 143 144 145 146 147 148 149 150 151  | Next Page >

  • hardware: delay and distinct 'click' before hard drive access

    - by matt lohkamp
    I have a windows 7 box stashed away in my closet, containing (among other things) 2 big HDDs linked together as a mirrored volume - basically a super lazy NAS / media server. I've noticed that when that drive is accessed (whether locally, on the machine itself, or remotely, from another computer, or my xbox, for example) there's a noticeable pause, and then from the computer itself, a 'click!' noise, after which the drive is accessed; e.g. open \\computername\shared\, wait 2 seconds, hear 'click!' and then see files appear in windows explorer. Any ideas? Otherwise the drive preforms normally - is it a windows thing? a HDD-about-to-die thing? Or a "yeah that always happens, you've just never noticed it before" thing?

    Read the article

  • Printing on Windows 8 64-Bit through Windows Server 2008 (32 Bit) RemoteApp

    - by Chris
    We have a network where our server is running on Windows Server 2008 (32 Bit) and a client computer is running on Windows 8.1 (64 bit) with a local printer attached to the client. The printer is an HP P1006. The remote app works well but when trying to print we get an odd "error 545". We have tried both using the "connect client printer" function in remoteapp and also making the printer shared over the network and printing to it via the network from within the remoteapp. Nothing works. We can print a test page from the server to the client computer just fine, but it seems from remoteapp we cannot. We have also tried installing the 32 bit drivers on the 64 bit machine as both the primary and secondary drivers but cannot get them to install. Suggestions please? We've been going crazy over this issue.

    Read the article

  • Mercurial changeset hook problem when auto updating. Server permissions maybe??

    - by Gary Willoughby
    I am using Mercurial SCM over a LAN using a normal shared folder instead of http and i'm having a problem getting the auto update hook to run. I have entered this hook as detailed here: http://mercurial.selenic.com/wiki/FAQ#FAQ.2BAC8-CommonProblems.Any_way_to_.27hg_push.27_and_have_an_automatic_.27hg_update.27_on_the_remote_server.3F This installs the hook, but when i push something to the remote repo i get an error: added 1 changesets with 1 changes to 1 files running hook changegroup: hg update >&2 warning: changegroup hook exited with status -1 There is a stackoverflow question similar to this here: http://stackoverflow.com/questions/2885246/mercurial-auto-update-problem but it offers no solutions other than it may be a permissions error somewhere. Has anyone else had this problem and can anyone else shed any more light on this or give me a heads up on where to start fixing this? Thanks.

    Read the article

  • Homebrew build with different arch?

    - by StasM
    I tried to install mysql-connector-c recipe via homebrew, and it builds just fine, but produces x86_64 library: $file ~/brew/lib/libmysql.dylib .../brew/lib/libmysql.dylib: Mach-O 64-bit dynamically linked shared library x86_64 I however need i386 library for my project. I tried to give it CFLAGS and LDFLAGS like this: CFLAGS="-arch i386 -arch x86_64" LDFLAGS="-arch i386 -arch x86_64" brew install mysql-connector-c but nothing changes - it still builds x86_64 only binary. Is there any way to make homebrew build either dual arch library or i386 library? I have kernel architecture set to x86_64, if it matters.

    Read the article

  • Double Click to open Office docs is slow, File -> Open is fast.

    - by Keith
    I have 2 unique networks. They both share similar architecture: Windows 2003 SBS SP2 Running Symantec Endpoint Running Symantec Information Foundation Shared drives off a data partition Clients running Office 2003 or 2007 Connect to file server through mapped drives When users try to open a file from their local PC by double clicking, it will take 30-60 seconds to open. When they do File - Open, those same documents open up almost immediately. So far I've tried the following - CCleaner to parse the registry of outdated mapped drives - Disabled "using DDE" - Disabled A/V - Reboot Any ideas beyond that? Figured this question belongs here instead of SU since its the same issue on different networks.

    Read the article

  • VPS hosting for a social network

    - by Jana
    Hi, I've developed a social network and I've been using shared hosting for that since it was launched. With that I wasn't able to send emails in bulk in cases like "newsletters" and "invitations to join my site". Plus most importantly most of the mails I send ended up in user's SPAM list.I'm planning to move into VPS as it may not have limits added. I'm wondering what's the cheapest VPS host available. I'm not pretty much familiar with Linux commands and seeking cPanel to do the work for me. Will the following configuration suit for a "new" social network like mine which has a less load? 1000Mhz Guaranteed 512MB Guaranteed RAM 20GB (RAID) Disk Space 1000GB/month Bandwidth 2 IP(s) & 5 Backups Semi Managed Thanks in advance

    Read the article

  • Why not install Msvcr71.dll into system32?

    - by hillu
    While looking for an authoritative source for the missing Msvcr71.dll that is needed by a few old applications, I stumbled across the MSDN article Redistribution of the shared C runtime component in Visual C++. The advice given to developers is to drop the DLL into the application's directory instead of system32 since DLLs in this directory are considered before the system paths. What can/will go wrong if I (as an administrator, not a developer) decide to take the lazy path and install Msvcr71.dll (and Msvcp71.dll while I'm at it) into the system32 directory (of 32 bit Windows XP or Windows 7 systems) instead of putting a copy in each application's directory? Is there another good solution to provide the applications with the needed DLLs that doesn't involve copying stuff to the application directories? added after first answers: I understand that incompatible API changes may have been made to the mentioned DLLs, but pretty much every mention of incompatibilities I have found using Google had to do with games or video codecs. Right now, I expect that the risk of breakage is pretty small. Am I missing something?

    Read the article

  • Putting our OLTP and OLAP services on the same cluster

    - by Dynamo
    We're currently in a bit of a debate about what to do with our scattered SQL environment. We are setting up a cluster for our data warehouses for sure and are now in the process of deciding if our OLTP databases should go on the same one. The cluster will be active/active with database services running on one node and reporting and analytical services on the other node. From a technical standpoint I don't see an issue here. With the services being run on different nodes they shouldn't compete too heavily for resources. The only physical resource that may be an issue would be the shared disk space. Our environment is also quite small. Our biggest OLAP database at the moment is only about 40GB and our OLTP are all under 10GB. I see a potential political issue here as different groups are involved but I'm just strictly wondering if there would be any major technical issues that could arise from this setup.

    Read the article

  • Installing php(suexec) for Apache

    - by John
    I've got Apache installed and running but how do i install and run PHP as fastcgi so it runs as its own user? here is my apache config: ./configure --prefix=/usr/local/apache2 --enable-rewrite=shared --enable-so --enable-suexec --disable-asis --disable-autoindex --enable-cache --enable-deflate --enable-disk-cache --enable-expires --enable-file-cache --enable-mem-cache --enable-ssl --enable-vhost-alias --with-mpm=prefork --with-port=8080 here is my php config: ./configure prefix=/usr/local/php --without-pear --enable-safe-mode --enable-magic-quotes --with-apxs2=/usr/local/apache2/bin/apxs --disable-cli --disable-cgi --enable-force-cgi-redirect --enable-fastcgi --with-mysql --with-gd --with-jpeg-dir=/usr/lib --with-png-dir=/usr/lib --with-freetype-dir=/usr/lib --enable-calendar --with-curl --enable-mbstring --with-mcrypt

    Read the article

  • Am I using too much memory? (Rails on EC2 with Resque)

    - by Stpn
    I am looking at the memory usage of the Rails application (it uses background processes via Resque) and since the common answer to the question, "how many workers is too many" was "test and see", I ran some memory commands and wonder if someone can help figuring if the memory usage is high enough already, or I can still add some extra workers.. so (this is all under the maximum load): $ free -t -m total used free shared buffers cached Mem: 1756 1532 223 0 12 229 -/+ buffers/cache: 1291 464 Swap: 895 10 885 Total: 2652 1543 1108 $ vmstat procs -----------memory---------- ---swap-- -----io---- -system-- ----cpu---- r b swpd free buff cache si so bi bo in cs us sy id wa 0 0 10588 156172 13400 326476 1 6 4 0 5 4 1 0 99 0 If there is any extra info I can provide to help answer this, I would be happy to do so. If the question is strange in some way, please let me know I'd be glad to fix etc..

    Read the article

  • YUM and RPM crash due to the liblua-5.1 library being missing

    - by A troubled linux newbie.
    I've been playing around with a LiveUsb install of basic Fedora with persistence. I attempted to install moonscript, which requires Lua and LuaRocks. After installing Lua and discovering there were flaws in the install which prevented LuaRocks from working, I used rpm to force Lua off so I could use yum to re-install it. The result was an error of this sort being yielded by both rpm and yum: There was a problem importing one of the Python modules required to run yum. The error leading to this problem was: liblua-5.1.so: cannot open shared object file: No such file or directory I've concluded from this that my Lua version installed a library which both yum and rpm are now connected to. Is there anyway to fix this without reformatting my drive and installing everything from scratch?

    Read the article

  • How to use 2 or more internet connections on the same network?

    - by Rogue
    Living in a joint family we have 3 internet connections, each floor has one internet connection which is split per floor. Each internet connection is shared between 4-5 computers using a switch per floor. Each of these internet sharing networks are independent of each other. What I want to achieve here is a local network (for local messenger and file sharing) that can combine all the 3 independent networks, problem is that whenever I try to do that the whole network tends to use just one internet connection. I have all the necessary hardware. How do I solve this problem, one approach could be that I could get one PC to act as a server and bridge the internet connections and then the whole network would have to access the internet through this server. Theoretically this could be possible but I have never tried this approach in real life. Also if certain computers need be restricted from internet access how would this be possible on the same network?

    Read the article

  • Reducing CPU load to absolute minimum [on hold]

    - by user191338
    I have had a couple of things gone missing I believe stolen in my shared apartment and want to run my laptop constantly with a webcam attached, running webcam surveillance software to record/ take pictures when motion is sensed. Id like to take whatever steps are necessary to be able to run the laptop constantly without the fan coming on, as its quite loud and even though it will be hidden it can be heard. Thus Id like to know what steps I can take to reduce CPU to the bare minimum for the laptop to boot up and run the camera software and send images via ftp / email when necessary. I have windows 7 installed, though I can reinstall it clean. Which are the windows services can I turn off, and more extreme disabling or measures of whatever kind which I can take. The OS would need to run the camera, wifi / networking. Thanks very much for any help.

    Read the article

  • Standalone server setup for compute capacity

    - by mikera
    I'm developing an application for my company that will require a lot of compute capacity (running some very big mathematical calculations), and looking for some form of server setup to do this. For various reasons, we want to run this on-site in our office rather than hosting it externally. It's been a while since I last had to set up my own servers so I thought I would tap into the collective wisdom of serverfault! My broad requirements are: Budget $30-50k, with an aim to get as much compute capacity as possible for that budget 64-bit servers suitable to run Ubuntu Linux + Java Some relatively standalone rack that can be installed in secure office space Fast/low latency network connections between the servers, but don't really care about connectivity to the outside world Storage capacity shared between the servers - they don't necessarily need their own storage providing they can be booted from a common image Downtime can be tolerated (since the calculations are run in batch mode) The software itself is fault-tolerant, so there is no need for extra resiliency in the server setup (cheap replaceable commodity parts will be fine in general) Given these requirements what kind of setup would you recommend and why?

    Read the article

  • Mcafee Auto-update from UNC path problem

    - by Vicky
    I have a network with 50 computers with no internet access. So instead of updating in each of them using dat file individually I tried to create a shared folder in server, and created a UNC in site repository. I downloaded the file DAT Package For Use with Mcafee AutoUpdate Architect & ePO 3.0 from http://www.mcafee.com/apps/downloads/security-updates/security-updates.aspx. When I try to update it is giving an error Error occurred while downloading file SiteStat.xml. So how fix it?

    Read the article

  • windows cache not working as it should?

    - by piotrektt
    I run windows 2012 server with data center. The setup is with 60GB of RAM. I have one file shared on VHD and when I copy this file locally the RAM cache is all used up but when multiple computers connect to the share it the cache is not used. The network is 8Gb. The whole network is around 200 computers that need to read that one file but on this setup only 10 connection kills the server. Is there any way to check what is going on? What other solution can I use to manage cache in windows?

    Read the article

  • Security considerations for my first eStore.

    - by Rohit
    I have a website through which I am going to sell few products. It is hosted on a simple shared-hosting and does not have SSL. On the products page, each product has a Buy Now button created from my PayPal Merchant account. PayPal recommends to use it's Button Factory to create secure buttons and save it inside PayPal itself. I have followed the same advice and the code of any button is secure and does not disclose any information on either a product or it's price. When the user clicks on a Buy Now button, he/she is taken to PayPal site where a page is opened in SSL for the user to fill in the credit card and shipping details. After a successful transaction, the control is passed back to my site. I want to know whether there is still any chance when security could be compromised.

    Read the article

  • Monitor or log directory permission changes?

    - by Myles
    I'm having an issue with a cPanel shared server running CentOS 5 where a few directories under the public_html folder keep getting changed to 777 from 755. The customer says they are not changing it and i'm wondering if there is a way to monitor these specific directories to find out who/what is changing the permissions. I have looked into using auditctl and after testing it and changing the permissions myself I don't see anything in the logs so i'm not sure if i'm doing it right or if it's even possible. Does anybody have any suggestions or ideas on how I could figure out what is changing the permissions? Thanks!!

    Read the article

  • Taking over and Moving a PHP site

    - by KCavon
    I have a internal use PHP site at my new position. It only runs a few days a year off site so we keep it on laptops. The hardware it has been on, a 8 year IBM Thinkpad running Fedora, is dying. I have new Lenovo Thinkpad's running latest and greatest Ubuntu. I have copied the contents of var to a shared drive, renamed the old www folder in var on the new machine and copied over the old www folder. I can get to the login page and into the site, but when I look something up it returns Cannot Open. I know I cannot get to the MySQL in the new machine because users and passwords dont match. The version of the PHP from the old machine is before the setup script was included. I know very little about PHP. I am looking for input on the proper way to link the old PHP files to my mysql instance. Any help, much appreciated.

    Read the article

  • Lots of files being used by blank web page. What are they?

    - by byronyasgur
    I am trying to optimise a website and I was using the network waterfall facility in Google Chrome. When I looked at the results there were lots of files which I didnt recognise. I first thought they might be something to do with Google Chrome itself, so I put a blank HTML file on my desktop and checked but there was nothing in the waterfall except the file itself. So I put a blank file on my server and I got the output below. What are all these files, are they all necessary, is this normal and do I need to be in any way concerned. My hosting provider has always been excellent in every regard that I'm aware of. My host is shared hosting, using cpanel and is based on a LAMP server. I also note that a couple of those file have problems but I have no idea how to fault find that or whether it's a concern. EDIT: I have cleared the cache so I don't think it's a browser cache issue.

    Read the article

  • Why can I not access any file or directory created by PHP from FTP-client?

    - by user43053
    Hello there, If I create a directory with mkdir(), or create a file with fopen(), file_put_contents() or SimpleXMLElement::asXML(), I am unable to access the file with my FTP-client or c-Panel File Manager. If I try to delete or edit them, I get errors. Dreamweaver suggests it is a permission problem or a network or filesystem fault (but I've set the permissions with chmod() to 0777, and when I check the cPanel, it confirms chmod 777. I also tried to use fileowner() and the function returns int(99), the same owner as those files that I could access with my FTP-client. It seems files and directories created with PHP can only be modified or be deleted with PHP. I thought this must be a server setup related issue, so I write it here. I am on a shared server, and I have no idea about setting up servers. Thank you for your time. Kind regards Marius

    Read the article

  • HTTP Redirect from www.mydomain.com to my amazon ec2 account (instance)?

    - by fabius
    Hello! I have a domain, that is registered at a service provider but my site (wordpress blog) is hosted in a shared account with a friend in another other host service. I want to become seperate from this friend because I'm tired of boring him with my blog downtimes. Now, my problem is that I signed up to Amazon EC2 service and I created a instance (a virtual machine) to host my wordpress blog and now I'd like to redirect mydomain.com to this instance at Amazon EC2 and I don't know how to proceed in order to achieve that. The instance at Amazon EC2 is up and running (it's a 64bit linux machine) but I couldn't redirect mydomain.com to this instance at my host service webpanel. Could someone help me please???

    Read the article

  • Tuning MySQL to consume less memory

    - by Alex
    I have a VM which has 2GB Ram, (full specs) And I am setting up a site which has one table in particular with over a million records. There's little or no usage of this particular database (perhaps once or twice a day) but simply running mysql grinds the whole server to a halt. I've looked through the top results but nothing is really denting the CPU however the memory seems to be the issue. The site isnt even live of taking requests yet. the memory situation looks like this: # free -m total used free shared buffers cached Mem: 2006 1880 126 0 3 53 -/+ buffers/cache: 1823 183 Swap: 2047 345 1702 Are there any good pointers to tune mysql to stop hogging the system memory? Thanks very much EDIT: (requested by 8bit): http://tny.cz/b41a0b12

    Read the article

  • Log into AD account through Command Line

    - by CranialPain
    Our SBS2003 server likes to lose connection every so often. This appears to 'kick' everyone out, so that no-one can access the server or its shared folders without a log off log in. It usually brings up an error message stating that Windows 7 (on the client machines) cannot find the server, even though its ping-able. Is there a way to login through the command line so I can just write a batch file and have the users double-click it and enter their credentials instead of closing down programs and logging out/in over and over?

    Read the article

  • Use HAProxy or Nginx to Load Balance between VPS

    - by xperator
    I want to load balance + failover backup multiple vps webservers hosted on different providers. I heard that for HAProxy you need multiple server under the same subnet, plus a shared (virtual) ip address between load balancers. But it's not possible in my case cause every VPS is on different node/network. Is there a way to use HAProxy in this kind of setup ? ( Please explain how briefly, I don't want to hear your "YES" answer ) What about NginX? Is it possible to achieve same result with Nginx ? (when servers are located on different nets) I know about Round Rubin DNS, but it doesn't provide a real failover solution, neither a load balance between servers.

    Read the article

< Previous Page | 140 141 142 143 144 145 146 147 148 149 150 151  | Next Page >