Search Results

Search found 28222 results on 1129 pages for 'machine config'.

Page 782/1129 | < Previous Page | 778 779 780 781 782 783 784 785 786 787 788 789  | Next Page >

  • Can you suggest some DIY PC specs for 1) Value, and 2) Future 'upgradability'?

    - by user17381
    Hi, I'm considering building a new desktop PC from components. For the last 7/8 or so years I have almost exlusively used laptops, and so have fallen behind a bit with knowledge of various hardware technologies. Anyway, now I'm considering building a new desktop machine. Mainly for development work, but also would be nice to do a bit of gaming. The two main criteria are: I would like the first build to be relatively low cost. I would like to select components that will allow me to upgrade in the future without throwing too much away. Can anyone recommend a setup? Thanks

    Read the article

  • Apache .htaccess problem: No input file specified.

    - by Michal M
    Hello Everyone, Can someone help me with this. I'm feeling like I've been hitting my head against a wall for over 2 hrs now. I've got Apache 2.2.8 + PHP 5.2.6 installed on my machine and the .htacces with code below works fine, no errors. RewriteEngine on RewriteCond $1 !^(index\.php|css|gfx|js|swf|robots\.txt|favicon\.ico) RewriteRule ^(.*)$ /index.php/$1 [L] The same code on my hosting provider server gives me a 404 error code and outputs only: No input file specified. index.php IS there. I know they have Apache installed (cannot find version info anywhere) and they're running PHP v5.2.8. I'm on windows xp 64-bit, they're running some Linux and php in cgi/fastcgi mode. Can anyone suggest what could be the problem? PS. if that's important that's for CodeIgniter to work with friendly URLs.

    Read the article

  • Problems when loop over a series of ssh-ed commands

    - by Jack Medley
    I have a series of server machines which I want to run the same command on. Each command takes hours and (even though I am running the commands using nohup and setting them to run in the background) I have to wait for each to finish before the next starts. Here is roughly how I have set it up: On the host machines: for i in {1..9}; do ssh RemoteMachine${i} ./RunJobs.sh; done Where RunJobs.sh on each remote machine is: source ~/.bash_profile cd AriadneMatching for file in FileDirectory/Input_*; do nohup ./Executable ${file} & done exit Does anyone know of a way such that I dont have to wait for each job to finish before the next starts? Or alternatively a better way of doing this, I have a feeling what I am do is fairly sub-optimal. Cheers, Jack

    Read the article

  • Why do I have no TTY on a basic Ubuntu 9.10 server install?

    - by pr1001
    I have reinstalled Ubuntu 9.10 Server several times on a bog standard 1RU server and each time I finish the install and reboot I see GRUB run and am then presented with a black screen. The machine is running just fine, as I am able to SSH in, but I can't see anything on the attached monitor. I have a simple LCD screen connected via VGA and a signal is apparently being output to it, as it doesn't go asleep. Looking at /var/log/syslog I see: Mar 24 14:57:44 bridge5 rsyslogd-2039: Could no open output file '/dev/xconsole' [try http://www.rsyslog.com/e/2039 ] However, I later see: Mar 24 14:57:44 bridge5 kernel: [ 0.001368] console [tty0] enabled Any thoughts? Thanks!

    Read the article

  • Worker processes not starting in IIS 7.5. What should I check?

    - by locster
    I have a Windows 7 machine (Windows version 6.1.7601 SP1 Build 7601) with IIS installed. At some point the installation appears to have become 'corrupted' in some way, as any requests are now met with the message: Service Unavailable HTTP Error 503. The service is unavailable. In IIS manager IIS is started and the app pool I am using reports itself as 'Started', yet there is no w3wp.exe process listed in the process list in task manager (I am a local admin and have clicked the 'Show processes from all users' button. I have enabled logging for the web site (at default location of %SystemDrive%\inetpub\logs\LogFiles), but this folder is empty. I am assuming that this log output is written by w3wp.exe as it handles requests (no w3wp.exe, no log file?). Presumably there is another layer of request handling that is responsible for starting the worker processes, does thsi layer have log files I can check, and/or can I uninstall/re-install that layer? Thanks.

    Read the article

  • Ubuntu+Mono+Postgres+ASP.NET 4.0. No problem?

    - by wreck_of_u
    Would this be ok? I'm an ASP.NET developer and I'm planning to build "portable" web app servers based on Atom D510 mini-ITX. I have ran Ubuntu 10 with MySQL along with a separate IIS machines (win 2k3, 2k8) before with no problems. But now I'm thinking of "packaging" a web/db server into one small, cheap machine. I thought of Ubuntu/Mono/Postgres/ASP.NET, that it would be a good idea but I'm not sure? I have not actually tried it yet. Your thoughts?

    Read the article

  • Virtual MS Sql Server not consuming enough CPU

    - by rocketman
    We have a Win2008 server 32 bit running as a virtual machine under ESX server. It has 6 CPU cores of 2Gz each and 4GB ram. It's running MS Sql Server 2008 R2 only. Problem: The server is heavily loaded and responds slowly. From windows taskmanagers point of view, it really looks overloaded, CPU wise. However, our external "cloud manager" says it's only using 2.5GHz worth of CPU-cycles in the cluster. I/O times looks "good". We have already tried to set the SQL servers number of working threads from 0(auto) to 256, to no effect. How to tune the VM host, guest or SQL to use all of it's alotted resources? Does it sound possible att all?

    Read the article

  • What is the value of workflow tools?

    - by user16549
    I'm new to Workflow developement, and I don't think I'm really getting the "big picture". Or perhaps to put it differently, these tools don't currently "click" in my head. So it seems that companies like to create business drawings to describe processes, and at some point someone decided that they could use a state machine like program to actually control processes from a line and boxes like diagram. Ten years later, these tools are huge, extremely complicated (my company is currently playing around with WebSphere, and I've attended some of the training, its a monster, even the so called "minimalist" versions of these workflow tools like Activiti are huge and complicated although not nearly as complicated as the beast that is WebSphere afaict). What is the great benefit in doing it this way? I can kind of understand the simple lines and boxes diagrams being useful, but these things, as far as I can tell, are visual programming languages at this point, complete with conditionals and loops. Programmers here appear to be doing a significant amount of work in the lines and boxes layer, which to me just looks like a really crappy, really basic visual programming language. If you're going to go that far, why not just use some sort of scripting language? Have people thrown the baby out with the bathwater on this? Has the lines and boxes thing been taken to an absurd level, or am I just not understanding the value in all this? I'd really like to see arguments in defense of this by people that have worked with this technology and understand why its useful. I don't see the value in it, but I recognize that I'm new to this as well and may not quite get it yet.

    Read the article

  • Copy large files to multiple machines on a LAN

    - by Jonathan Callen
    I have a few large files that I need to copy from one Linux machine to about 20 other Linux machines, all on the same LAN as quickly as is feasible. What tools/methods would be best for copying these files, noting that this is not going to be a one-time copy. These machines will never be connected to the Internet, and security is not an issue. Update: The reason for my asking this is because (as I understand it) we are currently using scp in serial to copy the files to each of the machines and I have been informed that this is "too slow" and a faster alternative is being sought. According to what I have been told, attempting to parallelize the scp calls simply slows it down further due to hard drive seeks.

    Read the article

  • How to force rsync to use destination directory as root

    - by thepurplepixel
    I have a simple script to one-way-sync files/folders within a directory: #!/bin/bash HOST='<hostname>' USER='<username>' DIR='/downloads/' SOURCE='/srv/torrents' rsync -e "ssh -l $USER" --remove-source-files -h -4 -r --stats --progress -i $SOURCE $HOST:$DIR find $SOURCE -type d -empty -prune -exec rmdir -p \{\} \; However, when this rsync operation runs, it creates a folder, torrents in /downloads on the destination machine. How can I force rsync to put all folders & files from /srv/torrents (remote) into /downloads/ (local) instead of creating /downloads/torrents as a separate directory?

    Read the article

  • Allowing access to MPD from local network

    - by August Karlstrom
    I have successfully installed MPD (Music Player Daemon) on my desktop computer. Everything works fine when the client runs on the same machine as the server. Now I would like to access MPD from my laptop computer which is connected (wirelessly) to the local network. In order to allow access to MPD from any computer on the local network I have added this line to /etc/hosts.allow: mpd: .local and restarted MPD. Still I get the message "error: Connection refused" when I try to access MPD with MPC (Music Player Client) from my laptop. Any clues or troubleshooting hints?

    Read the article

  • Oracle Endeca "Getting Started" Partner Guide

    - by Grant Schofield
    For partners looking for a concise step by step guide to getting started with Oracle Endeca Information Discovery, here it is to help you get started as quickly as possible. Step 1: Join the Knowledge Zone as a company and an individual - this will give you a) the right to resell Oracle Endeca ID, and b) notice of any free / subsidised training events in your region Step 2: For a quick general overview & positioning see the following article, in particular the Agile BI Video series which are useful in sharing with prospective clients. Also find a link to the official OEID Data Sheet. Step 3: For a more detailed overview there is a live recorded OEID partner webcast with downloadable slides. In conjunction with this, your sales / presales team have free access to the official OEID Partner Playbook as well as the full Oracle price book. Step 4: Download the OEID software and install. Please be aware you will need a 64-bit machine & a 64-bit Operating System. A useful solution for partners that have a 32-bit Operating System is to use Oracle's free VirtualBox software to quickly and easily create a Linux image and install on that. Step 5: Attend a free / subsidised training event in your region. Please join the Knowledge Zone as an Individual (opt in) to be informed of these. We will also publish these via the blog Things are moving fast, so please be aware that the team are working hard to produce more and more material such as downloadable data sets (structured / unstructured), a downloadable image, access to demos, and over the next few weeks we will update this article as soon as new material becomes available!

    Read the article

  • Suspected power supply issue? PC repeatedly whirs for some seconds and then dies.

    - by benwebdev
    I've come home today and switched on the PC I've built a few months ago only for the machine to whir for a few short seconds and then die. It repeats this until I disconnect the power lead. Nothing is output to the screen and this cut out happens very quickly after switching it on. What could this be and how could it be fixed? Is it the power supply? I'm in despair :-( My spec is below, everything is new and was bought and assembled within the last 6months, system has been fantastic until now. Power Supply: 620W CoolerMaster Real Power M620 Motherboard: Gigabyte GA-X58A-UD3R Intel X58 (Socket 1366) DDR3 Processor: Intel Core i7 930 2.80GHz @ 4.00GHz RAM: 6GB DDR3 OS Windows 7 64bit Sapphire HD 5770 Vapor-X 1GB Graphics card 1TB Hitachi HDD 720GB Seagate Barracuda HDD 350GB Seagate Barracuda HDD EMu 0404 PCI Soundcard D-Link PCI-E wireless card Samsung DVD RW drive

    Read the article

  • Rsync and Windows 7

    - by Nate
    Can someone give me any tips on setting up some sort of Rsync server/client on Windows 7 to run rsync between both my web hosting server, and a backup server that I have running Ubuntu? I've tried setting it up with this tutorial: http://www.youtube.com/watch?v=CvwdkZLNtnA Using copssh, and cwrsync. Ran into all sorts of troubles, including not being able to get cwrsync to run (it installs properly, but never starts up), and copssh not generating the keys at all. The guy was running Windows Server 2003, though, so I'm guessing the problems could just be because I'm running Windows 7. I've been trying to set it up with my Windows machine being the rsync server, and then Ubuntu and my webhosting VPS as the clients, but I realize it may be easier (and make more sense) to just setup the rsync server on Ubuntu, and then an rsync client on Windows 7? Can anyone point me in the right direction? I'm thinking of using this guide: http://www.gaztronics.net/rsync.php It seems a bit outdated, though.

    Read the article

  • How do I share my iPhoto photos with my ubuntu partition?

    - by Taryn East
    I have a MacBook Pro dual-booted with Snow Leopard and Ubuntu Karmic. I have recently imported hundreds of my photos into iPhoto - but I now want to be able to see them (and use them as desktop/screen saver images) from my Ubuntu partition (ie when the machine is running Ubuntu instead of MacOS). Is there an easy way to do this direct from the iPhoto library or do I have to shift them all out to an external file directory or something? Further edit - just to make it clear: I have already uploaded my photos directly into iPhoto - then spent many days categorising, tagging and uploading to flickr. Unless there's something I'm missing, I'm guessing it's likely too late to do the "don't copy into the iPhoto library" option. Happy to be proven wrong :) Perhaps somebody knows of a way to "export" the library without losing any of the current information - so that I can (from then on) keep the photos in an external library? I don't want to do this, though, if I lose the information that is currently there.

    Read the article

  • Choosing a Linux distribution

    - by Luke Puplett
    Dangerous territory with this question so please try to be impartial and instead focus on what to look for when choosing a Linux distribution. I'm completely new to Linux. I thought it'd never happen but I need to have a Linux box to play with and I have a spare fanless Atom PC (32-bit only). I'll be using the machine as a non-commercial hobby server, the trouble is, I don't even know how to compare Linux distributions and why people pick one over another. If anything, I want to have an easy install from USB stick. My question is: what do you look for when choosing a (free?) Linux distribution for a server? If you can, please explain what sorts of things actually differ between one and another without saying which you think is better, just the facts. The way I see it, Linux as a server is just an SSH console and I find it hard to imagine what could be different between one and another.

    Read the article

  • Using SQL Server profiler from 2008 against R2 database

    - by Chris Lively
    On my dev machine I have the management studio tools installed from SQL Server 2008. However, one of the sql server boxes I need to profile against is running R2. Because of this, when I start the profiler and connect to that server all of the templates are gone. If I go to Edit the templates, the R2 server type isn't available as an option. Do I have to upgrade my local toolset to R2 or is there some other way around this? Note, that if I create a template and save it the next time I connect to the server that saved template isn't available.

    Read the article

  • Is the BCM4306 wireless card ipv6 capable?

    - by horroricane
    I've been trying to connect with ipv6 enabled networks with my Broadcom Wireless card under Ubuntu 12.04. The wireless card model is BCM4306. $lspci $Network controller: Broadcom Corporation BCM4306 802.11b/g Wireless LAN Controller (rev 03) $Ethernet controller: Realtek Semiconductor Co., Ltd. RTL-8139/8139C/8139C+ (rev 10) I have been unsuccesful to connect through an ipv6 address, but I can still connect to a network when assigned an ipv4 address. While searching for an answer or a solution, I know the kernel can handle dealing with ipv6. So, what's left to question should be the hardware handling the connection. Unfortunately nothing comes up when I specifically try to search for information on BCM4306 ipv6 capabilities. I just tried using a wired connection to establish an ipv6 only connection to the network I'm on right now, but I got the same behavior of constant disconnections. Maybe it's not the hardware? I don't know.. I don't want to disable ipv6 on my machine as relevant networks I'll be connecting to will be using it exclusively, but I'm not sure what is wrong and which parts should I replace/fix to get this working.. Could someone please point me in a fruitful direction to get ipv6 working under Ubuntu 12.04?

    Read the article

  • Running a Screen instance of a program as non-root

    - by user288467
    I've got a dedicated server (Ubuntu 12.04, no GUI) set up to launch an instance of McMyAdmin and attach it to a screen instance every time I reboot the hardware. I have the command saved to root's crontab as: @reboot cd /var/MC_SVR && screen -dmS McMyAdmin ./MCMA2_Linux_x86_64 Problem being, though, I have a user set up specifically for FTP access to the server files so I don't always have to SSH into the machine. Since the server is being started as a root process, all the files it makes are, obviously, set with root as the owner. So I chown'd all the files and set them to ftpuser. Now I'm stuck with trying to get the process to start as ftpuser. I've tried doing the following but to no avail: cd /var/MC_SVR && su ftpuser - -c 'screen -dmS McMyAdmin ./MCMA2_Linux_x86_64' I try this in terminal and I get no errors or anything (in fact I never get anything unless it's a syntax error from su), but there's no screen instance to access and so I can assume the server never starts. So, what am I doing wrong? Or am I just not accessing the screen instance correctly since it's (supposed) to be launched by another user?

    Read the article

  • In virtualbox, I can't access the dvd drive to install a guest host

    - by user211062
    I have installed a fresh copy of Ubuntu Server 12.04 and VirtualBox 4.3. I have set up a VM called "MediaServer" and tried to start it. I then get the following error: Cannot open host device '/dev/sr0' for readonly access. Check the permissions of that device ('/bin/ls -l /dev/sr0'): Most probably you need to be member of the device group. Make sure that you logout/login after changing the group settings of the current user (VERR_ACCESS_DENIED) I have looked all over the Internet and have been unable to find a solution. Using Webmin, I tried changing the group settings so that my user name was in the "vboxusers" group, but that did not work either. I tried various other changes in group settings and none of them worked. Also, I tried rebooting the server after the changes and that didn't work either. I have been following a guide on how to set up an Ubuntu server from the website "linuxhomeserverguide.com" and when it came to the section where you could finally set up your first virtual machine, I am stumped. I would really appreciate it if someone could help me. Thanks in advance.

    Read the article

  • Have you considered switch from Windows to OS X?

    - by Oscar Reyes
    For several years I wanted to move from Windows to OSX. Mainly because back then Windows95/98/2000 were terrible options. Now I have finally put my hands on this. Just when Vista and Win7 seems to redeem all the mistakes from their predecessors. The most terrible rant is the lack of the following keys: PageUp, PageDown, Home, End, Supr Not having those made my first programming day terrible on OSX. I don't regret switching, but somehow I expected something else. So: Have you considered switch to mac? What has stopped you? What is holding you back? If you have switched already, are you willing to comeback? (Now I run XP/Vista/Ubuntu/OSX in the same machine, yeah!!)

    Read the article

  • Remote desktop to multiple windows machines on a LAN with dynamic IP

    - by kevyn
    Is it possible to use remote desktop to connect to multiple computers inside a network that has a dynamic IP address? I use a netgear WPN824 router which has dyndns onboard - but I currently use No-IP to control a single computer that I use most frequently. Every so often I need to get onto a couple of other computers in the network, but don't know how to go about this without logging onto one computer, and then starting another RDC session from that machine. What I would like to be able to do is connect to my router, and be able to see a list of connected devices, and then choose which to remote desktop onto. - I appreciate this probably is not be possible, but any other suggestions are welcome!

    Read the article

  • Openfire Installation Issue - Can't Login to admin panel

    - by Lobe
    I am trying to get Openfire to install on an Ubuntu virtual machine, however upon completing the web based installer, I am unable to login to the admin panel. So far I: downloaded Debian installer Installed using stock options Added database and built the structure using supplied SQL file Completed web based installer I am now trying to login using username: admin and my password, however I constantly get a wrong username/password error. There is a record generated in the MySQL database showing the admin user with an encrypted password, and changing to an unencoded password doesn't work. What is the problem here?

    Read the article

  • How to do a 3-tier using PHP [closed]

    - by Ric
    I have a requirement from a client for my PHP Web application to be 3-tier. For example, I would have a web server on Apache in the DMZ, but it should NOT contain any DB connections. It should connect to a Middle server that would host the business objects but be behind the firewall. Then those objects connect to my SQL cluster on another server. I have actually done this using .NET, but I am not sure how to setup my stack using PHP. I suppose I could have my UI front tier call the middle tier using REST based web services if I create my middle tier as a second web server, but this seems overly complex. The main reason for this is advanced security: we can not have any passwords on the DMZ first tier web server. The second reason is scalability - to have multiple server on different tiers that can handle the requests. The Last reason is for deployment - it is easier if I can take one set of servers offline for testing before putting them back in production. Is there a open source project that shows how to do this? The only example I can find is the web server hosting files from a shared drive on another machine (kind of how DotNetNuke pretends to be 3-tier), but that is NOT secure.

    Read the article

  • Google Chrome on Ubuntu 12.04 not rendering webpages correctly

    - by sumit_gt
    I am facing some serious web page rendering issues with Chrome. It is more prominent during javascript based animations and stuff on websites like youtube. I have tried removing chrome using (sudo apt-get purge google-chrome-stable) and then reinstalling it. But the problems still persist. The same webpages work correctly on firefox on ubuntu and chrome on windows. The problem only shows up when I use chrome on ubuntu. I think the issue has started after I updated to the latest version of Chrome. I have used Chrome previously on this machine without any problems. I have attached a image that demonstrates the issue. What could possibly be the problem? PS: here's the output of lshw -c video: *-display description: VGA compatible controller product: Madison [Radeon HD 5000M Series] vendor: Hynix Semiconductor (Hyundai Electronics) physical id: 0 bus info: pci@0000:01:00.0 version: 00 width: 64 bits clock: 33MHz capabilities: pm pciexpress msi vga_controller bus_master cap_list rom configuration: driver=fglrx_pci latency=0 resources: irq:46 memory:e0000000-efffffff memory:f0020000-f003ffff ioport:d000(size=256) memory:f0000000-f001ffff Here's the output of lspci -nn: output of lspci -nn

    Read the article

< Previous Page | 778 779 780 781 782 783 784 785 786 787 788 789  | Next Page >