Search Results

Search found 25039 results on 1002 pages for 'machine learning'.

Page 485/1002 | < Previous Page | 481 482 483 484 485 486 487 488 489 490 491 492  | Next Page >

  • Memory Usage on Linux box does not match up with `free`

    - by Chris Lieb
    I have a Linux machine that is not running too much in the way of software, but is somehow using 1.7GB of the 2GB of the installed memory. When I run free, I get: total used free shared buffers cached Mem: 2072616 1979972 92644 0 164876 129740 -/+ buffers/cache: 1685356 387260 Swap: 498004 1632 496372 However, when I run ps aux, the memory usage of all processes only comes out to 295.9MB, which is a far cry from the 1.7GB of memory that free reports as used. Why is there such a discrepancy?

    Read the article

  • Is there an equivalent to Airfoil available for Linux?

    - by Chris Adams
    Hi there, I just stumbled across this Airfoil on Mac OS X, which lets me 'throw' music from my laptop to any other linux machine, iPhone or other device hooked up to a better set of speakers than my laptop - here's the page below: http://www.rogueamoeba.com/airfoil/mac/ What tools would I use to recreate this functionality on a linux box - are there any nice GUI interfaces to something like pulse audio (which is what I imagine I'd use ) for doing this? Thanks

    Read the article

  • How to configure mercurial access controls using apache and hgweb?

    - by Gj1
    I have set up a mercurial repo to be served using apache+wsgi+hgweb on OS X. It is now completely open to anyone who stumbles upon my server on the correct port number.. How can I set it up so that only people with a username+password pair that I approve can pull and/or push from the repo? I know how to very easily achieve this using ssh, but in this specific case the requirement is that the solution doesn't require defining full fledged user accounts on the machine for each person whom I'd like to give access to the repo.

    Read the article

  • Remote SCCM deployment of Operating Systems

    - by Decad
    I am currently using sccm 2007 for our software deployment and PXE. During this summer I have been tasked with upgrading 2000+ machines from Windows XP to Windows 7. My plan is to use sccm to advertise the Windows 7 task sequence to the machines. However my question is, what is the best way to automate the deployment? Can I make SCCM turn a machine on and make it run an advertised task sequence without having to be in the same room as the machines?

    Read the article

  • Is there a Distributed SAN/Storage System out there?

    - by Joel Coel
    Like many other places, we ask our users not to save files to their local machines. Instead, we encourage that they be put on a file server so that others (with appropriate permissions) can use them and that the files are backed up properly. The result of this is that most users have large hard drives that are sitting mainly empty. It's 2010 now. Surely there is a system out there that lets you turn that empty space into a virtual SAN or document library? What I envision is a client program that is pushed out to users' PCs that coordinates with a central server. The server looks to users just like a normal file server, but instead of keeping entire file contents it merely keeps a record of where those files can be found among various user PCs. It then coordinates with the right clients to serve up file requests. The client software would be able to respond to such requests directly, as well as be smart enough to cache recent files locally. For redundancy the server could make sure files are copied to multiple PCs, perhaps allowing you to define groups in different locations so that an instance of the entire repository lives in each group to protect against a disaster in one building taking down everything else. Obviously you wouldn't point your database server here, but for simpler things I see several advantages: Files can often be transferred from a nearer machine. Disk space grows automatically as your company does. Should ultimately be cheaper, as you don't need to keep a separate set of disks I can see a few downsides as well: Occasional degradation of user pc performance, if the machine has to serve or accept a large file transfer during a busy period. Writes have to be propogated around the network several times (though I suspect this isn't really much of a problem, as reading happens in most places more than writing) Still need a way to send a complete copy of the data offsite occasionally, and this would make it very hard to do differentials Think of this like a cloud storage system that lives entirely within your corporate LAN and makes use of your existing user equipment. Our old main file server is due for retirement in about 2 years, and I'm looking into replacing it with a small SAN. I'm thinking something like this would be a better fit. As a school, we have a couple computer labs I can leave running that would be perfect for adding a little extra redundancy to the system. Unfortunately, the closest thing I can find is Dienst, and it's just a paper that dates back to 1994. Am I just using the wrong buzzwords in my searches, or does this really not exist? If not, is there a big downside that I'm missing?

    Read the article

  • What else can I do to secure my Linux server?

    - by eric01
    I want to put a web application on my Linux server: I will first explain to you what the web app will do and then I will tell you what I did so far to secure my brand new Linux system. The app will be a classified ads website (like gumtree.co.uk) where users can sell their items, upload images, send to and receive emails from the admin. It will use SSL for some pages. I will need SSH. So far, what I did to secure my stock Ubuntu (latest version) is the following: NOTE: I probably did some things that will prevent the application from doing all its tasks, so please let me know of that. My machine's sole purpose will be hosting the website. (I put numbers as bullet points so you can refer to them more easily) 1) Firewall I installed Uncomplicated Firewall. Deny IN & OUT by default Rules: Allow IN & OUT: HTTP, IMAP, POP3, SMTP, SSH, UDP port 53 (DNS), UDP port 123 (SNTP), SSL, port 443 (the ones I didn't allow were FTP, NFS, Samba, VNC, CUPS) When I install MySQL & Apache, I will open up Port 3306 IN & OUT. 2) Secure the partition in /etc/fstab, I added the following line at the end: tmpfs /dev/shm tmpfs defaults,rw 0 0 Then in console: mount -o remount /dev/shm 3) Secure the kernel In the file /etc/sysctl.conf, there are a few different filters to uncomment. I didn't know which one was relevant to web app hosting. Which one should I activate? They are the following: A) Turn on Source Address Verification in all interfaces to prevent spoofing attacks B) Uncomment the next line to enable packet forwarding for IPv4 C) Uncomment the next line to enable packet forwarding for IPv6 D) Do no accept ICMP redirects (we are not a router) E) Accept ICMP redirects only for gateways listed in our default gateway list F) Do not send ICMP redirects G) Do not accept IP source route packets (we are not a router) H) Log Martian Packets 4) Configure the passwd file Replace "sh" by "false" for all accounts except user account and root. I also did it for the account called sshd. I am not sure whether it will prevent SSH connection (which I want to use) or if it's something else. 5) Configure the shadow file In the console: passwd -l to lock all accounts except user account. 6) Install rkhunter and chkrootkit 7) Install Bum Disabled those services: "High performance mail server", "unreadable (kerneloops)","unreadable (speech-dispatcher)","Restores DNS" (should this one stay on?) 8) Install Apparmor_profiles 9) Install clamav & freshclam (antivirus and update) What did I do wrong and what should I do more to secure this Linux machine? Thanks a lot in advance

    Read the article

  • Have a server, need to figure out a method of backup

    - by PolishHurricane
    My company has an older Dell 2650 server running ArchLinux x64: http://www.dell.com/downloads/global/products/pedge/en/2650_specs.pdf (2 x 2.4GHz Intel Xeon w/around 3287 RAM according to "free -m") We use it to host our internal company site and to post some information from our orders to and we'd like the ability to keep it up as much as possible. What we require: - It needs to always be functional from 8am to 4pm for our data entry person to use it and others to do other things required on it. - If it goes down, we need a quick way to get the machine running again. - If it goes down, we would like to have the data backed up. Some of the major problems include: - The servers old and it may have memory issues - We don't know when one of the hard drives could fail - Our power goes out here once in a while We have a battery backup, but that's pretty much it and it's not for long term. If the server does go down, we have another system in place to store order information that comes in while it's down and repost it when it's back, but we need it up during the day. So we're wondering, what should we get for options? These are the things we thought of, sort of: Setup RAID 1, but that would involve wiping everything right? If we do that, how would we transfer the data over without messing up the server? We could buy an extra server or 2 off eBay for $100, the same model, is that practical or should we get something else? Should we buy a PC or another better server and host off that because it is if anything easier to exchange parts? Should we keep extra parts handy incase it implodes? Should we buy/use backup software? We hear drobo's are cool, but suck. Perhaps there is a software solution to this problem that backs up to another machine or gets us up and running again quickly. Also, if we are to purchase hardware, what is decent? Does anybody know of one for ArchLinux/Linux? We both know a ton about computers but we're kind of unsure what step to take with this, especially with this type of server. Thanks

    Read the article

  • How to redirect external web request to localhost's testing server

    - by Ivan Monteiro
    Some web services calls my web application(www.myapplication.com/external_update_handler). I need to test those requests locally, so I'd like to know your opinions about how can I "redirect" those requests to my localhost dev machine(that is outside of my web aplication domain) so I can debug. Probably it's needed a service/server to get those external requests and a desktop application that sends it to localhost:5555/external_update_handler, but I have no idea where to start and simpler options.

    Read the article

  • Unable to share a folder between Windows 7 and Ubuntu (running in VMWare)

    - by darthvader
    I have installed vmware toolbox in ubuntu (guest OS). I tried to share a location from the settings of the virtual machine. But when I click Ok, the following error in thrown in the host (Win 7) OS. "Unable to update run-time folder sharing status: Unknown error." The location is not showing up in /mnt/ What could be the reason? P.S I have vmhgfs process running in my Ubuntu VM. I was following this method.

    Read the article

  • Add Ubuntu to windows domain

    - by John Isaacks
    I am new to linux. I do not have any knowledge on how to do anything on linux. I just got a new machine and successfully installed ubuntu onto it. The first thing I want to do is join the domain the rest of the computers are on. How do I do that. I tried googling it but all the results assume some sort of linux knowledge and I have none.

    Read the article

  • How to start gVim maximized?

    - by Somebody still uses you MS-DOS
    How to make gVim automatically maximize it's window when I open it? And a cross-plataform solution, I'm trying to use the same configs in a Linux and Windows machine... I've tried the hack :set lines=999 columns=999, it almost works, but the window is not maximized, just resized and I lose some rows/columns.

    Read the article

  • Rewrite /folder to / Using .htaccess

    - by manyxcxi
    I am trying to redirect /folder to / using .htaccess but all am I getting is the Apache HTTP Server Test Page. My root directory looks like this: / .htaccess -/folder -/folder2 -/folder3 My .htaccess looks like this: RewriteEngine On RewriteCond %{REQUEST_URI} !^/folder/ RewriteRule (.*) /folder/$1 What am I doing wrong? I checked my httpd.conf (I'm running Centos) and the mod_rewrite library is being loaded. As a side note, my server is not a www server, it's simply a virtual machine so it's hostname is centosvm.

    Read the article

  • How could I determine which SMB client/session has a specific file open on a Server 2008R2 Windows file server?

    - by Rasmir
    What I need a way to associate a client name or IP address with an open file, so that I can cleanly close the file for maintenance. NET SESSION doesn't show the names of open files and NET FILE doesn't show the client which has the file open. I had hoped that I could cross-reference the data from these two commands, but that doesn't seem doable. Everything else I've see provides the same data as these commands, with no apparent way to determine which client machine has the file open.

    Read the article

  • Running localhost webapp projects under domain name using fiddler2

    - by user01
    I have a Tomcat server running on my local dev machine(running Windows8) & I use fiddler2 to assign an alias to localhost as my domain name (www.mydomainName.com), so my application webpages open in the browser like this: http://www.mydomainName.com/myAppName/welcome.html instead of http://localhost:8080/myAppName/welcome.html But I want to my webapp pages urls to omit 'myAppName' & be something like : http://www.mydomainName.com/welcome.html How could I configure to do this ?

    Read the article

  • Help me hosting Apache server on a LAN connected computers?

    - by akhilesh
    i have build a JSP project,but i really want that other computer connected to my computer can access by website.i have never done it before Please help me.Now my server can be accessed using http://localhost:8080 on my local machine.what are the configuration i need to do please tell me.please post some link or step by step help.

    Read the article

  • Cisco NAT vs Bridge vs BVI

    - by cjavapro
    The only devices on this particular LAN will all have public IP addresses. Also the public IP address will be configured directly on the machine,,, so we will not translate private/public IP addresses. If we use NAT,, we would have to translate the public IP on the WAN to the public IP on the LAN. The only security feature I expect on the gateway is an access list. I don't really know much about networking, so I am sorry if this question is generic.

    Read the article

  • Windows 7 can't identify network

    - by Carl Hörberg
    I use a Windows 7 machine to share my internet connection, but the one network interface which are connected to my local network is marked as "Unidentified network", the sharing works well anyway but because the interface can't be chosen as Home network i can't use the HomeGroup features etc. Do you know which requirements an interface has meet to identify a network in Windows 7?

    Read the article

  • MySQL server installation problems (windows)

    - by waitinforatrain
    Hi guys, I'm trying to install some CMS software (Wiccle). I was using XAMPP's MySQL but was getting a lot of errors (the same configuration works on another machine) so thought I'd install MySQL Community Edition to see if the proplem was related to the MySQL server. When I install and run the MySQL Community Edition service, however, it only works with my XAMPP password, and contains the same tables as the XAMPP install. Is there a common local database file where the database and login info is stored? Any help appreciated

    Read the article

< Previous Page | 481 482 483 484 485 486 487 488 489 490 491 492  | Next Page >