Search Results

Search found 25266 results on 1011 pages for 'internet usage'.

Page 200/1011 | < Previous Page | 196 197 198 199 200 201 202 203 204 205 206 207  | Next Page >

  • How should I monitor memory usage/performance in SunOS/Solaris?

    - by exhuma
    Last week we decided to add some SunOS (uname -a = SunOS bbs-sam-belair 5.10 Generic_127128-11 i86pc i386 i86pc) machines into our running munin instance. First off, the machines are pre-configured appliances, so, I want to avoid touching the system too much without supervision of the service provider. But adding it to munin was fairly easy by writing a small socket-service (if anyone is interested, I put it up on github: https://github.com/munin-monitoring/contrib/tree/master/tools/pypmmn) Yesterday, I implemented/adapted the required plugins for our machines. And here the questions start: First, I have not found a way to determine detailed memory usage values. I get the total memory by running prtconf | grep Memory, and the free memory using vmstat. Fiddling together a munin-plugin, gives me the following graph: This is pretty much uninformative. Compare this to the default plugin for linux nodes which has a lot more detail: Most importantly, this shows me how much memory is actually used by applications. So, first question: Is it possible to get detailed memory information on SunOS with the default system tools (i.e. not using top)? Onto the next puzzle: Seeing the graphs, I noticed activity in the "Paging in/out" graphs, even though the memory graph still has unused memory: Upon further investigation, I found out that df reports that /tmp is mounted on swap. Drilling around on the web, I understood that df will display swap, but in fact, it's mounted as a tmpfs. Now I don't know if this explains the swap activity. The default munin-plugin for solaris uses kstat -p -c misc -m cpu_stat to get these values. I find it already strange that this is using the cpu_stat module. So maybe I simply misinterpret the "paging" graphs? Second question: Do the paging graphs indicate that parts of the memory are paged to disk? Or is the activity caused by file operations in /tmp?

    Read the article

  • Why is my apache2, mod_fcgid, php configuration causing 100% cpu usage?

    - by Scott Lundgren
    Page load makes a quick initial connection, then hangs about 10 seconds before the page renders. When the server load goes up I start watching top & I see that both CPUs get pegged at times to 100% by between 4-8 processes of php-cgi. My theory is that since I never see RAM usage never go above 50%, that apache is able to handle the requests coming in, but is queueing them for PHP to process. What is wrong with my mod_fcgid/php configuration ? RHEL 5.4 2 Xeon E5420s @ 2.50 Ghz 4 Gb RAM Apache 2.2.3 Timeout 30 KeepAlive On MaxKeepAliveRequests 0 KeepAliveTimeout 5 <IfModule worker.c> StartServers 2 MaxClients 300 MinSpareThreads 25 MaxSpareThreads 75 ThreadsPerChild 25 MaxRequestsPerChild 0 </IfModule> mod_fcgid 2.2.10 LoadModule fcgid_module modules/mod_fcgid.so <IfModule !mod_fastcgi.c> AddHandler fcgid-script fcg fcgi fpl php </IfModule> SocketPath run/mod_fcgid SharememPath run/mod_fcgid/fcgid_shm DefaultInitEnv PHPRC "/etc/" FCGIWrapper /usr/bin/php-cgi .php MaxRequestsPerProcess 1500 MaxProcessCount 20 IPCCommTimeout 240 IdleTimeout 240 APC 3.0.19 extension = apc.so apc.enabled=1 apc.shm_segments=1 apc.optimization=0 apc.shm_size=32 apc.ttl=7200 APC cache is 43% used with a 99% hit rate

    Read the article

  • How to read XML from the internet using a Web Proxy?

    - by Mark Allison
    This is a follow-up to this question: How to load XML into a DataTable? I want to read an XML file on the internet into a DataTable. The XML file is here: http://rates.fxcm.com/RatesXML If I do: public DataTable GetCurrentFxPrices(string url) { WebProxy wp = new WebProxy("http://mywebproxy:8080", true); wp.Credentials = CredentialCache.DefaultCredentials; WebClient wc = new WebClient(); wc.Proxy = wp; MemoryStream ms = new MemoryStream(wc.DownloadData(url)); DataSet ds = new DataSet("fxPrices"); ds.ReadXml(ms); DataTable dt = ds.Tables["Rate"]; return dt; } It works fine. I'm struggling with how to use the default proxy set in Internet Explorer. I don't want to hard-code the proxy. I also want the code to work if no proxy is specified in Internet Explorer.

    Read the article

  • Is Transport security a bad practice for the WCF service over the Internet?

    - by Sergey
    Hello, I have a WCF service accessible over the Internet. It has wsHttpBinding binding and message security mode with username credentials to authenticate clients. The msdn says that we should use message security for the Internet scenarios, because it provides end-to-end security instead of point-to-point security as Transport security has. What if i use transport security for the wcf service over the Internet? Is it a bad practice? Could my data be seen by malicious users? Thanks, Sergey

    Read the article

  • Has anyone ever bought an "Internet Connection" license for WSS 3.0?

    - by strongopinions
    I would like to run a WSS 3.0 that is exposed to the internet and provides access to an arbitrary number of users through forms-based authentication. If the WSS licensing is analogous to that of MOSS, then there should be some special licensing required to make this legitimate. I have seen several vague statements on the internet about an "Internet Connection" license for WSS 3.0, and some more general statements from Mike Walsh to the effect that it costs "around $2000." But I have never seen anything official about this, and I'm not sure if I am even using the correct terminology. Has anyone actually purchased something resembling this license?

    Read the article

  • Java: Anyone know of a library that detects the quality of an internet connection?

    - by Zombies
    I know a simple URLConnection to google can detect if I am connected to the internet, after all I am confident that the internet is all well and fine If I cant connect to google. But what I am looking for at this juncture is a library that can measure how effective my connection to the internet is in terms of BOTH responsiveness and bandwidth available. BUT, I do not want to measure how much bandwidth is potentially available as that is too resource intensive. I really just need to be able to test wether or not I can recieve something like X kB's in Y amount of time. Does such a library already exist?

    Read the article

  • Is there other usage of Comcast extension cable and splitter?

    - by Tim
    I have an unused Comcast high-speed internet self-install kit from 2006. It has two coils of extension cable, and a splitter, and some screws. The name of the cable is "extension cable" printed on the package. I wonder if that is its actual name? As ethernet cables are always useful, I wonder if the extension cables are totally worthless and should be tossed into a trash can? Or are they usable for some other purposes, or even sellable? Thanks! SVI Digital SV-2G Splitter (5/1000MHz): One interface of the cables: Screws

    Read the article

  • Can Kies provide a data connection so my phone can avoid mobile data usage?

    - by Highly Irregular
    I regularly connect my Samsung phone by USB cable to my PC to charge it up, and when I run the Samsung Kies app on the PC I can sync data too. Is it possible for my phone to use my PCs internet connection through the USB link to avoid data charges? I don't have a wifi connection available. It would make sense if Kies could provide this, or perhaps there's some other way I can achieve it without wifi, such as using Bluetooth. I have a Samsung GT-S7500, though this question should apply to many other models.

    Read the article

  • WP7 Return the last 7 days of data from an xml web service

    - by cvandal
    Hello, I'm trying to return the last 7 days of data from an xml web service but with no luck. Could someone please explain me to how I would accomplish this? The XML is as follows: <node> <api> <usagelist> <usage day="2011-01-01"> <traffic name="total" unit="bytes">23579797</traffic> </usage> <usage day="2011-01-02"> <traffic name="total" unit="bytes">23579797</traffic> </usage> <usage day="2011-01-03"> <traffic name="total" unit="bytes">23579797</traffic> </usage> <usage day="2011-01-04"> <traffic name="total" unit="bytes">23579797</traffic> </usage> </usagelist> </api> </node> EDIT The data I want to retrieve will be used to populate a line graph. Specificly I require the day attribute value and the traffic element value for the past 7 days. At the moment, I have the code below in place, howevewr it's only showing the first day 7 times and traffic for the first day 7 times. XDocument xDocument = XDocument.Parse(e.Result); var values = from query in xDocument.Descendants("usagelist") select new History { day = query.Element("usage").Attribute("day").Value, traffic = query.Element("usage").Element("traffic").Value }; foreach (History history in values) { ObservableCollection<LineGraphItem> Data = new ObservableCollection<LineGraphItem>() { new LineGraphItem() { yyyymmdd = history.day, value = double.Parse(history.traffic) }, new LineGraphItem() { yyyymmdd = history.day, value = double.Parse(history.traffic) }, new LineGraphItem() { yyyymmdd = history.day, value = double.Parse(history.traffic) }, new LineGraphItem() { yyyymmdd = history.day, value = double.Parse(history.traffic) }, new LineGraphItem() { yyyymmdd = history.day, value = double.Parse(history.traffic) }, new LineGraphItem() { yyyymmdd = history.day, value = double.Parse(history.traffic) }, new LineGraphItem() { yyyymmdd = history.day, value = double.Parse(history.traffic) }, }; lineGraph1.DataSource = Data; }

    Read the article

  • Amanda Todd&ndash;What Parents Can Learn From Her Story

    - by D'Arcy Lussier
    Amanda Todd was a bullied teenager who committed suicide this week. Her story has become headline news due in part to her You Tube video she posted telling her story:   The story is heartbreaking for so many reasons, but I wanted to talk about what we as parents can learn from this. Being the dad to two girls, one that’s 10, I’m very aware of the dangers that the internet holds. When I saw her story, one thing jumped out at me – unmonitored internet access at an early age. My daughter (then 9) came home from a friends place once and asked if she could be in a YouTube video with her friend. Apparently this friend was allowed to do whatever she wanted on the internet, including posting goofy videos. This set off warning bells and we ensured our daughter realized the dangers and that she was not to ever post videos of herself. In looking at Amanda’s story, the access to unmonitored internet time along with just being a young girl and being flattered by an online predator were the key events that ultimately led to her suicide. Yes, the reaction of her classmates and “friends” was horrible as well, I’m not diluting that. But our youth don’t fully understand yet that what they do on the internet today will follow them potentially forever. And the people they meet online aren’t necessarily who they claim to be. So what can we as parents learn from Amanda’s story? Parents Shouldn’t Feel Bad About Being Internet Police Our job as parents is in part to protect our kids and keep them safe, even if they don’t like our measures. This includes monitoring, supervising, and restricting their internet activities. In our house we have a family computer in the living room that the kids can watch videos and surf the web. It’s in plain view of everyone, so you can’t hide what you’re looking at. If our daughter goes to a friend’s place, we ask about what they did and what they played. If the computer comes up, we ask about what they did on it. Luckily our daughter is very up front and honest in telling us things, so we have very open discussions. Parents Need to Be Honest About the Dangers of the Internet I’m sure every generation says that “kids grow up so fast these days”, but in our case the internet really does push our kids to be exposed to things they otherwise wouldn’t experience. One wrong word in a Google search, a click of a link in a spam email, or just general curiosity can expose a child to things they aren’t ready for or should never be exposed to (and I’m not just talking about adult material – have you seen some of the graphic pictures from war zones posted on news sites recently?). Our stance as parents has been to be open about discussing the dangers with our kids before they encounter any content – be proactive instead of reactionary. Part of this is alerting them to the monsters that lurk on the internet as well. As kids explore the world wide web, they’re eventually going to encounter some chat room or some Facebook friend invite or other personal connection with someone. More than ever kids need to be educated on the dangers of engaging with people online and sharing personal information. You can think of it as an evolved discussion that our parents had with us about using the phone: “Don’t say ‘I’m home alone’, don’t say when mom or dad get home, don’t tell them any information, etc.” Parents Need to Talk Self Worth at Home Katie makes the point better than I ever could (one bad word towards the end): Our children need to understand their value beyond what the latest issue of TigerBeat says, or the media who continues flaunting physical attributes over intelligence and character, or a society that puts focus on status and wealth. They also have to realize that just because someone pays you a compliment, that doesn’t mean you should ignore personal boundaries and limits. What does this have to do with the internet? Well, in days past if you wanted to be social you had to go out somewhere. Now you can video chat with any number of people from the comfort of wherever your laptop happens to be – and not just text but full HD video with sound! While innocent children head online in the hopes of meeting cool people, predators with bad intentions are heading online too. As much as we try to monitor their online activity and be honest about the dangers of the internet, the human side of our kids isn’t something we can control. But we can try to influence them to see themselves as not needing to search out the acceptance of complete strangers online. Way easier said than done, but ensuring self-worth is something discussed, encouraged, and celebrated is a step in the right direction. Parental Wake Up Call This post is not a critique of Amanda’s parents. The reality is that cyber bullying/abuse is happening every day, and there are millions of parents that have no clue its happening to their children. Amanda’s story is a wake up call that our children’s online activities may be putting them in danger. My heart goes out to the parents of this girl. As a father of daughters, I can’t imagine what I would do if I found my daughter having to hide in a ditch to avoid a mob or call 911 to report my daughter had attempted suicide by drinking bleach or deal with a child turning to drugs/alcohol/cutting to cope. It would be horrendous if we as parents didn’t re-evaluate our family internet policies in light of this event. And in the end, Amanda’s video was meant to bring attention to her plight and encourage others going through the same thing. We may not be kids, but we can still honour her memory by helping safeguard our children.

    Read the article

  • What NAS setup for two-way syncing over the internet?

    - by Jamse
    I have family living a few hours away and have a lot of files that I would like to share - especially lots of folders of digital photos, but also documents etc. - partially so they can see them, partially so I can have access when I visit them and partially for backup / redundancy purposes. My current hard drives on my main machine are getting pretty full anyway, and I have a MythTV box where my music is currently stored, so I was thinking of getting a NAS anyway. And at the other end my family have a few computers, so they would probably benefit from a NAS too. My general idea (though I'm willing to shift on this if there are any bright ideas about other ways of achieving my objectives) is to get a matching pair of NASs and have them sync over the internet. (To cut down on bandwidth use I would get them in sync locally to start with.) Having read around as best I can it seems that syncing over the internet is generally only a feature on quite high end units. However, I have seen that QNAP seem to feature this on their TS-110 and TS-210 units, which might work (they call it "remote replication"). They seem pretty reasonably priced for what they are, but of course with buying 2 of them and then adding the drives (say 1TB or 2TB each) I'd be looking at about £400 total. So, I'm looking for recommendations really. I don't want to spend more than the QNAPs would cost me, but any other ideas would be most appreciated. I am comfortable with technology and tinkering around, but I don't have as much time for that as I would like, so I guess I would favour solutions that require less tinkering rather than more (even though that's less fun!). Any thoughts would be welcome, as would any comments from people who have used the QNAP boxes for this. Thanks in advance. Some specifications: Two-way syncing. Changes made at either end should be synced to the other. There shouldn't be one unit that is effectively a read-only mirror of the other. Not real time. The syncing doesn't need to be real time - if it updated, say, daily overnight that would be fine. Set and forget. I would prefer minimal user interaction once set up - it would be great if syncs were scheduled and automatic. OS independence. I am running Windows XP plus an Ubuntu-based MythTV box. At the other end there are Windows 7 and Windows XP machines, plus a networked TV set top box which I think can play files off the network. Machine independence. I would favour a system that is self-contained, i.e. not reliant on any particular PC being switched on. If the system had enough else going for it I could perhaps work around it at this end, where I only have one PC that's used as such, but it would be harder at the other where there are at least two PCs that might be accessing the files. Notifications. I guess things like getting an email notification if the syncing fell over for any reason would be useful, though it's not a deal breaker. Update I've been digging some more and it looks like QNAP's Remote Replication function is actually just Rsync, so only really suitable for one-way syncing. I've posted on their forum to double check, but I think that's the case. In which case, I think the focus of my question is now either: do any reasonably-priced NASs support bidirectional syncing over the internet?, or has anyone had any luck installing onto NASs for this purpose? (Also, updated question to clarify that I'm after two-way syncing.)

    Read the article

  • Does 64bit Windows 8 have the same 75% memory-usage limitation for applications as Windows 7?

    - by Barleyman
    64bit Windows 7 (and Windows Vista) have a built-in limit of not being able to use the last 25% of RAM. You will get a low memory warning when you get close to the limit. Even if you disable that warning, applications will run out of memory and crash since the OS will refuse to allocate memory from that last 25%. That was fine when Vista was designed, when machines had 1 GB of total memory, but is pretty daft for today's 8 GB machines. Yes, the system will run cache, etc. on that extra 2 GB, but running out of memory when you have "merely" 2 GB left.... NB: this has nothing to do with the page file. If you limit the page file to a sensible size like 2 GB, you will still see this behavior. The system will cram the page file to the last byte while refusing to touch that 1/4th of the RAM. Does Windows 8 change this behavior? Is there now some fixed minimum free RAM requirement, like 512 MB, or is it still 25%? Can you actually adjust the low memory limit? EDIT: Here is another older post here which discusses this same behavior on Windows 7. There is fixed 25% limit in Windows 7 and I'd like to know if it's still in Windows 8. Windows 7 / Page File Disabled / 12 GB RAM / 2+ GB RAM free and "your computer is running low on memory" Edit2: Here is another link discussing the low memory warning and how to disable it. Note he claims the limit for RAM usage is 80%, not 75%. It would seem to be correct as you can in fact allocate 6.4GB of RAM with 8GB machine. Anything above and beyond that goes to the pagefile, though. http://halflight.com.au/2011/04/06/how-to-disable-low-memory-warnings-and-the-advantages-of-removing-the-page-file/ Edit3: a Here's couple of process explorer screenshots that demonstrate how it goes down. Exhibit1: https://dl.dropbox.com/u/42068601/sysinfo.jpg Exhibit2: https://dl.dropbox.com/u/42068601/sysint2.jpg You can see that Windows 7 will use the memory 6.4GB as the very last resort. I have low memory warning switched off here so programs crashed at the last screenshot allocation. With low memory warning turned on, it starts nagging before you can push OS to use that remaining 1.6GB. The question is not "Is it OK windows does not want to allocate last 20% of RAM because X", it's "Does Windows 8 still behave this way". With 16GB this really becomes dumb.

    Read the article

  • Ubuntu Server, 2 Ethernet Devices, Same Gateway - Want to force internet traffic through 1 device (or at least allow it to work!)

    - by Chris Drumgoole
    I have a Ubuntu 10.04 Server with 2 ethernet devices, eth0 and eth1. eth0 has a static IP of 192.168.1.210 eth1 has a static IP if 192.168.1.211 The DHCP server (which also serves as the internet gateway) sits at 192.168.1.1. The issue I have right now is when I have both plugged in, I can connect to both IPs over SSH internally, but I can't connect to the internet from the server. If I unplug one of the devices (e.g. eth1), then it works, no problem. (Also, I get the same result when I run sudo ifconfig eth1 down). Question, how can I configure it so that I can have both devices eth0 and eth1 play nice on the same network, but allow internet access as well? (I am open to either enforcing all inet traffic going through a single device, or through both, I'm flexible). From my google searching, it seems I could have a unique (or not popular) problem, so haven't been able to find a solution. Is this something that people generally don't do? The reason I want to make use of both ethernet devices is because I want to run different local traffic services on on both to split the load, so to speak... Thanks in advance. UPDATE Contents of /etc/network/interfaces: # The loopback network interface auto lo iface lo inet loopback # The primary network interface auto eth0 iface eth0 inet dhcp # The secondary network interface #auto eth1 #iface eth1 inet dhcp (Note: above, I commented out the last 2 lines because I thought that was causing issues... but it didn't solve it) netstat -rn Kernel IP routing table Destination Gateway Genmask Flags MSS Window irtt Iface 192.168.1.0 192.168.1.1 255.255.255.0 UG 0 0 0 eth0 192.168.1.0 0.0.0.0 255.255.255.0 U 0 0 0 eth0 0.0.0.0 192.168.1.1 0.0.0.0 UG 0 0 0 eth0 UPDATE 2 I made a change to the /etc/network/interfaces file as suggested by Kevin. Before I display the file contents and the route table, when I am logged into the server (through SSH), I can not ping an external server, so this is the same issue I was experiencing that led to me posting this question. I ran a /etc/init.d/networking restart after making the file changes. Contents of /etc/network/interfaces: # The loopback network interface auto lo iface lo inet loopback # The primary network interface auto eth0 iface eth0 inet dhcp address 192.168.1.210 netmask 255.255.255.0 gateway 192.168.1.1 # The secondary network interface auto eth1 iface eth1 inet dhcp address 192.168.1.211 netmask 255.255.255.0 ifconfig output eth0 Link encap:Ethernet HWaddr 78:2b:cb:4c:02:7f inet addr:192.168.1.210 Bcast:192.168.1.255 Mask:255.255.255.0 inet6 addr: fe80::7a2b:cbff:fe4c:27f/64 Scope:Link UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 RX packets:6397 errors:0 dropped:0 overruns:0 frame:0 TX packets:683 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:538881 (538.8 KB) TX bytes:85597 (85.5 KB) Interrupt:36 Memory:da000000-da012800 eth1 Link encap:Ethernet HWaddr 78:2b:cb:4c:02:80 inet addr:192.168.1.211 Bcast:192.168.1.255 Mask:255.255.255.0 inet6 addr: fe80::7a2b:cbff:fe4c:280/64 Scope:Link UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 RX packets:5799 errors:0 dropped:0 overruns:0 frame:0 TX packets:8 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:484436 (484.4 KB) TX bytes:1184 (1.1 KB) Interrupt:48 Memory:dc000000-dc012800 lo Link encap:Local Loopback inet addr:127.0.0.1 Mask:255.0.0.0 inet6 addr: ::1/128 Scope:Host UP LOOPBACK RUNNING MTU:16436 Metric:1 RX packets:635 errors:0 dropped:0 overruns:0 frame:0 TX packets:635 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:0 RX bytes:38154 (38.1 KB) TX bytes:38154 (38.1 KB) netstat -rn Kernel IP routing table Destination Gateway Genmask Flags MSS Window irtt Iface 192.168.1.0 0.0.0.0 255.255.255.0 U 0 0 0 eth0 192.168.1.0 0.0.0.0 255.255.255.0 U 0 0 0 eth1 0.0.0.0 192.168.1.1 0.0.0.0 UG 0 0 0 eth1 0.0.0.0 192.168.1.1 0.0.0.0 UG 0 0 0 eth0

    Read the article

  • What's the risk of running a Domain Controller so that it is accessible from the internet?

    - by Adrian Grigore
    I have three remote dedicated web servers at different webhosts. Adding them to a common domain would make a lot of administration tasks much easier. Since two of the servers are running Windows 2008 R2 Standard, I thought about promoting them to Domain Controllers in order to set up the windows domain. There's another thread at Serverfault that recommends this. At the same time I've read a lot of times on different websites that this is not a good idea because an domain controller should always be behind a firewall LAN. But I can't set up something like this because I don't have a LAN with a static IP accessible from the internet. In fact I don't even have a windows server in my LAN. What I have not found out is why exposing a DC to the Internet would be bad idea. The only risk I can see is that if someone penetrates one of my webservers, it should be much easier to penetrate the others as well. But as far as I can see that's the worst case scenario since I am only going my web servers to that domain, not any computers from my local network. Is this the only downside or does it also make it easier to penetrate one of my web servers in the first place? Thanks, Adrian

    Read the article

  • Internet Explorer 9 Cannot open file on download CTRL + J doesn’t work can’t open list of downloads

    - by simonsabin
    If any of the above symptoms are causing a problem, i.e. 1. You download a file and the download dialog disappears. 2. You select open when you download a file and nothing happens. 3. The View Downloads doesn’t work 4. CTRL + J doesn’t work (view downloads) The solution is to clear your download history See IE9 - View downloads / Ctrl+J do not open. I cannot open any file. But SAVE function still work fine. 64 bit version. for details the answer is provided by Steven. S on June 20th. I hope that...(read more)

    Read the article

  • Getting Server 2008 R2 to ignore all traffic from Internet-facing NIC, leaving it to a VM

    - by Wolvenmoon
    I got in to Server 2008 R2 via Dreamspark and would like to start learning on it. I don't have much option but to put it on a system sitting between the Internet and my home LAN due to electricity bills and the fact that 3 computers in an 11x11 space in 102 degree weather is pretty stygian. Currently I use a ClearOS gateway to manage everything, what I'd like to do is take my server 2008 R2 box, which has two NICs, and drop it at the head of my network. I'd want Server 2008 R2 to ignore all traffic on the external facing NIC and pass it to a virtual ClearOS gateway, and to put all its Internet traffic through its other NIC - which will face the rest of my network and be the default gateway for it. The theory is to keep the potentially vulnerable Server 2008 R2 install as tucked behind a Linux box as possible, without sacrificing too much performance. This is a home network that occasionally hosts dedicated game servers and voice chat servers, so most malicious activity is in the form of drive by non-targeted attacks, however, I don't trust Windows Server because I don't know the OS well enough, yet. So, three questions: How do I do this, am I going to be reasonably more secure doing this than if I just let the Server 2008 R2 rig handle all the network traffic and DHCP (not an option), and should I virtualize the Server 2008 R2 rig instead and if so in what? (Core 2 Duo e6600 w/ 5 gigs usable RAM)

    Read the article

  • How to approach people you've found through internet with similar programming interests?

    - by randomguy
    I've recently really dived into Ruby/Rails and I'm falling in love. I have a gut feeling this might be something that could last for a while. What I've been missing is interaction with people who are as passionate about Ruby, Rails and things closely related to these. I live in a relatively small city, but was able to find five local people through a RoR website. Weekly meetups with Macs, beer and bro-love rushed through my mental theater. Seriously though, I have no clue how I could approach these people. I have their e-mail addresses. Any advice?

    Read the article

  • Updating modules on VPS hosted under OpenVZ

    - by tertle
    Been trying to install OpenVPN on a VPS but come into a few problems when trying to start the openvpn server: Service deferred error: IPTablesServiceBase: failed to run iptables-restore [status=1]: ['FATAL: Could not load /lib/modules/2.6.18-028stab070.14/modules.dep: No such file or directory', 'FATAL: Could not load /lib/modules/2.6.18-028stab070.14/modules.dep: No such file or directory', 'iptables-restore: line 46 failed']: internet/base:1175,internet/base:752,internet/process:45,internet/process:306,internet/_baseprocess:48,internet/process:775,internet/_baseprocess:60,svc/pp:116,svc/svcnotify:26,internet/defer:238,internet/defer:307,internet/defer:323,sagent/ipts:105,sagent/ipts:39,util/error:52,util/error:32 service failed to start due to unresolved dependencies: set(['user', 'iptables_openvpn']) service failed to start due to unresolved dependencies: set(['user', 'iptables_openvpn']) service failed to start due to unresolved dependencies: set(['iptables_openvpn']) Anyway so after a bit of playing around and some advice, I found that the Linux kernel and modules don't match on my server. uname -r returns 2.6.18-028stab070.14 and ls /lib/modules returns 2.6.18-028stab070.7 The server is running OpenVZ and my container uses Ubuntu 9.10. So my question is, is it possible for me to update my modules on a VPS and if so how would I do this, or is this something I'll need to try get my host to do? Thanks in advance.

    Read the article

  • When I update, It says I should check my internet and it says I should I should use apt-cdrom but I can't get onto it

    - by Joey
    W:Failed to fetch cdrom://Ubuntu 12.10 _Quantal Quetzal_ - Release amd64 (20121017.5)/dists/quantal/main/binary-amd64/Packages Please use apt-cdrom to make this CD-ROM recognized by APT. apt-get update cannot be used to add new CD-ROMs , W:Failed to fetch cdrom://Ubuntu 12.10 _Quantal Quetzal_ - Release amd64 (20121017.5)/dists/quantal/restricted/binary-amd64/Packages Please use apt-cdrom to make this CD-ROM recognized by APT. apt-get update cannot be used to add new CD-ROMs , W:Failed to fetch cdrom://Ubuntu 12.10 _Quantal Quetzal_ - Release amd64 (20121017.5)/dists/quantal/main/binary-i386/Packages Please use apt-cdrom to make this CD-ROM recognized by APT. apt-get update cannot be used to add new CD-ROMs , W:Failed to fetch cdrom://Ubuntu 12.10 _Quantal Quetzal_ - Release amd64 (20121017.5)/dists/quantal/restricted/binary-i386/Packages Please use apt-cdrom to make this CD-ROM recognized by APT. apt-get update cannot be used to add new CD-ROMs , E:Some index files failed to download. They have been ignored, or old ones used instead.

    Read the article

  • How to run a WebPy server on port 8080 using DDNS of dlink router and to access this site from internet?

    - by nuke1010
    I have two major issue with setting up a web server using my dlink DIR-600L router. Issue 1: I run a WebPy server on port 8080. But the DDNS service providers (like dlinkddns.com or dyndns.org) only allows port 80. I can run the server in port 80 with sudo command. But my server become vulnerable if i give root access. So I tried port forwarding in the router and server. But no use. I don't know if I done that correctly. Issue 2: Even though the server runs on port 80, I can access my site from my local machines only using registered domain names ( say, nikz.dyndns.org). No one on internet cannot load this site even when its totally up. As I observed server log, the request from other clients never reached my server. I need to run this server on port 8080 and i need to access this site from internet. How can I do it? any idea?

    Read the article

  • How to set up a VPN Incoming connection with Windows to tunnel Internet traffic?

    - by Mehrdad
    I want to set up a VPN on a remote server to route all my Internet traffic for privacy reasons. I can set up an incoming connection and connect to it successfully. The problem is, I can just see the remote computer and no other Web sites will open. I want the remote server to act like a NAT. How can I do that? Note that I don't want to split Internet traffic. I actually want to send all the traffic to the remote server but need to make it relay the traffic. For the record, my remote server is Windows Web Server 2008 which does not have routing and remote access service. Clarification I'm mostly interested in server configuration. I don't have any problems configuring the client. By the way, Windows Web Server 2008 seems to have the same VPN features built in client OSes (like Vista) and specifically, it doesn't include the RRAS console in MMC. I'm also open to suggestions regarding third party PPTP/L2TP daemons available, if they are free.

    Read the article

  • GWT (Google Web Toolkit) - Développez des Applications Internet Riches (RIA) en Java de Damien Picard, critique par Benwit

    Je viens de lire le 3° ouvrage sur GWT en français, celui de Damien PICARD aux éditions ENI. [IMG]http://images-eu.amazon.com/images/P/2746058308.08.LZZZZZZZ.jpg[/IMG] Citation: Ce livre sur GWT (Google Web Toolkit) s'adresse aux développeurs Java souhaitant créer des applications RIA sans passer par JavaScript ou aux développeurs web confirmés (JavaScript/XHTML/CSS) désireux de disposer d'un framework décuplant leur productiv...

    Read the article

  • Unable to remove limit on memory usage for PHP script.

    - by Jess Telford
    The Situation I am having an issue with a PHP script getting the following error message: Fatal error: Out of memory (allocated 359923712) (tried to allocate 72 bytes) in /path/to/piwik/core/DataTable.php on line 969 The script I'm running is: /path/to/piwik/misc/cron/archive.sh I am assuming the numbers are Bytes, which means that total is approximately 360MB. For all intents and purposes, I have increased the memory limits on the server well above 360MB, yet this is the number (give or take a byte) it consistently errors out at. Please note: This question is not about fixing a memory leak in the script, nor about why the script itself is using so much memory. The script is part of the Piwik archiving process, so I cannot just fix any memory leaks, etc. For more info on this script and why I am increasing the memory limit, see "How to setup auto archiving" The question Given that the script is attempting to use over 360MB of memory, which I cannot change, why does it not seem possible for me to increase the amount of memory available to php on my server? What I've tried Increasing PHP's memory_limit Given the php.ini file: php -i | grep php.ini Configuration File (php.ini) Path => /usr/local/lib Loaded Configuration File => /usr/local/lib/php.ini I have edited that file, so the memory_limit directive reads; memory_limit = -1 Restart Apache, and check the new value has stuck; $ php -i | grep memory_limit memory_limit => -1 => -1 Run the script, and get the same error. I've also tried 1G, 768M, etc, all to the same result (ie; no change). Update 22nd June: Based on Vangel's help, I have attempted to set post_max_size to 20M in combination with setting memory_limit. Again, this has no effect. Removing the memory limit on child processes of Apache I have found and edited the httpd.conf file to make sure there is no RLimitMEM directive. I then used WHM's Apache Configuration Memory Usage Restrictions to generate a restriction, which it claimed was at 1000M (and confirmed by checking httpd.conf). Both of these resulted in no change to the script erroring at 360MB. Increasing the per process memory limits of Linux The current limits set on the system: $ ulimit -m 524288 $ ulimit -v 524288 I have attempted to set both of these to unlimited: $ ulimit -m unlimited $ ulimit -v unlimited $ ulimit -m unlimited $ ulimit -v unlimited Once again, this has resulted in absolutely no improvement in my problem. My setup $ cat /etc/redhat-release CentOS release 5.5 (Final) $ uname -a Linux example.com 2.6.18-164.15.1.el5 #1 SMP Wed Mar 17 11:30:06 EDT 2010 x86_64 x86_64 x86_64 GNU/Linux $ php -i | grep "PHP Version" PHP Version => 5.2.9 $ httpd -V Server version: Apache/2.0.63 Server built: Feb 2 2011 01:25:12 Cpanel::Easy::Apache v3.2.0 rev5291 Server's Module Magic Number: 20020903:13 Server loaded: APR 0.9.17, APR-UTIL 0.9.15 Compiled using: APR 0.9.17, APR-UTIL 0.9.15 Architecture: 64-bit Server compiled with.... -D APACHE_MPM_DIR="server/mpm/prefork" -D APR_HAS_SENDFILE -D APR_HAS_MMAP -D APR_HAVE_IPV6 (IPv4-mapped addresses enabled) -D APR_USE_SYSVSEM_SERIALIZE -D APR_USE_PTHREAD_SERIALIZE -D SINGLE_LISTEN_UNSERIALIZED_ACCEPT -D APR_HAS_OTHER_CHILD -D AP_HAVE_RELIABLE_PIPED_LOGS -D HTTPD_ROOT="/usr/local/apache" -D SUEXEC_BIN="/usr/local/apache/bin/suexec" -D DEFAULT_PIDLOG="logs/httpd.pid" -D DEFAULT_SCOREBOARD="logs/apache_runtime_status" -D DEFAULT_LOCKFILE="logs/accept.lock" -D DEFAULT_ERRORLOG="logs/error_log" -D AP_TYPES_CONFIG_FILE="conf/mime.types" -D SERVER_CONFIG_FILE="conf/httpd.conf" Output of $ php -i: http://pastebin.com/EiRut6Nm

    Read the article

  • How to install a desktop environment onto Ubuntu Server -- but without internet access or a CDROM?

    - by James
    I am playing around with a computer which has no CDROM drive or internet access and I have installed Ubuntu Server onto it. I have that all up and running nicely but now I'd like to install Xfce, GNOME or something similar so I can load up a desktop environment from the command line if I wish. Obviously with internet access or a CDROM, this would be a simple task of using apt-get and it finding & retrieving the packages for me, I assume, but I do not have either. I do however have a USB drive and I have used Unetbootin to make it into a bootable drive with the Ubuntu Server disk image files on there. I have mounted the USB drive to /media/usb0 and tried the command "sudo apt-cdrom add -d /media/usb0" to get apt to recognise the USb drive as an "Ubuntu CD" -- a source of package files but apt-get doesn't seem to be finding Xfce.. I try "sudo apt-get install xfce" and "sudo apt-get install xfce4" but neither find the package.. I would prefer to have Xfce but GNOME would be OK too.. My question is, am I doing something wrong? I figured that the Ubuntu Server disk (or rather, my Ubuntu Server USB drive) might not have any desktop environment packages on there so I tried the Xubuntu Desktop disk too (again, from my USB drive). I tried "sudo apt-get install xubuntu-desktop" but it couldn't find the package - even though it is listed under the /casper/ directory in some MANIFEST file. Anyone see where I'm going wrong? Maybe apt-get install is looking somewhere other than my USB drive? Maybe my commands are wrong? Maybe the disks don't even have the desktop environments on!? Thanks in advance guys, any input would be much appreciated. Cheers - James

    Read the article

  • How to make a static route when using two internet connections?

    - by webmasters
    I have asked a question here on how to choose which applications will use a 3G internet connection and which applications will use the LAN. User harrymc gave a very complete and interesting answer, pointing that this is possible using static routes for certain websites. Now, lets say I want to access google.com only through my 3G internet connection. How would that static root look like? google has the IP: 173.194.39.180 here is a print of my route table, the 3G Modem has the IP: 10.81.132.96 +-----------------------------------------------------------------------------+ ¦ IPv4 Route Table ¦ ¦ =========================================================================== ¦ ¦ Active Routes: ¦ ¦ Network Destination Netmask Gateway Interface Metric ¦ ¦ 0.0.0.0 0.0.0.0 192.168.2.1 192.168.2.102 20 ¦ ¦ 0.0.0.0 0.0.0.0 10.81.132.97 10.81.132.111 286 ¦ ¦ 10.81.132.96 255.255.255.224 On-link 10.81.132.111 286 ¦ ¦ 10.81.132.111 255.255.255.255 On-link 10.81.132.111 286 ¦ ¦ 10.81.132.127 255.255.255.255 On-link 10.81.132.111 286 ¦ ¦ 127.0.0.0 255.0.0.0 On-link 127.0.0.1 306 ¦ ¦ 127.0.0.1 255.255.255.255 On-link 127.0.0.1 306 ¦ ¦ 127.255.255.255 255.255.255.255 On-link 127.0.0.1 306 ¦ ¦ 192.168.2.0 255.255.255.0 On-link 192.168.2.102 276 ¦ ¦ 192.168.2.102 255.255.255.255 On-link 192.168.2.102 276 ¦ ¦ 192.168.2.255 255.255.255.255 On-link 192.168.2.102 276 ¦ ¦ 224.0.0.0 240.0.0.0 On-link 127.0.0.1 306 ¦ ¦ 224.0.0.0 240.0.0.0 On-link 192.168.2.102 276 ¦ ¦ 224.0.0.0 240.0.0.0 On-link 10.81.132.111 286 ¦ ¦ 255.255.255.255 255.255.255.255 On-link 127.0.0.1 306 ¦ ¦ 255.255.255.255 255.255.255.255 On-link 192.168.2.102 276 ¦ ¦ 255.255.255.255 255.255.255.255 On-link 10.81.132.111 286 ¦ +-----------------------------------------------------------------------------+

    Read the article

< Previous Page | 196 197 198 199 200 201 202 203 204 205 206 207  | Next Page >